A bioelectric neural interface towards intuitive prosthetic control for amputees

Anh Tuan Nguyen, Jian Xu, Ming Jiang, Diu Khue Luu, Tong Wu, Wing Kin Tam, Wenfeng Zhao, Markus W. Drealan, Cynthia K. Overstreet, Qi Zhao, Jonathan Cheng, Edward W. Keefer

Research output: Contribution to journalArticlepeer-review

Abstract

OBJECTIVE: While prosthetic hands with independently actuated digits have become commercially available, state-of-the-art human-machine interfaces (HMI) only permit control over a limited set of grasp patterns, which does not enable amputees to experience sufficient improvement in their daily activities to make an active prosthesis useful.

APPROACH: Here we present a technology platform combining fully-integrated bioelectronics, implantable intrafascicular microelectrodes and deep learning-based artificial intelligence (AI) to facilitate this missing bridge by tapping into the intricate motor control signals of peripheral nerves. The bioelectric neural interface includes an ultra-low-noise neural recording system to sense electroneurography (ENG) signals from microelectrode arrays implanted in the residual nerves, and AI models employing the recurrent neural network (RNN) architecture to decode the subject's motor intention.

MAIN RESULTS: A pilot human study has been carried out on a transradial amputee. We demonstrate that the information channel established by the proposed neural interface is sufficient to provide high accuracy control of a prosthetic hand up to 15 degrees of freedom (DOF). The interface is intuitive as it directly maps complex prosthesis movements to the patient's true intention.

SIGNIFICANCE: Our study layouts the foundation towards not only a robust and dexterous control strategy for modern neuroprostheses at a near-natural level approaching that of the able hand, but also an intuitive conduit for connecting human minds and machines through the peripheral neural pathways. (Clinical trial identifier: NCT02994160).

Original languageEnglish (US)
Article number066001
JournalJournal of neural engineering
Volume17
Issue number6
Early online dateOct 22 2020
DOIs
StatePublished - Nov 11 2020

Bibliographical note

Funding Information:
The surgery and patients related costs were supported in part by the DARPA under Grants HR0011-17- 2-0060 and N66001-15-C-4016. The human motor decoding experiments, including the development of the prototype, was supported in part by MnDRIVE Program and Institute for Engineering in Medicine at the University of Minnesota, in part by the NIH under Grant R01-MH111413-01, in part by NSF CAREER Award No. 1845709, in part by Fasikl Incorporated and in part by Singapore Ministry of Education funding R-263-000-A47-112, and Young Investigator Award R-263-000-A29-133. Z. Yang is co-founder of, and holds equity in, Fasikl Incorporated a sponsor of this project. This interest has been reviewed and managed by the University of Minnesota in accordance with its Conflict of Interest policy. J. Cheng and E. W. Keefer have ownership in Nerves Incorporated, a sponsor of this project.

Publisher Copyright:
© 2020 IOP Publishing Ltd.

Keywords

  • Artificial intelligence
  • Deep learning
  • Frequency-shaping amplifier
  • Fully-integrated bioelectronics
  • Intrafascicular microelectrodes
  • Motor decoding
  • Peripheral nerve interface

PubMed: MeSH publication types

  • Journal Article

Fingerprint

Dive into the research topics of 'A bioelectric neural interface towards intuitive prosthetic control for amputees'. Together they form a unique fingerprint.

Cite this