Editors' ChoicePROSTHETIC DEVICES

A real-life cyborg brain

See allHide authors and affiliations

Science Translational Medicine  16 Oct 2019:
Vol. 11, Issue 514, eaaz3717
DOI: 10.1126/scitranslmed.aaz3717

Abstract

A wireless brain neural interface and four-limp exoskeleton enabled a tetraplegic man to grasp and walk again.

The idea of “cyborg brain,” which enables controlling machines by thoughts, has inspired numerous arts and literatures. This fantasy is becoming reality. Using neural interfaces, scientists learned to harvest signals of the brain and applied the signals to control robotic limbs. This brings great hopes for patients with tetraplegia, the loss of use of all four limbs due to spinal cord injuries. However, many technological gaps must be addressed to achieve this goal. Major challenges are a lack of reliable neural interfaces for long-term signal recording, as well as the difficulty for patients in controlling multiple limbs at the same time, which is demanded in most regular activities, such as walking and carrying items.

Using a comprehensive rehabilitation strategy, Benabid and co-workers addressed the above technological gaps on a 28-year-old man suffering from tetraplegia. The brain regions for signal recording was identified by magnetic resonance imaging on the patient while he focused on thinking of using different parts of body, including elbows, wrists, ankle, and legs. Wireless signal recorders of 64 electrodes were implanted in the patient’s epidural space above the identified brain regions. Brain signals were collected wirelessly while the patient thought of using the above parts of body. The signals were decoded by an algorithm and delivered to a game interface, which shows animated motions to the patient. The feedback provided by the subject was used to complete the training of the decoding algorithm. Post algorithm-training, the patient’s brain signals were applied to controlling an exoskeleton capable of performing 14 independent movements, enabling the patient to perform reach-and-touch gestures with the upper limbs and gaiting motions with the lower limbs. The above algorithm remained reusable for up to 7 weeks without recalibration, showing that the brain interfaces were stable for long-term implantation.

Although the real body motions are much more sophisticated than the exoskeleton could perform, and studies in larger cohorts are necessary to fully grasp the potential of the brain-controlled exoskeleton, this work represents a major milestone in medical engineering.

Highlighted Article

View Abstract

Navigate This Article