Free Shipping on orders over US$39.99 How to make these links

Brain-Computer Interface Enables Quadriplegic Man to Feed Himself


Source: Placidplace/Pixabay

A new study published in Boundaries of Neurorobotics shows how a brain-computer interface enabled a quadriplegic man to feed himself for the first time in three decades by operating two robotic arms using his minds. Brain-computer interfaces (BCIs), also known as brain-machine interfaces (BMIs) are neurotechnology powered by artificial intelligence (AI) that enable those with speech or motor challenges to live more independently.

“This demonstration of bimanual robotic system control through a BMI in collaboration with intelligent robot behavior has major implications for the restoration of complex motor behaviors for those living with sensorimotor disabilities,” the authors wrote. in the study. This study was led by principal investigator Pablo A. Celnik, MD, of Johns Hopkins Medicine, as part of a clinical trial with an approved Food and Drug Administration Investigational Device Exemption.

A partially paralyzed quadriplegic 49-year-old man who had lived with a spinal cord injury for about 30 years before the study implanted six Blackrock Neurotech NeuroPort electrode arrays in the motor and somatosensory cortices of the left and right brain recording his neural. activity. Specifically, in the left hemisphere of the human brain there are four implanted arrays: two 96-channel arrays in the left primary motor cortex and two 32-channel arrays in the somatosensory cortex. In the right brain hemisphere, a 96-channel array is implanted in the primary motor cortex and a 32-channel array is implanted in the somatosensory cortex.

The participant was asked to perform tasks as the implanted microelectrode arrays recorded brain activity via a wired connection to three 128-channel Neuroport Neural Signal Processors. He was sitting at a table between two robot arms with pastry on a plate placed in front of him. He is tasked with using his minds to guide robotic branches with attached forks and knives to cut a piece of pastry and bring it to his mouth.

The goal is to make the robotic arms perform most of the task with the participant being given the power to control certain areas. The researcher confirmed that this shared control of the robotic limbs for a task that requires both fine maneuvering and manual coordination can be made much faster. The robot was given the approximate location of the participant’s plate, food, and mouth previously.

“Using neurally-driven shared control, the participant successfully and simultaneously controlled the movements of both robotic limbs to cut and feed food in a complex bimanual self-feeding task,” the researchers reported.

Source link

We will be happy to hear your thoughts

Leave a reply

Info Bbea
Enable registration in settings - general
Compare items
  • Total (0)
Shopping cart