Science – Paralyzed, he can communicate again thanks to a brain-machine interface

Published

A 30-year-old suffering from confinement syndrome and unable even to move his eyes managed to address his family by means of electrodes implanted in his motor cortex.

Two arrays of microelectrodes, each 3.2mm square, were inserted into his motor cortex – the part of the brain responsible for movement. Each array has 64 needle-like electrodes that record neural signals. Wyss Center

Completely paralyzed and unable to speak, a 30-year-old was able to speak to his 4-year-old son. “Do you want to watch Disney’s Robin Hood with me,” he managed to ask her. Researchers from the Wyss Center for Bio and Neuroengineering in Geneva have enabled this man to communicate once more, thanks to two networks of electrodes implanted in his brain and a computer interface. The clinical case study has been ongoing for over two years with the participant, who has advanced amyotrophic lateral sclerosis (ALS) – a progressive neurodegenerative disease in which people lose the ability to move and speak.

The results show that communication is possible with people suffering from locked-in syndrome or syndrome of confinement, because of ALS. Consciousness and cognitive function are not affected. The study was published Tuesday in Nature Communicationsindicates the Wyss Center in its communicated.

“This study answers a long-standing question regarding whether people with complete locked-in syndrome – who have lost even eye movement – also lose their brain’s ability to generate commands to communicate,” said Jonas Zimmermann, senior neuroscientist at the Wyss Center in Geneva and co-author of the study.

spelling program

The study participant is a man in his thirties who has been diagnosed with a rapidly progressive form of ALS. Two arrays of microelectrodes were surgically implanted in his motor cortex. The participant, who lives at home with his family, learned to generate brain activity by attempting different movements. These brain signals are picked up by the implanted microelectrodes and are decoded in real time by a machine learning model.

The model interprets the signals as meaning “yes” or “no”. To reveal what the participant wants to communicate, a spelling program reads the letters of the alphabet aloud. Using auditory neurofeedback, the participant can choose “yes” or “no” to confirm or reject the letter, and ultimately form words and whole sentences.

(comm/pmi)

Leave a Replay