New implant helps patient spell out entire sentences using only brain signals

New implant helps patient spell out entire sentences using only brain signals

A man who cannot speak simply needs to think of letters to spell his words.


Andrew Paul

Published Nov 10, 2022 4: 30 PM

Man with neural implant sitting in front of computer screen with text displayed

Researchers hope the implant will one day be wireless. YouTube

Researchers at the University of California Berkeley have made improvements to a brain implants that allow paralyzed men who are unable to speak to communicate with their brains by translating their brain signals into text. The research, , published this week in Nature Communications, demonstrates how a a patient can now “type” from a roughly 1,150 word vocabulary bank at a speed of about 29 characters per minute (roughly seven words), with a 94 percent accuracy rate by simply thinking.

Although a version of the device was first described last year in The New England Journal of Medicine, its capabilities were then far more restricted than the current iteration, and relied on a patient attempting to vocally speak words which were then translated by a computer system.

[Related: How a personalized brain implant helped one woman’s extreme depression. ]

“Neuroprostheses have the potential to restore communication to people who cannot speak or type due to paralysis. The abstract of the paper by the research group states that it is not clear if silent attempts at speaking can be used to control a communication Neuroprosthesis.

YouTube video

To tackle this issue, team used deep machine learning and language modelling techniques to decode the letter sequences from the patient. They silently spelled the NATO alphabet (“alpha”, “bravo”, and “b” respectively.

“The NATO phonetic alphabet was developed for communication over noisy channels,” one of the study’s co-authors told Live Science this week. “That’s kind of the situation we’re in, where we’re in this noisy environment of neural recordings.” When the patient thinks the code words, algorithms translate the brain activity via a network of 128 electrodes previously laid across his brain’s surface, particularly atop a region controlling vocal tract muscles and an area involved in hand movements. The trial participant tried to squeeze his right thumb to end a sentence. This was then interpreted by the algorithm as the sentence’s endpoint.

The device currently requires a wired connection for operation, but researchers hope to eventually be able to use a wireless interface. They believe that the system could be combined with the skills from both their previous experiments and their most recent to allow for both thought-based and vocal speaking.

Andrew Paul

Read More