Turning thoughts into language for people with disabilities

A group of scientists has managed to "translate" brain activity into words with an accuracy of up to 74 percent. This breakthrough will allow people with severe speech disabilities to communicate.
This breakthrough, led by Stanford University in California (United States) and published this Thursday in the journal Cell , could help people who cannot speak communicate more easily using brain-computer interface (BCI) technologies.
"This is the first time we've been able to understand what brain activity is like when you're just thinking about speaking," says lead author Erin Kunz of Stanford University.
"For people with severe motor and speech disabilities, BCIs capable of decoding internal speech could help them communicate much more easily and naturally," he explains.
Brain-computer interfacesBCIs are a tool that can help people with disabilities. Using sensors implanted in the brain regions that control movement, these systems decode neural signals related to movement and convert them into actions , such as moving a prosthetic hand.
For people with paralysis, some BCIs have been able to interpret the brain activity of users trying to speak out loud by activating the related muscles and "write" what they are trying to say .
But in these cases, even with systems that track users' eye movements to type words, trying to speak is tiring and slow for those with limited muscle control .
In these cases, it would be good to see whether BCIs could decode inner speech: "If you just have to think about speech instead of trying to speak it, it's potentially easier and faster for those people ," says Benyamin Meschede-Krasa, co-senior author and a researcher at Stanford.
To find out, they recorded neural activity from microelectrodes implanted in the motor cortex— a region of the brain responsible for speech —of four people with severe paralysis due to amyotrophic lateral sclerosis (ALS) or a brainstem stroke.
They then asked participants to either attempt to speak or imagine saying a series of words and found that attempted speech and inner speech activated overlapping brain regions and evoked similar patterns of neural activity, although inner speech had a weaker intensity of activation overall.
Using internal speech data, the team trained AI models to interpret imagined words, and in a proof-of-concept demonstration, the BCI was able to decode imagined phrases from a vocabulary of up to 125,000 words with 74 percent accuracy .
The BCI was also able to pick up inner speech that some participants had never been instructed to say, such as numbers when asked to count the pink circles on the screen.
The team also found that although attempted speech and inner speech produced similar patterns of neural activity in the motor cortex, they were different enough to be reliably distinguished from each other.
For Stanford researcher Frank Willett, lead author of the paper, researchers can use this distinction to train BCIs to completely ignore inner speech.
The team also demonstrated a password-controlled mechanism that would prevent the BCI from decoding internal speech unless it was temporarily unlocked with a chosen keyword.
In their experiment, users could think of the phrase "chitty chitty bang bang" to initiate decoding of internal speech. The system recognized the password with over 98% accuracy.
"The future of BCIs is bright. This work offers real hope that speech BCIs could one day restore communication as fluid, natural, and comfortable as conversational speech," Willett emphasizes.
* * * Stay up to date with the news, join our Whatsapp channel * * *
OB
informador