Science

They create an AI device that allows internal thoughts to be converted into language

A device implanted in the brain has been able to read the thoughts of four people suffering from severe paralysis in real time.

Cells and artificial intelligence.
ARA
14/08/2025
2 min

A team of scientists has managed to decipher the brain activity that occurs during internal speech—the monologue that takes place in people's minds when they think about speaking—and has translated it into words with an accuracy of up to 74%. The research, led by Stanford University in the United States, was published this Thursday in the journalCell and could help people who cannot speak communicate more easily using brain-computer interface (BCI) technologies.

"This is the first time we've been able to understand what brain activity is like when you're just thinking about speaking," says lead author Erin Kunz of Stanford University. "For people with severe speech and motor disabilities, BCIs that can decode inner speech could help them communicate much more easily and naturally."

BCIs are a tool that can assist people with disabilities. Using sensors implanted in the regions of the brain that control movement, these systems decode neural signals related to movement and translate them into actions, for example, moving a prosthetic hand.

For people with paralysis, some BCIs have been able to interpret the brain activity of users trying to speak out loud by activating the related muscles and “type” what they are trying to say. But in these cases, even with systems that track users’ eye movements to type words, trying to speak is tiring and slow for those with limited muscle control.

For these cases, the research was to determine whether BCIs could decode inner speech: “If they can just think about speech instead of trying to speak, it’s potentially easier and faster for these people,” says Benyamin Meschede-Krasa, co-senior author of the study and a researcher at Stanford.

To find out, they recorded neural activity from microelectrodes implanted in the motor cortex—a brain region responsible for speech—of four people with severe paralysis caused by amyotrophic lateral sclerosis (ALS) or a stroke in the brain stem. They then asked participants to either attempt to speak or imagine saying a series of words and found that attempting to speak and inner speech activated overlapping brain regions and evoked similar patterns of neural activity, although inner speech had a weaker activation intensity overall.

The password "chitty chitty bang bang"

Using inner speech data, the team trained AI models to interpret imagined words, and in a proof-of-concept demonstration, the BCI was able to decode imagined sentences from a vocabulary of up to 125,000 words with 74% accuracy.

The team also found that although intended and inner speech produce similar patterns of neural activity in the motor cortex, they were different enough to be reliably distinguished from one another. According to Stanford researcher Frank Willett, senior author on the research, researchers can use this distinction to train BCIs to ignore inner speech entirely. To temporarily prevent the BCI from decoding inner speech, a password-controlled mechanism was developed: in the experiment, users could think of the phrase "chitty chitty bang bang" to begin decoding the internal speech. The system recognized the password with over 98% accuracy.

"The future of BCIs is bright. This work offers real hope that speech BCIs could one day restore communication as fluent, natural, and comfortable as conversational speech," Willett emphasizes.

stats