A development in brain–computer interfaces may help people who have lost the physical ability to speak. At The Scientist, Sahana Sitaraman reports:
There’s a voice inside most people’s minds that comes alive when they listen, read, or prepare to speak. This “internal monologue” is thought to support complex cognitive processes like working memory, logical reasoning, and motivation. In fact, inner speech continues to thrive in many individuals who are unable to speak owing to injury or disease. More than half a century ago, Jacques Vidal, a computer scientist at the University of California, Los Angeles, proposed the idea for brain-computer interfaces (BCIs); systems that could use electrical signals in the brain to control prosthetic devices.
… Francis Willett, a neurosurgeon at Stanford University, and his team developed a speech BCI that could decipher imagined sentences. Their findings, published in Cell, enabled people with speech paralysis to audibly communicate by simply thinking about what they wanted to say in real time. “If you just have to think about speech instead of actually trying to speak, it’s potentially easier and faster for people,” said Benyamin Meschede-Krasa, a graduate student at Stanford University and coauthor of the study, in a statement.
“Brain-Computer Interface Lets Users Communicate Using Thoughts,” August 14, 2025
Implanted in the Brain
The four study participants in BrainGate2 had speech problems due to amyotrophic lateral sclerosis (ALS) or stroke. After devices to record signals from the motor cortex were implanted in their brains, participants were asked to either imagine or try to speak words in English. The device could pick up the signals even from imagined words, though the signals were weaker, and a machine learning model could decode them with 74 percent accuracy.
From Live Science:
Scientists have pinpointed brain activity related to inner speech — the silent monologue in people’s heads — and successfully decoded it on command with up to 74% accuracy. Publishing August 14 in the Cell Press journal Cell, their findings could help people who are unable to audibly speak communicate more easily using brain-computer interface (BCI) technologies that begin translating inner thoughts when a participant says a password inside their head.
“This is the first time we’ve managed to understand what brain activity looks like when you just think about speaking,” says lead author Erin Kunz of Stanford University. “For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally.”
“Brain-computer interface could decode inner speech in real time,” August 14, 2025
The system could only decode imagined words, not thoughts in general. The researchers hope that it could become sophisticated enough to restore conversational speech.
Technology Raises Concerns About Privacy
The open-access paper hints at a problem, though:
Separately, inner speech may be a way to bypass the current approach of requiring speech BCI users to physically attempt speech, which is fatiguing and can slow communication. Using multi-unit recordings from four participants, we found that inner speech is robustly represented in the motor cortex and that imagined sentences can be decoded in real time. The representation of inner speech was highly correlated with attempted speech, though we also identified a neural “motor-intent” dimension that differentiates the two. We investigated the possibility of decoding private inner speech and found that some aspects of free-form inner speech could be decoded during sequence recall and counting tasks. Finally, we demonstrate high-fidelity strategies that prevent speech BCIs from unintentionally decoding private inner speech.
Kunz EM, et al., “Inner speech in motor cortex and implications for speech neuroprostheses.” Cell. 2025 Jul 9:S0092-8674(25)00681-6. doi: 10.1016/j.cell.2025.06.015. Epub ahead of print. PMID: 40816265; PMCID: PMC12360486.
Note this: “We investigated the possibility of decoding private inner speech and found that some aspects of free-form inner speech could be decoded during sequence recall and counting tasks.” While it may not be the researchers’ intention to “read minds” for the purpose of sniffing out dissidents, many governments would welcome the ability.
Thus, as Rudy Molinek reports at Smithsonian Magazine,
To protect users’ privacy, they chose a passphrase to activate the device that was unlikely to come up in everyday speech: “Chitty Chitty Bang Bang,” the title of the 1964 Ian Fleming novel and 1968 movie. The technology would start translating thoughts when it detected the phrase, which, for one participant, it did with 98.75 percent accuracy.
“Science Fiction? Think Again. Scientists Are Learning How to Decode Inner Thoughts,” August 15, 2025
Without Invading Privacy
And from Gemma Conroy at Nature:
A brain implant can decode a person’s internal chatter — but the device works only if the user thinks of a preset password.
The mind-reading device, or brain–computer interface (BCI), accurately deciphered up to 74% of imagined sentences. The system began decoding users’ internal speech — the silent dialogue in people’s minds — only when they thought of a specific keyword. This ensured that the system did not accidentally translate sentences that users would rather keep to themselves.
The study, published in Cell on 14 August, represents a “technically impressive and meaningful step” towards developing BCI devices that accurately decode internal speech, says Sarah Wandelt, a neural engineer at the Feinstein Institutes for Medical Research in Manhasset, New York, who was not involved in the work. The password mechanism also offers a straightforward way to protect users’ privacy, a crucial feature for real-world use, adds Wandelt.
“A mind-reading brain implant that comes with password protection,” August 14, 2025
In other words, though no one discusses it directly, the researchers specifically designed the device to work without invading privacy. Others will doubtless have the opposite intent. Stay tuned.
Cross-posted at Mind Matters News.








































