Scientists say they have found a way to decode the “inner voices” of people who can’t speak

Scientists say they have found a way to decode the “inner voices” of people who can’t speak

This could be a major breakthrough…

Stanford University researchers have developed a brain-computer interface (BCI) capable of decoding the “inner speech” of people unable to speak, potentially easing communication for those with conditions like amyotrophic lateral sclerosis (ALS).

Published in Cell, the study explored bypassing the physical strain of traditional BCIs, which require users to attempt speech. “If we could decode that, then that could bypass the physical effort,” said neuroscientist Erin Kunz. “It would be less tiring, so they could use the system for longer.”

ALS patient Casey Harrell, part of the long-running trial, initially used a BCI to restore speech through brainwaves and past recordings. In this phase, researchers trained AI models to link thoughts to words, accurately producing sentences like “I don’t know how long you’ve been here.”

However, the system sometimes detected thoughts participants did not intend to share. To address privacy concerns, researchers introduced an “inner password” — Chitty Chitty Bang Bang — to control when decoding occurred, achieving 98.75% accuracy.

Kunz described the work as “proof-of-concept,” marking a significant step toward giving voice to the voiceless while protecting mental privacy.

READ MORE AT FUTURISM

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top