x
AI - August 20, 2025

Decoding Inner Speech: Privacy Concerns Rise with Brain-Computer Interfaces That Could Unintentionally Leak Private Thoughts

In a groundbreaking study published in the journal Cell, researchers have demonstrated that brain-computer interfaces (BCIs) can decode not just spoken words, but also a person’s inner monologue. This development, while promising for improving communication for paralyzed individuals, raises concerns about mental privacy and the potential for unintentional oversharing.

The study, led by Erin Kunz of Stanford University’s Neural Prosthetics Translational Laboratory, focused on four participants already using BCIs to communicate. The team aimed to decode imagined speech, a more subtle brain signal than those produced during attempted speech.

During the experiment, participants simply thought about words or sentences instead of actively trying to speak them. Surprisingly, these brain signals were found to be similar, but weaker, than those produced during attempted speech. With artificial intelligence assistance, researchers were able to translate these faint signals into understandable words, achieving up to a 74% accuracy rate for sentences from a 125,000-word vocabulary.

This advancement made communication faster and less strenuous for the participants. However, it also sparked a concern: if inner speech is similar enough to attempted speech, could it inadvertently be decoded when someone is using a BCI? The research indicated that this was indeed possible, particularly during tasks like recalling a sequence of directions.

To address these privacy concerns, the team implemented two strategies. First, they programmed the device to ignore inner speech signals, although this method diminished the speed and ease associated with decoding inner speech. Second, they adopted an approach used by virtual assistants like Alexa and Siri, which respond only when they detect a specific phrase. In this case, the team chose “Chitty Chitty Bang Bang” as it is not common in conversations and highly identifiable. This allowed participants to control when their inner speech could be decoded.

Despite these privacy measures, Nita Farahany, a professor of law and philosophy at Duke University, expresses reservations. She notes that participants were still unable to prevent the BCI from detecting numbers they were thinking about, even though they did not intend to share them. This suggests that the boundary between public and private thought may be more blurred than assumed.

Privacy concerns are less significant for surgically implanted BCIs, which will be subject to Food and Drug Administration regulation upon market release. However, this level of education and regulation might not extend to consumer BCIs, expected to be worn as caps for activities such as video gaming. Early consumer devices may lack the sensitivity to detect words in the same manner as implanted devices, but the study suggests that this capability could be developed in the future.

If so, companies like Apple, Amazon, Google, and Meta might gain access to a consumer’s inner thoughts, even if the individual does not intend to share them. Farahany emphasizes, “We have entered an entirely new frontier with this era of brain transparency.” Encouragingly, researchers are already exploring ways to help individuals protect their private thoughts in this brave new world.