Models & Releases

How AI can read our scrambled inner thoughts

A paralyzed woman's internal monologue appears on screen via AI decoding brain signals.

Deep Dive

Stanford University researchers used an implanted electrode array and AI to decode the brain signals of a 52-year-old woman paralyzed by a stroke, turning her imagined speech into text on a screen. A separate Japanese study combined three AI tools with non-invasive scans for 'mind captioning' of visual thoughts. These brain-computer interfaces (BCIs) aim to restore communication for patients with paralysis or ALS, with companies like Neuralink pushing toward commercial scale.

Key Points
  • Stanford's 2025 BCI trial achieved real-time decoding of imagined speech at 18 words per minute using implanted electrodes and AI.
  • Japanese researchers developed 'mind captioning' by combining three AI tools with non-invasive brain scans to decode visual thoughts.
  • Neuroengineers expect commercialization within years, led by Neuralink and others, aiming to restore speech for paralyzed patients.

Why It Matters

BCI technology could restore communication for paralyzed patients and transform human-computer interaction.