Brain Computer Interface neural scanning
More info i
Credit: JEAN-PIERRE CLATOT/AFP via Getty Images

Don’t speak! This AI-powered neural interface takes dictation…by silently ‘reading’ your mind

Contributed by
May 12, 2021, 9:56 PM EDT

As AI inches closer to brain-like mimicry (and science shows off the neurally-linked power of teaching pigs to play video games), the tech-based connection between humans and machines just keeps getting tighter. But in a new twist that frames mind-reading in a whole new light, there’s now an AI-powered neural interface that can literally tune in to your text-based thoughts.

Stanford Prof. Krishna V. Shenoy and a team of researchers have reported on their findings after observing the interactions between a human test subject and their specially designed brain-computer interface (BCI). After linking the AI to the human volunteer via a series of sensors implanted in the brain of the subject — a 65 year-old paralyzed from the neck down — the BCI was able, in real time, to “read” the individual letters of the alphabet that the man was envisioning, one by one, as handwritten symbols in his mind.

On top of that, the AI was able to "write" the letters down by displaying them back to the research team on a screen. Think of it as taking dictation in the most direct way possible: not from recognizing patterns of speech, but by looping straight into the human brain itself to decode “the neural signals associated with writing letters,” according to the study’s press page.

Check out the remarkable results in the video below — especially around the 20-second mark, when the AI begins scripting out the letters it's sensing from the subject's brain:

HHMI Howard Hughes Medical Institute on YouTube

Even with a paralyzed human subject, the neural sensors were able to pick up on brain activity stimulated by the mere thought of executing an action. “When an injury or disease robs a person of the ability to move, the brain’s neural activity for walking, grabbing a cup of coffee, or speaking a sentence remain,” the team explains. “Researchers can tap into this activity to help people with paralysis or amputations regain lost abilities.”

For this test, the BCI and human subject were able to stay in sufficient sync for the AI to accurately turn out 90 characters per minute, according to the findings. And though at least one previous BCI study has accomplished a similar feat (at a far slower speed), the new research marks the first time that science has successfully “deciphered the brain activity associated with trying to write letters by hand.”

Technophiles and sci-fi fans don’t need much prompting to dream of all the ways a mind-reading tech like this could someday be used. But it’s crucial to recognize that the BCI involved in this study was designed only to achieve a specific task — detecting and recognizing the brain’s handwriting activity — and to communicate it back to the team by “writing” it out. In other words, the AI and the human here are interacting on a very simple two-way street: this research ‘bot isn’t attempting any sophisticated communication; no interpretive analysis that would cue a computer to respond with its own calculated next move.

That means this mind-reading BCI could lead to useful future tech that could serve as a helpful tool for the disabled, rather than a mind-scanning super-intelligence that reacts to your thoughts. Researchers involved in the study say their findings could lead to encouraging tech that one day helps “people with paralysis rapidly type without using their hands.”