Syfy Insider Exclusive

Create a free profile to get unlimited access to exclusive videos, sweepstakes, and more!

Sign Up For Free to View
SYFY WIRE Science

Unconscious patients can now 'speak' with brain-computer interface tech

By Elizabeth Rayne
The 9th Life of Louis Drax

When you see an unconscious patient in a movie, you sometimes see their thoughts onscreen (like in The 9th Life of Louis Drax, above) or at least hear a voiceover.

That may not entirely stay in science fiction. Adrian Owen, neuroscientist and Professor of Cognitive Neuroscience and Imaging at the University of Western Ontario, Canada, and his research team are using brain-computer interfaces with advanced technology to get answers directly from people who can’t answer for themselves any other way. Any critical decisions for patients unable to communicate are usually made for them.

When this means matters of life and death, finding out from the patient could literally be life-altering.

“Brain-computer interfaces (BCIs) are becoming increasingly popular as a tool to improve the quality of life of patients with disabilities,” Owen said in a study recently published in Frontiers in Neuroscience. "Recently, time-resolved functional near- infrared spectroscopy (TR-fNIRS) based BCIs are gaining traction because of their enhanced depth sensitivity.”

BCIs are devices that allow the brain to communicate with an external device that “speaks” for them. Owen’s version used functional near-infrared spectroscopy, a non-invasive method which measures changes near-infrared light and processes signals to find out the hemodynamic response, or increase in blood oxygen levels that occurs when more blood flows to the front of brain. Even NASA developed its own version to monitor what is going on in astronauts’ brains — except the astronauts are conscious.

Owen’s BCI also adds time-resolved (TR) detection to this already unreal tech. Brain activity detected by fNIRS is visible on a digital screen. Photons — particles of light — are what makes that screen light up when there are changes in blood oxygenation. TR watches out for them and records when each photon arrives on that screen, enabling more depth sensitivity (sensitivity to what goes on deeper in the brain). Because early photons don’t have to travel as much, TR can identify those that make it to the screen later as arriving from deeper regions.

With the detection technology down, Owen needed to see the actual brain activity that occurred when healthy participants were given questions to respond to. He wasn’t immediately going to ask about things like whether or not to stay on a ventilator. The subjects were instead told to imagine playing tennis as a “yes” answer and to stay relaxed as a “no” answer. Sure enough, for positive answers, the section of the brain that responds to movement lit up on the screen.

Later, the changes in oxygenated and deoxygenated hemoglobin were measured, and the oxygenated or oxyhemoglobin showed better indications of brain activity that a given task had switched on. “Yes” responses elicited a spike in oxyhemoglobin and a slight decrease in deoxyhemoglobin.

“The “yes” responses show the expected hemodynamic changes in oxy-and deoxy-hemoglobin, which are absent in the “no” responses,” Owen observed, adding that “this work highlights the potential of TR-fNIRS as a BCI for mental communication.”

BCIs using TR-fNIRS still need further study to cancel out potential errors that could elicit the wrong interpretation of an answer from actual unconscious patients—but we’re that much closer to giving them a voice.

(via Frontiers in Neuroscience)