Reaching out to those locked-in
Locked-in syndrome is the stuff of nightmares: You’re awake and aware of your environment, taking in everything, but communicating out nothing. You can’t move – except, if you’re lucky, an occasional voluntary blink of the eye – can’t talk, can’t tell the world that YES I’M ALIVE AND CONSCIOUS AND THINKING.
Locked-in syndrome, or LIS, usually occurs from specific damage to the lower brain and brain stem, resulting in loss of control of nearly all voluntary muscles in the body except the eyes. Those suffering from TOTAL LIS loose the ability to blink at will as well, cutting off the only route of communication with the world. They’re literally trapped in their own minds.
Scientists have long tried to establish ways for LIS suffers to reach out. Since in most cases voluntary blinking is retained, the eyes have been considered the most effective way to communicate, such as by answering yes-no questions (one blink=yes, two=no), or using the number of blinks to spell out letters in a given alphabet. More recently, scientists have focused on brain-computer interfaces, EEG caps or fMRI* to directly measure brain activity and try to decode the meaning through machine-learning algorithms. Unfortunately, such methods often require surgery, expensive equipment and extensive training with the individual, the latter of which is nearly impossible to achieve with those completely locked-in. (*REALLY cool new stuff presented at the Canadian Association for Neuroscience conference a while back - will blog about it for sure!)
What if, instead of relying on training, we use a subconsciously controlled behavioral readout to measure consciousness and communicate instead?
Josef Stoll et al 2013. Pupil responses allow communication in locked-in syndrome patients. Current Biology. 23(15):647-648
The “behavior” in question is dilation of the pupil. Our pupil size automatically changes in response to external cues, such as sunlight; they also respond to internal cues, dilating during arousal and mental effort. Here, scientists cleverly exploited the latter – specifically, the effort needed to perform mental arithmetic – to use as a tool for LIS patients to control and dilate their pupils to signal response to a yes/no question.
Here’s the setup. Six healthy volunteers were first recruited to test out the system. To establish “ground truth”, scientists sat them down in front of a computer and asked them a yes/no binary question with one correct answer, such as “Are you 26 years old?” The computer then displayed a math problem while announcing “yes” through the speakers. Several seconds later, a different problem was displayed accompanied with the answer “no”. The crux of the system is that volunteers had to calculate the problem associated with the right answer and blatantly ignore the other. Since mental effort alone is sufficient to dilate pupils, it didn’t matter whether the volunteers got the right answer – they just had to try. A nearby camera closely monitored their pupil size as they went through the 30 trials.
Amazingly, the method had an 84-99% correct rate in the healthy volunteers (green, HC). Encouraged, researchers then tried out the system in 7 “typical” LIS patients, who suffered stroke-related damage in the brainstem but had normal cognitive function. Most patients could not go through the entire 30 trials, tiring out before the end. In this population, 3 patients achieve higher-than-chance results: the computer managed to decode the “right” answer 67-84% of the time (blue). The method is obviously not perfect; however, when one patient was re-tested the second day, the computer’s performance soared from 77% to 90% correct, suggesting that training the patients to become more adept at focusing their mental effort could enhance the system’s precision.
Taking one step further, the researchers next tried the system in LIS patients with much more severe brain damage, which could impede their cognitive function. Unfortunately, the computer’s decoding fluctuated around chance 38-59% for all of them (pink). However, there was a silver lining: when the same question was asked twice and the computer gave a consistent response both times, decoding was almost always correct. Finally, in a non-communicative minimally conscious state patient, when instructed to perform the calculation, the computer program managed the get 82% of the answers right (orange).
The system is not a magical mind-reading machine, being fairly flawed in its decoding accuracy. It will need more maturation to be useful for diagnosing consciousness, and most certainly cannot be used to question or interrogate LIS patients. However, the idea of using automatic behaviour as a means of communication is fascinating, as it (in theory) allows those with total LIS to reach out to the external world. The setup is cheap, simple and usable in daily life or remote areas without access to fancy machinery. Even in well-equipped hospitals, the system could offer a secondary approach to gauge a person’s consciousness.
After all, for those torturously trapped in their own minds, “pretty good” is still better than nothing.
Josef Stoll, Camille Chatelle, Olivia Carter, Christof Koch, Steven Laureys, & Wolfgang Einhäuser (2013). Pupil responses allow communication in locked-in syndrome patients Current Biology, 23 (15), 647-648 DOI: 10.1016/j.cub.2013.06.011