January 11, 2008 — Peggy Chun is a popular artist known for her bold watercolor paintings that capture the spirit of her Hawaiian home. But in 2002, the painter was diagnosed with amyotrophic lateral sclerosis, also known as Lou Gehrig's disease. This debilitating neurological disorder progressively destroys a person's motor neurons. As a victim of this incurable disease, Chun can feel, see, smell, taste, think and imagine, but can no longer move in any way. She is, in the parlance of the medical profession, "locked-in."
ALS is the most frequent cause of locked-in syndrome, which begins with numbness in the extremities and progresses until all motor function disappears.
"Usually the last thing you lose is eye movement," says University of Virginia cognitive psychologist Dennis Proffitt. "When you lose that, you are cognitively alert, you can think, you can feel, but you can't move a thing. As a result, you can't communicate in any way. It's awful."
Proffitt, whose research focuses on creating computer interfaces that help improve human cognitive processes, is working with colleagues at Georgia Tech and a company called Archinoetics in Hawaii to develop technology that may one day make life for locked-in patients more bearable. Their work is being funded by the National Science Foundation.
Chun is one of several locked-in patients currently testing a technology that allows her to answer simple yes-or-no questions. It involves an interface that uses functional near-infrared imaging to assess activity in Broca's area, a part of the brain where verbal working memory occurs. Just above Chun's left ear, researchers strap a device that projects a light beam through the skull and measures changes in blood volume and oxygenation when Broca's area is engaged.
With the device in place, researchers ask subjects like Chun to count in their head when they want to activate the verbal working memory and initiate a "yes" response. When they want to say "no," subjects think of clouds or rest or think "la la la." It's a process that most people can engage easily without having to spend a long time training to do it.
"It was hard for us to think of something we could ask a person to do — something easy to control, something you can turn on and off — that we could measure in this way," explains Proffitt, Commonwealth Professor of Psychology. "What we came up with was subvocal speech … talking to yourself.
"It's not reading your thoughts," Proffitt stresses. "We can't do that."
Scientists know the kinds of things the brain is doing because different parts of the brain are activated when a person performs different functions. They know, for example, that moving the left arm activates a particular area on the right side of the brain. The back of the brain is active with visual imagery, and the frontal area of the brain is active when one tries to focus attention on something. Proffitt's system simply detects whether or not a particular area of the brain is actively engaged at the time.
"You could be counting, or you could be reciting a poem. We couldn't tell the difference," Proffitt says. "We have no idea what you're doing. We just know the kind of thing you're doing."
At this time the system is primitive, Proffitt admits, but it's a start. "Right now it's an on/off switch. What we want to do is to get continuous control so the person is not just activating [verbal working memory], but can say by how much. So, not just ‘yes' or ‘no,' but small to large, continuous control within some range. If we could achieve that in the next few years, that would be a huge improvement in what we will be able to do with the technology."
For the 500,000 people in the world with locked-in syndrome, having the ability to communicate, even in this primitive fashion, can make the difference between suffering in silence and a meaningful life.
Peggy Chun isn't waiting for the technology to evolve, though. She's using the system now as a tool for creativity. With the sensor in place over her left ear, the artist activates Broca's area to select shades from a palette that show up on a computer screen as horizontal gradations of color. She calls it "brain art," and it may be simple, but it's selling like hotcakes.
— Written by Linda J. Kobert
ALS is the most frequent cause of locked-in syndrome, which begins with numbness in the extremities and progresses until all motor function disappears.
"Usually the last thing you lose is eye movement," says University of Virginia cognitive psychologist Dennis Proffitt. "When you lose that, you are cognitively alert, you can think, you can feel, but you can't move a thing. As a result, you can't communicate in any way. It's awful."
Proffitt, whose research focuses on creating computer interfaces that help improve human cognitive processes, is working with colleagues at Georgia Tech and a company called Archinoetics in Hawaii to develop technology that may one day make life for locked-in patients more bearable. Their work is being funded by the National Science Foundation.
Chun is one of several locked-in patients currently testing a technology that allows her to answer simple yes-or-no questions. It involves an interface that uses functional near-infrared imaging to assess activity in Broca's area, a part of the brain where verbal working memory occurs. Just above Chun's left ear, researchers strap a device that projects a light beam through the skull and measures changes in blood volume and oxygenation when Broca's area is engaged.
With the device in place, researchers ask subjects like Chun to count in their head when they want to activate the verbal working memory and initiate a "yes" response. When they want to say "no," subjects think of clouds or rest or think "la la la." It's a process that most people can engage easily without having to spend a long time training to do it.
"It was hard for us to think of something we could ask a person to do — something easy to control, something you can turn on and off — that we could measure in this way," explains Proffitt, Commonwealth Professor of Psychology. "What we came up with was subvocal speech … talking to yourself.
"It's not reading your thoughts," Proffitt stresses. "We can't do that."
Scientists know the kinds of things the brain is doing because different parts of the brain are activated when a person performs different functions. They know, for example, that moving the left arm activates a particular area on the right side of the brain. The back of the brain is active with visual imagery, and the frontal area of the brain is active when one tries to focus attention on something. Proffitt's system simply detects whether or not a particular area of the brain is actively engaged at the time.
"You could be counting, or you could be reciting a poem. We couldn't tell the difference," Proffitt says. "We have no idea what you're doing. We just know the kind of thing you're doing."
At this time the system is primitive, Proffitt admits, but it's a start. "Right now it's an on/off switch. What we want to do is to get continuous control so the person is not just activating [verbal working memory], but can say by how much. So, not just ‘yes' or ‘no,' but small to large, continuous control within some range. If we could achieve that in the next few years, that would be a huge improvement in what we will be able to do with the technology."
For the 500,000 people in the world with locked-in syndrome, having the ability to communicate, even in this primitive fashion, can make the difference between suffering in silence and a meaningful life.
Peggy Chun isn't waiting for the technology to evolve, though. She's using the system now as a tool for creativity. With the sensor in place over her left ear, the artist activates Broca's area to select shades from a palette that show up on a computer screen as horizontal gradations of color. She calls it "brain art," and it may be simple, but it's selling like hotcakes.
— Written by Linda J. Kobert
Media Contact
Article Information
January 14, 2008
/content/unlocking-locked-syndrome-uva-researcher-helps-patients-complete-paralysis-or-locked