August 8, 2008 — In 2002, Peggy Chun, a Hawaii artist popular for her colorful and sometimes whimsical paintings of lush tropical vegetation and blue seas, was diagnosed with amyotrophic lateral sclerosis, a debilitating neurological disorder also known as Lou Gehrig's disease.
Her mother, twin sister and a grandfather died of the same hereditary disease. Chun knew that with time she, too, would lose control of her muscles and eventually become fully paralyzed.
But until then, she was determined to continue to paint, her profession and lifelong passion. Creativity was a way of life for her, no matter how she felt physically. In fact, on the days she felt the worst, she often painted her best — colorful, funny pictures of flying cows and smiling dolphins and dancing tigers and giraffes and hippopotamuses, and of pigs roasting marshmallows.
As she lost control of her right hand, she learned to paint with her left. And when she could no longer paint with either hand, and became dependent on a ventilator to breathe for her, she learned to paint with a brush gripped tightly in her mouth. Soon her jaw muscles weakened and it appeared that she would never paint again.
Then she began using a computer system called ERICA – Eye-Gaze Response Interface Computer Aid, which was developed by Thomas Hutchison, a now-retired University of Virginia systems engineering professor. That system allowed Chun to paint digitally by gazing at a computer screen and using a visual keypad to choose colors and create patterns.
Eventually, though, she became fully paralyzed — "locked-in" — and lost control of her eye movement, even as her mind continued to function normally. Dennis Proffitt, a U.Va. psychology professor, learned of Chun's story and began exploring new ways to use technology to help her continue to create art. Proffitt specializes in working with emerging technologies to create interfaces with the mind, among his other research interests.
• View UVA Today Video Produced by Rob Smith
Chun now creates "brain art," using a new technology that Proffitt and his team are testing and refining with colleagues from Georgia Tech and a Hawaiian company called Archinoetics. The National Science Foundation funds Proffitt's research.
"Our hope is to allow people with locked-in syndrome to not only communicate their needs, but also to continue to contribute to society and their families, and their own sense of fulfillment, by allowing them to creatively express themselves," Proffitt said.
The system allows Chun to answer simple yes-or-no questions by responding in her mind.
A sensor is placed on her head that, in effect, reads her answers. Using infrared imaging, the sensor detects changing blood flow in Broca's area, a small section of the brain at the front left side that becomes active when a person is thinking discursively, whether mentally or verbally.
Chun activates that area of her brain when she rapidly counts 1, 2, 3, 4, 5 …, which is used to signify a "yes" answer. When she wants to say "no," she sings mentally. Some patients simply say in their mind, very slowly, "la … la … la … la."
"We're not reading their minds," Proffitt said, "but we are able to determine, with increasing accuracy, simple responses to simple questions. This allows the patient to articulate an answer and not be so locked-in."
In the case of Peggy Chun, the technology allows her to continue to make art. By answering yes or no, she is able to choose colors and shades from a palette on a computer screen, and brain-paint in gentle horizontal lines, images that look much like a sky at dusk or dawn.
Jonathan Zadra, a doctoral student in Proffitt's lab, was with Chun when she saw the first full-sized prints of her brain art.
"I could see what it meant to her family that she was able to do that," he said. "It became a driving factor for me to want to continue in this field. … It's very satisfying to know that we are developing ways to help people in a paralyzed state continue to express themselves creatively."
— By Fariss Samarrai
August 8, 2008