July 1, 2009 — Researchers have struggled for more than 50 years to build machines that work like the human brain. But even the most advanced supercomputers don't come close to the brain in terms of computational efficiency.
Now an interdisciplinary research team at the University of Virginia is tackling this elusive challenge.
William Levy, professor of neurological surgery, and Mircea Stan, professor of electrical and computer engineering, received a $27,000 nanoSTAR Institute seed grant last year that is allowing them to explore biologically inspired computational possibilities.
Levy is a theorist, and his laboratory creates realistic computer simulations of what goes on in the brain as it performs different cognitive tasks. Stan specializes in designing high-performance, low-power electronics at the nanoscale. Their shared goal is to find a way to map the brain's highly efficient algorithms and determine specific hardware applications that can run them.
Advances in computation are currently limited by problems of power consumption and energy dissipation. Simulating just one cubic millimeter of brain power – about one-millionth of the brain's overall capacity – consumes about 146,000 watts, Levy said. In contrast, the brain uses about 20 to 25 watts.
"It's like a dim light bulb," Levy said. "Nature is able to do what must be on the order of trillions of computations a second with almost no energy."
In a typical computer, the power supply is housed in one corner of the unit and has to be distributed. In contrast, the brain has miniature power supplies in each neuron – about every 10 nanometers.
"The power supplies are right where the power is needed," Levy said. "And energy consumption is proportional to its use."
Computers also have separate areas for storing, computing and processing information. The brain can simultaneously carry out these tasks, which is a huge advantage.
"It turns out that the way that the brain and neurosystems work is that the signals are doing computation and communication at the same time and in essentially the same format," Stan said. The team hopes to mimic these valuable characteristics.
This research could be applied to both high- and low-end computational problems. On the high end, Stan and Levy can imagine using their biologically inspired system to address complex challenges that include vast numbers of variables. This could come in handy for data mining or predicting weather or a market crash.
"Current technology fails miserably and is very quickly overwhelmed by increasing levels of complexity," Stan said. "Biologically inspired systems may have a better clue as to how to deal with these higher levels of complexity."