In 1994, University of Virginia computer science professor emeritus William Wulf and his then-graduate student, Sally McKee, identified what would become a defining challenge in the field of computer science for decades to come. They called it the “memory wall.”
The memory wall results from two issues: outdated computing architecture, with a physical separation between computer processors and memory; and the fact that a processor can run much faster than the speed at which memory chips can provide data.
As early as the 1980s, researchers were predicting that computer systems could not keep up with the future trajectory of data. Then came the internet of things – devices connected through the cloud collecting vast amounts of data. The rapid growth of bioinformatics has been another source of the data explosion.
By 2018, Forbes reported that 90% of the world’s data had been generated in just the previous two years. The servers processing these data have not been able to keep up and provide timely results, such as identifying new COVID variants or responding quickly when a patient falls ill.
That was the year when researchers in the University of Virginia’s Department of Computer Science and Charles L. Brown Department of Electrical and Computer Engineering were selected to establish a $29.7 million research effort to remove the memory wall.

