The Future of Computing

/// By Fariss Samarrai

The Architecture

For 70 years, computers have operated on an architectural framework that has worked exceedingly well, multiplying speed and functionality many times over through improvements to core, individual components.

Engineers and computer scientists have continually improved the processors – the lightning-fast digital brains – that operate the machines, crunch data and make sense of it. They’ve also improved the capability of the memory chips that store data.

And as a result of the meteoric rise in digital power, today’s machines can enable big data discoveries, supercharge economic activity and productivity, and spark innovations that benefit society.

But even as computers have grown more efficient and robust, the exponential growth of data by our technological society is leapfrogging the capabilities of even the most advanced computer systems.

It’s partly because of a fundamental flaw built into the otherwise reliable architecture of computers. This weak link, the wires that connect the processor and the memory chip, is creating a bottleneck that backs up the flow of data between the two components. Computer scientists call it the “memory wall,” and it’s time for an architectural reboot.

The Center

A new national center, led by UVA and computer science professor and chair Kevin Skadron, is focused on finding a way around, over or through that wall. The goal is to create a generation of computers that can handle the modern flood of data that is bogging down existing systems.

“We need to start building computers that operate more like the way our brain operates,” Skadron said. “The brain’s neurons retrieve memories from storage and process them – the thinking that we do – in essentially a single integrated action. That’s what computers need to do.”

The $27.5 million center, which is a consortium of universities, is dedicated to redesigning computer architecture to operate more like the brain, with data storage and the central processing unit coupled as a single component, working in unison. Only then will computers have the full capability to think, in essence, and help humans make sense of ever-expanding data sets.

We need to start building computers that operate more like the way our brain operates”

With computers that can quickly process masses of seemingly random data for meaningful correlations, patterns within that data will become apparent for a range of societally-important areas, from health care to national security to smart technologies and beyond. As an example, analyses of massive data sets involving the genomic patterns of tumors among perhaps millions of cancer patients worldwide might eventually be possible, leading to new targeted therapies that would be tailored to the genetics of individuals – the key to obstructing cancer before it can start, or stopping it in its tracks.

The data explosion has created an urgent situation, making ever more apparent the need to finally resolve this problem.”

The memory wall problem is an inadvertent result of the computer architecture first proposed by pioneering computer scientist John von Neumann in 1945. That structure, based on the technologies of the time, creates the separation between processors and data storage devices.

Only so much data can travel at a time along the wires between the two, and the rest is left behind, at least until it, too, is retrieved and processed in a step-by-step procedure of ongoing follow-up computations that can take a lot of time when data sets are massive. It’s like trying to draw ever more water through a narrow pipe from a rapidly growing reservoir. Over time, efficiency is reduced to a stream of sips. As a result of this cumbersome infrastructure, where the processor frequently sits idle waiting for the data to arrive, many highly complex societal challenges never get tackled, and big questions go unanswered.

$27.5M National Center

“As data keeps growing, and we develop increasingly complex algorithms, the bottleneck keeps becoming more apparent,” Skadron said. “We’ve got to break past that. New technological capabilities involving new transistor types make it possible, and we will exploit those capabilities with this project.”

Skadron’s team, which includes 20 researchers at UVA and seven other universities, is funded through a national program sponsored by the computer industry. The group proposes to solve the bottleneck problem by pairing processors and memory chips into a single unit, a “3-D stack,” as they call it. This layer cake-like integration would create more electronic connections while minimizing the amount and length of wire needed to connect the two components, thereby removing, or at least lowering, the memory wall. This embedded construct also would reduce heat generation and electrical use – wasted energy – making the end product not only much faster, but also cooler and more energy efficient. That research will be led by UVA electrical and computer engineering professor Mircea Stan, an expert on the design of computer chips and circuits.

The term “memory wall,” by the way, was coined in 1994 by UVA computer scientist William Wulf and then graduate student Sally McKee.

“They characterized a problem that had long been recognized, but nobody really knew how severe it was or how to fix it,” Skadron said. “We’ve had ideas for some time about how to redesign the architecture, but not the technological solutions, or the financial dedication. Now we do.”

Skadron said that processor speeds have improved dramatically over the years with new materials and technologies, but memory and storage speeds are lagging, while data continues to grow.

We’ve had ideas for some time about how to redesign the architecture, but not the technological solutions, or the financial dedication. Now we do.”

The Initiative

It is this acute and nearly crisis situation, he said, that is pushing government, universities and the computing industry to work together to bring solutions to bear. The Semiconductor Research Corporation, a North Carolina-based consortium of scientists and engineers from technology companies, universities and government agencies, is managing a $200 million, five-year program that funds six research centers at leading universities, including UVA.

Each of the initiative’s centers – which are made up of multiple universities – is tasked with solving a particular challenge in mi­cro­electronics crucial to the U.S. economy and national defense. The universities work together on the problems, bringing forth individual areas of expertise, while sharing resources.

“Most of the grand challenges the National Academy of Engineering has identified for humanity in the 21st century will require effective use of big data,” said Craig Benson, dean of UVA’s School of Engineering and Applied Science. “This investment affirms the national research com­mu­nity’s confidence that UVA has the vision and expertise to lead a new era for technology.”

Through this initiative, and under Skadron’s leadership, a new Center for Research in Intelligent Storage and Processing in Memory brings together experts from eight public universities to tackle the problem of integrating memory and storage using innovative techniques and technologies.

“This team approach to solving big problems for the nation is the way to do this,” Skadron said. “Universities working together across disciplines, teamed up with industry and government agencies, can really accelerate the pace of discovery through technological breakthroughs.”

The new architecture, Skadron said, will create some challenges for software engineers, because the 3-D stack platform will alter the way software is programmed. UVA computer science professor Samira Khan, an expert on the implications of computer architecture on software systems, will guide the center’s efforts to make compatible the hardware and software elements of the new paradigm, and make it portable to existing architectures. The end goal is to make it easy to use.

Universities working together across disciplines, teamed up with industry and government agencies, can really accel­erate the pace of discovery through technological breakthroughs.”

The new architecture will be developed to solve practical, real-world problems, using case studies. The research team will work with experts in application areas to ensure the systems work well in data-heavy realms involving large-scale population studies, environmental change and lifestyles, understanding the effects of intestinal biome on genetics, and in data mining for improving home health care.

The program will fund the research of about a dozen Ph.D. students at UVA in engineering, and about 75 more across the center. Undergraduate students also will join in, while gaining professional experience through internships with companies sponsoring the program.

“We had to compete against some of the best computer science programs in the nation to bring this center to UVA,” Skadron said. “This really demonstrates that we are at the top of our field. Our faculty and students will benefit greatly as we work hard to solve one of the most important challenges in information technology.”