PALO ALTO, Calif. — Hewlett-Packard researchers have proposed a fundamental rethinking of the modern computer for the coming era of nanoelectronics — a marriage of memory and computing power that could drastically limit the energy used by computers.
Noah Berger for The New York Times
Today the microprocessor is in the center of the computing universe, and information is moved, at heavy energy cost, first to be used in computation and then stored. The new approach would be to marry processing to memory to cut down transportation of data and reduce energy use.
The semiconductor industry has long warned about a set of impending bottlenecks described as “the wall,” a point in time where more than five decades of progress in continuously shrinking the size of transistors used in computation will end. If progress stops it will not only slow the rate of consumer electronics innovation, but also end the exponential increase in the speed of the world’s most powerful supercomputers — 1,000 times faster each decade.
However, in an article published in IEEE Computer in January, Parthasarathy Ranganathan, a Hewlett-Packard electrical engineer, offers a radical alternative to today’s computer designs that would permit new designs for consumer electronics products as well as the next generation of supercomputers, known as exascale processors.
Today, computers constantly shuttle data back and forth among faster and slower memories. The systems keep frequently used data close to the processor and then move it to slower and more permanent storage when it is no longer needed for the ongoing calculations.
In this approach, the microprocessor is in the center of the computing universe, but in terms of energy costs, moving the information, first to be computed upon and then stored, dwarfs the energy used in the actual computing operation.
Moreover, the problem is rapidly worsening because the amount of data consumed by computers is growing even more quickly than the increase in computer performance.
“What’s going to be the killer app 10 years from now?” asked Dr. Ranganathan. “It’s fairly clear it’s going to be about data; that’s not rocket science. In the future every piece of storage on the planet will come with a built-in computer.”
To distinguish the new type of computing from today’s designs, he said that systems will be based on memory chips he calls “nanostores” as distinct from today’s microprocessors. They will be hybrids, three-dimensional systems in which lower-level circuits will be based on a nanoelectronic technology called the memristor, which Hewlett-Packard is developing to store data. The nanostore chips will have a multistory design, and computing circuits made with conventional silicon will sit directly on top of the memory to process the data, with minimal energy costs.
Within seven years or so, experts estimate that one such chip might store a trillion bytes of memory (about 220 high-definition digital movies) in addition to containing 128 processors, Dr. Ranganathan wrote. If these devices become ubiquitous, it would radically reduce the amount of information that would need to be shuttled back and forth in future data processing schemes.
For years, computer architects have been saying that a big new idea in computing was needed. Indeed, as transistors have continued to shrink, rather than continuing to innovate, computer designers have simply adopted a so-called “multicore” approach, where multiple processors are added as more chip real estate became available.
The absence of a major breakthrough was referred to in a remarkable confrontation that took place two years ago during Hot Chips, an annual computer design conference held each summer at Stanford University.
John L. Hennessy, the president of Stanford and a computer design expert, stood before a panel of some of the world’s best computer designers and challenged them to present one fundamentally new idea. He was effectively greeted with silence.
“What is your one big idea?” he asked the panel. “I believe that the next big idea is going to come from someone who is considerably younger than the average age of the people in this room.”
Next Page »
- 1
- 2
This article has been revised to reflect the following correction:
Correction: March 2, 2011
An article on Tuesday about measures to rethink computer technology for the coming era of nanoelectronics misstated the amount of energy needed to keep a 100-watt bulb lighted for an hour. It is 360,000 joules, not eight million joules.
The object of this blog began as a display of a varied amount of writings, scribblings and rantings that can be easily analysed by technology today to present the users with a clearer picture of the state of their minds, based on tests run on their input and their uses of the technology we are advocating with www.projectbrainsaver.com
Monday, 7 March 2011
Remapping Computer Circuitry to Avert Impending Bottlenecks - NYTimes.com
via nytimes.com
Flickr - projectbrainsaver
www.flickr.com
|