Futurology: Global demand for AI computing is causing data centers to guzzle electricity like dorm rooms guzzling beer, but researchers at the University of Minnesota may have a highly innovative solution to curb AI’s growing power demands with a revolutionary new device that promises far greater energy efficiency.
The researchers designed a new “computational random access memory” (CRAM) prototype chip, Energy Demand AI applications can achieve an astounding 1,000x or more energy savings compared to current methods. In one simulation, CRAM technology delivered an astounding 2,500x energy savings.
Traditional computing relies on the decades-old von Neumann architecture of separate processors and memory units, which require data to be constantly passed back and forth in an energy-intensive process. The University of Minnesota team’s CRAM completely upends this model by using a spintronics device called a magnetic tunnel junction (MTJ) to perform calculations directly in the memory itself.
Spintronic devices harness the spin of electrons rather than relying on electric charge to store data, offering a more efficient alternative to traditional transistor-based chips.
“Because CRAM is a highly energy-efficient, digital-based, in-memory computing substrate, it is extremely flexible in that computations can be performed anywhere in the memory array. Thus, CRAM can be reconfigured to best meet the performance needs of different AI algorithms,” said Uriya Karpzuk, co-author of the paper. paper The paper was published in the journal Nature. He added that this is more energy efficient than traditional components of today’s AI systems.
By eliminating power-hungry data transfers between logic and memory, CRAM technology like this prototype could be crucial to significantly improve AI energy efficiency at a time when energy demands are exploding.
The International Energy Agency predicted in March that global electricity consumption for AI training and applications could double from 460 terawatt-hours in 2022 to more than 1,000 terawatt-hours by 2026, roughly the amount consumed by the entire country of Japan.
The researchers: press release The foundation for this breakthrough was laid over more than 20 years ago, dating back to pioneering work by Engineering Professor Jian-Ping Wang on the use of MTJ nanodevices for computing.
Wang acknowledged that the original proposal to abandon the von Neumann model was “considered insanity” 20 years ago, but the Minnesota team built on Wang’s patented MTJ research. Magnetic RAM (MRAM) It is now also used in smartwatches and other embedded systems.
Of course, as with any breakthrough of this nature, the researchers must address challenges such as scalability, manufacturing, and integration with existing silicon. They are already planning demonstration collaborations with semiconductor industry leaders to make CRAM commercially viable.