Artificial intelligence (AI) computing requires huge amounts of power, and targeted research could hold the key to significantly reducing it, but a team of US researchers has developed a technique that could reduce the energy consumption required for AI processing by at least a thousand-fold.
A group of engineering researchers at the University of Minnesota Twin Cities have demonstrated a technique to make AI more efficient, and have published a peer-reviewed paper outlining their work and findings in npj Unconventional Computing, a peer-reviewed Nature journal. In essence, they created a shortcut to the conventional way of doing AI calculations, dramatically reducing the energy required for the task.
In current AI computing, data is transferred between the components that process it (logic) and where it’s stored (memory/storage), and according to the study, this constant shuttling of information can consume up to 200 times more energy than is used for the computation.
To address this issue, the researchers turned to computational random access memory (CRAM), which they developed by placing a high-density, reconfigurable spintronic in-memory computational substrate within the memory cell itself.
This differs from existing in-memory processing solutions such as Samsung’s PIM technology, which places a processing computing unit (PCU) inside the memory core. Data still has to move from the memory cells to the PCU and back again, but it doesn’t have to travel very far.
With CRAM, data is processed entirely within a computer’s memory array without ever leaving memory, which the researchers say improves energy consumption for systems running AI computing applications “by as much as 1,000 times compared to state-of-the-art solutions.”
Other examples suggest the potential for even greater energy savings and faster processing: In one test, CRAM was 2,500 times more energy efficient and 1,700 times faster than a near-memory processing system using the 16nm technology node when running the MNIST handwritten digit classification task, which is used to train AI systems to recognize handwriting.
The importance of this work cannot be overstated: A recent report said that AI workloads already consume roughly the same amount of electricity as the entire country of Cyprus consumed in 2021. Total energy consumption is expected to reach 4.3GW in 2023, growing at a rate of 26-36% over the next few years. Arm’s CEO recently suggested that AI could consume a quarter of all energy produced in the United States by 2030.
Lead author Jan Ruff, a postdoctoral researcher in the Department of Electrical and Computer Engineering at the University of Minnesota, and other members of the research team have already filed several patents based on the new technology, and they plan to work with semiconductor industry leaders, including in Minnesota, to conduct large-scale demonstrations and build hardware that will help improve AI capabilities and make it more efficient.