Diamond Member Pelican Press 0 Posted July 30, 2024 Diamond Member Share Posted July 30, 2024 This is the hidden content, please Sign In or Sign Up New memory tech could make AI 1,000 times more efficient We all know AI has a power problem. On the whole, global AI usage already This is the hidden content, please Sign In or Sign Up did in 2021. But engineering researchers at the University of Minnesota Twin Cities have developed and demonstrated a new computer memory design that could drastically reduce the amount of energy AI systems consume to help temper this problem. Their This is the hidden content, please Sign In or Sign Up in the journal Nature journal Unconventional Computing. Most modern computing systems are built on what is known as the Von Neumann architecture, where the logic and memory subsystems are separated. During normal operations, data is shuttled back and forth between the memory modules and processors. This is the basic foundation of modern computers operate. However, This is the hidden content, please Sign In or Sign Up , this data transfer becomes a bottleneck both in terms of processing speed (also known as the This is the hidden content, please Sign In or Sign Up ) and power consumption. As the researchers pointed out, just shuffling the data back and forth consumers as much as 200 times the amount of power that the computations themselves do. Developers have sought to work around this issue by bringing the logic and memory physically closer together with “near-memory” and “in-memory” computing designs. Near-memory systems stack the logic and memory on top of one another in a 3D array, they’re layered PB&J-style, while in-memory systems intersperse clusters of logic throughout the memory on a single chip, more like a peanut butter and banana sandwich. The Twin Cities research team’s solution is a novel, fully digital, in-memory design, dubbed computational random-access memory (CRAM), wherein, “logic is performed natively by the memory cells; the data for logic operations never has to leave the memory,” per the researchers. The team achieved this by integrating a reconfigurable This is the hidden content, please Sign In or Sign Up compute substrate directly into the memory cell, an advance that the researchers found could reduce an AI operation’s energy consumption by an “order of 1,000x over a state-of-the-art solution.” And that 1,000x improvement could just be the baseline. The research team tested CRAM on an This is the hidden content, please Sign In or Sign Up and found it to be “2,500× and 1,700× less in energy and time, respectively, compared to a near-memory processing system at the 16 nm technology node.” The emerging AI industry is already facing significant resource issues. The ever faster, ever more powerful and capable GPUs that underpin AI software are immensely energy hungry. NVIDIA‘s newest top-of-the-line This is the hidden content, please Sign In or Sign Up , for example, and generates so much waste heat that it requires liquid cooling, another resource-intensive operation. With hyperscalers like This is the hidden content, please Sign In or Sign Up , This is the hidden content, please Sign In or Sign Up , and This is the hidden content, please Sign In or Sign Up all scrambling to build out the physical infrastructure necessary to power the oncoming AI revolution — ie This is the hidden content, please Sign In or Sign Up , some with This is the hidden content, please Sign In or Sign Up — creating more energy-efficient compute and memory resources will become increasingly critical to the long-term viability of AI technology. This is the hidden content, please Sign In or Sign Up #memory #tech #times #efficient This is the hidden content, please Sign In or Sign Up This is the hidden content, please Sign In or Sign Up 0 Quote Link to comment https://hopzone.eu/forums/topic/82468-new-memory-tech-could-make-ai-1000-times-more-efficient/ Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.