Diamond Member Pelican Press 0 Posted April 11 Diamond Member Share Posted April 11 AI may eventually consume a quarter of America’s power by 2030, warns Arm CEO In an interview cited by The Wall Street Journal earlier this week, Rene Has, CEO of Arm, warned of AI’s “insatiable” thirst for electricity, stating an increase to as much as 25% of the U.S.’ current 4% power grid usage from AI datacenters is possible. Rene himself may have been citing an This is the hidden content, please Sign In or Sign Up report from January stating that ChatGPT consumes roughly 2.9 watt-hours of electricity per request, which is 10 times as much as a standard This is the hidden content, please Sign In or Sign Up search. Thus, if This is the hidden content, please Sign In or Sign Up made the full hardware and software switch with its search engine, This is the hidden content, please Sign In or Sign Up would consume at least 11 terawatt-hours of electricity per year from its current 1 TWh. The original report says one example of a standard 2.9-watt-hour would be running a 60-watt-hour lightbulb for just under three minutes. Similar to the standard deviation of ChatGPT queries to standard search engines, industry-wide expectations for Artificial Intelligence power demands are expected to increase tenfold. These statements were made ahead of an expected U.S. and ********* partnership in AI and alongside recent developments like OpenAI’s Sora, the current version of which This is the hidden content, please Sign In or Sign Up estimates to consume at least one Nvidia H100 GPU per hour to generate five minutes of video. Grok 3 has also been estimated to require 100,000 Nvidia H100s just for training. A single, 700-watt Nvidia H100 can consume roughly 3740 kilowatt-hours per year. Without great improvements to efficiency and/or greatly increased government regulation, Rene declares the current trend is “hardly very sustainable,” and he might be correct. The US Energy Information Administration (EIA) stated that the ******* States generated a total of 4.24 trillion kilowatt-hours, or 4240 terawatt-hours, in 2022, with only 22% of that coming from renewables. This is compared to a total consumption of 3.9 trillion kWh, or 3900 terawatt-hours of the available ~42. That’s 11 of the 340 remaining terawatt-hours left at current levels that the AI industry seems to be aiming for in the next decade. Sustainability must also keep in mind the likely increasing demands of other industries and the scale of renewable to non-renewable resources. Given that the cost of power has nearly doubled since 1990 (per This is the hidden content, please Sign In or Sign Up ), perhaps calls for more regulation are justified. Join the experts who read Tom’s Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We’ll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox. Of course, outlets like The New York Times are also outright suing OpenAI and This is the hidden content, please Sign In or Sign Up , so it’s not like the current AI industry is without existing legal challenges. Rene Haas expressed hope that the international partnership between Japan and the U.S. may yet improve these dramatically high power estimations. However, corporate greed and compute demand are also international, so only time will tell. This is the hidden content, please Sign In or Sign Up #eventually #consume #quarter #Americas #power #warns #Arm #CEO This is the hidden content, please Sign In or Sign Up Link to comment https://hopzone.eu/forums/topic/14192-ai-may-eventually-consume-a-quarter-of-america%E2%80%99s-power-by-2030-warns-arm-ceo/ Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now