logo
#

Latest news with #JoshuaYou

Training AI Models Could Eat Up 4 Gigawatts of Power by 2030, Report Warns
Training AI Models Could Eat Up 4 Gigawatts of Power by 2030, Report Warns

Newsweek

time11-08-2025

  • Business
  • Newsweek

Training AI Models Could Eat Up 4 Gigawatts of Power by 2030, Report Warns

Based on facts, either observed and verified firsthand by the reporter, or reported and verified from knowledgeable sources. Newsweek AI is in beta. Translations may contain inaccuracies—please refer to the original content. The energy required to train large, new artificial intelligence (AI) models is growing rapidly, and a report released on Monday projects that within a few years such AI training could consume more than 4 gigawatts of power, enough to power entire cities. "If trends continue in training, compute growth and hardware efficiency, then the largest individual training runs will likely require several gigawatts of power by 2030," Joshua You, a data analyst at Epoch AI, told Newsweek. Epoch AI is a research institute investigating the trajectory of AI and partnered with the independent nonprofit Electric Power Research Institute (EPRI) to produce the report. While other studies of electricity demand for AI have gleaned energy use by examining company orders for chips and real estate deals for data center construction, the researchers at EPRI and Epoch AI focused on the energy required just to train new AI models, known as frontier AI training runs. Recent AI model training, such as the Grok AI developed by Elon Musk's xAI, required about 100–150 megawatts, the report found. But the power demands for AI training have been more than doubling every year, the researchers said. By 2028, each frontier AI training is projected to gobble up 1 to 2 gigawatts of power. A gigawatt is 1,000 megawatts. By 2030, an individual AI training run could require 4 gigawatts. An Amazon Web Services data center in Ashburn, Virginia. Northern Virginia is the largest data center market in the world, but is facing headwinds from availability of land and electric power. An Amazon Web Services data center in Ashburn, Virginia. Northern Virginia is the largest data center market in the world, but is facing headwinds from availability of land and electric power."It is a lot of power," You said. "It's approaching the power draw of some individual U.S. states." The researchers factored in efficiency improvements in chips, server cooling systems and software that have greatly boosted the computing power possible from a unit of energy. But even with those efficiency gains, they found the overall energy consumption continues to rise sharply, a modern version of the Jevons paradox concept in which economists observed that improvements in energy efficiency tend to lead to greater consumption. "AI companies tend to just reinvest those efficiency gains into scaling up," You said. "So that tends to swamp these efficiency improvements." The findings have big implications for the electric utility companies in areas where tech companies want to locate large AI data centers. By better understanding the nature and timing of power demands for AI, tech and power companies can work together on better ways to meet that demand, Tom Wilson, principal technical executive at EPRI, told Newsweek. "We have technical demonstrations where you can distribute the training," Wilson said. EPRI worked with tech companies Oracle and NVIDIA and a startup called Emerald AI for a recent demonstration at data centers in the Phoenix, Arizona, area. Applying software developed by Emerald AI, the companies were able to shift AI computing work to other data centers to avoid being a drain on the local power company during times of peak energy demand. Wilson said that sort of shifting of data workloads could help avoid stressing regional power providers and their ability to deliver power when other customers need it the most, such as very hot days when AC use is high. EPRI launched a partnership last year with tech and power companies called DCFlex (the DC stands for data centers) to explore more flexible ways to power AI. Wilson said the addition of on-site power generation co-located with data centers and large-scale battery storage systems can also offer more options. The energy sources that Big Tech and power companies opt for will also affect the path toward decarbonization of the power system. Tech companies are already among the world's biggest purchasers of renewable energy. However, many are also opting for fossil fuel sources for new data center developments, and the AI boom has knocked several tech giants such as Google and Microsoft off target for ambitious climate goals to reach net zero. Newsweek will examine the issue at "Powering Ahead," a live event on September 25 during Climate Week NYC.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store