Latest news with #energyConsumption


The Guardian
22-05-2025
- Business
- The Guardian
AI could account for nearly half of data centre power usage ‘by end of year'
Artificial intelligence systems could approach accounting for half of data centre power consumption by the end of this year, according to new analysis. The estimates by Alex de Vries-Gao, the founder of the Digiconomist tech sustainability website, came as the International Energy Agency (IEA) forecast that AI would require almost as much energy by the end of this decade as Japan uses today. De Vries-Gao's calculations, to be published in the sustainable energy journal Joule, are based on the power consumed by chips made by Nvidia and Advanced Micro Devices that are used to train and operate AI models. The paper also takes into account the energy consumption of chips used by other companies such as Broadcom. The IEA estimates that all data centres – excluding mining for crypto currencies – consumed 415 terawatt hours (TWh) of electricity last year. De Vries-Gao argues in his research that AI could already account for 20% of that total. De Vries-Gao said a number of variables came into his calculations, such as the energy efficiency of a data centre and electricity consumption related to cooling systems for servers handling an AI system's busy workloads. Data centres are the central nervous system of AI technology, with their high energy demands making sustainability a key concern in the development and use of artificial intelligence systems. By the end of 2025, De Vries-Gao estimates, energy consumption by AI systems could approach up to 49% of total data centre power consumption, again excluding crypto mining. AI consumption could reach 23 gigawatts (GW), the research estimates, twice the total energy consumption of the Netherlands. However, De Vries-Gao said a number of factors could lead to a slowdown in hardware demand, such as waning demand for applications such as ChatGPT. Another issue could be geopolitical tensions resulting in constraints on producing AI hardware, such as export controls. De Vries-Gao cites the example of barriers on Chinese access to chips, which contributed to the release of the DeepSeek R1 AI model that reportedly used fewer chips. De Vries-Gao said: 'These innovations can reduce the computational and energy costs of AI.' But he said any efficiency gains could encourage even more AI use. Multiple countries attempting to build their own AI systems – a trend known as 'sovereign AI' – could also increase hardware demand. De Vries-Gao also pointed to a US data centre startup, Crusoe Energy, securing 4.5GW of gas-powered energy capacity for its infrastructure, with the ChatGPT developer OpenAI among the potential customers through its Stargate joint venture. 'There are early indications that these [Stargate] data centres could exacerbate dependence on fossil fuels,' writes De Vries-Gao. On Thursday OpenAI announced the launch of a Stargate project in the United Arab Emirates, the first outside the US. Sign up to TechScape A weekly dive in to how technology is shaping our lives after newsletter promotion Microsoft and Google admitted last year that their AI drives were endangering their ability to meet internal environmental targets. De Vries-Gao said information on AI's power demands had become increasingly scarce, with the analyst describing it as an 'opaque industry'. The EU AI Act requires AI companies to disclose the energy consumption behind training a model but not for its day-to-day use. Prof Adam Sobey, the mission director for sustainability at the UK's Alan Turing Institute, an AI research body, said more transparency was needed on how much energy is consumed by artificial intelligence systems – and how much they could potentially save by helping make carbon-emitting industries such as transport and energy more efficient. Sobey said: 'I suspect that we don't need many very good use cases [of AI] to offset the energy being used on the front end.'


Reuters
20-05-2025
- Business
- Reuters
AI power demand is generating hallucinations
NEW YORK, May 20 (Reuters Breakingviews) - A sprawling global network of silicon has sprung up to power artificial intelligence. Housed within data centers, these chips demand vast sums of electricity to crunch the numbers behind the likes of ChatGPT. Yet both technology giants and energy utilities are incentivized to pad projections for how many of these facilities need to be built and how many electrons they will ultimately consume. When trying to reconcile canceled plans from cloud giant Microsoft (MSFT.O), opens new tab with, say, multi-year backlogs for gas turbine manufacturer GE Vernova (GEV.N), opens new tab, it's worth keeping these distortions in mind. After two decades of flatlining, the U.S. Energy Information Administration now expects American power demand to grow by 2% annually. Above all, that's thanks to a wave of data centers being set up as Alphabet (GOOGL.O), opens new tab, (AMZN.O), opens new tab, Meta Platforms (META.O), opens new tab and Microsoft aim to spend over $300 billion in combined capital expenditure this year. These server farms are expected to triple as a share of total grid usage, rising to 12% by 2030, McKinsey reckons. This economy-reshaping boom has dramatically lifted AI-adjacent stocks, but left investors on edge about any potential sign of a slowdown. The success of China's DeepSeek, for instance, raised questions about whether chatbots really do need the monumental supply of chips being churned out by Nvidia (NVDA.O), opens new tab, briefly lopping $600 billion off its market value. Perhaps the most important bellwether is the company led by Satya Nadella. Microsoft ushered in the AI age by partnering with ChatGPT developer OpenAI in 2016, and its Azure cloud division provides computing grunt to a raft of chatbot developers and users. When an executive said in April that the company had slowed or paused, opens new tab some of its data center build-out, it seemed an ominous sign. A few weeks later, Azure nonetheless reported accelerating growth, and a roughly $80 billion infrastructure spending target was reaffirmed. This should be a familiar dynamic for tech watchers. The semiconductor industry is notoriously plagued by 'double ordering,' or the phenomenon of customers ordering more chips than they need on the theory that cancelling some shipments is better than going without a mission-critical part. Whether Microsoft and Google are consciously doing the same or not, the worst thing that can happen in the lightning-fast AI race is falling behind. There are plenty of stumbling blocks - like Nvidia's essential chips, constantly in short supply - that are out of their control. But it's simple enough to earmark land and requests for power hookups by starting new data center projects. It is far less damaging to cancel any surplus developments, especially early-stage projects where minimal capital has been spent, than be left without capacity to serve customers. This flows up to utilities and grid operators like ERCOT, PJM Interconnection and MISO, which together monitor over two dozen states accounting for roughly half the U.S. population, including data center hotspots in North Virginia and Texas. The three organizations estimate that 'large load demand' - essentially, server projects - will reach 140 gigawatts by 2030, according to nuclear reactor operator Constellation Energy. Yet that figure is double estimated data center consumption for the entire country, according to S&P Global and McKinsey. Of course, companies like OpenAI are planning data centers as large as 5 gigawatts, roughly equivalent to the power draw of four million homes, based on average annual home use according to the EIA. It therefore takes very few projects to move utilities' projections around substantially. They also face incentives to over-estimate, though. These can be as benign as wanting to plan for a bigger buffer of potential power generation on tap, rather than too little, thus avoiding blackouts. On the other hand, as strictly regulated businesses, defraying the cost of capital improvements is among the few means they have of gaining permission to raise prices and earnings. Whatever the cause, the energy industry's crystal balls often produce overcooked figures. PJM, which manages the grid across 13 states, overestimated peak summer demand for 17 years in a row before 2024, according to Wilson Energy Economics, opens new tab. The Rocky Mountain Institute, opens new tab reckons that utilities and grid operators similarly forecast growth 12 percentage points too high between 2005 and 2015. Of course, once projects get started, they ramify up the supply chain. GE Vernova estimates that its gas turbines will be effectively sold out through 2028 by this summer. Similarly, Hitachi Energy recently said that the wait time for new power transformers has now reached four years. This wash of new orders is stretching capacity and upping prices. NextEra Energy (NEE.N), opens new tab reckons that the cost of building new gas-fired generation has tripled since 2022. Laments of a skilled worker shortage abound. If any of the demand driving this proves illusory, it might also over-egg the apparent need for alternative energy sources. OpenAI boss Sam Altman has joined a scramble to explore shrunken-down nuclear plants. Google said it would commit capital to Elementl Power, opens new tab to advance three nuclear projects - though their location or even basic technology was still to be determined. The outlook for such efforts is uncertain at best. There are less far-fetched or disruptive ways to handle whatever influx really does arise. The current U.S. energy grid could absorb 76 gigawatts of new load, about 10% of aggregate peak demand, so long as new users agree to switch off their facilities just 0.25% of the time, or less than one day a year on average, according to researchers at Duke University. For any new generation that does get built, utilities and regulators could try to ensure that big tech customers understand that they will be on the hook for at least some payments. For now, though, all of this simply makes the AI tea leaves more difficult to read. Microsoft might step back from deals, utility load growth might underwhelm - and Azure and its rivals might still continue to grow at a rapid clip all the same. Like a malfunctioning chatbot that suggests combining glue and pizza, tripping up over apparent connections between unrelated data is an investing hazard. Follow @rob_cyran, opens new tab on X


Fast Company
07-05-2025
- Business
- Fast Company
Three ways data centers can solve for energy independence
Today's technology boom isn't just reliant on innovation—it's reliant on energy supply. AI has the potential to create upwards of $4.4 trillion in economic value worldwide, but the U.S. alone would need 50 to 60 gigawatts of new data center infrastructure to support those ambitions. Because of the high energy demands of AI, cloud infrastructures, and big data initiatives, data center energy consumption is expected to rise to 12% of the total annual U.S. electricity consumption by 2028, up from just 1.9% in 2018. Where will that energy come from? And how will data centers ensure they have access to it to support their growing computing needs? To stay ahead of energy demands, data center operators and commercial real estate leaders can turn to flexible energy options that offer more control, cost savings, and independence from traditional grids. Organizations have big plans for AI and other scaling technologies like cloud computing and data storage. In 2024, 78% of companies said they used AI in at least one business function, up from just 20% in 2017, and AI, cloud, and big data are all essential business needs today. To support these tech innovations, companies are turning to data centers for the space and infrastructure to run their hardware. But these technologies require a lot of energy—for AI, about 10 times the electricity of equivalent non-AI software—which the data center needs to provide. And it's not just computing that data centers are powering. For many, traditional air or fan cooling is becoming less efficient to cool hardware, and chilled water or other liquid cooling options can use up 37% of a data center's energy. Data centers rarely have easy access to the amount of energy they need. And it's not as simple as plugging into the grid, because the grid may not be able to deliver the supply needed for high-energy computing demands. Data centers around the world are facing a wait time of many months to years in the queue to get plugged in, but once connected to a grid, they're still at the mercy of that grid's costs and outages. Data centers can't wait for appropriate outside energy solutions, especially when customer demand is so high. There is a solution to the bottleneck: rethinking a power strategy that incorporates more sustainable and self-sufficient options, like creating a microgrid, building strategically, and increasing operational efficiencies. 1. MICROGRID AND HYBRID ENERGY SOLUTIONS Data center operators and commercial real estate leaders looking to stay ahead of computing demands do have options that can generate more energy independence. One is to create their own microgrid so they can access energy on their own terms. By creating a microgrid independent of a larger grid, or one that connects to a larger grid as a backup, data centers can choose which energy modalities they want to use, allowing for better control over their costs and their sustainability efforts. They could choose to run their microgrid on solar and battery storage, the cheapest and most renewable fuels today. The fuel cost for solar is zero, and data centers can store excess solar energy for future use. Natural gas is another fuel being touted as a potential solution, or data centers can create a hybrid of solar, wind, battery, and natural gas. Data centers can also take advantage of peak shaving, where they are able to sell energy back to the grid at high prices and at high energy times. A microgrid also increases a data center's energy resilience and security, since it won't be at the mercy of grid outages and instability, nor as susceptible to natural disasters that may cause the grid to go down for weeks at a time. 2. STRATEGIC LOCATION DECISIONS Another way to increase energy independence and ensure needed supply is to build new data centers in strategic locations that allow access to the right power infrastructures, like next to hydropower dams, a nuclear reactor, or a large transmission station. Data center developers today can look to secondary and tertiary markets where there's available land and grid capacity to build. For example, 70% of U.S. data center growth is projected to be concentrated in Virginia, Ohio, Illinois, Iowa, Oregon, and Georgia, not in densely populated metropolitan areas. 3. CREATIVE APPROACHES TO ON-SITE EFFICIENCY Data center operators can also take advantage of energy efficiencies that can increase energy availability and control costs. One of those efficiencies is getting creative about cooling. Data centers can expand their underground storage to allow for more natural cooling and less reliance on high-energy HVAC systems. Data center developers can also look to locations in colder regions, so there's less energy spent on rack cooling. Using recycled water for cooling can also help reduce costs. Another renewable energy option that can reduce cooling demand and costs is geothermal power, an option that Meta is pursuing for its new data centers. Updating data centers to more energy-efficient infrastructure can create more efficiencies as well, especially when upgrading from air cooling to liquid cooling, which conducts 3,000 times more heat than air, with less energy. Using AI to optimize data center conditions based on sensor outputs can increase operational efficiency and sustainability efforts, too. Another option for data centers is to strategically colocate near renewable energy plants to utilize curtailed energy—the excess electricity that would otherwise be wasted during peak production times when supply exceeds grid demand. This approach not only provides data centers with reliable power but also helps renewable energy producers monetize what would otherwise be lost revenue from unused capacity.