
Powering Intelligence: The Energy Challenge Behind The AI Revolution
The rise of generative AI, from small-scale models to large language models (LLMs) in particular, represents a turning point in our digital evolution. These models, which are capable of producing human-like text, making decisions for people and sometimes even speeding up the pace of science, are moving industries at a breathtaking rate.
But with this transformation comes one massive and often overlooked consequence—our skyrocketing thirst for electricity.
The Energy Demands Of Intelligence
A recent IEEE PES technical report indicates that investment in new data centers has dramatically increased. This growth, partly driven by generative AI workloads and the broader digital economy, means data centers are expected to form up to 44% of total U.S. electricity load growth over the next three or five years.
According to the report, some experts project that data centers may use 9% to 12% of all U.S. electricity by 2028. That contrasts sharply with one AI task: A ChatGPT query needs around 2.9 watt-hours (Wh) of power—10 times that of a typical Google search. Multiply that by the billions of searches made daily around the globe, and you have big numbers.
More alarming is that the scale of new hyperscale and AI-optimized facilities has continued to increase. Now, data center developers are planning campuses of 500 MW to 2 GW—whole cities in terms of electrical power demand. For context, a 1 GW load can provide electricity for 800,000 to 1 million homes in the U.S. These data centers are not just another category of demand—they are increasingly becoming the 21st-century tech industry's new backbone.
The Grid Is Feeling The Pressure
The electric grid, already groaning under the strain of the different trends in electrification (EVs, smart cities, renewable integration, etc.), now faces another challenge with an additional twist: Data centers have volatile, enormous and place-specific loads.
In this landscape, it's hard to forecast potential energy needs. Commercially confidential plans, fast-changing chip efficiency trends, self-generation and a regulatory landscape that varies with each section line are all contributing to the forecasting miasma. Forecasts by some organizations might show that some coastal areas will require 5% of their local grid capacity for data centers, while others may expect usage rates up to 15% within just two or three years.
Also, it's worth mentioning that our substations and transmission lines were never built for AI-era demand peaks. A single large data center can swamp local infrastructure, requiring years of permits and construction work to upgrade.
Grid interconnection applications have also risen meteorically. In some regions, like I've seen in Texas and PJM regions, data centers are giving rise to new levels of reliability and infrastructure coordination efforts.
It is also possible that this thirst for digital intelligence will thwart decarbonization. As energy needs grow, the goal of reducing greenhouse gas emissions may become unattainable—unless AI growth champions energy sources that are clean and reliable.
Digital Intelligence, Physical Fragility
According to IEEE's recent Energy Sustainability Magazine, we must look at AI not only as a new tool, but also as a driving force behind energy consumption. The AI community must realize that it can shape sustainable demand.
Although I applaud AI's role in optimizing grids, money-saving waste reduction and bringing social infrastructure online, I believe we also must consider whether all this resource consumption is justified. Data centers not only stress power systems—they also use lots of water (for cooling) and throw away tons of electronic waste due to obsolete hardware that cannot be recycled.
This intersection of digital and physical systems demands a new model. We need to move from disconnected planning to one that champions sustainability, reliability and computer power.
So, What Can We Do?
As an industry professional leading multiple AI, cloud and infrastructure transformation projects for clients across energy, utilities and public sector organizations, I believe the way forward must include the following five imperatives:
1. Grid Aware AI Design: We should schedule our rollout of AI alongside low-carbon infrastructure. This will make it so that LLMs are either trained or put into operation at times when renewable energy is abundant and grids are least stressed.
2. Behind-The-Meter Generation: We need to build clean energy resources (solar, hydrogen, etc.) alongside data centers to reduce grid impacts while promoting corporate ESG goals. However, this kind of design must have policy clarity and is not without its reliability risks.
3. Flexible AI Workloads: Some AI jobs are not time-critical. We should keep this in mind when building AI into grid assets and keep the fact that this tech can be both an asset and a liability.
4. Decentralized Edge Intelligence: By doing more computations on edge devices that are closer to where data is generated, we can save in local backhaul, latency and energy use, also making way for more resilient mini-grids or even modular energy systems.
5. Cross-Sectoral Cooperation: I believe that we need the whole industry to pull together. This includes utilities, large cloud operators, government regulation agencies and universities. This isn't simply an engineering problem; it's actually a question of governance and policy that will shape our digital future.
It's Time For The AI Ecosystem To Wake Up
In the last decade, we have been optimizing our systems for algorithmic efficiency. But now is the time to optimize energy efficiency, grid stability and social resilience.
The AI we are constructing has great strength, but that strength comes without power—in the literal sense. Don't forget that each intelligent prompt, video stream and AI-generated insight affects the environment. As engineers, architects and practitioners of AI, it is our responsibility to make sure this intelligence is not just transformative, but sustainable.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
4 minutes ago
- Yahoo
H.C. Wainwright Sees Long-Term Upside for BigBear.ai (BBAI) Despite Setbacks
Holdings, Inc. (NYSE:BBAI) ) is one of the . On August 12, H.C. Wainwright analyst Scott Buck lowered the price target on the stock to $8.00 (from $9.00) while maintaining a Buy rating. The price target cut follows second quarter results which it posted after the markets closed on August 11th. The firm discussed how the artificial intelligence company's revenue of $32.5 million was significantly below its $41 million estimate. also lowered its full-year revenue guidance to $125.0M to $140.0M, down from $160.0M and $180.0M. The firm considers this low guidance to be because of lower 2Q25 revenue coupled with disruptions in federal contracts, particularly in programs supporting the U.S. Army. A busy trading floor featuring a team of financial analysts monitoring the public equity markets. The firm acknowledged that these results are disappointing, but also pointed out that they shouldn't come as a surprise. 'While disappointing, these results should not come as a surprise given what we have heard from other reporting peers in the defense space, which have also experienced program delays. We believe revenue visibility could begin to improve as the business moves towards 2026.' Longer term, the firm considers to be a beneficiary of the One Big Beautiful Bill. 'Longer term, we believe is well positioned to be a beneficiary of the One Big Beautiful Bill, which substantially increases investment in areas aligned with the company's core competencies. This includes $170.0B of incremental funding to the Department of Homeland Security. In addition, the company solidified its balance sheet during 2Q25, ending the quarter with more than $390.0M of available cash. We expect this capital to be deployed in coming quarters, through reinvestment in the business as well as complementary M&A. While we expect shares to respond negatively to 2Q25 results and change in guidance, we see an improved balance sheet and favorable industry and legislative trends as catalysts for long term growth. As a result, we suspect operating results and outlook should look materially stronger a year from now. We recommend investors take advantage of any near term weakness in BBAI shares to accumulate a position ahead of more favorable operating results. We remain Buy-rated but ave lowered our price target to $8 from $9, reflecting our lower 2026 revenue forecast.' Holdings, Inc. (NYSE:BBAI) is an artificial intelligence specialist that provides decision intelligence solutions for national security, digital identity, supply chain and logistics, enterprise operations, and manned-unmanned teaming in autonomous systems. While we acknowledge the potential of BBAI as an investment, we believe certain AI stocks offer greater upside potential and carry less downside risk. If you're looking for an extremely undervalued AI stock that also stands to benefit significantly from Trump-era tariffs and the onshoring trend, see our free report on the best short-term AI stock. READ NEXT: and Disclosure: None. Sign in to access your portfolio
Yahoo
4 minutes ago
- Yahoo
DA Davidson Lowers monday.com (MNDY) Target After Google Search Changes Hit Growth
Ltd. (NASDAQ:MNDY) is one of the AI Stocks Investors Are Watching Closely. On August 12, DA Davidson analyst Lucky Schreiner lowered the price target on the stock to $275.00 (from $325.00) while maintaining a 'Buy' rating. The price target adjustment stems from weakness in net new demand from small anad medium-sized businesses (SMBs) as a result of changes to Google's Search algorithm that affected customer acquisition. ' reported a solid quarter with a slightly lower than typical beat driven primarily by weakness around net new demand from SMBs as a result of changes to Google's Search algorithm." Ltd. (NASDAQ:MNDY) develops software applications globally, offering a cloud-based Work OS for creating work management tools. While we acknowledge the potential of MNDY as an investment, we believe certain AI stocks offer greater upside potential and carry less downside risk. If you're looking for an extremely undervalued AI stock that also stands to benefit significantly from Trump-era tariffs and the onshoring trend, see our free report on the best short-term AI stock. READ NEXT: and Disclosure: None. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Geek Wire
21 minutes ago
- Geek Wire
Photos: Inside the Allen Institute for AI's new HQ in Seattle's first mass-timber office building
The view from one of Ai2's outdoor patios, overlooking Lake Union and the Seattle skyline. (Photos by Todd Bishop, GeekWire) It was a big news week for Seattle's Allen Institute for Artificial Intelligence (Ai2), including the announcement of a new AI robotics initiative and a landmark grant from Nvidia and the National Science Foundation to lead the creation of the future AI backbone for U.S. scientific research. But the nonprofit research institute also reached another milestone in recent weeks: moving into its new 50,000-square-foot headquarters in Seattle's first large-scale mass-timber commercial building. A lobby seating area at Ai2's new headquarters in Northlake Commons. (GeekWire Photo / Todd Bishop) The new Ai2 space, in the Northlake Commons project on the north shore of Lake Union, is now the central gathering and workspace for the organization and its team of about 225 people. Ai2 occupies one floor of the five-story building. It's a short drive up the road from Ai2's prior headquarters, and within walking or biking distance of the University of Washington, where many of Ai2's researchers are also affiliated with the UW Allen School of Computer Science & Engineering. Mass timber is an engineered wood product made by binding layers of wood together into strong, large structural panels and beams, providing the strength of steel or concrete with a lower carbon footprint. The mass-timber elements are visible throughout the space and common areas of the building. An outdoor meeting area at the Northlake Commons building. The space includes large meeting rooms, private call rooms, a podcast and video studio, large doors that open up to outside patios on warm days, and a dining area and gathering space with sweeping views of Lake Union and downtown. The interior design deliberately avoids a traditional 'sea of desks' layout in favor of smaller groupings of workstations where researchers can collaborate while remaining connected through walkways and sight lines. There's even enough space for a simulated home environment in the Ai2 robotics lab, including two kitchens, allowing researchers to test their latest AI robotics technology in conditions that mimic a real-world household. The Ai2 logo in the lobby of the new headquarters. Northlake Commons was designed by Seattle-based architect Weber Thompson and built by general contractor Swinerton. News of the Ai2 lease was announced in July 2024. The Allen Institute for AI (Ai2) is separate from the AI2 Incubator, which spun off from the non-profit and has its own new waterfront home, AI House, at Pier 70 on Elliott Bay. Keep scrolling for more pictures from inside the new Ai2 space. The view across the open-air courtyard at Northlake Commons, the mass-timber building that now houses Ai2's new headquarters. (GeekWire Photos / Todd Bishop) A grouping of workstations inside the new Ai2 headquarters. The entrance to a new podcast and video studio at the Ai2 headquarters. A large common area overlooking Lake Union. The Ai2 logo inside the common room of the new headquarters. RELATED STORY: Allen Institute for AI lands $152M from Nvidia and NSF to lead national AI project