
How much energy does your AI prompt use? I went to a data center to find out.
There they were. Golden boxes, louder than a toddler on a red-eye, hotter than a campfire in a heat wave, pricier than a private Caribbean island.
Yes, real, working Nvidia GPUs.
I was under strict 'look, don't touch" orders—as if I'd, what, lick the mesh metal enclosure. Just standing there, I could hear and feel the electricity being devoured.
We've all heard about AI's insatiable energy appetite. By 2028, data centers like this one I visited in Ashburn, Va., could consume up to 12% of all U.S. electricity, according to a report from the Energy Department and Lawrence Berkeley National Lab.
And yes, we're the problem, it's us. (Insert Taylor Swift-related groan here.) Every time we ask AI to write an email, draw an anime-style George Washington or generate a video of a cat doing a back flip, we're triggering another roar in those massive halls of GPUs.
What I wanted to know was, how much power do my AI tasks actually use? The equivalent of charging a phone? A laptop? Cooking a steak on an electric grill? Powering my house?
After digging into the research, visiting a data center, bugging just about every major AI company and, yes, firing up that grill, I got some answers. But not enough. Tech companies need to tell us more about the energy they're using on our behalf.
Let's start with a recent, popular example: 'a video of a cat diving off an Olympic diving board." The moment you hit enter, that prompt gets routed to a massive data center.
When it arrives, it kicks off inference, where pretrained AI models interpret and respond to your request. In most cases, rows of powerful Nvidia graphics processing units get to work turning your weird idea into a weirder reality. Rival chips from companies like Amazon, Google or Groq are also starting to be used for inference. The model training itself happens earlier, with Nvidia chips.
The facility where I saw that 'SuperPod" of Nvidia H100 GPUs was run by Equinix, one of the world's largest operators of data centers that provide cloud infrastructure—and now, AI.
Chris Kimm, Equinix senior vice president of customer success, said that while AI training can happen just about anywhere, inference is best done geographically closer to users to deliver the best speed and efficiency.
Figuring out how much energy your individual AI prompts use would be a lot easier if the major AI companies actually shared the darn info. Google, Microsoft and Meta declined. Google and Meta pointed me to their sustainability reports.
OpenAI shared something. Chief Executive Sam Altman said that the average ChatGPT query uses about 0.34 watt-hours of energy. OpenAI wouldn't break out details on text, image or video energy usage.
Researchers have stepped in to fill the gap. Sasha Luccioni, the AI and climate lead at open-source AI platform Hugging Face, has run tests to estimate the energy required to generate different types of content. Along with other researchers, she also maintains an AI Energy Score leaderboard. Since the top AI players use their own proprietary models, she relies on open-source alternatives.
The energy required to generate content varies widely depending on the model and GPU setup. Compare Luccioni's findings with charging a typical smartphone, which uses around 10 watt-hours of energy:
• Text: A lightweight, single-GPU Llama model from Meta used about 0.17 watt-hours, while a larger Llama model running across multiple GPUs used 1.7 watt-hours.
• Images: Generating a single 1024 x 1024 image with one GPU also used 1.7 watt-hours.
• Video: This is the most intensive. Even making 6-second, standard-definition videos used anywhere between 20 and 110 watt-hours.
I wanted to better understand the stakes—literally. So I grabbed an electric grill from Home Depot, a power meter and my video producer, David Hall. About 10 minutes and 220 watt-hours later, we had a thin, medium-well steak. Translation: The energy it took to cook a decent dinner was about the same as generating two AI videos, at the high end. (Watch the video above for more steak breakdowns.)
Remember the short AI film I made using Google Veo and Runway a few weeks ago? We generated about a thousand 8-second, 720p clips for our film. Going by these estimates, we might have used roughly 110,000 watt-hours. That's nearly 500 steaks!
But, as I said, Luccioni doesn't have the power-consumption data for the commercial AI tools, and her numbers aren't a perfect match: On the one hand, our video was higher quality than the 6-second, 480p clips in Luccioni's research. On the other hand, the popular video models are likely optimized for greater efficiency, experts say.
'Until we get access to these models," Luccioni said, 'all we can do is estimate."
Her tests also use Nvidia's last-generation Hopper chips. Nvidia has seen a jump in energy efficiency with its latest Blackwell Ultra chips, according to Josh Parker, the company's head of sustainability. 'We're using 1/30th of the energy for the same inference workloads that we were just a year ago," Parker said.
That said, plenty are still using those older chips. The pod I saw at Equinix's facility? It cost over $9 million in just Nvidia hardware alone. You don't just toss that in the dumpster when new ones come out.
And I've only covered electricity. These hot GPUs also require a lot of water to stay cool, but that's a whole other story.
Data-center providers and tech companies I spoke to all said the same thing: Demand for these GPU-filled buildings keeps multiplying. Just driving through Ashburn, I saw five massive data centers going up.
The companies also stressed the improving efficiency of models and chips, and their efforts to shift to cleaner, more renewable energy sources.
No matter how efficient things get, more of us are using AI. We could all buy more efficient air conditioners, but if the planet keeps getting hotter, we're going to crank the AC more—and burn more energy.
Luccioni hopes we at least consider energy use when we use these tools, maybe think twice about generating a dozen cat videos. And it's on the companies to start sharing real numbers, so that we can make informed choices.
Back to Virginia, and those screaming GPUs. Turns out, they weren't generating Olympic kitty videos. They were owned by Bristol Myers Squibb—and they were searching for new cures to diseases.
Not all AI prompts are what you'd call a waste of energy.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time of India
30 minutes ago
- Time of India
After suspending Indus Waters Treaty with Pakistan, India now eyes changing Ganges Water Sharing Treaty with Bangladesh
The water-sharing agreement of 1996 between India and Bangladesh was established to resolve disputes regarding the Farakka water allocation. (AI image) Ganges Water Sharing Treaty with Bangladesh to be revised? India is exploring various alternatives to modify and reassess a significant agreement with Bangladesh regarding the distribution of Ganga waters, following its decision to suspend the Indus Waters Treaty with Pakistan. The agreement for sharing Ganga waters, called the Ganges Water Sharing Treaty, expires in 2026, marking 30 years since its implementation. While renewal requires mutual agreement, New Delhi is eyeing a fresh treaty that addresses its present developmental requirements, sources told ET. What is the Ganga water agreement between India and Bangladesh? The agreement, which was formalised in 1996 when Sheikh Hasina began her first tenure as Bangladesh's Prime Minister, established a framework for distributing Ganga's flow at the Farakka Barrage in West Bengal during the critical dry period from January 1 to May 31 annually. Also Read | China plays hardball! After choking rare earth magnets supply, China blocks important agriculture-related shipments to India; continues exports to others The water-sharing agreement of 1996 between India and Bangladesh was established to resolve disputes regarding the Farakka water allocation. The discord arose following the 1975 activation of the Farakka barrage, which channelled water from the Ganges to the Hooghly River to ensure the Calcutta port remained navigable. The agreement established terms between India, the upstream nation, and Bangladesh, the downstream nation, for sharing Ganges water at Farakka - a barrier constructed on river Bhagirathi, approximately 10 kilometres from Bangladesh's border. The Farakka Barrage's construction facilitated a 40,000 cusecs water diversion into a feeder canal serving the Kolkata Port Trust. The current protocol allocates 35,000 cusecs of water alternately to both nations for 10-day periods during the lean season from March 11 to May 11. Sources told the financial daily that India seeks an additional 30,000 to 35,000 cusecs during this period to address its growing needs. There exists a necessity to reconsider the agreement to achieve an ideal equilibrium in water distribution between West Bengal and Bangladesh. According to these sources, India seeks to modify the treaty to accommodate its requirements for irrigation, harbour maintenance, and electricity production. The West Bengal administration reportedly supports the Central government's position, considering that the treaty's existing provisions are insufficient for their requirements. Also Read | India bleeds Pakistan dry: Water at 'dead' levels in Pakistan's dams; bigger Indus river plans in the works - top points to know Stay informed with the latest business news, updates on bank holidays and public holidays . AI Masterclass for Students. Upskill Young Ones Today!– Join Now


Time of India
30 minutes ago
- Time of India
Chromebook's first flagship store ‘Chromebook Townhouse' is now a multi-brand retail outlet
Chromebook Townhouse, located at DLF CyberHub, Gurugram, has officially transitioned into a multi-brand retail outlet offering the full spectrum of Chromebooks and Chromebook Plus laptops. Chromebook is a laptop that runs on ChromeOS, a fast, simple and secure operating system built by Google. The store features laptops from top original equipment manufacturers (OEMs), including HP, Lenovo, Acer, and ASUS, along with peripherals from Canon and Logitech, all carefully curated to create a seamless user experience. Over the past two years, Chromebook Townhouse operated as an experience-only store, introducing first-time users to ChromeOS through thousands of in-store demos. It quickly became the go-to destination for brand discovery, product education, and hands-on engagement. Keeping the long-term vision in mind, Chromebook Townhouse has officially transitioned into a franchise-led store format. To lead this next phase, I&S Communique Pvt. Ltd. has been onboarded as a Google Certified Partner., To make buying even more convenient, the Chromebook team has adopted a true omnichannel approach. Shoppers can purchase Chromebooks in-store or online via the official eShop at with secure doorstep delivery across India. The Chromebook Townhouse is offering laptops starting from Rs. 20,000/- with exclusive benefits f or a limited time, including a weekend Flash Sale going live from July 4th 25th with up to 25% off on selected models. EMI cashbacks with cards of HDFC, IDFC, and Bank of Baroda, plus perks like 1-year free Gemini Advanced access and Google Cloud storage. Whether you're buying your first laptop or upgrading to a premium Chromebook Plus, Chromebook Townhouse is your destination to - Try. Buy. Love. Located at DLF CyberHub, visit the store for exclusive in-store offers and personalised demos. Timings: 11:00 AM – 9:30 PM Shop online at Disclaimer- This is a brand post and has been produced by Times Internet's Spotlight team.


The Hindu
38 minutes ago
- The Hindu
Brazil's Supreme Court clears way to hold social media companies liable for user content
Brazil's Supreme Court agreed on Thursday on details of a decision to hold social media companies liable for what their users post, clearing the way for it go into effect within weeks. The 8-3 vote in Brazil's top court orders tech giants like Google, Meta and TikTok to actively monitor content that involves hate speech, racism and incitation to violence and act to remove it. The case has unsettled the relationship between the South American nation and the U.S. government. Critics have expressed concern that the move could threaten free speech if platforms preemptively remove content that could be problematic. After Thursday's ruling is published by the court, people will be able to sue social media companies for hosting illegal content if they refuse to remove it after a victim brings it to their attention. The court didn't set out firm rules on what content is illegal, leaving it to be decided on a case-by-case basis. The ruling strengthens a law that requires companies to remove content only after court orders, which were often ignored. It's the product of two cases accepted by the court last year in which social media companies were accused of failed to act against users promoting fraud, child abuse media, and violence. A majority of the 11 justices voted to approve the change two weeks ago, but it took until today to reach consensus on how to implement it. The justices also agreed that social media companies will not be liable if they can show they took steps to remove illegal content in a timely fashion. Google said in a statement that is analysing the court's decision. 'We remain open for dialogue,' the company said. Brazil's top court came to the decision after U.S. Secretary of State Marco Rubio warned of possible visa restrictions against foreign officials involved in censoring American citizens. Thursday's ruling brings Brazil's approach to big tech closer to the European Union's approach, which has sought to rein in the power of social media companies and other digital platforms.