How Much Electricity It Actually Takes to Use AI May Surprise You
By now, most of us should be vaguely aware that artificial intelligence is hungry for power.
Even if you don't know the exact numbers, the charge that "AI is bad for the environment" is well-documented, bubbling from sources ranging from mainstream press to pop-science YouTube channels to tech trade media.
Still, the AI industry as we know it today is young. Though startups and big tech firms have been plugging away on large language models (LLMs) since the 2010s, the release of consumer generative AI in late 2022 brought about a huge increase in AI adoption, leading to an unprecedented "AI boom."
In under three years, AI has come to dominate global tech spending in ways researchers are just starting to quantify. In 2024, for example, AI companies nabbed 45 percent of all US venture capital tech investments, up from only nine percent in 2022. Medium-term, big-name consultant firms like McKinsey expect AI infrastructure spending to grow to $6.7 trillion by 2030; compare this to just $450 billion in 2022.
That being the case, research on AI's climate and environmental impacts can seem vague and scattered, as analysts race to establish concrete environmental trends in the extraordinary explosion of the AI industry.
A new survey by MIT Technology Review is trying to change that. The authors spoke to two dozen AI experts working to uncover the tech's climate impact, combed "hundreds of pages" of data and reports, and probed the top developers of LLM tools in order to provide a "comprehensive look" at the industry's impact.
"Ultimately, we found that the common understanding of AI's energy consumption is full of holes," the authors wrote. That led them to start small, looking at the energy use of a single LLM query.
Beginning with text-based LLMs, they found that model size directly predicted energy demand, as bigger LLMs use more chips — and therefore more energy — to process questions. While smaller models like Meta's Llama 3.1 8B used roughly 57 joules per response (or 114 joules when the authors factored for cooling power and other energy needs), larger units needed 3,353 joules (or 6,706), or in MIT Tech's point of reference, enough to run a microwave for eight seconds.
Image-generating AI models, like Stable Diffusion 3 Medium, needed 1,141 joules (or 2,282) on average to spit out a standard 1024 x 1024 pixel image — the type that are rapidly strangling the internet. Doubling the quality of the image likewise doubles the energy use to 4,402 joules, worth over five seconds of microwave warming time, still less than the largest language bot.
Video generation is where the sparks really start flying. The lowest-quality AI video software, a nine-month old version of Code Carbon, took an eye-watering 109,000 joules to spew out a low-quality, 8fps film — "more like a GIF than a video," the authors noted.
Better models use a lot more. With a recent update, that same tool takes 3.4 million joules to spit out a five-second, 16fps video, equivalent to running a microwave for over an hour.
Whether any of those numbers amount to a lot or a little is open to debate. Running the microwave for a few seconds isn't much, but if everybody starts doing so hundreds of times a day — or in the case of video, for hours at a time — it'll make a huge impact in the world's power consumption. And of course, the AI industry is currently trending toward models that use more power, not less.
Zooming out, the MIT Tech survey also highlights some concerning trends.
One is the overall rise in power use correlating to the rise of AI. While data center power use remained mostly steady across the US between 2005 and 2017, their power consumption doubled by 2023, our first full year with mass-market AI.
As of 2024, 4.4 percent of all energy consumed in the US went toward data centers. Meanwhile, data centers' carbon intensity — the amount of iceberg-melting exhaust spewed relative to energy used — became 48 percent higher than the US average.
All that said, the MIT authors have a few caveats.
First, we can't look under the hood at closed-source AI models like OpenAI's ChatGPT, and most of the leading AI titans have declined to join in on good-faith climate mapping initiatives like AI Energy Score. Until that changes, any attempt to map such a company's climate impact is a stab in the dark at best.
In addition, the survey's writers note that data centers are not inherently bad for the environment. "If all data centers were hooked up to solar panels and ran only when the Sun was shining, the world would be talking a lot less about AI's energy consumption," they wrote. But unfortunately, "that's not the case."
In countries like the US, the energy grid used to power data centers is still heavily reliant on fossil fuels, and surging demand for immediate energy are only making that worse. For example, the authors point to Elon Musk's xAI data center outside of Memphis, which is is using 35 methane gas generators to keep its chips humming, rather than wait for approval to draw from the civilian power grid.
Unless the industry is made to adopt strategies to mitigate AI's climate impact — like those outlined in the Paris AI Action Declaration — this will just be the beginning of a devastating rise in climate-altering emissions.
More on AI: New Law Would Ban All AI Regulation for a Decade
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
35 minutes ago
- Yahoo
Here's how Uber's product chief uses AI at work — and one tool he's going to use next
Uber's product chief said he uses AI to summarize lengthy reports and do research before launches. Sachin Kansal uses ChatGPT and Gemini to understand Uber's performance in overseas markets. He plans to add Google's NotebookLM to his AI suite. Uber's chief product officer has one AI tool on his to-do list. In an episode of "Lenny's Podcast" released on Sunday, Uber's product chief, Sachin Kansal, shared two ways he is using AI for his everyday tasks at the ride-hailing giant and how he plans to add NotebookLM to his AI suite. Kansal joined Uber eight years ago as its director of product management after working at cybersecurity and taxi startups. He became Uber's product chief last year. Kansal said he uses OpenAI's ChatGPT and Google's Gemini to summarize long reports. "Some of these reports, they're 50 to 100 pages long," he said. "I will never have the time to read them." He said he uses the chatbots to acquaint himself with what's happening and how riders are feeling in Uber's various markets, such as South Africa, Brazil, and Korea. The CPO said his second use case is treating AI like a research assistant, because some large language models now offer a deep research feature. Kansal gave a recent example of when his team was thinking about a new driver feature. He asked ChatGPT's deep research mode about what drivers may think of the add-on. "It's an amazing research assistant and it's absolutely a starting point for a brainstorm with my team with some really, really good ideas," the CPO said. In April, Uber's CEO, Dara Khosrowshahi, said that not enough of his 30,000-odd employees are using AI. He said learning to work with AI agents to code is "going to be an absolute necessity at Uber within a year." Uber did not immediately respond to a request for comment from Business Insider. On the podcast, Kansal also highlighted NotebookLM, Google Lab's research and note-taking tool, which is especially helpful for interacting with documents. He said he doesn't use the product yet, but wants to. "I know a lot of people who have started using it, and that is the next thing that I'm going to use," he said. "Just to be able to build an audio podcast based on a bunch of information that you can consume. I think that's awesome," he added. Kansal was referring to the "Audio Overview" feature, which summarizes uploaded content in the form of two AIs having a voice discussion. NotebookLM was launched in mid-2023 and has quickly become a must-have tool for researchers and AI enthusiasts. Andrej Karpathy, Tesla's former director of AI and OpenAI cofounder, is among those who have praised the tool and its podcast feature. "It's possible that NotebookLM podcast episode generation is touching on a whole new territory of highly compelling LLM product formats," he said in a September post on X. "Feels reminiscent of ChatGPT. Maybe I'm overreacting." Read the original article on Business Insider Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
35 minutes ago
- Yahoo
YC partners say AI founders are closing huge deals fast by taking a page out of Palantir's early playbook
AI founders should see themselves as "forward-deployed engineers," said Y Combinator partners. The term, popularized by Palantir, refers to engineers who embed themselves with clients. Founders have closed "six, seven seven-figure deals" by being forward-deployed engineers, said a YC partner. Some AI founders are landing big enterprise deals by doing something old-school: showing up, writing code, and building the perfect demo — fast. YC partners say this strategy is taking off, and it's straight out of Palantir's early playbook. Startup founders should see themselves as "forward-deployed engineers," said Garry Tan, YC's CEO, on an episode of the "Y Combinator" podcast published Friday. The term, popularized by Palantir, refers to engineers who embed themselves with clients to fine-tune the product on-site. Tan, who was Palantir's 10th employee, said the defense tech company's edge came from recognizing that many government agencies and Fortune 500 companies lacked deep technical expertise in the room. Palantir bridged that gap by embedding technically savvy engineers during sales and implementation. Much of Palantir's success comes from its business with the US government. The Department of Defense is its biggest customer, making up 41% of its fourth-quarter revenue. Startup founders need to be "technical," "great product people," and even "ethnographers" and "designers," said Tan, who worked at Palantir from 2005 to 2007." "You want the person on the second meeting to see the demo you put together based on the stuff you heard, and you want them to say, 'Wow, I've never seen anything like that.' And take my money," he added. This hands-on approach is already delivering big results. YC partner Diana Hu said she and her team have seen founders close "six, seven seven-figure deals" with large enterprises by being forward-deployed engineers. Sometimes, she said, a pair of founders wins a deal by walking into a boardroom, gathering context, and coming back the next day with a tailored AI demo. Once the deal is closed, some of these founders go on-site to work closely with customer support teams, continuously fine-tuning the software or language model to improve performance, said YC partner Harj Taggar. Tan said this model gives AI startups a chance to outmaneuver giants like Salesforce, Oracle, and Booz Allen. "You have big fancy salespeople with big strong handshakes, and it's like, how does a really good engineer with a weak handshake go in there and beat them?" Tan said. "It's actually you show them something that they've never seen before, and like, make them feel super heard." Y Combinator did not respond to a request for comment from Business Insider. Read the original article on Business Insider Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Forbes
43 minutes ago
- Forbes
Microsoft's Bad News—500 Million Windows Users Must Now Decide
Surprising bad news suddenly hits Microsoft. A new warning has been issued for Windows users, whose PCs have been described as 'magnets for security threats,' just as new data gives Microsoft a surprising bad news story ahead of the critical next few months. You can expect many more such warnings as 500-million Windows users face an increasingly urgent decision. The latest advice comes courtesy of PC maker Asus, pointing out that 'if you're still using Windows 10 or, dare we say it, something even older — your computer's days of regular updates and support are numbered.' As for upgrades, 'what makes Windows 11 different?," Asus says. "one word: Copilot," as it pushes the latest range of AI PCs. Clearly, you don't need to decide on a premium Copilot PC to benefit from Windows 11's future-proofing, ensuring your PC receives critical security updates after Windows 10's demise in October. AI PCs remain a niche, despite projections they will eventually dominate new PC sales. Right now, there's a more fundamental decision to make. Windows 10 versus Windows 11 globally. The latest Windows market data presents a painfully bleak picture with just over five months to run until free Windows 10 security updates end for all users. Paid extensions are available, but they're expensive for enterprises and restricted to just 12-months for home users who also must pay. Microsoft is pushing free upgrades not paid extensions. A month ago, it seemed Windows 11 had turned the tide against Windows 10. The newer OS already outanks its older sibling in the U.S. but not globally. Come the end of April, though, Windows 11 was within 10% of Windows 11 for the first time. 'Just over half (53%) of all users are still on Windows 10, but that's inching down month by month.' Not any more, it seems. While more directional than exact, Statcounter's data at the end of May shows a slight month-over-month increase for Windows 10, while Windows 11 dips. This after four months of steady progress the other way. Windows 10 is holding stubbornly above 50% while Windows 11 remains 10% behind. Windows 10 versus Windows 11 in U.S. This means there are around 750 million users are yet to upgrade to Windows 11, of which at least 240 million don't have an eligible PC. That still leaves around 500 million users who can take up Microsoft's offer for a free Windows 11 upgrade but have not. Even in the U.S., where Windows 11 has overtaken Windows 10, May's data suggests Windows 10 has grown its share from 41% in April to more than 43%, while Windows 11 drops a more worrying 3.5%, from 56.5% down to below 53%. All this makes June's data critical. Come the end of this month, there will be just three months until Windows 10 is shuttered. If Microsoft is to avoid a cybersecurity nightmare hitting mid-October, something need to change. For all those Windows 10 users with PCs eligible for a free upgrade, do not run out of time.