
AI could consume more power than Bitcoin by the end of 2025
AI could soon surpass Bitcoin mining in energy consumption, according to a new analysis that concludes artificial intelligence could use close to half of all the electricity consumed by data centers globally by the end of 2025.
The estimates come from Alex de Vries-Gao, a PhD candidate at Vrije Universiteit Amsterdam Institute for Environmental Studies who has tracked cryptocurrencies' electricity consumption and environmental impact in previous research and on his website Digiconomist. He published his latest commentary on AI's growing electricity demand last week in the journal Joule.
AI already accounts for up to a fifth of the electricity that data centers use, according to de Vries-Gao. It's a tricky number to pin down without big tech companies sharing data specifically on how much energy their AI models consume. De Vries-Gao had to make projections based on the supply chain for specialized computer chips used for AI. He and other researchers trying to understand AI's energy consumption have found, however, that its appetite is growing despite efficiency gains — and at a fast enough clip to warrant more scrutiny.
'Oh boy, here we go.'
With alternative cryptocurrencies to Bitcoin — namely Ethereum — moving to less energy-intensive technologies, de Vries-Gao says he figured he was about to hang up his hat. And then 'ChatGPT happened,' he tells The Verge. 'I was like, Oh boy, here we go. This is another usually energy-intensive technology, especially in extremely competitive markets.'
There are a couple key parallels he sees. First is a mindset of 'bigger is better.' 'We see these big tech [companies] constantly boosting the size of their models, trying to have the very best model out there, but in the meanwhile, of course, also boosting the resource demands of those models,' he says.
That chase has led to a boom in new data centers for AI, particularly in the US, where there are more data centers than in any other country. Energy companies plan to build out new gas-fired power plants and nuclear reactors to meet growing electricity demand from AI. Sudden spikes in electricity demand can stress power grids and derail efforts to switch to cleaner sources of energy, problems similarly posed by new crypto mines that are essentially like data centers used to validate blockchain transactions.
The other parallel de Vries-Gao sees with his previous work on crypto mining is how hard it can be to suss out how much energy these technologies are actually using and their environmental impact. To be sure, many major tech companies developing AI tools have set climate goals and include their greenhouse gas emissions in annual sustainability reports. That's how we know that both Google 's and Microsoft 's carbon footprints have grown in recent years as they focus on AI. But companies usually don't break down the data to show what's attributable to AI specifically.
To figure this out, de Vries-Gao used what he calls a 'triangulation' technique. He turned to publicly available device details, analyst estimates, and companies' earnings calls to estimate hardware production for AI and how much energy that hardware will likely use. Taiwan Semiconductor Manufacturing Company (TSMC), which fabricates AI chips for other companies including Nvidia and AMD, saw its production capacity for packaged chips used for AI more than double between 2023 and 2024.
After calculating how much specialized AI equipment can be produced, de Vries-Gao compared that to information about how much electricity these devices consume. Last year, they likely burned through as much electricity as de Vries-Gao's home country of the Netherlands, he found. He expects that number to grow closer to a country as large as the UK by the end of 2025, with power demand for AI reaching 23GW.
Last week, a separate report from consulting firm ICF forecasts a 25 percent rise in electricity demand in the US by the end of the decade thanks in large part to AI, traditional data centers, and Bitcoin mining.
It's still really hard to make blanket predictions about AI's energy consumption and the resulting environmental impact — a point laid out clearly in a deeply reported article published in MIT Technology Review last week with support from the Tarbell Center for AI Journalism. A person using AI tools to promote a fundraiser might create nearly twice as much carbon pollution if their queries were answered by data centers in West Virginia than in California, as an example. Energy intensity and emissions depend on a range of factors including the types of queries made, the size of the models answering those queries, and the share of renewables and fossil fuels on the local power grid feeding the data center.
It's a mystery that could be solved if tech companies were more transparent
It's a mystery that could be solved if tech companies were more transparent about AI in their sustainability reporting. 'The crazy amount of steps that you have to go through to be able to put any number at all on this, I think this is really absurd,' de Vries-Gao says. 'It shouldn't be this ridiculously hard. But sadly, it is.'
Looking further into the future, there's even more uncertainty when it comes to whether energy efficiency gains will eventually flatten out electricity demand. DeepSeek made a splash earlier this year when it said that its AI model could use a fraction of the electricity that Meta's Llama 3.1 model does — raising questions about whether tech companies really need to be such energy hogs in order to make advances in AI. The question is whether they'll prioritize building more efficient models and abandon the 'bigger is better' approach of simply throwing more data and computing power at their AI ambitions.
When Ethereum transitioned to a far more energy efficient strategy for validating transactions than Bitcoin mining, its electricity consumption suddenly dropped by 99.988 percent. Environmental advocates have pressured other blockchain networks to follow suit. But others — namely Bitcoin miners — are reluctant to abandon investments they've already made in existing hardware (nor give up other ideological arguments for sticking with old habits).
There's also the risk of Jevons paradox with AI, that more efficient models will still gobble up increasing amounts of electricity because people just start to use the technology more. Either way, it'll be hard to manage the issue without measuring it first.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
29 minutes ago
- Yahoo
What is a GPT?
When you buy through links on our articles, Future and its syndication partners may earn a commission. The introduction of generative pre-trained transformers (GPTs) marked a significant milestone in the adoption and utility of artificial intelligence in the real world. The technology was created by the then fledgling research lab OpenAI, based on previous research done on transformers in 2017 by Google Labs. It was Google's white paper "Attention is all you need", which laid the foundation for OpenAI's work on the GPT concept. As seen in > Model matchup surprise > ChatGPT announcements > Goodbye ChatGPT-4 > Why ChatGPT 4.1 is a big deal Transformers provided AI scientists with an innovative method of taking user input, and converting it to something that could be used by the neural network using an attention mechanism to identify important parts of the data. This architecture also allows for the information to be processed in parallel rather than sequentially as with traditional neural networks. This provides a huge and critical improvement in speed and efficiency of AI processing. OpenAI's GPT architecture was released in 2018 with GPT-1. By significantly refining Google's transformer ideas, the GPT model demonstrated that large-scale unsupervised learning could produce an extremely capable text generation model which operated at vastly improved speeds. GPT's also uprated the neural networks' understanding of context which improved accuracy and provided human-like coherence. Before GPT, AI language models relied on rule-based systems or simpler neural networks like recurrent neural networks (RNNs), which struggled with long-range dependencies and contextual understanding. The story of the GPT architecture is one of constant incremental improvements ever year since launch. GPT-2 in 2019 introduced a model with 1.5 billion parameters, which started to provide the kind of fluent text responses where AI users are now familiar with. However it was the introduction of GPT-3 (and subsequently 3.5) in 2020 which was the real game-changer. It featured 175 billion parameters, and suddenly a single AI model could cope with a vast array of applications from creative writing to code generation. GPT technology went viral in November of 2022 with the launch of ChatGPT. Based on GPT 3.5 and later GPT-4, this astonishing technology instantly propelled AI into public consciousness in a massive way. Unlike previous GPT models, ChatGPT was fine-tuned for conversational interaction. Suddenly business users and ordinary citizens could use an AI for things like customer service, online tutoring or technical support. So powerful was this idea, that the product attracted a 100 million users in a mere 60 days. Today GPT is one of the top two AI system architectures in the world (along with Google's Gemini). Recent improvements have included multimodal capabilities, i.e. the ability to process not just text but also images, video and audio. OpenAI has also updated the platform to improve pattern recognition and enhance unsupervised learning, as well as adding agentic functionality via semi-autonomous tasks. On the commercial front, GPT powered applications are now deeply embedded in many different business and industry enterprises. Salesforce has Einstein GPT to deliver CRM functionality, Microsoft's Copilot is an AI assisted coding tool which incorporates Office suite automation, and there are multiple healthcare AI models which are fine-tuned to provide GPT powered diagnosis, patient interaction and medical research. At the time of writing the only two significant rivals to the GPT architecture are Google's Gemini system and the work being done by DeepSeek, Anthropic's Claude and Meta with its Llama models. The latter products also use transformers, but in a subtly different way to GPT. Google however is a dark horse in the race, as it's becoming clear that the Gemini platform has the potential to dominate the global AI arena within a few short years. Despite the competition, OpenAI remains firmly at the top of many leaderboards in terms of AI performance and benchmarks. Its growing range of reasoning models such as o1 and o3, and its superlative image generation product, GPT Image-1 which uses the technology, continue to demonstrate that there is significant life left in the architecture, waiting to be exploited.
Yahoo
39 minutes ago
- Yahoo
Coach Launch Responds to Rising Sales Resistance by Championing Virtual Events for Entrepreneurial Success
Melbourne, Australia, June 03, 2025 (GLOBE NEWSWIRE) -- As businesses navigate an era of heightened sales resistance and growing distrust driven by AI-generated content, Coach Launch, a Melbourne-based consultancy with a global client base, has announced a strategic initiative to help entrepreneurs embrace virtual events as a scalable solution to rebuild engagement and trust. The company now offers tailored online virtual event consulting designed to replace outdated sales tactics with immersive, rapport-building digital Skepticism Fuels a New Sales Era Across industries, sales teams are encountering a new form of resistance—one deeply tied to the rise of artificial intelligence and automation in marketing. Potential buyers are becoming more selective, with an increased focus on authenticity and real-time engagement. According to recent findings, traditional one-to-one selling methods are proving costly and increasingly ineffective, while automated systems often fail to create the trust needed to convert modern customers. In response, businesses are shifting their approach, and Coach Launch is leading the way by offering a consulting model that focuses on high-impact virtual experiences. Virtual Events Offer an Effective Alternative Virtual events are emerging as a vital tool in combating sales resistance, offering entrepreneurs a way to scale outreach while maintaining a human touch. These experiences create an opportunity for businesses to connect with their audiences in real-time, delivering value, insights, and interaction that static digital content cannot match. 'Virtual events have become a necessary evolution in business communication,' said Mr. Matthew White, spokesperson for Coach Launch. 'They allow entrepreneurs to demonstrate their value in real time, answer questions, and build a genuine connection. In today's environment, that's what moves the needle.' Coach Launch specializes in guiding entrepreneurs through the entire event process—from strategy and planning to execution and follow-up—ensuring each virtual engagement is structured for both authenticity and profitability. Immersive Experiences Increase Engagement The effectiveness of virtual events lies in their immersive nature. Reports such as Immersion Causes Conversion point to a measurable link between active participation and consumer response. When attendees are drawn into a live, engaging experience, they are more likely to remember the message, build trust with the host, and take action. This insight is especially relevant given data from The Rise of Sales Resistance, which outlines the declining effectiveness of conventional outreach tactics such as cold emails and sales calls. In this landscape, passive strategies are falling short, and immersive approaches like live digital events are rising in value. Scalable and Repeatable Business Growth Model Coach Launch is not merely offering one-time solutions. Its consulting services are built around a repeatable framework that enables entrepreneurs to run profitable, recurring virtual events. This system allows entrepreneurs to run profitable, recurring virtual events—accessible to businesses anywhere in the world. Unlike conventional marketing funnels, the model empowers entrepreneurs to host sessions that engage audiences, showcase expertise, and move prospects naturally toward conversion—all within a controlled, measurable environment. 'Our goal is to provide a roadmap that entrepreneurs can replicate,' said White. 'The beauty of virtual events is in their flexibility. You can iterate, adapt, and run them as often as needed without the overhead of traditional events.' Coach Launch's upcoming session will be held on April 19, 2025, with new sessions recurring every two weeks. Entrepreneurs and business leaders interested in learning more about this model are encouraged to get a free ticket and attend the next live event. Rebuilding Trust Through Connection Trust is at the heart of every business transaction, and Coach Launch emphasizes that it must be earned through transparency, interaction, and value. By replacing static content and impersonal outreach with live, human-driven experiences, entrepreneurs are reclaiming lost ground in a digitally crowded marketplace. Coach Launch's approach aligns with the evolving expectations of today's buyers, who value authenticity over automation and are more likely to engage when they feel seen and heard regardless of geography or time zone. About Coach Launch Coach Launch is a global online virtual event consulting company based in Melbourne, Australia. It specializes in helping entrepreneurs and small business owners plan, launch, and optimize profitable virtual events. As virtual event consultants, Coach Launch supports coaches, consultants, and service experts who want to run profitable virtual events in small, intimate groups with highly engaged attendees ready to buy—and do so consistently, like clockwork. With a focus on scalability, engagement, and trust-building, Coach Launch provides strategic consulting to support business growth through immersive digital experiences. Its bi-weekly events demonstrate the effectiveness of this approach, offering entrepreneurs a clear path toward modernizing their outreach Event Consultants CONTACT: Media Contact Company Name: Coach Launch Contact Person: Mr. Matthew White Email: contactus@ Phone: +1 844-780-1448 Country: Australia Website: in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


TechCrunch
43 minutes ago
- TechCrunch
Windsurf says Anthropic is limiting its direct access to Claude AI models
Windsurf, the popular vibe coding startup that's reportedly being acquired by OpenAI, said Anthropic significantly reduced its first-party access to the highly popular AI models Claude 3.7 Sonnet and Claude 3.5 Sonnet. Windsurf CEO Varun Mohan said in a post on X that Anthropic gave Windsurf little notice for this change, and the startup now has to find other third-party compute providers to run Claude AI models on its platform. 'We have been very clear to Anthropic that this is not our desire – we wanted to pay them for the full capacity,' said Mohan on X. 'We are disappointed by this decision and short notice.' In a blog post, Windsurf said this it has some capacity with third-party inference providers, but not enough, so this change may create short-term availability issues for Windsurf users trying to access Claude. With less than five days of notice, Anthropic decided to cut off nearly all of our first-party capacity to all Claude 3.x models. Given the short notice, we may see some short-term Claude 3.x model availability issues as we have very quickly ramped up capacity on other inference… — Varun Mohan (@_mohansolo) June 3, 2025 The decision comes just a few weeks after Anthropic seemed to pass over Windsurf during the launch of Claude 4, the company's new family of models, which offer industry leading performance on software engineering tasks. Anthropic gave several popular vibe coding apps — including Anysphere's Cursor, Cognition's Devin, and Microsoft's GitHub Copilot — immediate access to run Claude Sonnet 4 and Claude Opus 4. Those apps started supporting the new Claude 4 models on launch day. Windsurf said at the time it did not receive direct access from Anthropic to run Claude 4 on its platform, and still hasn't — forcing the company to rely on a workaround that's more expensive and complicated for developer to access Claude. Anthropic's AI models have become a favorite among developers, and in the past, Anthropic has worked with Windsurf to power its vibe coding tools. The AI-assisted coding sector, also know as vibe coding, has heated up in recent months. OpenAI reportedly closed on a deal to acquire Windsurf in April. At the same time, Anthropic has invested more in its own AI-coding applications. In February, Anthropic launched its own AI coding application, Claude Code, and in May, the startup held its first Code with Claude developer conference. Techcrunch event Save now through June 4 for TechCrunch Sessions: AI Save $300 on your ticket to TC Sessions: AI—and get 50% off a second. Hear from leaders at OpenAI, Anthropic, Khosla Ventures, and more during a full day of expert insights, hands-on workshops, and high-impact networking. These low-rate deals disappear when the doors open on June 5. Exhibit at TechCrunch Sessions: AI Secure your spot at TC Sessions: AI and show 1,200+ decision-makers what you've built — without the big spend. Available through May 9 or while tables last. Berkeley, CA | REGISTER NOW 'We're prioritizing capacity for sustainable partnerships that allow us to effectively serve the broader developer community,' said Anthropic spokesperson Steve Mnich in an email to TechCrunch on Tuesday, noting that it's still possible to access Claude on Windsurf via an API key. 'Developers can also access Claude through our direct API integration, our partner ecosystem, and other development tools.' Windsurf has grown quickly this year, reaching $100 million ARR in April, in an attempt to catch up with more popular AI coding tools such as Cursor and GitHub Copilot. However, Windsurf's limited access to Anthropic's models may be stunting its growth. Several Windsurf users that spoke with TechCrunch were frustrated by the lack of direct access to Anthropic's best AI coding models. Ronald Mannak, a startup founder that specializes in Apple's programming language, Swift, told TechCrunch that Claude 4 represented a significant jump in capabilities for his workloads. While Mannak has been a Windsurf customer since late 2024, he's switched to using Cursor in recent weeks so that he can vibe code more easily with Claude 4. As a short-term solution to support Claude 4, Windsurf allows users to connect their Anthropic API keys to their Windsurf accounts. However, developers have noted that this 'bring your own key' solution is more expensive and complicated than if Windsurf provided the models itself. When it comes to vibe coders, optionality is the name of the game. Every few months, OpenAI, Google, and Anthropic release new AI models that seem to outperform the industry on coding tasks. Because of that, it benefits vibe coding startups to support AI models from all the leading developers. Windsurf spokesperson Payal Patel tells TechCrunch via email that the company has always believed in providing optionality for users. In this case, it seems Anthropic has made that a bit more challenging.