Latest news with #SashaLuccioni


Saudi Gazette
2 days ago
- Business
- Saudi Gazette
Data centers to be expanded across UK as concerns mount
LONDON — The number of data centers in the UK is set to increase by almost a fifth, according to figures shared with BBC News. Data centers are giant warehouses full of powerful computers used to run digital services from movie streaming to online banking - there are currently an estimated 477 of them in the UK. Construction researchers Barbour ABI have analysed planning documents and say that number is set to jump by almost 100, as the growth in artificial intelligence (AI) increases the need for processing power. The majority are due to be built in the next five years. However, there are concerns about the huge amount of energy and water the new data centres will consume. Some experts have warned it could drive up prices paid by consumers. More than half of the new data centres would be in London and neighbouring are privately funded by US tech giants such as Google and Microsoft and major investment firms.A further nine are planned in Wales, one in Scotland, five in Greater Manchester and a handful in other parts of the UK, the data the new data centres are mostly due for completion by 2030, the biggest single one planned would come later - a £10bn AI data centre in Blyth, near Newcastle, for the American private investment and wealth management company Blackstone would involve building 10 giant buildings covering 540,000 square metres - the size of several large shopping centres - on the site of the former Blyth Power is set to begin in 2031 and last for more than three is planning four new data centres in the UK at a total cost of £330m, with an estimated completion between 2027 and 2029 - two in the Leeds area, one near Newport in Wales, and a five-storey site in Acton, north-west Google is building a data centre in Hertfordshire, an investment worth £740m, which it says will use air to cool its servers rather than some analyses, the UK is already the third-largest nation for data centres behind the US and government has made clear it believes data centres are central to the UK's economic future - designating them critical national there are concerns about their impact, including the potential knock-on effect on people's energy is not known what the energy consumption of the new centres will be as this data is not included in the planning applications, but US data suggests they are can be considerably more powerful than older Sasha Luccioni, AI and climate lead at machine learning firm Hugging Face, explains that in the US "average citizens in places like Ohio are seeing their monthly bills go up by $20 (£15) because of data centres".She said the timeline for the new data centres in the UK was "aggressive" and called for "mechanisms for companies to pay the price for extra energy to power data centres - not consumers".According to the National Energy System Operator, NESO, the projected growth of data centres in Great Britain could "add up to 71 TWh of electricity demand" in the next 25 years, which it says redoubles the need for clean power - such as offshore Owen, regional president of data centre operator Equinix, said the UK's high energy costs, as well as concerns around lengthy planning processes, were prompting some operators to consider building elsewhere."If I want to build a new data centre here within the UK, we're talking five to seven years before I even have planning permission or access to power in order to do that," he told BBC Radio 4's Today programme."So you're starting to see some of these AI workloads move into other countries, where the UK has always been a very important hub."UK deputy prime minister Angela Rayner has overturned some local councils' rejection of planning permission for data centres, citing their importance to the country's infrastructure and the government's growth are also growing concerns about the environmental impact of these enormous existing data centre plants require large quantities of water to prevent them from overheating - and most current owners do not share data about their water Hone, chief executive of industry body the Data Centre Alliance, says "ensuring there is enough water and electricity powering data centres isn't something the industry can solve on its own".But he insisted "data centres are fixated with becoming as sustainable as possible", such as through dry-cooling promises of future solutions have failed to appease Potters Bar, Hertfordshire, residents are objecting to the construction of a £3.8bn cloud and AI centre on greenbelt land, describing the area as the "lungs" of their in Dublin there is currently a moratorium on the building of any new data centres because of the strain existing ones have placed on Ireland's national electricity 2023 they accounted for one fifth of the country's energy month, Anglian Water objected to plans for a 435-acre data centre site in North Lincolnshire. The developer says it aims to deploy "closed loop" cooling systems which would not place a strain on the water planning documents suggest that 28 of the new data centres would be likely to be serviced by troubled Thames Water, including 14 more in Slough, which has already been described as having Europe's largest cluster of the BBC understands Thames Water was talking to the government earlier this year about the challenge of water demand in relation to data centres and how it can be UK, the trade body for all water firms, said it "desperately" wants to supply the centres but "planning hurdles" need to be cleared more new reservoirs are being built in Lincolnshire, the West Midlands and south-east England.A spokesperson for the UK government said data centres were "essential" and an AI Energy Council had been established to make sure supply can meet demand, alongside £104bn in water infrastructure investment. — BBC


WIRED
19-06-2025
- Business
- WIRED
How Much Energy Does AI Use? The People Who Know Aren't Saying
Jun 19, 2025 6:00 AM A growing body of research attempts to put a number on energy use and AI—even as the companies behind the most popular models keep their carbon emissions a secret. Photograph: Bloomberg/Getty Images 'People are often curious about how much energy a ChatGPT query uses,' Sam Altman, the CEO of OpenAI, wrote in an aside in a long blog post last week. The average query, Altman wrote, uses 0.34 watt-hours of energy: 'About what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes.' For a company with 800 million weekly active users (and growing), the question of how much energy all these searches are using is becoming an increasingly pressing one. But experts say Altman's figure doesn't mean much without much more public context from OpenAI about how it arrived at this calculation—including the definition of what an 'average' query is, whether or not it includes image generation, and whether or not Altman is including additional energy use, like from training AI models and cooling OpenAI's servers. As a result, Sasha Luccioni, the climate lead at AI company Hugging Face, doesn't put too much stock in Altman's number. 'He could have pulled that out of his ass,' she says. (OpenAI did not respond to a request for more information about how it arrived at this number.) As AI takes over our lives, it's also promising to transform our energy systems, supercharging carbon emissions right as we're trying to fight climate change. Now, a new and growing body of research is attempting to put hard numbers on just how much carbon we're actually emitting with all of our AI use. This effort is complicated by the fact that major players like OpenAi disclose little environmental information. An analysis submitted for peer review this week by Luccioni and three other authors looks at the need for more environmental transparency in AI models. In Luccioni's new analysis, she and her colleagues use data from OpenRouter, a leaderboard of large language model (LLM) traffic, to find that 84 percent of LLM use in May 2025 was for models with zero environmental disclosure. That means that consumers are overwhelmingly choosing models with completely unknown environmental impacts. 'It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,' Luccioni says. 'It's not mandated, it's not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere.' As a result of this lack of transparency, Luccioni says, the public is being exposed to estimates that make no sense but which are taken as gospel. You may have heard, for instance, that the average ChatGPT request takes 10 times as much energy as the average Google search. Luccioni and her colleagues track down this claim to a public remark that John Hennessy, the chairman of Alphabet, the parent company of Google, made in 2023. A claim made by a board member from one company (Google) about the product of another company to which he has no relation (OpenAI) is tenuous at best—yet, Luccioni's analysis finds, this figure has been repeated again and again in press and policy reports. (As I was writing this piece, I got a pitch with this exact statistic.) 'People have taken an off-the-cuff remark and turned it into an actual statistic that's informing policy and the way people look at these things,' Luccioni says. 'The real core issue is that we have no numbers. So even the back-of-the-napkin calculations that people can find, they tend to take them as the gold standard, but that's not the case.' One way to try and take a peek behind the curtain for more accurate information is to work with open source models. Some tech giants, including OpenAI and Anthropic, keep their models proprietary—meaning outside researchers can't independently verify their energy use. But other companies make some parts of their models publicly available, allowing researchers to more accurately gauge their emissions. A study published Thursday in the journal Frontiers of Communication evaluated 14 open-source large language models, including two Meta Llama models and three DeepSeek models, and found that some used as much as 50 percent more energy than other models in the dataset responding to prompts from the researchers. The 1,000 benchmark prompts submitted to the LLMs included questions on topics such as high school history and philosophy; half of the questions were formatted as multiple choice, with only one-word answers available, while half were submitted as open prompts, allowing for a freer format and longer answers. Reasoning models, the researchers found, generated far more thinking tokens—measures of internal reasoning generated in the model while producing its answer, which are a hallmark of more energy use—than more concise models. These models, perhaps unsurprisingly, were also more accurate with complex topics. (They also had trouble with brevity: During the multiple choice phase, for instance, the more complex models would often return answers with multiple tokens, despite explicit instructions to only answer from the range of options provided.) Maximilian Dauner, a PhD student at the Munich University of Applied Sciences and the study's lead author, says he hopes AI use will evolve to think about how to more efficiently use less-energy-intensive models for different queries. He envisions a process where smaller, simpler questions are automatically directed to less-energy-intensive models that will still provide accurate answers. 'Even smaller models can achieve really good results on simpler tasks, and don't have that huge amount of CO 2 emitted during the process,' he says. Some tech companies already do this. Google and Microsoft have previously told WIRED that their search features use smaller models when possible, which can also mean faster responses for users. But generally, model providers have done little to nudge users toward using less energy. How quickly a model answers a question, for instance, has a big impact on its energy use—but that's not explained when AI products are presented to users, says Noman Bashir, the Computing & Climate Impact Fellow at MIT's Climate and Sustainability Consortium. 'The goal is to provide all of this inference the quickest way possible so that you don't leave their platform,' he says. 'If ChatGPT suddenly starts giving you a response after five minutes, you will go to some other tool that is giving you an immediate response.' However, there's a myriad of other considerations to take into account when calculating the energy use of complex AI queries, because it's not just theoretical—the conditions under which queries are actually run out in the real world matter. Bashir points out that physical hardware makes a difference when calculating emissions. Dauner ran his experiments on an Nvidia A100 GPU, but Nvidia's H100 GPU—which was specially designed for AI workloads, and which, according to the company, is becoming increasingly popular—is much more energy-intensive. Physical infrastructure also makes a difference when talking about emissions. Large data centers need cooling systems, light, and networking equipment, which all add on more energy; they often run in diurnal cycles, taking a break at night when queries are lower. They are also hooked up to different types of grids—ones overwhelmingly powered by fossil fuels, versus those powered by renewables—depending on their locations. Bashir compares studies that look at emissions from AI queries without factoring in data center needs to lifting up a car, hitting the gas, and counting revolutions of a wheel as a way of doing a fuel-efficiency test. 'You're not taking into account the fact that this wheel has to carry the car and the passenger,' he says. Perhaps most crucially for our understanding of AI's emissions, open source models like the ones Dauner used in his study represent a fraction of the AI models used by consumers today. Training a model and updating deployed models takes a massive amount of energy—figures that many big companies keep secret. It's unclear, for example, whether the light bulb statistic about ChatGPT from OpenAI's Altman takes into account all the energy used to train the models powering the chatbot. Without more disclosure, the public is simply missing much of the information needed to start understanding just how much this technology is impacting the planet. 'If I had a magic wand, I would make it mandatory for any company putting an AI system into production, anywhere, around the world, in any application, to disclose carbon numbers,' Luccioni says. Paresh Dave contributed reporting.