logo
#

Latest news with #MITTechnologyReview

Stargate and beyond: The global data centre arms race
Stargate and beyond: The global data centre arms race

Yahoo

time2 days ago

  • Business
  • Yahoo

Stargate and beyond: The global data centre arms race

In May 2025, OpenAI announced its plans to develop a new data centre in Abu Dhabi in the United Arab Emirates (UAE). This planned 5GW data centre campus may become one of the largest in the world. The development is linked to the broader Stargate Project, a $500bn AI infrastructure initiative focused on building large-scale data centres across the United States. For every dollar the UAE invests in Stargate UAE and the broader data centre project in Abu Dhabi, the UAE will invest an additional dollar in US AI infrastructure. This latest development highlights the US' race to progress in AI. The Stargate Project involves a joint venture between OpenAI, SoftBank, Oracle, and MGX, and aims to create a network of facilities to support AI training and development. The project is intended to expand existing AI infrastructure and establish the US as a leader in AI innovation. China, however, is not falling behind. The central government had designated AI infrastructure as a national priority, urging governments to accelerate the development of AI-focused data centres. Hundreds of new infrastructure projects were announced in 2023 and 2024. Interestingly, in March 2025, the MIT Technology Review revealed that there was an underutilisation of data centres in China, probably due to a weaker demand than expected, plus the shifts in AI trends with the rise of DeepSeek. In Q1 2025, DeepSeek released a reasoning model called R1 that achieved performance comparable to ChatGPT o1 but at a significantly lower cost. This made many AI companies rethink their requirements for hardware and scale. Globally, as demand for data storage and processing surges, data centre expansion accelerates. According to GlobalData, the number of data centre projects by construction start date remained relatively stable between 2019 and 2024, before surging in 2025 due to companies scaling up data centre projects to support growing AI workloads. GlobalData also estimates that combined investment in new infrastructure construction projects by Alphabet, Amazon, Apple, Meta, and Microsoft will grow 114% in 2025 compared to 2024. On April 2, 2025, US President Donald Trump imposed various tariffs on imports into the US, sending global financial markets into turmoil. The announcement has disrupted the global economy and is expected to negatively impact the US data centre industry. Steel and aluminium are essential materials for data centres, used not only in their construction but also in critical components like power equipment and cooling systems. The 25% tariff on all US steel and aluminium imports, first announced in February 2025, will increase construction and component costs, raising the cost per square foot of new data centres. As a result, there is likely to be a reduction in investments in infrastructure projects in the US for the foreseeable future. Data centres currently in the planning or pre-construction phase will likely be hit hardest economically. Increased material costs could lead to scaled-back plans, delayed timelines, or a shift toward regions with more stable prices. Data centres rely on advanced chips to power AI models, cloud computing, and any high-performance workload. Currently, semiconductors are exempt from the new tariffs. However, imported equipment used to produce chips is not exempt from tariffs. The increased cost of chip-related components will be passed down to the companies building data centres, thus slowing their upgrades and expansion. These infrastructures are also notoriously energy-intensive and increasingly use renewables to meet sustainability targets. In recent years, spurred by technological developments and favourable policy incentives, North America's energy transition has gathered pace. A key part of this transition has been the rapid expansion of energy storage, which is crucial to grid stability. However, the current administration may delay North America's energy transition and shift away from renewables and back to thermal power and conventional fuels. With the US reliant on Chinese lithium-ion batteries, these tariffs will significantly impact energy storage development, including the battery energy storage systems (BESSs) deployed in data centres. More broadly, reduced support for energy transition technologies may limit the centres' ability to source renewable energy. In addition, these centres face challenges from a congested power grid, and tariffs on key materials like steel and aluminium are worsening the issue. The cost of essential grid components, which rely on steel, will likely rise, potentially delaying infrastructure upgrades. These delays threaten the reliable power supply needed for large-scale data centres. It will be difficult for companies that build and operate data centres to scale AI capabilities and digital infrastructure while navigating an increasingly fragmented global economic and geopolitical landscape. As a result of these challenges, the cost of data centre capacity is expected to rise, with these increases ultimately being passed down to enterprises and consumers. "Stargate and beyond: The global data centre arms race" was originally created and published by Verdict, a GlobalData owned brand. The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

What Pennsylvania stands to lose if federal research dollars dry up
What Pennsylvania stands to lose if federal research dollars dry up

Technical.ly

time4 days ago

  • Business
  • Technical.ly

What Pennsylvania stands to lose if federal research dollars dry up

This is a guest post by Thomas P. Foley, a former college president and the current president of the Association of Independent Colleges and Universities of Pennsylvania. In early March, on behalf of 85 independent nonprofit colleges, I sent a letter to Congress about the many ways that funding research benefits each one of us. In the weeks since, this topic has exploded across news headlines, as the effects of cuts to the National Science Foundation (NSF) and National Institutes of Health (NIH) become clear. Your daily routine — even what you're doing right now, reading an online article — is likely shaped by the innovations born from academic research funded with federal dollars. I could list hundreds of examples for you, many even pioneered in PA, like WiFi and Java code, but simply put, according to the MIT Technology Review, 'every major technological transformation in the US, from electric cars to Google to the iPhone, can trace its roots back to basic science research once funded by the federal government.' NIH and NSF funding is a major economic driver for Pennsylvania. In fact, Pennsylvania ranks fourth in the nation in winning NIH federal research grants. Local researchers won $1.8 billion in NIH funds last year alone, and just one year of NIH funding in Pennsylvania generates $5.2 billion in economic activity and supports 21,787 jobs. NSF funding amounted to another $332 million for Pennsylvania in fiscal year 2024. Every one of the 67 counties in Pennsylvania, whether rural or urban, receives research funding through NIH and NSF. Cutting these programs is estimated to mean a $27 million loss to Dauphin County (Harrisburg), $259 million loss to Allegheny County (Pittsburgh), and a $397 million loss in Philadelphia. Cutting research funding means real dollars that will be pulled out of Pennsylvania's economy. Education is an export that pays off Don't sell education short. Higher ed is doing the heavy lifting for our state's economy. One of the nation's biggest exports is education, even bigger than coal, corn and natural gas. The PA Chamber of Business and Industry found that the 5 th largest industry in PA isn't steel – it's higher education. Two hundred thousand jobs in PA are supported by the independent nonprofit higher ed sector alone, with thousands more jobs supported by our outstanding trade schools, community colleges and state-owned and state-related colleges. We've made strides in ' brain gain ' here in Pennsylvania, and we changed a Rust Belt narrative into a story of success. PA is a magnet for talent, our colleges attract the second highest number of out-of-state college students in the country (considered an economic 'export' for the state), and the number of college graduates moving into the state has ticked up (51% increase in 2023 according to Newsweek). Sixteen percent of all American Nobel Prize winners were affiliated with one of Pennsylvania's independent nonprofit universities and colleges. We're winning in innovation, and we've made a thriving ecosystem where university research fuels businesses and supports startup culture. So why lose all that by reversing course and cutting the research that underpins our state's prosperity? An impending brain drain Nationwide, we're looking at potentially 68,000 job losses due to NIH cuts alone, according to the Science & Community Impacts Mapping Project. That doesn't include the unknown loss in medical breakthroughs or tech innovations that won't happen now without much-needed research. The losses pile up from there. Universities are cutting back on their doctoral programs, which means fewer doctors at your local hospital and fewer researchers working on treatments for diabetes and cancer. Foreign countries are actively seizing their moment to poach American talent and lure away our best and brightest minds (see: Australia, China, EU, France, Germany, Ireland, Netherlands, Norway, South Korea). Remember, WWII and the Cold War were fought in labs and lecture halls as well as on battlefields when America was a shining beacon for émigré scientists (see: Manhattan Project, Project Paperclip). Tomorrow's competitive edge can be found today on college campuses where A.I. and drones were first developed, and Pennsylvania's higher ed sector is already a significant contributor to our nation's isn't about conservative or liberal, Republican or Democrat. In fact, in years past, it was a Republican who pushed for more funding for higher ed and NASA, when President Eisenhower realized the competitive advantage of America's universities. Let's not give away what took so many years to win.

Is AI growing faster than we can power it? MIT study warns of explosive energy demands in a machine-led future
Is AI growing faster than we can power it? MIT study warns of explosive energy demands in a machine-led future

Time of India

time24-05-2025

  • Business
  • Time of India

Is AI growing faster than we can power it? MIT study warns of explosive energy demands in a machine-led future

Artificial Intelligence may be the sharpest tool in humanity's digital shed, but behind its sleek interface lies a growing climate conundrum. From helping us choose our next binge-worthy show to whispering sweet nothings as a virtual romantic partner, AI is rapidly becoming an inseparable part of everyday life. But what powers this 'magic' comes with a carbon footprint big enough to leave scorch marks on the planet. A new investigation by MIT Technology Review has pulled back the curtain on the escalating environmental cost of AI—and the findings are as alarming as they are eye-opening. Experts now suggest that what we're seeing today might just be the calm before a very energy-hungry storm. The Energy Avalanche You Didn't See Coming For every chatbot reply or AI-generated painting, there's a surge of electricity flowing through data centres that rarely sleep. And according to Professor Sajjad Moazeni, asking ChatGPT a single question may use 10 to 100 times more energy than sending an email. Multiply that by the billion messages ChatGPT receives daily, and you begin to understand the scale. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like The Killer New Toyota 4Runner Is Utter Perfection (Take A Look) MorninJoy And it doesn't stop there. OpenAI's Sam Altman admitted that even the 'politeness' in our prompts costs tens of millions of dollars in operational expenses. AI systems like Google's Gemini and image generators that churn out 78 million images a day aren't just consuming bandwidth—they're devouring energy. The MIT report reveals that by 2028, over half of all electricity used by data centres could go directly into powering AI. That translates to between 165 and 326 terawatt-hours annually—more electricity than what all U.S. data centres currently use for everything . It's also enough to power nearly a quarter of all American homes. You Might Also Like: Former Google CEO Eric Schmidt sounds alarm on AI data centers' soaring power demand: 'We need energy in all forms' To put it in a wilder perspective: this energy use would emit the same carbon as 1,600 round trips from Earth to the Sun in a car. It's a statistic so surreal it almost feels fictional—except it's not. Data Centres: The Silent Gas Guzzlers of the Digital World AI infrastructure isn't just greedy—it's relentless. 'AI data centres need constant power, 24-7, 365 days a year,' said Rahul Mewawalla, CEO of Mawson Infrastructure Group. And despite the optimism around renewables, the bulk of that power still comes from fossil fuels. As AI adoption accelerates, so does the dependency on energy grids that are far from green. This has led to serious concerns from environmentalists. 'It's not clear to us that the benefits of these data centres outweigh these costs,' said Eliza Martin of Harvard's Environmental and Energy Law Program. 'Why should we be paying for this infrastructure? Why should we be paying for their power bills?' A Future Too Hot to Handle? The AI revolution is pushing boundaries, but it's also pushing climate scientists to the brink of panic. With global warming already spinning out of control, the sudden explosion of energy-intensive AI tools adds a new layer to an already urgent crisis. You Might Also Like: 'I am going to lose my job': AI outpaces doctor's two decades of experience in seconds. Can machines truly replace professional expertise? If the trajectory continues unchecked, this hidden environmental tax may soon become too heavy to ignore. While AI may promise smarter futures, the question remains: at what cost? And perhaps more pressingly—was today's AI energy footprint the smallest it will ever be? If so, the future could be brighter for tech but bleaker for the planet.

How Much Electricity It Actually Takes to Use AI May Surprise You
How Much Electricity It Actually Takes to Use AI May Surprise You

Yahoo

time22-05-2025

  • Business
  • Yahoo

How Much Electricity It Actually Takes to Use AI May Surprise You

By now, most of us should be vaguely aware that artificial intelligence is hungry for power. Even if you don't know the exact numbers, the charge that "AI is bad for the environment" is well-documented, bubbling from sources ranging from mainstream press to pop-science YouTube channels to tech trade media. Still, the AI industry as we know it today is young. Though startups and big tech firms have been plugging away on large language models (LLMs) since the 2010s, the release of consumer generative AI in late 2022 brought about a huge increase in AI adoption, leading to an unprecedented "AI boom." In under three years, AI has come to dominate global tech spending in ways researchers are just starting to quantify. In 2024, for example, AI companies nabbed 45 percent of all US venture capital tech investments, up from only nine percent in 2022. Medium-term, big-name consultant firms like McKinsey expect AI infrastructure spending to grow to $6.7 trillion by 2030; compare this to just $450 billion in 2022. That being the case, research on AI's climate and environmental impacts can seem vague and scattered, as analysts race to establish concrete environmental trends in the extraordinary explosion of the AI industry. A new survey by MIT Technology Review is trying to change that. The authors spoke to two dozen AI experts working to uncover the tech's climate impact, combed "hundreds of pages" of data and reports, and probed the top developers of LLM tools in order to provide a "comprehensive look" at the industry's impact. "Ultimately, we found that the common understanding of AI's energy consumption is full of holes," the authors wrote. That led them to start small, looking at the energy use of a single LLM query. Beginning with text-based LLMs, they found that model size directly predicted energy demand, as bigger LLMs use more chips — and therefore more energy — to process questions. While smaller models like Meta's Llama 3.1 8B used roughly 57 joules per response (or 114 joules when the authors factored for cooling power and other energy needs), larger units needed 3,353 joules (or 6,706), or in MIT Tech's point of reference, enough to run a microwave for eight seconds. Image-generating AI models, like Stable Diffusion 3 Medium, needed 1,141 joules (or 2,282) on average to spit out a standard 1024 x 1024 pixel image — the type that are rapidly strangling the internet. Doubling the quality of the image likewise doubles the energy use to 4,402 joules, worth over five seconds of microwave warming time, still less than the largest language bot. Video generation is where the sparks really start flying. The lowest-quality AI video software, a nine-month old version of Code Carbon, took an eye-watering 109,000 joules to spew out a low-quality, 8fps film — "more like a GIF than a video," the authors noted. Better models use a lot more. With a recent update, that same tool takes 3.4 million joules to spit out a five-second, 16fps video, equivalent to running a microwave for over an hour. Whether any of those numbers amount to a lot or a little is open to debate. Running the microwave for a few seconds isn't much, but if everybody starts doing so hundreds of times a day — or in the case of video, for hours at a time — it'll make a huge impact in the world's power consumption. And of course, the AI industry is currently trending toward models that use more power, not less. Zooming out, the MIT Tech survey also highlights some concerning trends. One is the overall rise in power use correlating to the rise of AI. While data center power use remained mostly steady across the US between 2005 and 2017, their power consumption doubled by 2023, our first full year with mass-market AI. As of 2024, 4.4 percent of all energy consumed in the US went toward data centers. Meanwhile, data centers' carbon intensity — the amount of iceberg-melting exhaust spewed relative to energy used — became 48 percent higher than the US average. All that said, the MIT authors have a few caveats. First, we can't look under the hood at closed-source AI models like OpenAI's ChatGPT, and most of the leading AI titans have declined to join in on good-faith climate mapping initiatives like AI Energy Score. Until that changes, any attempt to map such a company's climate impact is a stab in the dark at best. In addition, the survey's writers note that data centers are not inherently bad for the environment. "If all data centers were hooked up to solar panels and ran only when the Sun was shining, the world would be talking a lot less about AI's energy consumption," they wrote. But unfortunately, "that's not the case." In countries like the US, the energy grid used to power data centers is still heavily reliant on fossil fuels, and surging demand for immediate energy are only making that worse. For example, the authors point to Elon Musk's xAI data center outside of Memphis, which is is using 35 methane gas generators to keep its chips humming, rather than wait for approval to draw from the civilian power grid. Unless the industry is made to adopt strategies to mitigate AI's climate impact — like those outlined in the Paris AI Action Declaration — this will just be the beginning of a devastating rise in climate-altering emissions. More on AI: New Law Would Ban All AI Regulation for a Decade

MIT Technology Review Releases In-Depth Reporting Package: Power Hungry: AI and our energy future
MIT Technology Review Releases In-Depth Reporting Package: Power Hungry: AI and our energy future

Yahoo

time20-05-2025

  • Business
  • Yahoo

MIT Technology Review Releases In-Depth Reporting Package: Power Hungry: AI and our energy future

New reporting that reveals the growing demands of AI's energy use and its climate impact. CAMBRIDGE, Mass., May 20, 2025 /PRNewswire/ -- MIT Technology Review launched today Power Hungry: AI and our energy future, a first-of-its-kind content package that tells a story you haven't yet heard. It's well documented that AI is a power-hungry technology. But there has been far less reporting on the extent of that hunger, how much its appetite is set to grow in the coming years, where that power will come from, and who will pay for it. For the past six months, MIT Technology Review's team of expert reporters and editors have worked to answer those questions. The result is an unprecedented look at the state of AI's energy and resource usage, where it is now, where it is headed in the years to come, and why we have to get it right. At the centerpiece of this new content package is a groundbreaking line of reporting into the demands of inference—the way human beings interact with AI when we make text queries or ask AI to come up with new images or create videos. Experts say inference is set to eclipse the already massive amount of energy required to train new AI models. We were so startled by what we learned reporting this story that we also put together a brief on everything you need to know about estimating AI's energy and emissions burden. And then we went out into the world to see the effects of this hunger. Our in-depth reporting takes you into the deserts of Nevada, where data centers in an industrial park the size of Detroit demand ever more water to keep their processors cool and running. In Louisiana, where Meta plans its largest-ever data center, we expose the dirty secret that will fuel its AI ambitions—along with those of many others. Separately, we have a look at why the clean energy promise of powering AI data centers with nuclear energy will long remain elusive. Finally, we also look at the reasons to be optimistic, and examine why future AI systems could be far less energy intensive than today's. Power Hungry: AI and our energy future is now available at Readers can subscribe to access the full series, every story on as well as MIT Technology Review's exclusive Insider's Panel from our latest EmTechAI conference for a deeper understanding of the future. Members of the press may obtain additional information and access by emailing press@ About MIT Technology Review Founded at the Massachusetts Institute of Technology in 1899, MIT Technology Review is the world's leading authority on technology and its influence. Through award-winning journalism and premium events, we break down complex innovations and analyze their commercial, social, and political impact. From AI and biotech to climate tech and computing, we deliver trusted insights and expert analysis, empowering a global audience to navigate the emerging technologies shaping our future. We are the destination for those seeking to better understand where technology is headed next. Subscribe. Attend. Follow: Facebook, LinkedIn, Instagram, Reddit. Media Contact: MIT Technology Review press@ View original content to download multimedia: SOURCE MIT Technology Review Sign in to access your portfolio

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store