Latest news with #NvidiaH100s
Yahoo
18-03-2025
- Business
- Yahoo
OpenAI's First Stargate Site to Hold Up to 400,000 Nvidia Chips
(Bloomberg) -- The first data center complex for OpenAI's $100 billion Stargate infrastructure venture will have space for as many as 400,000 of Nvidia Corp.'s powerful AI chips — an amount that, if filled, would make it one of the largest known clusters of artificial intelligence computing power. ICE Eyes Massive California Tent Facility Amid Space Constraints How Britain's Most Bike-Friendly New Town Got Built The Dark Prophet of Car-Clogged Cities Washington, DC, Region Braces for 'Devastating' Cuts from Congress NYC Plans for Flood Protection Without Federal Funds The construction for the site, in the small Texas city of Abilene, will be completed by mid-2026 with capacity of 1.2 gigawatts of power, according to developer Crusoe, which is set to announce the next phase of development on Tuesday. Though the facility will be large enough to support hundreds of thousands of advanced AI chips, it's unclear how many have been committed to the project. The Stargate joint venture was unveiled by OpenAI, SoftBank Group Corp. and Oracle Corp. at a White House event in January, with the goal of providing the physical infrastructure needed for more advanced AI models from the ChatGPT maker. OpenAI previously said Stargate will expand to as many as 10 sites around the country. Oracle has already agreed to use the Abilene location's full build for Stargate, according to people familiar with the matter, who spoke on condition of anonymity to discuss private information. OpenAI currently has plans to use roughly 1 gigawatt of capacity at the facility, one person said. Crusoe declined to comment on the site's clients. OpenAI declined to comment. Oracle didn't respond to a request for comment. Stargate joins a race among leading tech companies to build up capacity of Nvidia's latest chips. Elon Musk's xAI recently inked a $5 billion deal with Dell Technologies Inc. for AI servers for a supercomputer in Memphis. Meta Platforms Inc. has said it planned to have computing power equal to 600,000 Nvidia H100s — a previous generation of the company's data center semiconductors — by the end of 2024. And CoreWeave Inc., an AI-focused cloud provider, has more than 250,000 Nvidia graphics processing units across 32 data centers, it said in paperwork for an public offering earlier this month. While Stargate was formally announced in January, the Abilene data center complex was in the works before that. 'If you came here a year ago, this would be a field of mesquite trees and shrubs,' said Crusoe Chief Executive Officer Chase Lochmiller in an interview. 'We broke ground in June of last year and have been on a very accelerated build pace.' Currently, there are about 2,000 people working on construction for the project, with plans for that to increase to nearly 5,000 workers, Crusoe said. There will be eight data center buildings, each designed to hold as many as 50,000 Nvidia GB200 semiconductors, the company said. The White House announcement gave the long in-the-works project 'tremendous credibility,' said Michael McNamara, CEO of energy startup Lancium, which is also developing the site and first inked an agreement with local officials to build a data center campus in Abilene in 2021. There's now a sense among 'all stakeholders that these projects need to be built bigger and faster,' he said. --With assistance from Shirin Ghaffary. (Adds more detail on site plans in fourth paragraph.) Tesla's Gamble on MAGA Customers Won't Work The Real Reason Trump Is Pushing 'Buy American' The Future of Higher Ed Is in Austin Snap CEO Evan Spiegel Bets Meta Can't Copy High-Tech Glasses Nvidia Looks Past DeepSeek and Tariffs for AI's Next Chapter ©2025 Bloomberg L.P.


Axios
05-02-2025
- Business
- Axios
Stunning breakthroughs from China's DeepSeek AI alarm U.S. rivals
Breakthroughs from Chinese AI startup DeepSeek have stunned Silicon Valley and could bring turbulence to Wall Street, as they were accomplished at a fraction of what the U.S. giants are spending and despite export bans on top-of-the-line chips. Why it matters: China's rapid advances suggest America's strategy of withholding technology from China might just be speeding up the evolution of its rival's AI knowhow. DeepSeek's rise is alarming the likes of Meta, which announced Friday that it plans $60 billion-$65 billion in capital investment this year as it scales up its own AI projects. But it could potentially also be bad news for Nvidia, which designs the world's most advanced AI chips, because DeepSeek is proving that rapid advances are possible even with fewer and less sophisticated chips. Nvidia's stock slid on Friday and again in overnight trading last night, pulling the Nasdaq down with it. Driving the news: DeepSeek hit No. 1 on Apple's App Store a week after the Jan. 20 release of its R1 model, which works along similar lines to OpenAI's o1. Presented with a complex challenge, it takes time to consider alternate approaches before picking the best solution — and it explains its chain of reasoning to users. These "reasoning" models are especially good at coding and math. Just last month another DeepSeek model, v3, stunned AI experts by providing performance comparable to OpenAI's and Anthropic's most advanced publicly available general models, as Axios reported. The kicker is that DeepSeek created and released its entirely open source project for about $6 million in training costs ("a joke of a budget," in one expert's words). OpenAI is spending hundreds of millions of dollars. The results from China have turned eyes around the world and revved up concerns in the U.S. that its lead in the so-called AI race between the two superpowers may be shrinking. DeepSeek spun out of a Chinese hedge-fund firm two years ago, hired ambitious young AI scientists, and set them to figure out more efficient ways to develop models, per Wired, and they focused on basic research rather than consumer product development. The other side: It's not as though the U.S.' efforts to keep the best chips out of China haven't had an impact. DeepSeek is said to have already amassed a training network of 10,000 Nvidia H100s by the time U.S. sanctions were introduced in 2022. But DeepSeek's CEO has said that the export controls, and resulting scarcity of Nvidia's top-of-the-line chips, have been "a problem." No one knows where DeepSeek would stand today if it didn't face those roadblocks. We also can't say whether DeepSeek would be making such rapid advances on its own without having the latest work from OpenAI and its U.S. competitors to aim at. Our thought bubble: Scarcity drives innovation, China has some great AI minds at work, and the U.S. might need to rethink its strategy. Silicon Valley's startup culture relentlessly pushes research in the direction of the consumer market. But sometimes, particularly when a field is young and applications aren't immediately obvious, basic research is even more important than market share — and open research tends to overwhelm secret research. OpenAI's pivot from "world's best AI lab" to "aspiring consumer tech giant" has wowed the financial world but could leave the U.S. in second place.