
Applied Digital and CoreWeave ink 15-year lease worth $7 billion
A screen displays the company logo for CoreWeave, Inc., Nvidia-backed cloud services provider, during the company's IPO at the Nasdaq Market, in New York City, U.S., March 28, 2025. REUTERS/Brendan McDermid
Applied Digital said on Monday it has entered into two 15-year leases with CoreWeave, a specialized cloud services provider backed by Nvidia, which will generate about $7 billion in revenue for the data center operator over the lease period.
WHY IT'S IMPORTANT
The deal could prove to be a major lifeline for Applied Digital, which has been facing challenges in the data center hosting sector as it transitions into a data center real estate investment trust.
The company's shares surged by 17% in premarket trading following the lease announcement.
KEY QUOTES
"These leases solidify Applied Digital's position as an emerging provider of infrastructure critical to the next generation of artificial intelligence and high-performance computing,' CEO of Applied Digital, Wes Cummins, said in a statement.
"Through these newly signed long-term leases with CoreWeave, we are taking a step forward in our strategic expansion into advanced compute infrastructure."
CONTEXT
The emergence of new cloud service providers, known as "neoclouds" like CoreWeave, focuses on renting Nvidia's highly sought-after chips to software developers.
Leasing data center infrastructure from companies like Applied Digital helps reduce some of the financial burden of providing AI-centric cloud services.
CoreWeave's shares were up close to 4%.
(Reporting by Arsheeya Bajwa in Bengaluru; Editing by Tasim Zahid)
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Star
10 hours ago
- The Star
Anthropic CEO says proposed 10-year ban on state AI regulation 'too blunt' in NYT op-ed
Dario Amodei, CEO and Co-Founder of Anthropic attends the 55th annual World Economic Forum (WEF) meeting in Davos, Switzerland, January 23, 2025. REUTERS/Yves Herman/File Photo (Reuters) -A Republican proposal to block states from regulating artificial intelligence for 10 years is "too blunt," Anthropic Chief Executive Officer Dario Amodei wrote in a New York Times' opinion piece. Amodei instead called for the White House and Congress to work together on a transparency standard for AI companies at a federal level, so that emerging risks are made clear to the people. "A 10-year moratorium is far too blunt an instrument. AI is advancing too head-spinningly fast," Amodei said. "Without a clear plan for a federal response, a moratorium would give us the worst of both worlds - no ability for states to act, and no national policy as a backstop." The proposal, included in President Donald Trump's tax cut bill, aims to preempt AI laws and regulations passed recently in dozens of states, but has drawn opposition from a bipartisan group of attorneys general that have regulated high-risk uses of the technology. Instead, a national standard would require developers working on powerful models to adopt policies for testing and evaluating their models and to publicly disclose how they plan to test for and mitigate national security and other risks, according to Amodei's opinion piece. Such a policy, if adopted, would also mean developers would have to be upfront about the steps they took to make sure their models were safe before releasing them to the public, he said. Amodei said Anthropic already releases such information and competitors OpenAI and Google DeepMind have adopted similar policies. Legislative incentives to ensure that these companies keep disclosing such details could become necessary as corporate incentive to provide this level of transparency might change in light of models becoming more powerful, he argued. (Reporting by Arsheeya Bajwa in Bengaluru; Editing by Anil D'Silva)


The Star
13 hours ago
- The Star
Taiwan's Wistron to raise up to $923 million in share sale
A logo of Wistron is pictured at Wistron's year-end party in Taipei, Taiwan January 17, 2025. REUTERS/Ann Wang/File Photo SYDNEY (Reuters) -Taiwanese electronics manufacturer Wistron Corp is looking to raise up to $923 million by selling global depository shares that will be listed in Luxembourg, according to a term sheet reviewed by Reuters on Thursday. The company, which is a supplier to Nvidia, is selling up to 250 million depository shares in a price range of $36.20 to $36.93, the term sheet said. That is a discount of 4% to 6% to a closing stock price of NT$115 ($3.85) on Thursday. Wistron did not immediately respond to a request for comment from Reuters. The company plans to use the money raised in the share sale to buy raw materials in foreign currencies, the term sheet showed. The shares are due to start trading on June 16. Wistron said last month its new U.S. manufacturing facilities for customer Nvidia would be ready next year and that it was in talks with other potential customers. The facilities will produce high-performance computing and AI-related products. ($1 = 29.9060 Taiwan dollars) (Reporting by Scott Murdoch; Additional reporting Ben Blanchard; Editing by Tom Hogue)


Malaysian Reserve
20 hours ago
- Malaysian Reserve
Why is Nvidia the king of AI chips, and can it last?
INVESTORS poured money into Nvidia Corp and made it the world's most valuable chipmaker, convinced that its lead in artificial intelligence (AI) computing would deliver riches. Attention has now shifted to whether AI itself will pay off for companies investing tens of billions of dollars in the vast data centres required to power it. For now, Nvidia remains the preeminent picks-and-shovels seller in an AI gold rush. Revenue is still soaring, and the orderbook for the company's accelerator chips is bulging. The company's continued success depends on CEO Jensen Huang's ability to manage a myriad of challenges. Huang is pushing forward the capability of his chips to prove to his biggest customers, including Microsoft Corp and Inc, that the products are the best they can get. That's doubly important because those companies are developing in-house technology that could eventually replace some of Nvidia's less-advanced semiconductors. He's trying to help a broader range of companies to more easily use AI computing in their businesses, while navigating geopolitical tensions that threaten to cut Nvidia off from major global markets. Here's a look at what's been driving Nvidia's spectacular growth and the challenges ahead. What are Nvidia's most popular AI chips? The current moneymaker is the Hopper H100, the name of which is a nod to computer science pioneer Grace Hopper. It's a beefier version of a graphics processing unit that originated in personal computers used by video gamers. Hopper is being replaced at the top of the lineup with the Blackwell range, named for mathematician David Blackwell. Both Hopper and Blackwell include technology that turns clusters of computers that use Nvidia chips into single units that can process vast volumes of data and make computations at high speeds. That makes them a perfect fit for the power-intensive task of training the neural networks that underpin the latest generation of AI products. Founded in 1993, Nvidia pioneered this market with investments dating back more than a decade, when it bet that the ability to do work in parallel would one day make its chips valuable in applications outside of gaming. The Santa Clara, California-based company will sell the Blackwell products in a variety of options, including as part of the GB200 superchip, which combines two Blackwell GPUs with one Grace CPU, a general-purpose central processing unit. (The Grace CPU is also named for Grace Hopper.) Why are Nvidia's AI chips special? So-called generative AI platforms learn tasks such as translating text, summarising reports and synthesising images by ingesting vast quantities of preexisting material. The more they absorb, the better they perform. They develop through trial and error, making billions of attempts to achieve proficiency and sucking up huge amounts of computing power along the way. Blackwell delivers 2.5 times Hopper's performance in training AI, according to Nvidia. The new design has so many transistors — the tiny switches that give semiconductors their ability to process information — that it can't be produced conventionally as a single unit. It's actually two chips married to one other through a connection that ensures they act seamlessly as one, the company said. How did Nvidia become a leader in AI? Nvidia was already the king of graphics chips, the components that generate the images you see on a computer screen. The most powerful of those are built with thousands of processing cores that perform multiple simultaneous threads of computation. This allows them to produce the complex 3D renderings like shadows and reflections that are a feature of today's video games. What are Nvidia's competitors doing? Nvidia controls about 90% of the market for data centre GPUs, according to market research firm IDC. Dominant cloud computing providers and major Nvidia customers such as Amazon's AWS, Alphabet Inc's Google Cloud and Microsoft's Azure are trying to develop their own chips, as are Nvidia rivals Advanced Micro Devices Inc (AMD) and Intel Corp. How does Nvidia stay ahead of its competitors? Nvidia has updated its offerings, including software to support the hardware, at a pace that no other firm has yet been able to match. The company has also devised cluster systems that help its customers to buy chips in bulk and deploy them quickly. Huang keeps up a frantic pace of appearances at tech shows and company events all over the world to tout new offerings and tie ups. Nvidia has committed to annual introductions of new main products for years to come, reflecting what Huang says is an unprecedented commitment to advancing innovation in the industry. Such pledges serve as a warning to rivals that they are trying to catch a moving train. — BLOOMBERG This article first appeared in The Malaysian Reserve weekly print edition