
Nvidia ruffles tech giants with move into cloud computing
Cloud computing generates big profits for Amazon.com, Microsoft and Google. Now that cash cow faces a nascent threat with the rise of artificial-intelligence cloud specialists and a new industry power broker: Nvidia.
AI-chip maker Nvidia launched its own cloud-computing service two years ago called DGX Cloud. It has also nurtured upstarts competing with the big cloud companies, investing in AI cloud players CoreWeave and Lambda.
Those moves have yet to make an enormous dent, but a competitive shift is easy to imagine if computing demand continues to shift toward AI and Nvidia remains the sector's principal arms dealer.
DGX Cloud is already growing fast. UBS analysts estimated when it launched that it could grow into a more than $10 billion annual revenue business. And CoreWeave, which listed shares on the Nasdaq in March, is forecasting around $5 billion of revenue this year.
Those businesses are limited by their narrow focus on AI computing, and they pale in comparison to the more than $107 billion of sales Amazon's market-leading cloud business generated last year.
Yet any challenge in cloud computing would be worrying for Amazon: While the company's cloud division accounted for 29% of its revenue in its latest quarter, it accounted for more than 60% of its operating income thanks to its high margins.
Microsoft and Alphabet's Google, the next two largest cloud companies, have a lot to lose if the cloud-computing landscape shifts, too. Growing macroeconomic concerns are raising caution about IT spending. Google is under antitrust scrutiny in the U.S., and its golden goose—the search engine—is being challenged by OpenAI.
All the big cloud companies effectively offer AI chips for rent, many of them made by Nvidia, which has a market share estimated at around 80%. In what is perhaps a testament to Nvidia's market power, though, the cloud companies are helping Nvidia grow its own cloud business.
Under DGX Cloud's unusual arrangement, the cloud giants buy and manage equipment—including Nvidia's chips—that forms the backbone of the service. Then Nvidia leases back that equipment from them and rents it out to corporate clients. It also offers access to its AI experts and software as part of the package.
That has left cloud-computing giants in an uncomfortable position. While they make money through the arrangement, they are also being asked to help a service that could compete with them. Some of them haven't rushed to participate, even if they do eventually join up; Google was notably absent from a roster of companies participating in a DGX Cloud chip-rental marketplace announced in May.
Roy Illsley, the chief analyst at tech research firm Omdia, said participating made sense for the cloud companies a couple of years ago because their own AI services weren't well-developed, although they have gotten better recently. 'They needed to respond to the market when the AI revolution took off, and what Nvidia did was give them a solution when they hadn't got their own ducks in order," he said.
It isn't clear how big DGX Cloud is—Nvidia doesn't break out its revenue or profits. But the company said in its latest fiscal year that it had $10.9 billion in multiyear cloud service agreements, up from $3.5 billion the year before, in large part to support DGX Cloud. If the service is more than breaking even—a fair bet given usually high cloud-computing profit margins—that is already a fairly sizable business.
Nvidia is adamant that it isn't trying to outshine the cloud-computing giants with DGX Cloud. In the company's telling, it aims to help connect customers with AI computing power and Nvidia's expertise in ways that weren't otherwise possible.
That may be true for now, but it would be naive to think Nvidia doesn't have any further designs. At minimum, DGX Cloud gives Nvidia an option in the future to grow a big cloud business and the power now to help shape how AI is developed. Some of the same logic likely helps drive its investments in CoreWeave and Lambda.
Nvidia has reason to be wary of the durability of its relationships with the existing cloud players, too: As it encroaches on the cloud giants' territory, they are increasingly trespassing on Nvidia's. All of the cloud giants are developing their own custom AI chips that eventually could supplant Nvidia's. That would save them money at the expense of Nvidia's revenue, unless the chip maker can find it somewhere else.
As Nvidia seeks new areas to conquer after its runaway AI success, cloud computing looks increasingly like fertile territory—even if some of its biggest customers may not be thrilled about it.
Write to Asa Fitch at asa.fitch@wsj.com

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Indian Express
23 minutes ago
- Indian Express
Google powers up developer workflows with new open-source AI tool Gemini CLI
Google has announced Gemini CLI, its open-source AI agent to rival Claude Code and Codex CLI. Developers rely on the command-line interface (CLI) terminal for almost everything. According to the tech giant, Gemini CLI's ubiquity and portability make it a go-to utility for getting work done. The latest offering from Google addresses developers' demand for integrated AI assistance. Gemini CLI harnesses Gemini directly into the terminal. It essentially offers a lightweight access to Gemini, offering developers the most direct path from their prompt to the model. 'While it excels at coding, we built Gemini CLI to do so much more. It's a versatile, local utility you can use for a wide range of tasks, from content generation and problem solving to deep research and task management,' the company said in its official release. Further, Google has also integrated Gemini CLI with Google's AI coding assistant, Gemini Code Assist. This will allow all Code Assist Developers on free, standard, and enterprise plans to get prompt-driven and AI-first coding in VS Code and Gemini CLI. With Gemini CLI, Google is offering unmatched usage limits for individual developers. In order to use Gemini CLI for free, users simply need to log in with their personal Google account to get a free Gemini Code Assist licence. This free licence brings free access to Gemini 2.5 Pro and its massive one million token context. In case a user ever hits a limit during this preview, Google is offering the industry's largest allowance with 60 model requests per minute and 1,000 requests per day at no charge. For professional developers required to run multiple agents at the same time, they can opt for Google AI Studio or Vertex AI key for usage-based billing or get a Gemini Code Assist Standard or Enterprise licence. Gemini CLI, now in preview, offers powerful AI capabilities, from code understanding and file manipulation to command execution and dynamic troubleshooting. The AI agent offers an upgrade to one's command line experience, allowing them to write code, debug issues, and streamline their workflow with natural language. According to Google, Gemini CLI's powerful capabilities are owing to built-in tools. Users can fetch web pages and offer real-time, external context to the model with ground prompts with Google Search. They can extend Gemini CLI's capabilities with built-in support for the Model Context Protocol (MCP) or bundled extensions. With customised prompts and instructions, developers can tailor Gemini for specific needs and workflows. Further, one can automate tasks and integrate the same with existing workflows by summoning Gemini CLI non-interactively with their scripts. Since Gemini CLI is fully open source (Apache 2.0), developers can inspect the code to understand how it works and verify its security implications.


Time of India
an hour ago
- Time of India
AI startup that Mark Zuckerberg paid $14 billion 'leaked' sensitive data of Google, Elon Musk's xAI and Facebook: What the company has to say
Scale AI, the AI startup that is a key partner for companies like Meta and xAI, is facing scrutiny after a report revealed that the company has been exposing sensitive client and contractor data through publicly accessible Google Docs . The findings, released by Business Insider, raise serious questions about Scale AI 's security practices and its commitment to client confidentiality . The report uncovered thousands of pages of project documents across 85 individual Google Docs, some containing highly confidential information related to Scale AI's work with major tech clients. Documents show information on training data related to Google and Elon Musk's xAI These documents included details on how Google utilised ChatGPT to refine its Bard chatbot, as well as at least seven 'confidential' instruction manuals from Google outlining issues with Bard and how contractors should address them. For Elon Musk's xAI, public documents revealed specifics of "Project Xylophone," including training materials and 700 conversation prompts aimed at enhancing the AI's conversational abilities on diverse topics. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Trading CFD dengan Teknologi dan Kecepatan Lebih Baik IC Markets Mendaftar Undo Similarly, Meta training documents, marked confidential, were found publicly accessible, containing links to audio files demonstrating "good" and "bad" speech prompts for their AI products. Contractors reported that despite attempts at codenaming, it was often easy to identify the client they were working for due to the nature of the tasks or even explicit company logos embedded in presentations. Some even said they could determine the client simply by prompting the AI model directly. Contractor data also exposed: Report Beyond client information, Scale AI also left sensitive personal data belonging to thousands of its contractors exposed in unsecured Google Docs. Spreadsheets, readily viewable by anyone with the URL, listed private Gmail addresses, work performance details, and even categorized some workers as 'high quality' or suspected of 'cheating.' Scale AI responds to report on 'leaking' data In response to the publication's findings, Scale AI stated it takes data security seriously and is investigating the matter. 'We are conducting a thorough investigation and have disabled any user's ability to publicly share documents from Scale-managed systems,' a spokesperson said. "We remain committed to robust technical and policy safeguards to protect confidential information and are always working to strengthen our practices," the spokesperson added. The revelations come on the heels of Meta's blockbuster investment in Scale AI. However, clients including Google, OpenAI, and xAI reportedly paused work with Scale AI following Meta's investment. Acer Swift Neo AI Laptop: Features That Make It Future-Ready! AI Masterclass for Students. Upskill Young Ones Today!– Join Now


Business Standard
an hour ago
- Business Standard
Schneider Electric Accelerates the Development and Deployment of AI Factories at Scale With NVIDIA
VMPL Paris [NVIDIA GTC], June 25: Schneider Electric, the leader in the digital transformation of energy management and automation, today announced it is collaborating with NVIDIA to serve the growing demand for sustainable, AI-ready infrastructure. Together, Schneider Electric and NVIDIA are advancing research and development (R & D) initiatives for power, cooling, controls, and high-density rack systems to enable the next generation of AI factories across Europe and beyond. This unique global partnership, announced during NVIDIA GTC Paris, brings together the world leaders in sustainability and accelerated computing to support the European Union's AI infrastructure ambitions and its "InvestAI" initiative, which plans to mobilize a EUR200 billion investment in AI. Leveraging its expertise in AI-ready infrastructure, sustainability, and grid coordination, Schneider Electric and NVIDIA are together responding to the European Commission's "AI Continent Action Plan," which outlines a shared mission to set up at least 13 AI factories across Europe, while establishing up to five AI gigafactories. "Schneider Electric and NVIDIA are not just partners -- our teams are driving advanced R & D, co-developing the infrastructure needed to power the next wave of AI factories globally," said Olivier Blum, CEO of Schneider Electric. "Together, we've seen tremendous success in deploying next-generation power and liquid cooling solutions, purpose-built for AI data centers. This strategic partnership -- bringing together the world leaders in sustainability and accelerated computing -- allows us to further accelerate this momentum, pushing the boundaries of what's possible for the AI workloads of tomorrow." "AI is the defining technology of our time--the most transformative force reshaping our world," said Jensen Huang, founder and CEO, NVIDIA. "Together with Schneider Electric, we are building AI factories: the essential infrastructure that brings AI to every company, industry, and society." New NVIDIA-Enabled Infrastructure Solutions In support of today's announcement, Schneider Electric has also unveiled a suite of AI-ready data center solutions, including new EcoStruxure™ Pod and Rack Infrastructure. Designed to accelerate AI developments globally, the Prefabricated Modular EcoStruxure Pod Data Center is a scalable, pod-based architecture, enabling rapid AI data center deployment. As part of this, a new Schneider Electric Open Compute Project (OCP) inspired rack system has also been developed to support the NVIDIA GB200 NVL72 platform that uses the NVIDIA MGX modular architecture, integrating Schneider Electric into NVIDIA HGX and MGX ecosystems for the first time. These announcements build on a series of milestones shared by the two global leaders earlier this year, including Schneider Electric and ETAP unveiling the world's first digital twin for electrical and large-scale power systems in AI factories using the NVIDIA Omniverse Blueprint. Together, Schneider Electric and NVIDIA have also co-developed a series of full electrical and liquid cooling-based reference designs as an approved CDU vendor for NVIDIA -- many of which also include solutions from Motivair's liquid cooling portfolio, following its acquisition by Schneider Electric in March 2025. Through this expanded and deepened strategic partnership, Schneider Electric and NVIDIA will continue to accelerate their infrastructure initiatives, fast-tracking new product rollouts and reference designs to build the AI factories of the future.