logo
#

Latest news with #NVL72

AMD's Helios Server launch signals bold challenge to Nvidia's AI supremacy, backed by OpenAI and Crusoe
AMD's Helios Server launch signals bold challenge to Nvidia's AI supremacy, backed by OpenAI and Crusoe

Mint

time13-06-2025

  • Business
  • Mint

AMD's Helios Server launch signals bold challenge to Nvidia's AI supremacy, backed by OpenAI and Crusoe

Advanced Micro Devices (AMD) has revealed its next major push into artificial intelligence hardware, unveiling a new AI server platform calledHelios, designed to challenge Nvidia's dominance in the sector. Speaking at the company's 'Advancing AI' developer conference, CEO Lisa Su outlined AMD's roadmap through 2026, including the launch of the MI350 and upcoming MI400 series AI chips. The Helios servers, set for release next year, will be powered by 72 of the MI400 chips, placing them in direct competition with Nvidia's NVL72 servers built around the Blackwell architecture. Crucially, AMD announced that core elements of the Helios system, including networking standards, would be made openly available, a sharp contrast to Nvidia's historically closed NVLink technology, which has only recently begun to be licensed out under industry pressure. Su stated, 'The future of AI is not going to be built by any one company or in a closed ecosystem. It's going to be shaped by open collaboration across the industry.' In a significant endorsement, OpenAI CEO Sam Altman joined Su onstage and confirmed the company is working closely with AMD on the development of the MI450 chip series, helping tailor the design for large-scale AI workloads. Altman reflected on OpenAI's exponential infrastructure growth, calling it'a crazy, crazy thing to watch.' Executives from Meta, Oracle, xAI, and AI-focused cloud provider Crusoe also took part in the keynote, with Crusoe revealing plans to purchase $400 million worth of AMD chips, a vote of confidence in the company's renewed AI ambitions. Despite the fanfare, AMD shares dipped 2.2 per cent following the announcement. Analysts like Kinngai Chan of Summit Insights suggested the new chips are unlikely to dramatically shift AMD's current market position in the near term, given Nvidia's strong lead in both hardware and supporting software ecosystems. To address these gaps, AMD has made a series of acquisitions aimed at bolstering its AI software capabilities and server infrastructure. The firm acquired server manufacturer ZT Systems in March and recently brought on talent from Untether AI and generative AI startup Lamini. Over the past year, AMD has made 25 strategic investments to accelerate its AI agenda. Nonetheless, AMD's ROCm software platform still trails Nvidia's CUDA in terms of developer adoption and ecosystem maturity. CUDA remains a cornerstone of Nvidia's AI stronghold, widely regarded as a key factor in its dominance. AMD, headquartered in Santa Clara, California, continues to forecast strong double-digit growth in its AI chip segment for the coming year, even as export restrictions on high-end chips to China intensify. (With inputs from Reuters)

Latest MLPerf Shows AMD Catching Up With Nvidia, Sort Of...
Latest MLPerf Shows AMD Catching Up With Nvidia, Sort Of...

Forbes

time04-06-2025

  • Business
  • Forbes

Latest MLPerf Shows AMD Catching Up With Nvidia, Sort Of...

As you AI pros know, the 125-member MLCommons organization alternates training and inference benchmarks every three months. This time around, its all about training, which remains the largest AI hardware market, although not by much as inference drives more growth as the industry shift from research (building) to production (using). As usual, Nvidia took home all the top honors. For the first time, AMD joined the training party (they had previously submitted inference benchmarks), while Nvidia trotted out their first GB200 NVL72 runs to demonstrate industry leadership. Each company focussed on their best features. For AMD it is larger HBM memory, while Nvidia exploited its Arm/GPU GB200 superchip and NVLink scaling. the bottom line is that AMD can now compete head to head with H200 for smaller models that fit into MI325's memory. That means AMD cannot compete with Blackwell today, and certainly cannot compete with NVLink-enabled configurations like NVL72. Let's take a look. (Note that Nvidia is a client of Cambrian-AI Research, and I am a former employee of AMD.) AMD has more HBM memory on their MI325 platform than any Nvidia's GPU, and can therefore contain an entire medium-sized model on a single chip. So, they ran the training benchmark that fits, the Llama 2-70B LORA model. The results are reasonably impressive, besting the Nvidia H200 by an average of 8%. While a good result, I doubt many would choose AMD for 8% better performance, even at a somewhat lower price. The real question, of course, is how much better the MI350 will be when it launches next week, likely with higher performance and even more memory. One thing AMD will not offer soon is better networking for scale-up; the UA-Link needed to compete with NVLink is still months away (possibly in the MI400 timeframe in 2026). So, if you only need a 70B model, AMD may be a better deal than Nvidia H200; but not by much. AMD is also showing traction with partners, and better performance from its software, which took quite a beating from SemiAnalysis last December. With better ease-of-use from ROCm, partners can benefit from offering customers a choice; many enterprises do not need the power of an NVL72 or NVLink, especially if they are focussed on simple inference processing. And of course, AMD can offer better availability, as NVIDIA GB200 is much harder to obtain due to overwhelming demand and pre-sold capacity. The rumor mill says GB200 still takes over a full year delivery time if you order today. AMD Partners also submitted MLPerf results. AMD So, if you net it out, the MI325 result foreshadows a decent position for the MI350, but support for only up to 8 GPUs per cluster limits their use for large-scale training deployments. Nvidia says the GB200 NVL72 has now arrived, if you were smart enough to put in an early order. With over fifty benchmark submissions using up to nearly 2500 GPUs, Nvidia and their partners ran every MLPerf benchmark on the ~3000 pound rack, winning each one. CoreWeave submitted the largest configuration, with nearly 2500 GPUs. Nvidia focused on the GB200 NVL72 in this round. Nvidia While the GB200 NVL72 can outperform Hopper by some 30X for inference processing, its advantage for training is 'only' about 2.5X; thats still a lot of savings in time and money. The reason is that inference processing benefits greatly from the lower 4- and 8-bit precision math available in Blackwell, and the new Dynamo "AI Factory OS' optimizes inference processing and reuses previously calculated tokens in KV-Cache. While AMD does not yet have the scale-up networking required to train larger models at Nvidia's level of performance, this benchmark shows that they are getting close enough to be a contender once that networking is ready next year. And AMD can already out-perform the Nvidia H200 once you clear the CUDA development hurdle. It could take a year or more for AMD to replicate the NVL72 architectural benefits, and by then Nvidia will have moved on to the Kyber-based NVL576 with the new NVLink7, Vera CPU and upgraded Rubin GPU. If you start late; you stay behind.

Prediction: These 4 Explosive AI Megatrends Will Catapult Nvidia to a $5 Trillion Market Cap
Prediction: These 4 Explosive AI Megatrends Will Catapult Nvidia to a $5 Trillion Market Cap

Yahoo

time03-06-2025

  • Business
  • Yahoo

Prediction: These 4 Explosive AI Megatrends Will Catapult Nvidia to a $5 Trillion Market Cap

Nvidia CEO Jensen Huang spoke about four key AI growth drivers for his company in Nvidia's Q1 earnings call. These AI megatrends -- reasoning AI, AI diffusion, enterprise AI, and industrial AI -- present huge opportunities for Nvidia. 10 stocks we like better than Nvidia › Nvidia (NASDAQ: NVDA) is breathing down Microsoft's neck to become the world's most valuable company. I think it's only a matter of time before the GPU maker takes the No. 1 spot. In Nvidia's latest quarterly update, CEO Jensen Huang spoke about four artificial intelligence (AI) growth drivers that "are really kicking into turbocharge." Huang was onto something, in my opinion. I even predict that the four explosive AI megatrends he mentioned will catapult Nvidia to a $5 trillion market cap. Huang discussed reasoning AI extensively during Nvidia's first-quarter earnings call. Reasoning AI solves problems step by step. It's also a critical technology for taking AI agents to the next level. Huang noted that there has been "a huge breakthrough in the last couple of years" that has resulted in "super agents" that use multiple tools and work in clusters to solve problems. These reasoning AI agents will almost certainly become heavily used by lots of companies over the next few years. However, they require exponentially more computing power than past AI models. That's great news for Nvidia. Huang believes that his company's Grace Blackwell and NVL72 (which connects Grace CPUs to Blackwell GPUs) together make "the ideal engine" for reasoning AI. I think he's right. And I predict the skyrocketing demand for this technology -- and the future newer-generation versions on the way -- will provide a huge tailwind that helps get Nvidia to a $5 trillion market cap. Huang praised the Trump administration for rescinding the AI diffusion rule established during the Biden administration. This rule, which was originally scheduled to go into effect on May 15, 2025, before its rescission, would have restricted U.S. AI chip exports to many countries. AI won't be limited to a handful of technologically advanced nations. Jensen correctly observed in the Q1 earnings call that "countries around the world are awakening to the importance of AI as an infrastructure, not just as a technology of great curiosity and great importance, but infrastructure for their industries and start-ups and society." As countries build AI infrastructure, Huang thinks it will create a tremendous opportunity for Nvidia. Again, I fully agree. Enterprise AI is the integration of AI throughout a large organization to improve its business processes. Huang said in Nvidia's Q1 call, "Enterprise AI is just taking off." This megatrend is joined at the hip with reasoning AI. Many of the AI agents that reasoning AI makes possible will be deployed enterprise-wide. Huang pointed out that enterprise information technology consists of three major components: compute, storage, and networking. These components are also critical for enterprise AI. Nvidia has put all of them together. I expect the company will see strong revenue and earnings growth as a result of its enterprise AI leadership, which will propel its market cap higher. Industrial AI was the last AI megatrend mentioned by Huang. It's the application of AI to industrial processes to improve efficiency and productivity. Huang predicted, "[E]very factory today that makes things will have an AI factory that sits with it." He believes these AI factories will create and operate AI for the physical factory, plus "power the products and the things that are made by the factory." His vision also includes robots in the factories. Nvidia's Omniverse product already helps manufacturers build 3D simulations and "digital twins" of real-world facilities. They can also use Omniverse to train and test autonomous vehicles and robots used in factories. Is Huang right that "every factory will have an AI factory"? Maybe not. However, I suspect that many factories will. And I predict industrial AI will be a significant growth driver for Nvidia over the next decade that helps catapult the company to a $5 trillion market cap. Before you buy stock in Nvidia, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Nvidia wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $651,049!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $828,224!* Now, it's worth noting Stock Advisor's total average return is 979% — a market-crushing outperformance compared to 171% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of June 2, 2025 Keith Speights has positions in Microsoft. The Motley Fool has positions in and recommends Microsoft and Nvidia. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy. Prediction: These 4 Explosive AI Megatrends Will Catapult Nvidia to a $5 Trillion Market Cap was originally published by The Motley Fool Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Nvidia Wins Praise as Analysts Cheer 'Very Bullish' AI Guidance
Nvidia Wins Praise as Analysts Cheer 'Very Bullish' AI Guidance

Yahoo

time29-05-2025

  • Business
  • Yahoo

Nvidia Wins Praise as Analysts Cheer 'Very Bullish' AI Guidance

May 29 Nvidia's (NASDAQ:NVDA) upbeat quarterly results and guidance drew positive reactions from analysts on Wednesday, who pointed to rising global demand for AI chips despite headwinds from China. Wedbush's Dan Ives described the performance as robust and said demand signals remain strong, even with export restrictions to China. Ives noted recent deals in the Middle East, including from Saudi Arabia and the UAE, as signs of growing interest from governments in AI infrastructure. Warning! GuruFocus has detected 4 Warning Signs with NVDA. Morgan Stanley's Joseph Moore said Nvidia's report helped ease concerns over slowing momentum, especially with signs of growth in all markets except China. He raised his price target to $170 from $160 and reaffirmed an Overweight rating. Jefferies analyst Blayne Curtis said earlier worries about supply mismatches have faded, with hyperscalers now deploying Nvidia's NVL72 systems at scale. He added that networking and gaming segments showed healthy recovery. Bank of America's Vivek Arya lifted his price target to $180 and said Nvidia's earnings potential could reach $10 per share as sovereign demand picks up. J.P. Morgan's Harlan Sur said the results reinforce Nvidia's position as a leader in the AI chip market. This article first appeared on GuruFocus. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Why export restrictions aren't the only thing to pay attention to in Nvidia's earnings
Why export restrictions aren't the only thing to pay attention to in Nvidia's earnings

Yahoo

time28-05-2025

  • Business
  • Yahoo

Why export restrictions aren't the only thing to pay attention to in Nvidia's earnings

After the market closes on Wednesday, Nvidia will report earnings for the first quarter of its fiscal year 2026, which ended on April 27. While many in the industry are likely eager to hear how the recent whiplash surrounding U.S. chip export controls will impact Nvidia's international chip business and future guidance, not everyone thinks that is the most important piece of Nvidia's results to pay attention to. Kevin Cook, a senior equity strategist at Zacks Investment Research who has followed Nvidia for a decade, told TechCrunch he believes the company's rollout its new GB200 NVL72 hardware -- a single-rack exascale computer that started shipping in February -- is a much more important area for shareholders to focus on. These GB200 NVL72 machines include 72 GPUs and cost around $3 million. Cook said that, despite strong demand and high expectations heading into this year, the chaos around DeepSeek in late January sparked many analysts to halve their delivery estimates for the unit. Cook added that, since this is the first quarter the company has shipped the machine, there isn't yet a clear indicator of how things are going. "If Jensen [Huang] says we are going to deliver 10,000 units in Q2, the street will be very impressed," Cook said. "That's a big doable number; 10,000 is $30 billion on a $3 million product. I think they are going to do less than 5,000." Cook added that these results will start to paint a picture of enterprise appetite for the latest AI tech. Will companies upgrade their AI hardware each time a new system comes out, similar to how consumers upgrade to the latest iPhone each year? Cook isn't sure. Whether or not enterprises will adopt that behavior could have a significant impact on Nvidia down the line. There will be immediate effects on Nvidia's stock based on what the company says regarding U.S. export controls, Cook predicted. But he doesn't think it will impact Nvidia's valuation or stock price long term in the same way that demand for the GB200 NVL72 might. Nvidia's stock price has proven it can recover from short-term market reactions, he added. "We basically had a flash crash, and it's right back up," Cook said regarding Nvidia's stock price after the chip export restrictions were announced. "That's unique to Nvidia. Lots of companies are going to have hiccups, but Nvidia has the biggest moat. They have the most resilience to any of this. It's such an irony that they could have this issue with China -- whether or not they can sell -- and it basically gets shrugged off, right?" Even if chip export restrictions on China remain or become more stringent, Cook argued that Nvidia isn't struggling to find customers elsewhere. The company currently sells to all the major hyperscalers and will likely continue to see strong demand for its AI chips. He added that the recent announcements regarding Stargate's new project in the Middle East will likely be another win for the company. For Cook, his guidance really comes down to those GB200 NVL72 units. "As long as we hear that deliveries are expected to be steady to exceptional, then whatever fluctuations in this quarter's revenue, I think, are going to be put on the back burner because the wind is in their sails for the rest of the year," Cook said. This article originally appeared on TechCrunch at Sign in to access your portfolio

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store