logo
#

Latest news with #NVAQC

Nvidia Builds An AI Superhighway To Practical Quantum Computing
Nvidia Builds An AI Superhighway To Practical Quantum Computing

Forbes

time05-05-2025

  • Business
  • Forbes

Nvidia Builds An AI Superhighway To Practical Quantum Computing

At the GTC 2025 conference, Nvidia announced its plans for a new, Boston-based Nvidia Accelerated Quantum Research Center or NVAQC, designed to integrate quantum hardware with AI supercomputers. Expected to begin operations later this year, it will focus on accelerating the transition from experimental to practical quantum computing. 'We view this as a long-term opportunity,' says Tim Costa, Senior Director of Computer-Aided Engineering, Quantum and CUDA-X at Nvidia. 'Our vision is that there will come a time when adding a quantum computing element into the complex heterogeneous supercomputers that we already have would allow those systems to solve important problems that can't be solved today.' Quantum computing, like AI (i.e., deep learning) a decade ago, is yet another emerging technology with an exceptional affinity with Nvidia's core product, the GPU. It is another milestone in Nvidia's successful ride on top of the technological shift re-engineering the computer industry, the massive move from serial data processing (executing instructions one at a time, in a specific order) to parallel data processing (executing multiple operations simultaneously). Over the last twenty years, says Costa, there were several applications where 'the world was sure it was serial and not parallel, and it didn't fit GPUs. And then, a few years later, rethinking the algorithms has allowed it to move on to GPUs.' Nvidia's ability to 'diversify' from its early focus on graphics processing (initially to speed up the rendering of three-dimensional video games) is due to the development in the mid-2000s of its software, the Compute Unified Device Architecture or CUDA. This parallel processing programming language allows developers to leverage the power of GPUs for general-purpose computing. The key to CUDA's rapid adoption by developers and users of a wide variety of scientific and commercial applications was a decision by CEO Jensen Huang to apply CUDA to the entire range of Nvidia's GPUs, not just the high-end ones, thus ensuring its popularity. This decision—and the required investment—caused Nvidia's gross margin to fall from 45.6% in the 2008 fiscal year to 35.4% in the 2010 fiscal year. 'We were convinced that accelerated computing would solve problems that normal computers couldn't. We had to make that sacrifice. I had a deep belief in [CUDA's] potential,' Huang told Tae Kim, author of the recently published The Nvidia Way. This belief continues to drive Nvidia's search for opportunities where 'we can do lots of work at once,' says Costa. 'Accelerated computing is synonymous with massively parallel computing. We think accelerated computing will ultimately become the default mode of computing and accelerate all industries. That is the CUDA-X strategy.' Costa has been working on this strategy for the last six years, introducing the CUDA software to new areas of science and engineering. This has included quantum computing, helping developers of quantum computers and their users simulate quantum algorithms. Now, Nvidia is investing further in applying its AI mastery to quantum computing. Nvidia became one of the world's most valuable companies because the performance of the artificial neural networks at the heart of today's AI depends on the parallelism of the hardware they are running on, specifically the GPU's ability to process many linear algebra multiplications simultaneously. Similarly, the basic units of information in quantum computing, qubits, interact with other qubits, allowing for many different calculations to run simultaneously. Combining quantum computing and AI promises to improve AI processes and practices and, at the same time, escalate the development of practical applications of quantum computing. The focus of the new Boston research center is on 'using AI to make quantum computers more useful and more capable,' says Costa. 'Today's quantum computers are fifty to a hundred qubits. It's generally accepted now that truly useful quantum computing will come with a million qubits or more that are error corrected down to tens to hundreds of thousands of error-free or logical qubits. That process of error correction is a big compute problem that has to be done in real time. We believe that the methods that will make that successful at scale will be AI methods.' Quantum computing is a delicate process, subject to interference from 'noise' in its environment, resulting in at least one failure in every thousand operations. Increasing the number of qubits introduces more opportunities for errors. When Google announced Willow last December, it called it 'the first quantum processor where error-corrected qubits get exponentially better as they get bigger.' Its error correction software includes AI methods such as machine learning, reinforcement learning, and graph-based algorithms, helping identify and correct errors accurately, 'the key element to unlocking large-scale quantum applications,' according to Google. 'Everyone in the quantum industry realizes that the name of the game in the next five years will be quantum error correction,' says Doug Finke, Chief Content Officer at Global Quantum Intelligence. 'The hottest job in quantum these days is probably a quantum error correction scientist, because it's a very complicated thing.' The fleeting nature of qubits—they 'stay alive' for about 300 microseconds—requires speedy decisions and very complex math. A ratio of 1,000 physical qubits to one logical qubit would result in many possible errors. AI could help find out 'what are the more common errors and what are the most common ways of reacting to it,' says Finke. Researchers from the Harvard Quantum Initiative in Science and Engineering and the Engineering Quantum Systems group at MIT will test and refine these error correction AI models at the NVAQC. Other collaborators include quantum startups Quantinuum, Quantum Machines, and QuEra Computing. They will be joined by Nvidia's quantum error correction research team and Nvidia's most advanced supercomputer. 'Later this year, we will have the center ready, and we'll be training AI models and testing them on integrated devices,' says Costa.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store