logo
#

Latest news with #quantumAI

QBTS Stock Gains on Tangible AI Use Cases: More Upside Ahead?
QBTS Stock Gains on Tangible AI Use Cases: More Upside Ahead?

Yahoo

time07-08-2025

  • Business
  • Yahoo

QBTS Stock Gains on Tangible AI Use Cases: More Upside Ahead?

D-Wave Quantum QBTS is accelerating its momentum across both real-world quantum AI applications and deep tech hardware innovation. In terms of application, the company's quantum-enhanced AI is already yielding measurable gains across various domains. Earlier this week, the company's shares jumped following D-Wave's announcement that Japan Tobacco, which leveraged D-Wave's technology for quantum-assisted drug discovery, achieved superior results compared to traditional classical model training. Similarly, the Julich Supercomputing Centre in Germany reported improved accuracy in protein-DNA binding predictions. Added to this, TRIUMF, Canada's national particle accelerator center, demonstrated significant simulation speedups by integrating AI with quantum systems. These early successes validate the practical advantages of combining quantum computing with artificial intelligence. Complementing these use-case achievements is D-Wave's strategic investment in advanced cryogenic packaging, an initial step in scaling both its annealing and gate-model quantum architectures. The company is collaborating with NASA's Jet Propulsion Laboratory (JPL) to develop superconducting bump-bond interconnects, a critical innovation aimed at enhancing the performance and manufacturability of quantum processors. This initiative is expected to unlock multiple hardware advantages, including higher qubit density, extended coherence times and support for multichip quantum processor designs, all essential for progressing toward D-Wave's ambitious 100,000-qubit roadmap. During the first few days of August, shares of D-Wave rallied 6.5% backed by the above developments. Month-to-Date QBTS Share Rally Image Source: Zacks Investment Research Diverging Paths Among Quantum Computing Rivals IonQ IONQ: It is advancing in quantum AI through its gate-based architecture, focusing on hybrid AI model training and partnerships with cloud providers. While strong in developer tooling and quantum machine learning, IonQ is yet to introduce a blockchain framework like QBTS, leaving a potential gap in its emerging markets strategy. Rigetti Computing RGTI: The company remains hardware-focused, prioritizing qubit fidelity and government contracts via its QCS platform. Though exploring AI, Rigetti lacks blockchain-specific initiatives and domain toolkits like D-Wave's PyTorch integration, making its approach more tech-centric and less diversified across emerging commercial applications. Average Target Price for QBTS Suggests Near-Term Upside Based on short-term price targets offered by nine analysts, D-Wave Quantum's average price target represents an increase of 8.7% from the last closing price of $17.18. Image Source: Zacks Investment Research D-Wave Quantum currently carries a Zacks Rank #3 (Hold). You can see the complete list of today's Zacks #1 Rank (Strong Buy) stocks here. Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free report IonQ, Inc. (IONQ) : Free Stock Analysis Report Rigetti Computing, Inc. (RGTI) : Free Stock Analysis Report D-Wave Quantum Inc. (QBTS) : Free Stock Analysis Report This article originally published on Zacks Investment Research ( Zacks Investment Research

A Quantum Battery Has Outperformed a Classical One for the First Time Ever
A Quantum Battery Has Outperformed a Classical One for the First Time Ever

Yahoo

time17-07-2025

  • Science
  • Yahoo

A Quantum Battery Has Outperformed a Classical One for the First Time Ever

Here's what you'll learn when you read this story: For more than a decade, scientists have been investigating ways to develop a 'quantum battery' that stores energy using photons rather than electrons or ions. While quantum batteries—thanks to properties like superabsorption and quantum entanglement—could theoretically charge more quickly than their classical counterparts, there's been little evidence to show this sort of 'quantum advantage.' In a new study, scientists have developed a model battery that reaches the known quantum speed limit and provides a measurable advantage over classical batteries, but building such a device remains remarkably complicated. Over the last few decades, it seems that every classical piece of technology has gotten a quantum counterpart. Of course, people spend the most time and resources trying to develop quantum computers, but there's also advancements in things like the quantum internet, quantum cryptography, and yes, quantum AI. However, one of these technologies that doesn't get much time in the spotlight is quantum batteries. As their name suggests, quantum batteries store energy using photons—particles of light and the carrier for the electromagnetic force—rather than electrons or ions, as is the case with classical electrochemical batteries. But since their introduction in 2012, scientists have yet to convincingly establish the technology's 'quantum advantage'—does a quantum battery really surpass the function and capability of regular, old classical batteries? Quantum batteries do have a few unique tricks up their sleeves, however. Due to attributes such as quantum entanglement and superabsorption—where the rate of absorption of light increases with the number of molecules—quantum batteries could charge much quicker than even the very best classical batteries. And now, a new study shows the first evidence of quantum advantage in this new type of battery. In an article published in the journal Physical Review Letters, scientists from the PSL Research University in Paris and the University of Pisa describe how they developed a simple quantum battery model at a microscopic scale. So, no—this one won't be charging your iPhone anytime soon. But this battery does provide a major boost for efforts to develop these potentially game-changing pieces of tech. 'Our model consists of two coupled harmonic oscillators: one acts as the 'charger,' and the other serves as the 'battery,'' the authors told 'The key ingredient enabling the quantum advantage is an anharmonic interaction between the two oscillators during the charging process. This anharmonic coupling allows the system to access non-classical, entangled states […] enabling faster energy transfer than in classical dynamics.' The researchers also showed that the battery could theoretically reach the quantum speed limit (QSL), which is the maximum rate of change in a quantum system. This would definitely exceed the performance of classical batteries. This isn't the first theoretical model of a quantum battery. Last year, a group of researchers developed a model quantum battery the size of an atom that also used intermediate cavities to avoid 'decoherence,' which is the process through which a system loses its quantum properties. But both of these examples are just theoretical models—building such a device that's practical is something else entirely. It's for this reason that researchers believe quantum batteries are still pretty far removed from everyday applications. The authors of this new paper say that their battery would need to be built using superconducting circuits, which experience zero electrical resistance at near-absolute zero temperatures. 'To the best of our knowledge, this work provides the first rigorous certification of a genuine quantum advantage in a solvable model,' the authors told 'We hope that our work will stimulate further research on this exciting topic, fostering progress on both the theoretical and experimental fronts.' You Might Also Like The Do's and Don'ts of Using Painter's Tape The Best Portable BBQ Grills for Cooking Anywhere Can a Smart Watch Prolong Your Life?

Doing The Work With Frontier Models: I'll Talk To AI
Doing The Work With Frontier Models: I'll Talk To AI

Forbes

time15-06-2025

  • Business
  • Forbes

Doing The Work With Frontier Models: I'll Talk To AI

Artificial Intelligence processor unit. Powerful Quantum AI component on PCB motherboard with data ... More transfers. Within the industry, where people talk about the specifics of how LLMs work, they often use the term 'frontier models.' But if you're not connected to this business, you probably don't really know what that means. You can intuitively apply the word 'frontier' to know that these are the biggest and best new systems that companies are pushing. Another way to describe frontier models is as 'cutting-edge' AI systems that are broad in purpose, and overall frameworks for improving AI capabilities. When asked, ChatGPT gives us three criteria – massive data sets, compute resources, and sophisticated architectures. Here are some key characteristics of frontier models to help you flush out your vision of how these models work: First, there is multimodality, where frontier models are likely to support non-text inputs and outputs – things like image, video or audio. Otherwise, they can see and hear – not just read and write. Another major characteristic is zero-shot learning, where the system is more capable with less prompting. And then there's that agent-like behavior that has people talking about the era of 'agentic AI.' If you want to play 'name that model' and get specific about what companies are moving this research forward, you could say that GPT 4o from OpenAI represents one such frontier model, with multi-modality and real-time inference. Or you could tout the capabilities of Gemini 1.5, which is also multimodal, with decent context. And you can point to any number of other examples of companies doing this kind of research well…but also: what about digging into the build of these systems? At a recent panel at Imagination in Action, a team of experts analyzed what it takes to work in this part of the AI space and create these frontier models The panel moderator, Peter Grabowski, introduced two related concepts for frontier models – quality versus sufficiency, and multimodality. 'We've seen a lot of work in text models,' he said. 'We've seen a lot of work on image models. We've seen some work in video, or images, but you can easily imagine, this is just the start of what's to come.' Douwe Kiela, CEO of Contextual AI, pointed out that frontier models need a lot of resources, noting that 'AI is a very resource-intensive endeavor.' 'I see the cost versus quality as the frontier, and the models that actually just need to be trained on specific data, but actually the robustness of the model is there,' said Lisa Dolan, managing director of Link Ventures (I am also affiliated with Link.) 'I think there's still a lot of headroom for growth on the performance side of things,' said Vedant Agrawal, VP of Premji Invest. Agrawal also talked about the value of using non-proprietary base models. 'We can take base models that other people have trained, and then make them a lot better,' he said. 'So we're really focused on all the all the components that make up these systems, and how do we (work with) them within their little categories?' The panel also discussed benchmarking as a way to measure these frontier systems. 'Benchmarking is an interesting question, because it is single-handedly the best thing and the worst thing in the world of research,' he said. 'I think it's a good thing because everyone knows the goal posts and what they're trying to work towards, and it's a bad thing because you can easily game the system.' How does that 'gaming the system' work? Agrawal suggested that it can be hard to really use benchmarks in a concrete way. 'For someone who's not deep in the research field, it's very hard to look at a benchmarking table and say, 'Okay, you scored 99.4 versus someone else scored 99.2,'' he said. 'It's very hard to contextualize what that .2% difference really means in the real world.' 'We look at the benchmarks, because we kind of have to report on them, but there's massive benchmark fatigue, so nobody even believes it,' Dolan said. Later, there was some talk about 10x systems, and some approaches to collecting and using data: · Identifying contractual business data · Using synthetic data · Teams of annotators When asked about the future of these systems, the panel return these three concepts: · AI agents · Cross-disciplinary techniques · Non-transformer architectures Watch the video to get the rest of the panel's remarks about frontier builds. What Frontier Interfaces Will Look Like Here's a neat little addition – interested in how we will interact with these frontier models in 10 years' time, I put the question to ChatGPT. Here's some of what I got: 'You won't 'open' an app—they'll exist as ubiquitous background agents, responding to voice, gaze, emotion, or task cues … your AI knows you're in a meeting, it reads your emotional state, hears what's being said, and prepares a summary + next actions—before you ask.' That combines two aspects, the mode, and the feel of what new systems are likely to be like. This goes back to the personal approach where we start seeing these models more as colleagues and conversational partners, and less as something that stares at you from a computer screen. In other words, the days of PC-DOS command line systems are over. Windows changed the computer interface from a single-line monochrome system, to something vibrant with colorful windows, reframing, and a tool-based desktop approach. Frontier models are going to do even more for our sense of interface progression. And that's going to be big. Stay tuned.

Multiverse Computing Raises $215M to Scale Ground-Breaking Technology that Compresses LLMs by up to 95%
Multiverse Computing Raises $215M to Scale Ground-Breaking Technology that Compresses LLMs by up to 95%

Yahoo

time12-06-2025

  • Business
  • Yahoo

Multiverse Computing Raises $215M to Scale Ground-Breaking Technology that Compresses LLMs by up to 95%

Technology breakthrough attracts international investment. Quantum AI leader to turbo charge AI proliferation, reduce power concerns and bring the technology to the edge Multiverse Computing's Founding Team SAN SEBASTIAN, Spain , June 12, 2025 (GLOBE NEWSWIRE) -- Multiverse Computing, the global leader in quantum-inspired AI model compression, has developed CompactifAI, a compression technology capable of reducing the size of LLMs (Large Language Models) by up to 95% while maintaining model performance. Having spent 2024 developing the technology and rolling it out to initial customers, the company today announces a €189 million ($215 million) investment round. The Series B will be led by Bullhound Capital with the support of world-class investors such as HP Tech Ventures, SETT, Forgepoint Capital International, CDP Venture Capital, Santander Climate VC, Quantonation, Toshiba and Capital Riesgo de Euskadi - Grupo SPRI. The company has brought on widespread support for this push with a range of international and strategic investors. The investment will accelerate widespread adoption to address the massive costs prohibiting the roll out of LLMs, revolutionizing the $106 billion AI inference market. LLMs typically run on specialized, cloud-based infrastructure that drives up data center costs. Traditional compression techniques—quantization and pruning—aim to address these challenges, but their resulting models significantly underperform original LLMs. With the development of CompactifAI, Multiverse discovered a new approach. CompactifAI models are highly-compressed versions of leading open source LLMs that retain original accuracy, are 4x-12x faster and yield a 50%-80% reduction in inference costs. These compressed, affordable, energy-efficient models can run on the cloud, on private data centers or—in the case of ultra compressed LLMs—directly on devices such as PCs, phones, cars, drones and even Raspberry Pi. 'The prevailing wisdom is that shrinking LLMs comes at a cost. Multiverse is changing that,' said Enrique Lizaso Olmos, Founder and CEO of Multiverse Computing. 'What started as a breakthrough in model compression quickly proved transformative—unlocking new efficiencies in AI deployment and earning rapid adoption for its ability to radically reduce the hardware requirements for running AI models. With a unique syndicate of expert and strategic global investors on board and Bullhound Capital as lead investor, we can now further advance our laser-focused delivery of compressed AI models that offer outstanding performance with minimal infrastructure.' CompactifAI was created using Tensor Networks, a quantum-inspired approach to simplifying neural networks. Tensor Networks is a specialized field of study pioneered by Román Orús, Co-Founder and Chief Scientific Officer at Multiverse. 'For the first time in history, we are able to profile the inner workings of a neural network to eliminate billions of spurious correlations to truly optimize all sorts of AI models,' said Orús. Compressed versions of top Llama, DeepSeek and Mistral models are available now, with additional models coming soon. Per Roman, Co-founder & Managing Partner, Bullhound Capital, said: 'Multiverse's CompactifAI introduces material changes to AI processing that address the global need for greater efficiency in AI, and their ingenuity is accelerating European sovereignty. Román Orús has convinced us that he and his team of engineers are developing truly world-class solutions in this highly complex and compute intensive field. Enrique Lizaso is the perfect CEO for rapidly expanding the business in a global race for AI dominance. I am also pleased to see that so many high-profile investors such as HP and Forgepoint decided to join the round. We welcome their participation.' Tuan Tran, President of Technology and Innovation, HP Inc., commented: 'At HP, we are dedicated to leading the future of work by providing solutions that drive business growth and enhance professional fulfillment. Our investment in Multiverse Computing supports this ambition. By making AI applications more accessible at the edge, Multiverse's innovative approach has the potential to bring AI benefits of enhanced performance, personalization, privacy and cost efficiency to life for companies of any size.' Damien Henault, Managing Director, Forgepoint Capital International, said: 'The Multiverse team has solved a deeply complex problem with sweeping implications. The company is well-positioned to be a foundational layer of the AI infrastructure stack. Multiverse represents a quantum leap for the global deployment and application of AI models, enabling smarter, cheaper and greener AI. This is only just the beginning of a massive market opportunity.' Multiverse Computing extends its sincere gratitude to its current investors for their continued trust and support, as well as to the European institutions whose backing has been instrumental in achieving this milestone. For more information about Multiverse Computing and CompactifAI, visit About Multiverse Computing Multiverse Computing is the leader in quantum-inspired AI model compression. The company's deep expertise in quantum software and AI led to the development of CompactifAI, a revolutionary AI model compressor. CompactifAI compresses LLMs by up to 95% with only 2-3% precision loss. CompactifAI models reduce computing requirements and unleash new use cases for AI across industries. Multiverse Computing is headquartered in Donostia, Spain, with offices across Europe, the US, and Canada. The company won DigitalEurope's 2024 Future Unicorn award and was recognized by CB Insights as one of the Top 100 Most Promising AI Companies in 2025. With over 160 patents and 100 customers globally, including Iberdrola, Bosch, and the Bank of Canada, Multiverse Computing has raised c.$250M to date. For more information, visit About Bullhound Capital Bullhound Capital is the investment management arm of GP Bullhound, building with founders creating category-leading technology companies. Launched in 2008 with over €1 billion deployed, it has invested in global leaders like Spotify, Klarna, Revolut, Slack, Unity, ConnexAI and EcoVadis. Operating from 13 offices worldwide, its platform delivers hands-on, founder-focused support across strategy, growth, and execution. From quantum to entertainment, Bullhound Capital backs global leaders applying Artificial Intelligence to solve real-world problems. About SETT The Sociedad Española para la Transformación Tecnológica, Entidad Pública Empresarial, SETT, attached to the Ministry for Digital Transformation and Public Function, is a public entity dedicated to the financing and promotion of advanced and transformative digital technologies. The operation is carried out through the Next Tech fund, whose objective is to encourage private investment and improve access to financing in strategic Spanish sectors such as disruptive technologies. The implementation of the Next Tech, foreseen in the Recovery, Transformation and Resilience Plan, is among the functions of SETT, which also manages two other financial instruments to boost the technological business ecosystem: PERTE Chip, dedicated to microelectronics and semiconductors, and Spain Audiovisual Hub, which promotes the digitization of the audiovisual sector. Contact InformationLaunchSquad for Multiverse Computingmultiverse@ A photo accompanying this announcement is available at in to access your portfolio

Multiverse Computing Raises $215M to Scale Ground-Breaking Technology that Compresses LLMs by up to 95%
Multiverse Computing Raises $215M to Scale Ground-Breaking Technology that Compresses LLMs by up to 95%

Yahoo

time12-06-2025

  • Business
  • Yahoo

Multiverse Computing Raises $215M to Scale Ground-Breaking Technology that Compresses LLMs by up to 95%

Technology breakthrough attracts international investment. Quantum AI leader to turbo charge AI proliferation, reduce power concerns and bring the technology to the edge Multiverse Computing's Founding Team SAN SEBASTIAN, Spain , June 12, 2025 (GLOBE NEWSWIRE) -- Multiverse Computing, the global leader in quantum-inspired AI model compression, has developed CompactifAI, a compression technology capable of reducing the size of LLMs (Large Language Models) by up to 95% while maintaining model performance. Having spent 2024 developing the technology and rolling it out to initial customers, the company today announces a €189 million ($215 million) investment round. The Series B will be led by Bullhound Capital with the support of world-class investors such as HP Tech Ventures, SETT, Forgepoint Capital International, CDP Venture Capital, Santander Climate VC, Quantonation, Toshiba and Capital Riesgo de Euskadi - Grupo SPRI. The company has brought on widespread support for this push with a range of international and strategic investors. The investment will accelerate widespread adoption to address the massive costs prohibiting the roll out of LLMs, revolutionizing the $106 billion AI inference market. LLMs typically run on specialized, cloud-based infrastructure that drives up data center costs. Traditional compression techniques—quantization and pruning—aim to address these challenges, but their resulting models significantly underperform original LLMs. With the development of CompactifAI, Multiverse discovered a new approach. CompactifAI models are highly-compressed versions of leading open source LLMs that retain original accuracy, are 4x-12x faster and yield a 50%-80% reduction in inference costs. These compressed, affordable, energy-efficient models can run on the cloud, on private data centers or—in the case of ultra compressed LLMs—directly on devices such as PCs, phones, cars, drones and even Raspberry Pi. 'The prevailing wisdom is that shrinking LLMs comes at a cost. Multiverse is changing that,' said Enrique Lizaso Olmos, Founder and CEO of Multiverse Computing. 'What started as a breakthrough in model compression quickly proved transformative—unlocking new efficiencies in AI deployment and earning rapid adoption for its ability to radically reduce the hardware requirements for running AI models. With a unique syndicate of expert and strategic global investors on board and Bullhound Capital as lead investor, we can now further advance our laser-focused delivery of compressed AI models that offer outstanding performance with minimal infrastructure.' CompactifAI was created using Tensor Networks, a quantum-inspired approach to simplifying neural networks. Tensor Networks is a specialized field of study pioneered by Román Orús, Co-Founder and Chief Scientific Officer at Multiverse. 'For the first time in history, we are able to profile the inner workings of a neural network to eliminate billions of spurious correlations to truly optimize all sorts of AI models,' said Orús. Compressed versions of top Llama, DeepSeek and Mistral models are available now, with additional models coming soon. Per Roman, Co-founder & Managing Partner, Bullhound Capital, said: 'Multiverse's CompactifAI introduces material changes to AI processing that address the global need for greater efficiency in AI, and their ingenuity is accelerating European sovereignty. Román Orús has convinced us that he and his team of engineers are developing truly world-class solutions in this highly complex and compute intensive field. Enrique Lizaso is the perfect CEO for rapidly expanding the business in a global race for AI dominance. I am also pleased to see that so many high-profile investors such as HP and Forgepoint decided to join the round. We welcome their participation.' Tuan Tran, President of Technology and Innovation, HP Inc., commented: 'At HP, we are dedicated to leading the future of work by providing solutions that drive business growth and enhance professional fulfillment. Our investment in Multiverse Computing supports this ambition. By making AI applications more accessible at the edge, Multiverse's innovative approach has the potential to bring AI benefits of enhanced performance, personalization, privacy and cost efficiency to life for companies of any size.' Damien Henault, Managing Director, Forgepoint Capital International, said: 'The Multiverse team has solved a deeply complex problem with sweeping implications. The company is well-positioned to be a foundational layer of the AI infrastructure stack. Multiverse represents a quantum leap for the global deployment and application of AI models, enabling smarter, cheaper and greener AI. This is only just the beginning of a massive market opportunity.' Multiverse Computing extends its sincere gratitude to its current investors for their continued trust and support, as well as to the European institutions whose backing has been instrumental in achieving this milestone. For more information about Multiverse Computing and CompactifAI, visit About Multiverse Computing Multiverse Computing is the leader in quantum-inspired AI model compression. The company's deep expertise in quantum software and AI led to the development of CompactifAI, a revolutionary AI model compressor. CompactifAI compresses LLMs by up to 95% with only 2-3% precision loss. CompactifAI models reduce computing requirements and unleash new use cases for AI across industries. Multiverse Computing is headquartered in Donostia, Spain, with offices across Europe, the US, and Canada. The company won DigitalEurope's 2024 Future Unicorn award and was recognized by CB Insights as one of the Top 100 Most Promising AI Companies in 2025. With over 160 patents and 100 customers globally, including Iberdrola, Bosch, and the Bank of Canada, Multiverse Computing has raised c.$250M to date. For more information, visit About Bullhound Capital Bullhound Capital is the investment management arm of GP Bullhound, building with founders creating category-leading technology companies. Launched in 2008 with over €1 billion deployed, it has invested in global leaders like Spotify, Klarna, Revolut, Slack, Unity, ConnexAI and EcoVadis. Operating from 13 offices worldwide, its platform delivers hands-on, founder-focused support across strategy, growth, and execution. From quantum to entertainment, Bullhound Capital backs global leaders applying Artificial Intelligence to solve real-world problems. About SETT The Sociedad Española para la Transformación Tecnológica, Entidad Pública Empresarial, SETT, attached to the Ministry for Digital Transformation and Public Function, is a public entity dedicated to the financing and promotion of advanced and transformative digital technologies. The operation is carried out through the Next Tech fund, whose objective is to encourage private investment and improve access to financing in strategic Spanish sectors such as disruptive technologies. The implementation of the Next Tech, foreseen in the Recovery, Transformation and Resilience Plan, is among the functions of SETT, which also manages two other financial instruments to boost the technological business ecosystem: PERTE Chip, dedicated to microelectronics and semiconductors, and Spain Audiovisual Hub, which promotes the digitization of the audiovisual sector. Contact InformationLaunchSquad for Multiverse Computingmultiverse@ A photo accompanying this announcement is available at

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store