logo
#

Latest news with #MultiverseComputing

Buzzy AI startup Multiverse creates two of the smallest high-performing models ever
Buzzy AI startup Multiverse creates two of the smallest high-performing models ever

TechCrunch

time6 days ago

  • Business
  • TechCrunch

Buzzy AI startup Multiverse creates two of the smallest high-performing models ever

One of Europe's most prominent AI startups has released two AI models that are so tiny, they have named them after a chicken's brain and a fly's brain. Multiverse Computing claims these are the world's smallest models that are still high performing and can handle chat, speech, and even reasoning in one case. These new tiny models are intended to be embedded into internet of things devices, as well as run locally on smartphones, tablets, and PCs. 'We can compress the model so much that they can fit on devices,' Orús told TechCrunch. 'You can run them on premises, directly on your iPhone or on your Apple Watch.' As we previously reported, Multiverse Computing is a buzzy European AI startup headquartered in Donostia, Spain, with about 100 employees in offices worldwide. It was co-founded by a top European professor of quantum computers and physics, Román Orús; quantum computing expert Samuel Mugel; and Enrique Lizaso Olmos the former deputy CEO of Unnim Banc. It just raised €189 million (about $215 million) in June on the strength of a model compression technology it calls 'CompactifAI.' (Since it was founded in 2019, it has raised about $250 million, Orús said.) CompactifAI is a quantum-inspired compression algorithm that reduces the size of existing AI models without sacrificing those models' performance, Orús said. Techcrunch event Tech and VC heavyweights join the Disrupt 2025 agenda Netflix, ElevenLabs, Wayve, Sequoia Capital, Elad Gil — just a few of the heavy hitters joining the Disrupt 2025 agenda. They're here to deliver the insights that fuel startup growth and sharpen your edge. Don't miss the 20th anniversary of TechCrunch Disrupt, and a chance to learn from the top voices in tech — grab your ticket now and save up to $600+ before prices rise. Tech and VC heavyweights join the Disrupt 2025 agenda Netflix, ElevenLabs, Wayve, Sequoia Capital — just a few of the heavy hitters joining the Disrupt 2025 agenda. They're here to deliver the insights that fuel startup growth and sharpen your edge. Don't miss the 20th anniversary of TechCrunch Disrupt, and a chance to learn from the top voices in tech — grab your ticket now and save up to $675 before prices rise. San Francisco | REGISTER NOW 'We have a compression technology that is not the typical compression technology that the people from computer science or machine learning will do, because we come from quantum physics,' he described. 'It's a more subtle and more refined compression algorithm.' The company has already released a long list of compressed versions of open-source models, especially popular small models like Llama 4 Scout or Mistral Small 3.1. And it just launched compressed versions of OpenAI's two new open models. It has also compressed some very large models – it offers a DeepSeek R1 Slim, for instance. But since it's in the business of making models smaller, it has focused extra attention on making the smallest yet most powerful models possible. Its two new models are so small that they can bring chat AI capabilities to just about any IoT device and work without an internet connection, the company says. It humorously calls this family the Model Zoo because it's naming the products based on animal brain sizes. A model it calls SuperFly is a compressed version of Hugging Face's open-source model SmolLM2 135. The original has 135M parameters and was developed for on-device uses. SuperFly is 94M parameters, which Orús likens to the size of a fly's brain. 'This is like having a fly, but a little bit more clever,' he said. SuperFly is designed to be trained on very restricted data, like a device's operations. Multiverse envisions it embedded into home appliances, allowing users to operate them with voice commands like 'start quick wash' for a washing machine. Or users can ask troubleshooting questions. With a little processing power (like an Arduino), the model can handle a voice interface, as the company showed in a live demo to TechCrunch. The other model is named ChickBrain, and is larger at 3.2 billion parameters, but is also far more capable and has reasoning capabilities. It's a compressed version of Meta's Llama 3.1 8B model, Multiverse says. Yet it's small enough to run on a MacBook, no internet connection required. More importantly, Orús said that ChickBrain actually slightly outperforms the original in several standard benchmarks, including the language-skill benchmark MMLU-Pro, math skills benchmarks Math 500 and GSM8K, and the general knowledge benchmark GPQA Diamond. Here are the results of Multiverse's internal tests of ChickBrain on the benchmarks. The company didn't offer benchmark results for SuperFly but Multiverse also isn't targeting SuperFly at use cases that require reasoning. Multiverse Computing's ChickBrain Benchmarks Image Credits:Multiverse Computing It's important to note that Multiverse isn't claiming that its Model Zoo will beat the largest state-of-the-art models on such benchmarks. Zoo performances might not even land on the leaderboards. The point is that its tech can shrink model size without a performance hit, the company says. Orús says the company is already in talks with all the leading device and appliance makers. 'We are talking with Apple. We are talking with Samsung, also with Sony and with HP, obviously. HP came as an investor in the last round,' he said. The round was led by well-known European VC firm Bullhound Capital, with participation from a lot of others, including HP Tech Ventures and Toshiba. The startup also offers compression tech for other forms of machine learning, like image recognition, and in six years has obtained clients like BASF, Ally, Moody's, Bosch, and others. In addition to selling its models directly to major device manufacturers, Multiverse offers its compressed models via an API hosted on AWS that any developer can use, often at lower token fees than competitors.

Multiverse Computing Plans to Transform the AI Inference Market
Multiverse Computing Plans to Transform the AI Inference Market

Bloomberg

time15-07-2025

  • Business
  • Bloomberg

Multiverse Computing Plans to Transform the AI Inference Market

Spanish AI startup Multiverse Computing says it has managed to compress Large Language Models (LLMs) by 95% and without sacrificing performance. So far, Multiverse Computing has $215 million to scale its quantum-inspired AI model compression tool. CEO and co-founder Enrique Lizaso spoke to Bloomberg's Tom Mackenzie about his company's plans to improve energy efficiency and compete with larger AI providers. (Source: Bloomberg)

Multiverse raises $215m to advance CompactifAI technology
Multiverse raises $215m to advance CompactifAI technology

Yahoo

time12-06-2025

  • Business
  • Yahoo

Multiverse raises $215m to advance CompactifAI technology

Spanish quantum software startup Multiverse Computing has raised €189m ($215m) in a Series B funding round to advance its CompactifAI technology. CompactifAI, developed throughout 2024 and now rolled out to initial customers, can reduce the size of Large Language Models (LLMs) by up to 95% while preserving performance. The investment round was led by Bullhound Capital with participation from investors including HP Tech Ventures, SETT, Forgepoint Capital International, CDP Venture Capital and Santander Climate VC, as well as Quantonation, Toshiba, and Capital Riesgo de Euskadi - Grupo SPRI. Multiverse said it plans to leverage the funding to support broader adoption of CompactifAI, targeting the $106bn AI inference market. This technology is claimed to address the high costs of running LLMs, which typically require specialised cloud-based infrastructure. Multiverse Computing founder and CEO Enrique Lizaso Olmos said: 'With a unique syndicate of expert and strategic global investors on board and Bullhound Capital as lead investor, we can now further advance our laser-focused delivery of compressed AI models that offer outstanding performance with minimal infrastructure.' Unlike traditional compression methods such as quantisation and pruning, which often degrade model performance, CompactifAI maintains original accuracy, achieves 4x-12x faster processing, and cuts inference costs by 50%-80%, according to the company. CompactifAI enables compressed models to operate on cloud platforms, private data centres, or devices including PCs, phones, cars, drones, and Raspberry Pi. Compressed versions of Llama, DeepSeek, and Mistral models are available now, with more models expected soon. The technology leverages Tensor Networks, a quantum-inspired approach to neural network simplification, pioneered by Roman Orus, Multiverse's co-founder and chief scientific officer. Orus said: 'For the first time in history, we are able to profile the inner workings of a neural network to eliminate billions of spurious correlations to truly optimise all sorts of AI models.' At the end of 2024, Multiverse Computing received an investment from CDP Venture Capital, an Italian venture capital investor, as part of its Series A funding round. The investment was made through two compartments of the Corporate Partners I fund, ServiceTech and Energytech, which includes participation from major Italian corporations such as Baker Hughes, BNL BNP Paribas, Edison, GPI, Italgas, Snam, and Terna Forward. "Multiverse raises $215m to advance CompactifAI technology" was originally created and published by Verdict, a GlobalData owned brand. The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site. Sign in to access your portfolio

Multiverse Computing raises $215M for tech that could radically slim AI costs
Multiverse Computing raises $215M for tech that could radically slim AI costs

Yahoo

time12-06-2025

  • Business
  • Yahoo

Multiverse Computing raises $215M for tech that could radically slim AI costs

On Thursday, Spanish startup Multiverse Computing announced that it raised an enormous Series B round of €189 million (about $215 million) on the strength of a technology it calls 'CompactifAI.' CompactifAI is a quantum-computing inspired compression technology that is capable of reducing the size of LLMs by up to 95% without impacting model performance, the company says. Specifically, Multiverse offers compressed versions of well-known open source LLMs – primarily small models – such as Llama 4 Scout, Llama 3.3 70B, Llama 3.1 8B, Mistral Small 3.1. However, it will soon release a version of DeepSeek R1, with more open source and reasoning models coming soon. Proprietary models from OpenAI and others are not supported. It's 'slim' models, as the company calls them, are available on Amazon Web Services or can be licensed for on-premises uses. The company says its models are 4x-12x faster than the comparable not-compressed versions, which translates to 50%-80% reduction in inference costs. For instance, Multiverse says that its Lama 4 Scout Slim costs 10 cents per million tokens on AWS compared to Lama 4 Scout's 14 cents. The company says that some of its models can be made so small and energy efficient they could be run on PCs, phones, cars, drones and even the DIY-enthusiast's favorite tiny PC, Raspberry PI. (We are suddenly imagining those fantastical Raspberry PI Christmas-light houses upgraded with LLM-powered interactive talking Santas.) Multiverse has some technical might behind it. It was co-founded by CTO Román Orús, a professor at the Donostia International Physics Center in San Sebastián, Spain. Orús is known for his pioneering work on tensor networks (not to be confused with all AI-related things named Tensor at Google). Tensor networks are computational tools that mimic quantum computers but run on classic computers. One of their primary uses these days is compression of deep learning models. Multiverse's co-founder and CEO, Enrique Lizaso Olmos, also holds multiple mathematical degrees and has been a college professor. He spent most of his career in banking, best known as the former deputy CEO of Unnim Bank. The Series B was led by Bullhound Capital (which has backed companies like Spotify, Revolut, DeliveryHero, Avito, Discord) along with participation of HP Tech Ventures, SETT, Forgepoint Capital International, CDP Venture Capital, Santander Climate VC, Toshiba and Capital Riesgo de Euskadi – Grupo SPR. Multiverse says it has 160 patents and 100 customers globally, including Iberdrola, Bosch, and the Bank of Canada. With this funding, it has raised about $250M to date. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store