
Breathe lands $21M Series B to predict battery performance
Few markets are moving as rapidly as China's automotive sector. There, new models are rolled out in as little as 18 months, putting tremendous pressure on legacy Western automakers, which need four-plus years to go from concept to sales floor.
'With the increasingly short development cycles in China, it's driving a huge amount of cost and time focus,' Ian Campbell, co-founder and CEO of Breathe Battery Technologies, told TechCrunch. 'In both geographies, in the East — in China and Asia — and in the West as well.'
Much of that focus has been centered around batteries — the components that can make or break electric vehicle sales. Automakers are forced to predict where the market will be a few years out, but those forecasts don't always pan out given how quickly the EV landscape is evolving.
Making changes to physical components can be expensive and unpredictable, which is why Campbell's startup has been trying to give batteries more flexibility via software.
Breathe has developed a suite of tools that Campbell said helps automakers and others get the most out of their batteries. The startup recently raised a $21 million Series B led by Kinnevik Online AB, the company exclusively told TechCrunch. Lowercarbon Capital and Volvo Cars Tech Fund participated.
The new funding will help Breathe continue to push its software earlier in the battery development process. The company currently has four products: Design, Model, Map, and Charge.
Charge was Breathe's first offering, and it optimized charging strategies to speed refilling or increase the longevity of a battery.
Techcrunch event
Exhibit at TechCrunch Sessions: AI Secure your spot at TC Sessions: AI and show 1,200+ decision-makers what you've built — without the big spend. Available through May 9 or while tables last.
Exhibit at TechCrunch Sessions: AI Secure your spot at TC Sessions: AI and show 1,200+ decision-makers what you've built — without the big spend. Available through May 9 or while tables last.
Berkeley, CA
|
BOOK NOW
Though battery manufacturing is tightly controlled, no two cells that roll of the line are 100% identical. As a result, some might generate more heat during fast charging, while others might be able to withstand more charge and discharge cycles than their peers.
Chinese mobile phone maker Oppo was the first to adopt it, and the software cut charging time by 27%. On the automotive side, Volvo has Breathe's code installed on its forthcoming ES90 sedan, helping it to charge 10% to 80% in 20 minutes. In essence, Breathe's software lets them make the most of each cell given its individual quirks.
The startup's other offerings help automakers and electronics companies design and predict how their batteries will perform years down the line, letting them determine where to invest development resources. For example, if a new chemistry is lower cost and looks to have a longer lifespan, then designers may decide to let it charge a little faster at the expense of some of that longevity.
'They want to understand what room they have and what will happen when they make trade offs throughout the development program of their battery system,' Campbell said.
To do that, Breathe has built a lab in London where it can run a range of tests on batteries its customers are interested in using. In as little as four weeks, it has enough to ship the customer a model (called Breathe Model) that can simulate likely future performance.
After that, the cells stay on in the lab, contributing more data so that Breathe can eventually ship the customer its Map product, which augments simulated data with more real world results, Campbell said. The Design product will round out the suite when its released in the coming months, providing a customers with set of software tools to speed — you guessed it — battery design.
The goal is to reduce the amount of 'brute force lab testing' needed to bring a battery to market, Campbell said. He likens Breathe's software tools to those used in the semiconductor industry, which have helped companies like Apple and Nvidia work closely with foundries like TSMC to implement their processor designs in silicon.
'We want to try and do for batteries what we've seen the simulation software from Cadence and Synopsis do so effectively in semiconductor design,' he said.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
10 minutes ago
- Yahoo
Amazon wants to become a global marketplace for AI
Amazon Web Services isn't betting on one large language model (LLM) winning the artificial intelligence race. Instead, it's offering customers a buffet of models to choose from. AWS, the cloud computing arm of Amazon (AMZN), aims to become the go-to infrastructure layer for the AI economy, regardless of which model wins out. By making customer choice a defining principle, AWS hopes to win out against rivals that have aligned closely with specific LLM providers — notably Microsoft (MSFT), which partnered with ChatGPT creator OpenAI ( 'We don't think that there's going to be one model to rule them all,' Dave Brown, vice president of compute and networking at AWS, told Yahoo Finance. The model-neutral approach is embedded into Amazon Bedrock, a service that allows AWS customers to build their own applications using a wide range of models, with more than 100 to choose from. Brown added that after Chinese startup DeepSeek surprised the world, AWS had a fully managed version of the disruptive model available on Bedrock within a week. Two years after its launch, Bedrock is now the fastest-growing service offered by AWS, which accounted for over 18% of Amazon's total revenue in the first quarter. It's why Amazon CEO Andy Jassy sees Bedrock as a core part of the company's AI growth strategy. But to understand the competitive advantage AWS hopes to offer with Bedrock, you have to go back to its origin story. Bedrock dates back to a six-page internal memo that Atul Deo, AWS's director of product management, wrote in 2020. Before OpenAI's ChatGPT launched in 2022 and made 'generative AI' a household term, Deo pitched a service that could generate code from plain English prompts using large language models. But Jassy, the head of AWS at the time, didn't buy it. 'His initial reaction was, 'This seems almost like a pipe dream,'' Deo said. He added that while a tool that makes coding easy sounds obvious now, the technology was 'still not quite there.' When that project, initially known as Code Whisperer, launched in 2023, the team realized they could offer the service for a broader set of use cases, giving customers a choice of different models with 'generic capabilities' that 'could be used as a foundation to build a lot of interesting applications,' according to Deo. Deo noted that the team steered away from doubling down on its own model after it recognized a pattern of customers wanting choice in other AWS services. This led to AWS becoming the first provider to offer a range of different models to customers. With this foundational approach in mind, Amazon renamed the project Bedrock. To be sure, the model-agnostic approach has risks, and many analysts don't consider Amazon to be leading the AI race, even though it has ramped up its AI spending. If there is ultimately one model to rule them all, similar to how Google came to dominate search, Amazon could risk further falling behind. At the beginning of the year, Amazon and its peers Meta (META), Microsoft, and Google parent Alphabet (GOOG) expected to spend $325 billion combined, mostly on AI infrastructure. To keep pace, Amazon has hedged its bets with its own technology and one LLM provider in particular: Anthropic. In November 2024, AWS doubled its investment in Anthropic to $8 billion in a deal that requires Anthropic to train its large language model, Claude, using only AWS's chips. (For comparison, Microsoft has invested over $13 billion into OpenAI.) The $8 billion deal allows Amazon to prove out its AI training infrastructure and deepen ties with one LLM provider while continuing to offer customers a wide selection of models on Bedrock. 'I mean, this is cloud selling 101, right?' said Dan Rosenthal, head of go-to-market partnerships at Anthropic. 'There are some cases where it's been very clear that a customer wants to use a different model on Bedrock for something that we just frankly don't focus on, and that's great. We want to win where we have a right to win.' Amazon also launched its own family of foundational models, called Nova, at the end of 2024, two years after the launch of ChatGPT. But competition and expectations remain high: Revenue at AWS increased 16.9% to $29.27 billion in Q1, marking the third time in a row it missed analyst estimates despite double-digit growth. The Anthropic partnership also underscores a bigger competition AWS may be fighting with chipmakers, including Nvidia (NVDA), which recently staged a $1 trillion rally in just two months after an earnings print that eased investor concerns about chip export controls. While Amazon is an Nvidia customer, it also produces highly effective and more affordable AI chips based on power consumed (known as 'price performance'). On Bedrock, AWS lets clients choose whether to use its own CPUs and GPUs or chips from competitors like Intel (INTC), AMD (AMD), and Nvidia. 'We're able to work with the model providers to really optimize the model for the hardware that it runs,' Brown said. 'There's no change the customer has to make.' Customers not only have a choice of model but also a choice of which infrastructure the model should run and train on. This helps AWS compete on price — a key battleground with Nvidia, which offers the most expensive chips on the market. This 'coopetition' dynamic could position Amazon to take market share from Nvidia if it can prove its own chips can do the job for a lower sticker price. It's a bet that Amazon is willing to spend on, with capital expenditures expected to hit $100 billion in 2025, up from $83 billion last year. While AWS doesn't break out its costs for AI, CEO Andy Jassy said on an earnings call in February that the 'vast majority of that capex spend is on AI for AWS.' In an April letter to shareholders, Jassy noted that 'AI revenue is growing at triple-digit YoY percentages and represents a multibillion-dollar annual revenue run rate.' Sign in to access your portfolio
Yahoo
12 minutes ago
- Yahoo
NATO chief's speech was meant as a call to arms, but it was also a shameful admission for the alliance
For all the stark warnings and ominous predictions made by the head of NATO today, one key fact remained unmentioned. The West is still funding the Russian war effort to the tune of billions by buying oil and gas, funnelling vast amounts into an economy that is now fully militarised. Russian gas exports to Europe went up by 20% last year and its LNG exports to the EU are now at record levels. Read more: Vladimir Putin's Russia is now making more money from selling fossil fuels than Ukraine receives from allies. NATO's secretary general Mark Rutte did not mention any of that. But he did spell out what Russia is doing with all that hydrocarbon revenue. It is using it to put its economy onto a war footing that is now pumping out munitions at a rate that puts the West to shame, to the extent Russia could have the capability to take on NATO in three to five years, according to Mr Rutte. New Sky News podcast launches on 10 June - simulates an attack by Russia to test UK defences 👉Search for The Wargame on your podcast app👈 The secretary general meant his speech in London as a warning and call to arms. But it was also a shameful admission for the Western alliance he heads. More than three years into this war, Russia is outstripping the entire Western bloc by four to one in terms of munitions production. Russia's economy is 1/25th that of NATO's combined economic might and crippled by sanctions and yet every three months pumps out more shells than the entire NATO bloc manages in a year. And while Europe carries on funding Russia's war effort by buying its oil and gas, none of that is going to change. We are now in the insane and obscene situation where European taxpayers will have to fork out more, a lot more, to counteract the threat of a militarised Russia, whose resurgence is being subsidised by Western countries buying its fossil fuels. Historians will look back on that and wonder why it was allowed to continue more than three years into this devastating conflict.

Yahoo
15 minutes ago
- Yahoo
OpenAI claims to have hit $10B in annual revenue
OpenAI says it recently hit $10 billion in annual recurring revenue, up from around $5.5 billion last year. That figure includes revenue from the company's consumer products, ChatGPT business products, and its API, an OpenAI spokesperson told CNBC. Currently, OpenAI is serving more than 500 million weekly active users and 3 million paying business customers. The revenue milestone comes roughly two and a half years after OpenAI launched its popular chatbot platform, ChatGPT. The company is targeting $125 billion in revenue by 2029. OpenAI is under some pressure to increase revenue quickly. The company burns billions of dollars each year hiring and recruiting talent to work on its AI products, and securing the necessary infrastructure to train and run AI systems. OpenAI has not disclosed its operating expenses or whether it is close to profitability. This article originally appeared on TechCrunch at Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data