Scientists debut revolutionary battery built to deliver energy for lifetimes: 'Advancements in this technology could reshape the future'
Experts at Northwest Normal University in China have an uncommon invention: a small nuclear battery that can power pacemakers and other tiny tech, according to multiple outlets.
The fascinating invention could theoretically provide electricity for hundreds to thousands of years, the reports touted about the benchmark. The news made headlines after a slew of tests, including 35,000 LED pulses, successful integration into Bluetooth chips for signal transmission, and even energizing a clock, per Interesting Engineering and pv magazine.
The applications could be vast. IE added that deep-sea and outer-space uses are possible for nuclear batteries, powering sensors and small gadgets. On Earth, imagine smartphones that never die or tech that helps to keep us alive.
"The researchers are confident that the battery could permanently power implantable devices like pacemakers or brain-computer interfaces," IE's Ameya Paleja wrote.
Nuclear batteries leverage the power of radioactive materials, which at first seems like a dangerous idea when considered for use inside the body. But a news release about similar research in South Korea said that not all nuclear elements harm living organisms. Radiation can be blocked if it's encased, too. Power is generated from the decay energy of radioactive isotopes, according to the expert descriptions.
The Korean scientists built a prototype pack with carbon-14, the same atomic material used in China, where experts encased it in a silicon carbon semiconductor material. This negates leaks and ensures safety. The Chinese battery is called Candle Dragon One, per pv.
"Nuclear battery technology represents the next generation of micro-power solutions, driving transformation in advanced manufacturing, national security, and aerospace applications," Beita Pharmatech chairman Li Gang said in the pv story. Bieta collaborated with Normal on the project.
Candle Dragon One's energy conversion efficiency clocked in at 8%, with a storage capacity 10 times greater than lithium-ion packs per pound. The units can work at an astounding temperature range of minus-148 degrees to 392 degrees Fahrenheit, pv added.
It's not the only nuclear battery being developed in China, as experts at Soochow University are working on one involving the element americium. U.S. researchers are also making packs with radioactive parts. Experts elsewhere even have ideas for a theoretical black hole battery.
The small storage units could play a big role as we shift to cleaner energy, which is key to reducing harmful planet-warming dirty fuel emissions. Air pollution is not only an overheating force, but it's also being linked to brain damage and dementia.
Should the U.S. invest more in battery innovations?
Absolutely
Depends on the project
We're investing enough
We should invest less
Click your choice to see results and speak your mind.
Larger nuclear projects are already providing about 9% of global electricity, according to the World Nuclear Association. It's made without air pollution but produces radioactive waste that needs to be safely stored. There's also the risk of rare yet catastrophic meltdowns.
For the smaller batteries, experts have more milestones to achieve before commercial use can happen, as the power output is low, per pv.
"While currently limited to niche applications, advancements in this technology could reshape the future of energy storage," the magazine's Vincent Shaw wrote.
In the meantime, you can reshape energy use in your home immediately by switching out traditional bulbs for better LEDs. The move could save you up to $600 in energy costs annually while preventing five times the pollution of the old bulbs.
Join our free newsletter for weekly updates on the latest innovations improving our lives and shaping our future, and don't miss this cool list of easy ways to help yourself while helping the planet.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Hill
28 minutes ago
- The Hill
Amazon to invest $20 billion in Pennsylvania data centers
Amazon plans to invest at least $20 billion to build out two data centers in Pennsylvania, the e-commerce and cloud computing giant announced Monday alongside Keystone State leaders. The new data centers will be located in Salem Township alongside the Susquehanna nuclear power plant and in Falls Township on the site of a former steel mill, Pennsylvania Gov. Josh Shapiro (D) said. 'Pennsylvania workers, over decades, over centuries, have made our Commonwealth and our country safer and freer, and what we are talking about here today is yet another national security issue that Pennsylvania can lead on,' Shapiro said Monday. 'See right now, there is a battle for supremacy when it comes to AI, a battle that will be won by either the United States or China,' he continued. 'I'm comforted by the fact that thanks to Amazon, the future of AI is going to run right here through the Commonwealth of Pennsylvania.' The latest Amazon data centers come as tech companies race to develop AI, competing with fellow U.S. firms, as well as Chinese companies like Deep Seek. This has spurred new interest in data center construction that extends all the way to the White House. President Trump announced the Stargate project in January, with the goal of investing $500 billion in AI infrastructure over the next four years. However, rapid AI development has also expanded energy demands, leading major tech firms to increasingly seek out additional capacity, including in the form of nuclear energy. Amazon has run into roadblocks from federal regulators as it has attempted to increase the power flowing to its new data center co-located with the Susquehanna nuclear plant in Pennsylvania. The Federal Energy Regulatory Commission (FERC) in November rejected the company's request to alter an existing agreement with the power plant to boost its capacity by 180 megawatts. The regulator upheld its decision in April.
Yahoo
an hour ago
- Yahoo
Amazon wants to become a global marketplace for AI
Amazon Web Services isn't betting on one large language model (LLM) winning the artificial intelligence race. Instead, it's offering customers a buffet of models to choose from. AWS, the cloud computing arm of Amazon (AMZN), aims to become the go-to infrastructure layer for the AI economy, regardless of which model wins out. By making customer choice a defining principle, AWS hopes to win out against rivals that have aligned closely with specific LLM providers — notably Microsoft (MSFT), which partnered with ChatGPT creator OpenAI ( 'We don't think that there's going to be one model to rule them all,' Dave Brown, vice president of compute and networking at AWS, told Yahoo Finance. The model-neutral approach is embedded into Amazon Bedrock, a service that allows AWS customers to build their own applications using a wide range of models, with more than 100 to choose from. Brown added that after Chinese startup DeepSeek surprised the world, AWS had a fully managed version of the disruptive model available on Bedrock within a week. Two years after its launch, Bedrock is now the fastest-growing service offered by AWS, which accounted for over 18% of Amazon's total revenue in the first quarter. It's why Amazon CEO Andy Jassy sees Bedrock as a core part of the company's AI growth strategy. But to understand the competitive advantage AWS hopes to offer with Bedrock, you have to go back to its origin story. Bedrock dates back to a six-page internal memo that Atul Deo, AWS's director of product management, wrote in 2020. Before OpenAI's ChatGPT launched in 2022 and made 'generative AI' a household term, Deo pitched a service that could generate code from plain English prompts using large language models. But Jassy, the head of AWS at the time, didn't buy it. 'His initial reaction was, 'This seems almost like a pipe dream,'' Deo said. He added that while a tool that makes coding easy sounds obvious now, the technology was 'still not quite there.' When that project, initially known as Code Whisperer, launched in 2023, the team realized they could offer the service for a broader set of use cases, giving customers a choice of different models with 'generic capabilities' that 'could be used as a foundation to build a lot of interesting applications,' according to Deo. Deo noted that the team steered away from doubling down on its own model after it recognized a pattern of customers wanting choice in other AWS services. This led to AWS becoming the first provider to offer a range of different models to customers. With this foundational approach in mind, Amazon renamed the project Bedrock. To be sure, the model-agnostic approach has risks, and many analysts don't consider Amazon to be leading the AI race, even though it has ramped up its AI spending. If there is ultimately one model to rule them all, similar to how Google came to dominate search, Amazon could risk further falling behind. At the beginning of the year, Amazon and its peers Meta (META), Microsoft, and Google parent Alphabet (GOOG) expected to spend $325 billion combined, mostly on AI infrastructure. To keep pace, Amazon has hedged its bets with its own technology and one LLM provider in particular: Anthropic. In November 2024, AWS doubled its investment in Anthropic to $8 billion in a deal that requires Anthropic to train its large language model, Claude, using only AWS's chips. (For comparison, Microsoft has invested over $13 billion into OpenAI.) The $8 billion deal allows Amazon to prove out its AI training infrastructure and deepen ties with one LLM provider while continuing to offer customers a wide selection of models on Bedrock. 'I mean, this is cloud selling 101, right?' said Dan Rosenthal, head of go-to-market partnerships at Anthropic. 'There are some cases where it's been very clear that a customer wants to use a different model on Bedrock for something that we just frankly don't focus on, and that's great. We want to win where we have a right to win.' Amazon also launched its own family of foundational models, called Nova, at the end of 2024, two years after the launch of ChatGPT. But competition and expectations remain high: Revenue at AWS increased 16.9% to $29.27 billion in Q1, marking the third time in a row it missed analyst estimates despite double-digit growth. The Anthropic partnership also underscores a bigger competition AWS may be fighting with chipmakers, including Nvidia (NVDA), which recently staged a $1 trillion rally in just two months after an earnings print that eased investor concerns about chip export controls. While Amazon is an Nvidia customer, it also produces highly effective and more affordable AI chips based on power consumed (known as 'price performance'). On Bedrock, AWS lets clients choose whether to use its own CPUs and GPUs or chips from competitors like Intel (INTC), AMD (AMD), and Nvidia. 'We're able to work with the model providers to really optimize the model for the hardware that it runs,' Brown said. 'There's no change the customer has to make.' Customers not only have a choice of model but also a choice of which infrastructure the model should run and train on. This helps AWS compete on price — a key battleground with Nvidia, which offers the most expensive chips on the market. This 'coopetition' dynamic could position Amazon to take market share from Nvidia if it can prove its own chips can do the job for a lower sticker price. It's a bet that Amazon is willing to spend on, with capital expenditures expected to hit $100 billion in 2025, up from $83 billion last year. While AWS doesn't break out its costs for AI, CEO Andy Jassy said on an earnings call in February that the 'vast majority of that capex spend is on AI for AWS.' In an April letter to shareholders, Jassy noted that 'AI revenue is growing at triple-digit YoY percentages and represents a multibillion-dollar annual revenue run rate.' Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
an hour ago
- Yahoo
Amazon wants to become a global marketplace for AI
Amazon Web Services isn't betting on one large language model (LLM) winning the artificial intelligence race. Instead, it's offering customers a buffet of models to choose from. AWS, the cloud computing arm of Amazon (AMZN), aims to become the go-to infrastructure layer for the AI economy, regardless of which model wins out. By making customer choice a defining principle, AWS hopes to win out against rivals that have aligned closely with specific LLM providers — notably Microsoft (MSFT), which partnered with ChatGPT creator OpenAI ( 'We don't think that there's going to be one model to rule them all,' Dave Brown, vice president of compute and networking at AWS, told Yahoo Finance. The model-neutral approach is embedded into Amazon Bedrock, a service that allows AWS customers to build their own applications using a wide range of models, with more than 100 to choose from. Brown added that after Chinese startup DeepSeek surprised the world, AWS had a fully managed version of the disruptive model available on Bedrock within a week. Two years after its launch, Bedrock is now the fastest-growing service offered by AWS, which accounted for over 18% of Amazon's total revenue in the first quarter. It's why Amazon CEO Andy Jassy sees Bedrock as a core part of the company's AI growth strategy. But to understand the competitive advantage AWS hopes to offer with Bedrock, you have to go back to its origin story. Bedrock dates back to a six-page internal memo that Atul Deo, AWS's director of product management, wrote in 2020. Before OpenAI's ChatGPT launched in 2022 and made 'generative AI' a household term, Deo pitched a service that could generate code from plain English prompts using large language models. But Jassy, the head of AWS at the time, didn't buy it. 'His initial reaction was, 'This seems almost like a pipe dream,'' Deo said. He added that while a tool that makes coding easy sounds obvious now, the technology was 'still not quite there.' When that project, initially known as Code Whisperer, launched in 2023, the team realized they could offer the service for a broader set of use cases, giving customers a choice of different models with 'generic capabilities' that 'could be used as a foundation to build a lot of interesting applications,' according to Deo. Deo noted that the team steered away from doubling down on its own model after it recognized a pattern of customers wanting choice in other AWS services. This led to AWS becoming the first provider to offer a range of different models to customers. With this foundational approach in mind, Amazon renamed the project Bedrock. To be sure, the model-agnostic approach has risks, and many analysts don't consider Amazon to be leading the AI race, even though it has ramped up its AI spending. If there is ultimately one model to rule them all, similar to how Google came to dominate search, Amazon could risk further falling behind. At the beginning of the year, Amazon and its peers Meta (META), Microsoft, and Google parent Alphabet (GOOG) expected to spend $325 billion combined, mostly on AI infrastructure. To keep pace, Amazon has hedged its bets with its own technology and one LLM provider in particular: Anthropic. In November 2024, AWS doubled its investment in Anthropic to $8 billion in a deal that requires Anthropic to train its large language model, Claude, using only AWS's chips. (For comparison, Microsoft has invested over $13 billion into OpenAI.) The $8 billion deal allows Amazon to prove out its AI training infrastructure and deepen ties with one LLM provider while continuing to offer customers a wide selection of models on Bedrock. 'I mean, this is cloud selling 101, right?' said Dan Rosenthal, head of go-to-market partnerships at Anthropic. 'There are some cases where it's been very clear that a customer wants to use a different model on Bedrock for something that we just frankly don't focus on, and that's great. We want to win where we have a right to win.' Amazon also launched its own family of foundational models, called Nova, at the end of 2024, two years after the launch of ChatGPT. But competition and expectations remain high: Revenue at AWS increased 16.9% to $29.27 billion in Q1, marking the third time in a row it missed analyst estimates despite double-digit growth. The Anthropic partnership also underscores a bigger competition AWS may be fighting with chipmakers, including Nvidia (NVDA), which recently staged a $1 trillion rally in just two months after an earnings print that eased investor concerns about chip export controls. While Amazon is an Nvidia customer, it also produces highly effective and more affordable AI chips based on power consumed (known as 'price performance'). On Bedrock, AWS lets clients choose whether to use its own CPUs and GPUs or chips from competitors like Intel (INTC), AMD (AMD), and Nvidia. 'We're able to work with the model providers to really optimize the model for the hardware that it runs,' Brown said. 'There's no change the customer has to make.' Customers not only have a choice of model but also a choice of which infrastructure the model should run and train on. This helps AWS compete on price — a key battleground with Nvidia, which offers the most expensive chips on the market. This 'coopetition' dynamic could position Amazon to take market share from Nvidia if it can prove its own chips can do the job for a lower sticker price. It's a bet that Amazon is willing to spend on, with capital expenditures expected to hit $100 billion in 2025, up from $83 billion last year. While AWS doesn't break out its costs for AI, CEO Andy Jassy said on an earnings call in February that the 'vast majority of that capex spend is on AI for AWS.' In an April letter to shareholders, Jassy noted that 'AI revenue is growing at triple-digit YoY percentages and represents a multibillion-dollar annual revenue run rate.' Sign in to access your portfolio