logo
Is the moon about to go nuclear?

Is the moon about to go nuclear?

Fast Companya day ago
NASA wants to build a nuclear reactor on the surface of the moon—and fast.
Transportation Secretary Sean Duffy, who is also serving as NASA's interim administrator, has issued new directives at the agency to speed up the timeline for a fission reactor designed to power human activity on the moon, even as the space agency faces deep cuts to other parts of its budget.
The plot beat is straight out of sci-fi, but it's also key to unlocking humanity's future on the moon, where two weeklong long lunar night cycles make storing solar energy a challenge. 'To properly advance this critical technology to be able to support a future lunar economy, high power energy generation on Mars, and to strengthen our national security in space, it is imperative the agency move quickly,' Duffy wrote in an agency directive sent out late last week.
Under Duffy's more aggressive plan, NASA has been directed to put out a broad call encouraging private companies to craft designs for a powerful 100 kilowatt reactor that could be ready to go by 2030. Politico first reported the expedited plan for a lunar reactor, which a senior NASA official characterized as a priority for 'winning the second space race.'
Subscribe to the Daily newsletter.
Fast Company's trending stories delivered to you every day
Privacy Policy
| Fast Company Newsletters
NASA's lunar fission dreams
The U.S. space agency has been working on designs for an energy source that could power lunar development for a few years now. In 2022, NASA doled out three $5 million contracts for concept designs for small nuclear fission reactors that could be used on the moon and adapted for future Mars exploration.
Those designs each weighed under six metric tons and were capable of producing 40 kilowatts of electricity, 'ensuring enough for demonstration purposes and additional power available for running lunar habitats, rovers, backup grids, or science experiments.'
'A demonstration of a nuclear power source on the Moon is required to show that it is a safe, clean, reliable option,' director of Technology Demonstrations for NASA's Space Technology Mission Directorate Trudy Kortes said of the initiative, called the Fission Surface Power Project, last year. 'The lunar night is challenging from a technical perspective, so having a source of power such as this nuclear reactor, which operates independent of the Sun, is an enabling option for long-term exploration and science efforts on the Moon.'
Other nations' plans
The U.S. isn't alone in its ambitions for a sustainable source of power on the moon—nor is its timing a coincidence.
Russia and China are working together on a joint lunar program that could include building a nuclear reactor on the moon's surface as soon as 2033. Russia's former head of Roscosmos Yuri Borisov said last year that Russia was 'seriously considering a project—somewhere at the turn of 2033-2035 – to deliver and install a power unit on the lunar surface together with our Chinese colleagues.'
The project is viewed as a precursor to lunar colonization for both countries, enabling power production greater than what a solar array could generate. While that plan appears intact, Borisov was fired from his position earlier this year. Russia maintains ambitious plans for exploring the moon with an aggressive timeline, but the country's space program faced a devastating setback when its first moon mission in almost 50 years smashed into the lunar surface.
Beyond China, Russia is also partnering with North Korea, another U.S. adversary, when it comes to space. Former U.S. Secretary of State Antony Blinken warned in January that Russia planned to share advanced satellite and other space technology with Pyongyang, an extension of the new military alliance between the two countries.The Biden administration maneuvered to block Russian plans to place an anti-satellite nuclear weapon in orbit, though that weapons program may be derailed for now after suffering a technical failure in April.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Google takes on ChatGPT's Study Mode with new ‘Guided Learning' tool in Gemini
Google takes on ChatGPT's Study Mode with new ‘Guided Learning' tool in Gemini

Yahoo

time15 minutes ago

  • Yahoo

Google takes on ChatGPT's Study Mode with new ‘Guided Learning' tool in Gemini

As the new school year approaches, Google announced on Wednesday that it's launching a new tool called Guided Learning within Gemini. The tool sort of functions like an AI tutor, as it's designed to help users build a deep understanding instead of just getting answers. The launch follows just over a week after OpenAI rolled out Study Mode for ChatGPT, which is also designed to go beyond simply obtaining answers to questions by actually helping users develop critical thinking skills. Both companies' launches come amid concerns that AI chatbots undermine the learning process because they spit out direct answers. The new tools from Google and OpenAI likely aim to address these concerns by positioning their chatbots as learning tools rather than simple answer engines. With Guided Learning, Gemini will break down problems step-by-step and adapt explanations to its users' needs. The feature responds using images, diagrams, videos, and interactive quizzes to help users build and test themselves on their knowledge, rather than simply giving them the answer. Google says the feature will help users uncover the 'why' and 'how' behind concepts. 'Whether you're preparing for an exam about enzymes, starting the first draft of a paper on the importance of bee populations in supporting our food systems, or exploring your passion for photography, Guided Learning is a collaborative thinking partner that helps you get it — each step of the way,' wrote Maureen Heymans, Google's VP of Learning and Sustainability, in a blog post. In addition to the new feature, Google announced that it's working to make Gemini as a whole better equipped to help users learn. Gemini will now automatically incorporate images, diagrams, and YouTube videos directly into responses to help users better understand complex topics. Plus, users can now also ask Gemini to create flashcards and study guides based on their quiz results or other class materials. Google also announced on Wednesday that it's offering students in the U.S., Japan, Indonesia, Korea, and Brazil a free one-year subscription to Google's AI Pro plan. The plan includes expanded access to Gemini 2.5 Pro, NotebookLM, Veo 3, Deep Research, and more. Sign in to access your portfolio

123Invent Inventor Develops Quick Cartridge Replacement Tool (OSK-1214)
123Invent Inventor Develops Quick Cartridge Replacement Tool (OSK-1214)

Yahoo

time15 minutes ago

  • Yahoo

123Invent Inventor Develops Quick Cartridge Replacement Tool (OSK-1214)

PITTSBURGH, Aug. 6, 2025 /PRNewswire/ - "As a plumber, I thought there could be a better way to pull a 1300, 1400, 1700, or 10000 series cartridge from the shower/tub rough-in valve," said an inventor, from Alamogordo, N.M., "so I invented the QUICK CARTRIDGE REPLACEMENT TOOL. My design turns a 30-60 minute plumbing job into a 30 second job." The invention provides an improved way to pull a 1300, 1400, 1700, or 10000 series cartridge from the shower/tub rough-in valve. In doing so, it saves time and effort. It also has less moving parts and less material than traditional tools. The invention features a simple and reliable design that is easy to use so it is ideal for plumbers. The QUICK CARTRIDGE REPLACEMENT TOOL is currently available for licensing or sale to manufacturers or marketers. For more information, visit Or contact Stephen Fulmer at 575-214-9454 or email info@ View original content to download multimedia: SOURCE InventHelp

Nvidia's ‘most underappreciated' business is taking off like a 'rocket ship'
Nvidia's ‘most underappreciated' business is taking off like a 'rocket ship'

Yahoo

time15 minutes ago

  • Yahoo

Nvidia's ‘most underappreciated' business is taking off like a 'rocket ship'

When Nvidia (NVDA) reports its second quarter earnings on Aug. 27, investors will focus squarely on the company's data center results. After all, that's where the chip giant realizes revenue on the sale of its high-powered AI processors. But the Data Center segment includes more than just chip sales. It also accounts for some of Nvidia's most important, though often overlooked, offerings: its networking technologies. Composed of its NVLink, InfiniBand, and Ethernet solutions, Nvidia's networking products are what allow its chips to communicate with each other, let servers talk to each other inside massive data centers, and ultimately ensure end users can connect to it all to run AI applications. 'The most important part in building a supercomputer is the infrastructure. The most important part is how you connect those computing engines together to form that larger unit of computing,' explained Gilad Shainer, senior vice president of networking at Nvidia. That also translates into some big sales. Nvidia's networking sales accounted for $12.9 billion of its $115.1 billion in data center revenue in its prior fiscal year. That might not seem impressive when you consider that chip sales brought in $102.1 billion, but it eclipses the $11.3 billion that Nvidia's second-largest segment, Gaming, took in for the year. In Q1, networking made up $4.9 billion of Nvidia's $39.1 billion in data center revenue. And it'll continue to grow as customers continue to build out their AI capacity, whether that's at research universities or massive data centers. 'It is the most underappreciated part of Nvidia's business, by orders of magnitude,' Deepwater Asset Management managing partner Gene Munster told Yahoo Finance. 'Basically, networking doesn't get the attention because it's 11% of revenue. But it's growing like a rocket ship.' Connecting thousands of chips When it comes to the AI explosion, Nvidia senior vice president of networking Kevin Deierling says the company has to work across three different types of networks. The first is its NVLink technology, which connects GPUs to each other within a server or multiple servers inside of a tall, cabinet-like server rack, allowing them to communicate and boost overall performance. Then there's InfiniBand, which connects multiple server nodes across data centers to form what is essentially a massive AI computer. Then there's the front-end network for storage and system management, which uses Ethernet connectivity. 'Those three networks are all required to build a giant AI-scale, or even a moderately sized enterprise-scale, AI computer,' Deierling explained. The purpose of all of these various connections isn't just to help chips and servers communicate, though. They're also meant to allow them to do so as fast as possible. If you're trying to run a series of servers as a single unit of computing, they need to talk to each other in the blink of an eye. A lack of data going to GPUs slows the entire operation, delaying other processes and impacting the overall efficiency of an entire data center. '[Nvidia is a] very different business without networking,' Munster explained. 'The output that the people who are buying all the Nvidia chips [are] desiring wouldn't happen if it wasn't for their networking. ' And as companies continue to develop larger AI models and autonomous and semi-autonomous agentic AI capabilities that can perform tasks for users, making sure those GPUs work in lockstep with each other becomes increasingly important. That's especially true as inferencing — running AI models — requires more powerful data center systems. Inferencing powers up The AI industry is in the midst of a broad reordering around the idea of inferencing. At the onset of the AI explosion, the thinking was that training AI models would require hugely powerful AI computers and that actually running them would be somewhat less power-intensive. That led to some trepidation on Wall Street earlier this year, when DeepSeek claimed that it trained its AI models on below top-of-the-line Nvidia chips. The thinking at the time was that if companies could train and run their AI models on underpowered chips, then there was no need for Nvidia's pricey high-powered systems. But that narrative quickly flipped as chip companies pointed out that those same AI models benefit from running on powerful AI computers, allowing them to reason over more information more quickly than they would while running on less-advanced systems. 'I think there's still a misperception that inferencing is trivial and easy,' Deierling said. 'It turns out that it's starting to look more and more like training as we get to [an] agentic workflow. So all of these networks are important. Having them together, tightly coupled to the CPU, the GPU, and the DPU [data processing unit], all of that is vitally important to make inferencing a good experience.' Nvidia's rivals are, however, circling. AMD is looking to grab more market share from the company, and cloud giants like Amazon, Google, and Microsoft continue to develop their own AI chips. Industry groups also have their own competing networking technologies including UALink, which is meant to go head-to-head with NVLink, explained Forrester analyst Alvin Nguyen. But for now, Nvidia continues to lead the pack. And as tech giants, researchers, and enterprises continue to battle over Nvidia's chips, the company's networking business is all but guaranteed to keep growing as well. Email Daniel Howley at dhowley@ Follow him on X/Twitter at @DanielHowley.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store