.jpg&w=3840&q=100)
Orbital Intelligence: When Satellites Meet Machine Learning
How BCG X, research institutions, and space agencies are using generative AI to supercharge weather forecasting with the GAIA Foundation Model.
The 20,000-Foot (or Mile?) View
Here's a fact that almost everyone on the planet is becoming increasingly familiar with: As the Earth's climate warms and its weather systems become less reliable, so do the weather prediction capabilities underpinning the business practices of countless agriculture, insurance, public safety, and scientific research organizations around the world.
Here's a less obvious fact: As those prediction capabilities deteriorate, so do many of the public and private services we take for granted every day.
'Having better, more reliable, more detailed intelligence about what's going on in the weather system has a lot to do with who's going to win and lose in financial industries like insurance and lending, in infrastructure sectors like energy, and places like state and local government,' says David Potere, geospatial tech leader and BCG X managing director and partner. 'As an example, the way we characterize risk affects the homes we buy, the businesses we invest in, the cities that grow or don't. And right now, there is a known gap in the insurance industry being able to cover a rapidly changing game board.'
The impact that this weather intelligence gap—and countless other gaps like it—has on organizational margins can trickle down to consumers in harsh ways. New volatility in the climate system can manifest as extended droughts and high winds that fuel record-breaking wildfires, or back-to-back 100-year storms that cause property damage at massive scales. A societal inability to forecast those kinds of events drives up our insurance rates, undercuts public safety measures, and strains governmental relief efforts.
The world needed a novel solution to a rapidly growing problem. The BCG X AI Science Institute may have found it alongside a growing new class of gen AI-powered weather models.
Turning to Eyes in the Sky
Enter GAIA (Geospatial Artificial Intelligence for Atmospheres) Foundation Model, an open source foundation model built in partnership between the BCG X AI Science Institute and several of the world's leading aerospace organizations to help researchers all across the world better understand and anticipate weather's next move.
Similar to large language models (LLMs) trained on text, the GAIA Foundation Model is a gen AI vision model trained on 25 years of satellite imagery that allows researchers to study climate and weather patterns at a greater speed and accessibility than ever before. Specifically, GAIA works with images from a constellation of school bus-sized satellites that 'stare' at the planet from a stationary position more than 22,000 miles above the surface, capturing high- resolution images of the entire 'disk' of the Earth every 30 minutes. This provides a continuous, real-time stream of images and atmospheric data. Taken in concert with a global array of thousands of hyper-detailed weather ground stations, meteorologists can essentially visually map weather developments in near real time across the entire globe.
'There are naturally gaps in this record, including a 'soft spot' when it comes to tracking weather in polar regions,' says Potere.. 'What we're talking about is investments on the ground through generative AI capabilities like GAIA that have the potential to unlock a synthetic fourth satellite constellation.'
That kind of visualization capability, produced via open source gen AI technology, is groundbreaking on its own. But the setup behind that tooling is equally innovative.
Consider compute power: Depending on the bands and mosaicking process, global satellite imagery can clock in at 3298 x 9896 pixels (and more), and a 15-year span of data measured every 30 minutes yields 263,000 images—more than the total frames in a typical Hollywood feature length film. That's 17 TB of data to be crunched per training session for the GAIA model. The team is also working with live weather data, tapping into the same operational satellites that weather forecasters use on the news at night. These foundation model approaches require a lot of GPUs—a common reason why visual-based gen AI tools have traditionally been a lesser-explored space.
'Up until now, the sheer compute and the algorithms and the know-how you need to be able to translate pixels into answers has been very rare,' Potere says.
Tackling the Compute Problem
BCG X made two conscious decisions when scoping the endeavor that not only proved to be novel but allowed them to bring the project online in just one year rather than the 18 to 24 months typical for other projects.
The first was to commit to creating an environment that could be deployed in the cloud, rather than being tethered to a purpose-built supercomputer. According to Tom Berg, BCG X lead engineer for the project, 'There was something really daunting here; it's almost become an accepted truth that to roll up your sleeves and build your own foundational model is too expensive, if you look at the immense resources the hyperscalers are using. One of the things we wanted to show is that if it works, you don't have to have a dedicated supercomputer to do these kinds of builds.'
To that end, the GAIA team turned to a national network of university computing resources distributed across the United States. This constellation of off-the-shelf GPUs (ranging from state-of-the-art to 10-year-old GPUs) is precisely what BCG X's development team had in mind.
'That profile, rather than matching a supercomputer, gave us a lot of parameters to work with,' Berg says. 'It's a very, very adaptable system, and at one point we were using 15 percent of the NRP's entire cloud.'
Still, such a setup provided some interesting challenges. Where a dedicated supercomputer has all of its processing power in one building with one uniform power configuration, Berg and Potere's team would instead be connecting GPUs on opposite sides of the Earth. There were also acute issues like power outages, or a university unexpectedly cycling their data centers. Crucially, GAIA was sharing compute space with hundreds of other research applications running at the same time. 'You're basically on a busy public road rather than a dedicated racetrack,' Berg says.
The team's second operational decision was to initially narrow their focus to precipitation and top-of-cloud temperature data—as opposed to modeling all aspects of every layer of the atmosphere. Because that selected data closely corresponds to a range of weather phenomena, it provided researchers with the flexibility needed to prove out the foundation model and run experiments at a still-manageable level of initial effort.
Critically, by focusing on this 'lower complexity' problem statement, the GAIA team was able to immediately scale their modeling to global atmospheric conditions, putting a dent in the problem of weather predictability. That surgically targeted start allowed the team to reach an equally targeted—yet incredibly meaningful—outcome.
Building a Global Toolset and Modeling Solutions
A key reason why the team was able to build so rapidly: open source tools and resources, oftentimes combined with earlier research from equally pioneering research teams.
'We have the benefit of standing on the shoulders of some of the earliest groups working on this,' Potere says. 'Even now, the literature has gotten three times denser since we started, but there was something of a literature, so we certainly benefited from the second mover advantage.'
That open source, iterative mindset will now define the project's next phase, as well: To give back to the research community and contribute to its ever-evolving toolset, BCG X and their collaborators released the GAIA Foundation Model to the global open source community. In other words, they modeled the Earth, for the Earth.
And their work couldn't have come at a better time. As governments, businesses, and research institutions increasingly grapple with the new normal of disruptive weather volatility, gen AI weather and environmental models like GAIA can fuel faster and better decision making—something experts and organizations across the world need more every day.
Climate change may very well be the defining issue of our time—and GAIA may very well be part of how the world as a whole is able to meet it.
Learn more about Boston Consulting Group here.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Digital Trends
35 minutes ago
- Digital Trends
At $120 off, the most comfortable noise-canceling headphones are now at a comfortable price
One of the best things about wearing noise-canceling headphones is the great deal of comfort they bring when in loud, chaotic environments. However, they need to really fit around your head to get that seal, which can lead to some comfort issues. If you're the type that wants to float away into your music or studies, then the Sonos Ace are your absolute best bet, and they are the headphones we consider to be the overall best noise-canceling headphones for comfort. Unfortunately, they're usually $449, the same price as the brand new Sony WH-1000XM6. Now, however, in a deal you can find at any major retailer, they're discounted by $120, making them just $329. Tap the appropriate button below to make your purchase or keep reading to see just how comfortable the Sonos Ace are, and learn about the special link they have with the best soundbar Sonos makes. Why you should buy the Sonos Ace Looking through our guide to the best headphones with noise-canceling you might expect the Sonos Ace to be the lightest of them all, considering they're considered the most comfortable. At 11 ounces, however, they are not even close to the 8.8 ounces that multiple noise-cancellers are down to. It all comes down to excellent padding and what our Sonos Ace review will remind you is a delicate balance between clamping force and pressure. Your ears will stay cool, not stuffy, while wearing these headphones, too. And while our review says that the area where the Sonos Ace 'truly shine' is comfort and simplicity, it's also worth pointing out their one true special feature. (That they have a good transparency mode, tactile controls, and a 30 hour battery life shouldn't surprise you at this price.) They have a TV Audio Swap feature with select Sonos soundbars, such as the Sonos Arc or Sonos Arc Ultra. This feature takes the full-on Dolby Atmos surround sound and spatial audio of your soundbar and ports it to your headset — perfect for when you notice the spouse or little one nodding off on the couch. We're not sure how long the Sonos Ace are going to be $120 off, making them $329 instead of $449, but we're happy to see the deal while it lasts. This is a deal that just about every retailer is taking advantage of, so you should be able to take your pick based on your memberships, gift cards, or other reasons — just tap the appropriate button below to shop. If the deal is off, or the Sonos Ace aren't to your liking, check out these other fresh headphone deals for more great offers.


Geek Wire
40 minutes ago
- Geek Wire
Seattle's Allen Institute launches ‘moonshot' to create new approach to cell biology research
A cross-section image of cells forming a hollow sphere, called a lumenoid. The colors mark different proteins expressed by the cells inside and outside of the sphere. (Allen Institute Image) Human cells, like the people they create, are dynamic and complex. And while researchers can create images and videos of how they move, organize and change their properties, it's hard to efficiently and accurately describe all that's happening. So a 75-person team at Seattle nonprofit Allen Institute is embarking on a 10-year project called CellScapes to devise a new language using mathematics to capture these essential processes. 'This is a new way of approaching very fundamental cell biology,' Ru Gunawardane, executive director and vice president of the Allen Institute for Cell Science, told GeekWire. 'We want to combine math and biophysical modeling, which are things that people are doing right now, but in a siloed way in very different systems.' Ru Gunawardane, executive director and vice president of the Allen Institute for Cell Science. (Allen Institute Photo) The Allen Institute was founded more than 20 years ago by the late Microsoft co-founder Paul Allen and his sister Jody Allen to dive into challenging problems in the biosciences. Previous efforts at multiple institutions have created numerical systems for understanding biological processes. That includes BayesSpace, a computational tool that produces data on gene expression in mixed cell types that developed researchers at the Fred Hutch Cancer Center. The Allen Institute has engineered modeling for organelles, which are the various machines packed inside cells that make proteins, produce energy and perform other key operations. 'The exciting thing is that we are trying … to bring different disciplines together,' Gunawardane said, 'because data is everywhere — but how do you make sense out of that data?' The CellScapes researchers are working with human stem cells, which are cells that don't yet have a set identity as, say, a skin or liver cell. The hope is through analysis and experimentation they'll devise mathematics that describe the cell's behavior, ultimately allowing them to predict and manipulate what the cells do. A primary goal would be to use these tools to unravel mysteries such as the intermediate steps to developing cancer, and ultimately discover new cell therapies. 'It's a lot like astronomy and going from 'which planet is that dot in the sky' to 'what are the laws of motion that describe all moving objects?'' said Wallace Marshall, professor of biochemistry and biophysics at the University of California, San Francisco, and a CellScapes advisor, in a statement. The Allen Institute seen from Dexter Yard. (GeekWire File Photo / Charlotte Schubert) The Allen Institute will make its data and innovations in the space publicly available, Gunawardane said, and expects to collaborate with researchers at outside institutions. The research team includes software engineers, computational biologists, program managers and others. There is no set budget for the decade-long effort, and the CellScapes team is simultaneously pursuing three projects that are part of the broader initiative. The effort already has a scientific paper accepted by the journal Nature that will be published in coming months. It's an exciting time, Gunawardane said. 'I also feel a huge responsibility,' she said, 'because Paul [Allen] is not alive anymore, but our work is his legacy, and he asked us to break the code of the cell. And in a way, the code is very complicated — it's more like a program, the cellular program. 'So I feel like we are now actually at the brink,' she said, 'of knowing maybe how to approach that.'


Fast Company
44 minutes ago
- Fast Company
Why Is a Sweetgreen Salad So Expensive?
On this week's episode of The Most Innovative Companies Podcast, Josh and Yaz sit down with Sweetgreen CEO Jonathan Neman to talk all things Sweetgreen from prices to AI 'cooks' and his favorite item on the menu.