
Why FEMA's flood maps often miss dangerous flash flood risks
The same region of Texas Hill Country where a flash flood on July 4 killed more than 130 people was hit again with downpours a week later, forcing searchers to temporarily pause their efforts to find missing victims. Other states, including New Mexico, Oklahoma, Vermont, and Iowa, also saw flash flood damage in July.
The U.S. Federal Emergency Management Agency's flood maps are intended to be the nation's primary tool for identifying flood risks. Originally developed in the 1970s to support the National Flood Insurance Program, these maps, known as Flood Insurance Rate Maps, or FIRMs, are used to determine where flood insurance is required for federally backed mortgages, to inform local building codes and land-use decisions, and to guide flood plain management strategies.
In theory, the maps enable homeowners, businesses, and local officials to understand their flood risk and take appropriate steps to prepare and mitigate potential losses.
But while FEMA has improved the accuracy and accessibility of the maps over time with better data, digital tools, and community input, the maps still don't capture everything—including the changing climate. There are areas of the country that flood, some regularly, that don't show up on the maps as at risk.
I study flood-risk mapping as a university-based researcher and at First Street, an organization created to quantify and communicate climate risk. In a 2023 assessment using newly modeled flood zones with climate-adjusted precipitation records, we found that more than twice as many properties across the country were at risk of a 100-year flood than the FEMA maps identified.
Even in places where the FEMA maps identified a flood risk, we found that the federal mapping process, its overreliance on historical data, and political influence over the updating of maps can lead to maps that don't fully represent an area's risk.
What FEMA flood maps miss
FEMA's maps are essential tools for identifying flood risks, but they have significant gaps that limit their effectiveness.
One major limitation is that they don't consider flooding driven by intense bursts of rain. The maps primarily focus on river channels and coastal flooding, largely excluding the risk of flash flooding, particularly along smaller waterways such as streams, creeks, and tributaries.
This limitation has become more important in recent years due to climate change. Rising global temperatures can result , leaving more areas vulnerable to flooding, yet unmapped by FEMA.
For example, when flooding from Hurricane Helene hit unmapped areas around Asheville, North Carolina, in 2024, it caused a huge amount of uninsured damage to properties.
Even in areas that are mapped, like the Camp Mystic site in Kerr County, Texas, that was hit by a deadly flash flood on July 4, 2025, the maps may underestimate their risk because of a reliance on historic data and outdated risk assessments.
Political influence can fuel long delays
Additionally, FEMA's mapping process is often shaped by political pressures.
Local governments and developers sometimes fight high-risk designations to avoid insurance mandates or restrictions on development, leading to maps that may understate actual risks and leave residents unaware of their true exposure.
An example is New York City's appeal of a 2015 FEMA Flood Insurance Rate Maps update. The delay in resolving the city's concerns has left it with maps that are roughly 20 years old, and the current mapping project is tied up in legal red tape.
On average, it takes five to seven years to develop and implement a new FEMA Flood Insurance Rate Map. As a result, many maps across the U.S. are significantly out of date, often failing to reflect current land use, urban development, or evolving flood risks from extreme weather.
This delay directly affects building codes and infrastructure planning, as local governments rely on these maps to guide construction standards, development approvals, and flood mitigation projects. Ultimately, outdated maps can lead to underestimating flood risks and allowing vulnerable structures to be built in areas that face growing flood threats.
How technology advances can help
New advances in satellite imaging, rainfall modeling, and high-resolution lidar, which is similar to radar but uses light, make it possible to create faster, more accurate flood maps that capture risks from extreme rainfall and flash flooding.
However, fully integrating these tools requires significant federal investment. Congress controls FEMA's mapping budget and sets the legal framework for how maps are created. For years, updating the flood maps has been an unpopular topic among many publicly elected officials, because new flood designations can trigger stricter building codes, higher insurance costs, and development restrictions.
In recent years, the rise of climate risk analytics models and private flood risk data have allowed the real estate, finance and insurance industries to rely less on FEMA's maps. These new models incorporate forward-looking climate data, including projections of extreme rainfall, sea-level rise and changing storm patterns—factors FEMA's maps generally exclude.
Real estate portals like Zillow, Redfin, Realtor.com, and Homes.com now provide property-level flood risk scores that consider both historical flooding and future climate projections. The models they use identify risks for many properties that FEMA maps don't, highlighting hidden vulnerabilities in communities across the U.S.
Research shows that the availability, and accessibility, of climate data on these sites has started driving property-buying decisions that increasingly take climate change into account.
Implications for the future
As homebuyers understand more about a property's flood risks, that may shift the desirability of some locations over time. Those shifts will have implications for property valuations, community tax-revenue assessments, population migration patterns, and a slew of other considerations.
However, while these may feel like changes being brought on by new data, the risk was already there. What is changing is people's awareness.
The federal government has an important role to play in ensuring that accurate risk assessments are available to individuals and communities everywhere. As better tools and models evolve for assessing risk evolve, FEMA's risk maps need to evolve, too.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Bloomberg
an hour ago
- Bloomberg
EPA Plans to Ditch Linchpin Climate Policy, Washington Post Says
The US Environmental Protection Agency is considering scrapping a landmark almost two-decade old legal opinion that greenhouse gas emissions are harmful to human health, the Washington Post reported. The so-called ' endangerment finding,' published in 2009, has been the legal basis of a wide range of climate rules under the Clean Air Act. A draft proposal to formally abandon the policy is being considered by the EPA, though is still subject to change, the newspaper reported, citing two sources familiar with the details which it didn't name.


Forbes
an hour ago
- Forbes
Quantum Computing Faces 3 Major Barriers Before Going Mainstream
discovery, climate solutions, and artificial intelligence, but faces major technical hurdles, including unstable qubits that last only microseconds and the need for millions of qubits versus today's 1,000. Less Adobe Stock The age of the 'useful' quantum computer is fast approaching, promising to supercharge the computing power we have available to tackle some of the world's biggest problems. This could include drug and medicine discovery or tackling global warming. From a business point of view, they'll also make it possible to accelerate and innovate in many fields impacted by the speed of calculations, including delivery and logistics, anything involving simulating the real world, and artificial intelligence. So, if they're so amazing, why aren't we all using them already? Well, the fact is that there are still some pretty big challenges that have to be overcome before quantum computing is likely to gain the momentum it needs in order to change the world. These range from the highly challenging and technical to the just-as-challenging down-to-earth matters involving society, security and privacy. So, here's an overview of the big issues that quantum computing is currently trying to overcome, as well as a look at what tackling them could mean to businesses attempting to time their entry. Technical Challenges Processing qubits—the compute units of quantum computers—that exist in thousands of states simultaneously (unlike the binary ones and zeros of classical computing) involves some seriously bleeding-edge technology. Probably the biggest challenge is still keeping qubits in a stable state. They are incredibly fragile and usually only remain in the correct state for microseconds before they are disturbed by heat, vibration or electromagnetic fields. Today, solutions include keeping them super-chilled to temperatures very close to absolute zero, and housing individual ions in vacuum chambers where their state can be read by lasers. Both require equipment that can't be shipped next-day on Prime. Another problem is simply the volume of qubits that are needed; today's typical quantum computers generally have between 50 and 200 physical qubits, with IBM's Condor (currently the largest) stretching that to over 1,000. Expert predictions on the number of qubits needed for truly useful quantum computing range from 10,000 to over 13 million. And then there's the pretty big problem that there simply isn't much software available for quantum computers yet. The majority of apps, tools and frameworks that do exist have all been created for specific purposes. So, anyone wanting to put it to work in innovative use cases has the expense of developing their own software and infrastructure to consider. So once those developing quantum computers are able to stabilize qubits, scale the number of qubits and offer a diverse set of tools, frameworks and applications, the path will be clear for businesses to start putting it to commercial use, right? Well, not quite, because there are challenges that go beyond the purely technological to consider, too. The Bigger Picture Some of the biggest obstacles are posed by issues related to society and culture, and don't take it for granted that they will be solved by technology alone. To start with, there's a big problem around skills. Analysts at McKinsey estimate that there were three quantum computing job vacancies for every one qualified applicant in 2024. As fast as they graduate, newly qualified professionals are being snapped up by the hyperscalers like Google and Microsoft as they race against each other. While we can expect the number of quantum-skilled graduates to increase, being forced to wait for the laws of supply and demand to catch up with reality could have consequences for small and medium-sized businesses planning on becoming first-movers. This leads on to another societal challenge—addressing the issue of technological inequality and deprivation. With quantum computers being hugely expensive and difficult to operate, the risk is that their power is monopolized by corporations, well-funded research groups and governments. This could result in those with fewer resources missing out on the benefits, by blocking smaller businesses from competing before the race has even started. Another factor is that adopting quantum computing will mean large-scale infrastructure changes for many businesses. No one believes quantum computers will replace 'classical' computers completely for some time, but businesses will still need to make big investments in infrastructure to benefit from the opportunities that are there. Even in fields where the quantum advantage is significant, there could be cultural resistance due to the scale of change required. And one more issue that can't be overlooked is the security implications. It's already believed that quantum computers will be powerful enough to break some of today's toughest digital encryption. This is encryption that could be protecting state security as well as the privacy of millions of us as individuals. Today's 1,000 qubit computers won't do it, but tomorrow's one million qubit machines probably will. Which means society needs fixes and safeguards in place before powerful quantum computers can become widely available. Is AI The Solution? Quantum computing is expected to have a significant impact on the field of AI, both in research and development, where it will help to train models more quickly, and in operational use. Common algorithmic functions like Monte-Carlo sampling, which are widely used in AI as well as general computing, can be vastly accelerated with quantum computers, as can the linear algebra powering deep learning routines. AI's 'smartness' can be considered as a function of the volume of data it's trained on, and the amount of compute power it can access. Quantum computing could boost both of these factors considerably. At the same time, AI can, in theory, help us make bigger, better quantum processors. Reinforcement learning, a core AI technique, is already used to optimize the timing of microwave pulses used to stabilize qubits. Foundation models are being used to propose new candidates for superconducting or photonic compounds that can be used to build more powerful quantum machines. So this symbiosis of quantum and AI could be the key to overcoming the technical hurdles impeding the quantum revolution. The fact is, though, that the technology problems blocking adoption are likely to be solved eventually, one way or another. But for businesses wanting to act fast in order to capitalize on the huge opportunities, the focus should be on tackling the societal and cultural challenges. These are problems that can be solved by organizations without needing Google or IBM-sized pockets, and will prove equally critical to quantum success in the long term.


New York Times
6 hours ago
- New York Times
North Carolina's Bogs Have a Dirty Secret, and That's a Good Thing
Depending on how it's treated, this North Carolina soil can be a blessing or a curse. In its natural state, the soggy, spongy soil known as peat stores exceptional amounts of planet-warming carbon. Peatlands cover only about 3 percent of land on Earth, but they sock away twice as much carbon as all the world's forests put together. They also offer protection from wildfires, floods and drought, and support rare species. But decades ago, in peatlands across North Carolina, people dug ditches to drain the waterlogged earth, often to fell old-growth trees or plant new ones for timber. As peat dries, its virtues turn upside down. The soil itself becomes highly flammable. Even without burning, drained peat starts to emit the carbon it once stored, converting a climate solution into a climate problem. The land no longer soaks up floodwaters. And in times of drought, there's little water for the ecosystem to fall back on. Now, nonprofit, state, federal and private sector scientists and engineers have teamed up on what amounts to a series of giant plumbing projects. They are coaxing water to stay on the land to restore moisture to the peat. Tell Us About Solutions Where You Live Want all of The Times? Subscribe.