AI In Disaster Response Planning

Explore top LinkedIn content from expert professionals.

  • View profile for Juan M. Lavista Ferres

    CVP and Chief Data Scientist at Microsoft

    31,716 followers

    Today, Nature Communications published our latest research, led by Amit Misra from Microsoft’s AI for Good Lab: a global flood detection model built using 10 years of Synthetic Aperture Radar (SAR) satellite data. It can detect floods through clouds, at night, and in remote areas—filling a critical gap in global disaster data. Already in use in Kenya and Ethiopia, this open-source tool is helping governments respond faster and plan smarter. It’s a powerful example of how AI can drive climate resilience.

  • View profile for Matt Forrest
    Matt Forrest Matt Forrest is an Influencer

    🌎 Helping geospatial professionals grow using technology · Scaling geospatial at Wherobots

    69,373 followers

    ☁️ Clouds block satellites. Floods don’t care. Here’s how foundation models are being adapted to see through the storm. During extreme weather events like floods, clouds often block optical satellites from capturing usable data. And that's exactly when timely insight matters most. Originally shared by Heather Couture, PhD, this study tackled this head-on. It adapted the Prithvi foundation model, originally trained on optical imagery, by incorporating Synthetic Aperture Radar (SAR) to detect floods across the UK and Ireland. ✅ SAR can “see” through clouds ✅ Fine-tuning the model with SAR bands boosted flood segmentation accuracy from 0.58 to 0.79 ✅ Even small amounts of local data were enough to adapt the model to new regions This research shows that Earth Observation Foundation Models can be effectively adapted for disaster response, even in data scarce areas and how AI can be useful for real world problems. 🌎 I'm Matt and I talk about modern GIS, AI, and how geospatial is changing. 📬 Want more like this? Join 6k+ others learning from my newsletter → forrest.nyc

  • View profile for James Manyika
    James Manyika James Manyika is an Influencer

    SVP, Google-Alphabet

    89,626 followers

    For those of you following developments with AI & Science, particularly around weather forecasting… At Google Research and Google DeepMind we have introduced an experimental model for tropical cyclone prediction, which can predict a cyclone’s formation, track, intensity, size and shape – generating 50 possible scenarios, up to 15 days in advance. And as we head into this year’s cyclone season, we’re partnering with the US National Hurricane center to support their forecasts and warnings. We’re publicly sharing this experimental model in Weather Lab, a new platform to access experimental weather forecast visualizations, and we hope to gather feedback and enable researchers and forecasters to leverage our models and predictions to inform their own work. You can learn more in our blog post (https://lnkd.in/geG62c2v) or this New York Times story (https://lnkd.in/gAFPbUrD).

  • View profile for Eric Kant ∴

    Tech Scout | Innovation strategy & market intel | Digital Twins • VR • Low-Code • Responsible AI | From idea to deployed

    18,084 followers

    Using Generative AI to Assist Emergency Managers in Data Collection: With this prototype, we will explore how generative AI can be used to assist emergency managers in data collection and entry, and how it can support disaster response efforts. Emergency managers have a huge responsibility when disaster strikes. One of their primary tasks is to collect and analyze data from the disaster area to determine the extent of damage, prioritize rescue efforts, and make informed decisions. However, the process of data collection and entry can be daunting and time-consuming, especially when done manually. This is where generative AI comes in. GeoTalk essential elements of information (EEI) extractor prototype being developed by Kant Consulting Group, LLC. The prototype uses multiple technologies like generative AI (ChatGPT), LangChain to extract EEI data communications, ask follow up questions and enter the data into Esri Esri ArcGIS Online layer. This provides a valuable resource for emergency managers to use when analyzing disaster situations and making informed decisions. Generative AI is a relatively new field of AI technology that involves creating algorithms that can generate content, such as text or images, based on given parameters. While generative AI is often associated with creative writing, it can also be used in data collection and entry. Emergency managers can take advantage of this technology by using it to input field notes, analyze situation reports, and input data into geographic information systems (GIS). One of the main advantages of using generative AI is its ability to analyze and interpret data at scale. Unlike traditional methods that require emergency managers to manually input data into an incident management system, generative AI can automatically analyze and extract data from a variety of sources, which saves time and makes data entry more efficient. This means that emergency managers can focus on other critical tasks related to disaster response. Moreover, generative AI can help identify inconsistencies and errors in collected data. The algorithms can flag inconsistencies and errors, allowing emergency managers to correct and revise the data as needed. This can lead to more accurate and reliable decision-making by emergency managers. Overall, the integration of generative AI in the field of emergency management is an exciting new development that holds tremendous promise. By utilizing AI technology for data collection and entry, emergency managers will be able to operate more efficiently, make higher quality decisions, and ultimately save more lives. While this is only the beginning of what is possible, we can certainly be excited about the future possibilities for AI technology in disaster response efforts. If you are an emergency manager interested in contributing to the development of this technology, don't hesitate to reach out.

  • View profile for Bob Lord
    Bob Lord Bob Lord is an Influencer
    18,564 followers

    Wildfires destroyed over 30 million acres globally in 2023. Now, a groundbreaking AI model from the ECMWF is changing how we fight back. Their new “Probability of Fire” (PoF) model doesn’t rely on flashier algorithms; it thrives on better data. By integrating real-time weather patterns, vegetation conditions, and human activity, PoF offers wildfire risk predictions that are not only more accurate, but also more accessible to smaller agencies with limited resources. This is a perfect example of how better data > better algorithms when it comes to real-world impact. As climate change accelerates the frequency and severity of wildfires, tools like PoF could be game changers in helping communities prepare, respond, and ultimately save lives. #AI #ClimateTech #WildfirePrevention

  • View profile for Jordi Visser
    Jordi Visser Jordi Visser is an Influencer

    22V Research | Macroeconomics, Data-Driven Insights, Hedge Funds

    7,837 followers

    AI-Powered Weather Forecasting: The GraphCast Innovation In the quest to enhance weather forecasting and provide early warnings for extreme events like hurricanes, artificial intelligence has emerged as a potent tool. Traditional forecasting systems have undoubtedly improved over the years, but AI's ability to swiftly analyze historical data and make predictions is transforming the field. Google #DeepMind's innovative AI tool, #GraphCast, has demonstrated its potential by outperforming conventional models and significantly expediting forecast delivery. Weather forecasts serve a crucial purpose beyond helping us decide our daily attire; they offer a lifeline in anticipating and preparing for severe weather events such as storms, floods, and heatwaves. However, traditional weather forecasting demands immense computational power. It involves processing hundreds of variables across various atmospheric layers worldwide. GraphCast takes a fundamentally different approach. Instead of attempting to model intricate atmospheric processes, it leverages machine learning to analyze extensive historical weather data, including output from the European Centre for Medium-Range Weather Forecasts (ECMRWF) model, to understand the evolution of weather patterns. This AI-driven approach enables it to predict how current conditions are likely to change in the future, with remarkable precision. GraphCast has demonstrated exceptional accuracy, outperforming traditional models on more than 90% of the factors crucial for weather forecasting. Moreover, it produces forecasts in under a minute, utilizing only a fraction of the computing power required by traditional numerical weather prediction (NWP) models. An illustrative example of its success is its prediction of Hurricane Lee's landfall in Canada in September. The AI tool accurately forecasted the storm's path nine days in advance, surpassing the ECMRWF's six-day prediction window. This extended lead time for forecasting can be pivotal in preparing for extreme weather events, potentially saving lives and mitigating property damage. Crucially, AI models like GraphCast do not supplant traditional weather forecasts but complement them. These AI models rely on data generated by traditional approaches, emphasizing the symbiotic relationship between AI and traditional meteorological methods. Despite advances, climate change brings unpredictable weather extremes, challenging AI models with data quality issues. Rising ocean temperatures introduce a previously unseen variable that can accelerate storm intensification — like Hurricane Otis's swift escalation from a tropical storm to a Category 5 hurricane within 24 hours. GraphCast by Google DeepMind represents a significant advancement in weather forecasting. As climate change continues to reshape weather patterns, AI's role in forecasting becomes increasingly crucial in safeguarding communities worldwide. #JordiPlusJavis Note: This is an #AI generated image

  • Every year, natural disasters hit harder and closer to home. But when city leaders ask, "How will rising heat or wildfire smoke impact my home in 5 years?"—our answers are often vague. Traditional climate models give sweeping predictions, but they fall short at the local level. It's like trying to navigate rush hour using a globe instead of a street map. That’s where generative AI comes in. This year, our team at Google Research built a new genAI method to project climate impacts—taking predictions from the size of a small state to the size of a small city. Our approach provides: - Unprecedented detail – in regional environmental risk assessments at a small fraction of the cost of existing techniques - Higher accuracy – reduced fine-scale errors by over 40% for critical weather variables and reduces error in extreme heat and precipitation projections by over 20% and 10% respectively - Better estimates of complex risks – Demonstrates remarkable skill in capturing complex environmental risks due to regional phenomena, such as wildfire risk from Santa Ana winds, which statistical methods often miss Dynamical-generative downscaling process works in two steps: 1) Physics-based first pass: First, a regional climate model downscales global Earth system data to an intermediate resolution (e.g., 50 km) – much cheaper computationally than going straight to very high resolution. 2) AI adds the fine details: Our AI-based Regional Residual Diffusion-based Downscaling model (“R2D2”) adds realistic, fine-scale details to bring it up to the target high resolution (typically less than 10 km), based on its training on high-resolution weather data. Why does this matter? Governments and utilities need these hyperlocal forecasts to prepare emergency response, invest in infrastructure, and protect vulnerable neighborhoods. And this is just one way AI is turbocharging climate resilience. Our teams at Google are already using AI to forecast floods, detect wildfires in real time, and help the UN respond faster after disasters. The next chapter of climate action means giving every city the tools to see—and shape—their own future. Congratulations Ignacio Lopez Gomez, Tyler Russell MBA, PMP, and teams on this important work! Discover the full details of this breakthrough: https://lnkd.in/g5u_WctW  PNAS Paper: https://lnkd.in/gr7Acz25

  • View profile for Sherrie Wang

    Assistant Professor, MIT MechE/IDSS

    3,457 followers

    Thrilled to unveil our latest work: multi-modal machine learning to forecast localized weather! We construct a graph neural network to learn dynamics at point locations, where typical gridded forecasts miss significant variation. Paper: https://lnkd.in/eBmfsJin Weather dataset: https://lnkd.in/ejCG8bKs Code: https://lnkd.in/eQg-JzQJ AI weather models have made huge strides, but most still emulate products like ERA5, which struggle to capture near-surface wind dynamics. The correlation between ERA5 and ground weather station data is low due to topography, buildings, vegetation, and other local factors. In this work, we forecast near-surface wind at localized off-grid locations using a message-passing graph neural network ("MPNN"). The graph is heterogeneous, integrating both global forecasts (ERA5) and historical local weather station data as different nodes. What do we find? First off, ERA5 interpolation performs poorly, failing to capture local wind variations, especially in coastal and inland regions with complex conditions. An MLP trained on historical data at a location performs better than ERA5 interpolation, as it learns from the station's past observations. However, it struggles with longer lead times and lacks the spatial context necessary to capture weather patterns. Meanwhile, our MPNN dramatically improves performance, reducing the error by over 50% compared to the MLP. This is because the MPNN incorporates spatial information through message passing, allowing it to learn local weather dynamics from both station data and global forecasts. Interestingly, adding ERA5 data to the MLP does not improve its performance significantly. The MLP struggles to integrate spatial information from global forecasts, while the MPNN excels, highlighting the importance of combining global and local data. Large improvements in forecast accuracy occur at both coastal and inland locations. Our model shows a 92% reduction in MSE relative to ERA5 interpolation overall. This work showcases the strength of machine learning in combining multi-modal data. By using a graph to integrate global and local weather data, we were able to generate much more accurate localized weather forecasts! Congrats to Qidong Yang and Jonathan Giezendanner for the great work, and thanks to Campbell Watson, Daniel Salles Chevitarese, Johannes Jakubik, Eric Schmitt, Anirban C., Jeremy Vila, Detlef Hohl, and Chris Hill for a wonderful collaboration. Thanks also to our partners at Amazon Web Services (AWS) for providing cloud computing and technical support!

  • View profile for William "Craig" F.

    Craig Fugate Consulting

    11,886 followers

    another recommendation that didn't make the op-ed: AI-Powered Debris Estimation for Faster, More Accurate Assessments Current Challenge: The existing debris reimbursement model relies on post-disaster damage assessments, which can be slow, bureaucratic, and often lead to disputes over the actual volume and cost of debris removal. AI Solution: FEMA should develop an AI-driven debris estimation tool that uses satellite imagery, LiDAR, historical disaster data, and machine learning models to predict debris volume immediately after an event. The model could be trained on past disaster events and refined with real-time inputs (e.g., wind speed, storm path, structural damage reports) to generate automated, rapid debris cost estimates. This would allow FEMA to pre-authorize funding within days instead of waiting weeks or months for full damage assessments. Upfront Payments to States Instead of Reimbursement Current Challenge: The reimbursement model requires local and state governments to front the costs, which can strain budgets and delay cleanup. Proposed Reform: Based on AI-generated debris estimates, FEMA could provide states with upfront lump-sum payments rather than relying on a reimbursement system tied to cubic yards of debris collected. This would allow states to mobilize debris contractors immediately instead of waiting for reimbursement approvals. A true-up process could follow, where adjustments are made if actual costs exceed or fall short of estimates. Benefits of This Approach ✅ Faster Recovery: Reduces delays caused by slow reimbursement processes, getting debris cleared quickly to restore infrastructure. ✅ Cost Efficiency: AI modeling can improve cost projections, reducing disputes and fraud associated with overestimated cubic yard measurements. ✅ Better Resource Allocation: States won’t have to wait for FEMA assessments before securing contracts and mobilizing cleanup efforts. ✅ Equity in Funding: Helps underfunded local governments that struggle with cash flow for immediate debris removal efforts.

  • View profile for Sohail Elabd

    Passionate About Putting the World on the Map—Literally | Helping Governments & Organizations Unlock the Power of GeoSpatial Data. Turning Complex Geospatial Challenges into Scalable Solutions

    10,404 followers

    When Earth Strikes, Can Technology Heal? Over the July 4th weekend, Central Texas faced a deadly flash flood. The Guadalupe River rose 26 feet in just 45 minutes, overwhelming entire communities. More than 120 lives were lost, with over 170 still missing — including children at summer camps. It was sudden. It was devastating. It was a moment that exposed the fragility of our systems. But it also spotlighted how Geospatial Technology + AI — or GeoAI — can change the way we respond. Here’s how GeoAI is healing what nature has shattered: -Real-time Damage Assessment Using satellite imagery and AI models, agencies are mapping destroyed buildings, washed-out roads, and flooded zones within hours, not weeks. This empowers rescue teams to act with precision and speed. -Smarter Search and Rescue GeoAI analyzes flood paths, terrain, and population density to identify where survivors might be — even in rural or disconnected areas. -Infrastructure Recovery and Risk Mapping By overlaying flood impact with infrastructure data, governments can plan better, restore faster, and prevent future disasters. -Informed Recovery Planning AI-powered change detection helps assess loss, allocate resources, and simulate recovery under future climate scenarios. Agencies like NASA and platforms like Esri are already delivering flood extent maps and pretrained damage classification models — helping responders make better decisions, faster. As disasters become more intense and more frequent, GeoAI isn’t just innovation. It’s intervention. When the Earth strikes, technology must become the healer.

Explore categories