New research models the likelihoods of different climate scenarios. It shows that 3°C isn’t a worst-case. It’s the most likely. Up until now, climate scenarios have been treated as narrative pathways without assigned probabilities. Climate scientists have resisted giving scenarios a likelihood because of deep uncertainty. That is, the full range of outcomes due to physical, social, political and technological changes can't be known, and therefore, probabilities cannot be reliably estimated. Climate scenarios were described as exploratory tools, not forecasts and were designed to illuminate plausible pathways, not predict them. But... intuitively, we know that some climate futures are more likely than others. This information is helpful for business decision-making. This new paper from the EDHEC Climate Institute challenges the idea that probabilities can't be assigned to climate scenarios and provides two robust, data-driven methods to do it. The first is an 'informative method', which starts with economists’ views on the social cost of carbon (SCC). In effect, it converts wishful thinking into plausible expectations. The second is a 'maximum entropy method'. It makes as few assumptions as possible, using current carbon prices and basic policy constraints as the only inputs. What’s remarkable is that both approaches produce results that are very similar. Does this mean that some climate pathways are more locked in than we think? Model outputs: 🔸 The most likely temperature anomaly in 2100 is between 2.8–3.0ºC 🔸 There is a 35–40% chance of exceeding 3.0ºC 🔸 There is just a 1% chance of staying below 1.5ºC The model was also tested using Oxford Economics scenarios. The results were even more shocking. 🔸 The ‘Climate Catastrophe’ carries a likelihood of 57.5%. 🔸 The ‘Climate Distress’ scenario carries a likelihood of 35% 🔸 Together, they make up 92.5% of the total These high temperatures increase the likelihood of triggering irreversible tipping points, for which standard damage functions no longer apply. This is dangerous territory. 𝗠𝘆 𝗧𝗮𝗸𝗲 Most companies use climate scenarios that treat all futures as exploratory scenarios. But this doesn't allocate future risk efficiently. Without probabilities, we cannot optimise capital allocation between mitigation (transition risk) and adaptation (physical risk). Assigning probabilities to scenarios changes the conversation. It equips firms to weigh investment in risk reduction not just by severity but also by likelihood. Personally, I believe this is a critical next step in climate risk planning. Assigned likelihoods should be accompanied by uncertainty bounds—so decision-makers can assess not just what’s likely, but how confident we can be in those estimates. Source: https://lnkd.in/exy5TDS8 _____________ 𝘍𝘰𝘭𝘭𝘰𝘸 𝘮𝘦 𝘰𝘯 𝘓𝘪𝘯𝘬𝘦𝘥𝘐𝘯: Scott Kelly
System-wide scenario modeling for climate impact
Explore top LinkedIn content from expert professionals.
Summary
System-wide scenario modeling for climate impact is an approach that uses data-driven methods to predict how various climate futures might unfold across entire regions or sectors, assigning probabilities to these scenarios rather than treating them as equally likely possibilities. This helps organizations, governments, and businesses plan more confidently for physical risks, policy changes, and financial impacts caused by climate change.
- Prioritize probable risks: Focus your climate planning on scenarios that data show are most likely, rather than assuming all outcomes are equally possible.
- Integrate uncertainty: Include uncertainty ranges in your forecasts to help decision-makers weigh both the likelihood and confidence of climate-related predictions.
- Refine local impact: Use advanced modeling tools, like AI-powered downscaling, to bring scenario analysis to the city or neighborhood level for more actionable results.
-
-
📢 Research Alert: A Probabilistic Framework for Climate Scenario Analysis 🌍 "Median global warming expected at 2.7°C - well above the #ParisAgreement" As climate risks become central to #financial and #regulatory decision-making, one challenge remains critically unmet: most climate scenarios lack probabilistic grounding. To address this, the EDHEC Climate Institute with Lionel Melin, Riccardo Rebonato, FANGYUAN ZHANG has released a groundbreaking study: 📘 "How to Assign Probabilities to Climate Scenarios" This research proposes an innovative framework to quantify the likelihood of long-term temperature outcomes, enriching narrative-based scenarios with a probabilistic layer essential for asset pricing, risk management, and policy planning. ✅ Key contributions: • Based on 5,900+ Social Cost of Carbon estimates from 207 academic sources • Uses two rigorous methods: an elicitation-based approach and a maximum-entropy framework • Integrates real-world policy constraints and macroeconomic data 🔍 Findings: • 35–40% chance of >3°C warming by 2100 • The 1.5°C target is technologically feasible, but highly improbable • Median expected warming: 2.7°C - well above the Paris Agreement • Physical climate damages outweigh the cost of transition, emphasizing urgent financial realignment 🔗 The study also maps #probabilities onto Oxford Economics’ scenario framework, assigning over 90% likelihood to pathways involving limited or delayed emissions cuts: Climate Catastrophe, Climate Distress, and Baseline. 👉 A must-read for those in climate finance, regulatory strategy, and risk modeling. This research pushes the frontier in integrating uncertainty and feasibility into climate scenario analysis. #ClimateChange and #Mitigation remains both the greatest source of risk and of opportunity of our time. Let’s prepare! radicant bank #InvestInSolutionsNotProblems
-
Every year, natural disasters hit harder and closer to home. But when city leaders ask, "How will rising heat or wildfire smoke impact my home in 5 years?"—our answers are often vague. Traditional climate models give sweeping predictions, but they fall short at the local level. It's like trying to navigate rush hour using a globe instead of a street map. That’s where generative AI comes in. This year, our team at Google Research built a new genAI method to project climate impacts—taking predictions from the size of a small state to the size of a small city. Our approach provides: - Unprecedented detail – in regional environmental risk assessments at a small fraction of the cost of existing techniques - Higher accuracy – reduced fine-scale errors by over 40% for critical weather variables and reduces error in extreme heat and precipitation projections by over 20% and 10% respectively - Better estimates of complex risks – Demonstrates remarkable skill in capturing complex environmental risks due to regional phenomena, such as wildfire risk from Santa Ana winds, which statistical methods often miss Dynamical-generative downscaling process works in two steps: 1) Physics-based first pass: First, a regional climate model downscales global Earth system data to an intermediate resolution (e.g., 50 km) – much cheaper computationally than going straight to very high resolution. 2) AI adds the fine details: Our AI-based Regional Residual Diffusion-based Downscaling model (“R2D2”) adds realistic, fine-scale details to bring it up to the target high resolution (typically less than 10 km), based on its training on high-resolution weather data. Why does this matter? Governments and utilities need these hyperlocal forecasts to prepare emergency response, invest in infrastructure, and protect vulnerable neighborhoods. And this is just one way AI is turbocharging climate resilience. Our teams at Google are already using AI to forecast floods, detect wildfires in real time, and help the UN respond faster after disasters. The next chapter of climate action means giving every city the tools to see—and shape—their own future. Congratulations Ignacio Lopez Gomez, Tyler Russell MBA, PMP, and teams on this important work! Discover the full details of this breakthrough: https://lnkd.in/g5u_WctW PNAS Paper: https://lnkd.in/gr7Acz25
-
📗 𝐓𝐡𝐞 𝐍𝐆𝐅𝐒 𝐒𝐡𝐨𝐫𝐭-𝐭𝐞𝐫𝐦 𝐒𝐜𝐞𝐧𝐚𝐫𝐢𝐨𝐬 𝐚𝐫𝐞 𝐡𝐞𝐫𝐞! The group of over 100 central banks and supervisors just published a first-of-its-kind, publicly available tool to analyse the near-term impacts of climate policies and climate change on financial stability and economic resilience. 🖍 𝗛𝗲𝗿𝗲'𝘀 𝘄𝗵𝗮𝘁 𝘆𝗼𝘂 𝘀𝗵𝗼𝘂𝗹𝗱 𝗸𝗻𝗼𝘄: 𝐓𝐡𝐞 #𝐍𝐆𝐅𝐒 𝐬𝐡𝐨𝐫𝐭-𝐭𝐞𝐫𝐦 𝐬𝐜𝐞𝐧𝐚𝐫𝐢𝐨𝐬 𝐚𝐫𝐞 𝐡𝐢𝐠𝐡𝐥𝐲 𝐫𝐞𝐥𝐞𝐯𝐚𝐧𝐭 𝐟𝐨𝐫 𝐜𝐥𝐢𝐦𝐚𝐭𝐞 𝐫𝐢𝐬𝐤 𝐚𝐧𝐚𝐥𝐲𝐬𝐢𝐬 𝐚𝐧𝐝 𝐩𝐨𝐥𝐢𝐜𝐲𝐦𝐚𝐤𝐢𝐧𝐠. The four different scenarios show that: ➡️ regional extreme weather events generate temporary but material GDP losses, with effect on the global economy, and could increase the cost of transition; ➡️ delaying transition efforts increase the economic costs of transitioning and could cause additional financial stress. 𝗟𝗲𝘁'𝘀 𝗹𝗼𝗼𝗸 𝗮𝘁 𝘁𝗵𝗲𝘀𝗲 4 𝘀𝗰𝗲𝗻𝗮𝗿𝗶𝗼𝘀: 1. 𝗛𝗶𝗴𝗵𝘄𝗮𝘆 𝘁𝗼 𝗣𝗮𝗿𝗶𝘀: A technology-driven and orderly transition unfolds gradually. (Transition risk in a relatively orderly transition). 2. 𝗦𝘂𝗱𝗱𝗲𝗻 𝗪𝗮𝗸𝗲-𝗨𝗽 𝗖𝗮𝗹𝗹: A world of widespread climate unawareness is challenged by a sudden change in policy preferences. (Transition risk in a more disorderly transition) 3. 𝗗𝗶𝘃𝗲𝗿𝗴𝗶𝗻𝗴 𝗥𝗲𝗮𝗹𝗶𝘁𝗶𝗲s: Advanced economies pursue a net-zero transition in line with Highway to Paris. The rest of the world is hit by a sequence of extreme weather events. (Partial transition with mounting physical risks). 4. 𝗗𝗶𝘀𝗮𝘀𝘁𝗲𝗿𝘀 𝗮𝗻𝗱 𝗣𝗼𝗹𝗶𝗰𝘆 𝗦𝘁𝗮𝗴𝗻𝗮𝘁𝗶𝗼𝗻: A sequence of region-specic extreme weather events result in capital destruction, reduced productivity and production, and cascading economic impacts. (Stalled transition and severe physical risks) 𝗛𝗼𝘄 𝗺𝗶𝗴𝗵𝘁 𝘁𝗵𝗲𝘆 𝗯𝗲 𝘂𝘀𝗲𝗱? These scenarios are 𝐩𝐚𝐫𝐭𝐢𝐜𝐮𝐥𝐚𝐫𝐥𝐲 𝐰𝐞𝐥𝐥-𝐬𝐮𝐢𝐭𝐞𝐝 𝐟𝐨𝐫 𝐜𝐥𝐢𝐦𝐚𝐭𝐞 𝐬𝐭𝐫𝐞𝐬𝐬-𝐭𝐞𝐬𝐭𝐢𝐧𝐠 𝐞𝐱𝐞𝐫𝐜𝐢𝐬𝐞𝐬 and for analysing financial risks that may materialise within a business-planning, policy-relevant timeframe. They also provide users with granular outputs across a wide range of financial variables, sectors and countries. 𝐓𝐡𝐞 𝐬𝐡𝐨𝐫𝐭-𝐭𝐞𝐫𝐦 𝐬𝐜𝐞𝐧𝐚𝐫𝐢𝐨𝐬 𝐚𝐫𝐞 𝐚 𝐦𝐚𝐣𝐨𝐫 𝗮𝗱𝘃𝗮𝗻𝗰𝗲 𝐢𝐧 𝗳𝗶𝗻𝗮𝗻𝗰𝗲'𝘀 𝘁𝗼𝗼𝗹𝗸𝗶𝘁 𝗳𝗼𝗿 𝐮𝐧𝐝𝐞𝐫𝐬𝐭𝐚𝐧𝐝𝗶𝗻𝗴 𝐜𝐥𝐢𝐦𝐚𝐭𝐞-𝐫𝐞𝐥𝐚𝐭𝐞𝐝 𝐫𝐢𝐬𝐤𝐬. While climate change is a long-term challenge, sudden events and policy shifts can already have a significant impact within a policy-relevant timeframe. The next five years will be important in mitigating climate change, and the NGFS short-term scenarios can help you navigate through these uncertain times. 💡 Stay tuned as we will have lots more analysis on the new scenarios in the weeks ahead! Access the full dataset here: https://lnkd.in/ePrs6hZV #climate #climaterisk #financialrisk #risk #finance #climatescenarios #climatedata
-
As we’ve discussed in previous posts, regulators expect climate risk to be integrated into ECL frameworks. ECL requires scenario weighting. But on the climate side, we don’t have these weights ready. This is where EDHEC’s research offers a practical breakthrough (https://lnkd.in/e3bNRgQM). It developed a way to introduce scenario probabilities for climate risk. This is a great starting point for aligning climate risk with ECL-style calculations. While there are many useful outputs from this research, here I want to focus on Step 1: assigning scenario probabilities and comparing this to what we intuitively do in ECL. In typical ECL modelling: - The baseline scenario gets most of the probability weight (often 70-80%), - With downside and upside scenarios splitting the rest. But in the climate space, this intuition breaks down. According to EDHEC scenario probabilities: -“Climate Catastrophe” receives 57.5% probability. -The “Baseline” scenario receives only 5%. In other words, climate catastrophe becomes the “new baseline”. The whole risk modelling approach needs to change. Currently, we are modelling climate risk as if catastrophe is a tail event, while sleepwalking into catastrophe being the most probable scenario. Other scenarios, e.g. net zero, delayed transition, climate distress, should be modelled in comparison to this “new baseline” of catastrophe, not the other way around. This is not just a technical nuance. If we are serious about integrating climate risk into ECL, this reframing is necessary.