One of the most challenging questions regarding future climate change impacts on hurricanes and other tropical cyclones (TCs) is, “Will there be more storms, and how bad will they be?” This question has been around for decades but continues to be the focus of a significant amount of active research and discussion.
Analyzing output from nine state-of-the-art GCMs shows an overall increase in tropical cyclone frequency from high CO2 concentration.
What have we learned in 30 years?
In early 1990, Anthony J. Broccoli and Syukuro Manabi, both from the Geophysical Fluid Dynamics Laboratory (GFDL), conducted a study describing results from a set of very coarse general circulation model (GCM) climate simulations that yielded changes in global TC counts between -6% and +6%, depending on model resolution and cloud parameterizations.
Thirty years later, Thomas Knutson, also from the GFDL, summarized results from hundreds of peer-reviewed studies that used more sophisticated models and found global frequency changes on the order of -30% to +20%. So, are we less certain about how the frequency of TCs may change now than we were more than 30 years ago?
GCMs have become more sophisticated
The lines of code in GCMs are numerical representations of what scientists know about the physical, chemical, and biological processes in the earth’s atmosphere, ocean, biosphere, and land surface. These processes describe how heat, moisture, momentum, and chemicals move across space and time. As science and technology have advanced, GCMs have become more sophisticated, capable of accommodating many more factors and variables and thus generating a wider range of outcomes.
GCMs don’t explicitly simulate tropical cyclones
A constraining aspect of many coarse-resolution GCMs is their inability to explicitly simulate strong TCs. These storms are relatively localized phenomena that result from the complex interplay of many factors, which is challenging for these large-scale global models to capture. Even weak TCs may not be sufficiently represented because proxy-based algorithms and criteria for spotting TCs in these models in the current climate may not work the same for future climates.
Modeling at a higher resolution will help decrease the uncertainty
The optimal way of explicitly modeling TCs in the current or future climates is by running GCMs at a one- to two-kilometer resolution. Unfortunately, doing so with current technology would take such a long time as to make it unfeasible. However, studies that downscale the GCM output, especially from simulations with resolution coarser than 100 km, show more evidence of increasing frequency. Downscaling GCMs can be done using the following techniques:
- Running coarse scale conditions through a high-resolution regional scale model
- Inputting the GCM coarse scale information to a simpler, physically based model
- Using a higher-resolution (e.g., 25 km) GCM that explicitly simulates TCs directly
Climate change impact goes beyond air and ocean temperature
In the post-industrial age, an increase in CO2 levels is warming the atmosphere and oceans, allowing more storms to develop and become stronger. Of course, climate change affects factors beyond air and ocean temperatures. For example, the poles are warming faster than the equator, which is expected to reduce the jet stream’s strength and decrease vertical wind shear, resulting in circumstances conducive to TC development.
Meanwhile, two lesser-known ingredients could be altered by climate change in a way that would inhibit tropical cyclone development. The same greenhouse gases that increase ocean temperatures will also likely warm the upper troposphere in the tropics more than the lower troposphere, thus stabilizing the atmosphere and leading to weaker background rising motion. Additionally, the mid-levels of the atmosphere will likely be drier.
Even as modern, sophisticated GCMs are generating results that include more uncertainty, they are also enabling scientists and modelers to incorporate innovative techniques to narrow down the effects of climate change on TCs’ ingredients.
CO2 levels could double by 2070
The most recent IPCC report on climate change includes climate change scenarios with high and very high greenhouse gas (GHG) emissions (SSP3-7.0 and SSP5-8.5). These scenarios posit that CO2 concentrations will roughly double from their current levels by 2070 and 2090, respectively, causing higher air and ocean temperatures. If no actions are taken to reduce CO2 emissions, many of us will likely experience more intense TCs.
A doubling of CO2 concentration should increase the number of TCs
Dr. Kerry Emanuel of MIT recently concluded that an overall increase in TC frequency will occur from a much higher CO2 concentration after analyzing output from nine state-of-the-art GCMs that he downscaled using the techniques mentioned above.
Emanuel’s study used simplified high-spatial-resolution, physically based equations of motion driven by environmental conditions from GCMs to investigate how doubling CO2 would affect TC activity. The global result showed increases in frequency for all Saffir-Simpson categories, even for weak category storms and at landfall, and greater changes for the North Atlantic Basin than for others.
One positive attribute of Emanuel’s approach is that it uses multi-model GCM output that captures the large-scale impacts of increased CO2 on temperature, wind shear, stability, and moisture. The downscaling procedure then shows how TCs would develop under those conditions.
While the Emanuel downscaling approach isn’t perfect, it’s still a high-resolution, dynamically based solution that yields explicitly simulated TCs without needing proxies or indices. In that sense, it’s difficult to second-guess the results, even if they haven’t been widely accepted yet, especially because other recent high-resolution studies are now reaching similar conclusions.
How (re)insurers can benefit from these recent findings
Using results from the Emanuel study, Verisk has leveraged its industry-leading catastrophe models for U.S. hurricanes and Caribbean tropical cyclones and developed Climate Change Projections in the form of mapping files. (Re)insurers can use these files to adjust U.S. and Caribbean year event loss tables (YELTs) to account for changes in TC frequency by the Saffir-Simpson category under four climate scenarios at four future time horizons. These Climate Change Projections provide a probabilistic view of future risk, including how average annual losses (AALs) and other loss metrics, such as 100-year and other return period losses, may change in the future.
(Re)insurers can further leverage these mapping files to assess future potential losses for residential, commercial/industrial, manufactured (mobile) homes, and automobile lines of business. They can also use them to help develop and enhance climate risk insights and gain peril-specific data, which can inform mitigation and adaptation strategies, aid in stress-testing and rebalancing their portfolios, and assist them in responding to regulatory requirements. To receive more information about Verisk’s climate change solutions, please get in touch with the Verisk Climate Change Practice.
The time to take action is now
While impacts of climate change on other aspects of TCs such as wind speed, atmospheric pressure and precipitation were not addressed in this article, the increasing frequency of strong storms has become more of a certainty than a theory. And because these storms contribute to the majority of the damage, we shouldn’t wait for 200-year storms to become 100-year storms before taking further action—to mitigate future climate change and to better adapt to what change is already inevitable.