New publication: innovating techniques to advance climate change projection coverage.

New research published in January’s International Journal of Climatology has innovated a novel method to increase the efficiency of generating climate prediction data. The study, co-authored by EarthSystemData & led by UEA’s Climatic Research Unit, describes and demonstrates a new research method that enables daily climate projection data to be produced that can represent a greater range of […]

New research published in January’s International Journal of Climatology has innovated a novel method to increase the efficiency of generating climate prediction data. The study, co-authored by EarthSystemData & led by UEA’s Climatic Research Unit, describes and demonstrates a new research method that enables daily climate projection data to be produced that can represent a greater range of potential climate futures than could otherwise be achieved. Read more below: 

The recent increase in complexity of the mathematical models used to predict Earth’s climate is a double-edged sword: more features can be added to these models, better representing the real world, but running the more complex models becomes expensive — both computationally and in real-money terms.

This expense is compounded since to adequately prepare for the continued heating of Earth, a variety of predictions are needed to represent different possible futures: for example, one set predicting what would happen to the climate system when all nations reduce greenhouse gas emissions; another set to simulate a future in which only some countries do – and so on.

Devising ways to reduce the expense of climate predictions is of critical importance then, especially since information is needed now, to build and prepare for current and future heating.

Certain properties of how the Earth’s climate system responds to global heating, can be exploited to help reduce the cost of assembling the necessary predictions. Fundamental here is the tendency of many local climate parameters to change linearly with the planetarymean temperature change: For example, temperature at a specific city, Tx, will often change as a scaled function, a, of the global mean temperature change ΔT:

T_x=a∆T

[i]

This is useful since it means that once we know the scaling functions, a, for all points on the planet, we can then decipher the local climate change knowing just the future global temperature change, ΔT.

In essence, we do not need to run the ‘full’ climate model. Instead, we can use a less-complex (i.e. quicker and cheaper) model to establish ΔT for a pool of different future scenarios and then calculate the local climate response according to [i].

This technique, pattern scaling, was innovated in the 1990s, and is frequently used to increase the information available to international and national policy makers , including for the UN-IPCC Assessment Reports, fundamental in forming planning strategies for future climate impacts.

Pattern scaling utilises the relationship between the change of a climate parameter (e.g. monthly rainfall amount) at a given location, with the change in the overall global-mean temperature. Shown here are the 'pattern values' - i.e. 'a' in eq. [i] - for an example analysis [Author's own study, 2020].

Usually pattern scaling produces monthly-scale data. However, effective climate planning requires information on the daily climate (i.e. the future weather) as well, especially in the case of hydrological planning for floods and droughts. Providing daily climate data can be difficult for complex global climate models – which excel in simulating the slower time-scale planetary processes which modify the climate, but may be less accurate in simulating higher-time scale, and geographically local (i.e. at the city or village scale), data.

For that reason, climate scientists developed separate mathematical tools to blend climate model predictions with statistical properties of local climate taken from real-world weather observations. These tools, via this amalgamation, can produce physically-plausible daily series of data at a very precise geographical location that represents the future climate. 

Within this class of tools are so-called stochastic weather generators (bear with me). These generators have a number of statistical parameters which are set to represent the ambient climate in the location they are simulating, and which are then modified to represent the future climate: for example in order to generate a series of plausible rainfall data for a given month, e.g. a March, the generator will need to know characteristics such as: how many days does it usually rain in March at this specific location? What are the chances of a March day being a rain day if the previous one, two, or three days also experienced rain? How much rain might fall in a given day, at this location in March?

To run these tools for a particular future climate, scientists would ordinarily analyse the climate model’s daily data (for that future) to obtain the required statistical parameters for the future, with the remaining set of required statistical parameters being derived from real-world observations. 

What if, though, the statistical parameters representing the future could, themselves, be pattern scaled, in a way described above? In other words if you knew, by prior analysis, how each statistical parameter itself responded per degree of planetary warming, then equation [i] could simply be solved – not to predict temperature – but to predict each weather generator setting, knowing only the global-mean temperature change.

Our study outlines the first working framework for this precise combination of techniques — that is the global pattern scaling of weather generator parameters. It demonstrates the validity of the approach using output data from one of the thirty-or-so IPCC class fully-coupled global climate models, IPSL-CM6A-LR, yet the technique can be applied to any climate model that has the necessary diagnostic data.

In short, this means that from just a few sets of training simulations, an untold volume of additional data can be synthesized, approximating any variety of future climate conditions or emission situations not explicitly modelled by the full climate model: the approach enables an extremely cost-effective route to substantially deepening  the pool of climate data available to inform climate-impact decision making.

To read the full details of the study you can access the paper online here. This study was performed by lead-author Dr Sarah Wilson Kemsley, as part of her successful PhD research at the University Of East Anglia. EarthSystemData’s Dr Craig Wallace was a co-supervisor, and co-writer to this study, and a co-supervisor for Sarah’s PhD project. 

MULTIPLE MONTHS HAVE BREACHED 1.5°C OF GLOBAL WARMING

Dr. Craig Wallace, Lead Climatologist. We routinely evaluate the latest, leading, global climate data sets, both to generate information for  our clients and, if time, for our own general scientific interest. In so far as global observed temperature records go, one of the (if the) most established and well-respected records is the Climatic Research Unit […]

Dr. Craig Wallace, Lead Climatologist.

We routinely evaluate the latest, leading, global climate data sets, both to generate information for  our clients and, if time, for our own general scientific interest.

In so far as global observed temperature records go, one of the (if the) most established and well-respected records is the Climatic Research Unit & UK Meteorological Office HadCRUT dataset, current version [July 2023] HadCRUT5 [1].

This is a blended data set of monthly-mean ocean surface temperatures and land surface air temperatures (~2m) and it has been a mainstay of scientific climate analysis  for decades and a key element in the United Nations Intergovernmental Panel on Climate Change’s (UN-IPCC) assessment reports evaluating anthropogenic influence on the climate system: HadCRUT, in its various incarnations over the years, has been categorical in the detection and attribution of anthropogenic warming.

This month we implemented new code to analyse HadCRUT5, together with other data, part of which involved re-calibrating HadCRUT5 anomalies to a pre-industrial benchmark, as opposed to their standard provided 1961-1990 climatology. 

Re-adjusting these data to a pre-industrial average is now a critical requirement, since it is against the pre-industrial climate state that the United Nations has set global warming limits – such as the 1.5 and 2.0°C limits associated with the well-known Paris Agreement [2].  Therefore this is what we did.

In our calculations, we follow the UN-IPCC adoption of defining the pre-industrial period as the 51 years between 1850-1900 [3] [4] [5], a period commensurate with near-global observations and a low (but not zero) level of human greenhouse gas emissions. 

Example definition of the pre-industrial period witin the IPCC Special Report on 1.5°C [5]

Other pre-industrial period definitions have been discussed [6] [7] on the basis that some anthropogenic influence on the global climate system is likely to have been present even by 1850, and that volcanic feedback existed in the 1880s – and that the balance between those two opposite forcings is uncertain. We will return to this point below, however, in-line with the bulk of scientific assessments, we predominantly use the 1850-1900 period given the noted rationalisation above.

To proceed with our re-adjustments, we compose a vector, x, of 12 1850-1900 average anomalies, in their 1961-1990 form, for each calendar month, January to December. Each of the 12 values is the mean of the 51 Januarys (or Februarys, Marchs, Mays etc) within the 1850-1900 time slice. The resulting values are shown below:

Global mean temperature anomalies of each month, relative to 1961-1990, averaged for the 1850-1900 time period (n = 51).

Thereafter, for each respective calendar month, i, January … December, we add absolute |x|i to all Januarys, Februarys … December in the raw, 1961-1990 anomaly data, therefor re-expressing the 1961-1990 anomaly values to 1850-1900. To use January as an example, the 1961-1900 anomaly series looks like this:  

HadCRUT5 January global-mean temperature anomalies bench-marked to the 1961-1990 period. The mean of the 1961-1990 values = 0.

The 1850-1900 mean of the 1961-1990 January anomalies is -0.4238. By adding absolute(-0.4238) to the series, they become re-expressed with reference to (wrt) 1850-1900 (orange series in the below plot). The mean of the 1850-1900 anomalies is, accordingly, zero: 

Orange: HadCRUT5 January anomalies expressed as deviations from the 1850-1900 average.

Having applied the transform for the remaining 11 moths, we can then sweep the monthly series for anomalies >1.5°C. In doing so, we find none before 2015, and a total of 6 months since 2016, the most recent being March 2023 (at the time of this analysis HadCRUT5 has not been updated beyond March 2023, due to the operational delay of collecting, cleaning and calculating the component data that comprises the global mean).

The six months with warming in excess of 1.5°C are:

January 2016: +1.51°C

February 2016: +1.63°C

March 2016: +1.62°C

February 2020: +1.53°C

March 2020: +1.53°C

March 2023 +1.56°C.

We can examine these breaches by plotting the time series of the HadCRUT5 values (wrt 1850-1990) for the relevant months of January, February and March (nb. only values > year 2000 are plotted, since no breaches occur before 2016 ):

Given the scientific consensus that limiting global warming to 1.5°C is required to avoid dangerous levels of climate change (compared to higher levels, say, 2.0°C) [5] [8] [9], we should be extremely alarmed at these findings — the first evidence that Earth has, in fact, heated beyond the  threshold for periods longer than a few days [10].

To conservatively assess these findings in the context of the UN-FCCC, however, we must point out that the UN-FCCC has, in it’s formation, been clear that warming limits, thresholds etc, refer to – and only to – the anthropogenic component of heating [11] [12] 

Excerpt from the formative UN Framework Convention on Climate Change Treaty text, 1992. (highlight added)

This nuanced definition does not detract from the importance of the above findings but, rather, points to the fact that a formal, technical, breach of the 1.5°C limit (in UN-FCC parlance) would require an extended period, at least say two decades, in which the mean anomaly benchmarked to 1850-1900 is >1.5°C. This requirement is to ensure that the warming signal is indeed anthropogenic: to do this we must sample a sufficiently long period within which natural forcing agents effectively cancel each other out. The principal process in this consideration, is the La Niña & El Niño phenomenon, modifying global temperatures a few tenths of a °C either way. Indeed, the 2016 breaches noted above co-incide with an exceptionally strong El Niño, suggesting the human-driven warming was <1.5°C. 

However, more worryingly, the most recent breach, March 2023, occurred during the tail-end of an extended La Niña phase which imparted a cooling upon the human-driven warming. This would suggest not only that March 2023’s warming was human-driven but, also, that the human-caused heating was likely in excess of the observed 1.56°C. 

To establish confidence in the evident 1.5°C breaches revealed in HadCRUT5 we can refer to an alternative dataset – GISTEMP4 – produced by NASA’s Goddard Institute of Space Sciences, and another ‘de-facto’ data set used by the UN-IPCC for climate monitoring [13] [14]. Aside from using different assemblage techniques, these data are useful since they force us to use an alternative preindustrial period: GISTEMP4 records commence in 1880, rather than 1850. Therefore we repeat the transformation as described above, but using an 1880-1900 mean for each month and apply this to the GISTEMP4 anomalies (expressed in their raw form to 1951-1980). In doing so, the following months are identified as having breached 1.5°C – all of which cross validate as HadCRUT5 breaches:

February 2016: +1.64°C

March 2016: +1.59°C

February 2020: +1.52°C

The three absent months, compared to HadCRUT5, are January 2016, February 2020 and, of course March 2023 (+1.44°C).

One final interesting exercise is to re-define the pre-industrial period used in our initial HadCRUT5 transform from 1850-1900 to 1880-1900 so as to match NASA-GISTMEP4 and then re-scan HadCRUT5 for 1.5°C breaches. We find an additional breach, January 2020, and that three of the prior-noted breaches (January 2016; February 2016 and February 2020) are stronger when using this benchmark:

 

January 2016: +1.56°C

February 2016: +1.66°C

March 2016: +1.62°C

January 2020:  +1.53°C

February 2020: +1.54°C

March 2020: +1.53°C

March 2023 +1.56°C

It is likely this enhancement of the breach magnitudes relates to the 1883 global cooling imparted by the Krakatoa eruption (e.g. [15] and others) – thus reducing the 1880-1900 benchmark temperatures, compared to a longer period. On balance this lends weight to the original selection of 1850-1900 as a superior period against which to diagnose bona-fide human-driven warming.  

In conclusion, our analysis shows here numerous months exceeding 1.5°C of global warming, including a latest breach, March 2023, occurring despite during natural conditions that act to cool greenhouse gas warming. This should serve as a stark and sobering indicator of our current position with respect to ‘safe’ climate warming levels. 

A ‘by-the-book’ UN-FCCC breach, will be composed of a multi-year period in which some months exceed 1.5°C by some distance, others not so much, and some where the observed warming anomalies are <1.5°C.  

The findings reported here are consistent with the first emergence of such a pattern. 

 

Dr Craig Wallace is Lead Climatologist at EarthSystemData. He holds a PhD degree in physical climatology from the world-renowned Climatic Research Unit at the UK’s University of East Anglia (co-creators of the HadCRUT5 data set). He has 20 years experience of post-doctoral climate change research involving analysis of instrumental and modelled climate data and he is an IPCC-cited author.

References:

[1] Morice, C.P., Kennedy, J.J., Rayner, N.A., Winn, J.P., Hogan, E., Killick, R.E., Dunn, R.J.H., Osborn, T.J., Jones, P.D., and Simpson, I.R., 2021: An updated assessment of near-surface temperature change from 1850: the HadCRUT5 dataset. Journal of Geophysical Research 126, e2019JD032361, doi:10.1029/2019JD032361

[2] https://unfccc.int/files/essential_background/convention/application/pdf/english_paris_agreement.pdf

[3] IPCC, 2013: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change [Stocker, T.F., D. Qin, G.-K. Plattner, M. Tignor, S.K. Allen, J. Boschung, A. Nauels, Y. Xia, V. Bex and P.M. Midgley (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 1535 pp.

[4] IPCC, 2021: Climate Change 2021: The Physical Science Basis. Contribution of Working Group I to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change[Masson-Delmotte, V., P. Zhai, A. Pirani, S.L. Connors, C. Péan, S. Berger, N. Caud, Y. Chen, L. Goldfarb, M.I. Gomis, M. Huang, K. Leitzell, E. Lonnoy, J.B.R. Matthews, T.K. Maycock, T. Waterfield, O. Yelekçi, R. Yu, and B. Zhou (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, In press, doi:10.1017/9781009157896.

[5] IPCC (2018). Global Warming of 1.5°C: IPCC Special Report on impacts of global warming of 1.5°C above pre-industrial levels in context of strengthening response to climate change, sustainable development, and efforts to eradicate poverty (1 ed.). Cambridge University Press. doi:10.1017/9781009157940.001. ISBN 978-1-009-15794-0.

[6] https://www.climate-lab-book.ac.uk/2017/defining-pre-industrial/

[7] Hawkins, E., and Coauthors, 2017: Estimating Changes in Global Temperature since the Preindustrial Period. Bull. Amer. Meteor. Soc., 98, 1841–1856, https://doi.org/10.1175/BAMS-D-16-0007.1.

[8] David I. Armstrong McKay et al. ,Exceeding 1.5°C global warming could trigger multiple climate tipping points.Science377,eabn7950(2022).DOI:10.1126/science.abn7950

[9] Lenton TM, Rockström J, Gaffney O et al. 2019. Climate tipping points – too risky to bet against. Nature 575: 592– 595.

[10] https://climate.copernicus.eu/tracking-breaches-150c-global-warming-threshold

[11] UN-FCC, 1992, FCCC/INFORMAL/84 GE.05-62220 (E) 200705

[12] pers. comm. Professor Tim Osborn, 2023

[13] GISTEMP Team, 2023: GISS Surface Temperature Analysis (GISTEMP), version 4. NASA Goddard Institute for Space Studies. Dataset accessed 20YY-MM-DD at https://data.giss.nasa.gov/gistemp/.

[14] Lenssen, N., G. Schmidt, J. Hansen, M. Menne, A. Persin, R. Ruedy, and D. Zyss, 2019: Improvements in the GISTEMP uncertainty model. J. Geophys. Res. Atmos., 124, no. 12, 6307-6326, doi:10.1029/2018JD029522.

[15] Gao, Chaochao, Alan Robock, and Caspar Ammann, 2008:  Volcanic forcing of climate over the past 1500 years: An improved ice-core-based index for climate models.  J. Geophys. Res., 113, D23111, doi:10.1029/2008JD010239.  

 

February 2022: Major model agreement for forecasts

December’s long-range forecasts for February 2022 are displaying major agreements over continental North America. This is a rare occurrence – indicating a high degree of confidence in the forecasts. Conditions are therefore likely to be colder-than-normal over Alaska and central/eastern Canada, with warmer-than-normal conditions prevailing in central and southern USA. Five long-range forecasting systems involved […]

December’s long-range forecasts for February 2022 are displaying major agreements over continental North America. This is a rare occurrence – indicating a high degree of confidence in the forecasts. Conditions are therefore likely to be colder-than-normal over Alaska and central/eastern Canada, with warmer-than-normal conditions prevailing in central and southern USA.

Five long-range forecasting systems involved in EarthSystemData’s routine analyses for our users – the UK’s UKMO-GLO-SEA6, Germany’s DWD-GCFS2.0, Meteo-Frances’ System 7; the European consortium ECMWF SEAS-5 and America’s NCEP/NOAA-CFSystemv2 – are showing a rare consensus for late winter 2022 North 

American conditions that are very likely a consequence of the present strong La Nina Pacific Ocean State. 

All five modelling systems are projecting colder-than-normal weather for the Pacific north-west portion of the continent with warmer-than-normal weather to the south and east of the USA in what is termed a ‘di-pole’ anomaly. 

Multi-model average temperature forecasts over North America for Feb. 2022 (five forecsating systems detailed in main text)

La Nina Teleconnection

The pattern of these temperature conditions are a signature response of the La Nina phenomenon farther south in the Tropical Pacific Ocean, characterised by colder-than-normal sea surface temperatures which in turn perturb the regular atmospheric circulation.  

The scientific community remarked earlier in the autumn of the emergence of strong La Nina conditions and speculated that the phase will continue throughout the winter, a prediction that is evidently true. 

Nonetheless, the fact that all five long-range forecasting systems are capturing the atmospheric response to the La Nina conditions is fairly remarkable, given the long-term nature of these forecasts, in relative forecasting terms.  

Ordinarily, variations between each forecasting model’s projections cause some degree of uncertainty in the forecasts themselves, however for December’s runs the agreement in projections appears to be solid.

La Nina ocean state (blue colours indicating colder-than-normal sea surface temperatures) latest conditions (as of December 2021) as plotted by NCEP/NOAA : https://www.cpc.ncep.noaa.gov/products/analysis_monitoring/lanina/enso_evolution-status-fcsts-web.pdf

Impacts on Expected Temperatures

Regular February temperature averages in Anchorage, Alaska – central in the cold anomaly area – would be expected to be around -6°C, although these may fall to around -9°C, with minimum temperatures being reduced by a similar amount (e.g. the February average minimum temperature [~-10°C] reducing to approximately -13°C). 

Within the U.S.A. Gulf Coast sector, for example Houston Texas, regular February average temperatures (around +13°C) may rise by two or three degrees – with more moderate warming over the vast area of south central United States and Central American countries.

 

How are Temperatures Modified?

These temperature departures can be explained by the atmospheric circulation changes associated with La Nina (in fact, La Nina strengthens the ‘normal’ atmospheric, local circulation over the tropical eastern Pacific) leading to interference with the Pacific Ocean jet stream just to the north. 

Much like the Atlantic jet stream, this wind pattern would ordinarily bring relatively warm, maritime air masses into the North America sea board, especially in the northern-mid latitudes. A weaker jet stream therefore leaves the door open for Arctic air mass incursions. 

In Europe a similar atmospheric perturbation can occur weakening the winter-time westerly airflow: this phenomenon (or, more precisely, the measurement of the phenomenon) is called the ‘North Atlantic Oscillation’ (NOA): when the NAO adopts a so-called negative phase, maritime airflow into northern Europe is much weaker — indicating colder, continental winter air. 

In extreme phases, these are indicative of ‘Beast from The East’ type weather phases. Northern Europe becomes dominated by air masses of eastern, or north-eastern origin. However, there is no evidence that the European phenomenon is linked to La Nina at all, like this Pacific counterpart and the resulting forecsats.

 

It will be interesting to monitor January and February’s weather developments for this sector; the early-autumn alarms for La-Nina type domination of the winter northern hemisphere weather patterns are being borne out – in the modelling systems at least. Stand by.

-CW 20/12/2021

 

Ensemble seasonal forecasts for water planning

Our ESDV3 Seasonal Forecasting service now includes bespoke long-range hydro-meteorological forecast data for 51 ECMWF ensemble members. This allows users to test-case planning strategies against a selection of possible forthcoming weather scenarios and better manage risk using cutting-edge NWP data. We’ve blogged previously about how forecasts of climate and weather are sensitive to small differences […]

Our ESDV3 Seasonal Forecasting service now includes bespoke long-range hydro-meteorological forecast data for 51 ECMWF ensemble members. This allows users to test-case planning strategies against a selection of possible forthcoming weather scenarios and better manage risk using cutting-edge NWP data.

Drought

We’ve blogged previously about how forecasts of climate and weather are sensitive to small differences in the starting conditions of forecasts: the equations that predict future weather conditions need numbers to start with – i.e. observations of the current weather – and it is impossible to *perfectly* observe the current weather across the globe to provide these numbers, with no error. Due to the characteristics of the maths equations in leading weather forecasting systems (e.g. UMKO GLO-SEA-6, Meteo-France Sys 7, ECMWF SEAS-V, DWD GCFS2.0, NCEP-NOAA CFSystemv2), small differences in starting numbers can grow into different forecasts — the so-called chaos effect.

To combat this, forecasting centres can conduct multiple runs of forecasts each with slightly different starting numbers. Users then have the choice of how to use these forecasts: [1] take the average of all the individual runs and use that information – or [2] analyse all of the individual runs, and establish the worst and the best possible outcomes. (or option [3]: do both [1] and [2]!).

 

 

ESD Seasonal Forecaster enhancements: ESDV3

When we launched our Seasonal Forecasting service, we provided clients with information derived from five numerical weather prediction models in the form of [1],  the average of all the individual runs for each centre. This information is a useful guide, but users have no information of probabilities of certain weather conditions being met. Is the average forecast the way it is because all the multiple runs came out with the same answer? Or because a small number had a result strongly in one direction and and equal number of runs had a result in the opposing direction? (e.g. cold temperatures and warm temperatures)

Image 1: Global temperature anomalies relative to 1981-2010 for early April, via ESD48 [raw un-anomalised data NOAA/NCEP PSD]
ESDV3 provides global coverage of temperature, rainfall and evaporation indices for an ensemble of 51 ECMWF members.

We started work several weeks ago to enhance the ESD Seasonal Forecaster to enable users to answer these types of questions and are pleased to say the ESDV3 Seasonal Forecaster is now available for our client’s use! In the same way as the initial Forecaster, we configure the code base with our user’s specific geographic and weather-dependent metrics, and run the code each month, but now provide a full ‘ensemble’ set of output (51 runs) for one of our primary component modelling systems ‘ECMWF’.   

Identifying best & worst cases for planning purposes

We confer with our users to establish their unique weather criteria (in terms of exposure for their business / resources / operations) and then isolate the ‘best’ and ‘worst’ forecasts from the full set each month – allowing easy test-cases for their management strategies for the coming month, under best and worst potential cases.

Like the original ESD Seasonal Forecaster, ESDV3 has full global coverage, and a maximum lead time of seven months. They will be configured to output the most relevant weather-related intelligence needed by your organisation to make the most resilient planning decisions to protect your community, resources and  customers.

The ESD Seasonal Forecaster and ESDV3 are monthly services utilising raw data from five of the world’s leading national seasonal forecasting models and transforming these data into user-relevant packages for onward, inhouse planning. Apply to join now as a user, or learn more via the buttons below.

How financial systems and companies are managing climate risk: the TCFD program

As the pace of global climate change accelerates and weather extremes re-organise to the new ‘climate normal’ societal systems face no option but to adapt and mitigate risk. This includes the global financial system – defined here as the complex of banks, lenders, investors and insurers and the associated transactions that maintain much of the […]

As the pace of global climate change accelerates and weather extremes re-organise to the new ‘climate normal’ societal systems face no option but to adapt and mitigate risk. This includes the global financial system – defined here as the complex of banks, lenders, investors and insurers and the associated transactions that maintain much of the services and infrastructure on which our way of life hinges.

Climate-change is disrupting business and the financial system

Drought

Rapid global heating and the continuing jolts to global weather extremes have the capacity to catastrophically shock the financial system, if capital allocation and risk management strategies have not kept pace with change, nor adequately anticipated further change. 

Climate-related risk to the financial system – and constituent companies – can be separated into two parts. Firstly there are the direct physical impacts from climate change as weather systems re-organise under new regimes e.g. flooding, drought, excessive heat all contributing to interrupt business viability, either in regions where companies are based – or remote regions that contribute to the company’s supply chain. 

Secondly so-called transition risks exist as the world attempts to de-carbonise: how will changing energy delivery and generation affect a company’s operations, and what risks are there if consumer preferences continue to become greener and, of course, the implications of possible emissions targets within industrial sectors.

The G20 acts

In recognition of the severity that these risks pose, the G20’s Financial Stability Board commissioned a task force in 2015 to devise a reporting system incumbent on companies and businesses that could serve two purposes: firstly, to steer businesses to take their own climate-related risks seriously (if not already) and to enable them to develop risk-management strategies increasing their resilience to ongoing global warming – whilst also identifying and managing any opportunities and advantages that may exist. Secondly, the reporting system would be such that it would provide lenders and investors with standardised cross-industry information that could help ensure efficient capital allocation by the market – in the face of disruptive and volatile change. 

Skyline

Climate risk recommended reporting: win win. 

Within two years, the task force – the Task Force on Climate-Related Disclosures (TCFD) – released an inaugural set of reporting recommendations that companies could align to in order increase their climate resilience and provide reliable strategic information to potential investors. These recommendations cover not just identifying and managing a company’s climate-related risks – but also include reporting on critical ancillary activities: how does the company structure ensure sufficient governance of climate-related risks and mitigation work? And how does the company track performance against climate risks – and what measured metrics are used? 

Uptake of recommendations has been rapid as global heating has continued unabated in the last few decades. Over 60% of the world’s largest companies are now TCFD supporters and / or return TCFD-aligned reports every year. But the benefits of TCFD reporting are not limited to just the largest companies.

Climate scenarios for risks and opportunities

Companies of all sizes should assess exposure to current and forthcoming climate change. Supply-chain interference, local physical risk and transition risks (to a low carbon economy) are certain to exert stress on business continuity and profitability and where risks exist, so too may lie opportunities: innovating services or products to help others mitigate climate-related risks, diversifying your own products / assets to ensure ongoing security etc. 

A key element to effective climate-related risk management and recommended by the TCFD for particular focus is appropriate scenario analyses. What does this mean? In a nutshell, it can boiled down to this: how resilient is your organisation’s climate risk / opportunity management under differing futures? For instance, what if global greenhouse gas emissions fail to be checked, and a ‘worst-case’ climate scenario unfolds, instead of a more moderate one in which climate change is kept to a 2°C limit? 

These different futures will have marked regional differences in physical climate impacts and transition pathways – and the jury is still out on which scenario will unfold.  Risk strategies tied to one scenario, with no contingencies, will therefore be limited if the assumed scenarios differ in actuality. A better strategy therefore is to devise risk management solutions that can be as resilient as possible to as many potential future scenarios. Do this and you not only maximise your business, protect customers, but also offer a far more secure proposition to future investment.

TCFD-aligned assessments add value to companies.  

Assessing your company via a TCFD-aligned report will provide you with valuable insights into your resilience against current and future climate change and identify areas for potential improvement. The exercise will enable you to consolidate against potentially catastrophic risk and allow you to convey this information to current and potential investors and customers. 

EarthSystemData are registered TCFD Supporters and have completed official TCFD Climate Disclosure Standards Board training for TCFD reporting. Through EarthSystemData’s ‘THEMIS’ service we can produce full TCFD-aligned assessments of your company and advise on impacts, opportunities, governance and risk-management areas to produce reports that match or exceed TCFD recommendations.

We are especially suited to working with SMEs that have regional exposure to climate risks, e.g. supply chains, clients or assets spanning several countries on a continental scale and beyond. Our service excels in identifying climate impacts and since we are climate projection specialists, dealing day-to-day with the latest United Nations IPCC climate change projection data, working with ESD guarantees that your climate risk evaluation will use the most scientifically-robust, international-accredited climate data.

We provide dedicated, one-to-one client engagement throughout the process and guaranteed delivery times. We’re a small, highly-experienced team, and therefore can offer superb value with zero quality compromise. 

To find out how we can help, contact us now we’re ready to talk. Many companies commission TCFD-aligned returns as part of wider ESG workstreams and if this describes you we can offer ESD’s TCFD service as a wider ESG service incorporating our ESG-specialist partners ESGABLE.  

Craig Wallace PhD, EarthSystemData

EarthSystemData designs and deploys climate data solutions to meet organisational needs. We specialise in climate risk assessment, risk disclosure and adaptation projects assisting multi-national firms through to local clients. To discuss using our research or consultancy services contact us here: info[at]earthsystemdata.com 

 

Managing chaos: ensembles and forecasting the climate system

Forecasting weather, the climate or oceans is difficult for a host of reasons. Devising a set of equations that can describe how air and water circulate across the planet under different conditions (height, temperature, changing density etc) to name but one. But beyond that, even with a good set of equations, predictions are plagued by […]

Forecasting weather, the climate or oceans is difficult for a host of reasons. Devising a set of equations that can describe how air and water circulate across the planet under different conditions (height, temperature, changing density etc) to name but one. But beyond that, even with a good set of equations, predictions are plagued by another problem: that very small differences in the starting numbers for the forecast equations – ‘initial conditions’ – lead to vastly-different end forecasts. How do we address this problem for end-users, especially for long-range forecasts?

The exact problem may be known to you already as the ‘Butterfly effect’ – the idea that the flapping of a butterfly’s wings can result in a particular sequence of chain reaction events leading to vastly different outcomes elsewhere. This illustration, in fact, was coined by the mathematician and meteorologist, Edward Lorenz, himself who was central to discovering the ‘starting condition problem’ together with researching its effects on how forecasts evolve through time and, ultimately, arrive at different predictions.

In the 1960s Lorenz created a simplified set of equations for an aspect of weather dynamics called ‘convection’ – describing how to simulate heat-driven rising and sinking of air masses. When applying his equations into weather forecasting computer code, Lorenz was at a loss to explain how two simulations using the same computer arrived at radically different weather forecasts. 

All he had done, for the second forecast, was to stop the computer half way through its run, print out the weather conditions at that point, and then re-input the numbers later to re-start, and finish, the second forecast. 

Unbeknown to Lorenz at the time, the printer used a slightly lower decimal precision for the numbers than used in the computer’s physical memory. In the first forecast (with no stopping involved) the whole forecast proceeded at high numerical precision. The second forecast, however, the numbers had very slightly changed precision half way through. 

Lorenz was still perplexed: both sets of numbers were reasonably high precision, what difference could five, rather than six, decimal precision make to the same numbers? In fact, and as his subsequent research showed, a lot.

Equations describing a system have beautiful shapes - depicting the tendency of values of the system's properties through time - 'attractors'. Small differences in the starting co-ordinates within the shape lead to different paths through the shape: predicted weather is similar initially, but increasingly divergent after a while - proportional to the space between the two paths. This attractor: Nicolas Desprez, Chaos Scope http://www.chaoscope.org/

Lorenz had discovered the initial condition sensitivity of weather forecast equations, where even very small differences in the numbers, profoundly different forecasts emerged, and the differences grew bigger the farther into the future the forecasts ran. This characteristic behaviour for solutions of this type of equations was eventually written up and gave rise to new research fields within fluid dynamics: Lorenz systems, chaos theory, non-periodic flow, together with ideas about how this effect could be managed, especially given that we do not have perfect observations (starting conditions) for the whole planet, when launching a forecast.

What, then, are ‘ensembles’ and how/why are they used? A good analogy is to consider the meaning of the word in a musical sense – a collection of components (instruments) all contributing together to make a set. In our case, however, the ensemble is a collection of forecasts simulations, all identical but each with slightly varying starting conditions. An ensemble approach is needed for forecasting because our real-time weather observations, whilst good, are nowhere near perfect – in terms of geographical coverage and accuracy. To account for this, an ensemble of forecasts, can maximise as many runs as is computationally possible – to reflect our uncertainty of the observations. The end user is provided with a range of possible outcomes, that reflects how the forecasts vary in each ensemble member. 

ens
Four forecasts – each identical, other than the starting observations are slightly different, accounting for the fact that weather observations are not 100% accurate. They agree well up to a point and then diverge significantly. Each is as likely as the other. Is it better to give users the average of the four forecasts - or all four individual forecasts?

In part, this approach is why you hear phrases such as ‘10% chance’ of rainfall, or a ‘20% probability’ of a warmer than average summer: the numbers reflect how many of the ensemble members give rise to that particular weather condition. Long range forecasts are especially susceptible to initial condition sensitivity because, as Lorenz showed, the differences due to small differences in the starting conditions get very large the farther in time the simulations are run. 

 By averaging all ensemble members together and providing the user with ‘mean forecast conditions’ we can give users a general indication of forthcoming weather – up to a point. However, since all ensemble members are equally possible – a particular ensemble member that predicts a very very warm month might be as equally as likely as an ensemble member predicting average conditions, it is valuable to the user, if possible, to provide the full set of ensemble members. 

This is especially important for organisations needing to plan for ‘worst’ or ‘best’ case scenarios. It may not be possible to say which forecast is likely, but stress-testing an organisation’s strategy against the worst and best case conditions can identify shortcomings, and ways to improve resilience. In facing adverse weather, this preparedness can prove extremely valuable – especially if the worst-case forecasts transpire.

Version 1.0 of ESD’s Seasonal Forecaster service currently provides water-industry specific ensemble-average data from five national numerical weather prediction systems. Our intention has been to expand this to provide full sets of ensemble data – each transformed into the area and time frame of interest to our users – and we are pleased to announce that development of this feature has now started. V2.0 will continue to give users bespoke, ensemble-average output for DWD, NCEP, UKMO and Meteo-France systems, but will offer ensemble products for the ECMWF system.

More soon as this feature is launched!  

Craig Wallace, EarthSystemData

More on ensemble modelling:  https://www.ecmwf.int/en/elibrary/10729-ensemble-forecasting

Nice MIT video documenting Lorenz and colleague’s, Jule Charney’s, work:  https://www.youtube.com/watch?v=MTwga6dch2s

EarthSystemData designs and deploys climate data solutions to meet organisational needs. We specialise in climate risk assessment, risk disclosure and adaptation projects assisting multi-national firms through to local clients. To discuss using our research or consultancy services contact us here: info[at]earthsystemdata.com 

 

SCENARIO ANALYSES FOR TCFD

As climate change accelerates and an increasing number of global impacts are emerging, global companies are being encouraged, either by best-practise recommendations or, in some cases, legislative requirements, to disclose their climate-related financial risks within annual reports. The main impetus for this step change has been the Financial Stability Board’s Task Force on Climate-Related Financial […]

As climate change accelerates and an increasing number of global impacts are emerging, global companies are being encouraged, either by best-practise recommendations or, in some cases, legislative requirements, to disclose their climate-related financial risks within annual reports. The main impetus for this step change has been the Financial Stability Board’s Task Force on Climate-Related Financial Disclosures (The TCFD).

The TCFD recommend eleven disclosures that should be made by all organisations exposed to climate-related impacts: these can be separated into physical climate risks (the direct risks an organisation has, or will likely have, due to changing weather patterns, shifts in climate regimes etc) and transition risks (e.g. the threat to an organsiation;s business viability / continuity due to greening thier operations and supply change). In fact, both elements can be further partitioned into risks and opportunities – for example adaptitng and planning to climate risk may well invoke further ‘wins’ in other aspects, not least secuting an organisation as a stable prospect for lending and investment.

More scenario analyses are needed.

The annual TCFD status report has shown encouraging uptake of global organisations aligning themselves with TCFD expectiations. However, organisations in all global regions are scoring poorly in terms of test-casing their Strategy and Risk Management using scenario-based data. This is important, since stakeholders, investors and lenders want to know how resilient an organisation is to differing levels of future climate change. 

TCFD returns using scenario-based test-casing are few. The TCFD has identified this area for improvement. [TCFD Status Report, 2020]

In response to this ESD is excited to launch a dedicated work-stream ‘THEMIS’ which will serve partner organisations in accessing and applying the very latest United Nations Intergovernmental Panel on Climate Change climate scenario data. These data are the world-leading suite of climate model output that are used to form international climate treaty objectives (e.g. Kyoto, 1995; Paris 2015) – as well as underpinning international academic research into global climate impacts.

Obtaining and processing UN IPCC scenario data can be difficult. ESD will do the technical work for you , leaving you with the TCFD-ready output.

THEMIS will do the technical work for you

The UN IPCC scenarios can be technically difficult to navigate and accessing and applying the data to inform business impacts require programming and data handling skills. ESD and their staff have been using UN IPCC data for 15+ years (IPCC CMIP3, CMIP5 and now CMIP6) both for  university-affiliated research and private and public sector companies commissions to inform project and business decisions needing future climate intelligence. 

We are excited to make our computer programming and climate analysis skills available via THEMIS to allow TCFD-returning companies access official UN climate scenarios, and produce specific scenario-based content to inform their work and meet or surpasses TCFD standards.

Low, medium, high emissions scenarios.

A good evaluation of organisational risk management efficacy under differing climate scenarios demonstrates security to your stakeholders and investors. We can assist with all elements of TCFD reporting (Governance, Strategy, Risk Management and Metrics), but THEMIS excels in producing material related to physical climate risks (and opportunities). We have the programming and analytical specialism to work with all sizes of companies joining the TCFD reporting community. We can advise (and perform) bespoke analyses of future scenarios, or provide appraisals of existing research, to produce tailored information for your reporting.

Example final output base on IPCC high-emissions scenario.

We would love to hear from you and learn about your specific TCFD work. Our specialism and skills are here to help and we would love to engage with organisations of all sizes, anywhere on the globe.

[e] contact us at info@earthsystemdata.com with subject ‘TCFD’

THEMIS HOMEPAGE

DOWNLOAD THE 2020 STATUS REPORT

Rain maker: data generation for climate services

Generating the right data for a client is a crucial part of any climate service work stream and there are many techniques we can call upon. EarthSystemData uses many of these techniques for our clients and we are also a co-supervisor for a PhD student, Sarah Wilson, based at the University of East Anglia. Sarah […]

Generating the right data for a client is a crucial part of any climate service work stream and there are many techniques we can call upon. EarthSystemData uses many of these techniques for our clients and we are also a co-supervisor for a PhD student, Sarah Wilson, based at the University of East Anglia. Sarah is exploring mathematical techniques to advance some of these methods and her first publication has been accepted in IJOC. Here we take a quick look at the topic from a climate services perspective

Effective climate management strategies – those that prepare business or an organisation for future climate regimes – most often need plausible series of data, at the daily timescale, from which to devise adequate coping strategies. One option is to take daily series of weather directly from global numerical models that simulate future climate.

 A second option is to blend characteristics of the projected climate with real-world, observed information in order to synthesize a series of daily data that has many of the statistical properties from the real world, but adjusted in some form to reflect the future climate. This second option, in itself, has a number of sub-options as to how exactly the unification of real-world characteristics and the projected climate change can be achieved and it has been the subject of focussed research to identify the most effective techniques given the end use of the data. This is an important topic within climate services, where the trade offs between the methods must be carefully assessed and acknowledged when applying the data to quantify your climate risks. 

One popular technique in climate science, for the generation of rainfall data in particular, is to adopt a mathematical technique known as a Markov chain, named after the Russian mathematician Andre Markov.  The technique is a powerful tool for representing systems with two states — wet and dry in this case — and it can be designed by ‘learning’ from a series of real-world observations for the desired location and then run for a future climate to generate a sequence of dry or wet days. Markov Chains are popular since they are able to embody a memory of the system you are trying to represent. For example, a Chain can be devised to produce a series of data indicating whether each day is dry or wet, whilst accounting for whether the previous 2, 3, or 4 days have been wet or dry themselves. The parameters that control the probability of the day’s state (and how much emphasis to place of each of the historical day’s state) are derived from the real-world observation series you feed the chain (this is the ‘learning’). 

Markov Climate equations
Markov state equations: the probability of a day being wet or dry can be calculated to account for the preceding days' states [EarthSystemData]

Having established a series of wet or dry days, a second mathematical module, not strictly part of the Markov process, will provide rainfall amounts – by randomly choosing a value from a population of possible amounts – and then the whole series is scaled (either inflated or deflated) to match the future climate provided by the numerical climate model (e.g. a monthly total). The bottom line advantage of this technique is that the day-to-day series of rain or no rain can adhere to statistical properties of the real world, rather than the numerical climate model, which may have implicit differences to the real world, especially when it comes to simulating daily processes.

Until recently, very little attention has been given to how long the ‘memory’ of the Markov Chain should be whilst synthesizing data for different climatic zones. Should the Markov chain know about just 2 days previous conditions to decide whether or not today is wet? Or 3? And, which number is best for applications in Seattle (a pretty wet climate) as opposed to Los Angeles (dry)?

In the IJOC paper, Sarah built Markov Chains using over 44 000 rainfall observation series covering the globe to see which memory length (or ‘order’) was optimal for recreating the observed rainfall characteristics at each location. 

Climate change consultancy

Perhaps surprisingly, for most cases, a Markov Chain with just one day’s memory appears to be the most optimal for recreating the actual observed sequence of wet or dry days. This is intriguing, although likely a reflection of the time-frame of many weather systems that result in rain.  

The exception to this finding occurs in the tropics, where zeroth (i.e. no memory what-so-ever) and third-order (three day’s memory) chains perform well and, in the case of re-creating lengths of dry spells, better than first order Chains.

These findings may sound trivial in a sense, since they are an aspect of climate science that remains hidden to many people away from the front face of research and statistical consultancy. But they are very important inputs in order to update existing mathematical tools and to offer further flexibility, depending on the geographical location of the project in question.

 

Craig Wallace, EarthSystemData

Full paper and results: Kemsley, S.W., Osborn, T.J., Dorling, S., Wallace, C. and Parker, J. (2021), Selecting Markov Chain Orders for Generating Daily Precipitation Series Across Different Köppen Climate Regimes. IJOC. Accepted Author Manuscript. https://doi.org/10.1002/joc.7175

PhD Candidate:  Sarah Wilson Kemsley, University of East Anglia, UK

Supervisory Panel:  Professor T J Osborn, Professor S Dorling,  University of East Anglia UK; Dr. Craig Wallace,  EarthSystemData UK; Dr. Jo Parker,  Atkins UK

More on Markov Chains here!  https://en.wikipedia.org/wiki/Markov_chain

EarthSystemData designs and deploys climate data solutions to meet organisational needs. We specialise in climate risk assessment, risk disclosure and adaptation projects assisting multi-national firms through to local clients. To discuss using our research or consultancy services please contact us here: info[at]earthsystemdata.com 

Cool Northern Hemisphere April 2021

Early April has been dominated by notable cold surface temperature anomalies over much of the northern hemisphere – shown in ESD48’s code run on NOAA/NCEP data for 28th March through April 6th [Image 1]. Curiously these cold anomalies are positioned over sub arctic regions, which normally exhibit strong warmer-than-normal conditions – since these areas are […]

Early April has been dominated by notable cold surface temperature anomalies over much of the northern hemisphere – shown in ESD48’s code run on NOAA/NCEP data for 28th March through April 6th [Image 1]. Curiously these cold anomalies are positioned over sub arctic regions, which normally exhibit strong warmer-than-normal conditions – since these areas are heating the most rapidly as Earth’s mean temperature continues to rise.  An expanse of cold anomalies, too, occupies Eurasia. The pattern and expanse of these anomalies suggests something larger is afoot for now rather than, simply, the regional displacement of cold air to another (relatively warmer) region — i.e. the type of circulation response to things like polar vortex displacements.  

One primary candidate for this type of signal – at least as a contributing party – is the sister of the widely-known Pacific Ocean El-Nino phenomenon – aka ‘La Nina’. Bluntly put, La Nina is the reverse of the warm sea temperatures associated El-Nino, resulting in massive swathes of cooler-than-normal sea temperatures in the eastern Pacific, large enough to exert influence on the global mean temperature. Earth is currently in a sustained La Nina phase — this plot [image 2, November 2020, NOAA OISST] shows the extent, and the episode (just) offset 2020 from being the warmest globally on record. The current episode began approximately in September 2020. If five 3-month averages of sea temperatures over the Eastern Pacific (and specifically an area known as Nino3.4) are below -0.5°C (compared to normal) then we’re officially in a La Nina phase. 

We’ve had seven 3-month means well below that criteria to date. Therefore, an ongoing imprint on global temperatures is well expected. The scale of La Nina, and its capacity to influence the planet’s mean temperature, owes to the ocean dynamics that cause it: enormous pools of deeper, colder ocean water rise to the surface at the South-American coast, and pool westwards.   – CW

Image 1: Global temperature anomalies relative to 1981-2010 for early April, via ESD48 [raw un-anomalised data NOAA/NCEP PSD]
Image 2: Sea surface temperatures November 2020, relative to expected conditions [via NOAA OISST visualiser]