Print Friendly and PDF
only search openDemocracy.net

Modelling climate change: known unknowns

About the author
David Stainforth is chief scientist at climateprediction.net and is at the department of atmospheric physics at Oxford University.

When planning how to mitigate and adapt it’s important to understand as much as possible about the full range of ways in which the Earth’s climate may change. There are three fundamental sources of uncertainty in climate forecasts:

  • natural variability. The climate system is chaotic, which means that small changes in one location at one point in time can lead to large differences at other locations at some future point in time. This is the familiar “butterfly effect” whereby a butterfly flapping its wings in Indonesia is said to be able to affect whether a hurricane might hit Florida at some point in the future.
  • changing boundary conditions. The climate is affected by many factors which are considered to be separate from, or outside, the climate system. These include natural factors such as volcanic eruptions and solar output, and anthropogenic factors such as the emission of greenhouse gases.
  • scientific understanding of how the climate behaves and how it responds to changing boundary conditions such as a rapid increase in atmospheric concentrations of greenhouse gases.

All three can be studied using multiple climate-model simulations known as an ensemble. An ensemble consists of many simulations, each one slightly different from the rest.

Natural variability is studied using “initial condition” ensembles in which the differences are in the distribution of temperature, wind, humidity and other factors at the beginning of the simulation.

Boundary condition uncertainty is studied using ensembles with different scenarios for outside factors. Greenhouse gas emissions from human activities are likely to have a much larger effect on the climate during the 21st century than any natural factors, short of very low likelihood events such as a collision with a large meteor. Consequently scenarios of greenhouse gas emissions are most important for this source of uncertainty.

The third source of uncertainty, “model uncertainty”, can be studied using ensembles in which each simulation is produced by a different model. Unfortunately only a handful of climate models have been developed around the world so it is unlikely that they span the whole range of plausible behaviour. This source of uncertainty has therefore been little studied to date but is an area in which research is rapidly expanding.

In recent years it has been suggested that this problem could be addressed using an approach in which many “model versions” are created from one model by changing the way it represents certain aspects of climate physics. This is known as a “perturbed-physics” ensemble.

A final problem is that these three sources of uncertainty interact. For instance natural variability may be different with different boundary conditions or in different models. It is therefore necessary to study them together in one large “ensemble of ensembles”, known as a “grand ensemble”.

Climateprediction.net

The climateprediction.net project has done just this using a version of the UK Meteorological Office Unified Model (see fig 1).

The experiment carried out by climateprediction.net was designed to explore the way our climate might respond to doubling levels of CO2 concentrations in the atmosphere (see figure 1b).


Figure 1: Schematic of the experimental design

A grand ensemble is an ensemble of ensembles designed to explore uncertainty resulting from model construction, initial conditions and forcing. (a) the standard model has parameters perturbed to create a large PPE and for each member of this ensemble an IC ensemble is created, producing a grand ensemble of simulations. (b) For each member of the grand ensemble 45 years of simulation are undertaken, including 15 years exploring the response to doubling the concentrations of CO2 in the atmospheric component of the model.

Until recently, complex climate models of this type have only been run on supercomputers. This is because vast processing power is needed to simulate changing temperatures, humidity, winds, rainfall and other factors at thousands of points around the earth, and for each of these at tens of levels in the atmosphere. One cycle of equation solving can only take the simulation about thirty minutes forward; this is known as a “time step”. So the process must be repeated hundreds of thousands of times to produce a simulation of decades or centuries.

The climateprediction.net grand ensemble needs to consist of thousands, or tens of thousands, of these simulations. This was not feasible using supercomputers so climateprediction.net developed a distributed computing approach whereby members of the public donate the spare computing capacity on their PCs to do one or more of the simulations.

The interest and commitment has been staggering. Over 100,000 people from 150 countries have taken part, and more than 70,000 simulations have been completed. To put this in context, the largest ensemble in the literature using a complex climate model was 53 at the beginning of 2005.

The first climateprediction.net analysis, published in January 2005, presented results from 2,578 simulations, a small subset of the total number and representing an initial investigation of six parameters.

An important and often cited measure of how substantially the climate might respond to changing levels of greenhouse gases is “climate sensitivity”, defined as the equilibrium global warming in response to doubling levels of atmospheric carbon dioxide (CO2). The initial conditions seem to have relatively little effect on this quantity so the climate sensitivity for each model version was calculated as the average of the initial condition ensemble for that model version. This gave us the climate sensitivity for 414 model versions. The distribution of these is shown in figure 2. They range from less than 2 degrees Centigrade to more than 11 degrees.


Figure 2: Distribution of climate sensitivities from the perturbed-physics ensemble.

This is an important new result for several reasons. There have been a number of recent studies (Forest et al Science, 2002) which have shown we should not neglect the possibility that the real climate’s sensitivity could be high (greater than 5 degrees Centigrade for instance). However, this is the first study finding versions of complex models which exhibit such behaviour. Their existence lends further credibility to such high sensitivities and provides a means of studying the mechanisms leading to them, as well as the impacts which could result. The “range” of model versions will also enable studies of regional and seasonal uncertainty.

It is worth noting that none of the model versions have sensitivities much below 2 degrees Centigrade, adding to the evidence that it is unlikely to be very low. The implication of this is that even if the most optimistic of these possibilities were correct, we are facing the prospect of very substantial climate change during the next hundred years.

The ranges in predictions of temperature rise and other changes are wide. This does not imply a lack of confidence in the results. It simply means that our current understanding cannot restrict them further. Even wide ranges can provide useful information on the minimum or maximum expected changes and thereby provide powerful drivers to inform and support policy planning.

This article appears as part of openDemocracy‘s online debate on the politics of climate change. The debate was developed in partnership with the British Council as part of their ZeroCarbonCity initiative – a two year global campaign to raise awareness and stimulate debate around the challenges of climate change.


We encourage anyone to comment, please consult the
oD commenting guidelines if you have any questions.