Home

The science of prediction

Dave Frame
22 May 2005

One of the major problems climate scientists face is communicating their knowledge and their ignorance to an interested wider community. If you attend any large meteorological conference you’ll probably come away with a rather schizophrenic view of climate models: to some, they are perfectly adequate tools for examining likely global change; to others, they are inadequate tools that can’t even predict El Nino. Both views, inconveniently, seem equally fair.

It’s true that the relatively coarse resolution of today’s models hampers our ability to predict El Nino (the major source of inter-annual variability), and it’s also true that climate models (even quite simple ones) seem to do a reasonable job of predicting the global response to changes in natural and human-induced climate forcing.

Don’t miss other articles in openDemocracy’s debate on the politics of climate change

This sounds a little weird, and is the source of much confusion in the public debate about climate change. As climate modellers, we’re often asked why we can make confident-sounding predictions about 2100’s temperature, when we can’t tell you whether there’ll be an El Nino or NAO-induced wet winter this year.

The answer has to with just where physics places (observable, measurable, useful) constraints on the earth’s climate system. It turns out that the earth’s energy budget has so far provided quite a strong constraint on global mean temperature on long timescales (and the available evidence suggests that it will continue to do so in the future), while El Nino is subject to much more loose or hard-to-measure-and-model constraints (things to do with tropical convection, for instance).

Where we can find and apply such constraints within an appropriate model, we can often predict things reasonably well. Such is the case for global mean temperature in 2090-2100, but not for rainfall in Birmingham in 2007. One of the implications of this is that though, as David King says, the fundamentals of the science behind climate change are long-established, based on simple physics and fit very well with the observational record, it’s not necessarily the case that this exhausts the issue. Even though there is quite a consensus surrounding expected 21st century warming, there may be processes which we have yet to identify that will ameliorate or exacerbate that warming.

Benny Peiser suggests that the possibility that there are as yet unidentified negative feedbacks which will lead to lower than expected warming creates a space for climate sceptics. This seems a reasonable position. To be consistent, though, he ought also to ask for a similar space for scientists who think that whatever processes we are missing will amplify the expected warming, rather than mitigate it.

Both positions are equally justified in the interests of free intellectual enquiry. However, both are essentially speculative: they both relate to the putative effects of as yet unidentified or unconfirmed processes. It’s true that scientists shouldn’t scorn or deride new ideas just because they depart form the current consensus. But neither should scientists consider new and unproven speculations as having equal status with a theoretically coherent, empirically-verified body of knowledge.

That makes it sound much more tidy than it is in real life. Real life is not so conveniently categorised. Our beliefs about the non-linear, chaotic, multi-scale climate system tend not to fall easily into boxes labelled “justified” and “speculative”. Instead, beliefs about climate processes and their effects tend to fall along a spectrum where they may be more or less justified by reference to the available evidence and theory. This is important and argues for a shift in the way climate modellers work. It argues for a “probabilistic turn” in which we seek to take uncertainty and degrees of beliefs seriously, where we can.

A model for the century

This shift is quietly underway among the climate research community, as is evident from the increasing prominence given to probabilistic climate forecasting in conference agendas over the last five or six years. This may sound a little irrelevant: I appreciate that worrying about the ontological status of claims about the climate system may seem like an academic’s typically obscure, unhelpful response to a serious global threat – but in fact thinking carefully about the uncertainties surrounding our understanding of climate has some powerful real world implications.

In particular it can help us clarify which questions are important. Climate sensitivity – the equilibrium warming to a doubling of CO2 – is a centrepiece of global climate policy. It happens to be impossible to measure directly, and very difficult to infer from observations. This uncertainty allows alarmists to make spurious and dramatic claims about very high or very low equilibrium warming, while still being within (say) a 5-95% confidence interval.

When analysed within a (Bayesian) probabilistic framework it turns out that the question is quite poorly posed. We can ask a related, much more policy relevant question if we focus on the transient climate response: the actual increase in temperature change to 2100. As well as being a more practical way of thinking about the issue, it also happens to be better constrained, observationally.

In fact, there is a considerable consensus around the transient response, one that even the most ardent climate sceptics accept. And that consensus says that global temperatures will warm by somewhere between 1.8-4.6 degrees Centigrade in the next century, under a scenario in which CO2 emissions grow by 1% per year over the period. The development of probabilistic approaches to climate forecasting – with the extra information such forecasts contain – and (we hope!) the emergence of such consensus, may well combine to help move the discourse around climate change forward: from the rather polarised debate we have seen to date, to a more nuanced conversation in which there is the space for marginal voices desired by Benny Peiser, but enough mass in the mainstream to satisfy David King.

This article appears as part of openDemocracy‘s online debate on the politics of climate change. The debate was developed in partnership with the British Council as part of their ZeroCarbonCity initiative – a two year global campaign to raise awareness and stimulate debate around the challenges of climate change.

Had enough of ‘alternative facts’? openDemocracy is different Join the conversation: get our weekly email

Comments

We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData