Mitigation versus Adaptation

Back in 1992 most nations signed the UN Framework Convention on Climate Change (UNFCCC), pledging to prevent dangerous climate change. 'Prevent' is a clear word, but in this context has largely been replaced by the more cumbersome term 'mitigation'.  Literally mitigation means 'softening', or perhaps 'to render less harmful', and it could be taken mean simply slowing down the process of climate change.  

In 1992 it was generally thought there was a fairly simple and reversible linear relationship between injections of 'new' greenhouse gases (GHGs) and climate response. At any point when  the process was thought to have gone too far, it would be fairly straighforward to reduce emissions, merge into the natural buffering processes and bring back the global temperature to its  'natural' (Holocene) level. 

This simple model reflected the standard environmental thinking of the time: that left alone, Nature can clean itself up. A polluted river for example, will soon run clean if you stop polluting it;  compare city air, food additives, acid rain. Just stop.  

Unfortunately the Climate turned out to obey different principles. There were two main differences. The first is that many GHGs are highly persistent and do not go away even if you stop emitting them. If you do emit them they accumulate, as we have observed in the famous Keeling Curve. The second difference is that the effects might not be linear, and in particular there might be thresholds beyond which new behaviours emerge. Further, there might be 'feedback' effects that could amplify or dampen the effects, and finally long-stored natural greenhouse gases might be released.

In the early years of the new millennium many of these understandings about non-linearity became condensd in the notion of 'abrupt climate change'.  It was considered possible that abrupt changes could kick in at any time, but that the probability was related to the level of GHGs and any consequential temperature effects. Such changes might be mild enough to control, or they might lead to a cascade of feedback effects and so-called 'runaway' climate change. 

By about 2008 climate scientists and the global political class had reached a pragmatic agreement that non-linear effects were reasonably unlikely if the temperature rise could be maintained below 2 degrees C.  This became an important threshold, and defined the most common meaning of 'mitigation': policies that gave a reasonable chance of avoiding it.  'Reasonable chance' was often defined as two-thirds or 67%; others preferred 80%, but if you think about it these odds against ultimate disaster make you want to rub your eyes and ask "Are they really saying that?". 

OK, so we are back with the meaning of mitigation as 'prevention': preventing the temperature rising above 2 degrees C, sometimes translated into a GHG concentration of 450 parts per million (ppm), sometimes translated into an allowable 'emissions budget' of about 1000 billion tonnes (Gt) between 2010 and 2050. It seems to me the key task of our generation is precisely to achieve these physically-defined goals. The next generation should perhaps consider a harder-edged meaning of 'mitigation', that of restoring the GHG levels to the Holocene average of around 270ppm.

Meanwhile there continues to be dispute about whether we can already detect the 'signal' of warming within the 'noise' of natural year-to-year variability and longer cycles such as the sunspot cycles and the kinds of background variation that gave rise to the Mediaeval warm period and the so-called 'Little Ice Age'.  After all, atmospheric warming is only one of the many effects that could occur as a result of increased GHGs.

To give a physical analogy, heating a pan full of ice on the gas stove will not show up as a temperature rise until all the ice has melted. Energy is being fed into the system, and changes are inevitably happening, but not, in this instance, warming. Going back to the climatic system of the earth, it has never been entirely clear how much of the extra energy in the system would express itself as a temperature rise. True, temperature is a 'bottom line' in most physical changes and you would certainly expect some change, but how much, when -- that remains uncertain, although most atmospheric models predict quite a lot.  We can be very confident that later in the 21st century, at present rates of forcing, the temperature is bound to rise a lot, but has it yet?

As we all know, by about 2000 the official climate change community was convinced that the signal was discernible, and expressed in the famous 'hockey stick' graph. While I personally find the Hockey Stick entirely plausible, it has not  convinced either informed or fundamentalist sceptics, who invariably point to periods of cooling and historical 'noise' as refutations.

It seems to me that the emphasis on the 'signal' rather than the underlying physics has damaged our rhetorical stance. It would have been far better to use the analogy of earthquakes, that everybody understands are the result of powerful but slow-acting physical processes. Just because an earthquake has not occurred in a susceptible zone for a long time, it does not mean there is no risk. On the contrary, the longer the period of quiescence, the greater is likely to be the energy released when it comes.

Climate change has similar dynamics: if warming is actually occurring, you could say that is good: at least you can see it. If warming is not occurring that is in some respects more alarming, because it is more likely to manifest itself in an unstoppable rush at some future date.

I have written the above to introduce the notion that any climatic effects that are attributed to raised GHG concentrations, are contentious.  If GHG production continues, and even if it levels off, the warming will definitely come sooner or later, but it is conceivable, as sceptics aver, that what we have observed so far is not due to GHGs. 

Within the mainstream climate community however, there is a firm belief that climate change is already happening, with on the whole deleterious effects.  There are numerous 'natural adaptations' at a personal and institutional level, such as improved flood management, drought-resistant crops, sustainable urban drainage, artificial snow at ski resorts, altered insurance schedules and premiums and so on. But as warming gathers momentum to become an unarguable fact of life, will such piecemeal adaptation be enough?  Should we be more interventionist and 'strategic'? How much of our resources should be devoted to helping people 'adapt'?

This is a very important question, and it is widely thought that on humanitarian grounds we should start planning for all the implications of a just-below 2 degrees world, whatever they are considered to be. This sounds fairly obvious, but I wish to argue against it.

My reasons can be explained in terms of a medical analogy. Supposing somebody is severely injured in a traffic accident, with heavy bleeding. As a first aider your task is to stop the bleeding, then you have to call an ambulance and give a blood transfusion as quickly as possible. The patient might need an emergency operation, then intensive care. All these things are  'mitigation' measures designed to ensure survival. Meanwhile it is also true that the patient needs to 'adapt' to the new situation: what about work, not showing up for appointments, letting the family know, insurance claims, it hurts, and I'll need a new dress etc. These are not unimportant in themselves, but they are clearly less important than the need for survival. They would not be pursued in any way that would compromise the primary goal.

It seems to me that the same logic should apply to the climate change problem. The crucial task is to maximise the chance of avoiding 2 degrees. Any resources devoted to premature 'adaptation' are, in some sense, wrongly prioritised. That is not to say that all manner of 'adaptations as usual' will not be pursued, just as we presently respond to natural disasters and local difficulties. But it would not be part of the climate mitigation programme to plan for adaptation as such.

Does this sound harsh, even inhuman? I am sure it could well be seen in such a light. But because it is such an unusual standpoint, I would like to put it forward for debate. Of course it is possible that we can identify many measures that serve both to mitigate and to adapt, such as more energy-efficient buildings, greater national self-sufficiency in food, better public transport and so on. We should definitely work on these.

Having said this, there is another sense in which we would understand the term 'adaptation', that for many people will appear to impose greater hardships than 'merely' adapting to 2 degrees. This concerns the adaptations to mitigation measures themselves. Successful mitigation entails a very strong strategic process comparable to the mobilisations of wartime. There are many possible transition scenarios, but generally these 'adaptations' include changes of diet, reduced levels of aviation, more intrusive energy technologies, landscape changes and so on.  This view is of course anathema to Sceptics, but even  the community of True Believers is reluctant to engage with it, generally proposing some extension of the present pattern of preferences. Expansions of the rapid transition view will be found elsewhere on this site.

Thus we have two meanings of 'adaptation'. One is planned, strategic and necessary, responding to the changes required to prevent dangerous climate change. The other is a range of responses to unplanned effects of climate change, that might indeed have other causes.  Probably some new terminology is required to underline these important distinctions, but that is for some future occasion.