Kevin Anderson: "Scientists are Cajoled into Developing...Politically Palatable Messages" on Climate

Sun in the Alberta tar sands by Kris Krug.

This is a guest post by Gabriel Levy and was originally published on the blog People and Nature. This post is Part 1 of a two-part interview with Kevin Anderson of the Tyndall Centre. You can read Part 2 here.

The reality about the greenhouse gas emissions cuts needed to avoid dangerous global warming is obscured in UK government scenarios, climate scientist Kevin Anderson has said.

The most important measurements, of total carbon dioxide in the atmosphere, are pushed into the background – and scientists are pressured to tailor their arguments to fit “politically palatable” scenarios – Anderson told a Campaign Against Climate Change conference in London on 8 June.

The government’s scenarios assume that rich countries such as the UK will reduce emissions by some distant – and effectively meaningless – future dates, Anderson explained to more than 200 trade unionists and environmental activists at the conference.

Between conference sessions, Anderson, deputy director of the Tyndall Centre, the UK’s leading climate change research organisation, and professor of energy and climate change at the University of Manchester, gave this interview to People & Nature.

Gabriel Levy (for People & Nature): Could you comment on recently published research showing that global average temperature has risen more slowly in the 2000s than in the 1990s? [1] The usual crowd of climate science deniers are using this as a new and spurious reason to deny the need to do anything about global warming.

Kevin Anderson: Over a relatively short period of time – a decade – the temperatures have not gone up as much as some estimates said they might do. The first point to bear in mind is that climate change is not about one decade. It’s about longer periods of time. You can not say that any one year, or two years, or even ten years, is much of a signal that climate change is occurring or is not occurring. You need to look at longer time frames, and the longer-term trends have not changed.

It is interesting, though, that in the last ten years, as carbon dioxide (CO2) emissions have been going up, we have not seen the acceleration in the rate of increase in temperatures that some anticipated. There could be several reasons for this. One possibility is that, as the world gradually warms up, a great deal of the thermal energy is trapped in oceans. There is a thermal lag in the system that the oceans provide: that may or may not explain why the temperatures have not gone up as much as some thought they would.

But bear in mind the context: the temperature has gone up, and continues to go up. If you look at the Met Office plots, the warmest 15 years on record have all occurred since 1990. We had an outlier in 1998 – and we will always have occasions when such extreme weather events occur, that may or may not be related to climate change. (Update, 4 July. New information from the World Meteorological Organisation reported here.)

What concerns me is the conduct of public debate. The slower rise of average temperature has resulted in scientists modifying slightly their estimates of climate sensitivity – that is, their estimates of what temperature rise is most likely if the amount of CO2 in the atmosphere is doubled. The change has not actually been that dramatic – and, of course, if there were further reasons for the range of estimates to come down still more, that would be welcome news.

But while there has been a great deal of discussion of this, where has been the discussion about total emissions of greenhouse gases, which is the really important indicator? During the 2000s, emissions have been much higher than anyone anticipated. Since the global economic downturn of 2008-09 – an event that you might have thought would severely constrain emissions growth – global emissions have continued to rise at an unprecedentedly rapid rate. They rose by 6% in 2010 and by 3% in 2011, and preliminary information indicates something similar for 2012.

So while the climate sensitivity has dropped a little, the emissions are going up at much faster rates. If you think about that from the point of view of overall temperature projections, of overall climate change, the higher-than-anticipated rise in emissions more than counterbalances the reduction in climate sensitivity.

Illustrating this failure adequately to consider emissions, Ed Davey, the UK energy secretary, recently welcomed China’s statement that its emissions would peak by 2025, and would achieve a 40% reduction in the carbon intensity of its economy by 2020. Yet the UK’s own policies are premised on China peaking its emissions in 2017 or 2018, not 2025. How do we deal with that gap?

What we’re doing repeatedly is fudging the numbers to fit within acceptable norms – our analysis must not raise fundamental and uncomfortable questions. The level of emissions reductions necessary to avoid “dangerous climate change” is much, much more challenging than anyone – including many climate scientists – is yet prepared to countenance.

The emissions story has been the Cinderella of climate change debates for the last 20 years…with the unprecedented and ongoing rise in emissions completely offsetting any small change in climate sensitivity.

GL: In the paper Beyond Dangerous Climate Change, you and Alice Bows, your co-author, stress that global emissions peak dates (that is, the dates set by politicians at which they aim for emissions to reach their peak level) and longer-term reduction targets (that is, the targets set by politicians for reducing emissions) obscure reality, and that the real focus should be on cumulative emissions budgets (i.e. the total amount of CO2 in the atmosphere). Could you explain that for non-scientists? 

Kevin Anderson

KA: The climate change story has long been told in terms of “we must have large reductions in emissions by some abstract point in the future” – for example, an 80% reduction by 2050. The message conveyed is that, many years from now, we must have got our carbon emissions down by some arbitrary amount. But when you consider the science of climate change and how this links to the global rise in temperature, it is not what happens in 2050 that matters, but the total quantity of CO2 in the atmosphere. That is the carbon budget. We know how much CO2 we can put in the atmosphere for a given temperature – there or thereabouts…there is some scientific uncertainty, but we have a good handle on what that range looks like. This carbon budget approach is scientifically legitimate, in stark contrast to 2050 emission reductions.

Once you frame climate change issues in terms of the carbon budget, it transfers our focus away from 2050 and towards what we need to do by 2015, 2020 and 2025. For the wealthier nations, reductions after 2030 are much less important in terms of our international commitments on the climate change.

When we present our emission scenarios to the scientific community – with their much greater focus on the shorter-term – we receive no real disagreement with our principal conclusions. The difference between our analysis and that of many others stems from their expedient choice of assumptions – assumptions that enable them to deliver politically palatable outcomes.

GL: Please explain scenarios, for those not familiar with them.

KA: Scenarios illustrate alternative pathways of greenhouse gas emissions out into the future – with our scenarios relating to a particular carbon budget and hence to a particular chance of avoiding the 2°C level, which is characterised as a measure of dangerous climate change. In other words, we look at how to constrain emissions to levels that mean the temperature will not rise by more than 2°C. Our scenarios focus on energy and its associated emissions of greenhouse gases (mostly CO2), and include “what if” illustrations of economic activity, energy demand, energy supply technologies and fuels, how emissions of CO2 are growing, and when is it possible they could reach a global peak level. It is scenarios such as this that are used by governments to help determine what low-carbon policies to consider, develop and potentially implement.

A tar sands refinery in Alberta, Canada. Photo by Kris Krug.

GL: But you have a problem with the scenarios…

KA: Yes. Many, if not most, of the scenarios proposed are completely unrealistic in assuming almost immediate changes to current emissions trends. Moreover, they typically neglect what is happening in China and India. They typically neglect how poorer parts of the world need much more energy if they are to develop and improve their welfare. Will they develop with wind turbines, nuclear power and the range of other low-carbon options, or will they develop with fossil fuels? Well, as it stands, their governments are being heavily lobbied by conventional fossil fuel companies, and some of the countries have their own fossil fuels resources. In the short term they are developing, and will continue to develop, fossil fuel energy systems – and our scenarios must factor this in.

Even our own infrastructure continues to be built around a fossil fuel base. Certainly, this is not going to change radically in the next few years – and probably, even if pushed hard, not for another five or ten years.

These global emissions stories have been significantly underplayed in almost all low-carbon scenarios – and as such they have only served to repeatedly reinforce the view that a decarbonised future is just a challenging evolutionary transition rather than a revolution in our use and supply of energy.

GL: At today’s conference, you said that in the scenarios used by government, assumptions about the level of emission reductions compatible with economic growth are dictated by economists. The scientists then have to come up with emissions scenarios to fit those. How does this happen?

KA: The scientists are forced to operate within a set of constraints that are unreasonable to start off with. Firstly, we have to deliver within the framework – or rather, it is very hard for us to question the framework – of a 2°C rise in global temperature. When we do our analysis, we are expected to make sure that our emissions budgets do not reject the feasibility of a 2°C temperature rise – and that this must also be viable within the paradigm of ongoing economic growth. There are a number of ways we can do that.

According to the Pembina Institute, “Average greenhouse gas emissions for oilsands extraction and upgrading are estimated to be 3.2 to 4.5 times as intensive per barrel as for conventional crude oil produced in Canada or the United States.” Photo by Kris Krug.

We can play with the acceptable probabilities of meeting 2°C and with the choice of models. All this gives us bigger or smaller emissions budgets. But even larger 2°C budgets may not offer sufficiently flexibility to deliver politically palatable outcomes – at least not with reasonable practical constraints.

So next we loosen what is practical or feasible, and we begin to adjust the time when emissions will peak. The earlier that emissions of CO2 reach a peak, i.e. their maximum level, the less steep will be the reductions curve. For example, several analysts, publishing in 2011, had peak emissions occurring in the past, around 2005! This despite everyone being aware that emissions were continuing to rise.

What is most disturbing is that such abstract analysis often goes alongside policy recommendations – and, more disturbing still, few policymakers are familiar with the details of the analysis informing their judgements.

Today, virtually all low-carbon scenarios aimed at 2°C assume a peak of global emissions in the period 2010-2016. The UK Committee on Climate Change, an independent, statutory committee set up to advise government and parliament, assumes a 2016 peak in global emissions. The Stern Review, a key report to the UK government on the economic consequences of climate change and published in 2006, assumed a 2015 peak. However, once you extend the peak out to 2020 or 2030, the proposed mitigation measures in such reports simply can not achieve the necessary emissions budgets.

The next ruse is to massage the rate at which emissions will grow, out to the peak date. We know that emissions are growing at around 2-4% per year – and probably nearer to 3-4%, depending on what is happening economically around the globe – but few analyses factor in such growth rates. The reality is that carbon emissions are rising steeply and hence the remaining carbon budget for 2°C is being rapidly consumed.

Tar sands infrastructure near Fort McMurray, Alberta. Photo by Kris Krug.

To sum up: those commissioned to produce these scenarios are essentially obliged to use a reduction rate in emissions (from the emissions peak) that is dictated by what economists assert is viable with economic growth. Consequently, scientists are being cajoled into developing increasingly bizarre sets of scenarios … that are able to deliver politically palatable messages. Such scenarios underplay the current emissions growth rate, assume ludicrously early peaks in emissions and translate commitments “to stay below 2°C” into a 60 to 70% chance of exceeding 2°C.

Moreover, when even these scenarios fail to deliver, Dr. Strangelove – in the guise of geo-engineering – is called upon. Such technologies may be found to work, perhaps even at reasonable scale. So they may one day be used. But, given the levels of uncertainty, their ubiquitous presence in 2°C scenarios only adds to my concern that orthodox economics and political cowardice are unduly influencing science.

To some extent, the cat has been let out of the bag. Increasingly, established organisations are joining the voices of those previously dismissed as alarmists and noting how the optimistic ramblings of many analysts look increasingly ridiculous. The International Energy Agency (IEA), Price Waterhouse Coopers, and a range of others are saying explicitly that emission trends are heading in completely the wrong direction, and that we need something much more radical to avoid 2°C.

However, while the scale of the problem is being grudgingly acknowledged, few are yet prepared to challenge the dominance of financial instruments and the wholly inadequate suite of mitigation proposals – let alone the more thorny issues of economic growth, equity and absolute versus relative emission reductions.

So in 2013 we are left with an increasing recognition of the radical nature of the problem – but a willingness only to consider piecemeal incrementalism as the solution. Anyone daring to highlight the disjuncture continues to be marginalised.

GL: How do the UK government’s scenarios compare with the scenarios that you and your colleagues have published?

KA: The first difference is that the government assumes a volume of total emissions associated with a 63% chance of exceeding a 2°C rise in global temperature. This is flagrantly at odds with the UK’s international commitment to “stay below 2°C”. In our view, it is not reasonable to expect the poor people of the world, living in lower-lying areas in the southern hemisphere, to put up with the sea-level rise, vulnerability to storms, and the plethora of other impacts on agriculture, migration, etc. It is not reasonable to expect the 30 million people – equivalent to half the UK population – living within 1 metre of sea level on the coastal strip of Bangladesh to deal with repercussions of our two-faced attitude towards climate change.

In our analysis, we allow only a 37% chance of going past the 2°C temperature rise. We don’t think it is possible to do much better than that now. It’s too late. We are in 2013, and we have pumped around 400 billion tonnes of carbon dioxide into the atmosphere since 2000. However, if others disagree and can demonstrate the viability of still better chances of 2°C then we would certainly be keen to revisit our analysis.

The government’s analysis is premised on global emissions reaching a peak in 2016 and implies a peak of around 2018 for the poorer nations. In our analysis we adopt 2020 as an extremely challenging but still achievable global peak date. And for the poorer nations we allow a longer period out to 2025. That seems fairer to us. Moreover, we don’t think that deforestation, which may account for 12-20% of the additional carbon in the atmosphere, is the responsibility solely of those nations deforesting now. We have virtually deforested the UK and have reaped the “benefits” from the land cleared for agriculture, etc.

The dotted line = emissions continuing to increase at the present rate for several more years. The solid line = concerted action is taken in the short term. From a presentation by Alice Bows

The dotted line = emissions continuing to increase at the present rate for several more years. The solid line = concerted action is taken in the short term. From a presentation by Alice Bows.

The consequence of all this is that there is a significant difference between the mitigation rate – that is, the rate at which emissions will be reduced after the peak year – in our scenarios compared with those used by government. The government scenarios typically assume a 3-4% mitigation rate; we estimate a 10% reduction per year. In other words, we would need a 10% reduction in emissions, every single year, to make our fair contribution to an outside chance of limiting the temperature rise to 2°C.

[1] A group of climate scientists led by Alexander Otto published a letter to Nature Geoscience reporting that they have recalculated how much the global average temperature will rise in the year that carbon dioxide concentrations reach twice their pre-industrial value, taking into account the levelling-off of temperatures in the last decade. Their best estimate is 1.3 degrees hotter than now, compared to 1.6 degrees stated by previous research. New Scientist article reported the research, and comments from scientists stressing that the big picture of global warming remains unchanged. The article by Otto and co., “Energy budget constraints on climate response”, is available via a paid-for Nature site here; a pdf seems to have ended up here.