Tag Archives: climate sensitivity

Marching Against a Tail (and All Those Who Don’t Get Risk)

This Saturday 7th March, I will be joining the Time to Act Climate March, which seeks greater government action to prevent climate change. Why? Because of RISK! To remind everyone: Risk jpeg Something can be risky even if it has a low probability as long as the impact is high. Imagine a game of Russian roulette. You put a revolver against your head with one loaded chamber–and then you pull the trigger. Feeling lucky? I now offer you $1 million to play the game once. Remember, an ordinary revolver usually has six chambers, one of which is loaded. Do you take the bet? What about a pistol with 100 chambers? A thousand? With 1,000 chambers, the probability that you blow your brains out is 0.1%. Is that a good bet?

In statistics, that 0.1% likelihood outcome is firmly in the ‘tail’ of the probability distribution. When outcomes cluster around a central estimate, they may not have significant tails; others, have long or fat tails. This is important since generally the tail is where bad stuff happens. In my example above, something really horrible happens in the tail: death. As impacts go, that is pretty bad. So despite the low probability of an adverse outcome and the $1 million potential pay off, putting a 1,000 chamber pistol against your head with only one bullet is still a very risky bet.

So is frying the planet.

And this is why I think scientists like Judith Curry and Nic Lewis don’t really get risk. They argue that doubling atmospheric CO2 isn’t much to worry about because we may only warm a little. Yes, we may warm only a little; but then again we may not. Here is the climate sensitivity table from a recent paper by the two (click for larger image):

Curry and Lewis jpeg

ECS refers to equilibrium climate sensitivity and TCS to transient climate sensitivity. Simplistically, the former has a longer time horizon than the latter.

On the back of this paper, Curry was welcomed with open arms by the Wall Street Journal to do an op-ed (which you can find on Curry’s personal web site here) and feted by climate skeptic blogs across the world. In the op-ed, Curry takes an extract from the table:

Nicholas Lewis and I have just published a study in Climate Dynamics that shows the best estimate for transient climate response is 1.33 degrees Celsius with a likely range of 1.05-1.80 degrees Celsius. Using an observation-based energy-balance approach, our calculations used the same data for the effects on the Earth’s energy balance of changes in greenhouse gases, aerosols and other drivers of climate change given by the IPCC’s latest report.

And then goes on to make a policy recommendation:

This slower rate of warming—relative to climate model projections—means there is less urgency to phase out greenhouse gas emissions now, and more time to find ways to decarbonize the economy affordably. It also allows us the flexibility to revise our policies as further information becomes available.

Now ‘likely’ used in the WSJ op-ed means the 17-83% range. She has 17% of the outcomes in the good tail–warming below 1.05 degrees Celsius–and 17% in the bad tail–above 1.80 degrees Celsius. But from the table we also see that there for a 5-95% range, the outcome of a doubling of CO2 may be above 2 degrees Celsius of warming–2.5 degrees to be exact.

Some rough back of the envelope calculations. The pre-industrial revolution level of atmospheric CO2 was 280 parts per million (ppm). We are now at 400 ppm, so to double from pre-industrial concentrations, we would need to get to 560ppm. Co2 concentrations are also rising at around 2.25-2.50 ppm per annum. So if climate sensitivity to a doubling of CO2 were 3 degrees Celsius, we would get 2 degrees of global warming if CO2 concentrations reached around 450 ppm, which will be in about 20-30 years time. Two degrees of warming is considered (somewhat arbitrarily) to be the borderline for dangerous climate change.

Now Curry and Lewis see a non-negligible risk, that is 5%, that the sensitivity is 2.5 degrees. If right, the current rate of CO2 emissions would lock us into a 2-degree warmer world maybe 10 years later than the consensus, say in 2050. And then we come to the tail.

What is going on further into the tail. I want to know about this low-probability tail risk. It is important. This is the chamber containing the one bullet. If you wouldn’t play Russian roulette at very low odds, how about permanently damaging the planet at such odds?

How deeply do we have to get into the tail before we hit catastrophic climate change of 4 to 6 degrees of warming (remember that this particular risk is a composite of how much carbon we put in the atmosphere and how sensitive climate is to that amount of carbon). Let’s suppose we have a 99% confidence interval that we remain outside of the disastrous outcomes. Now we have 0.5% in the really bad upper tail. That’s odds of one in 200. Are you happy ignoring a disaster movie outcome if it only has odds of one in 200?

Moreover, if we ignore this tail risk of climate sensitivity and feel “there is less urgency to phase out greenhouse gas emissions now”,  isn’t there the possibility that due to fossil fuel infrastructure lock-in, we commit ourselves to a more than doubling of atmospheric CO2?

It gets worse. What if their climate sensitivity numbers are wrong. Curry and Lewis use one particular approach to reach their figures, but there are others. Michael Mann sets out the alternative approaches and the resulting climate sensitivity numbers in a Scientific American article here (click for larger image). In general, they are nearly all higher than those of Curry and Lewis. So Curry and Lewis’ disastrous climate change tail risk, with odds perhaps measured in the hundreds, may actually be a tail risk with odds measured in the tens. We can’t really be sure at this stage.

Solid Line of Evidence jpeg

Finally, going on a march would appear a quixotic act in the face of the wicked problem of climate change. But one can take a risk approach similar to that of extremely unlikely, but very harmful, events here as well. Voting, demonstrating and lobbying may have only a very small chance of changing the probability of the final outcome. But the potential impact of altering an outcome–through in this case encouraging more aggressive Co2 emission mitigation–is monumental. This is reverse Russian roulette, where the chamber with the bullet becomes the benign outcome. And so I march.

The Trillionth Tonne

Communicating climate risk to non-specialists is not easy. Nonetheless, I think it is possible. In my own personal journey to understanding the risks my family and myself face, I have found that  getting to grips with the idea of a carbon budget has been vital. So I have a great deal of gratitude to those scientists who have thought long and hard about how to highlight the link between carbon and temperature change.

The carbon budget concept first found a wider audience in the journal Nature with the publication of two papers led by Myles Allen et al here, and Meinshausen et al here. A less technical commentary piece entitled “The Exit Strategy” also accompanied these two papers and is an absolute must-read for any thinking person.

The central tenet behind these papers is that only a limited amount of fossil-fuel carbon can burnt and turn into CO2 before we are committed to warming the earth by 2 degrees Celsius. Given our current state of knowledge, Myles Allen and his colleagues also suggest that our current carbon budget is one trillion tonnes (or rather this is their best estimate of what can be released). The time path over which that trillion tonnes of carbon is emitted has almost no bearing on the level of actual warming due to the lags of temperature change to CO2 and the fact that CO2 resides in the atmosphere for so long (click for larger image).

CO2 Emissions Paths jpg
Note that they tackle the question of climate sensitivity to CO2 somewhat differently from the approach taken by the Intergovernmental Panel on Climate Change (IPCC) . In short, the IPCC defines climate sensitivity as the rise in global mean temperature based on a doubling of atmospheric CO2 from pre-industrial levels. The preferred metric of Allen and his colleagues is how much global mean temperature rises per one trillion tonnes of carbon.

Helpfully, Oxford University hosts a web site based on this methodology telling us how far we are along the way to burning that trillionth tonne. The answer is here:

Trillionth Tonne jpg

Continue reading

Risk, Sensitivity and Sifting the Studies

Global warming? How bad could it get? Of course, with all of us being knowledgable about risk, we understand that this is really a question of probability multiplied by effect (that, in turn, means probable-but-quite-bad stuff happening all the way through to possible-but-bloody-awful stuff happening).

But lets chunk that up into three manageable variables: 1) how much CO2 we are throwing up into the atmosphere, 2) how much warming that CO2 is creating, and 3) how much damage the warming is causing.

This gives anyone of a “skeptical” disposition  three lines of attack: 1) dispute the trajectory of fossil fuel emissions, 2) uncover academic papers that suggest a low climate temperature sensitivity to CO2 or 3) welcome the warmer world as being beneficial to mankind.

Out of the primeval swamp that is the blogosphere, a Darwinian struggle has led to two sites emerging triumphant (one on either side of the Atlantic) as the alpha male climate “skeptic” clearing houses. From the U.S., we have Watts Up With That, and from the U.K. Bishop’s Hill. If you read any article bashing climate change, it is a good bet that the columnist or journalist sourced it from one of these two.

Not surprising, therefore, that both blogs have jumped on an as-yet-unpublished study by Norwegian researchers stating that climate sensitivity to a doubling of CO2 is as low as 1.9 °C (see here and here).

As a non-scientist but a student of risk, I suggest a three-step approach to any claim that here is little or no risk from climate change, and I use the Norwegian study as an example of this process. Continue reading

Back to that Big Number

In my post “A Big Number Gets Tweaked” I focused on ‘climate sensitivity’, aka the global mean surface temperature response to a doubling of CO2. It is an important number, and a basic understanding of what it means is a basic part of what I would call ‘climate change literacy’.

Going back to the Intergovernmental Panel on Climate Change (IPCC)’s Assessment Report 4 (AR4) published in 2007, a definition of climate sensitivity can be found on page 12 of the Summary for Policy Makers here.

The equilibrium climate sensitivity is a measure of the climate system response to sustained radiative forcing. It is not a projection but is defined as the global average surface warming following a doubling of carbon dioxide concentrations. It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values. Water vapour changes represent the largest feedback affecting climate sensitivity and are now better understood than in the TAR. Cloud feedbacks remain the largest source of uncertainty.

The chart below gives a sense of the different sensitivity estimates that provided the background to the IPCC’s final number:

This definition of climate sensitivity dates back to a landmark paper by Jule Charney et al in 1979 (here). In fact, to avoid confusion, we could call it Charney sensitivity. Now what Charney sensitivity isn’t (surprisingly) is the real world sensitivity of surface temperatures to a doubling of CO2. This is because Charney sensitivity was a blending of the results of two climate models that held a number of the variables constant. Of course, the Charney sensitivity in its modern version is now backed up by  a multitude of models of far greater sophistication, but interestingly the sensitivity number that came out of the 30-year old Charney report has held up pretty well. Nonetheless, the Charney sensitivity has a somewhat narrow definition. The excellent climate scientist run blog RealClimate (www.realclimate.org) explains this in more detail here:

The standard definition of climate sensitivity comes from the Charney Report in 1979, where the response was defined as that of an atmospheric model with fixed boundary conditions (ice sheets, vegetation, atmospheric composition) but variable ocean temperatures, to 2xCO2. This has become a standard model metric (because it is relatively easy to calculate. It is not however the same thing as what would really happen to the climate with 2xCO2, because of course, those ‘fixed’ factors would not stay fixed.

A wider definition is usually termed the Earth System sensitivity that allows all the fixed boundary conditions in the Charney definition to vary. As such, ice sheets, vegetation changes and atmospheric composition can provide feedbacks to temperature and thus cause a greater temperature response over the longer term. The Earth System sensitivity is in theory closer to the real world as it tells us at what temperature the system will ultimately get back to equilibrium.

The most influential calculation of Earth System sensitivity has been that made by NASA’s Jim Hansen, since it forms the scientific foundation for the 350.org climate change campaigning organisation. As the name suggests, 350.org urges humanity to strive toward a target of 350 parts per million (ppm) of CO2. The rationale for the target can be found here and rests heavily on a paper by Jim Hansen and his coauthors entitled “Target atmospheric CO2: Where should humanity aim?“.

In the abstract of the Hansen article, we immediately see a differentiation between a sensitivity that includes only fast feedback processes (a Charney sensitivity) and an equilibrium sensitivity that includes slower feedbacks (an Earth System sensitivity):

Paleoclimate data show that climate sensitivity is ~3°C for doubled CO2, including only fast feedback processes. Equilibrium sensitivity, including slower surface albedo feedbacks, is ~6°C for doubled CO2 for the range of climate states between glacial conditions and ice-free Antarctica. Decreasing CO2 was the main cause of a cooling trend that began 50 million years ago, the planet being nearly ice-free until CO2 fell to 450 ± 100 ppm; barring prompt policy changes, that critical level will be passed, in the opposite direction, within decades.

The paper then goes on to make a pretty forceful policy recommendation:

If humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted, paleoclimate evidence and ongoing climate change suggest that CO2 will need to be reduced from its current 385 ppm to at most 350 ppm, but likely less than that.

Note that the article does contain a number of caveats over climate variability, climate models and other uncertainties. Further, as is the usual process in science, it has received various critiques, many suggesting that a figure of 6 degree Celsius is too high for long term sensitivity. What is not in dispute, however, is that, an Earth System sensitivity with long-term feedbacks will have a higher sensitivity number than a Charney sensitivity with only short-term feedbacks (almost by definition).

Despite this fact, we see numerous media reports getting tangled up between the two types of sensitivities following the publishing of the new Schmittner et al paper I talked about in a previous post. This from the Houston Chronicle:

To me, the real effect of this paper will be to really impair the credibility of the more extreme environmentalists who have been saying the planet faces certain doom from climate change.

I am thinking about such efforts as Bill McKibben’s 350 campain, in which he asserts that 350 ppm is the most important number in the world. Such environmentalists assert that the planet will warm as much as 6 Celsius degrees with a doubling of atmospheric carbon dioxide levels.

That’s a big number and doubtless would have catastrophic consequences for the planet. This is not in dispute. But scientists are now telling us this is not going to happen.

Well ‘no’ actually. Since we are comparing apples and pears, scientists are not now telling us that catastrophic outcomes are not going to happen.

Getting back to the topic of risk, we can now see how a better understanding of the different sensitivity concepts allows ordinary people to get a better idea of the climate risk they and their families face.

To reiterate, we are going from CO2, to temperature (via sensitivity) to impacts. To get a good idea of overall risk we need a sense of of how carbon emissions are trending; then we need a feeling for how sensitive temperature is to CO2; and lastly an understanding of how much the earth changes (and the impact on us of those changes) once the world warms.

The Charney sensitivity is very useful since it gives a floor to the kind of temperature changes we will experience. If the best estimate of this sensitivity number if found in the future to be smaller than the current consensus of 3 degrees, then that—other things being equal—is a positive thing. However, we are not in a position, yet, to reduce the consensus based on the Schmittner paper.

The Hansen 6 degree Celsius number is probably a little too high, but if we get anywhere close to this number, we are still in the bad lands of catastrophic climate change. Nonetheless, the time horizon for the full warming stretches generations into the future; thus, it is probably not the risk metric you would use if your concern only goes our as far out as grandchildren. But I think Jim Hansen receives a lot of underserved ridicule in certain parts of the blogosphere and American press for his championing of a number that implies the yet unborn have rights too.

Putting this question of human ethics to one side, those alive today are really interested in a Charney sensitivity plus alpha from a climate risk perspective. The components that make up that ‘plus alpha’ are a topic for another post.

A Big Number Gets Tweaked

If I had to nominate candidates for the title of two most important numbers in the world, they would have to be 1) the atmospheric concentration of CO2 in the atmosphere (which you can find here) and 2) the climate sensitivity of the global mean temperature to a doubling of CO2.

As esoteric as this discussion may appear, both numbers rank above such economic heavy weights as inflation, GDP growth and government debt-to-GDP ratios for the life outcomes for my kids (in my humble opinion). Basically, bad things happen as CO2 jumps and temperature rises (see here, here and here).

Now there is a lot more I would like to say about atmospheric CO2 concentration, but that will have to wait for future posts. Today, I want to focus on climate sensitivity because an academic paper in the journal Science has just been released (here) that claims the numbers we have been using up to now for climate sensitivity have been too high.

But before I quote the abstract of the new paper, it is useful to restate the existing consensus from the Intergovernmental Panel on Climate Change (IPCC)’s Assessment Report 4 (AR4) published in 2007. It can easily be found on page 12 of the Summary for Policy Makers here. The key paragraph is as follows:

The equilibrium climate sensitivity is a measure of the climate system response to sustained radiative forcing. It is not a projection but is defined as the global average surface warming following a doubling of carbon dioxide concentrations. It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values. Water vapour changes represent the largest feedback affecting climate sensitivity and are now better understood than in the TAR. Cloud feedbacks remain the largest source of uncertainty.

Now we turn to the new academic paper by Schmittner et al. and—after noting that Kelvin (K) is the equivalent to Celsius (C)—we read this:

Assessing impacts of future anthropogenic carbon emissions is currently impeded by uncertainties in our knowledge of equilibrium climate sensitivity to atmospheric carbon dioxide doubling. Previous studies suggest 3 K as best estimate, 2 to 4.5 K as the 66% probability range, and nonzero probabilities for much higher values, the latter implying a small but significant chance of high-impact climate changes that would be difficult to avoid. Here, combining extensive sea and land surface temperature reconstructions from the Last Glacial Maximum with climate model simulations, we estimate a lower median (2.3 K) and reduced uncertainty (1.7 to 2.6 K 66% probability). Assuming paleoclimatic constraints apply to the future as predicted by our model, these results imply lower probability of imminent extreme climatic change than previously thought.

Very simplistically, the paper reconstructs the temperature record of the last glacial maximum (LGM, the height of the last ice age) 20,000 years ago. Their findings suggest that the LGM was between 2 to 3 degrees Celsius cooler than the present, against current consensus estimates of around 5 degrees. The authors then matched this temperature against the green house gas concentrations of that time. In sum, for the given difference in CO2 with the present, they got less bang for the buck in terms of CO2 impact on temperature compared with what climate models currently suggest for the future.

If we believe the new findings, then the best estimate of climate sensitivity should be reduced from 3 degrees Celsius for a doubling of CO2 to 2.3 degrees—and the range has also to be narrowed. Just to put things in context, the pre-industrial concentration of CO2 was 280 parts per million and we are now at around 390 ppm, or up 40%. Now the IPCC’s AR4 also has this to say:

450 ppm CO2-eq corresponds to best estimate of 2.1°C temperature rise above pre-industrial global average, and “very likely above” 1°C rise, and “likely in the range” of 1.4–3.1°C rise.

Now I’ve highlighted it before in another post, but I will highlight it again in this post, CO2 and CO2 equivalent are different concepts. However, at the current time, non-C02 atmospheric forcing effects currently cancel out (for a more detailed discussion of this, see here), so we are in the happy position of being able to capture what is happening by looking at the CO2 number alone—for the time being.

Moving on, we should note that the international community has decided that 2 degrees Celsius of warming marks the point as where we will experience ‘dangerous’ climate change. This is in the opening paragraph of the Copenhagen Accord:

We underline that climate change is one of the greatest challenges of our time. We emphasise our strong political will to urgently combat climate change in accordance with the principle of common but differentiated responsibilities and respective capabilities. To achieve the ultimate objective of the Convention to stabilize greenhouse gas concentration in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system, we shall, recognizing the scientific view that the increase in global temperature should be below 2 degrees Celsius, on the basis of equity and in the context of sustainable development, enhance our long-term cooperative action to combat climate change.

To recap, we have a best estimate of climate sensitivity of 3 degrees. And based on this number,  atmospheric CO2-equivalent should be capped at 450 ppm to hold temperature rise to around 2 degrees. This, in turn, is because 2 degrees of warming is deemed the level at which ‘dangerous’ climate change develops.

Now what happens if the 3 degree number is incorrect and should be 2.3 degrees? Well, the first reaction is to think that the 450 ppm ‘line in the sand’ for dangerous climate change goes out the window. Further, if this CO2 concentration number goes out the window, so do all the numbers for ‘extremely dangerous’ climate change, and for that matter ‘catastrophic’ climate change. If so, the carbon emissions paths associated with different levels of warming as talked about in my post here also have to be radically revised (click for larger image below, see here for the original article).

And, addition, the deadline for the cessation of fossil fuel based energy production plant installation calculated by the International Energy Agency (IEA) and as talked about in my last post here would also have to be reworked.

However, some caution is in order. First, this is only one paper amongst many that have tackled the question of climate sensitivity from a variety of angles; it should be judged within the context of the total body of work. Further, as with all good science, its assumptions will come under intense scrutiny to check if the methodology is correct. Unlike the climate skeptic blog comnentary, the authors of the report fully admit the tentative nature of their findings:

“There are many hypotheses for what’s going on here.  There could be something wrong with the land data, or the ocean data.  There could be something wrong with the climate model’s simulation of land temperatures, or ocean temperatures.  The magnitudes of the temperatures could be biased in some way.  Or, more subtly, they could be unbiased, on average, but the model and observations could disagree on the cold and warm spots are, as I alluded to earlier.  Or something even more complicated could be going on.

Until the above questions are resolved, it’s premature to conclude that we have disproven high climate sensitivities, just because our statistical analysis assigns them low probabilities.”

The excellent site Skeptical Science has a great post on the Schmittner et al. paper here.  After going through the technical challenges in considerable depth, they also note a critical, and inconvenient truth, if the article’s findings are correct:

In short, if Schmittner et al. are correct and such a small temperature change can cause such a drastic climate change, then we may be in for a rude awakening in the very near future, because their smaller glacial-interglacial difference would imply a quicker climate response a global temperature change, as illustrated in Figure 4.

As Figure 4 illustrates, although the Schmittner et al. best estimate for climate sensitivity results in approximately 20% less warming than the IPCC best estimate, we also achieve their estimated temperature change between glacial and interglacial periods (the dashed lines) much sooner.  The dashed lines represent the temperature changes between glacial and interglacial periods in the Schmittner (blue) and IPCC (red) analyses.  If Schmittner et al. are correct, we are on pace to cause a temperature change of the magnitude of an glacial-interglacial transition – and thus likely similarly dramatic climate changes – within approximately the next century.*

In the run-up to the publication of the IPCC’s AR5 report in 2013, it will be critical to see if a new consensus number emerges that is different from that of the last AR4 report in 1997—a consensus that takes all the new findings made over the last few years into consideration. As this number changes, so will the world.