Tag Archives: james hansen

The Irony of ‘Alarmists’ Who Err on the Side of Least Drama

Try searching the headlines of Anthony Watts’ WUWT blog for the words “alarmist” or “alarmism” over the past year and you will get 17 hits. The words are never clearly defined but imply that climate scientist claims are exaggerated and thus unworthy of merit.

Fortunately, a literature review by Brysse et al has recently been released that looks into whether this is true (the original is here and Skeptical Science also provides a good summary here).

Restated, the question is whether the scientific community, for which the Intergovernmental Panel on Climate Change (IPCC) can be used as a proxy, has made exaggerated predictions. For a full answer to this question, please read the original article or the Skeptical Science post. In summary, the article’s conclusion is that the IPCC has been too cautious in its predictions with respect to sea level rise, Arctic sea ice extent retreat, snow cover reduction, the rise in carbon emissions and ice sheet melt. Hurricane intensity and global mean temperature rise have been in line with predictions.

The most interesting part of the article deals with why scientists have had a tendency to be conservative in their predictions—what the study calls to ‘err on the side of least drama (ESLD)’. In their words: Continue reading

Climate Change: A Question of Caves and Mansions 2?

In my last post, I looked at the neoclassical economist’s view of a world undergoing climate change. The consensus within the profession is that global mean temperature rise will become a growing cost to humanity. Further, such a cost is not being borne by those causing it (a so called externality in the economics literature) and therefore justifies a carbon tax. Finally, and most controversially for some, the standard recommendation is for a slow and steady ramp in taxation from a very low starting point. This rests on the recognition that any investment to mitigate carbon emissions now will translate into lost economic output in the future. So the logic goes: it is often better to get rich and dirty first (before cleaning up), rather than staying clean and poor.

In sum, the economics profession calls for a calm, considered but, above all, slow response to climate change.  This is in stark contrast to the position of many climate scientists; for example, the sentiment expressed in the following statement by the climate scientist Lonnie Thompson:

Why then are climatologists speaking out about the dangers of global warming? The answer is that virtually all of us are now convinced that global warming poses a clear and present danger to civilization.

Since the scientific and the economics communities inhabit completely different academic silos, it is rare to find any intelligent discussion that analyses this dichotomy of opinion. Economists cite scholarly articles published in the leading economics journals, and scientists cite scholarly articles in the leading scientific journals. The one exception is perhaps the Intergovernmental Panel on Climate Change’s periodic assessment reviews, which has provided a communal market place of ideas for a variety of disciplines to meet. However, the last report was published in 2007 and was based on an information set available a few years even earlier. Therefore, many economists are not very well placed to tap into the rising alarm of climate scientists as new data comes in and reports get published.

If I were a climate scientist trying to install a sense of urgency among economists, how would plan my avenue of attack? Continue reading

Back to that Big Number

In my post “A Big Number Gets Tweaked” I focused on ‘climate sensitivity’, aka the global mean surface temperature response to a doubling of CO2. It is an important number, and a basic understanding of what it means is a basic part of what I would call ‘climate change literacy’.

Going back to the Intergovernmental Panel on Climate Change (IPCC)’s Assessment Report 4 (AR4) published in 2007, a definition of climate sensitivity can be found on page 12 of the Summary for Policy Makers here.

The equilibrium climate sensitivity is a measure of the climate system response to sustained radiative forcing. It is not a projection but is defined as the global average surface warming following a doubling of carbon dioxide concentrations. It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values. Water vapour changes represent the largest feedback affecting climate sensitivity and are now better understood than in the TAR. Cloud feedbacks remain the largest source of uncertainty.

The chart below gives a sense of the different sensitivity estimates that provided the background to the IPCC’s final number:

This definition of climate sensitivity dates back to a landmark paper by Jule Charney et al in 1979 (here). In fact, to avoid confusion, we could call it Charney sensitivity. Now what Charney sensitivity isn’t (surprisingly) is the real world sensitivity of surface temperatures to a doubling of CO2. This is because Charney sensitivity was a blending of the results of two climate models that held a number of the variables constant. Of course, the Charney sensitivity in its modern version is now backed up by  a multitude of models of far greater sophistication, but interestingly the sensitivity number that came out of the 30-year old Charney report has held up pretty well. Nonetheless, the Charney sensitivity has a somewhat narrow definition. The excellent climate scientist run blog RealClimate (www.realclimate.org) explains this in more detail here:

The standard definition of climate sensitivity comes from the Charney Report in 1979, where the response was defined as that of an atmospheric model with fixed boundary conditions (ice sheets, vegetation, atmospheric composition) but variable ocean temperatures, to 2xCO2. This has become a standard model metric (because it is relatively easy to calculate. It is not however the same thing as what would really happen to the climate with 2xCO2, because of course, those ‘fixed’ factors would not stay fixed.

A wider definition is usually termed the Earth System sensitivity that allows all the fixed boundary conditions in the Charney definition to vary. As such, ice sheets, vegetation changes and atmospheric composition can provide feedbacks to temperature and thus cause a greater temperature response over the longer term. The Earth System sensitivity is in theory closer to the real world as it tells us at what temperature the system will ultimately get back to equilibrium.

The most influential calculation of Earth System sensitivity has been that made by NASA’s Jim Hansen, since it forms the scientific foundation for the 350.org climate change campaigning organisation. As the name suggests, 350.org urges humanity to strive toward a target of 350 parts per million (ppm) of CO2. The rationale for the target can be found here and rests heavily on a paper by Jim Hansen and his coauthors entitled “Target atmospheric CO2: Where should humanity aim?“.

In the abstract of the Hansen article, we immediately see a differentiation between a sensitivity that includes only fast feedback processes (a Charney sensitivity) and an equilibrium sensitivity that includes slower feedbacks (an Earth System sensitivity):

Paleoclimate data show that climate sensitivity is ~3°C for doubled CO2, including only fast feedback processes. Equilibrium sensitivity, including slower surface albedo feedbacks, is ~6°C for doubled CO2 for the range of climate states between glacial conditions and ice-free Antarctica. Decreasing CO2 was the main cause of a cooling trend that began 50 million years ago, the planet being nearly ice-free until CO2 fell to 450 ± 100 ppm; barring prompt policy changes, that critical level will be passed, in the opposite direction, within decades.

The paper then goes on to make a pretty forceful policy recommendation:

If humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted, paleoclimate evidence and ongoing climate change suggest that CO2 will need to be reduced from its current 385 ppm to at most 350 ppm, but likely less than that.

Note that the article does contain a number of caveats over climate variability, climate models and other uncertainties. Further, as is the usual process in science, it has received various critiques, many suggesting that a figure of 6 degree Celsius is too high for long term sensitivity. What is not in dispute, however, is that, an Earth System sensitivity with long-term feedbacks will have a higher sensitivity number than a Charney sensitivity with only short-term feedbacks (almost by definition).

Despite this fact, we see numerous media reports getting tangled up between the two types of sensitivities following the publishing of the new Schmittner et al paper I talked about in a previous post. This from the Houston Chronicle:

To me, the real effect of this paper will be to really impair the credibility of the more extreme environmentalists who have been saying the planet faces certain doom from climate change.

I am thinking about such efforts as Bill McKibben’s 350 campain, in which he asserts that 350 ppm is the most important number in the world. Such environmentalists assert that the planet will warm as much as 6 Celsius degrees with a doubling of atmospheric carbon dioxide levels.

That’s a big number and doubtless would have catastrophic consequences for the planet. This is not in dispute. But scientists are now telling us this is not going to happen.

Well ‘no’ actually. Since we are comparing apples and pears, scientists are not now telling us that catastrophic outcomes are not going to happen.

Getting back to the topic of risk, we can now see how a better understanding of the different sensitivity concepts allows ordinary people to get a better idea of the climate risk they and their families face.

To reiterate, we are going from CO2, to temperature (via sensitivity) to impacts. To get a good idea of overall risk we need a sense of of how carbon emissions are trending; then we need a feeling for how sensitive temperature is to CO2; and lastly an understanding of how much the earth changes (and the impact on us of those changes) once the world warms.

The Charney sensitivity is very useful since it gives a floor to the kind of temperature changes we will experience. If the best estimate of this sensitivity number if found in the future to be smaller than the current consensus of 3 degrees, then that—other things being equal—is a positive thing. However, we are not in a position, yet, to reduce the consensus based on the Schmittner paper.

The Hansen 6 degree Celsius number is probably a little too high, but if we get anywhere close to this number, we are still in the bad lands of catastrophic climate change. Nonetheless, the time horizon for the full warming stretches generations into the future; thus, it is probably not the risk metric you would use if your concern only goes our as far out as grandchildren. But I think Jim Hansen receives a lot of underserved ridicule in certain parts of the blogosphere and American press for his championing of a number that implies the yet unborn have rights too.

Putting this question of human ethics to one side, those alive today are really interested in a Charney sensitivity plus alpha from a climate risk perspective. The components that make up that ‘plus alpha’ are a topic for another post.