Category Archives: Uncategorized

Blog on Hold

Apologies for the lack of posts for some time. The one common theme of all the best blogs I follow is their consistency in posting. That is something that I have been unable to achieve for a variety of reasons: a number of deaths of near relatives in my family, two moves between countries, a series of changes in accommodation and a general lack of stability in my life. I believe that for me to achieve a consistent level of posting requires a lot more stability and that won’t happen for another six months or so at least.

In addition, through writing posts for this blog I have become more aware that climate change risk can’t be separated from a variety of other risks we face: technology led economic instability, demographic transitions and resource depletion in particular. This nexus of risks really need to be dealt with in a unified manner.

With all this in mind, I hope to relaunch the blog in 2013 (potentially with a new name) explicitly dealing with a wider theme than was my original intention with Climate and Risk.

Thanks

Technology: Singularity or Collapse? (Part 2: The Ozone Hole)

In my last post, I made the point that techno-optimists, such as Ray Kurweil, see technological change transforming economies through the exponential growth of productivity as the present century progresses. Critically, the analysis of Kurweil and his fellow travellers makes no mention of societal costs—so called externalities in the language of economics. Each innovation or invention is basically self-contained—overcoming a particular problem but without creating any secondary problems in another part of the system.

Unfortunately, this tunnel vision of the benefits of technology does, on many occasions, not correspond to the actual historical record. One technology I have in mind is Thomas Midgley Jr.’s creation of a compound known as chlorofluorocarbon (CFC-12), better know as Freon. CFCs are a classic Kurzweil type solution to a particular problem, in this case the need for a substitute for the highly poisonous gases used up until the 1930s for refrigeration. At the time of their creation and for many years later, CFCs were believed to be inert and totally harmless to human health. In reality, as the CFCs accumulated in the upper atmosphere, they led to the creation of the Antarctic ozone hole. The journalist and author Dianne Dumanoski in her book “The End of the Long Summer” described the ozone hole phenomenon as the most important single event of the 2oth century, even eclipsing Neil Armstrong’s first steps on the moon, since it symbolised “the arrival of a new and ominous epoch when human activity began to disrupt the essential but invisible planetary systems that sustain a dynamic, living Earth.” Even more telling, the environmental historian J.R. McNeill described Midgley himself as having “had more impact on the atmosphere than any other single organism in earth’s history.” Continue reading

Technology: Singularity or Collapse? (Part 1: For Ever Exponential)

In the opening chapter of Ray Kurzweil‘s “The Singularity Is Near” we are presented with the following parable:

A lake owner wants to stay at home to tend to the lake’s fish and make certain that the lake itself will not become covered with lily pads, which are said to double their number every few days. Month after month, he patiently waits, yet only tiny patches of lily pads can be discerned, and they don’t seem to be expanding in any noticeable way. With the lily pads covering less than 1 percent of the lake, the owner figures that it’s safe to take a vacation and leaves with his family. When he returns a few weeks later, he’s shocked to discover that the entire lake has become covered with the pads, and his fish have perished. By doubling their number every few days, the last seven doublings were sufficient to extend the pads’ coverage to the entire lake. (Seven doublings extended their reach 128-fold.) This is the nature of exponential growth.

While ‘the water lily and the lake’ appears a strange choice of metaphor since if nothing else it highlights the importance of boundaries to growth, what Kurzweil was trying to communicate was how technology has barely begun to transform our lives.

By contrast, consider the 1972 report to the Club of Rome published under the title “The Limits to Growth.” Much maligned and mostly misrepresented, The Limits to Growth (LTG) was nothing more than a mathematical analysis of linear and exponential growth rates and ultimate constraints. According to the authors, the tyranny of exponential growth rates would eventually lead population and industrial production to explode, setting off a negative feedback in terms of burgeoning pollution and the eventual exhaustion of food and resources. The report never provided specific dates for the depletion of individual materials, although nine our of ten commentaries on the report claim it did (for a post I did on this particular urban legend, see here). Nonetheless, what the report did do was suggest that the idea of inevitable constant human progress was a dangerous myth. Continue reading

A Big Number Gets Tweaked

If I had to nominate candidates for the title of two most important numbers in the world, they would have to be 1) the atmospheric concentration of CO2 in the atmosphere (which you can find here) and 2) the climate sensitivity of the global mean temperature to a doubling of CO2.

As esoteric as this discussion may appear, both numbers rank above such economic heavy weights as inflation, GDP growth and government debt-to-GDP ratios for the life outcomes for my kids (in my humble opinion). Basically, bad things happen as CO2 jumps and temperature rises (see here, here and here).

Now there is a lot more I would like to say about atmospheric CO2 concentration, but that will have to wait for future posts. Today, I want to focus on climate sensitivity because an academic paper in the journal Science has just been released (here) that claims the numbers we have been using up to now for climate sensitivity have been too high.

But before I quote the abstract of the new paper, it is useful to restate the existing consensus from the Intergovernmental Panel on Climate Change (IPCC)’s Assessment Report 4 (AR4) published in 2007. It can easily be found on page 12 of the Summary for Policy Makers here. The key paragraph is as follows:

The equilibrium climate sensitivity is a measure of the climate system response to sustained radiative forcing. It is not a projection but is defined as the global average surface warming following a doubling of carbon dioxide concentrations. It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values. Water vapour changes represent the largest feedback affecting climate sensitivity and are now better understood than in the TAR. Cloud feedbacks remain the largest source of uncertainty.

Now we turn to the new academic paper by Schmittner et al. and—after noting that Kelvin (K) is the equivalent to Celsius (C)—we read this:

Assessing impacts of future anthropogenic carbon emissions is currently impeded by uncertainties in our knowledge of equilibrium climate sensitivity to atmospheric carbon dioxide doubling. Previous studies suggest 3 K as best estimate, 2 to 4.5 K as the 66% probability range, and nonzero probabilities for much higher values, the latter implying a small but significant chance of high-impact climate changes that would be difficult to avoid. Here, combining extensive sea and land surface temperature reconstructions from the Last Glacial Maximum with climate model simulations, we estimate a lower median (2.3 K) and reduced uncertainty (1.7 to 2.6 K 66% probability). Assuming paleoclimatic constraints apply to the future as predicted by our model, these results imply lower probability of imminent extreme climatic change than previously thought.

Very simplistically, the paper reconstructs the temperature record of the last glacial maximum (LGM, the height of the last ice age) 20,000 years ago. Their findings suggest that the LGM was between 2 to 3 degrees Celsius cooler than the present, against current consensus estimates of around 5 degrees. The authors then matched this temperature against the green house gas concentrations of that time. In sum, for the given difference in CO2 with the present, they got less bang for the buck in terms of CO2 impact on temperature compared with what climate models currently suggest for the future.

If we believe the new findings, then the best estimate of climate sensitivity should be reduced from 3 degrees Celsius for a doubling of CO2 to 2.3 degrees—and the range has also to be narrowed. Just to put things in context, the pre-industrial concentration of CO2 was 280 parts per million and we are now at around 390 ppm, or up 40%. Now the IPCC’s AR4 also has this to say:

450 ppm CO2-eq corresponds to best estimate of 2.1°C temperature rise above pre-industrial global average, and “very likely above” 1°C rise, and “likely in the range” of 1.4–3.1°C rise.

Now I’ve highlighted it before in another post, but I will highlight it again in this post, CO2 and CO2 equivalent are different concepts. However, at the current time, non-C02 atmospheric forcing effects currently cancel out (for a more detailed discussion of this, see here), so we are in the happy position of being able to capture what is happening by looking at the CO2 number alone—for the time being.

Moving on, we should note that the international community has decided that 2 degrees Celsius of warming marks the point as where we will experience ‘dangerous’ climate change. This is in the opening paragraph of the Copenhagen Accord:

We underline that climate change is one of the greatest challenges of our time. We emphasise our strong political will to urgently combat climate change in accordance with the principle of common but differentiated responsibilities and respective capabilities. To achieve the ultimate objective of the Convention to stabilize greenhouse gas concentration in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system, we shall, recognizing the scientific view that the increase in global temperature should be below 2 degrees Celsius, on the basis of equity and in the context of sustainable development, enhance our long-term cooperative action to combat climate change.

To recap, we have a best estimate of climate sensitivity of 3 degrees. And based on this number,  atmospheric CO2-equivalent should be capped at 450 ppm to hold temperature rise to around 2 degrees. This, in turn, is because 2 degrees of warming is deemed the level at which ‘dangerous’ climate change develops.

Now what happens if the 3 degree number is incorrect and should be 2.3 degrees? Well, the first reaction is to think that the 450 ppm ‘line in the sand’ for dangerous climate change goes out the window. Further, if this CO2 concentration number goes out the window, so do all the numbers for ‘extremely dangerous’ climate change, and for that matter ‘catastrophic’ climate change. If so, the carbon emissions paths associated with different levels of warming as talked about in my post here also have to be radically revised (click for larger image below, see here for the original article).

And, addition, the deadline for the cessation of fossil fuel based energy production plant installation calculated by the International Energy Agency (IEA) and as talked about in my last post here would also have to be reworked.

However, some caution is in order. First, this is only one paper amongst many that have tackled the question of climate sensitivity from a variety of angles; it should be judged within the context of the total body of work. Further, as with all good science, its assumptions will come under intense scrutiny to check if the methodology is correct. Unlike the climate skeptic blog comnentary, the authors of the report fully admit the tentative nature of their findings:

“There are many hypotheses for what’s going on here.  There could be something wrong with the land data, or the ocean data.  There could be something wrong with the climate model’s simulation of land temperatures, or ocean temperatures.  The magnitudes of the temperatures could be biased in some way.  Or, more subtly, they could be unbiased, on average, but the model and observations could disagree on the cold and warm spots are, as I alluded to earlier.  Or something even more complicated could be going on.

Until the above questions are resolved, it’s premature to conclude that we have disproven high climate sensitivities, just because our statistical analysis assigns them low probabilities.”

The excellent site Skeptical Science has a great post on the Schmittner et al. paper here.  After going through the technical challenges in considerable depth, they also note a critical, and inconvenient truth, if the article’s findings are correct:

In short, if Schmittner et al. are correct and such a small temperature change can cause such a drastic climate change, then we may be in for a rude awakening in the very near future, because their smaller glacial-interglacial difference would imply a quicker climate response a global temperature change, as illustrated in Figure 4.

As Figure 4 illustrates, although the Schmittner et al. best estimate for climate sensitivity results in approximately 20% less warming than the IPCC best estimate, we also achieve their estimated temperature change between glacial and interglacial periods (the dashed lines) much sooner.  The dashed lines represent the temperature changes between glacial and interglacial periods in the Schmittner (blue) and IPCC (red) analyses.  If Schmittner et al. are correct, we are on pace to cause a temperature change of the magnitude of an glacial-interglacial transition – and thus likely similarly dramatic climate changes – within approximately the next century.*

In the run-up to the publication of the IPCC’s AR5 report in 2013, it will be critical to see if a new consensus number emerges that is different from that of the last AR4 report in 1997—a consensus that takes all the new findings made over the last few years into consideration. As this number changes, so will the world.

Jeremy Grantham and Climate Change

I am sometimes amazed by how climate change has become almost a taboo issue in certain circles of the investment community. For example, John Mauldin, who writes Thoughts from the Frontline, probably the most-widely circulated and read financial newsletter in the world, barely touches on the issue. Is he a skeptic? My sense is ‘yes’ based on his musing on volcanoes in his 2011 New Year piece, although it is difficult to tell.

Yet regardless of his own inclination,  my suspicion is that John has decided to ignore climate change for fear of upsetting his readership base, given the political polarisation that the issues produces. Paraphrasing Oscar Wilde, climate change has become the investment topic that dares not speak its name. It is therefore refreshing to find at least one high profile investment manager who is not afraid to take a very public stand on the issue: Jeremy Grantham.

In the GMO Quarterly Letter written in July 2010, Grantham came out with one of most succinct explanations of global warming for investment professionals I have seen under the title “Everything You Need to Know about Global Warming in 5 Minutes”. It is worth spending 5 minutes of your time to read it through:

1) The amount of carbon dioxide (CO2) in the atmosphere, after at least several thousand years of being quite constant, started to rise with the advent of the Industrial Revolution. It has increased by 40% and is rising each year. This is certain and straight forward.

2) One of the properties of CO2 is that it creates a greenhouse effect and, other things being equal, causes the temperature to rise. This is just physics.

3) Several other factors, like changes in solar output, have major influences on climate over millennia, but these effects are known, are observable, and have been allowed for in current models. Critically, there have been no important changes in these other factors over the last 100 years.

4) The doubts arise when it comes to the interaction of CO2 with other variables in a complicated system, especially water vapor. It is impossible to be sure whether the temperature will rise slowly or rapidly. But, the past can be measured. The temperature has indeed steadily risen and is well within the boundaries predicted for the man-made effect. But the forecasts still range very widely, from a harmless negligible rise to a potentially disastrous +6 degrees Fahrenheit or higher within this century. The main danger of the CO2 interaction with water vapor is the high probability that it will cause a great increase in severe precipitation episodes.

5) Skeptics argue that this wide range of uncertainty lowers the need to act: “Why spend money when you’re not certain?” But since the penalties rise hyperbolically at the tail, a wider range implies a greater risk (and a greater expected value of the costs). This is logically and mathematically rigorous and yet is still argued.

6) Pascal asks the question: What is the expected value of a very small chance of an infinite loss? And, he answers, “Infinite.” In this example, what is the cost of lowering CO2 output and having the long-term effect of increasing CO2 turn out to be nominal? The cost appears to be equal to foregoing, once in your life, six months’ to one year’s global growth – 2% to 4%, or less. The benefits, even with no warming, include: energy independence from the Middle East; more jobs, since wind and solar power and increased efficiency are more labor-intensive than another coal-fired power plant; less pollution of streams and air; and an early leadership role for the U.S. in industries that will inevitably become important. Conversely, what are the costs of not acting on prevention when the results turn out to be serious: costs that may dwarf those for prevention; and probable political destabilization from droughts, famine, mass migrations, and even war. And, to Pascal’s real point, what might be the cost at the very extreme end of the distribution: definitely life changing, possibly life threatening.

7) The biggest cost of all from global warming is likely to be the accumulated loss of biodiversity. This features nowhere in economic cost-benefit analysis because, not surprisingly, it is hard to put a price on that which is priceless. 

8.) A special word on the right-leaning think tanks: As libertarians, they abhor the need for government spending or even governmental leadership, which in their opinion is best left to private enterprise. In general, this may be an excellent idea. But global warming is a classic tragedy of the commons – seeking your own individual advantage, for once, does not lead to the common good, and the problem desperately needs government leadership and regulation. Sensing this, these think tanks have allowed their drive for desirable policy to trump science. Not a good idea.

9) Also, I should make a brief note to my own group – die-hard contrarians. Dear fellow contrarians, I know the majority is usually wrong in the behavioral jungle of the stock market. And heaven knows I have seen the soft scientists who lead finance theory attempt to bully their way to a uniform acceptance of the bankrupt theory of rational expectations and market efficiency. But climate warming involves hard science. The two most prestigious bastions of hard science are the National Academy in the U.S. and the Royal Society in the U.K., to which Isaac Newton and the rest of that huge 18th century cohort of brilliant scientists belonged. The presidents of both societies wrote a note recently, emphasizing the seriousness of the climate problem and that it was man-made. (See the attachment to last quarter’s Letter.) Both societies have also made full reports on behalf of their membership stating the same. Do we believe the whole elite of science is in a conspiracy? At some point in the development of a scientific truth, contrarians risk becoming flat earthers.

10) Conspiracy theorists claim to believe that global warming is a carefully constructed hoax driven by scientists desperate for … what? Being needled by nonscientific newspaper reports, by blogs, and by right-wing politicians and think tanks? Most hard scientists hate themselves or their colleagues for being in the news. Being a climate scientist spokesman has already become a hindrance to an academic career, including tenure. I have a much simpler but plausible “conspiracy theory”: that fossil energy companies, driven by the need to protect hundreds of billions of dollars of profits, encourage obfuscation of the inconvenient scientific results.

11) Why are we arguing the issue? Challenging vested interests as powerful as the oil and coal lobbies was never going to be easy. Scientists are not naturally aggressive defenders of arguments. In short, they are conservatives by training: never, ever risk overstating your ideas. The skeptics are far, far more determined and expert propagandists to boot. They are also well-funded. That smoking caused cancer was obfuscated deliberately and effectively for 20 years at a cost of hundreds of thousands of extra deaths. We know that for certain now, yet those who caused this fatal delay have never been held accountable. The profits of the oil and coal industry make tobacco’s resources look like a rounding error. In one notable case, the obfuscators of global warming actually use one MIT professor who also defended tobacco! The obfuscators’ simple and direct motivation – making money in the near term, which anyone can relate to – combined with their resources and, as it turns out, propaganda talents, have meant that we are arguing the science long after it has been nailed down. I, for one, admire them for their P.R. skills, while wondering, as always: “Have they no grandchildren?”

12) Almost no one wants to change. The long-established status quo is very comfortable, and we are used to its deficiencies. But for this problem we must change. This is never easy.

13) Almost everyone wants to hear good news. They want to believe that dangerous global warming is a hoax. They, therefore, desperately want to believe the skeptics. This is a problem for all of us.

And Grantham has not been shy of putting his money where his mouth is—not least through funding the Grantham Institute for Climate Change at Imperial College and the Grantham Research Institute on Climate Change and the Environment at the London School of Economics.

The New York Times has just published an article entitled “Can Jeremy Grantham Profit from Ecological Mayhem” that I hope will introduce Grantham’s views to a wider audience. The title of the article is somewhat deceptive to the degree that Grantham’s charitable actions aim to prevent ecological mayhem. But what is new in the article is Grantham’s proposal that climate change should be recast as a resource issue to gain some political traction. In his own words:

“Global warming is bad news. Finite resources is investment advice.”

Personally, I believe that climate change can be sold as a financial threat from a variety of angles.  To the question “Have they no grandchildren?”, I think the answer is “Yes they do”. And the grandchildren will take a variety of hits. Some via climate-induced resource limitations bearing down on economic growth, and some through direct hits via sea level rise and extreme weather events. In the final analysis, those ignorant of global warming will see their wealth, and wealth of their families, transferred to those who are informed.