Category Archives: Technology

Battery Banter 5: The Relevance (or Not) of Moore’s Law

Concurrently with writing this series of blog posts, I have been reading Steve Levine’s newly published book “The Powerhouse: Inside the Invention of a Battery to Save the World“.  The book is a bit of a mess, full of random jumps, wrong turns and dead ends. Perhaps that is appropriate, since it describes a battery development process that is full of random jumps, wrong turns and dead ends.

While the back cover blurb tells me that the book reads like a thriller, it is more like Sir Arthur Conan Doyle’s tale “The Dog That Didn’t Bark”. We have two questing groups of heroes: the public-sector Argonne National Laboratory battery guys and the plucky private-sector upstarts at Envia Systems. Yet the book peters out at the end, with both teams abjectly failing in their respective quests to find the super battery Holy Grail. Argonne’s new version of nickel manganese cobalt batteries (NMC 2.0) suffers from chronic voltage fade (meaning that the performance of the battery slumps after repeated recharging cycles). Meanwhile, Envia’s super battery is spectacularly flawed, based on a collapsing anode and dodgy intellectual property.

Despite the book being in need of a good edit, it is still full of interesting insights into the battery development process. In a chapter recounting conversations with Don Hillebrand, an old school auto expert working at Argonne, Levine makes this observation:

Unlike microchips, batteries don’t adhere to a principle akin to Moore’s law, the rule of thumb that the number of switches on a chip–semiconductor efficiency–doubles every eighteen months. Batteries were comparatively slow to advance. But that did not make electronics superior to electric cars.

Consumer electronics typically wear out and require replacement every two or three years. They lock up, go on the fritz, and generally degrade. They are fragile when jostled or dropped and are often cheaper to replace than repair. If battery manufacturers and carmakers produced such mediocrity, they would be run out of business, sued for billions and perhaps even go to prison if anything catastrophic occurred. Automobiles have to last at least a decade and start every time. Their performance had to remain roughly the same throughout. They had to be safe while moving–or crashing–at high speed.

At this point, I want to refer you back to the original 1965 article by Gordon Moore that ushered in Moore’s Law entitled  “Cramming more components onto integrated circuits.” From this, we have the quintessential exponential chart, which delivers a straight line if you put the y-axis onto a logarithmic scale (click for larger image):

Moore's Law Paper jpeg

This is the world of Ray Kurzweil‘s singularity which I blogged on in a post a couple of years back called “Singularity or Collapse: Part 1 (For Ever Exponential?“. As knowledge increases by powers of 10, virtually every challenge faced by mankind dissolves. Continue reading

Battery Banter 4: Could the Grid Cope with a Next Generation EV?

In my last series of posts, I focussed on the war of attribution between electric vehicles (EVs) and traditional internal combustion engine vehicles (ICEs).  Due to the recent slump in oil prices, EVs are on the defensive. They need increased volume to get down their cost curves and punch out of their current redoubt of super cars (Tesla) and green credential statement cars (Nissan Leaf). Low gasoline prices has made such an offensive a lot more tricky to pull off.

But let us suppose that a commercial super battery were to emerge that had high energy density and was cheap. What would happen next? Let’s run this thought experiment in a UK context.

First, let’s look a the UK’s existing fleet. Great Britain has a population of 64 million people, who between them drive around 29 million registered cars (source: here, click for larger image).

Registered Cars UK

And annually each car is driven for an average of 8,000 miles, which translates into 22 miles per day (click for larger image; also remember we are smoothing out weekends and holidays).

Annual Average Miles Travelled jpeg

From a previous blog post, I republish the following chart, which shows the kind of mileage per kilowatt-hour (kWh) a battery achieves at present. Continue reading

Battery Banter 3: Gasoline’s Dastardly Energy Density

In my last post, I talked about the challenge that low oil prices pose for the electric vehicle industry. The following chart from a 2012 McKinsey battery study shows the key tipping points (click for larger image):

McKinsey Battery Study jpeg

With US gasoline (petrol) prices currently running at $2.5 per gallon, we are falling into the bottom left corner of the chart. In short, the battery price for battery electric vehicles (BEVs in the chart) must plummet to keep EVs in the game. As stated yesterday, Nissan and Tesla are getting their battery costs down to around $300 per kilowatt-hour (kWh), but this is still far above the current sweet spot of $150-$200.

Previously, I also talked about the ‘learning rate’: the rate at which battery prices could fall due to learning from experience manufacturing cost savings for every doubling of battery volume. The industry is in the ‘Catch 22’ position of not being able to crank up volume sufficiently to get down its cost curve since EVs are just too far adrift from internal combustion engine vehicles price-wise to secure volume sales. So what is to be done? Continue reading

Battery Banter 2: Sliding Down the Electric Vehicle Cost Curve

With impeccable timing (for my current blogging theme), Nature Climate Change has just published a commentary by Bjorn Nkyvist and Mans Nilsson reviewing the falling cost of battery packs for electric vehicles (source: here, but apologies as the article is behind a paywall). Bottom line: costs have been falling faster than predicted a few years ago (click for larger image).

Battery Electric Vehicle Costs jpeg

In line with Tony Seba’s estimates I blogged on two days ago (here), Nykvist and Nilsson saw total battery pack costs fall 14% per annum between 2007 and 2014 from $1,000 per kilowatt-hour (kWh) to $410. The market leaders in terms of auto battery technology, Tesla and Nissan, saw a slightly lower rate of decline of 6 to 9% since they have been at the cutting edge of improvements and have had less potential for catch-up than the industry as a whole. However, their costs per kWh are now seen at around $300 per kWh of battery capacity. Note that a BMW i3 has battery capacity of approximately 19 kWh, a Nissan Leaf 24 kWh and a top of the range Tesla 85 kWh. Continue reading

Battery Banter 1: Are Internal Combustion Engines Going the Way of the Horse?

A few days ago, a good friend of mine pointed me toward a presentation on disruptive technologies given by Tony Seba. A youtube video is available here:

The entire video is worth watching, but today I will restrict myself to the issues he raises relating to battery technology.

Seba stresses that technological change in the transport sector could happen at breakneck speed. With a pair of compelling photos of early-last-century New York, we are asked to remember that a grand disruption in transport has happened before. In the first photo, dating from April 1900, we play a game of spot the car (click for larger image).

Where Is the Car? jpeg

In the second, a mere 13 years later, the challenge is to spot the horse.

Where Is the Horse? jpeg

The lesson here is that once a disruptive technology reaches a particular tipping point, it doesn’t just take market share from the incumbent industry but rather completely replaces it. For Seba, we are close to reaching that point with electric vehicles.

Continue reading

Collapse Comes of Age

Not long ago, the study of human collapse and extinction was the preserve of cranks (or Hollywood). True, a few maverick scholars have taken on the topic, Joseph Tainter and his book “The Collapse of Complex Societies” springs to mind. Yet little academic infrastructure existed to give collapse studies depth. But just as with happiness studies, another topic covered by this blog, the situation has now changed.

In the UK, our two oldest universities, Oxford and Cambridge, have both set up institutes that probe into the greatest risks faced by mankind. In Oxford, we have the Future of Humanity Institute (FHI), and in Cambridge the Centre for the Study of Existential Risk (CSER). To get a taste of the FHI and its founder Nick Bostrom I recommend you read this in-depth article by Ross Andersen of the magazine Aeon here.

Like this blog, Bostrom’s principal concern is risk; that is, the probability that an event will occur combined with the impact should that event occur.

Risk jpeg

However, Bostrom extends this concept to take in scope: whether a particular risk is limited to a locality or whether it is all encompassing. This produces a risk matrix like this (source for the following analysis his paper here; click for larger image):

Typology of Risk jpeg

The X in the grid marks what Bostrom calls “existential risks”, which he defines thus:

A risk where an adverse outcome would either annihilate Earth-orginating intelligent life or permanently and drastically curtail its potential.

Bostrom then goes on to subdivide such existential risk into four categories:

Bangs: Intelligent life goes extinct suddenly due to accident or deliberate destruction.

Under this category we get traditional disaster movie scenarios of asteroid impact, nuclear holocaust, runaway climate change, naturally-occuring modern plague, bioterrorism, terminator-style super-intelligence and out-of-control nanobots.

Crunches: Society resets to a lower-level of technology and societal organisation. 

This includes bang-lite scenarios that don’t quite kill off intelligent life but rather just permanently cripple it.  Crunches also cover resource depletion and ecological degradation whereby natural assets can no longer support a sophisticated society. Crunch could also come from political institutions failing to cope with the modern world–subsequent to which emergent totalitarian or authoritarian regimes take us backwards.

Shrieks: A postmodern society is obtained, but far below society’s potential or aspirations. 

This is a rather nebulous category since the measuring stick of our potential is against something that we may not be able to understand–a reflection of Bostrom’s philosophical roots, perhaps.

Whimpers: Society develops but in so doing destroys what we value. 

Under this scenario, we could pursue an evolutionary path that burns up our resources or we bump up against alien civilisations that out-compete us. Over the time scale that this blog looks at–the lifespan of our young–this existential threat can be ignored.

Building on many of Bostroms preoccupations, a joint report by FHI and the Global Challenges Foundation has just been published under the title “Global Challenges: 12 Risks That Threaten Human Civilisation”. The Executive Summary can be found here and the full report here.  The report is again concerned with existential risks, but approaches this idea somewhat differently than Bostrom’s earlier work.

The focus of the report is on low probability but high impact events. The logic here is that low probability events are generally ignored by policy makers, but when such events occur, they could have catastrophic consequences. Accordingly, policy makers should be duty bound to plan for them. From a probability perspective, what we are talking about here is the often-ignored right tail of the probability distribution.

Existential Probability jpeg

The 12 risks falling into the right tail of the distribution highlighted in the report are:

  1. Extreme climate change
  2. Nuclear war
  3. Global pandemic
  4. Ecological collapse
  5. Global system collapse
  6. Major asteroid impact
  7. Super volcano
  8. Synthetic biology
  9. Nanotechnology
  10. Artificial intelligence
  11. Unknown consequences (Rumsfeld’s unknown unknowns)
  12. Future bad global governance

As an aside, finance is one of the few disciplines that takes these tails seriously since they are the things that will blow you up (or make you a fortune). The industry often doesn’t get the tail-risk right (incentives often exist to ignore the tail) as the financial crisis of 2008 can attest. However, the emphasis is there. A lot of science ignores outcomes that go out more than two or three standard deviations; in finance, half your life is spent trying to analyse, quantify and prepare for such outcomes.

Returning to the Global Challenges report, the emphasis of the analysis is on dissecting tail risks, with the goal of provoking policy makers to consider them seriously. One of the most interesting proposals within the report if for a kind of existential risk early warning system, which I will look at in a separate blog post.

Finally, I will finish this post with a chart dealing with severe climate change (click for larger image or go to page 64 of the report), a risk that I hope will be at the centre of the upcoming COP 21 climate talks in Paris in December. The fact that our top universities are seriously studying such risks will, I hope, prevent them being seen as the preserve of cranks and disaster movies in future.

current climate risk jpeg

 

Chart of the Day, 12 February 2015: The Slow Growth Movement

The holy grail of traditional economics lies in strong technology-led productivity growth.

Other types of growth are resource intensive (capital or labour) and suffer from diminishing returns. You can throw more and more capital into the GDP-producing pot (through increasing fixed investment as a percentage of GDP), but it will have less and less of an effect. This is what happened to the Soviet Union and Japan in the past, and this is what will happen to China in the future. (Resource intensive input growth can also sometimes produce an expansion of GDP, but not necessarily a rise in well-being–think spoiled environment.)

Similarly, you can throw more people into the GDP-producing pot, or make them work harder (more hours), or educate them to work smarter. But all three sources of growth also have limits.

So if you want quality, sustainable growth, you want technology-led productivity to expand. Will it? Last week, a short comment published by productivity researchers John Fernald and Bin Wang on the Federal Reserve Board of San Francisco web site looked at this issue. Fernald and Wang note that the ‘go go’ GDP growth years of the late 1990s were mostly built on labour productivity (click for larger image).

Contributoins to Business Sector Output Growth jpeg

But as I noted above, labour productivity can, in turn, be thought of as resting on the availability of more capital, better education or technological advance. Economists generally refer to the last type as total factor productivity (TFP). Think of it as a measure of innovation. And you can see which industries have been driving that innovation here (click for larger image):

Contributions by Industry Type jpeg

In IT-intensive industries in particular, there has been a step-change downward in innovation-led growth. Moreover, the non-IT industries have seen no TFP growth since 2007. But Fernald and Wang want to stress that innovation gains were struggling before the financial crisis hit:

The most recent slowdown in productivity growth predated the Great Recession of 2007–09. Hence, it does not appear related to financial or other disruptions associated with that event. Rather, it appears to mark a pause in—if not the end of—exceptional productivity growth associated with information technology.

This is one reason why I feel that Western countries will struggle to raise GDP growth above 2% any time soon.

Chart of the Day, 8 February 2015: The Primordial Soup of US Renewables

If you like charts (as I do), and you are interested in all things energy and climate change, then the annual Bloomberg New Energy Finance‘s “Sustainable Energy in America Factbook” is an absolute treat. The 2015 edition came out last week.

While the United States does not have a Climate Change Act like the UK, it does have top-end academic research, government-backed blue sky thinking (via the Advanced Research Projects Agency for Energy), lots of entrepreneurial zeal, a deep well of venture capital funding and a multitude of innovative state-led renewable initiatives. Just as the primordial soup of complex molecules on early earth once gave rise to the chemical combinations that we call life, so we hope that the US renewable melting pot will also give birth to something transformational.

Of course, we are not there yet. And Voldemort is well represented in the US care of the anti-science Congress and a fossil fuel lobby that makes the tobacco giant lobbying of the 60s and 70s look like amateur time. But let’s stay upbeat. For a start, King Coal does appear to be in full retreat (click for larger image on all charts).

Electricity Generating Capacity by Fuel Type jpeg

Further, investment continues to pour into the renewable space at a rate 10 times higher than a decade ago:

Renewable Investment jpeg

With the exponential explosion in solar capacity particularly encouraging.

US Solar Roll Out jpeg

I could go on.

Nonetheless, for the US to lead the world into a post-carbon age before we are committed to extremely dangerous climate change still requires a step change upward in renewable investment. But the building blocks for a renewable revolution are there, they just need to be put in the right order.

Chart of the Day, 28 Jan 2015: Oil, Cornucopians, Peakists and Jeremy Grantham

The stunning collapse in oil and metal prices since last summer (see yesterday’s post) has brought the cornucopians and abundantites crawling out of the wood work. From an (otherwise very good) article in The Economist of 17th January titled “Let there be light”.

An increase in supply, a surprising resilience in production in troubled places such as Iraq and Libya, and the determination of Saudi Arabia and its Gulf allies not to sacrifice market share in the face of falling demand have led to a spectacular plunge in the oil price, which has fallen by half from its 2014 high. This has dealt a final blow to the notion of “peak oil”. There is no shortage of hydrocarbons in the Earth’s crust, and no sign that mankind is about to reach “peak technology” for extracting them.

Frankly, this is just sloppy thinking from The Economist: the second sentence, which talks of a “final blow” to the notion of peak oil, doesn’t follow on from the first.

In short, the paragraph muddles the short term and the long term. Why is a fall in oil prices barely six months’ old a “final blow” to the notion of peak oil? And while fracking shows we are far from “peak technology”, it says nothing about price. Can tight oil keep coming to market for years to come at current prices? I think not. For a longer treatment of oil supply versus oil demand, see my more detailed post titled “Has Shale Killed Peak Oil“.

One of the most vocal advocates of the ‘peakist’ or ‘depletist’ hypothesis is Jeremy Grantham, who has used The Quarterly Letter of GMO as a platform for his views. The chart below is taken from The Third Quarter 2014 letter (click for larger image):

U.S. Average Hourly Manufacturing Earnings:Oil Price per Barrel jpeg

Grantham points out that in 1940 one hour’s work for an American engaged in manufacturing could buy 20% 0f a barrel of oil. At the twin peaks of oil abundance–1972 and 1999–the same wage could buy over a barrel of oil. But those days, he argues, are long gone. According to Grantham, this has implications for not only oil markets but also for the energy underpinnings of global economic and productivity growth.

Yesterday, I also argued that the rapid slowing to the Chinese economy was the likely culprit behind the havoc in commodity markets rather than a breakthrough in one particular extraction technology. As evidence, I noted how iron ore and copper prices had collapsed along with the oil price, despite the fact that you can’t frack for copper and iron ore.

The critical question now is what will happen to supply in the face of sluggish demand. Tight oil production is dramatically different from traditional oil production due to the accelerated nature of the depreciation schedule. Fracked fields deplete quickly, so to maintain production you must continually invest. If you don’t, aggregate production falls fast–that is, within a year or two. So we won’t witness a decade long excess capacity work-out as you would have seen in previous oil price busts: supply should adjust to demand at breakneck speed this time around.

Consequently, while we are not at “peak technology” for oil extraction, we possibly are at “peak cheap technology”. If so, forget all talk of “final blows” to peak oil.

Links for the Week Ending 6 April 2014

  • The second instalment of The Intergovernmental Panel on Climate Change’s (IPCC) Fifth Assessment Report (AR5), titled “Impacts, Adaption and Vulnerability”, was released in Tokyo on the 31st March and can be found here. The “Summary for Policymakers” can be downloaded here. On page 19 of the Summary, the IPCC states that “the incomplete estimates of global annual economic losses for additional temperature increases of around 2 degrees Celsius are between 0.2 and 2.0% of income (± one standard deviation around the mean)” with the risk for higher rather than lower losses. The report then goes on to say “Losses accelerate with greater warming, but few quantitative estimates have been completed for additional warming around 3 degrees Celsius or above”. Given that it looks almost impossible that we will constrain warming to 2 degrees Celsius based on the current CO2 emission path and the installed fossil fuel energy infrastructure base, the world really is going into an unknown world of risk with climate change.
  • A key area of economic loss from climate change relates to drought. To date, most models have focussed on precipitation as the principal driver of drought. A new paper by Cook et al in the journal Climate Dynamics titled “Global Warming and Drought in the 21st Century” gives greater emphasis to the role of evaporation (more technically, potential evapotranspiration or PET) in drought. Through better modelling of PET, the paper sees 43% of the global land area experiencing significant dryness by end of 21st century, up from 23% for models that principally looked at precipitation alone. A non-technical summary of the paper can be found here.
  • Meanwhile, the general public has lapsed back into apathy around the whole climate change question, partially due to the hiatus period in temperature rise we are currently experiencing. However, evidence is slowly mounting that we could be about to pop out of the hiatus on the back of a strong El Nino event (periods of high global temperature are linked to El Ninos). Weather Underground has been doing a good job of tracking this developing story, with another guest post from Dr. Michael Ventrice (here) explaining the major changes in the Pacific Ocean that have taken place over the last two months and which are setting us up for an El Nino event later in the spring or summer.
  • Changing subject, The Economist magazine ran a special report last week on robotics titled “Immigrants from the Future“. In some ways, I came away less impressed by the capabilities of the existing generation of robots than more.
  • I often blog on happiness issues (most recently here). This may seem strange for a blog whose stated focus is on such global risks as resource depletion and climate change, but I don’t see the contradiction. For me, much of our striving to extract and burn as much fossil fuel as possible comes through the pursuit of goals that don’t necessarily make us more happy. A new book by Zachary Karabell titled “The Leading Indicators” adds a new dimension to this argument. Karabell argues that over the last century or so we have created a series of statistics that are more than pure measurements of economic success. In short, they are ideology laden more than ideology free. Political parties set out their manifestos based on a mishmash of economic achievements and goals based on GDP, unemployment, inflation, the trade balance, interest rates, the strength of their national currency and so on and so forth. But these number encapsulate only part of well-being. Yet such statistics totally dominate political discourse because that is how we have been taught to keep score in a modern capitalist economy. As we career towards extremely dangerous climate change, I think it is time that we recognise these economic indicators for what they frequently have become: false gods. Karabell has an article in The Atlantic setting out the book’s main ideas here and there is a good review in The Week here.
  • Rising inequality has been one of the major economic development over the past 40 years. I am a great fan of the Word Bank economist Branko Milanovic, who wrote a wonderful book called “The Haves and Have-Nots: A Brief and Idiosyncratic History of Global Inequality“, in which he pulls together many strands of the inequality literature within a global context. I blogged on this once here. A nice complement to this book is the new web site titled Chartbook of Economic Inequality, which has been put together by two academic economists Anthony Atkinson and Salvatore Morelli. If you like infographics, you will love this site.