Battery Banter 5: The Relevance (or Not) of Moore’s Law

Concurrently with writing this series of blog posts, I have been reading Steve Levine’s newly published book “The Powerhouse: Inside the Invention of a Battery to Save the World“.  The book is a bit of a mess, full of random jumps, wrong turns and dead ends. Perhaps that is appropriate, since it describes a battery development process that is full of random jumps, wrong turns and dead ends.

While the back cover blurb tells me that the book reads like a thriller, it is more like Sir Arthur Conan Doyle’s tale “The Dog That Didn’t Bark”. We have two questing groups of heroes: the public-sector Argonne National Laboratory battery guys and the plucky private-sector upstarts at Envia Systems. Yet the book peters out at the end, with both teams abjectly failing in their respective quests to find the super battery Holy Grail. Argonne’s new version of nickel manganese cobalt batteries (NMC 2.0) suffers from chronic voltage fade (meaning that the performance of the battery slumps after repeated recharging cycles). Meanwhile, Envia’s super battery is spectacularly flawed, based on a collapsing anode and dodgy intellectual property.

Despite the book being in need of a good edit, it is still full of interesting insights into the battery development process. In a chapter recounting conversations with Don Hillebrand, an old school auto expert working at Argonne, Levine makes this observation:

Unlike microchips, batteries don’t adhere to a principle akin to Moore’s law, the rule of thumb that the number of switches on a chip–semiconductor efficiency–doubles every eighteen months. Batteries were comparatively slow to advance. But that did not make electronics superior to electric cars.

Consumer electronics typically wear out and require replacement every two or three years. They lock up, go on the fritz, and generally degrade. They are fragile when jostled or dropped and are often cheaper to replace than repair. If battery manufacturers and carmakers produced such mediocrity, they would be run out of business, sued for billions and perhaps even go to prison if anything catastrophic occurred. Automobiles have to last at least a decade and start every time. Their performance had to remain roughly the same throughout. They had to be safe while moving–or crashing–at high speed.

At this point, I want to refer you back to the original 1965 article by Gordon Moore that ushered in Moore’s Law entitled  “Cramming more components onto integrated circuits.” From this, we have the quintessential exponential chart, which delivers a straight line if you put the y-axis onto a logarithmic scale (click for larger image):

Moore's Law Paper jpeg

This is the world of Ray Kurzweil‘s singularity which I blogged on in a post a couple of years back called “Singularity or Collapse: Part 1 (For Ever Exponential?“. As knowledge increases by powers of 10, virtually every challenge faced by mankind dissolves.

The problem here is with the rate of exponential growth. If battery capacity and solar efficiency were increasing by orders of magnitude (powers of 10) every few years, then climate change and energy resource depletion would be solved. But actually it is a bit more complicated than that: technological transformation is actually a combination of two variables: scientific progress and price. The electronics industry performed its magic on both these variables simultaneously. Further, the science was to a degree forgiving. The theoretical constraint governing the number of components that can be placed on a silicon wafer has only recently come into play.

For a solar cell, however, the constraint is visible on day one. In the UK, the average raw power of sunshine is 110 Watts per metre squared. Current solar panels are somewhere between 10% and 20% efficient. According to David MacKay in his wonderful treatise on all things energy Sustainable Energy Without the Hot Air, the maximum theoretical efficiency is limited to 60%  due to fundamental physical laws (here). Therefore, you will never achieve more than 66 Watts from your solar panel, so the  ability to keep doubling is capped. At best, you will achieve an S curve effect from a technological perspective alone.

S Curve jpeg

Of course, the second variable is price, In theory, once the solar PV panel reaches the top of the S curve above, the price could keep on falling even as efficiency stalls. So while your panels aren’t getting any better, you could plausibly plaster the world with them if they were dirt cheap. But this is a different dynamic than that of silicon chips, which benefitted from going up an efficiency curve and down a cost curve both at the same time.

And this takes us back to batteries and an article at the Bulletin of Atomic Scientists by Kurt Zenz House titled “The limits of energy storage technology“. For this series of posts on batteries, I have been working in units of kilowatt-hours (kWh), but the Bulletin article works in units of megajoules (MJ). In case you want to follow my link to the Bulletin article, note that 3.6 MJ equals 1 kWh. Consequently, a top of the range Tesla has a battery capable of storing roughly 306 MJ (85 kWh), the Nissan Leaf 86 MJ (24 kWh) and the BMW i3 68 MJ (19 kWh).

Crucially, House points out that a kilogram of crude oil contains 50 MJ of potential chemical energy, equivalent to almost 14 kWh, not much less than the entire BMW i3 battery which weighs in at 230 kg. Of course, internal combustion engines aren’t particularly efficient, so in a petrol engine only around 20% of the energy makes it from fuel to wheels (the rest is lost as heat); for an EV, this figure is around 90%. So our BMW has a battery capacity of 80 Watt-hours per kg (roughly 0.3 MJ), of which 72 Watt-hours gets to the wheels, while its conventional cousin translates 1 kg of petrol into perhaps 2,800 Watt-hours of motion–so 40 times better.

But what of the theoretical limits posed by battery chemistry as technology improves? House has this to say:

Due to the theoretical limits of lead-acid batteries, there has been serious work on other approaches such as lithium-ion batteries, which usually involve the oxidation and reduction of carbon and a transition metal such as cobalt. These batteries have already improved upon the energy density of lead-acid batteries by a factor of about 6 to around 0.5 mega-joules per kilogram–a great improvement. But as currently designed, they have a theoretical energy density limit of about 2 mega-joules per kilogram. And if research regarding the substitution of silicon for carbon in the anodes is realized in a practical way, then the theoretical limit on lithium-ion batteries might break 3 mega-joules per kilogram.

Therefore, the maximum theoretical potential of advanced lithium-ion batteries that haven’t been demonstrated to work yet is still only about 6 percent of crude oil!

To restate the above numbers, 0.5 MJ equals roughly 140 Watt-hours, 2 MJ is 0.5 kWh and 3 MJ is about 0.8 kWh.

But what about some ultra-advanced lithium battery that uses lighter elements than cobalt and carbon? Without considering the practicality of building such a battery, we can look at the periodic table and pick out the lightest elements with multiple oxidations states that do form compounds. This thought experiment turns up compounds of hydrogen-scandium. Assuming that we could actually make such a battery, its theoretical limit would be around 5 mega-joules per kilogram.

And this brings us back to Ray Kurzweil’s parable of the doubling lily pads in the lake:

A lake owner wants to stay at home to tend to the lake’s fish and make certain that the lake itself will not become covered with lily pads, which are said to double their number every few days. Month after month, he patiently waits, yet only tiny patches of lily pads can be discerned, and they don’t seem to be expanding in any noticeable way. With the lily pads covering less than 1 percent of the lake, the owner figures that it’s safe to take a vacation and leaves with his family. When he returns a few weeks later, he’s shocked to discover that the entire lake has become covered with the pads, and his fish have perished. By doubling their number every few days, the last seven doublings were sufficient to extend the pads’ coverage to the entire lake. (Seven doublings extended their reach 128-fold.) This is the nature of exponential growth.

So how many doublings does it take to get us from your grandfather’s 0.1 MJ per kg lead acid battery to an ultra-advanced theoretical lithium ion battery at 5 MJ? Answer: six. In this lake, the owner can not only take a few weeks vacation but a sabbatical for the whole year and not worry about his fish: the lily pads will still be safely confined to one corner. Gordon Moore’s lake, however, will have witnessed around 30 doublings.

Technological efficiency, however, is only one of two variables that define a disruptive technology: the other one being cost as I mentioned above. In many ways, it is the cost variable that has been the key focus in the approach taken by Elon Musk’s Tesla when it comes to the battery design. Indeed, the battery technology is relatively old school, using a nickel-cobalt-aluminium combination. This accounts for the battery’s stunning 550 kg total weight. Preliminary reports on Tesla’s Gigafactory for battery fabrication suggest it will principally be dominated by efficiency of design rather than cutting edge new battery chemistry. But with technological change only in first gear, it will be hard to achieve integrated circuit-style disruptive change. Cost cutting design can only go so far.

At this point, I will conclude this series of posts in fear that I have turned into the ‘Battery Blog’ (although I will return to batteries at a future date as they are so important). But I will finish with the famous ‘rule of 72’. If you remember, divide 72 by a given growth rate to produce the approximate number of years to achieve a doubling. It works pretty well (click for larger image):

Rule of 72 jpeg

Tony Seba, Ray Kurzweil and other assorted techno-cornucopians achieve almost instant doublings by assuming growth rates in the high teens or better. Unfortunately, much science progresses in the low to mid single digits, so change is measured in decades–not years.

The distinction is important. Under the Kurzweil logic, we don’t really need to tackle climate change or resource depletion because technology is on the case. Just go about your business as usual, tuck up your kids in bed at night, and scientific innovation will do the rest.

But unless Argonne Laboratory‘s battery guys and their peers step up the pace (which looks exceedingly difficult), electric vehicles will not replace conventional internal combustion engines for a couple of decades or more. That translates into no natural near-term carbon emission mitigation in the field of motor transport. And unless we get very lucky with climate sensitivity to CO2, that also means we will get a lot closer to exceedingly dangerous climate change.

Sorry, this also means that a ‘do nothing’ political position at both a national and personal level won’t cut it when it comes to climate change.

One response to “Battery Banter 5: The Relevance (or Not) of Moore’s Law

  1. Pingback: Testing Tony Seba’s EV Predictions 15 (Three Nominations for Nobel Prizes) | Risk and Well-Being

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s