Krugman on Climate III: Resorting to Pascal’s Wager
In the two prior posts in this series, I argued that Paul Krugman has presented a flawed cost – benefit analysis for his proposals to reduce greenhouse gas emissions, and that his linked proposal to use carbon tariffs to push recalcitrant developing economies to go along is misguided. In this final post of the series, I will argue that his consideration of the role of uncertainty in determining our course of action is also incorrect, because he fails to provide any principle by which we could establish any limit to much we should spend to reduce a risk which can never be eliminated entirely.
Like all serious cases for emissions mitigation to avoid climate change damages, Krugman eventually comes to the argument that the possible, rather than the expected, consequences of AGW are so severe that they justify almost any cost. In describing why a simple comparison of expected to costs to expected benefits over the next century is an inadequate consideration of the economic trade-offs involved, he (correctly, in my view) focuses on the question of uncertainty.
Finally and most important is the matter of uncertainty. We’re uncertain about the magnitude of climate change, which is inevitable, because we’re talking about reaching levels of carbon dioxide in the atmosphere not seen in millions of years. The recent doubling of many modelers’ predictions for 2100 is itself an illustration of the scope of that uncertainty; who knows what revisions may occur in the years ahead. Beyond that, nobody really knows how much damage would result from temperature rises of the kind now considered likely.
You might think that this uncertainty weakens the case for action, but it actually strengthens it. As Harvard’s Martin Weitzman has argued in several influential papers, if there is a significant chance of utter catastrophe, that chance — rather than what is most likely to happen — should dominate cost-benefit calculations. And utter catastrophe does look like a realistic possibility, even if it is not the most likely outcome.
Weitzman argues — and I agree — that this risk of catastrophe, rather than the details of cost-benefit calculations, makes the most powerful case for strong climate policy. Current projections of global warming in the absence of action are just too close to the kinds of numbers associated with doomsday scenarios. It would be irresponsible — it’s tempting to say criminally irresponsible — not to step back from what could all too easily turn out to be the edge of a cliff.
Krugman is also correct, in my view, that uncertainty in our forecasts strengthens the case for action.
The stronger form of the argument from uncertainty is not only that it is possible that the true probability distribution of potential levels of warming is actually much worse than believed by the IPCC, but that a reasonable observer should accept it as likely that this is the case. As Krugman indicates, the sophisticated version of this argument has been presented by Weitzman. Weitzman’s reasoning on this topic is subtle and technically ingenious. In my view, it is the strongest existing argument that runs exactly counter to mine. (You can see a slightly earlier version of his paper, and my lengthy response here, along with links to the underlying source documents.) In very short form, Weitzman’s central claim is that the probability distribution of potential losses from global warming is “fat-tailed”, or includes high enough odds of very large amounts of warming (20°C or more) to justify taking expensive action now to avoid these low probability / high severity risks.
The big problem with his argument, of course, is that the IPCC has already developed probability distributions for potential warming that include no measurable probability for warming anywhere near this level for any considered scenario. That is, the best available estimates for these probability distributions are not fat-tailed in the sense that Weitzman means it. Therefore, professor Weitzman is forced to do his own armchair climate science, and argue (as he does explicitly in his paper) that he has developed a superior probability distribution for expected levels of warming than the ones the world climate-modeling community has developed and published. And his proposed alternative probability distribution is radically more aggressive than anything you will find in any IPCC Assessment Report – professor Weitzman argues, in effect, that there is a one percent chance of temperature increase greater than 20°C over the next century, while even the scale on the charts that display the relevant IPCC probability distributions only go up to 8°C. It is not credible to accept Professor Weitzman’s armchair climate science in place of the IPCC’s.
The only real argument for rapid, aggressive emissions abatement, then, boils down to the weaker form of the argument from uncertainty: the point that we must always retain residual doubt about any prediction – or in plain English, that you can’t prove a negative. The problem with using this rationale to justify large economic costs can be illustrated by trying to find a non-arbitrary stopping condition for emissions limitations. Any level of emissions imposes some risk. Unless you advocate confiscating all cars and shutting down every coal-fired power plant on earth literally tomorrow morning, you are accepting some danger of catastrophic warming. You must make some decision about what level of risk is acceptable versus the costs of avoiding this risk. Once we leave the world of odds and handicapping and enter the world of the Precautionary Principle – the Pascal’s Wager-like argument that the downside risks of climate change are so severe that we should bear almost any cost to avoid this risk, no matter how small – there is really no principled stopping point derivable from our understanding of this threat.
Think about this quantitatively for a moment. Suspend disbelief about the real world politics for a moment, and assume that we could have a perfectly implemented global carbon tax. If we introduced a tax high enough to keep atmospheric carbon concentration to no more than 420 ppm (assuming we could get the whole world to go along), we would expect, using to the Nordhaus analysis as a reference point, to spend about $14 trillion more than the benefits that we would achieve in the expected case. To put that in context, that is on the order of the annual GDP of the United States of America. That’s a heck of an insurance premium for an event so low-probability that it is literally outside of a probability distribution. Former Vice President Al Gore has a more aggressive proposal that if implemented through an optimal carbon tax (again, assuming we can get the whole word to go along) would cost more like $21 trillion in excess of benefits in the expected case. Of course, this wouldn’t eliminate all uncertainty, and I can find credentialed scientists who say we need to reduce emissions even faster. Any level of emissions poses some risk. Without the recognition that the costs we would pay to avoid this risk have some value, we would be chasing an endlessly receding horizon of zero risk.
So then what should we do? At some intuitive level, it is clear that rational doubt about our probability distribution of forecasts for climate change over a century should be greater than our doubt our forecasts for whether we will get very close to 500 heads if we flip a fair quarter 1,000 times. This is true uncertainty, rather than mere risk, and ought to be incorporated into our decision somehow. But if we can’t translate this doubt into an alternative probability distribution that we should accept as our best available estimate, and if we can’t simply accept “whatever it takes” as a rational decision logic for determining emissions limits, then how can we use this intuition to weigh the uncertainty-based fears of climate change damage rationally? The only way I can think of is to attempt to find other risks that we believe present potential unquantifiable dangers that are of intuitively comparable realism and severity to that of outside-of-distribution climate change, and compare our economic expenditure against each.
Unfortunately for humanity, we face many dimly-understood dangers. Professor Weitzman explicitly considers an asteroid impact and bioengineering technology gone haywire. It is straightforward to identify others. A regional nuclear war in central Asia kicking off massive global climate change (in addition to its horrific direct effects), a global pandemic triggered by a modified version of the HIV or Avian Flu virus, or a rogue state weaponizing genetic-engineering technology are all other obvious examples. Any of these could kill hundreds of millions to billions of people. Further, specialists often worry about the existential risks of new technologies spinning out of control: biosphere-consuming nanotechnology and supercomputers that can replace humans are both topics of intense speculation. In the extreme, leading physicists publicly speculated that starting the Large Hadron Collider in 2009 might create so-called strangelets that would annihilate the world, much as Edward Teller calculated that the initial nuclear explosion at Trinity in 1945 might ignite the world’s atmosphere. This list could be made almost arbitrarily long.
Consider the comparison of a few of these dangers to that of outside-of-distribution climate change dangers. The consensus scientific estimate is that there is a 1-in-10,000 chance of an asteroid large enough to kill a large fraction of the world’s population impacting the earth in the next 100 years. That is, we face a 0.01% chance of sudden death of most people in the world, likely followed by massive climate change on the scale of that which killed off the non-avian dinosaurs, which seems reasonably comparable to outside-of-distribution climate change. Or consider that Professor Weitzman argues that we can distinguish between unquantifiable extreme climate change risk and unquantifiable dangers from runaway genetic crop modification because “there exists at least some inkling of a prior argument making it fundamentally implausible that Frankenfood artificially selected for traits that humans and desirable will compete with or genetically alter the wild types that nature has selected via Darwinian survival of the fittest.” That does not seem exactly definitive. What is the realism of a limited nuclear war over the next century – with plausible scenarios ranging from Pakistan losing control of its nuclear arsenal and inducing a limited nuclear exchange with India, to a war between a nuclearized Iran and Israel?
The U.S. government currently spends about four million dollars per year on asteroid detection (in spite of an estimate that one billion dollars per year spent on detection plus interdiction would be sufficient to reduce the probability of impact by 90 percent). We continue to exploit genetic engineering to improve crop yields because, much like avoiding burning fossil fuels, the human costs of stopping this would be immediate and substantial. We detonated the atomic bomb at Trinity, and fired up in the Large Hadron Collider, because the perceived benefits anticipated from both were significant. We are not willing to engage in an unlimited level of military action to prevent nuclear proliferation, despite the risks proliferation creates, since we must weigh risk against risk.
In the face of massive uncertainty, hedging your bets and keeping your options open is almost always the right strategy. Money and technology are the raw materials for options to deal with physical dangers. A healthy society is constantly scanning the horizon for threats and developing contingency plans to meet them, but the loss of economic and technological development that would be required to eliminate all theorized climate change risk (or all risk from genetic and computational technologies or, for that matter, all risk from killer asteroids) would cripple our ability to deal with virtually every other foreseeable and unforeseeable risk, not to mention our ability to lead productive and interesting lives in the meantime.
We can be confident that humanity will face many difficulties in the upcoming century, as it has in every century. We just don’t know which ones they will be. This implies that the correct grand strategy for meeting them is to maximize total technical capabilities in the context of a market-oriented economy that can integrate highly unstructured information, and, most importantly, to maintain a democratic political culture that can face facts and respond to threats as they develop.
Mr. Krugman’s line of argument reminds me of something much closer to hand than Pascal’s Wager: Dick Cheney’s 1% Doctrine. I’m sure Mr. Krugman would agree that the same logic that drives climate change policy should drive foreign policy (or vice versa).
— David Polansky · Apr 20, 06:51 PM · #
Just a technical note about IPCC emissions scenarios:
Note that even the family of lowest scenario – B1 – has an average cumulative emissions figure of about 1000 Gigatons carbon. This, coincidentally, is very nearly the figure one would come to by adding current cumulative emissions (about 280Gt) to what would be released if we burned every last bit of Carbon in BP’s assessment of current proven fossil fuel reserves (716Gt).
That means burning all the economically extractable coal, gas, and oil we know about before the end of the century still only gets us to the mildest scenarios.
It would require discoveries of 40% more carbon than we currently know about to even approach A1B’s lower bound of 1300GT, and to get to A1B’s average figure of 1700GT, we would need to do no less than double our global fossil fuel reserves.
Even with the recent advance of exploration technology, we’re not even close to being on track to doing that, and my personal judgment is that these projections are quite unrealistic and that there is little basis (and plenty of recent strong contrary evidence) to be confident in the new discoveries model of the IPCC.
And if the new discoveries is inaccurate, then not only would the maximum atmospheric concentration be less than predicted, but the economic time series (the price/supply/demand paths) are changed dramatically. The effect ripples through the models until the error-bands on the final analysis of economic impact become much larger in scale than the mean results, with plenty of results in the “net benefit” ledger.
Contra Krugman and Weitzman, there then exists an equal argument that there are small chances, not of catastrophe, but of tremendous benefit deriving from current emissions rates. There might be an legitimate argument to emit even faster because of present-value considerations in a scenario of high future discount rates.
But getting back to fuels – you can have climate catastrophe, or you can have peak-fossils, but you can’t have both. You can forecast the former, but you must assume the absence of the later. Likewise, you can predict the latter, but then you must exclude the former as highly improbable.
— Indy · Apr 20, 09:05 PM · #
Well, no. Climate change itself can cause the release of atmospheric carbon, say, by melting permafrosts releasing trapped methane. It’s unfortunately the case that the carbon emitted by anthropogenic use of fossil fuels isn’t limited to the carbon in those fossil fuels.
Unfortunately, yes, you can.
— Chet · Apr 21, 12:36 AM · #
The climate change I can’t argue about. There are natural causes to climate change.
But human CO2 output, could alter quite significantly quite soon, because of the trend depicted in this chart.
— Keid A · Apr 21, 01:35 AM · #
@Chet
The “arctic methane” scenario is one of those “very low probability but highly catastrophic” potential developments that Weitzman was talking about, but like Mr. Manzi says, it doesn’t always make logical sense to over-weigh the significance of these possibilities in our decision making.
To my knowledge, the latest IPCC estimates estimate that the likely contribution to climate forcing from non-energy-related sources (to include deforestation, land-use changes, and hypothesized arctic feedbacks, such as the one you mention) will likely be small relative to that from fossil-fuel use, or at the very least, the possibility of the forcing contribution being of a similar scale is considered a very low probability / speculative (though, of course, not impossible) event.
The reason I’ve read for the methane scenario being one of small likelihood is that the Earth has undergone many prior periods of rapid thawing of permafrost, for example, during the recession of various recent ice-ages, or the emergence of ancient warm periods, but without producing a signature of a methane-spike of greater than a maximum 450 parts per billion observable in ice-core samples, not nearly enough to produce a thermal runaway. And anyway, this spike is not so much associated with “permafrost” effects so much as considered the normal amount of methane production from a non-ice-age biosphere – that’s why the pre-industrial concentration had already grown to the high-end of the historical range.
Now, it’s also true that methane in the upper atmosphere has a relatively short half-life, since it’s interaction with UV and air will eventually oxidize it into CO2, but it’s a small amount of additional CO2 since we’re only talking about a part per million or so, and anyway, it’s good news since CO2 is a weaker greenhouse gas.
More good news is that recent methane levels have seemed to have leveled off (or come close) in the last decade, contrary to most scientists’ expectations ten years ago. There are some guesses of a “new steady state” equilibrium, but at the very least, given high arctic temperature anomalies in the last decade, if the methane-feedback theory were accurate then one would have expected to see higher, not lower, recent growth rates in atmospheric concentrations.
Finally, it’s possible, of course, that such effect won’t manifest themselves until temperatures rise a few more degrees – but if I’m correct in thinking that the IPCC’s projection for future discoveries of additional fossil reserves is a wildly-optimistic one, then it’s also possible that CO2 levels and arctic temperatures may never get to a point where it would be sensible to worry about it.
Perhaps then, controlling future fossil reserve discoveries / exploration efforts would be a better, lower-cost and lower-economic-distortion strategy than controlling annual emissions rates. After all, the IPCC itself remarks that it is really aggregate emissions, not annual emission rates, that matter – so why are we trying to control the latter and not the former?
— Indy · Apr 21, 01:51 AM · #
It may already have started.
Most scientists are coming around to the notion that the impact is going to be greater than the IPCC estimates. Further, an argument based on the panel not knowing how much oil there is, which you were able to find on Google, strikes me as wrong on its face.
— Chet · Apr 21, 03:08 AM · #
Another trend that I think is quite hopeful for the reduction of human CO2 output, is the recent large increase in estimates of global natural gas reserves, as a result of new technologies like horizontal fracturing for shale gas and coalseam gas.
The increase in Nat.Gas production has caused NG prices to drop sharply in the last few years, and it now looks like most thermal power plants built going forward, could be NG powered rather than coal.
The technology was developed in USA, but is now spreading rapidly around the world, causing global natural gas prices to fall.
This by itself could significantly reduce human CO2 production in the next decade. Burning gas releases intrinsically, about half as much CO2 as burning coal for a given energy output.
You can get even better results by using more-advanced conversion technologies like combined-cycle generators. (But in principle, you could also use combined cycle with coal if you gasified it first).
— Keid A · Apr 21, 03:49 AM · #
Chet, that link you provided wasn’t so convincing. From the conclusion: “For methane to be a game-changer in the future of Earth’s climate, it would have to degas to the atmosphere catastrophically, on a time scale that is faster than the decadal lifetime of methane in the air. So far no one has seen or proposed a mechanism to make that happen.”
Also, I’d appreciate seeing some evidence for your statement about “most scientists” coming around to thinking AR4 underestimates future climate impacts. I would prefer a “review of the literature” type assessment from a peer-reviewed source, as most “most x say y” assertions are rather fungible in practice.
— ROI · Apr 21, 05:19 PM · #
I don’t know if I particularly like the analogy. Pascal’s Wager is bad because the proposed solution doesn’t satisfactorily solve the proposed problem. If we had equally probable mutually exclusive solutions to global warming, then it would be nonsensical to pick one. Just like it would be nonsensical to just pick some God at random to believe in and expect to be right. But the solutions for global warming don’t conflict like this. So the real issue is less philosophical and more fact based. It all boils down to whether or not doomsday probability scenarios are credible.
— Derek · Apr 21, 05:38 PM · #
One of the reasons people underestimate the potential of solar is because they don’t understand how radical the progress being made in solar technology really is. There are technologies, now being commercialized, that have the potential to make large differences to the cost of solar.
One of the more interesting approaches, that is growing rapidly in commercial importance, is thin film solar PV. The idea is that you have a metallic substrate, and a nanoparticle photovoltaic ink is printed on to the surface, as a thin film, using a kind of inkjet printing technology.
From the manufacturing point of view this gives you a scalable, mass production approach, that is very efficient in raw materials, and has the potential for future efficiency gains as more-advanced nanotechnology inks are developed and improved.
This extremely impressive 6 minute video clip shows one such startup.
Nanosolar foils have been NREL certified at 15.3% efficiency.
— Keid A · Apr 21, 07:56 PM · #
Jim – Thoroughly enjoyed these pieces. To me the comparison to other “tail risks” and our responses to each is particularly compelling.
One thing I’ve been thinking about in terms of the cost-benefit analysis is the marginal utility of economic output. A few % loss in US GDP in 2100 probably means that not everyone can have a hovercar, while in the developing world this can mean the difference for millions of people to escape poverty. I don’t think this alters the conclusions you make, but I think it’s a point to bring up when going over the initial cost-benefit analysis.
— B. Pujanauski · Apr 23, 02:21 AM · #