The American Scene

An ongoing review of politics and culture

Articles filed under Climate Change

Just a Mess

Bill McKibben reprises Al Gore’s article arguing that refusing to agree with him on climate change policy can only be evidence of an objective psychological disorder, and I reply at TNR here.

Re: Innovation and the Gas Tax

Ryan Avent has posted a response to my argument that the example of sustained gas taxes of several dollars per gallon several major Western European countries undercuts the assumption that any politically-feasible carbon price in the U.S. would induce technical innovation that could materially de-link economic growth from carbon dioxide emissions. I presented two arguments for this:

1. Europe is a huge, advanced market that has had these taxes for decades while internal combustion has remained a very stable technology, and people have continued to make choices from among broadly pre-existing technologies such as mass transit, bicycles, walking and so forth.

2. Even if other conditions mean that we can now create qualitatively different technical innovation in the near future, Europe provides a big enough market to induce this, and there is no obvious reason why the incremental market that would be provided by the U.S. would make much difference.

In response, Ryan argues:

In general, Europeans do drive different automobiles, which tend to be smaller and more efficient. Some of these have been innovative enough in their design to generate raised eyebrows from American tourists (see: the Smart car). In Europe, the scooter is far more popular and differentiated (the scooter with roof is a common sight). Bicycles are also more common and differentiated, and the institutional supports for cyclists are more highly developed (cycle superhighways are old news in Europe).

And then there’s public transport. From buses to trams to trains to high-speed rail, Europe is well ahead of America. When American transit systems go shopping for vehicles, they generally look to European manufacturers. When the District sought a technology that would allow the city to run streetcars without using overhead wires, it looked to France’s Alstom and Canada’s Bombardier (Canadian gas tax rates are considerably higher than those in America). And transit innovation goes beyond vehicle technologies. It includes fare-gathering methods, scheduling, system design and maintenance, and so on.

Brad Delong’s blog has picked this up, and several commenters on my blog post have made the same basic point.

This strikes me as, at best, a word game. I understand that innovation is not identical to invention. But this is like saying that in response to an increase in the price of peanut butter, I “innovated” by making smaller sandwiches and eating ham-and-cheese more often (while noting that I designed these new sandwiches very well, and am probably healthier anyway with less peanut butter in my diet). If by “innovation” in response to higher gas prices, we mean switching to smaller cars and taking the bus and riding bicycles more often, then I agree entirely that higher gas prices in the U.S. will induce innovation.

Re: Innovation Is the Only Answer

Unsurprisingly, Noah Millman has produced a long, thoughtful and insightful post on how to address climate change. I agree with a lot of it, but not all of it.

Noah (if I may) characterizes my argument correctly, with one important exception. He says that:

The proposed alternative is to encourage innovation to eventually make the fossil fuel economy obsolete. Jim Manzi specifically thinks the government should be spending a few billion dollars a year on such efforts – a drop in the bucket relative to the costs of climate change legislation.

In fact, I have very specifically argued against using government-directed investment for this purpose:

One obvious approach is to have the government fund technology research directly. The danger here, of course, is that we end up back in the failed game of industrial policy. Such dangers are not merely theoretical. The federal government was the key sponsor of, for example, the shale oil and large-scale wind turbine debacles in response to the energy crisis thirty years ago. Setting the right scope for such a program and managing the funding process carefully would each be essential to prevent it from becoming corporate welfare.

We should limit government investments to those topics that meet specific criteria. They should be related to detecting or ameliorating the effects of global warming, should serve a public rather than a private need, and should provide no obvious potential source of profit to investors if successful. Important examples include improved global climate prediction capability, visionary biotechnology to capture and recycle carbon dioxide emissions, or geo-engineering projects to change the albedo of the earth’s surface or atmosphere. On the other hand, most technologies that would contribute to the ongoing long-run transition of the economy away from fossil fuels, like more efficient fuel cells for autos or lower-cost solar power sources, need no government funding, since there is ample profit motive to develop them. As evidence, massive amounts of venture funding and large-company internal capital allocations are flowing to these opportunities right now. Government attempts to direct such development would almost certainly destroy value through political allocation of resources.

In the terms of Noah’s post, I am basically arguing for government-led technical innovation only on what he calls “the other side of the carbon ledger” (roughly speaking, geo-engineering). I think this difference turns out to be material to the broader argument, as Noah then goes on to say:

So I would argue that there are two key points of agreement between the opposing camps. First, that climate change is a real and potentially very serious threat. Second, that significant technological innovation is an absolutely necessary component to addressing that threat.

Which leaves as the proper question to debate: what is the most effective way to get that innovation, “effective” being defined as some function combining “shortest time” and “lowest economic cost” (the precise function being left open as a side debate).

Check on the first asserted point of agreement, but not on the second – at least not exactly. My position is that a comparatively narrow kind of technical innovation is the proper role for the government, but not the kind of broader management of the energy sector of the economy that is what I think Noah means by those words.

So, if he and I agree that government-led innovation is useful for geo-engineering, and we further agree that government led technology innovation is not an effective method for pushing the broader economy to adopt less carbon-intensive energy sources, then I think we’re back to the debate about whether or not pricing carbon is a good idea as a means to accomplish this second goal.

I don’t think so. Even if we assume away the gigantic problems of international coordination and the real-world costs that would be imposed by any conceivable US political deal that did this, no realistic US price on carbon is likely to induce the kind of transformation that many advocates rhetorically imagine, whether through a few big breakthroughs or through many incremental steps.

Consider as an important example that most major Western European countries have had very high gas taxes – typically several dollars per gallon – for decades. But despite the efforts of lots of very smart engineers, the automobile has been a pretty stable technology for these same decades. Raising the price of gas does reduce consumption, and will of course induce some incremental innovation. But Western Europe seems to me to a big enough market so that if a low-carbon technology could be developed globally that was competitive with internal combustion in the face of a ~$5 per gallon gas tax, we already have a big enough end-use market to induce it. Why would increasing prices in America work when it hasn’t for Europe? There might be some carbon price that would radically accelerate innovation across the array of uses of fossil fuels (the limit case is simply outlawing coal and petroleum), but it has never, to my knowledge, been imposed anywhere at scale, presumably because it would impoverish any country that tried.

I do agree that a debate about the proper role of government in encouraging technical innovation around the broader goal of accelerating the ongoing de-carbonization of advanced economies is a potentially productive discussion. This is mostly because it is more likely than the current academic economists’ debate about “pricing carbon” to get focused on practical and useful questions. I’m generally opposed to industrial policy for what I consider to be very well-founded reasons, but such abstractions are only the starting point for analyzing specific proposals. The history of technological-economic innovation in America is a complicated story that involves markets, but also government interventions form canals and railroads to WWII, Apollo and the War on Cancer. Though I haven’t seen a persuasive proposal for such an intervention in this case, I remain in principle open to the concept, and believe that credible ideas merit targeted scrutiny.

Ross Douthat and the Fight Against Cap-and-Trade

Ross Douthat has a column in today’s New York Times in which he kindly mentions me, but far more important, manages to make a multi-layered argument for why an informed rational observer should oppose cap-and-trade legislation within the length restrictions of an op-ed. In my view, the position that Ross presents – basically, that the cure is worse than the disease – is the rationally persuasive argument that won the day in recent legislative debates in the Congress.

I believe the debate and politics of this issue have, so far, played out along lines I set forth a couple of years ago. That doesn’t mean, however, that the debate is permanently settled. Nothing in American politics ever is, and the attempt to introduce cap-and-trade through legislation, regulation and/or judicial rulings is likely to continue for many years.

Climate Storm

So, in my first post as an in-house critic from the right at TNR, I took on what I thought was a pretty alarmist piece in the magazine on global warming by Al Gore.

A lot of bloggers have responded. Inside TNR you can see Bradford Plumer, then me, then Plumer, then me.

Ezra Klein responded, and you can see my reply back to him.

Joe Romm responded at length. He used a number of exclamation points. He covered a lot of the same ground that we did in a very detailed exchange two years ago in which I made a set of claims, he interrogated them, I counter-interrogated and so on. You can read the back-and-forth – me, Romm, me, Romm, me – and decide for yourself if you think I ought to spend my time interacting with him.

Obviously I’m incapable of objectivity on this subject, but it doesn’t seem to me that any of these criticisms have disproven any of my central claims.

UPDATE: I can’t believe I missed a reaction from Rortybomb. (H/T Patrick Appel at The Daily Dish).

Re: Reply to Jim Manzi


Thanks for your reply. I think that your directness and clarity have helped to isolate the key point of difference between us.

In response to my question – “Put yourself in the position of a senior government leader tasked with making real decisions that affect the lives of millions. What would you do if faced with a matter of technical disagreement on such a quantitative-prediction question among experts?” – you say this:

I’ll tell you what I would do. I would say that, given our finite capabilities and the shortness of life, AGW may not be a problem at all, and, if it is a problem, it is not urgent enough to obsess over.

In my view, you have assumed away the crucial question. How do you know that “it is not urgent enough to obsess over”?

Earlier in the post you said of the global warming debate that you “haven’t taken the time to study it”. Later in the post you say that “If the issue is truly important enough, the experts will sort that out themselves”. But unless you want to do your own armchair climate science, which I think would be a real mistake, the practical questions become who we identify as the experts, and what process do we require for them to “sort it out”.

When it comes to specific technical questions, the experts that I identify are those who have spent years studying the relevant topic areas at recognized universities and research centers, have published peer-reviewed technical articles, and can point to specific scientific results. The narrow process that I support for “sorting it out” is the scientific method, requiring replicated research, peer-review, falsification testing of claims and so forth. The broader process that sits around this must include some NRC-like entity, as I described in my prior post, that has leading scientific experts from other fields to do another layer of review to minimize groupthink and self-dealing. This method, like all others, is imperfect and takes time to work, but is superior to any practical alternative, and has worked out pretty well for America and the Western world across many, many such questions for a long time.

But has this process somehow been hijacked in the case of global warming in the service of a statist agenda?

Let me start to address this by going back to the extended version of how you characterize your preferred policy on global warming:

I would say that, given our finite capabilities and the shortness of life, AGW may not be a problem at all, and, if it is a problem, it is not urgent enough to obsess over. Not if I am a senior government leader of a country trillions of dollars in debt who is also tasked with making real decisions about unsustainable entitlement programs, the high likelihood that states will soon default, 10 percent unemployment, crippling new taxes and inflation on the horizon, a global war against jihadists whose mass-murder attacks — and their catastrophic costs — are impossible to predict, the imminence of game-changing nuclear capability in a revolutionary jihadist state that has threatened to wipe Israel off the map and whose motto is “Death to America,” aggression from other hostile nations, a judiciary that is steadily eroding popular self-government, and a host of other actually pressing problems.

Here is what I said in a post at The Corner this week (echoing what I have written many times in many places):

In the face of massive uncertainty, hedging your bets and keeping your options open is almost always the right strategy. Money and technology are the raw materials for options to deal with physical dangers. A healthy society is constantly scanning the horizon for threats and developing contingency plans to meet them, but the loss of economic and technological development that would be required to eliminate all theorized climate change risk (or all risk from genetic and computational technologies or, for that matter, all risk from killer asteroids) would cripple our ability to deal with virtually every other foreseeable and unforeseeable risk, not to mention our ability to lead productive and interesting lives in the meantime.

We can be confident that humanity will face many difficulties in the upcoming century, as it has in every century. We just don’t know which ones they will be. This implies that the correct grand strategy for meeting them is to maximize total technical capabilities in the context of a market-oriented economy that can integrate highly unstructured information, and, most importantly, to maintain a democratic political culture that can face facts and respond to threats as they develop.

How could I end up with a point of view that shares with yours at least the perspective that in comparison to other dangers AGW does not merit the priority it is given by activists, when I’m so scathing about those who dispute all of the establishment science around global warming? I would put the following in all caps if it didn’t look so crazy: Because the actual science does not support the policies that you oppose. And I don’t mean the “real science” as opposed to the “fake consensus”. I mean that you could take every technical assertion made in the current U.N. IPCC Assessment as scientifically true, and you still couldn’t rationally justify cap-and-trade, carbon taxes, EPA mandates or anything like that. The expected damages from global warming are less than the expected costs of the proposed solutions. This is the central point of the vast bulk of the hundreds of thousands of words I have written about global warming.

If we end up opposing many of the same policies, why, then, don’t I just quiet down? There are two ways to answer that.

The first is that we all have our jobs to do. The job of a writer is to do his best to write things that he believes to be correct. This has been my motivation (as far as it is possible to know my own mind) in writing what I have on the topic. One implication of trying to reason forward from facts to conclusions in this specific case is that the current scientific evidence about the level of climate change threat does justify some actions: primarily, in my view, investing in “break-glass-in-case-of-emergency” geo-engineering technologies, so that we have options available to us in the unlikely event that climate change turns out to be much worse than currently anticipated. Another is that if future scientific evidence of a more severe threat from global warming comes to light, then one should respond to that information rationally by changing policy preferences, and not view this as some kind of philosophical defeat.

The second answer is the more tactical. Though this has not been my motivation, it is my view that by attacking the scientific process, conservatives have needlessly disadvantaged themselves in achieving their desired policy outcomes. First, it has prevented conservatives from rolling the ball downhill from widely-accepted scientific findings to the policy conclusion that the costs of emissions mitigation don’t justify the benefits – which would put climate policy advocates in the position of arguing that the science is wrong, or that it is suddenly changing, or that we ought to do give up trillions of dollars for what is in effect a massive foreign aid program, or whatever. And second, it takes away what seems to me to be a position in reaction to proposals for new carbon taxes or cap-and-trade that normal voters would see as natural and believable coming from a Republican / conservative politician: Problem exists; solution costs too much.

[Cross-posted at The Corner]

Re: The Real Epistemic Closure

Kathryn / Andy / Mr. Levin,

I accept that it is fair to characterize my tone in the “Epistemic Closure” post as scathing. I apologize (sincerely) if this was offensive to you. All I can say about it is that I was calling a spade a spade as I see it.

Mr. Levin,

Thank you for the reply. I’m happy to give you the last word, and simply invite readers to review both posts and draw whatever conclusions they feel are appropriate.


I read the Richard Lindzen Wall Street Journal Op-Ed that you reference. While I might not have chosen the same words as Professor Lindzen in places, there is very little of scientific substance in that piece with which I disagree. In fact, I’ve made some version of most of the relevant points, often in almost identical language, and often right here at NR and NRO.

I’ll start by repeating my characterization of Lindzen’s views on Anthropogenic Global Warming (AGW) from the post, and compare this to what Lindzen said in a prior WSJ Op-Ed.

In my post I said:

…Richard Lindzen, a very serious climate scientist who disputes the estimated magnitude of the greenhouse effect, but not its existence…

In the Wall Street Journal, in 2006, Lindzen wrote:

[T]here has been no question whatever that carbon dioxide is an infrared absorber (i.e., a greenhouse gas—albeit a minor one), and its increase should theoretically contribute to warming. Indeed, if all else were kept equal, the increase in carbon dioxide should have led to somewhat more warming than has been observed, assuming that the small observed increase was in fact due to increasing carbon dioxide rather than a natural fluctuation in the climate system.

Lindzen has argued for some time that the expected impact of CO2 on temperature – this is a rough definition of ‘climate sensitivity’ – is lower than do most climate scientists, and that therefore we should not be alarmed, but he clearly acknowledges that CO2 is a greenhouse gas. This is important to keep in mind as we proceed through the current Op-Ed and discussion. Let me take it one piece at a time.

The first several paragraphs of his current Op-Ed make the point that the so-called science underlying much of the historical temperature record, and the use of historical data to establish a causal relationship between CO2 and temperature, is unreliable, and should not be the basis for establishing policy.

Or as I put it in 2009:

I argued over two years ago that: 1) Long-term climate reconstruction was one of the two key trouble spots in climate science; 2) mathematically sophisticated critics had debunked the methodology used to reconstruct long-term climate evidence that is the basis for the famous “hockey stick” increase in global temperatures; and 3) excellent evidence had been presented to the U.S. Senate that, in climate reconstruction, academic peer review meant, in effect, agreement among a tiny, self-selected group of experts. The root problem here is not the eternal perfidy of human nature, but the fact that we can’t run experiments on history to adjudicate disputes, which makes this less like chemistry or physics than like economics or political science.

Today’s Lindzen Op-Ed proceeds to consider the science around projections of warming impacts. I’ll compare the key points in it to what I’ve written.

First, Lindzen:

The IPCC’s position in its Summary for Policymakers from their Fourth Assessment (2007) is weaker, and is essentially simply that most warming of the past 50 years or so is due to man’s emissions. It is sometimes claimed that the IPCC is 90% confident of this claim, but there is no known statistical basis for this claim; it is purely subjective.

Compare, Manzi (2007):

The current summary indicates that the IPCC is “90% confident” that we have caused global warming. The summary further implies that if we double the concentration of carbon dioxide (CO2) in the atmosphere, the IPCC is 90 percent confident that we will cause further warming of 3° C +/- 1.5° C.

But what do these statements of confidence really mean? They are not derived mathematically from the type of normal probability distributions that are used when, for example, determining the margin of error in a political poll (say,+ /- 5%). IPCC estimates of “confidence” are really what we would mean by this word in everyday conversation—a subjective statement of opinion. This is a very big deal, since bounding the uncertainty in climate predictions is central to deciding what, if anything, we should do about them.

Second, Lindzen:

There are, however, some things left unmentioned about the IPCC claims. For example, the observations are consistent with models only if emissions include arbitrary amounts of reflecting aerosols particles (arising, for example, from industrial sulfates) which are used to cancel much of the warming predicted by the models. The observations themselves, without such adjustments, are consistent with there being sufficiently little warming as to not constitute a problem worth worrying very much about.

But Messrs. Cicerone and Rees … throw in a very peculiar statement (referring to warming), almost in passing: “Uncertainties in the future rate of this rise, stemming largely from the ‘feedback’ effects on water vapour and clouds, are topics of current research.”

Who would guess from this statement, that the feedback effects are the crucial question? Without these positive feedbacks assumed by computer modelers, there would be no significant problem, and the various catastrophes that depend on numerous factors would no longer be related to anthropogenic global warming. That is to say, the issue relevant to policy is far from settled.

Compare, Manzi (2007):

The most important scientific debate is really about these feedback effects. Feedbacks are not merely details to be cleaned up in a picture that is fairly clear. The base impact of a doubling of CO2 in the atmosphere with no feedback effects is on the order of 1°C, while the United Nations Intergovernmental Panel on Climate Change (IPCC) consensus estimate of the impact of doubling CO2 is about 3°C. The feedback effects dominate the prediction. As we’ve seen, however, feedback effects run in both directions. Feedback could easily dampen the net impact so it ends up being less than 1°C. In fact, the raw relationship between temperature increases and CO2 over the past century supports that idea.

Over the past several decades, teams in multiple countries have launched ongoing projects to develop large computer models that simulate the behavior of the global climate in order to account for feedback effects. While these models are complex, they are still extremely simplistic as compared with the actual phenomenon of global climate. Models have successfully replicated historical climates, but no model has ever demonstrated that it can accurately predict the climate impact of CO2 emissions over a period of many years or decades.

Climate models generate useful projections for us to consider, but the reality is that nobody knows with meaningful precision how much warming we will experience under any emissions scenario. Global warming is a real risk, but its impact over the next century could plausibly range from negligible to severe.

Third, Lindzen:

[T]he proposed policies are likely to cause severe problems for the world economy.

Compare, Manzi (2007):

In summary, then, the best available models indicate that 1) global warming is a problem that will have only a marginal impact on the world economy, and 2) it is economically rational only to reduce slightly this marginal impact through global carbon taxes. Further, practical knowledge of the world indicates that 1) such a global carbon-tax regime would be very unlikely ever to be implemented, and 2) even if it were implemented, the theoretical benefits it might create would probably be more than offset by the economic drag it would produce.

Why, if I am in such close agreement, am I not just going along with the “don’t worry about AGW” line? Because of uncertainty. The problem of the lack of confidence highlighted in the first Lindzen quote from the current Op-Ed (restated in concept as lack of model certainty in the second quote) is crucial. It is the basis of the sophisticated argument for emissions mitigation.

Lindzen is one expert scientist who forecasts very small net warming as a result of emissions. Most relevant scientists predict quantitatively larger effects. Put yourself in the position of senior government leader tasked with making real decisions that affect the lives of millions. What would you do if faced with a matter of technical disagreement on such a quantitative prediction question among experts? The sensible thing to do is to gather together a group of the leading subject matter experts to produce a review of known science, and subject it to review by a standing body of leading scientists who are not directly in the field in order to minimize both groupthink and opportunities for self-dealing. In America, this in effect describes the U.S. National Research Council (NRC).

In 1979, prior to any accusations of politicization of which I am aware, the NRC convened exactly such a process, and estimated that climate sensitivity is about 3C. This estimate has been consistently affirmed by each of the U.N. IPCC Assessment Reports over the past 20 years. It turns out (as I go into in detail in a Corner post from last week rebutting Paul Krugman) that the amount of warming that would be implied by this climate sensitivity doesn’t justify the costs of cap-and-trade, carbon taxes or other emissions mitigation schemes.

However, it is also the case – for the basic reasons that both Lindzen and I reference – that there is substantial uncertainty about this and other related estimates. Thus, the legitimate risk from climate change is that our current best forecast is wrong; more specifically that climate change will be worse than current forecasts. In a post from earlier this week at The Corner, I go into excruciating detail about why the argument from uncertainty, however, is unpersuasive in supporting aggressive mitigation programs.

[Cross-posted at The Corner]

Liberty and Tyranny and Epistemic Closure

Jonah notes Ross Douthat’s very interesting post, in which Ross had this to say:

Conservative domestic policy would be in better shape if conservative magazines and conservative columnists were more willing to call out Republican politicians (and, to a lesser extent, conservative entertainers) for offering bromides instead of substance, and for pandering instead of grappling with real policy questions.

I thought some about this over the past few days, and took this as a direct challenge.

Here goes.

I started to read Mark Levin’s massive bestseller Liberty and Tyranny a number of months ago as debate swirled around it. I wasn’t expecting a PhD thesis (and in fact had hoped to write a post supporting the book as a well-reasoned case for certain principles that upset academics just because it didn’t employ a bunch of pseudo-intellectual tropes). But when I waded into the first couple of chapters, I found that – while I had a lot of sympathy for many of its basic points – it seemed to all but ignore the most obvious counter-arguments that could be raised to any of its assertions. This sounds to me like a pretty good plain English meaning of epistemic closure. The problem with this, of course, is that unwillingness to confront the strongest evidence or arguments contrary to our own beliefs normally means we fail to learn quickly, and therefore persist in correctable error.

I’m not expert on many topics the book addresses, so I flipped to its treatment of a subject that I’ve spent some time studying – global warming – in order to see how it treated a controversy for which I’m at least familiar with the various viewpoints and some of the technical detail.

It was awful. It was so bad that it was like the proverbial clock that chimes 13 times – not only is it obviously wrong, but it is so wrong that it leads you to question every other piece of information it has ever provided.

Levin argues that human-caused global warming is nothing to worry about, and merely an excuse for the Enviro-Statist (capitalization in the original) to seize more power. It reads like a bunch of pasted-together quotes and stories based on some quick Google searches by somebody who knows very little about the topic, and can’t be bothered to learn. After pages devoted to talking about prior global cooling fears, and some ridiculous or cynical comments by advocates for emissions restrictions (and one quote from Richard Lindzen, a very serious climate scientist who disputes the estimated magnitude of the greenhouse effect, but not its existence), he gets to the key question on page 184 (eBook edition):

[D]oes carbon dioxide actually affect temperature levels?

Levin does not attempt to answer this question by making a fundamental argument that proceeds from evidence available for common inspection through a defined line of logic to a scientific view. Instead, he argues from authority by citing experts who believe that the answer to this question is pretty much ‘no’. Who are they? – An associate professor of astrophysics, a geologist and an astronaut.

But he says that these are just examples:

There are so many experts who reject the notion of man-made global warming and the historical claims about carbon dioxide they are too numerous to list here.

He goes on to cite a petition “rejecting the theory of human-caused global warming” sponsored by the Oregon Institute of Science and Medicine and signed by more than 31,000 scientists. There are a few problems with this survey that Levin doesn’t mention. More than 20,000 of these “scientists” lack PhDs in any field. There was very little quality control: At least one person signed it as Spice Girl Geri Halliwell. Scientific American did the hard work of actually contacting a sample of individual signatories, and estimated that there are about 200 climate scientists who agree with the statement in the petition among the signatories. And most important by far, the text of the petition is not close to Levin’s claim of rejecting the notion of man-made global warming. In the key sentence it says that signatories do not believe that there is compelling scientific evidence that human release of greenhouse gases will cause catastrophic heating and disruption of the earth’s climate. Depending on the definition of “catastrophic”, I could agree to that. Yet I don’t reject the notion of man-made global warming.

On one side of the scale of Levin’s argument from authority, then, we have three scientists speaking outside their areas of central expertise, plus a dodgy petition. What’s on the other side of the scale that Levin doesn’t mention to his readers?

Among the organizations that don’t reject the notion of man-made global warming are: the U.S. National Academy of Sciences; The Royal Society; the national science academies of Australia, Belgium, Brazil, Canada, China, France, Germany, Ireland, Italy, India, Japan, Mexico, New Zealand. Russia, South Africa and Sweden; the U.S. National Research Council; the American Association for the Advancement of Science; the American Chemical Society; the American Physical Society; the American Geophysical Union; and the World Meteorological Organization. That is, Levin’s argument from authority is empty.

Of course, this roll call could be arbitrarily long and illustrious, and that does not make them right. Groupthink or corruption is always possible, and maybe the entire global scientific establishment is wrong. Does he think that these various scientists are somehow unaware that Newsweek had an article on global cooling in the 1970s? Or are they aware of the evidence in his book, but are too trapped by their assumptions to be able to incorporate this data rationally? Or does he believe that the whole thing is a con in which thousands of scientists have colluded across decades and continents to fool such gullible naifs as the U.S. Congressional Budget Office, numerous White House science advisors, Margaret Thatcher and so on? Are the Queen of England and the Trilateral Commission in on it too?

But what evidence does Levin present for any of this amazing incompetence or conspiracy beyond that already cited? None. He simply moves on to criticisms of proposed solutions. This is wingnuttery.

There are many reasons to write a book. One view is that a book is just another consumer product, and if people want to buy Jalapeno-and-oyster flavored ice cream, then companies will sell it to them. If the point of Liberty and Tyranny was to sell a lot of copies, it was obviously an excellent book. Further, despite what intellectuals will often claim, most people (including me) don’t really want their assumptions challenged most of the time (e.g., the most intense readers of automobile ads are people who have just bought the advertised car, because they want to validate their already-made decision). I get that people often want comfort food when they read. Fair enough. But if you’re someone who read this book in order to help form an honest opinion about global warming, then you were suckered. Liberty and Tyranny does not present a reasoned overview of the global warming debate; it doesn’t even present a reasoned argument for a specific point of view, other than that of willful ignorance. This section of the book is an almost perfect example of epistemic closure.

Krugman on Climate III: Resorting to Pascal’s Wager

In the two prior posts in this series, I argued that Paul Krugman has presented a flawed cost – benefit analysis for his proposals to reduce greenhouse gas emissions, and that his linked proposal to use carbon tariffs to push recalcitrant developing economies to go along is misguided. In this final post of the series, I will argue that his consideration of the role of uncertainty in determining our course of action is also incorrect, because he fails to provide any principle by which we could establish any limit to much we should spend to reduce a risk which can never be eliminated entirely.

Like all serious cases for emissions mitigation to avoid climate change damages, Krugman eventually comes to the argument that the possible, rather than the expected, consequences of AGW are so severe that they justify almost any cost. In describing why a simple comparison of expected to costs to expected benefits over the next century is an inadequate consideration of the economic trade-offs involved, he (correctly, in my view) focuses on the question of uncertainty.

Finally and most important is the matter of uncertainty. We’re uncertain about the magnitude of climate change, which is inevitable, because we’re talking about reaching levels of carbon dioxide in the atmosphere not seen in millions of years. The recent doubling of many modelers’ predictions for 2100 is itself an illustration of the scope of that uncertainty; who knows what revisions may occur in the years ahead. Beyond that, nobody really knows how much damage would result from temperature rises of the kind now considered likely.

You might think that this uncertainty weakens the case for action, but it actually strengthens it. As Harvard’s Martin Weitzman has argued in several influential papers, if there is a significant chance of utter catastrophe, that chance — rather than what is most likely to happen — should dominate cost-benefit calculations. And utter catastrophe does look like a realistic possibility, even if it is not the most likely outcome.

Weitzman argues — and I agree — that this risk of catastrophe, rather than the details of cost-benefit calculations, makes the most powerful case for strong climate policy. Current projections of global warming in the absence of action are just too close to the kinds of numbers associated with doomsday scenarios. It would be irresponsible — it’s tempting to say criminally irresponsible — not to step back from what could all too easily turn out to be the edge of a cliff.

Krugman is also correct, in my view, that uncertainty in our forecasts strengthens the case for action.

The stronger form of the argument from uncertainty is not only that it is possible that the true probability distribution of potential levels of warming is actually much worse than believed by the IPCC, but that a reasonable observer should accept it as likely that this is the case. As Krugman indicates, the sophisticated version of this argument has been presented by Weitzman. Weitzman’s reasoning on this topic is subtle and technically ingenious. In my view, it is the strongest existing argument that runs exactly counter to mine. (You can see a slightly earlier version of his paper, and my lengthy response here, along with links to the underlying source documents.) In very short form, Weitzman’s central claim is that the probability distribution of potential losses from global warming is “fat-tailed”, or includes high enough odds of very large amounts of warming (20°C or more) to justify taking expensive action now to avoid these low probability / high severity risks.

The big problem with his argument, of course, is that the IPCC has already developed probability distributions for potential warming that include no measurable probability for warming anywhere near this level for any considered scenario. That is, the best available estimates for these probability distributions are not fat-tailed in the sense that Weitzman means it. Therefore, professor Weitzman is forced to do his own armchair climate science, and argue (as he does explicitly in his paper) that he has developed a superior probability distribution for expected levels of warming than the ones the world climate-modeling community has developed and published. And his proposed alternative probability distribution is radically more aggressive than anything you will find in any IPCC Assessment Report – professor Weitzman argues, in effect, that there is a one percent chance of temperature increase greater than 20°C over the next century, while even the scale on the charts that display the relevant IPCC probability distributions only go up to 8°C. It is not credible to accept Professor Weitzman’s armchair climate science in place of the IPCC’s.

The only real argument for rapid, aggressive emissions abatement, then, boils down to the weaker form of the argument from uncertainty: the point that we must always retain residual doubt about any prediction – or in plain English, that you can’t prove a negative. The problem with using this rationale to justify large economic costs can be illustrated by trying to find a non-arbitrary stopping condition for emissions limitations. Any level of emissions imposes some risk. Unless you advocate confiscating all cars and shutting down every coal-fired power plant on earth literally tomorrow morning, you are accepting some danger of catastrophic warming. You must make some decision about what level of risk is acceptable versus the costs of avoiding this risk. Once we leave the world of odds and handicapping and enter the world of the Precautionary Principle – the Pascal’s Wager-like argument that the downside risks of climate change are so severe that we should bear almost any cost to avoid this risk, no matter how small – there is really no principled stopping point derivable from our understanding of this threat.

Think about this quantitatively for a moment. Suspend disbelief about the real world politics for a moment, and assume that we could have a perfectly implemented global carbon tax. If we introduced a tax high enough to keep atmospheric carbon concentration to no more than 420 ppm (assuming we could get the whole world to go along), we would expect, using to the Nordhaus analysis as a reference point, to spend about $14 trillion more than the benefits that we would achieve in the expected case. To put that in context, that is on the order of the annual GDP of the United States of America. That’s a heck of an insurance premium for an event so low-probability that it is literally outside of a probability distribution. Former Vice President Al Gore has a more aggressive proposal that if implemented through an optimal carbon tax (again, assuming we can get the whole word to go along) would cost more like $21 trillion in excess of benefits in the expected case. Of course, this wouldn’t eliminate all uncertainty, and I can find credentialed scientists who say we need to reduce emissions even faster. Any level of emissions poses some risk. Without the recognition that the costs we would pay to avoid this risk have some value, we would be chasing an endlessly receding horizon of zero risk.

So then what should we do? At some intuitive level, it is clear that rational doubt about our probability distribution of forecasts for climate change over a century should be greater than our doubt our forecasts for whether we will get very close to 500 heads if we flip a fair quarter 1,000 times. This is true uncertainty, rather than mere risk, and ought to be incorporated into our decision somehow. But if we can’t translate this doubt into an alternative probability distribution that we should accept as our best available estimate, and if we can’t simply accept “whatever it takes” as a rational decision logic for determining emissions limits, then how can we use this intuition to weigh the uncertainty-based fears of climate change damage rationally? The only way I can think of is to attempt to find other risks that we believe present potential unquantifiable dangers that are of intuitively comparable realism and severity to that of outside-of-distribution climate change, and compare our economic expenditure against each.

Unfortunately for humanity, we face many dimly-understood dangers. Professor Weitzman explicitly considers an asteroid impact and bioengineering technology gone haywire. It is straightforward to identify others. A regional nuclear war in central Asia kicking off massive global climate change (in addition to its horrific direct effects), a global pandemic triggered by a modified version of the HIV or Avian Flu virus, or a rogue state weaponizing genetic-engineering technology are all other obvious examples. Any of these could kill hundreds of millions to billions of people. Further, specialists often worry about the existential risks of new technologies spinning out of control: biosphere-consuming nanotechnology and supercomputers that can replace humans are both topics of intense speculation. In the extreme, leading physicists publicly speculated that starting the Large Hadron Collider in 2009 might create so-called strangelets that would annihilate the world, much as Edward Teller calculated that the initial nuclear explosion at Trinity in 1945 might ignite the world’s atmosphere. This list could be made almost arbitrarily long.

Consider the comparison of a few of these dangers to that of outside-of-distribution climate change dangers. The consensus scientific estimate is that there is a 1-in-10,000 chance of an asteroid large enough to kill a large fraction of the world’s population impacting the earth in the next 100 years. That is, we face a 0.01% chance of sudden death of most people in the world, likely followed by massive climate change on the scale of that which killed off the non-avian dinosaurs, which seems reasonably comparable to outside-of-distribution climate change. Or consider that Professor Weitzman argues that we can distinguish between unquantifiable extreme climate change risk and unquantifiable dangers from runaway genetic crop modification because “there exists at least some inkling of a prior argument making it fundamentally implausible that Frankenfood artificially selected for traits that humans and desirable will compete with or genetically alter the wild types that nature has selected via Darwinian survival of the fittest.” That does not seem exactly definitive. What is the realism of a limited nuclear war over the next century – with plausible scenarios ranging from Pakistan losing control of its nuclear arsenal and inducing a limited nuclear exchange with India, to a war between a nuclearized Iran and Israel?

The U.S. government currently spends about four million dollars per year on asteroid detection (in spite of an estimate that one billion dollars per year spent on detection plus interdiction would be sufficient to reduce the probability of impact by 90 percent). We continue to exploit genetic engineering to improve crop yields because, much like avoiding burning fossil fuels, the human costs of stopping this would be immediate and substantial. We detonated the atomic bomb at Trinity, and fired up in the Large Hadron Collider, because the perceived benefits anticipated from both were significant. We are not willing to engage in an unlimited level of military action to prevent nuclear proliferation, despite the risks proliferation creates, since we must weigh risk against risk.

In the face of massive uncertainty, hedging your bets and keeping your options open is almost always the right strategy. Money and technology are the raw materials for options to deal with physical dangers. A healthy society is constantly scanning the horizon for threats and developing contingency plans to meet them, but the loss of economic and technological development that would be required to eliminate all theorized climate change risk (or all risk from genetic and computational technologies or, for that matter, all risk from killer asteroids) would cripple our ability to deal with virtually every other foreseeable and unforeseeable risk, not to mention our ability to lead productive and interesting lives in the meantime.

We can be confident that humanity will face many difficulties in the upcoming century, as it has in every century. We just don’t know which ones they will be. This implies that the correct grand strategy for meeting them is to maximize total technical capabilities in the context of a market-oriented economy that can integrate highly unstructured information, and, most importantly, to maintain a democratic political culture that can face facts and respond to threats as they develop.

Krugman on Climate II: Assuming a Can Opener

In a prior post, I argued that the cost-benefit analysis for aggressive actions to reduce greenhouse gas emissions presented by Paul Krugman is flawed, even if we assume a globally-coordinated program. But if we can not get broad participation from the major developing countries, the costs that any country that prices carbon will impose upon itself will dwarf the benefits that it will create for itself. In the case of the U.S., the costs of a cap-and-trade regime like that envisioned under Waxman-Markey would be at least ten times its benefits if it were done without international cooperation. It would be very difficult to imagine an effective global emissions mitigation program that does not include the major economies of the developing world.

Krugman clearly recognizes this. How does he suggest that we get, for example, China, India, Brazil and others to participate if they say they don’t want to?

Then you need sticks as well as carrots. In particular, you need carbon tariffs.

A carbon tariff would be a tax levied on imported goods proportional to the carbon emitted in the manufacture of those goods. Suppose that China refuses to reduce emissions, while the United States adopts policies that set a price of $100 per ton of carbon emissions. If the United States were to impose such a carbon tariff, any shipment to America of Chinese goods whose production involved emitting a ton of carbon would result in a $100 tax over and above any other duties. Such tariffs, if levied by major players — probably the United States and the European Union — would give noncooperating countries a strong incentive to reconsider their positions.

They sure would. But isn’t it obvious that the targeted countries might consider other reactions beyond either just joining the carbon pricing regime or choosing to pay the tariff? What if they reacted with counter-tariffs, or set up an outside-the-tariff trading block with various resource-rich African and Asian countries, or reduced purchases of U.S Treasuries, or any of a thousand other ideas? Krugman has this to say:

To the objection that such a policy would be protectionist, a violation of the principles of free trade, one reply is, So? Keeping world markets open is important, but avoiding planetary catastrophe is a lot more important.

But if for the next century “planetary catastrophe” = an expected cost of 2% of economic output 100 years from now (and if avoiding this will likely cost more than this amount, even if such a program works), then maybe running the risk of inciting a global trade war isn’t such a great bet.

He goes on to describe the legality, but not the effectiveness, of such tariffs. Why do we think they will work, and not be met by aggressive counter-action? Here is the argument in its entirety:

Needless to say, the actual business of getting cooperative, worldwide action on climate change would be much more complicated and tendentious than this discussion suggests. Yet the problem is not as intractable as you often hear. If the United States and Europe decide to move on climate policy, they almost certainly would be able to cajole and chivvy the rest of the world into joining the effort. We can do this.

Maybe a direct, aggressive confrontation with countries representing several billion people and a good chunk of world economic output would work, and maybe it wouldn’t; but this is exhortation and wishful thinking in the place of analysis.

Krugman on Climate I: The Cost – Benefit Analysis

Paul Krugman had a long article in the Sunday New York Times Magazine on climate change. Refreshingly, it attempts to compare the costs and benefits of proposed climate policies rationally. But it also, in my view, loads the dice in certain key passages. In the interests of not making this post endless, I’ll discuss only two points today: (1) it asserts an expected level of global w arming that is about twice as large as that provided in the current IPCC Assessment; and (2), it appears to compare a resulting inflated damage estimate in the year 2100 to a cost estimate that we would start to bear by 2050. In sum, a corrected simple cost – benefit comparison of a policy to reduce greenhouse gas emissions indicates that we should expect the “solution” to cost more than the damages it averts.

In the article, Krugman says this about how much we should expect temperatures to rise:

At this point, the projections of climate change, assuming we continue business as usual, cluster around an estimate that average temperatures will be about 9 degrees Fahrenheit higher in 2100 than they were in 2000. [Bold added]

Later, he says this:

Nordhaus has argued that a global temperature rise of 4.5 degrees Fahrenheit — which used to be the consensus projection for 2100 — would reduce gross world product by a bit less than 2 percent. But what would happen if, as a growing number of models suggest, the actual temperature rise is twice as great? Nobody really knows how to make that extrapolation. For what it’s worth, Nordhaus’s model puts losses from a rise of 9 degrees at about 5 percent of gross world product. Many critics have argued, however, that the cost might be much higher. [Bold added]

According to the currently-governing 2007 IPCC Fourth Assessment Report (AR4), expected warming through about 2100 under the typical reference case scenario A1B is about 5F ( Table SPM.3 ), not the 9F asserted in Krugman’s piece. No marker scenario for future population and economic growth evaluated by the IPCC in AR4 has an expected temperature increase of 9F by 2100. In the standard Nordhaus models referenced in the article, damage estimates rise super-linearly with temperature; therefore, not quite doubling the estimate of temperature increase from 5F to 9F would more than double the damage estimate from 2% of GDP to the 5% of GDP that Krugman cites.

The essence of the argument for greater warming is that several new studies project greater warming than those used in the AR4, and that therefore a rational observer should update this forecast and expect a larger temperature increase. Of course, there is a long history of individual studies that project greater future warming than the consensus estimate. These go all the way back to the first major attempt to create such a consensus estimate: the NRC Charney Commission of 1979. [The scientist cited in the Charney Commission working papers who led the climate modeling team that projected much higher warming than the consensus estimate more than thirty years ago was…James Hansen.] In spite of these frequent challenges, however, the estimated value for climate sensitivity – the central scaling parameter that projects how much warming will be created by a given emissions scenario – has remained essentially unchanged for decades at about 3C, from the Charney report through each of the four successive IPCC assessment reports, and remains the estimate in the current AR4.

Synthesizing the various individual studies that have been released over the past several years to determine if projected warming estimates need to be roughly doubled requires integrating knowledge from numerous physical science specialties, software engineering, economics, and various civil and energy engineering disciplines. I’m not competent to do it; but then again, neither is Paul Krugman. No one person or small group is competent to do this – if it were, why would we bother with the immense time and expense of the IPCC assessment process? We’d just go ask that guy to tell us the answer.

Krugman says this about the economic costs to the U.S. of imposing a carbon mitigation regime like that proposed in the Waxman-Markey bill:

Over all, the Budget Office concludes, strong climate-change policy would leave the American economy between 1.1 percent and 3.4 percent smaller in 2050 than it would be otherwise.

He then says this about the costs to the world of applying a similar regime to the world:

One recent review of the available estimates put the costs of a very strong climate policy — substantially more aggressive than contemplated in current legislative proposals — at between 1 and 3 percent of gross world product.

Consider the global cost assertion first.

Krugman does not cite the specific review study, nor does he define a “very strong” climate policy, nor does specify when we should expect costs to reach 1 to 3 percent of world output. This last point is important, because most serious analyses of this subject that I’ve seen project that a mitigation policy will create costs that continue to rise as a percentage of economic output over the course of the upcoming century.

There have been many reviews of cost estimates produced over the past few years. I’ll purposely choose one produced in 2009 by Resources for the Future (RFF), since it is a moderately left-of-center, well-respected environmental organization. Its estimates are broadly consistent with those produced by many similar bodies. This report provides estimates for the costs of maintaining a global concentration of CO2 at 450 ppm and 550 ppm. I’ll take the more stringent of these two (450 ppm) as representing a “very strong” climate policy. Finally, I’ll use the RFF estimate for the “least cost” estimate (i.e., the most efficient possible policy that assumes, for example, global coordination around emissions reductions without any more realistic geopolitical complexity). It’s possible that Krugman is using an uncited report with very different projections than RFF, but by the choice of study sponsor and political assumptions, I’ll, if anything, bias toward low cost estimates.

According to the RFF review, global cost of the 450 ppm policy should be in the range of 3% of gross world output in 2050. But by 2100, this is projected to increase to a cost of about 6% of gross world output (with a very wide range of estimates). This is very different than costs of “between 1 and 3 percent of gross world output.”, for two reasons: (1) if we use damages as of 2100 in a cost-benefit comparison, we need to be consistent about dates, and (2) we should expect to bear large mitigation costs for decades prior to realizing any expected benefit in the form of avoided damages, which has an enormous impact on the expected value calculation in the cost – benefit analysis.

Krugman says this about a simple comparison of the costs and benefits of a global policy to reduce emissions:

Despite the uncertainty, it’s tempting to make a direct comparison between the estimated losses and the estimates of what the mitigation policies will cost: climate change will lower gross world product by 5 percent, stopping it will cost 2 percent, so let’s go ahead.

But of course, properly considered, he has the numbers exactly backward. As per the first point in this post, if we use current IPCC temperature forecasts, a better estimate for the expected global economic damages caused by human-induced climate change in 2100 is more like 2%, rather than the 5% asserted in Krugman’s article. And as per the second point, a much better estimate for the economic costs of mitigation in 2100 is about 6%, not the 2% that Krugman asserts.

While Krugman immediately (and correctly) says such a comparison would over-simplify the real decision logic, it does illustrate the crux of the problem for advocates of aggressive efforts to reduce emissions: (1) for at lest the next century the expected damages caused by global warming are nothing like the doomsday pictures presented in the popular media, but are rather a few percent of GDP, and (2) the economic costs imposed by aggressive programs designed to avoid these damages are expected to be far greater than the damages that they should expect to avoid for the next 100 years at least.

And to be a little parochial about it, from the perspective of a U.S. taxpayer, the net expected economic damages to the U.S. from global warming are expected to be materially zero through 2100, while the costs, as indicated in the Krugman article should be about 2% of GDP by 2050. As per the prior analysis, mitigation costs should be higher by 2100. That is, even under some very favorable assumptions, citizens of the United States are being asked to sacrifice several percent of total income (i.e., trillions of dollars of cumulative consumption) for no tangible expected benefit for the next 100 years.

Introducing the nuances of very long-term considerations, geopolitical complexity, the importance of risk and uncertainty and so forth is the subject of the balance of Krugman’s article. I’ll take these up in the next post.

Hot Air on Climategate and Copenhagen

I have a diavlog up at bloggingheads with Dave Roberts on (what else?) Climategate and Copenhagen. I think it was a productive discussion.

Hacked Climate Science Emails

A set of very damaging e-mails have apparently been hacked from the Hadley Climate Research Unit; they purportedly show climate scientists there manipulating and deploying historical climate data to reach predetermined conclusions, coordinating messaging, and attempting to control the definition of expertise in order to marginalize those who disagree with them.

I have not read the full set of e-mails, nor have I seen authoritative evidence of their provenance, but for the sake of argument let’s assume the allegations are correct. None of this surprises me. I argued over two years ago that: 1) Long-term climate reconstruction was one of the two key trouble spots in climate science; 2) mathematically sophisticated critics had debunked the methodology used to reconstruct long-term climate evidence that is the basis for the famous “hockey stick” increase in global temperatures; and 3) excellent evidence had been presented to the U.S. Senate that, in climate reconstruction, academic peer review meant, in effect, agreement among a tiny, self-selected group of experts. The root problem here is not the eternal perfidy of human nature, but the fact that we can’t run experiments on history to adjudicate disputes, which makes this less like chemistry or physics than like economics or political science.

In human terms, the scandal is obviously a PR disaster for those who believe that climate reconstruction is “science” in the sense we normally use the term, but what it does not change is the basic physics of how CO2 molecules interact with radiation. As I have always argued, this is the real basis for rational concern about greenhouse-gas emissions, and is a key reason that all the major national scientific academies agree that the greenhouse effect is a real risk. Recognizing this risk, however, does not entail accepting the political conclusion that we need laws to radically reduce emissions at enormous cost.

Climate Change Discussion

You may (or may not) have noticed my almost total lack of blogging over the past few months. I have been devoting all of my time available for writing to my book.

I did, however, do a bloggingheads discussion over the weekend with David Orr, an academic who has written a very hard eco-left book on climate change and related subjects. He is also charming, smart, well-informed and well-intentioned. The subject was his book, so I tried mostly to explore his views rather than yak about my own, but I spent probably half the time on various iterations of a question that I find to be the key one with climate change action advocates: “What is the maximum price you would pay to avoid deleterious effects of climate change?” That exchange starts at about the 20 minute point.

Paul Krugman. Pot. Kettle. Black.

Paul Krugman has argued in his most recent New York Times column that opponents of the Waxman-Markey energy and environment bill (a.k.a. the “cap-and-trade” bill) are dishonest when they argue that it would be expensive to implement. He starts with wisecracks about climate change deniers; in what I assume is a first for Professor Krugman, he cites the behavior of a corporation as positive evidence for the moral worth of his position; and he quotes Joe Romm’s blog complaining that opponents of cap-and-trade are constantly changing their analysis in order to support pre-determined conclusions – which for anybody involved in this debate qualifies as the only really good laugh line in the piece.

When Professor Krugman eventually gets around to addressing substance, here is his argument:

[T]he best available economic analyses suggest that even deep cuts in greenhouse gas emissions would impose only modest costs on the average family. Earlier this month, the Congressional Budget Office released an analysis of the effects of Waxman-Markey, concluding that in 2020 the bill would cost the average family only $160 a year, or 0.2 percent of income. That’s roughly the cost of a postage stamp a day.

By 2050, when the emissions limit would be much tighter, the burden would rise to 1.2 percent of income. But the budget office also predicts that real G.D.P. will be about two-and-a-half times larger in 2050 than it is today, so that G.D.P. per person will rise by about 80 percent. The cost of climate protection would barely make a dent in that growth. And all of this, of course, ignores the benefits of limiting global warming.

Professor Krugman starts here by repeating the already-hackneyed political talking point that over the next decade Waxman-Markey is projected to have the same cost as “a postage stamp a day”. He then, to his credit, proceeds to consider its projected costs by 2050, which is crucial because emissions mitigation would need to be sustained for many, many decades in order to achieve its desired climate effects. He waves his hand at a projected cost of about 1% of income. But 1% of U.S. income is an enormous amount of money. Suppose I proposed some government program, and told you that it would cost “only” $150 billion per year, every year, for more than a hundred years, and then told you that this was no big deal because it’s only about 1% of the economy? I mean, it would “barely make a dent”. His argument is absurd.

Of course, if there were a persuasive case that it would create benefits that would more than offset this cost, it would be rational to support it. What is his argument about the benefits? “And all of this, of course, ignores the benefits of limiting global warming.” That’s it? Professor Krugman has a Nobel Prize in economics – he’s got to be able to do better than that. Why doesn’t he make any attempt to justify the costs? As I’ve argued at length, even using assumptions that are extremely favorable to the bill, the expected costs of Waxman-Markey are at least ten times larger than the expected benefits to U.S. taxpayers.

Professor Krugman then concludes his column with seven paragraphs that chastise his ideological opponents for lacking fair-mindedness and intellectual rigor.

The Socialism Implicit in the Social Cost of Carbon

Burning fossil fuels creates so-called “external costs” because it contributes to ongoing climate change. This is a fancy way of saying that when I burn such fuels, other people become worse off than they would be otherwise, because I have increased the odds that they will suffer damages from anthropogenic global warming (AGW). This both seems unfair, and means that we will burn more fossil fuels than would seem to be socially optimal. It seems obvious to many people that we should therefore tax fossil fuels in order to prevent this. This is termed a Pigovian tax, and is sometimes referred to as “internalizing the externality”, or taxing fossil fuels to reflect the “social cost of carbon”.

It’s not so obvious to me that this is good idea. To implement it would be little more than a re-labeling of the kind of comprehensive planning that Hayek attacked sixty years ago.

Over at the Daily Dish I try to explain why.

Has Global warming Stopped?, ctd.

I blame myself for what I consider to be the pretty disappointing responses to my prior post (especially the normally excellent TAS comboxes). The fact that so many people have reacted to things I wasn’t trying to say indicates that the communication failure is mine. So let me try to be clear about what I was actually trying to say.

When confronted with objections to an apparent scientific consensus, one valid approach is simply to assemble a wide variety of relevant scientists, ensure that the questions posed to them are technical questions within their scope of competence, and rely on their findings. This is the basic idea behind the UN IPCC, and the AGW reports of various national scientific academies. This has been my approach in the case of AGW, where I have always taken the technical findings of the IPCC as the starting point for any policy analysis on this topic.

George Will (or at least the view that was reasonably imputed to him by his interlocutors), questions the validity of the scientific consensus on AGW. His interlocutors, instead of just relying on the IPCC process, tried to engage the substance of George Will’s quasi-scientific objection. They responded by saying that he has not looked at a long enough trend line.

The primary point of my post was that while I agree that George Will’s (implicit) attempted falsification of AGW theory is not compelling, neither is the logic used by his interlocutors. Both logics share a common source of failure: looking for an underlying “trend” in the temperature record independent of physical causality. There is no magic “trend”, but instead a set of causal effects based on physical interactions that drive temperature. The scientific assertion made by the global climate science community is that we have built models that allow us to understand these effects with sufficient precision to make useful forward predictions. When evaluating this assertion, then, the relevant standard is not “Is the rate of warming slowing or accelerating?”, but rather “How accurately are our models predicting the rate of warming?”. That is, we should rationally care about deviation from prediction, not deviation from trend. This is why I described George Will’s (implicit) method for addressing the certainty of our scientific knowledge as “misguided”, which in addition to explicitly disagreeing with his conclusions, seems like a funny way of defending him.

The secondary point of the post was that a component of any well-structured prediction modeling process is to have model evaluation groups separate from the model-building teams that have different incentives and reporting structures, roughly analogous to a QA team for software development or fact-checkers at a magazine. One key task of such model evaluation teams is typically to escrow copies of code used to make predictions, log forward predictions made at time X for some outcome after time X, then run the code at the time of the predicted event with actual data entered for all inputs other than the asserted causal factor, and compare the resulting model output to actual outcomes. This is done across a range of predictions to create distributions of model error. While there have been some kludgey, one-off attempts to do something like this for the Hansen 1988 testimony, and a group has tried to look at single-year predictiveness of global climate models, there is nothing like a structured program in place to do this for climate models. Such approaches are always imperfect – and I tried to point out in the post some of the reasons that this would be especially problematic in the case of global climate models – but it would still provide a far better basis for the discussion of prediction adequacy than we have now.

What’s especially ironic abut a lot of the commentary on the post is that lots of people take assertions of uncertainty in climate forecasts as undercutting the case for emissions mitigation, so those on the Right argue for uncertainty, and those on the Left argue the opposite. In the sophisticated AGW debate, the economic justification for mitigation is seen as, conceptually, an insurance premium. If the expected warming takes place with expected effects, it is very difficult to justify the economic costs of mitigation, and therefore it is a hedge against much-worse-than-expected effects. Therefore, the greater the uncertainty in climate prediction, the stronger the case for mitigation – uncertainty is not our friend. So before you accuse me of intellectual dishonesty, recognize that in pointing out limitations in the current practice of climate model validation, I am actually arguing a point that cuts against my stated policy preference.

Has Global Warming Stopped?

Ezra Klein , Kevin Drum and Ryan Avent all have posts up that attack George Will’s statement that “If you’re 29, there has been no global warming for your entire adult life”. Kevin Drum describes this as “idiotic”, Ryan as “moronic”, and Ezra responds, of course, with a chart. So does the always-numerate Kevin Drum, and I’ll use his version of the chart:

The funny thing is that if you zoom in on about the last ten years, you see this:

There has not been a lot of measured warming for the last ten years.

It’s hard to dispute this. What Ezra, Kevin and Ryan are arguing is idiotic, moronic or whatever is the notion that the past ten years of data disproves the theory of AGW. Their basic argument is “sure, but look at the long-term trend”. I agree with them about the conclusion that the last ten years of raw data don’t falsify the theory (and have argued this at many times in many places), but I’m not sure any of them have thought through this question fully.

If I observe that it is cooler in New York today than yesterday, no reasonable person would take that as proof that AGW theory is wrong. On the other hand, if we had rapid growth of human population and rapid fossil-fuel-dependent economic development for the next 1,000 years with no increase in surface temperatures, no reasonable person would claim that AGW in anything like its current form had not been disproven. The question is at what point between 1 day and 1,000 years do I have enough evidence that I can reasonably reject the theory? It seems to me that you need a rational standard to answer this question before you simply call ten years “moronic” a priori.

In fact, it’s more complicated than that. If we had no warming over the past ten years (true) and lots more CO2 in the air (true) but also a huge increase in volcanic activity (not true, but posited as an illustration), this would not be evidence that AGW theory was untrue, because the models used to predict warming would have called for no warming because all the particulate matter thrown up by the volcanoes should offset the effect of the CO2. So what we are really looking for is the degree of divergence between the predictions of the models used as the basis for long-run warming predictions versus actual temperatures, in order to falsify or corroborate the operational theory that we can predict future long-run temperature impacts attributable to CO2 emissions. The rigorous version of the question then is: what is a valid falsification period for AGW models?

So, naturally we just go to the escrowed set of AGW models with their predictions made over the past 20 years or so, enter in all data for actual emissions, volcanic activity and other model inputs for the time from the prediction was made until today, and then run the mdoels and compare their outputs to actual temperature change in order to build a distribution of model accuracy, right? Ha ha. Needless to say, no such repository exists.

Almost all humans resist management and audit, and climate modelers are no exception. Because they have been so poorly managed, we have no well-structured program to evaluate accuracy, and instead must rely only on back-testing (or what among climate modelers is termed “hindcasting”). Now, this would be hard to do, for several reasons: the models (we believe) keep improving, so the accuracy of a 1988 model doesn’t necessarily tell us the accuracy of a 2008 model; there is huge signal-to-noise, so it requires several decades (we believe) to have a useful measure of accuracy, while we are being asked to make policy questions now; and so on.

But the instincts of those who are grasping for some way to hold the tools used to make temperature predictions accountable to reality in some way are sound, even if their method is somewhat misguided. They aren’t idiots or morons, they’re just not specialists, and the government they pay for, which in turn funds the model construction project, hasn’t bothered to do its job and provide the best feasible measurements of the value of these models.

Models, Models Everywhere And No One Stops to Think

Lots of bloggers have attacked my cost-benefit analysis of Waxman-Markey. The current line of attack seems to be that cost-benefit analysis is just the wrong way to think about this problem. There are, as usual, various intertwined logics for why this might be so. They all seem to center around the idea, however, that while climate science can make reasonably reliable predictions a century or two out, economic projections over this time scale are basically worthless.

While this exaggerates the reliability of climate models, I agree that they are more useful than very long-term economics models. But let’s assume arguendo that my critics are correct in the extreme, and therefore we have no ability to translate a climate forecast into an estimate of economic damages. Then, they say, we know some kind of catastrophe is coming, and it’s our duty to head it off.

There are at least a couple of huge problems with this argument.

First, all estimates of the climate impact of human-induced CO2 emissions rely on a long-term emissions forecast, which in turn relies on (i) forecasts of population growth, (ii) forecasts of economic growth per capita, (iii) forecasts for the energy intensity of economic output per capita, and (iv) technology forecasts for the carbon-intensity of each unit of economic output. That is, we can’t make a long-range climate forecast in the absence of long-range economic forecasts.

The differences in emissions across economic scenarios are not trivial. They are the basis for the UN IPCC’s scenario-based forecasting approach that leads to a difference in estimated temperature impacts by 2100 that are about 3 times larger for the highest-emissions plausible scenario versus the lowest emissions plausible scenario.

Second, if we assume that we have literally no technical capability to translate a temperature forecast to a forecast of damages, then we are forced to rely on intuition. A 3C increase by 2100 in temperature sure doesn’t sound so awful to me. Want to argue that I’m misguided, and here is this long list of awful things that will happen? We’re right back to estimating damages, and I’ll just point to what the IPCC, CBO, EPA and so forth estimate when they try to do comprehensive estimates of net impacts.

Look, I get the point that trying to forecast what our GDP will be in 2136 to within a few percent is ridiculous. It is. Further, I get the point that indefinite accumulations of CO2 in the atmosphere will eventually become very damaging. I also get the point that there is some risk that we might reach that point sooner than we think. These are all true statements.

But if they are to inform rational decision-making, they also require quantitative, not rhetorical, interpretation. When do we expect that CO2 will be how big a problem? How big is the risk that it will be worse than that? And so on. While we may legitimately criticize a specific cost-benefit analysis or methodology, it seems hard to imagine a rational approach to such decision-making that doesn’t try to envision the future world under alternative policy assumptions and assert a preference.

So, show us your alternative forecasts and provide sources and methodologies. If you choose to respond with a bunch of words describing how awful things could look, or wave your hands, you’re still making a forecast – it’s just not of any real use. False precision is one way to evidence the problem of a unwarranted assertion of certainty, but so is simply asserting that we know the damages that we should expect within some finite time outweigh the costs of some proposed program to avert them, without providing the evidence.

The forecasts of every responsible body, as I have gone to such boring lengths to show over many articles, actually make it very hard to justify any so-far proposed carbon pricing or rationing schemes based on the benefits we should expect them to produce over the next roughly 100 years. (I think the fact that my critics are mostly attacking the idea of cost-benefit analysis itself, rather than my quantitative arguments, is pretty good evidence of this.) Eventually, of course, if we assume linear extrapolation of current trends, CO2 will become a deadly problem; but I think if you’re honest, you’ll find yourself having to justify these programs based on things that you project will happen in the 22nd century and beyond. Who’s being arrogant about predicting the future now?

I have very little idea what the technological, social and political bases for the human economy will be hundreds of years from now, and think that trying to manage such a problem by changing carbon pricing today is foolish in the extreme.

Money is Not the Measure of All Things

I think there is a compelling case that if we were to use the best estimates from the IPCC and similar technical bodies for the most likely impacts of Anthropogenic Global Warming (AGW) on average global GDP over the next century, then proposed programs of emissions mitigation are not economically justified. Many very smart bloggers have made the point that that using average global GDP as our only metric to evaluate the relative attractiveness of potential future outcomes misses a lot of what should be important to us.

I believe that there at least two intertwined strands to these objections:

1. Average GDP misses a lot. We could wipe out the GDP of many poor countries, and still only have a small impact on global GDP, and it doesn’t seem fair to consider lowering U.S. GDP by a fraction of 1% on one hand, and entirely eliminating the country of, say, Bangladesh on the other, as equally bad in some important moral sense.

2. GDP misses a lot. There are many things that we care about that are not captured in GDP statistics, such as human health or suffering, maintaining traditional ways of life, aesthetic beauty and so on.

I’ll try to address these one at a time. I’ll rely heavily on papers by Indur Goklany in which he integrates multiple analyses, predominantly from the IPCC and the UK government.

1. Relative economic impacts on the developing vs. the developed world

There is some trade-off between economic growth and mitigation of AGW damages, at least in the short-term. Mitigation advocates often correctly point out that the global poor will be disproportionately affected by AGW damages, but it is also the case that they will be disproportionately affected by reductions in global economic growth. An empirical question is the relative size of these two effects.

Consider Goklany’s review of research that compares the change in climate and wealth under various UN IPCC scenarios for development over this century. I’ll show the two extreme scenarios to make a point: A1F1 (the IPCC scenario for global development that is most heavily dependent on fossil fuels, and has a projected increase in global temperature of about 4C by the end of the century), and B1 (the scenario that assumes greatest deployment of alternative technologies, and has a projected increase in global temperature of about 2C by the end of the century). Here are the projections for each scenario for the developed then the developing worlds:

Developed Countries Projected GDP / Capita in 2100:

A1F1: $107,300 B1: $72,800

Developing Countries Projected GDP / Capita in 2100:

A1F1: $66,500 B1: $40,200

In other words, at least through the next hundred years, the average person living in the developing world is better off in money terms with more economic development and more AGW damage, on net. A lot better off in fact: $66,500 is more than 65% higher than $40,200.

I’ll note in passing that by 2100 the average person in the developing world is projected be at a level of income comparable to the U.S. in 2009.

2. Impacts not directly captured by GDP

I’ll focus first on two items which any reasonable analyst would consider to be important, and for which we have some projections: hunger and water.

Goklany has collated detailed projections for the projected change in various metrics between a baseline year of 1990 and a projection year of 2085 from the UK Government’s Fast-Track Assessment of global climate change (FTA). This is a 95-year projection, and as there should be some acceleration of warming effects this should be a tolerable estimate for impacts for the highly-overlapped 91-year period from 2009 to 2100.

Here are the results for projected humans at risk from hunger under the A1F1 versus B1 scenarios:

Under either scenario, the world should be able to push those at risk for hunger down to 1% – 2% of the world’s population by the end of this century, at any projected level of warming. The realistic risks to this are war, other political action, or threats to a world of interdependent trade and economic growth. The impact of global temperature change is rounding error in comparison.

Here are the results for humans at risk of water stress:

Because wealth allows us to insulate ourselves from environmental risks, a warmer but richer world is projected to be better off on this metric.

Here are some other metrics. The percentage of the world’s population that is at risk for coastal flooding is well under 1% in the baseline, and is not projected to rise close to 1% in any scenario within the 95-year forecast. Malaria deaths have historically been in effect eliminated by societies that achieve several thousand dollars per year of per capita income – the key risk here is once again slower economic growth that keeps parts of the developing world poorer longer.

Again and again, we see the same pattern: at least for the next century, changes in human welfare, even on metrics that are not purely economic, are fundamentally driven by changes in economic development, not AGW damages. This is why it makes sense to be focused acutely on risks to economic growth when considering the overall effects of any emissions mitigation program.

Older articles ↓