Home > Environment > Error bars

Error bars

August 8th, 2003

This report in the Guardian cites leading leftwing thinktank the Institute of Public Policy Research as saying that, according to the latest research at the Hadley Centre for Climate Change, the likely change in temperature by 2100 (under business as usual) will be 8 degrees C, as against ‘consensus’ estimates ranging from 1 to 5 degrees C . The argument is apparently based on claims that CO2 stored in the soil will be released with rising temperatures, producing a positive feedback.

I haven’t followed this up, and it seems surprising that such an obvious mechanism should have been overlooked. So I’m not suggesting that this report should be regarded as reliable. Rather, I want to use this report to illustrate a point I’ve made previously.

A lot of critics of Kyoto argue that, since there’s a lot of uncertainty about the estimates produced by the Intergovernmental Panel and Climate Change and similar bodies, we should ‘wait and see’. These critics tend to pounce on any study that produces an estimate lower than the consensus range to bolster their case.

This neglects the fact that uncertainty goes both ways. There’s a nonzero probability that the rate of warming could be lower than the range suggested by the best available estimates, but it’s equally possible that the rate could be higher. The best available estimates suggest we should do something now (Kyoto) and prepare to do a lot more in the future unless we get a favorable surprise.

In the case of global warming uncertainty actually strengthens the case for action because the damage costs are convex. That is, an increase of 4 degrees will do more than twice as much damage than an increase of 2 degrees and an increase of 8 degrees (the IPPR estimate cited above) would be utterly catastrophic. So, the more uncertainty there is, the stronger the case for action.

When there’s a lot of uncertainty, the important thing is not so much immediate action to reduce emissions as the creation of institutions and mechanisms that will allow large reductions to be made in future. With all its imperfections, the Kyoto agreement is the only process that offers any possibility of progress in this respect.

Categories: Environment Tags:
  1. Uncle Milton
    August 8th, 2003 at 10:25 | #1


    Your argument assumes that the current ‘best estimates’ are unbiased, in the statistical sense, so that it is ‘equally possible’ that the rate of warming could be higher rather than lower than this best estimate.

    I’m not saying the estimates are not unbiased (I don’t know) but this is an important qualification to your argument. I realise that convex damage costs could undo the force of this qualification, but it all depends on the degree of the convexity compared to the size of the possible bias.

  2. August 8th, 2003 at 14:05 | #2

    John, I think that the big problem with this post, is that you have read to much on global warming.

    Having read the comments at Ken Parish’s latest post on global warming, it appears that one’s ability to evaluate global warming is inversely proportional to one’s knowledge of the subject*. Or as 24601 states “it is not uncommon for people closely involved in an issue to inclined towards over-estimating it as an issue”.

    Perhaps it would be best to leave future policy to those who have read a couple of webpages, and skimmed through The Skeptical Environmentalist.

    * An exception to this rule arises when a global warming skeptic is willing to make a minor concession, such is, there will be warming, but it’s so small we don’t have to worry about it.

  3. rdb
    August 8th, 2003 at 18:05 | #3

    Recently in the news:
    NYTimes 07-08 Salt of the Earth, Paul Krugman

    NYTimes 07-07 Suit Challenges Climate Change Report by U.S.
    An antiregulatory group sued the Bush administration yesterday in an effort to force the government to stop distributing a report on climate change that the group contends is inaccurate and biased.

    The suit says the continued use of the report, which was published in 2000, violates the Federal
    Data Quality Act, a law enacted that year that requires information disseminated by the government to pass standards for objectivity, quality, and utility – meaning the data are reliable enough to be used by the public.

    The triumph of fringe science
    Global warming naysayers argue that we don’t need to do anything to stop rising temperatures. Mainstream scientists used to be able to ignore them, but now they make White House policy.

    By Katharine Mieszkowski

  4. Jim Birch
    August 8th, 2003 at 19:13 | #4

    What a great law!

    Incidently, it seems to me that the methodology of the numerical models used to make these estimates has a tendancy to produce conservative estimates:

    Basically, the models take the the complexities of the earths carbon processes, break them up inot components, model the components, and combine for a result. The component models always rely on simplifying parameterizations of complex manifold heterogeneous physical processes. These are simplifications that are deemed to be representative “enough” of whats really going on out there.

    When is a parameterization considered ok? When it produces results like what happens. When does it need more work, tweaking or even fudging? When it doesn’t. The net result will be to pick up problems with the model components that respond too strongly but to apply less analytical effort to parameterizations that produce sluggish responses to inputs.

    Ideally, good scientific analysis and judgement would eliminate or minimize this bias but I would expect that overall the process would tend to produce somewhat conservative models.

    The psychology of the commentary is another matter.

Comments are closed.