Sensitivity analysis

One of the points on which economists generally agree on is that sensitivity analysis is a good thing. Broadly speaking, this means varying the (putatively) crucial parameters of a model and seeing what happens. If the results change a lot, the parameter justifies a closer look.

In the case of the Stern Review of the economics of global warming, sensitivity analysis quickly revelas that the crucial parameter is the pure rate of time preference. This is the extent to which we choose to discount future costs and benefits simply because they are in the future and (if they are far enough in the future) happening to different people and not ourselves. If like Stern, you choose a value near zero (just enough to account for the possibility that there will be no one around in the future, or at least no one in a position to care about our current choices on global warming), you reach the conclusion that immediate action to fix global warming is justified. If, like most of Stern’s critics you choose a rate of pure time preference like 3 per cent, implying that the welfare of people 90 years (roughly three generations) in the future counts for about one-sixteenth as much as the welfare of people alive today, you conclude that we should leave the problem to future generations.

So, responses to a Stern Review provide another kind of sensitivity analysis. If you don’t care (much) about future generations, you shouldn’t do anything (much) about global warming.

Robson reverses reality

It turns out that Tim Blair’s post, linked below, is based on this piece by Alex Robson (a former colleague of mine and also briefly a blogger), who argues from claimed prediction errors by “leftist economists” (he’s kind enough not to mention me by name) that we shouldn’t trust climate science. His crucial point is that whereas we predicted that the 1996 budget cuts would cause increased unemployment

Howard did indeed reduce overall spending in real terms in the two years following the 1996 election, and he cut the size of the commonwealth public service in each year between 1996 and 2000. But following Howard’s “savage” budget cuts, the unemployment rate did not rise; it fell, from 8 per cent in 1996 to 6 per cent in 2000.

Unfortunately, Alex is being a little economical with the information he reports here. As can be seen below (courtesy of Economagic, unemployment did in fact rise following the 1996 budget cuts. It wasn’t as bad as it might have been (and was in New Zealand where monetary policy was messed up as well), but the facts are clear: contra Robson, unemployment did not fall, it rose.

The unemployment rate didn’t clearly resume its earlier decline until the shift to an expansionary fiscal and monetary policy during the Howard government’s second term.

Another piece of misdirection is the reference to cuts in the Commonwealth public service, largely driven by contracting out. This is a red herring, apparently designed to distract attention from the more relevant variable, public expenditure.

200612131654

Blair concedes defeat

Tim Blair quotes a statement I and others wrote in 1996, criticising expenditure cuts, and saying in part

More attention needs to be given to the role of government expenditure on repairing the nation’s rundown infrastructure, creating jobs and fostering industry and regional development. If necessary, increased taxation and other revenue options should be under consideration. Savage expenditure cuts are economically irresponsible and socially damaging.

As Blair points out^, this is an argument that has now been pretty generally accepted. Most of the cuts we were criticising have been reversed (not without doing damage along the way). Infrastructure spending is now a high priority for governments. Without getting into sterile arguments about whether or not the current Federal government is the highest taxing in Australian history, it’s clear that the idea of radical cuts in public expenditure and taxation, which Blair has long advocated, is politically defunct in Australia.

The case was well stated by one of our political leaders in 2004, when he observed

There is a desire on the part of the community for an investment in infrastructure and human resources and I think there has been a shift in attitude in the community on this, even among the most ardent economic rationalists.

He could just about have been quoting our words from 1996.

As I noted at the time

A new bipartisan consensus has emerged, in favor of the social-democratic policies that have, until recently, been derisively described as ‘tax and spend’.

The only surprise is that it has taken Blair so long to wake up to the fact that he’s on the losing side of this debate.

^ With yet another kind reference to my success in winning a Federation Fellowship.

The equity premium and the Stern Review

Brad DeLong carries on the discussion about discounting and the Stern Review, responding to a critique by Partha Dasgupta that has already been the subject of heated discussion. As Brad says, all Dasgupta’s assumptions are reasonable, and his formal analysis is correct

But … The problem I see lies in a perfect storm of interactions:

This brings me to one of my favorite subjects: the equity premium puzzle and its implications, in this case for the Stern Review. I’ll try and explain in some detail over the page, but for those who prefer it, I’ll self-apply the DD condenser and report

Shorter JQ: It’s OK to use the real bond rate for discounting while maintaining high sensitivity to risk and inequality.
Read More »

Future of the Family Farm

My piece in Thursday’s Fin was about claims that the family farm is doomed. This is one of these notions that seems impossible to kill, having been around for decades, it looks as if it will outlast me. It seems to appeal to just about everybody. Sometimes its pushed by farmers who want more government aid. At the moment though, it’s mainly being run by economic rationalists who want to sweep away these small and allegedly inefficient operations.

Read More »

Close to zero?

In yet another round of the controversy over discounting in the Stern Report, Megan McArdle refers to Stern’s use of “a zero or very-near-zero discount rate”. Similarly Bjorn Lomborg refers to the discount rate as “extremely low” and Arnold Kling complains says that it’s a below-market rate.

So what is the discount rate we are talking about? Stern doesn’t pick a fixed rate but rather picks parameters that determine the discount rate in a given projection. The relevant parameters are the pure rate of time preference (delta) which Stern sets equal to 0.1 and the intertemporal elasticity of substitution (eta) which Stern sets equal to 1. The important parameter is eta, which reflects the fact that since people in the future will mostly be richer than us, additional consumption in the future is worth less than additional consumption now.

Given eta = 1, the discount rate is equal to the rate of growth of consumption per person, plus 0.1. A reasonable estimate for the growth rate is 2 per cent, so Stern would have a real discount rate of 2.1 per cent. Allowing for 2.5 per cent inflation, that’s equal to a nominal rate of 4.6 per cent. The US 10-year bond rate, probably the most directly comparable market rate, is currently 4.44 per cent; a bit above its long-run average in real terms. So, Stern’s approach produces a discount rate a little above the real bond rate.

Arguments about discounting are unlikely to be settled any time soon. There’s a strong case for using bond rates as the basis for discounting the future. There are also strong arguments against, largely depending on how you adjust for risk. But to refer to the US bond rate as “near-zero” of “extremely low” seems implausible, and to say it’s below-market is a contradiction in terms. It seems as if these writers have confused the discount rate with the rate of pure time preferences.

We’ll all be rooned

Today’s Courier-Mail has a report pushing the Beattie Government’s plans for new dams, and threatening financial ruin if they aren’t built. Crucial quote:

As its efforts to win approval for the controversial Traveston Crossing Dam in the Mary River Valley move into top gear, the Government has used a consultant’s report on possible economic losses to the region to push its case for the project.

The lack of new water sources could end up costing southeast Queensland at least $55 billion and perhaps as much as $110 billion by 2020, according to the consultants ACIL Tasman.

Even before this episode, the name ACIL Tasman wasn’t one that filled me with confidence. All consultants like to produce reports that support their client’s preferred position, and my experience of ACIL Tasman is that the approach to this outcome is “whatever it takes”.

I haven’t been able to find the report yet, but the numbers seem way off-beam to me. This report says that the total revenue for SEQ Water and sewerage businesses was about $1.4 billion in 2005/06, growing at about 6 per cent a year. ACIL Tasman wants us to believe that limits on additional supplies could cost between $5 billion and $10 billion a year.

I find this implausible, at least as an economically meaningful cost estimate. A doubling of water prices would be enough to reduce demand significantly over time (even allowing for underlying growth in population and income), and make all sorts of supply options, such as desalination, economically feasible, without any need for new dams. The welfare cost of this would be around 0.5 billion a year (I’ll do a proper check on this number later). So, I’d say ACIL Tasman is out by a factor of 10 to 20.

I haven’t seen enough information to determine whether the proposed dams pass the cost-benefit test. But this report makes me think the case must be pretty weak.

The Stern Review and the long tail

My first post on the Stern review started with the observation that

the apocalyptic numbers that have dominated early reporting represent the worst-case outcomes for 2100 under business-as-usual policies.

Unfortunately, a lot of responses to the review have been characterized by a failure to understand this point correctly. On the one hand, quite a lot of the popular response has reflected an assumption that these worst-case outcomes are certain (at least in the absence of radical changes in lifestyles and the economy) and that they are going to happen Real Soon Now. On the other hand, quite a few critics of the Review have argued that, since these are low-probability worst cases, we should ignore them.*

But with nonlinear (more precisely strongly convex) damage functions, low-probability events can make a big difference to benefit-cost calculation. Suppose as an illustration that, under BAU there is a 5 per cent probability of outcomes with damage equal to 20 per cent of GDP or more, and that with stabilisation of CO2 emissions this probability falls to zero. Then this component of the probability distribution gives a lower bound for the benefits of stabilisation of at least 1 per cent of GDP (more when risk aversion is taken into account). That exceeds Stern’s cost estimates, without even looking at the other 95 per cent of the distribution.

An important implication is that any reasoning based on picking a most likely projection and ignoring uncertainty around that prediction is likely to be badly wrong, and to understate the likely costs of climate change. Since the distributions are intractable the best approach, adopted by the Stern review, is to take an average over a large number of randomly generated draws from the distribution (this is called the Monte Carlo approach).

To sum up, the suggestion that because bad outcomes are improbable, we should ignore them is wrong. If it were right, insurance companies would be out of business (not coincidentally, insurance companies were the first sector of big business to get behind Kyoto and other climate change initiatives)
Read More »

Milton Friedman: a brief appreciation

Milton Friedman has died at the age of 94. He made some huge contributions to macroeconomics, notably including his permanent income theory of consumption, which paved the way for the modern life cycle theory and his work on expectations and the Phillips curve.

He was also the most effective advocate for free-market policies since Adam Smith. As has been said several times over at Crooked Timber recently, everyone, and particularly everyone with a leftwing view of the world, should read Capitalism and Freedom at least once. As Mill said, beliefs you hold merely because you haven’t been exposed to the strongest possible critique of those views, aren’t really well-founded. Certainly, my own views were changed in some respects by exposure to Friedman, and where they were not, I was forced to reconsider the basis for my positions.

Friedman was effective in part because he was obviously a person of goodwill. I never had the feeling with him, as with many writers in the free-market line, that he was promoting cynical selfishness, or pushing the interests of business. He genuinely believed that economics was about making people’s lives better and that disagreements among economists were about means rather than ends and could ultimately be resolved by careful attention to the evidence.

Stern on the costs of climate change, Part 1

The standard (expected utility) approach to assessing the cost of climate change is to
(i) derive a probability distribution for possible rates of climate change under some given projections,
(ii) attach a cost (or benefit) number to each possible outcome, expressed in utility terms,
(iii) calculate the expected utility gain (or loss)
(iv) express the calculated number as a percentage change in some income aggregate (usually GDP)

In this post, I’m going to look at step (ii). In most respects, the Stern review has adopted assumptions that favour strong action to mitigate climate change – relatively optimistic regarding the costs of stabilising CO2 levels, and relatively pessimistic regarding projections of changes in climate. But the cost calculations are conservative, probably because the previously published estimates of Mendelsohn, Nordhaus and Tol have been way too low.
Read More »