The end of the coal boom

A bit over a year ago, I put up a post with the same title as this one, except that it ended with a question mark. At that point, most of the authorities I cited took the view that the decline in the world price of steaming coal was just a blip. In fact, prices have kept on falling and are now, in real terms, not much higher than they were in 2004. More importantly, there is now no expectation of a recovery any time soon. The clearest evidence of that is the abandonment or deferral of a string of proposals to create or expand coal export terminals, most recently by BHP at Abbot Point. Investors are desperately trying to get out of the most recently completed project, at Wiggins Island.

A few observations on this

* It’s common for participants in the Australian debate to claim that the rest of the world is going ahead with coal-fired power stations and fossil fuel projects at an unprecedented rate. That was the view that motivated these port expansion projects, and it’s been falsified as clearly as it can be by their abandonment.

* Much of the discussion about climate mitigation is based on the assumption that Australia can decide how much or how little of the burden we should bear. Leaving aside the risks of a free rider strategy, our status as a coal-exporter means that the biggest impacts will arise from decisions made overseas

* Finally, for some light relief here’s former Queensland Treasurer Andrew Fraser (paywalled) citing the now-abandoned Abbott Point project as evidence of the benefits of the Bligh government’s asset sales program, of which he was the biggest booster. It will be interesting to see if he now changes tack and claims that the state was lucky to get of these assets when it could (a more plausible line, but both dubious and contradictory of his previous position).

Colin Clark Memorial lecture: National Accounting and the Digital Economy – the Case of the National Broadband Network,

I’ll be presenting the Colin Clark Memorial lecture on 14 November.

Colin Clark’s greatest contribution to economics was his pioneering role in the construction of national accounts. In the industrial economy of the 20th century, the central problem in the national accounting was the need to avoid double counting, by measuring only the value added at each stage of production. This problem is closely related to that of benefit-cost analysis for public projects. In the 21st century digital economy, value is primarily derived from the flow of information rather than physical inputs and outputs. This creates new problems for national accounting, and for benefit-cost analysis. One example of these problems is the question of how to evaluate alternative proposals for the National Broadband Network.

The talk is bundled with a lunch at Customs House, which (from past experience) will be very pleasant, but fairly expensive, so this event is mostly going to appeal to people whose employers can pay. For those who aren’t in this category, or who aren’t in Brisbane, I’ll post a link to the slides after the event.

The macro foundations of micro (crossposted at Crooked Timber)

Twitter alerted me to an amusing exchange between Chris Auld, posting a list of “18 signs you’re reading bad criticism of economics and Unlearning Economics, responding with 18 Signs Economists Haven’t the Foggiest. UL suggests that Stephen Williamson manages an impressive 9 out of 18 in his review of Zombie Economics (my response here with more from Noah Smith.

Scoring myself against Chris Auld’s list, I’d say I’m in the clear. But quite a few commenters on Zombie Economics have made complaints along the lines of his point 1, that I focus too much on macroeconomics (and finance). The implication is that, even if macro is totally wrong, only a minority of economists do it, and microeconomists are in the clear.

This defense doesn’t work, at least not in general.

Read More »

Why do we *still* have a Nobel Prize in Economic Sciences?

Ingrid Robyens at Crooked Timber links to some fascinating discussion from Philip Mirowski of the role of Swedish domestic politics in the establishment of the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel, with emphasis on the way in which claims of “scientific” status for economics helped the claim of the Swedish central bank to independence from government.

In the broader context, it seems pretty clear that, if the idea had arisen even a few years later, it would have been rejected. In 1969, economics really did seem like a progressively developing science in which new discoveries built on old ones. There were some challenges to the dominant Keynesian-neoclassical synthesis but they were either marginalized (Marxists, institutionalists) or appeared to reflect disagreements about parameter values that could fit within the mainstream synthesis.

Only a few years later, all of this was in ruins. The rational expectations revolution sought, with considerable success, to discredit Keynesian macroeconomics, while promising to develop a New Classical model in which macroeconomic fluctuations were explained by Real Business Cycles. This project was a failure, but led to the award of a string of Nobels, before macroeconomists converged on the idea of Dynamic Stochastic General Equilibrium models, which failed miserably in the context of the global financial crisis. The big debate in macro can be phrased as “where did it all go wrong”. Robert Gordon says 1978, I’ve gone for 1958, while the New Classical position implies that the big mistake was Keynes’ General Theory in 1936

The failure in finance is even worse, as is illustrated by this year’s awards where Eugene Fama gets a prize for formulating the Efficient Markets Hypothesis and Robert Shiller for his leading role in demolishing it. Microeconomics is in a somewhat better state: the rise of behavioral economics has the promise of improved realism in the description of economic decisions.

Overall, economics is still at a pre-scientific stage, at least, as the idea of science is exemplified by Physics and Chemistry. Economists have made some important discoveries, and a knowledge of economics helps us to understand crucial issues, but there is no agreement on fundamental issues. The result is that prizes are awarded both for “discoveries” and for the refutation of those discoveries.

Market monetarism: a first look

One of the more confusing of the macroeconomic debate is the emergence of a profusion of schools of thought with very similar names, but very different viewpoints. The one I’ve had most to deal with is Modern Monetary Theory. I had a go at this topic here and . My brief summary is that MMT pretty much coincides with traditional Keynesian views in the context of a liquidity trap, but that I reject the claim commonly made in popular presentations of MMT, that increased government spending doesn’t imply increased taxation.

Then there’s New Monetarism, associated with Stephen Williamson. He and I had a set-to a while back, which entertained many but didn’t produce a lot of enlightenment, and left me disinclined to put a lot of effort into understanding the differences between New and Old Monetarism. (For the record, I’m pretty much an Old Keynesian, but I have learnt a fair bit from New Keynesians like Akerlof and Shiller).

The third entrant is “Market Monetarism” associated mainly with Scott Sumner (though Wikipedia tells me the term was coined by Lars Christensen). I was aware in general terms that Sumner advocated a more expansionary monetary policy in response to the current crisis (I agree), that he prefers Nominal GDP level targeting to inflation targeting as the basis for monetary policy (I agree again though I’d prefer targeting levels rather than growth rates) and that he thinks this would be sufficient to fix the problem without any role for fiscal policy (I disagree). However, I wasn’t really aware that these ideas formed the basis of a school of thought, and I still haven’t investigated the underlying theory in any detail.

Sumner has commented on my recent posts on fiscal and monetary policy with a couple of his own, so I guess it’s time for me to look more closely at what he is saying. A first response is over the fold.

Read More »

A note on the ineffectiveness of monetary stimulus (updated and corrected)

A commenter on the previous post raised the idea, promoted by the “market monetarist” school, that monetary policy is so effective as to make fiscal policy entirely unnecessary, at least when interest rates are above the zero lower bound. My views on this issue were formed by the experience of the late 20th century, and in particular, the recession that began in 1990, following steep increases in interest rates. Having planned a “short, sharp, shock”, the RBA started cutting rates in January 1990.

They didn’t go for 25 basis point moves in those days. Over the period to March 1993, rates were cut by more than 12 percentage points, from 17.5 per cent to 5.25 per cent. Over the same period, unemployment rose from 6 per cent to nearly 11 per cent, a record for the period since the Depression, and stayed around that level well into 1994, until the adoption of the Working Nation package of fiscal stimuuls active labour market policies. As I said in the previous post, tight monetary policy can reliably cause recessions, but expansionary monetary policy in a deep recession is “pushing on a string”.

Update As pointed out by Mark Sadowski in comments, these are nominal rates of interest. To get the real rate, which is more relevant, you need to subtract the expected rate of inflation, which fell from around 7 per cent to around 4 per cent over this period (as measured by surveys, and by the premium for inflation-adjusted Treasury bonds). So, you get a 9 percentage point reduction in the real rate from 10 per cent to 1 per cent. This doesn’t make much difference to the story. Most economists would regard policy as contractionar/expansionary if real interest rates are above/below the long-run neutral level, about 3 per cent. So, we still have a shift from strongly contractionary to moderately expansionary.

However, market monetarists want to argue that the stance of policy should be assessed relative to a policy rule (Taylor rule or NGDP) that already incorporates a prescription of cutting rates when GDP falls and unemployment rises. This doesn’t make a lot of sense to me. It’s like arguing that Obama’s stimulus was actually a contractionary policy because it wasn’t as big as (according to a standard analysis based on Okun’s Law) it should have been. It’s partly a question of semantics, but it’s associated with the claim that, if only rates had been cut even more, we wouldn’t have had the recession, or would have recovered quickly. Having been around at the time, I disagree.

Fiscal multipliers and employment (wonkish)

With two weeks to go in the election campaign, we still haven’t seen anything resembling a budget proposal from Tony Abbott and the LNP. Various people have made estimates of the cost of his promises and the cuts likely to be needed to fund those promises and return to surplus. My main concern is that Abbott has locked himself so thoroughly into the rhetoric of surplus that, in the event of a downturn or recession, he will feel compelled to adopt the kinds of austerity measures that have had a disastrous impact in Europe and prevented any real recovery in the US. To make this point properly, we need some numbers. One way to get such numbers is with a macroeconomic model. That gives you some better precision, but often hides the key assumptions. Instead, I will give a very simple Keynesian analysis, yielding back-of-the-envelope estimates.

For illustration, I’ll assume a public expenditure cut of $10 billion a year – the calculation is linear so it can be scaled up or down as needed. In a recession, the fiscal multiplier is likely to be around 1.5 (that’s the value used by Christina Romer when she pushed for a larger fiscal stimulus in 2009, and consistent with recent estimates by the IMF). So, the impact of the cut, when multipliers are taken into account is $15 billion or around 1 per cent of national income (or GDP if you prefer that measure).

Now we can use Okun’s Law to estimate that the cut will raise the unemployment rate by around 0.5 percentage points. Taking participation rates into account, employment will also fall by around 0.5 per cent (about 50 000 jobs).

A bunch of qualifications and observations over the fold

Read More »

Cronyism and the global city again (crosspost from crookedtimber.org)

Alex Pareene at Salon points to a bunch of evidence showing, in essence, that the rich look out for themselves and their kids, and no one else, then to a piece by Andrew Ross Sorkin defending nepotism in the US, and by extension in China. There was a time, not so long ago, when Asia’s reliance on guanxi and similar networking practices was denounced as ‘crony capitalism’, to be contrasted with the pure and hard-edged version to be found in the US. This was supposed to explain the vulnerability of Asian economies to the crisis of 1997, and the stability of the US, then well into the Great Moderation.

A few years later, in the very early days of blogging, I wrote a post pointing out that the eagerness of financial sector workers to congregate in the same physical location, even though their work was supposed to be based on objective evaluation of data transmitted by computer, was pretty good evidence that the “global city” phenomenon, much in vogue at the time, was just guanxi writ large.

I turned that into a magazine article at Next American City (now Next City, whose web site seems to have lost it). Then I wrote a longer and more academic version and submitted it a lot of journals in economic geography, urban geography and so on, none of whom were interested. I think it stands up well in retrospect (much more so than most of the ‘global city’ literature, at any rate), but of course I’m biased.

At any rate, at least now everyone, and not least a defender and beneficiary of the system like Sorkin, is comfortable with the notion that capitalism is a rigged game, in which the ability to fix the next round is part of the prize for winning this one.

Update/clarification I’ve implicitly taken the efficient markets hypothesis as a benchmark, and assumed that features of the financial sector (for example, physical colocation) that can’t be explained by EMH are likely indicators of cronyism. It’s possible to take the view that the financial sector does things that are inconsistent with EMH, but nevertheless socially beneficial. An obvious example is the kind of opaque, over-the-counter derivatives that Dodd-Frank has tried to ban, and that the finance sector is lobbying hard to protect: it seems clear that doing these kinds of deals would benefit from face-to-face contact. So, if such deals are, in aggregate, socially beneficial, my argument fails – the converse also holds.

Krugman, Keynes, Kalecki, Konczal

Paul Krugman’s recent columns, responding in various ways to JM Keynes, Michal Kalecki and Mike Konczal have made interesting reading, signalling a marked shift to the left both on economic theory and on issues of political economy.[^1] Among the critical points he has made

* Endorsement of Kalecki’s argument (which he got via Konczal) that “hatred for Keynesian economics has less to do with the notion that unemployment isn’t a proper subject of policy than about the notion of shifting power over the economy’s destiny away from big business and toward elected officials.”

* Rejection of the Hicks-Samuelson synthesis of Keynesian macroeconomics and neoclassical microeconomics and advocacy of (at a minimum) comprehensive financial controls

* Abandonment of the idea that the economics profession is engaged in honest intellectual debate, in favor of the conclusion that the rightwing of the profession, including leading economists, is characterized by denialism and bad faith. As he says, while many economists would like to believe otherwise ” you go to economic debates with the profession you have, not the profession you want.”

Read More »