Home > Dead Ideas book > Bookblogging Micro-based macro

Bookblogging Micro-based macro

October 12th, 2009

Over the fold, yet more from my book-in-progress, Zombie Economics: Undead ideas that threaten the world economy. This is from the Beginnings section of the Chapter on Micro-based Macro, and covers the breakdown of the Phillips curve and the rise of New Classical and Rational Expectations macro. This (along with the bits to come on DGSE models) is probably the section on which my own background is weakest, so feel free to point out my errors.

I’ve now posted drafts of the first three chapters (+Intro) at my wikidot site, so you can get some context. In particular, before commenting on omissions, take a quick look to see that the point hasn’t been covered elsewhere.

Micro-based macro is here

I’ve got a lot out of comments and discussion so far, and I hope some of this is reflected in what you are reading.

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: center; font: 12.0px ‘Century Schoolbook’}
p.p2 {margin: 8.0px 0.0px 0.0px 0.0px; text-indent: 28.0px; font: 13.0px Optima}
p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; text-indent: 28.0px; font: 12.0px ‘Century Schoolbook’; min-height: 15.0px}
p.p4 {margin: 8.0px 0.0px 0.0px 0.0px; text-indent: 28.0px; font: 13.0px Optima; min-height: 15.0px}
span.s1 {text-decoration: underline}

The Phillips curve

Throughout the history of capitalism it has been observed that boom periods tended to be accompanied by inflation (an increase in the general price level), and depressions by deflation. This observation formed a central part of the Keynesian economic system. While Keynes is commonly remembered for his advocacy of budget deficits to stimulate the economy in periods of recession, he also grappled with the problem of how to avoid inflation in the postwar period. In his famous and influential pamphlet, How to Pay for the War, Keynes argued that inflation was the product of an excess of demand over supply, and that the appropriate policy response was for governments to increase taxes and run budget surpluses, to bring demand into line with supply.

In 1958, New Zealand economist A.W (Bill) Phillips undertook a statistical study which formalised the relationship between unemployment and inflation. in the now-famous Phillips Curve. The curve related unemployment to the rate of change in money wages, showing that, at very low rates of unemployment, wages tended to grow rapidly. Since wages account for the majority of production costs, rapid wage inflation also implies rapid price inflation. The higher the rate of unemployment, the lower the rate of wage growth. However, because workers generally resist outright cuts in wages, the curve flattens out, with increases in unemployment beyond a certain rate (typically between 5 and 10 per cent) having little further deflationary effect.

Phillips was famous (or perhaps notorious) for having designed a hydraulic analog computer that could be used to represent the Keynesian economic model (the Faculty of Economics and Politics at Cambridge University still has a working version) and had, as a Japanese prisoner of war built a miniature radio at great risk. Despite his engineering skills, and his general reputation as an exponent of ‘hydraulic’ Keynesianism, he did not endorse a mechanical interpretation of the curve. He is said to have remarked that “if I had known what they would do with the graph I would never have drawn it.

The leading American Keynesian economists of the day, Paul Samuelson and Robert Solow, were less cautious. They estimated similar relationships for the US, and drew the conclusion that society faced a trade-off between unemployment and inflation. That is, society could choose between lower inflation and higher unemployment or lower unemployment and higher inflation. This point was spelt out in successive editions of Samuelson’s textbook, simply entitled Economics, which dominated the market from its initial publication in 1948 until the mid-1970s. Given a menu of choices involving different rates of unemployment and inflation, it seemed obvious enough that, since unemployment was the greater evil, a moderate increase in inflation could be socially beneficial/

The interpretation of the Phillips curve as a stable trade-off between unemployment and inflation led to an acceptance of higher rates of inflation as the necessary price of reducing unemployment still further below the historically low levels of the postwar boom. So, whereas previous episodes of inflation had been met with the orthodox Keynesian response of fiscal contraction aimed at reducing aggregate demand, there was no such response to the acceleration of inflation in the late 1960s. The Phillips curve idea appeared to justify expansionary fiscal policy except when unemployment was very low, and embedded the notion of Keynesian economics as a justification for budget deficits under any and all circumstances.

#

Friedman, Natural Rate and NAIRU…

The Keynesian adoption of the Phillips curve paved the way for Milton Friedman’s greatest intellectual victory, based on a penetrating analysis offered in the late 1960s at a time when inflation, while already problematic, was far below the double-digit rates that would be experienced in the 1970s.

In his famous 1968 Presidential address to the American Economic Association, Friedman argued that the supposed trade-off between unemployment and inflation was the product of illusion. As long as workers failed to recognise that the general rate of inflation was increasing, they would regard wage increases as real improvements in their standard of living and therefore would increase both their supply of labor and their demand for goods. But, Friedman argued, sooner or later expectations of inflation would catch up with reality. If the rate of inflation were held at, say, 5 per cent for several years, workers would build a 5 per cent allowance for inflation into their wage claims, and businesses would raise their own prices by 5 per cent to allow for the increase in anticipated costs.

Once expectations adjusted, Friedman argued, the beneficial effects of inflation would disappear. The rate of unemployment would return to the level consistent with price stability, but inflation would remain high. Interpreted graphically, this meant that the long-term Phillips curve was a vertical line.

Friedman’s analysis gave no specific answer to the question of where unemployment would stabilise. Friedman argued that this could be determined as “the level ground out by the Walrasian system of general equilibrium equations … .including market imperfections …the cost of gathering information about job vacancies and labor availabilities, the costs of mobility and so on’ ” Friedman introduced the unfortunate description of this outcome as the ‘natural rate of unemployment’, although even on his own telling there was nothing natural about it. The same terminology was adopted by Edmund Phelps, who developed a more rigorous version of Friedman’s intuitive argument, for which he was awarded the Economics Nobel in 2006. These days, most economists prefer to use the euphemism “NAIRU,” which stands for Non-Accelerating Inflation Rate of Unemployment.

In summary, Friedman and Phelps suggested, the beneficial effects of inflation were the product of illusion on the part of workers and employers. And by implication, they suggested that their Keynesian colleagues were subject to a more sophisticated form of the same illusions.

Within a few years, Friedman’s judgement was vindicated. The Samuelson-Solow interpretation of the Phillips curve as a stable trade-off was soon proved wrong in practice, as inflation rates increased without any corresponding reduction in unemployment, a phenomenon that came to be referred to by the ugly portmanteau word, stagflation (stagnation + Inflation). Inflation rates rose steadily, reaching double digits by the early 1970s.

The simplistic Keynesian interpretation of the Phillips curve was discredited forever. No one in the future would suggest that policymakers could exploit a stable trade-off between unemployment and inflation, except under special conditions. But this idea, dating only from the 1960s, was a late development in Keynesian thought, and its failure did not imply that Keynesian macroeconomics itself was unsound. To banish the idea that governments could and should act to stabilise the economy and preserve full employment (or even Friedman’s ‘natural rate’) the critique of Keynesianism had to be pushed further.

#

The New Classical school

Friedman argued that exploitation of the Phillips curve could not work for long, because expectations of inflation would eventually catch up with reality. Experience seems to support this argument, at least once inflation rates are high enough for people to take notice (anything above 5 per cent seems to do the trick).

But Friedman’s reasonable argument was neither logically watertight nor theoretically elegant enough for the younger generation of free-market economists, who wanted to restore the pre-Keynesian purity of classical macroeconomics, and became known as the New Classical school. Their key idea was to replace Friedman’s adaptive model of expectations with what they called ‘rational expectations’ (a term coined much earlier, and in a microeconomic context, by John F. Muth). Although Muth had been cautious about possible misinterpretation of the term, his successors showed no such caution. Having adopted Muth’s characterization of rational expectations as “those that agree with the predictions of the relevant economic model”, and defined the relevant economic model as their own, New Classical economists happily traded on the implicit assumption that any consumer whose expectations did not match those of the model must be irrational.

One of the first and most extreme and applications of the rational expectations idea was put forward in 1974 by Robert Barro, then an up-and-coming young professor at the University of Chicago, and who now makes regular appearances, not only in academic journals and lists of likely candidates for the Nobel Prize in Economics, but also in the Opinion pages of the Wall Street Journal.

Barro drew on the work of the first great formal theorist in economics, David Ricardo. Ricardo, a successful speculator, financier and member of the House of Commons developed the ideas presented in Adam Smith’s Wealth of Nations into a rigorous body of analyis. He observed that, if governments borrow money, say to finance wartime expenditures, their citizens should anticipate that taxes will eventually have to be increased to repay the debt. If they were perfectly rational, Ricardo noted, they would increase their savings, by an amount equal to the additional government debt, in anticipation of the higher tax burden. So it should make no difference whether the war is financed by current taxation or by debt. Having observed this theoretical equivalence, Ricardo immediately returned to reality with the observation that “the people who paid the taxes never so estimate them, and therefore do not manage their private affairs accordingly”.

Barro’s big contribution, in an article published in 1974, was to focus on theory rather than reality and suggest that what he called ‘Ricardian equivalence’ actually holds in practice. Barro’s claim was never widely accepted, even among opponents of Keynesianism.

Barro’s claim was made without regard to empirical evidence. Econometric testing strongly rejected the “Ricardian equivalence’ hypothesis, that current borrowing by governments would be fully offset by household saving. Some tests suggested that borrowing might moderately increase household saving, but others showed the exact opposite. Critics pointed out numerous theoretical deficiencies in addition to Barro’s reliance on ultra-rational expectations. For example, the argument assumes that households face the same interest rate as governments, which is obviously untrue.

Nevertheless, despite failing to gain significant acceptance, the Ricardian Equivalence hypothesis had a significant effect on the debate within the economics profession. Extreme assumptions about the rationality of consumer decisions, that would once have been dismissed out of hand, were now treated as the starting point for analysis and debate. In this way, Barro paved the way for what became known as the Rational Expectations revolution in macroeconomics. And, while Barro’s Ricardian arguments for the claim that Keynesian policies could not possibly work were seen as implausible other versions of the same claim were soon produced, and widely accepted.

The result has been called New Classical Economics, a body of economic theory which reproduces the classical conclusion that government intervention cannot improve macroeconomic performance and that, in the absence of such intervention, the economy will rapidly adjust to economic shocks, returning in a short term to its natural equilibrium position.

#

Lucas critique and RE

The central idea of rational expectations goes back to the early 1960s. Agricultural economists at the time often modelled price cycles in commodity markets as the outcome of lags in the production process. The idea was that a high price for, say, corn, would occur in some season because of, say, a drought or a temporary increase demand. Farmers in one season would observe the high price and plant a lot of it for the next season. The result would be large crop and a low price. Farmers would therefore plant less corn for the following season and the price would go up again. Eventually, this series of reactions and counter-reactions would bring the price back to the equilibrium level where supply (the amount of corn farmers would like to produce and sell at that price) equalled demand. As represented on the supply-and-demand diagrams economists like to draw, the path of adjustment resembled a cobweb, and so, the model is

Economist John(?) Muth saw a problem. In the cobweb model, farmers expect a high price this season to be maintained next season, and so produce high output. But this is a self-defeating prophecy, since the high output means that the price next season will be low. Why, Muth, asked would farmers keep on making such a simple, and costly, mistake. If farmers based their expectations on their own experience, they would not expect high prices to be maintained. But, what, then, would they expect? An expectation that high prices are followed by low prices, as occurs in the cobweb model, would be similarly self-defeating.

Muth’s answer was both simple and ingenious. The requirement that the price expected by farmers should equal the expected price generated by the model can be incorporated within the model itself, and this requirement closes the circle in which expectations generate prices and vice versa. Muth showed that, with this requirement, the cobweb model could not work. As long as the ‘shocks’ that raise or lower prices in one season are not correlated with the shocks in the next season, the only ‘rational’ expectation for farmers is that the price next season will be equal to the ‘average’ equilibrium price that the model generates in the absence of such shocks. If farmers expect this, they will produce, on average, the supply associated with that price, and, on average, that price will in fact arise.

In the late 1970s, Robert E. Lucas took Muth’s idea and applied it to the macroeconomic debate about inflationary expectations. Friedman had convinced most economists that, if high rates of inflation are maintained long enough, companies and workers will come to expect it and build this expectation into price-setting decisions and wage demands. He suggested a simple adjustment process in which expectations gradually catch up with a change in the inflation rate. That process was sufficient to kill off the idea of a stable trade-off between unemployment and inflation, and to explain how continued high inflation, initially associated with low unemployment, could turn into the ‘stagflation’ of the 1970s.

In Friedman’s ‘adaptive expectations’ model, there was a lag between an increase in the rate of inflation and the adjustment of inflationary expectations. That lag left open the possibility that governments could manipulate the Phillips curve trade-off, at least in the short run. Lucas used the idea of rational expectations to close off that possibility. In a rational expectations model, workers and businesses (commonly referred to in this literature as ‘economic agents’) make the best possible estimate of future inflation rates, and therefore cannot be fooled by government policy. Lucas’ ideas were developed by Neil Sargent and Wallace into the ‘Policy Ineffectiveness Proposition’

Lucas developed a more general critique of economic policymaking, using the case of the Phillips curve as an example. His general point was that there was no general reason to suppose that an empirical relationship observed under one set of policies, like the Phillips curve relationship between unemployment and inflation, would be sustained in the event of a change in policies, which would, in general, imply a change in expectations. The Lucas critique works with a range of assumptions about expectations, including Friedman’s adaptive expectations, but it is most naturally associated with Lucas’ favored rational expectations model. Lucas argued that the only reliable empirical relationships were those derived from the ‘deep’ microeconomic structure of models, in which economic outcomes are the aggregate of decisions by rational agents, making decisions aimed at pursuing their own goals (maximising their utility, in the jargon of economists).

The solution it seemed was obvious, though not simple. The Keynesian separation between macroeconomic analysis, based on observed aggregate relationships and microeconomic analysis, must be abandoned. Instead, macroeconomics must be built up from scratch, on the microeconomic foundations of rational choice and market equilibrium.

#

RBC

The micro-based approach to macroeconomics appealed to large segments of the economics profession, who valued the elegance and apparent precision of microeconomics more than the messy empiricism of macro. But there was an obvious problem. General equilibrium models like those of Walras, Arrow and Debreu naturally generated a stable, static equilibrium. But the reality that business conditions fluctuate over time could scarcely be denied. So, the problem was posed as one of producing a general equilibrium model in which such fluctuations could arise.

The first attempt, Real Business Cycle theory emerged in the early 1980s as a variant of New Classical Economics. The big papers were by Plosser & Long and Kydland & Prescott. The RBC literature introduced two big innovations, one theoretical and one technical.

In theoretical terms, relative to the standard New Classical story that the economy naturally more rapidly back towards full employment equilibrium in response to any shock, RBC advocates recognised the existence of fluctuations in aggregate and employment but argued that such fluctuations represent a socially optimal equilibrium response to exogenous shocks such as changes in productivity, the terms of trade, or workers’ preference for leisure.

In technical terms, RBC models were typically estimated using a calibration procedure in which the parameters of the model were adjusted to give the best possible approximation to the observed mean and variance of relevant economic variables and the correlations between them (sometimes referred to, in the jargon, as ‘stylised facts’). This procedure, closely associated with a set of statistical techniques referred to as the Generalized Method of Moments, differs from the standard approach pioneered by the Cowles Commission in which the parameters of a model are estimated on the basis of a criterion such as minimisation of the sum of squared errors (differences between predicted and observed values in a given data set.

There’s no necessary link between these two innovations and there gradually emerged two streams within the RBC literature. In one stream were those concerned to preserve the theoretical claim that the observed business cycle is an optimal outcome, even in the face of data that consistently suggested the opposite. In the other stream were those who adopted the modelling approach, but were willing to introduce non-classical tweaks to the model (imperfect information/competition and so on) to get a better fit to the stylised facts.

The big exception that was conceded by most RBC theorists at the outset was the Great Depression. The implied RBC analysis that the state of scientific knowledge had suddenly gone backwards by 30 per cent, or that workers throughout the world had suddenly succumbed to an epidemic of laziness was the subject of some well-deserved derision from Keynesians. A couple of quotes I’ve pinched from a survey by Luca Pensieroso

<blockquote>“the Great Depression [. . . ] remains a formidable barrier to a completely unbending application of the view that business cycles are all alike.” (Lucas (1980), pg. 273.) “If the Depression continues, in some respects, to defy explanation by existing economic analysis (as I believe it does),

perhaps it is gradually succumbing under the Law of Large Numbers.” (Lucas (1980), pg.284)</blockquote>

But towards the end of the 1990s, at a time when RBC theory had in any case lost the battle for general acceptance, some of the more hardline RBC advocates tried to tackle the Depression, albeit at the cost of ignoring its most salient features . First, they ignored the fact that the Depression was a global event, adopting a single-country focus on the US. Then, they downplayed the huge downturn in output between 1929 and 1933, focusing instead on the slowness of the subsequent recovery which they blamed, unsurprisingly, on FDR and the New Deal. The key paper here is by Cole and Ohanian who put particular emphasis on the National Industrial Recovery Act.

There are plenty of difficulties with the critique of the New Deal, and these have been argued at length by <a href=”http://edgeofthewest.wordpress.com/2009/02/02/the-pony-chokers/”>Eric Rauchway</a> among others. But the real problem, is that RBC can’t possibly explain the Depression as most economists understand it, that is, the crisis and collapse of the global economic system in the years after 1929. Instead, Cole and Ohanian want to change the subject. The whole exercise is rather like an account of the causes of WWII that starts at Yalta.

The failure of RBC is brought into sharp relief by the current global crisis. Not even the most ardent RBC supporter has been game to suggest that the crisis is caused by technological shocks or changes in tastes, and the suggestion that it was all the fault of a minor piece of anti-redlining law (the Community Reinvestment Act) has been abandoned as the speculative excesses and outright corruption of the central institutions of Wall Street has come to light.

Unlike New Keynesian macro, where some useful insights will be relevant to policy in future periods of relative stability, it’s hard to see much being salvaged from the theoretical program of RBC. On the other hand, it has given us some potentially useful statistical techniques. The idea that parameters of macroeconomic models may be selected by calibration rather than by statistical estimation has an appeal that does not depend on accepting the theoretical commitments of the RBC school.

Categories: Dead Ideas book Tags:
  1. Michael Harris
    October 12th, 2009 at 18:54 | #1

    John

    A few semi-trivial points after a quick glance over the draft.

    1. I don’t think Barro ever consciously drew on Ricardo. If you ask Geoff Brennan, he’d probably regard Barro as having unknowingly re-invented Ricardian equivalence without knowing it existed in the first place. And Barro’s paper doesn’t actually cite Ricardo.

    2. Interestingly, a few years before that paper, when Barro was at Brown, he worked with Herschel Grossman on very interesting formalisation of Keynes as a disequilibrium system (spilling over from one market into another).

    And here’s an RBC joke for you:

    New-Keynesian theorist: “You real business cycle theorists have set macroeconomics back by 15 years!”

    RBC theorist: “A-ha! So you DO believe in technical regress!”

  2. Uncle Milton
    October 12th, 2009 at 22:04 | #2

    Friedman’s natural rate of unemployment was supposed to be analogous to Wicksell’s natural rate of interest (aka the real rate of interest). So while inflation changes the nominal rate of interest but not the real rate, it also temporarily puts the rate of unemployment below the natural rate.

  3. Uncle Milton
    October 12th, 2009 at 22:11 | #3

    And, a little OT, finally the economics Nobel has gone to a woman, albeit to a political scientist.

  4. Tim Peterson
    October 14th, 2009 at 15:14 | #4

    Actually, the NAIRU and the natural rate of unemployment are not synonimous. This is because, with rational expectations, inflation can accelerate when unemployment is above the natural rate if expectations become un-anchored (ie if people expect that in future unemployment will fall below the natural rate).

    The NAIRU is a mechanical concept, linked to adaptive expectations.

Comments are closed.