Home > Dead Ideas book > What went wrong with New Keynesian macro ?

What went wrong with New Keynesian macro ?

October 13th, 2009

More bookblogging! It’s all economics here at CT these days, but normal programming will doubtless resume soon.

Most of what I’ve written in the book so far has been pretty easy. I’ve never believed the Efficient Markets Hypothesis or New Classical Macro and it’s easy enough to point out how the occurrence of a massive financial crisis leading to a prolonged macroeconomic crisis discredits them both.

I’m coming now to one of the most challenging section of my book, where I look at why the New Keynesian program (with which I have a lot of sympathy) and ask why New Keynesians (most obviously Ben Bernanke) didn’t, for the most part, see the crisis coming or offer much in response that would have been new to Keynes himself. Within the broad Keynesian camp, the people who foresaw some sort of crisis were the old-fashioned types, most notably Nouriel Roubini (and much less notably, me) who were concerned about trade imbalances, inadequate savings, and hypertrophic growth of the financial sector. Even this group didn’t foresee the way the crisis would actually develop, but that, I think is asking too much – every crisis is different.

My answer, broadly speaking is that the New Keynesians had plenty of useful insights but that the conventions of micro-based macroeconomics prevented them from forming the basis of a progressive research program.

Comments will be appreciated even more than usual. I really want to get this right, or as close as possible

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: center; font: 12.0px ‘Century Schoolbook’}
p.p2 {margin: 8.0px 0.0px 0.0px 0.0px; text-indent: 28.0px; font: 13.0px Optima; min-height: 15.0px}
p.p3 {margin: 8.0px 0.0px 0.0px 0.0px; text-indent: 28.0px; font: 13.0px Optima}
p.p4 {margin: 8.0px 0.0px 0.0px 83.0px; text-align: justify; text-indent: -1.0px; font: 13.0px Optima}
span.s1 {text-decoration: underline}

New Keynesian macroeconomics

In the wake of their intellectual and political defeats in the 1970s, mainstream Keynesian economists conceded both the long-run validity of Friedman’s critique of the Phillips curve, and the need, as argued by Lucas, for rigorous microeconomic foundations. “New Keynesian economics” was their response to the demand, from monetarist and new classical critics, for the provision of a microeconomic foundation for Keynesian macroeconomics.

The research task was seen as one of identifying minimal deviations from the standard microeconomic assumptions which yield Keynesian macroeconomic conclusions, such as the possibility of significant welfare benefits from macroeconomic stabilization. A classic example was the ‘menu costs’ argument produced by George Akerlof, another Nobel Prize winner. Akerlof sought to motivate the wage and price “stickiness” that characterised new Keynesian models by arguing that, under conditions of imperfect competition, firms might gain relatively little from adjusting their prices even though the economy as a whole would benefit substantially.

The approach was applied, with some success, to a range of problems that had previously not been modelled formally, including many of the phenomena observed in the leadup to the global financial crisis, such as asset price bubbles and financial instability generated by speculative ‘noise trading’.

A particularly important contribution was the idea of the financial accelerator, a rigorous version of ideas first put forward by Fisher and by Keynesians such as Harrod and Hicks. Fisher had shown how declining prices could increase the real value of debt, making previously profitable enterprises insolvent, and thereby exacerbating initial shocks. The Keynesians showed how a shock to demand would result in declining utilisation, meaning that firms could meet their production requirements without any additional investment. Thus the initial shock to demand would have an amplified effect on the demand for investment goods.

In a 1989 paper, Ben Bernanke and Mark Gertler integrated these ideas with developments in the theory of asymmetric information to produce a rigorous model of the financial accelerator.

It would seem, then, that New Keynesian economists should have been well equipped to challenge the triumphalism that prevailed during the Great Moderation. With the explosion in financial sector activity, the development of massive international and domestic imbalances and the near-miss of the dotcom boom and slump as evidence, New Keynesian analysis should surely have suggested that the global and US economies were in a perilous state.

Yet with few exceptions, New Keynesians went along with the prevailing mood of optimism. Most strikingly, the leading New Keynesian, Ben Bernanke became, the anointed heir of the libertarian Alan Greenspan as Chairman of the US Federal Reserve. And as we have already seen, it was Bernanke who did more than anyone else to popularise the idea of the Great Moderation.

Olivier Blanchard summarises the standard New Keynesian approach (which converged, over time with the RBC approach) using the following, literally poetic, metaphor

A macroeconomic article today often follows strict, haiku-like, rules: It starts from a general equilibrium structure, in which individuals maximize the expected present value of utility, ¯rms maximize their value, and markets clear. Then, it introduces a twist, be it an imperfection or the closing of a particular set of markets, and works out the general equilibrium implications. It then performs a numerical simulation, based on calibration, showing that the model performs well. It ends with a welfare assessment.

Blanchard’s description brings out the central role of microeconomic foundations in the New Keynesian framework, and illustrates both the strengths and the weaknesses of the approach. One the one hand, as we have seen, New Keynesians were able to model a wide range of economic phenomena, such as bubbles and …, while remaining within the classical general equilibrium framework. On the other hand, precisely because the analysis remained within the general equilibrium framework, it did not allow for the possibility of a breakdown of classical equilibrium, which was precisely the possibility Keynes had sought to capture in his general theory.

The requirement to stay within a step or two of the standard general equilibrium solution yielded obvious benefits in terms of tractability. Since the properties of general equilibrium solutions have been analysed in detail for decades, modeling “general equilibrium with a twist” is a problem of exactly the right degree of difficulty for academic economists – hard enough to require, and exhibit, the skills valued by the profession, but not so hard as to make the problem insoluble, or soluble only with the abandonment of the underlying framework of individual maximization.

A critical implication of Blanchard’s haiku metaphor is that the New Keynesian program was not truly progressive. A study of some new problem such as the incentive effects of executive pay would typically, as Blanchard indicates, begin with the standard general equilibrium model, disregarding the modifications made to that model in previous work examining other ways in which the real economy deviated from the modelled ideal. The cumulative approach would imply a model that moved steadily further and further away from the standard GE framework, and therefore became less and less amenable to the standard techniques of analysis associated with that model.

This, I think, is what Paul Krugman had in mind when he suggested in his essay ‘How Did Economists Get It So Wrong?’ that economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth. The work described by Blanchard was beautiful (at least to economists) and illuminated some aspects of the truth, but beauty came first. An approach based on putting truth first would have incorporated multiple deviations from the standard general equilibrium model then attempted to work out how they fitted together. In many cases, the only way of doing this would probably be to incorporate ad hoc descriptions of aggregate relationships that fitted observed outcomes, even if it could not be related directly to individual optimization.

New Keynesian macroeconomics, of the kind described by Blanchard, was ideally suited to the theoretical, ideological and policy needs of the Great Moderation. On the one hand, and unlike New Classical theory it justified a significant role for monetary policy, a conclusion in line with the actual policy practice of the period. On the other hand, by remaining within the general equilibrium framework the New Keynesian school implicitly supported the central empirical inference drawn from the observed decline in volatility, namely that major macroeconomic fluctuations were a thing of the past.

#

DSGE

Eventually, the New Keynesian and RBC streams of micro-based macroeconomics began to merge. The repeated empirical failures of standard RBC models led many users of the empirical techniques pioneered by Prescott and Lucas to incorporate non-classical features like monopoly and information asymmetries. These “RBC-lite” economists sought, like the purists, to produce calibrated dynamic models that matched the “stylised facts” of observed business cycles, but quietly abandoned the goal of explaining recessions and depressions as optimal adjustments to (largely hypothetical) technological shocks.

This stream of RBC literature <a href=”http://www.econosseur.com/2009/05/leamer-and-the-state-of-macro.html”>converged with New Keynesianism</a>, which also uses non-classical tweaks to standard general equilibrium assumptions with the aim of fitting the macro data.

The resulting merger produced a common approach with the unwieldy title of Dynamic Stochastic General Equilibrium (DSGE) Modelling. Although there are a variety of DSGE models, they share some family features. As the “General Equilbrium” part of the name indicates, they take as their starting point the general equilibrium models developed in the 1950s, by Kenneth Arrow and Gerard Debreu, which showed how an equilibrium set of prices could be derived from the interaction of households, rationally optimising their work, leisure and consumption choices, and firms, maximizing their profits in competitive markets. Commonly, though not invariably, it was assumed that everyone in the economy had the same preferences, and the same relative endowments of capital, labour skills and so on, with the implication that it was sufficient to model the decisons of a single ‘representative agent’.

The classic general equilibrium analysis of Arrow and Debreu dealt with the (admittedly unrealistic) case where there existed complete, perfectly competitive markets for every possible asset and commodity, including ‘state-contingent’ financial assets which allow agents to insure against, or bet on, every possible state of the aggregate economy. In such a model, as in the early RBC models, recessions are effectively impossible – any variation in aggregate output and employment is simply an optimal response to changes in technology, preferences or external world markets. DGSE models modified these assumptions by allowing for the possibility that wages and prices might be slow to adjust, by allowing for the possibility of imbalances between supply and demand and so on, thereby enabling them to reproduce obvious features of the real world, such as recessions.

But, given the requirements for rigorous microeconomic foundations, this process could only be taken a limited distance. It was intellectually challenging, but appropriate within the rules of the game, to model individuals who were not perfectly rational, and markets that were incomplete or imperfectly competitive. The equilibrium conditions derived from these modifications could be compared to those derived from the benchmark case of perfectly competitive general equilibrium.

But such approaches don’t allow us to consider a world where people display multiple and substantial violations of the rationality assumptions of microeconomic theory and where markets depend not only on prices, preferences and profits but on complicated and poorly understood phenomena like trust and perceived fairness. As Akerlof and Shiller observe

It was still possible to discern the intellectual origins of alternative DSGE models in the New Keynesian or RBC schools. Modellers with their roots in the RBC school typically incorporated just enough deviations from competitive optimality to match the characteristics of the macroeconomic data series they are modelling, and prefer to focus on deviations that are due to government intervention rather than to monopoly power or other forms of market intervention. New Keynesian modellers focused more attention on imperfect competition and were keen to stress the potential for the macro-economy to deviate from the optimal level of employment in the short term, and the possibility that an active monetary policy could produce improved outcomes

Because New Keynesians were (and still are) concentrated in economics departments on the East and West Coast of the United States (Harvard, …) while their intellectual opponents are most prominent in the lakeside environments of Chicago and Minnesota, the terms ‘saltwater’ and ‘freshwater’ schools have been coined (by Krugman?) to describe the two positions. But such a terminology suggests a deeper divide between competing schools of thoughts than actually prevailed during the false calm of the Great Moderation. The differences between the two groups were less prominent, in public at least, than their points of agreement. The freshwater school had backed away from extreme New Classical views after the failures of the early 1980s, while the distance from traditional Keynesian views to the New Keynesian position was summed up by Lawrence Summer’s observation that ‘We are now all Friedmanites, Lawrence Summers’. And even these limited differences were tending to blur over time, with many macroeconomists, and particularly those involved in formulating and implementing policy shifting to an in-between position that might best be described as ‘brackish’.

However, the similarities outweigh the differences. Whether New Keynesian or RBC in their origins, DSGE models incorporate the assumption, derived from Friedman, that there is no long-run trade-off between unemployment and inflation, that is, that the long-run Phillips curve is vertical. And nearly all allowed for some trade-off in the short run, and therefore for some potential role for macroeconomic policy.

The differences between saltwater and freshwater DGSE models may be discussed in terms of the venerable Keynesian idea of the multiplier, that is, the ratio of the final change in output arising from a fiscal stimulus to the size of the initial stimulus. Old Keynesians had argued that the multiplier (as the name suggests) was greater than one since the beneficiaries of government expenditure would increase their consumption of goods and services, leading to more workers being hired who in turn would increase their own consumption and so on. The ‘policy ineffectiveness’ proposition of the New Classical school implied that the multiplier should be zero or even negative, because of the incentive-sapping effects of government spending and the taxes required to finance it. The DGSE modellers tended to split the difference.

Although the issue was rarely discussed explicitly, the DGSE models favored by the New Keynesian school typically implied values for the multiplier that were close to 1, while those derived from RBC approaches suggested values that were positive, but closer to zero. Given the mild volatility of the Great Moderation, such models yielded no justification for active use of fiscal policy, and good reasons for governments to maintain budget balance as far as possible. New Keynesians also typically rejected active use of fiscal policy, and relied exclusively on monetary policy to manage the economy, But, compared to their freshwater colleagues they had a more positive view of the ‘automatic stabilisers’. Since tax revenues tend to fall and welfare expneditures to rise during recessions a government that maintains a balanced budget on average will tend to run deficits during recessions and surpluses during booms. On a Keynesian analysis, the fact that government spending net of taxes is countercyclical (moves in the opposite direction to fluctuations in the rate of economic growth) tends to stabilise the economy. Vast numbers of journal pages were devoted to refining these different viewpoints, and to defending one or the other. But in practical policy terms, the differences were marginal

Reflecting their origins in the 1990s, most analysis using DSGE models assumed that macroeconomic management was the province of central banks using interest rate policy (typically the setting of the rate at which the central bank would lend to commercial banks) as their sole management instrument. The central bank was modelled as following either an inflation target (the announced policy of most central banks) or a “Taylor rule”, in which the aim is to stabilise both GDP growth and inflation.

On the whole, while central banks showed a some interest in DSGE models, and invoked their findings to provide a theoretical basis for their operations, they made little use of them in the actual operations of economic management. For practical purposes, most central banks continued to rely on older-style macroeconomic models, with less appealing theoretical characteristics, but better predictive performance. However, neither DSGE models nor their older counterparts proved to be of much use in predicting the crisis that overwhelmed the global economy in 2008, or in guiding the debate about how to respond.

Categories: Dead Ideas book Tags:
  1. gerard
    October 13th, 2009 at 18:09 | #1

    my basic macro course uses Mankiw’s textbook, which states very directly that “money neutrality is true”. I assume that this is a basic assumption of New Keynesian models but is it actually true? Are you going to compare this with theories of endogenous money creation?

    Steve Keen has recently put a new paper on multisectoral modeling with endogenous money on his blog.

    http://www.debtdeflation.com/blogs/2009/10/10/multi-sectoral-production-one-for-geeks/

  2. Ernestine Gross
    October 13th, 2009 at 19:30 | #2

    John,

    I don’t know whom to think of when I read the term ‘New Keynesians’. However, if the New Keynesians had the following in mind, then, I suggest, there is a problem which has nothing to do with ‘general equilibrium theory’ as found in the so-called ‘elegant’ part of the literature.

    Your write:
    “As the “General Equilbrium” part of the name indicates, they take as their starting point the general equilibrium models developed in the 1950s, by Kenneth
    Arrow and Gerard Debreu, which showed how an equilibrium set of prices could be derived from the interaction of households, rationally optimising their work, leisure and consumption choices, and firms, maximizing their profits in competitive markets. Commonly, though not invariably, it was assumed that everyone in the economy had the same preferences, and the same relative endowments of capital, labour skills and so on, with the implication that it was sufficient to model the decisons of a single ‘representative agent’.”

    Three comments:

    1. Contrary to the claim, the Arrow-Debreu model does not “showed how an equilibrium set of prices could be derived from ……..” Indeed Debreu (1959)is quite explicit about this by writing in words that the price formation process is not explained in the model.

    2. The macro-models which use a representative agent are not at all consistent with the research question underlying the Arrow-Debreu model, namely: Is it conceivable, and if so, under which conditions, that the independent actions of many agents, who differ in terms of preferences or production technologies and ownership rights, can be coordinated by a ‘price system’. The representative agent macro-models feature in the critique in the Dahlem report. There are contributors to this report who are established authors in what is called ‘Walrasian equilibrium theory’ (the Arrow-Debreu model is by now a special case).

    3. The Arrow-Debreu model has neither financial securities nor money. The market opens once. Surely the name Keynes is associated with ‘money’. I am at a total loss as to what New Keynesians are about.

  3. Alan
    October 13th, 2009 at 21:29 | #3

    Professor Quiggin
    I’ve been a bit tardy with my blog reading the past couple of weeks, so haven’t had a chance to comment on some of your posts. Regarding this post, I agree with much of what you say, but you might consider some of the following issues:

    * Robert Gordon has written a nice piece on the state of macro that could be used to buttress some of your arguments – furthering your criticism of DSGE models, he argues that DSGE models are internally contradictory in assuming that prices are sticky and that markets clear.
    http://faculty-web.at.northwestern.edu/economics/gordon/GRU_Combined_090909.pdf

    He also has much to say about why policymakers continued to use “older style macroeconomic models” instead of NK models, boosting your final paragraph:

    * The “macro is too micro” argument can be argued from different angles – for example, you focus on the intertemporal utility maximisation framework. Another valid criticism of macro being too micro would be that modern macro’s focus on individual preferences and production functions misses co-ordination failures and macro externalities that are also important sources of fluctuations;

    * representative agent paradigm – not to be too wonky, but assuming identical wealth is not the only way to get to the use of the representative agent. Equally valid is to allow for heterogeneity in wealth, but impose that preferences be homothetic (of course, your argument about the (possibly unrealistic) assumption of identical preferences would still apply even when homothetic preferences are used to derive the representative agent paradigm);

    * credit markets and leverage – while there are some notable cases of models incorporating the theory of debt deflation, another criticism of the majority of NK and RBC models is their omission of the role of credit markets, an omission glaringly obvious given the intimate role played by credit markets in this recession.

  4. been there
    October 14th, 2009 at 06:08 | #4

    A couple of observations:
    * Central banks did indeed have and develop these models: look up Smets and Wouters. But they were in the research department and largely ignored by the real policymakers. The research departments were in some cases (though not, I believe, in the ECB’s) oblivious to this non-use of their models. They just kept churning out unreadable papers.
    * The cost of this model-building activity was that central bank research departments did not have enough resources to build other models that might have been more useful to the financial stability function.
    * Part of the problem is that the canonical form of these papers requires spending two-thirds of the paper setting out the model, even the bits that are now standard. Utility functions, Euler equations, step by step. Nobody ever says: “My model is the same as Bloggs (2008) except that in addition to assuming sticky wages and habit formation in consumption, I assume that firms have a time-to-build constraint that adds a lag between investment decisions and their effects on productive capacity.” and then just shows the equilibrium conditions / derived macro equations. You have to show everything and that’s twenty pages already.
    * Many of these models do not just have a one-twist structure. The problem is that if you add seven frictions with free parameters, you can match the seven moments you are trying to match, and define that as success. Nobody in this literature gets that it’s what ELSE you can predict (that you weren’t explicitly trying to match) that generates the scientific content of your model.

  5. gerard
    October 14th, 2009 at 08:06 | #5

    whatever happened to that “New Australian” online newspaper thing? googled it but it seems to be gone.

  6. gerard
    October 14th, 2009 at 08:12 | #6

    er, too many tabs open, wrong thread

  7. iain
    October 14th, 2009 at 08:40 | #7

    Endogenous money is where the future lies. However you should have a demurrage charge on it. As Keynes said “I believe that the future will learn more from the spirit of Gesell than from that of Marx”.

    We haven’t quite reached that future yet.

  8. James
    October 14th, 2009 at 08:57 | #8

    I can appreciate that this part of the book becomes far harder to write, given that it involves moving from criticism of what went wrong in the past (in the wake of the GFC, the phrase ‘shooting fish in a barrel’ springs to mind) to offering suggestions of what will go right in the future, which involves far more uncertainties and much higher stakes.

    Given the intellectual trajectory you have laid out here, it’s hard to see any other position than Post-Keynesianism as being consistent with your self-description as an “old fashioned Keynesian” who rejects the NKs and micro-based macro. So perhaps you could explore the different strands of that school and explain which you believe is best at providing a consistent macro framework without unrealistic micro underpinnings? Given the financial/monetary nature of the crisis the Chartalists seem to be racking up points at the moment.

  9. ABOM
    October 14th, 2009 at 10:53 | #9

    Neo-Keynesians didn’t get it right because they aren’t Austrians.

    What’s not to understand?

  10. Michael Harris
    October 14th, 2009 at 22:45 | #10

    I have sense of the saltwater/freshwater distinction being coined by Blinder — perhaps in the address he gave out here that was published in the Economic Record. I won’t swear by that though.

  11. Alan
    October 14th, 2009 at 23:01 | #11

    About the coining of the saltwater and freshwater terms, here’s a 1988 NYTimes article that states it was Robert Hall:

    ” Robert Hall, a Stanford professor and a more mainstream economist, conceived the fresh-water, salt-water distinction a decade ago upon noting the workplaces of the new group’s leaders – Mr. Lucas in Chicago, Thomas J. Sargent, formerly at the University of Minnesota and now at the Hoover Institution at Stanford, and Robert J. Barro, then at the Univerity of Rochester and now at Harvard. They are also known as ”rational expectationists” and as ”neoclassical macroeconomists,” a term reflecting the school’s roots in the classical economics of Adam Smith and David Ricardo….”

    http://www.nytimes.com/1988/07/23/business/fresh-water-economists-gain.html?pagewanted=2

    This implies that this distinction’s been around for about 3 decades.

  12. Michael Harris
    October 15th, 2009 at 07:03 | #12

    The Blinder reference is from the same era.

    The Fall and Rise of Keynesian Economics
    Blinder, Alan S
    Economic Record, vol. 64, no. 187, December 1988, pp. 278-94

    It’d be a useful reference for this chapter even if it isn’t the source of salt-vs-fresh.

  13. Michael Harris
    October 15th, 2009 at 07:06 | #13

    PS: OK, reading more closely, Hall, in 1988, is being claimed to have made the distinction a decade prior.

  14. gerard
    October 15th, 2009 at 11:47 | #14

    On the whole, while central banks showed a some interest in DSGE models, and invoked their findings to provide a theoretical basis for their operations, they made little use of them in the actual operations of economic management. For practical purposes, most central banks continued to rely on older-style macroeconomic models, with less appealing theoretical characteristics, but better predictive performance.

    John, would you say that DSGE is considered by mainstream macroeconomists these days as the most promising direction for research in economic modeling? and could you elaborate a bit on the older-style alternatives and how they match up?

  15. timothy watson
    October 15th, 2009 at 15:20 | #15

    TYPO WATCH:

    “Blanchard’s description brings out the central role of microeconomic foundations in the New Keynesian framework… One the one hand,”

  16. Kevin Cox
    October 16th, 2009 at 19:19 | #16

    We invest to create more wealth. Normally investment means building new things. The FACT is that if you want to invest in something new you have to pay at least 20% for equity funds. That is, investors in new products (including established companies deciding on investments) demand at least a 20% return on investment otherwise they will not consider it.

    However, you can get loans at 7% against existing assets. The difference between 20% and 7% (or higher) is not accounted for by the higher risk but by the FACT that there is relatively little money available for new investments.

    To get growth we have to build new things or make old things more efficient and that requires investment. The world will struggle to get investments until the cost of money for new investments drops.

    Take a look at this presentation to see the dramatic effect of removing interest charges on building renewable power plants and the like. I would really like someone to tell me why this will not work and not only solve the money problem but also reduce ghg emissions.

    http://www.slideshare.net/cscoxk/financing-sustainability-2188734

  17. Alice
    October 18th, 2009 at 15:47 | #17

    @Alan
    says

    “They are also known as ”rational expectationists” and as ”neoclassical macroeconomists,” a term reflecting the school’s roots in the classical economics of Adam Smith and David Ricardo….”

    ???? Is Rational expectations part of the problem.

    Per Soros – “rational expectations theory totally misinterprets how financial markets operate. Although rational expectations theory is no longer taken seriously outside academic circles, the idea that markets are self correcting and tend towards equilibrium remains the prevailing paradigm on which the various synthetic instruments and vaulation models which have come to play such a dominant role in financial markets are based. I contend that the prevailing paradigm is false and urgently needs to be replaced.”

    (The Crash of 2008 and what it means – p.6)

    Soros argues that market participants do not base their actions on the information gained from the actual situation but only on their perceptions of the actual situation and their subsequent actions then have the potential to alter the actual situation, which in turn has the potential to alter their perceptions and so on ie there is a feedback loop with uncertainty.

    Makes sense to me.

Comments are closed.