Got this in my email this morning inviting contributions to a special issue of a journal (I won’t name it)

Dynamic stochastic general equilibrium (DSGE) models have become an established framework of reference in empirical macroeconomics. Because DSGE models combine micro- and macroeconomic theory with formal econometric modeling and inference, they are now widely used in policy analysis and academic discourse to address questions in monetary economics and business cycle research, and to inform policy interventions. The continued success of DSGE models will rest on a sustained ability to meet key challenges and improve upon the main modeling paradigm both in terms of its theoretical foundations and its econometric implementation.

And of course, we can thank DGSE models for predicting that nasty crisis in 2008 and prescribing the policy responses that fixed it so completely. Good think we didn’t have to rely on that old-fashioned Keynesian stuff.

18 thoughts on “Zombies

  1. I really would like to see evidence of any model that can predict exchange rates more than 2Q out.

    If you cannot predict exchange rates beyond 2Q what is the point of any economic model?

    In general I understand that all attempts to produce models have failed to predict turning points further than 2Q without producing just as many false calls.

    You can model a steam engine speed as product of steam pressure and rail gradient, as the linkage is direct and non-political. But this does not apply to capitalist economics driven by cartels, corruption, lobbying, private self-interest and degrees of monopoly.

  2. It is very frustrating that this obviously falsified theory (DSGE) is still being trotted as accurate, predictive and (of all things) empirical. A key question perhaps is why so many humans, the great majority in fact, won’t abandon obviously falsified theories. It is clear that for most people wishful thinking, blind belief and self-serving ideology trump objective empirical analysis. Religious blik, corporate propaganda and nationalistic jingoism form the views and mental constructs of 95% of the populace. It’s classic false consciousness of course.

  3. The ‘success’ of DSGE models is a theological success, not an empirical success. This success will continue as long as there are ‘economists’ how genuflect to truths revealed by great ‘old ones’ and scriptural interpretations, rather than truth empirically revealed from nature via the scientific method.

    That is, like creationism, holocaust denial, AGW denial, and a variety of other silliness, the ‘success’ of DSGE models will be sustained, in some quarters, for some time to come.

  4. And of course, we can thank DGSE models for predicting that nasty crisis in 2008 and prescribing the policy responses that fixed it so completely. Good think we didn’t have to rely on that old-fashioned Keynesian stuff.

    The fact that old-fashioned Keynesian stuff can more reliably explain what happened than the DSGE models will not cause the merits of either to be reassessed by most of the profession. The reason for this is that since the “Rational Expectations Revolution”, mainstream economics has decided that empiricism is not the relevant framework for assessing the validity of a macroeconomic model. Instead there has been a shift to deductive reasoning, in which conclusions are valid if they can shown to follow logically from a premise which is assumed, a priori, to be true: that is, “agents” are perfectly rational and perfectly clairvoyant, and in a valid model the “agents” must behave according to the maximization of a time-discounted utility functional that is consistent with their having full knowledge of the model which they inhabit.

    Hence a model can be “valid” even if it involves an economy consisting of a single, immortal individual and a single consumption good as long as there is some sort of optimization process involved. Of course this is just about the worst possible representation of reality, but nevermind. The old Keynesian models, even if they are more realistic from an empirical perspective, are automatically invalidated if they do not incorporate a rational optimization process.

    The problem is that once the while the optimization-based models can be extended to incorporate additional factors, once they begin to contains multiple heterogenous agents, let alone uncertainty, debt finance or the input-output relationships of multiple goods and services, it becomes very difficult if not impossible to design an optimization-based model, let alone one that has empirical value. But rather than sacrifice the fundamental premise and attempt to pursue an alternative framework that might be more promising, the profession has decided that the only serious type of macroeconomic modelling involves endless variations and minor embellishments of the representative-agent DSGE framework. While there is slim chance that any of these models will describe, let alone predict, phenomena such as the mispricing of mortgage-backed securities and the emergence of an unstable market of derivative instruments, that’s not the point.

  5. As far as I know, DSGE models are not much used in finance and insurance. DSGE modeling is not anywhere near as lucrative as financial engineering. It’s not about record private-sector bonuses, it’s more about publications in esteemed journals, and academic tenure.

  6. @gerard

    Gerard’s summation of DSGE seems about right to me. I guess for these DSGE people, the empirical revolution in science and philosphy (Bacon to Hume) never happened. The DSGE advocates seem like the modern equivalent of the medieval schoolmen.

  7. Just for fun JQ why don’t you submit a paper critiquing the problems with DSGE models and see if they accept it. Might be good for a hoot – the referees comments might be worth framing.

  8. gerard, it seems to me you conflate Walrasian and non-Walrasian theoretical models of economies (deductive reasoning) with statistical macro-economic models which happen to have the words ‘general equilibrium’ in it (DSGE).

    To illustrate the difference, the important difference if I may say, the former provides conditions under which a particular solution of a theoretical model can be shown to be conceivable (ie exists in the logic of mathematics). The latter does not check for these conditions but imposes more or less ad-hoc equations and identities on macro-economic data.

    By comparing observables with the assumed conditions, it seems to me quite obvious that doing a ‘DSGE’ exercise is not very helpful. For example, there is a history of empirical data on financial market ‘crises’ which violate the assumption of ‘strict risk aversion’ in the theoretical models (deductive reasoning). The minimum wealth condition in the theoretical models is grossly violated in the ‘global economy’ and, IMO, overstretched in so-called developed countries. AGW is one of the most notable examples of the incompleteness of markets. The whole idea of modelling the dynamics of ‘an economy’ as a stochastic process, driven by ‘exogenous events’ seems to me to exclude ‘financial crises’ in line one.

    But then I am not fond of macro-economics – I don’t like being locked into believing that market transactions, as recorded by accountants for individual businesses and the national accounts for ‘an economy’ is all there is to ‘economics’.

  9. Oh, I forgot the Ikonoclast’s interest – physical resource constraints. The theoretical ‘general equilibrium models do have a physical resource constraint in the definition of ‘an equilibrium’. As far as I know, there is no such thing in DSGE models, that is, there is no physical resource constraint which is exogenous to the macro-economic data used in the models.

  10. I’m no expert Ernestine but my understanding is that statistical macro models can either be structural or nonstructural. The nonstructural models don’t bother with theory and simply use statistical techniques to estimate correlations between particular variables which are then used as the basis for forecasting. As far as I know, these models have the best track-record when it comes to forecasting and are more extensively used in the private sector, who generally dont bother with DSGE. The structural models, like DSGE, begin with a theory of how the economy works, which is assumed to be equivalent to the solution to an intertemporal stochastic optimization problem, the parameters of which are then calibrated by economic data. A DSGE model begins life as a purely theoretical model (the “Walrasian” part of DSGE is the assumption that the markets for goods, capital and labour all clear at every point in time due to the flexibility of wages and prices which allows stable equilibria to be found through tatonnement). It is then made into a “statistical” model by calibration, simulating the model a large number of times with many combinations of parameters and choosing the simulation that bears the closest resemblance to the data. There is no reason why resource constraints and the like cannot be incorporated into DSGE, most macroeconomics these days seems to involve the addition of bells and whistles to the DSGE framework. A lot of serious and clever people are involved in this research and I dont want to pretend to know more than they do, but I am reminded of the joke about the guy looking for his keys under the streetlight because that’s the only place he can see.

  11. Fully concur with you, gerard, there are a lot of serious and clever people involved in the DSGE research program. As with other research, it seems to me it is the researchers rather than the general user of the output who have the ‘right’ understanding of what the output means. So, it is conceivable that the DSGE research program has never been intended to make predictions about global financial crises. I can’t agree that non-structural statistical methods don’t bother with theory. Isn’t it the case that these methods exclude, a priori, deterministic processes? Moreover, the success of these methods in industry is also not convincing because, when applied by more than one business at approximately the same time, wrong decisions are made by all of them (co-ordination problem) such that we observe excess supply of office space, exccess debt, …. As for real resource constraints it seems to me the problem is that natural scientists can’t tell us the numbers – eg how much oil is still under ground. Well, yes, DSGE models assume ‘market clearing’ but only in terms of the actually traded ‘things’ as recorded in the national accounts (I agree, callibration would be a better word). As we have learned from the methodology associated with ‘general equilibrium theory’, under otherwise similar conditions, the difference between complete and incomplete markets lies in the properties of the solution – in the latter case the solution (‘equilibrium’) has the property of being generically Pareto inefficient. So the ‘representative agent’ idea becomes theoretically invalid when markets are incomplete.

  12. I am a bit of my depth here but I will attempt to flounder on.

    I think theorising about “representative agents” is beside the point, in fact is has no point from the empirical view of economics. It is like theorising about “free will” to explain individual or mass human behaviour. It is an unecessary and indeed untestable hypothesis (empirically speaking) about an idealised essence or quality (a form of idealism and maybe even essentialism).

    Representative agents (as rational choice makers) and free will belong to moral philosophy and the political part of political economy but not to empirical economic studies. On the empirical side of economics we have econophysics or thermoeconomics for the physical issues. On the financial side we have the study of the behaviour of monetary and financial systems under the current sum total of physical and social-political-economic conditions. For the latter, the work of Steve Keen seems to me to be the best approach. He looks at the empirical reality (the capitalist system is clearly not an equilibrium system) and seeks to generate mathematical models which mimic the various disequilibria which manifest themselves under current and recent historical conditions.

  13. @gerard

    What does this mean?

    As far as I know, these models have the best track-record when it comes to forecasting …

    You could “know” nothing? or do you have a reference to some objective evidence?

    What is “the best track record”?

    Are you the bloke only looking under the streetlight?

  14. I know nothing Chris. I’m keen to be educated. I was just pointing out that imposing the structural form of a particular DSGE model on statistical estimates of the relationship between economic indicators is not likely to improve the accuracy of these estimates. My understanding is that this is why DSGE is not as popular in the private sector as it is in academia, even though the private sector puts major effort into developing forecasting models. But their models are proprietary and their results are not published in academic journals.

  15. I think the failures of macroeconomics – both freshwater and saltwater – have been methodological. The empiric failures are a product of the (very hard) methodological problems.

    Freshwater critics of what Gerard calls “non-structural” models are quite right – they are very much subject to the Lucas critique (more specifically, as a policy tool they are of limited use because of Goodhart’s Law). Non-structural models are excellent at predicting the past but unreliable as a policy guide, because you can’t predict the effect of attempts to control that future if you don’t understand the deep structure (and hence understand which relations will break down when control pressure is put on them). Most times that’s not a huge problem for small private sector players who don’t have their hands on any policy levers – which may be one reason that, as gerard notes, such methodologically naive models are still popular there. But it is a big problem if you’re the US Fed; after all Lucas started scribbling because both Keynesian and Monetarist approaches had failed in UNANTICIPATED ways. And it’s an intellectually unsatisfactory situation if you’re an academic.

    But then starting with structural models embodying a drastically impoverished model of individual human behaviour and torturing the data long enough until it sorta nearenough confesses agreement has, as we have seen, very serious problems of its own. Having said that, I think if you must do this then the general approach of DSGE is not a terrible one – on my limited understanding it ought in principle be possible to widen the representative agent methodology to permit such things as intemporal trade between individuals and heterogenous utlity functions (at the cost, of course, of even more difficult mathematics), to abandon the super-neutrality of money by getting a finance sector/ monetary authority with chronic information and agency problems, to get more realistic shock propagation mechanisms in the labour market, and etc, and still stay within the general framework. But then IANAM (I am Not A Macroeconomist) so I may be quite wrong here. If so, can someone explain why in a way a journeyman labour economist can follow?

    All of which is just to say what we all know – forecasting the future (and therefore making policy to affect the future) is really really hard. Humility is warranted all round.

  16. I’m not sure if ‘Rational Expectations’ models are immune to the Lucas Critique themselves. The “deep structure” is supposed to give a reliable prediction of how a change in policy will change agents’ behavior. But assuming that we know what this deep structure is, and we know what the reaction of the Representative Agent to a change in policy will be, we can then use this knowledge as an opportunity for arbitrage (assuming we have free will and are not just all robots controlled by the Representative Agent). And then the Rational Expectations model would break down just like the models that Lucas was critiquing. Lucas identified a problem with economic modelling, but his proposed solution seems to suffer from the same problem.

  17. The big deal (in the early theology, anyway) is that with Rational Expectations, policy can become ineffective, except to the extent that the policy is about changing real variables. Not too sure how the later theology goes.

    As for representative consumers in DSGE, the representative consumer idea went out the door in 1972-3 when Sonnenschein showed that ‘anything goes’ so the model didn’t even work in theory (unless you made über heroic assumptions, that is even more heroic than usual). http://en.wikipedia.org/wiki/Sonnenschein%E2%80%93Mantel%E2%80%93Debreu_theorem
    I suppose we are lucky that there are Supermen, or should I say superpersons, willing to go that extra heroic distance to keep model families like the DSGE alive. But then, theoretical and empirical problems never stopped an economist from being awarded the Bank of Sweden prize. Maybe they should have said the prize is in memory of Walt Disney?

    In 1987 I felt sorry for Lucas as I thought the crash would mean he wouldn’t get the prize, and in a way, that would be unfair as he was more deserving than some of his predecessors. I shouldn’t have worried.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s