Home > Economics - General > Two bob each way

Two bob each way

May 22nd, 2009
.!.

Now that everyone is rushing to bone up on the late Hyman Minsky

and at least some of us are concluding that New Keynesian macro needs to be dumped in favour of something more like behavioural economics Left for Dead psp , I thought I’d trawl through the hard disk and see what I had to say on the subject in the past. It turns out I had something of an each-way bet. Here’s something from my 2006 paper with Stephen Bell, advocating controls on financial innovation as the best approach to preventing asset price bubbles.

One obstacle to acceptance of Minsky’s work has been the lack of microeconomic foundations, that is, of a rigorous formal account of individual behavior and the markets in which individuals interact. The idea that such an account is a necessary prerequisite for a coherent macroeconomic theory became popular in the 1970s and reached its high point with new classical macroeconomics in the early 1980s. Since then, however, emphasis on microeconomic foundations has declined for several reasons. First, users of new classical models have found it necessary to make ad hoc adjustments to microeconomic assumptions in order to improve the capacity of their models to match the “stylized facts” about the macroeconomy that they seek to capture. Second, it has been shown that, in important instances, modest deviations from standard neoclassical microeconomic assumptions (rational optimization in competitive markets) can produce large changes in macroeconomic outcomes. Third, evidence arising from fields such as generalized expected utility theory and behavioral finance has cast doubt on the empirical validity of the standard assumptions of neoclassical microeconomics.

As improved models of individual behavior are developed, it seems likely that microeconomic foundations for models similar to Minsky’s will emerge. Such foundations will take account of the fundamental role of uncertainty, emphasized by writers as diverse as Keynes, [Frank] Knight, and Minsky. For the moment, it is sufficient to observe that none of the competing models of asset markets combine rigorous microeconomic foundations with empirically realistic predictions about market behavior.

So back in 2006, I hoped that New Keynesianism ( modest deviations from standard neoclassical microeconomic assumptions producing large changes in macroeconomic outcomes) would help in the process of shifting macro away from neoclassical microfoundations, along with the generalized expected utility/behavioral economics approach. I’ve now shifted to the view that NK macro is part of the problem, and that the generalized expected utility/behavioral economics understanding is the right way to go.
This reflects the fact that the second approach has helped me to understand the crisis and the first has not.

Categories: Economics - General Tags:
  1. Uncle Milton
    May 22nd, 2009 at 21:24 | #1

    “This reflects the fact that the second approach has helped me to understand the crisis and the second has not.”

    Typo. The second second should be first.

    Fixed now thanks

  2. SJ
    May 22nd, 2009 at 22:07 | #2

    John, have you seen this:

    Memo to Krugman and DeLong: Start a Little Differently

    1. Ordinary folks in the nonfinancial sector want to issue risky liabilities (shares of investments in fruit trees, mortgages on houses) and wants to hold riskless assets (demand deposits, money market fund shares).

    2. The financial sector obliges by having a balance sheet with risky assets and supposedly riskless liabilities. The bigger the financial sector gets, the more euphoric the investment climate, because nonfinancial folks get to hold more riskless assets and issue more risky liabilities.

    3. The financial sector’s expansion is based in part on signaling. That is, banks signal that their liabilities are low risk. Signals include fancy buildings, the “FDIC insured” sticker, balance sheets filled with AAA-rated assets, and so on.

    4. Financial expansions are gradual, because it takes a while to come up with new signaling mechanisms and to get the credibility of those mechanisms established with investors.

    5. Financial crashes are sudden, because once investors lose a little bit of their confidence in financial institutions, their natural instinct is to ask for safer assets (they withdraw money from uninsured banks, or they ask AIG to post collateral). This behavior weakens the financial institutions further, leading to a rapid downward spiral. Today, we are seeing that all sorts of signals are discredited.

    So, my advice to Paul and Brad is this: don’t start with a model that focuses on investor beliefs about real economic variables. Instead, start with a model in which financial firms use signaling to expand, and the credibility of those signals increases over time as long as nothing adverse happens. It should be easy to develop a model in which signaling devices gain credibility slowly but lose credibility suddenly. That will (a) produce the asymmetry between euphorias and crashes and (b) tell a story that puts the fragility of the financial sector in the middle, where it belongs.

    Looks to me like it’s time to revisit Kahneman and Tversky.

  3. Ernestine Gross
    May 22nd, 2009 at 22:39 | #3

    SJ, the most prominent signalling agent in the financial sector is known to me as rating agencies. Lets drop the macro language where individuals’ actions are hidden behind terms such as ‘finance industry’.

  4. SJ
    May 22nd, 2009 at 22:54 | #4

    No disagreement from me.

  5. May 23rd, 2009 at 03:01 | #5

    The requirements of economies of scale at the macro level impose restrictions on micro choices. Generalized expectations work because of relatively constant measurement uncertainty among individuals. The two together get you a mathematical model of Minsky theory.

  6. Political Economist
    May 23rd, 2009 at 07:34 | #6

    I am finding behavioural economics very useful, particularly the insights evolutionary psychology is providing.

  7. Ian Lucas
    May 23rd, 2009 at 09:32 | #7

    Steve Keen is working on modelling economic behaviour in a Minskian world. Steve’s analysis looks pretty good to me, but I’m not an economist. I’d be interested to see some expert perspectives on Steve’s analysis.

  8. Lord
    May 23rd, 2009 at 09:42 | #8

    I view it as a transition from assessing value to assessing other peoples assessment of value recursively providing convergence in expectations and a positive feedback loop.

  9. Alice
    May 23rd, 2009 at 10:11 | #9

    3#I agree Ernestine, on rating agencies being a major market signalling device in the financial sector, yet few are questioning the obvious inefficacy of the ratings agencies.

  10. Alice
    May 23rd, 2009 at 10:27 | #10

    Nor are we questioning the people that run the ratings agencies for their own self interested profit maximising business. Why do we expect them to have standards when what they sell can be compared to the now infamous, much varied and plagiarised red heart foundation tick.

    The US “may” lose its triple A rating per todays paper. Im astonished the triple A rating is still there…given the deficit and the GFC and housing market mess…Hello? Am I dreaming?

    If the US still has a triple A rating it must be because the ratings agencies are US based firms and living in a wonderland of their own making they want the rest of us continue subscribing to.
    Id rather put my money in the safe…the players and signallers in the financial sector are less than honest (are transparently dishonest in reality -we need a model that incorporates the behaviour of Viking style group pillage and plunder raids).

  11. May 23rd, 2009 at 11:00 | #11

    I am starting to read Keen (HT Ian), it is going to be a hard slog for me, but initially I would phrase the problem somewhat differently.

    There are two distribution networks, money distribution and good distribution; thus converting a price and single good model into a priceless two good model.

    The profit maximizing rule is equivalent to minimizing the number of transaction in the supply chains while matching the arrival rates of money and good. The goal is reached when the distribution of lags in the two supply chains match, equal spectral functions. This converts the problem into a coherency match.

    After this things go beyond a simple blog comment, and much beyond my current analysis.

    But I add the constant measurement uncertainty, and construct the transaction such that the store manager adjusts inventory as the transactions of the current queue of customers is ongoing. The store clerk can make inventory adjustments even before the current group of transaction is ongoing, as long as the observable result rises above uncertainty measurement.

    This leads to a coherency between the aggregate distribution in the chain of goods and the individual single store queue.

    Then, I add conditions for the most efficient distribution network structure, which is obtained from estimation theory; as if the distribution network is a linear estimator of output designed to minimize estimation steps.

    These conditions result in quantum coherence, the efficient supply network being the Hamiltonian; and the (measurement uncertainty * transaction time) being the planks constant; as in physics.

    The solution is a finite series of orthogonal functions which map to the optimum, and specific bond term period along the yield curve. It results in the time independent solution of equilibrium, and identifies clearly stable structure around equilibrium.

    I cheated. I knew I had to get a quantum mechanical result to explain explosive recombination when confronted by technology shock, as per Shumpeter, and I knew the mathematics had to be quantum mechanical.

    When the equilibrium times of money is much faster than the good, one can use as a good approximation the computed yield curve as the output. The problem should scale up to national economies.

  12. Kevin Cox
    May 23rd, 2009 at 11:22 | #12

    Economists seem to look at the outputs from the economic system, try to infer what happens inside the system, then try to change some of the outputs to change what goes on inside the system. While this might lead to some interesting insights it is not a good way to build and adapt social systems.

    Minsky seems to be no different.

    Shiller and co are better because they discuss interactions between components. All systems – like economic systems – are both simpler and more complex than most people realise.

    I make this statement as a person who designs, constructs and monitors complex systems.

    Systems are simple in that the direct interactions between components are best left as simple as we can otherwise the systems tend to fail. They are more complex because the emergent properties of the systems are complex and multifaceted and we often have little idea of what will happen when we change the interaction between components.

    Economists seem to concentrate on the emergent properties and look for relationships and cause and effect between these relationships – increase money supply too much and you will get inflation, will changing the minimum wage cause unemployment to rise etc.

    The other way of looking at the problem is to look at the details of how interactions occur between components. So the way we actually construct systems is to build something that works and has very carefully specified but simple interactions between components, turn the system on and see if it operates the way we expect. As you get better at constructing systems and as you improve and learn from your mistakes and as the different components become better understood so you build systems more quickly and with better overall outcomes.

    I see economic systems as no different from any other system. To change the systems for the “better” we need to concentrate on the bottom level transactions and work on those interactions. We observe the outputs of the system and adjust the bottom level transactions if the outputs are not achieving the desired outcome.

    The financial crisis is an extreme instance of the so called business cycle. As I have pointed out elsewhere this is an inevitable result of the way we increase the money supply. It is nothing to do with greed, or appetite for risk, or lack of regulation and all to do with the method of increasing money supply by creating loans. That is it is to do with the bottom level process of increasing the money supply.

    If we change the way we do this then the emergent properties of the system will change in all sorts of ways we may or may not guess. But that is the way to change a system. Concentrate on the bottom level processes, change them and monitor what happens.

    To give another example. There has been a lot of discussion about shares to employees and taxing those shares. The designers of the policy have forgotten a fundamental principle of taxation. You should only get taxed on income when you realise the income. Taxing notional share value before it is realised means that promises are taxed not income and it is very easy to change promises so that they will not be taxed – so defeating the purpose of the change.

    I think that tax needs to be reformed but I would reform in the following way. I would only tax money when it becomes available to an individual or group of individuals to spend for their own consumption (note I said consumption not investment)

    I would implement a ghg reduction scheme by paying people inversely to their contribution to the problem and require them to spend the money they receive on ways to reduce ghg concentrations.

    To reform the banks I would remove the privilege of banks lending money that is not on deposit and at the same time I would require banks to implement instant transfer of money between accounts no matter where they reside. This would lower transaction costs dramatically and stop booms and busts.

    Here is a real live example of a simple bottom level transaction change that is starting to have interesting emergent properties. We have built a system to identify people by putting the responsibility for identification on the person being identified not on the organisation who requires the identification. This, solves the problem apparently with the same result but it leading to interesting emergent properties such as practical electronic signatures and helping prevent identity theft and fixing the damage when it does occur.

    That is, when there is system problem we can look for ways we can change the underlying interactions to improve the system.

Comments are closed.