There’s something Wolfgang on the internet

I like Wolfgang Munchau’s columns on the Eurozone crisis a lot.  But this one on claiming that macroeconomists need new tools is serious overreach.  Anyone actually building, solving or just reading others’ models, will see that it isn’t right, not even a little bit, on a single point it makes.  This is not to say macro doesn’t have challenges to answer, or may indeed need ‘new tools’.  But these are not for any of the reasons given in Wolfgang’s columns.

Error one.  That ‘their system of equations is linear’.  Well, this is true, except for the thousands that are not.  And those that are linear are approximated as such.  I dare say some don’t do this as advisedly as they should.  But most will know what they are doing, and when it works and when it does not.    But the importance of not making them linear is THE WHOLE POINT of many papers, in fact of many macroeconomists WHOLE LIVES.

I’m not at the forefront of this research, but I did teach a 5 hour workshop of various different methods for solving these models.  Check it out.  And in that time I could barely skim the surface of what’s been done. Or take a look at this textbook by Ken Judd.  It contains lots of ‘new tools’.  In a 1992 book!  And tools that were around for a lot longer before that in paper form before they became lecture notes and then his textbook.  Popular topics now are the macroeconomics of borrowing constraints and the zero lower bound.  Both of which require non-linear methods, which are embraced, in fact positively revelled in by the practitioners.

Error two.  Macro restricts itself to single equilibria.  Or ignores ‘chronic instability’.  Nope.  Multiplicity is everywhere.  Some, admittedly, get embarrassed about it, and hide it away.  There’s an amusing interview with William Brock where he explains how some liked to conceal multiple equilibria in footnotes.  [Can’t find it now, but I’ll update with a link when I do].  But I think most either address this head on and demand a lot of explanation from authors that seek to sidestep it.  Roger Farmer made much of his career out of multiplicity.  The New Keynesians – often bearing the brunt of this ‘failure of macro’ waffle – have spilled lots and lots of ink over the issue of what kinds of policies and environments generate multiplicity of equilibria and what don’t.  [Anyone who doubts this can try a quick google.  I’d suggest terms like ‘multiple equilibria, multiplicity, equilibrium, macroeconomy, monetary, stability’.]

One example [but there are hundreds] of a paper that embraces both issues, nonlinearity and multiplicity.  Benhabib and Schmitt-Grohe on the ‘perils of Taylor rules’.  This is solved non-linearly, in the presence of the zero bound.  And it explains how there are 2 steady states.  One with inflation at target.  And one with nominal interest rates perpetually trapped at the zero bound.

Error 3.  Secular stagnation is hard to capture in modern macro models.  A little more debatable.  For a start, this started out life as a bit of verbal reasoning by one of the great minds in our profession no longer building models.  It’s a conjecture, not a fact.  So even if it was hard to build into a model, that would not necessarily be a bad thing.  But, in fact it is.  Eggertson and Mehrotra have done it, for one.  Whether they have done it convincingly is up for debate.  But it is done.  And so has Greg Thwaites.  I would not like to trivialise what they did by calling it ‘easy’.  But I think it’s fair to say of both pieces of work that they are not pieces of technical virtuoso.  They use bread and butter tools, and the contribution is in the insight and the economics.  The difficulties they had don’t speak to some problem in macro.  They may well speak to a problem with the hypothesis, conjured creatively but speculatively out of a few charts.

Error 4. Well, I’m not actually sure what is meant here.  But Munchau starts out saying that there is ‘an assumption of limitless space.  That wherever you stand, you can go further.’  This is so vague, it’s not easy to know if it’s wrong or not.  But, some examples:  microfounded models assume budget and economy wide resource constraints.  So, in that sense, the ‘space’ is extremely limited.  And usually there is the invocation that the space that variables live in is bounded.  Munchau writes ‘No go zones like a zero bound are technical minefields in a model.’  Well, not really.  I teach my MSc students how to deal with it in 2 hours.  They are a clever lot, for sure.  But they are also busy and exhausted and have 25 other things to revise.  Yet they still get it.

The implication is ‘ooh, look at this really obvious real world thingy that economists just can’t deal with’.  But actually, they can and do, and it’s embraced by 100s of papers now, since Krugman wrote the first modern one in 1998.

Munchau identifies, therefore a false ‘consensus’.  And hopes that ‘new tools’ will come along to help people challenge it.  Not realising that these tools have been in use by mainstream macro people since the mid 1980s.  I hope that Munchau’s penetrating writing on other topics don’t lead peope to take him seriously on this one.

This entry was posted in Uncategorized. Bookmark the permalink.

15 Responses to There’s something Wolfgang on the internet

  1. Daniel Davies says:

    I kind of disagree with you because I tend to think Munchau is often horribly bad on lots of other subjects too (and some of his Greece columns have been horrible). But you’re dead right here. The one thing that economists don’t need is more tools, and more models! They’ve already got far too much flexibility in modelling approaches. There’s a way of modelling everything – which, of course, means that in practice, there’s no systematic way of knowing which particular things you ought to be modelling.

    Even the much-hated DGSE model – they’re by definition stock-flow consistent. There’s no reason at all that a DGSE model couldn’t have picked up the only thing (in retrospect) worth knowing about the period of the Great Moderation – that personal sector spending was growing faster than personal sector income, and that this meant that gross debt levels were on an unsustainable path. There’s not even anything “non-linear” about it – every year’s debt level was equal to the previous year’s level multiplied by the growth rate.

    This is why it kind of worries me that models are now being produced with all sorts of “frictions” and exciting stuff, rather than a few reasonably simple behavioural relationships (which, I guess, people who like microfoundations can tell themselves fairy stories in order to justify, the way they do with Calvo pricing). These bells and whistles are going to end up being dead useless if they’re not accompanied by a wider change of approach.

    Which is to say – economics is the study of the economy. Nobody needs different tools, they just need to look out the window or read a newspaper once in a while and have more of a think about what kinds of things they ought to be modelling. The big problem with the way the subject went in the 00s is that the the decision of “what things should we be modelling?” got taken on the basis of a) what it was tractable to model in a particular framework and b) what was considered important twenty years earlier while the workhorse models were being invented.

    • Luis Enrique says:

      I sent this by email to Tony in response to last night’s twitter conversation about stock flow consistent models, but I see it would fit reasonably well here ….

      caveat: I may not have understood Godley and Lavoie’s work [*]

      one attraction of their approach is that you can load it up (I mean populate matrices) with recent data on the cash flows and balances sheets of firms, households, banks etc. and then calibrate whatever it is that does the job of behavioural equations (that govern the transition from one period to the next – you can seem my unfamiliarity with the details of what they do here) and then run it forward and say: if this continues, it’s going to end in disaster .

      Now that is obviously sometimes a useful exercise, sometimes and extremely useful exercise, because this might not have been obvious and sometimes people do keep doing things until they end in disaster and that may be really, really important. See 2007. And I think mainstream macro does not have this feature – I don’t think you can populate a mainstream macro model with reasonably fine-grained data covering what firms, households and banks have been doing in recent years, how their balance sheets have been evolving etc. and run it forward in the same way.

      You have to work a lot harder to get mainstream macro to end in tears – you need individually rational behaviour that is irrational in aggregate (some combination of heterogeneous agents, strategic behaviour, coordination problems, herds, principal-agent set ups and so forth) and then calibrating the initial conditions to match any given economy at a point in time would be challenging

      What I take from this is that different approaches have different strengths. You (Tony) will not like the fact that Godley and Lavoie does not, I think, proceed from optimizing agents. Clearly sometimes people do not keep doing what they are doing until it ends in disaster, so any approach that basically works on that basis will be misleading in some places and times.

      [*] my memories are very vague because I skim-read the book in 2007, not because I was far-sighted or on the ball, but because Godley happened to be a friend of my girlfriend’s mum, and I haven’t gone back to it since (although I often think of doing so)

      • Nick Edmonds says:

        Excellent comments here, in my opinion (both Daniel Davies and Luis Enrique).

        All models can do is help us organise our understanding of what is going on or what might happen. But it can be counter-productive to keep staring at models of one particular type.

        So-called SFC models help us think through certain mechanics that are much harder to represent within, say, DSGE. In my own experience in a commercial environment, I have found them invaluable as a framework for understanding the reasons for, and potential consequences of, things like developments in sectoral balance sheets.

        Absence of micro-foundations is a reason to be wary of the results of such models. We do need to ask ourselves whether our assumed behaviour implies some inconsistency. But there are (different) reasons to be wary of the results of any model. It’s not a reason not to use them.

      • Daniel Davies says:

        [“one attraction of their approach is that you can load it up (I mean populate matrices) with recent data on the cash flows and balances sheets of firms, households, banks etc. and then calibrate whatever it is that does the job of behavioural equations (that govern the transition from one period to the next – you can seem my unfamiliarity with the details of what they do here) and then run it forward and say: if this continues, it’s going to end in disaster “]

        This is actually I think the key to my and Noah’s disagreement (or at least, to that small proportion of it which wasn’t just driven by our respective personality problems).

        The strength of Godley/Lavoie type models is their weakness and vice versa, and it’s that they don’t reduce the form. This means that it’s very difficult to “estimate” the model as a whole – you end up just getting econometric estimates as good as you can get for some of the behavioural relationships, and finger-in-the-air rules of thumb for the rest. And as Stephen Kinsella correctly says, this wouldn’t be so bad if the models were generally robust to small variations in the behavioural parameters, but they’re usually not. There is basically no way of estimating all the parameters of the model at once, in a way that ensures that the estimates will be consistent, because there’s nowhere near enough data (very few economies have even one business cycle’s worth of basic flow-of-funds tables).

        But …. a) I don’t think that it is reasonable from a philosophy-of-science point of view to say that decisions made fifty years ago about what statistics to collect should be a constraint on what’s considered to be valid theory. If we were to believe that SFC modelling was the approach needed, then the conclusion would be “start collecting the data for it, now!”.

        And b) at least because they don’t reduce the form, policy simulations in SFC models are proper policy simulations. When you estimate the DGSE model, you end up throwing away this ability because (as I seem to remember it was Tony who first impressed on me) “Impulse Response Functions Are Not Measures Of Marginal Effect”. Once you’ve estimated the model in reduced form, it’s no longer at all clear what you ought to be doing with the inputs in order to do a policy simulation.

      • Tony Yates says:

        I don’t follow the remarks about reducing the form. Different people mean different things by that statement.
        There are many ways to estimate a DSGE model. The model has implications for a VAR impulse response; for an autoregression; for a nonlinear regression on simulated data; for a state-space model…. all ways that help estimate the primitive parameters leave us able to do policy simulations.

    • Luis Enrique says:

      P.s. d2, I reckon it was also worth knowing that the banking system had got itself into a state whereby the end of a credit boom would cause it to implode. It didn’t have to be that way did it? I mean the end of a housing bubble and private credit expansion could just have led to a moderate recession as opposed to global financial cataclysm.

      • dsquared says:

        I tend to take the Dean Baker view on this – that when you’ve got such a massive wealth shock, you’re bound to have a big recession, and that the financial crisis is a bit of a misnomer for a crisis that was all about real estate. But yeah, I see your point.

    • Dan – I am surprised that you view the financial crisis as being about household debt rather than about banking sector balance sheet leverage. The unique part of 2008 was the correlated global bank run. If a few countries with high levels of household debt had had recessions the former would look like a valid thesis. Hence, as I see things, it was the likes of Einhorn and Ackman (and a number of notable banks analysts …) who were a lot closer to the mark than Godley.

      • Daniel Davies says:

        Well there’s two things here – the financial crisis of 2007-8, and the economic crisis of 2006-present. The financial crisis was definitely all about bank leverage, which is why it was so severe in Germany, and so many German banks went bust and had to be bailed out by SoFFIN. But the economic crisis looks to me like it had more to do with a housing wealth shock – again my example is Germany, where there wasn’t as much housing wealth or leverage, and so they had a very different recovery.

      • Daniel Davies says:

        Or actually, more to the point, Canada and Australia. They had all of the sins of high personal sector debt, etc. But they didn’t have a housing wealth shock, for a variety of not entirely clear reasons, and so they didn’t have the kind of recession that the USA did. (people have back-formed theories of Canadian financial stability but actually they really weren’t that different from the USA).

  2. metatone says:

    This seems to be more bait and switch.
    The question is, are you going to do something about your colleagues who do not believe that the economy can sit in a low-employment equilibrium. That’s where the religious belief in a single equilibrium bites – and it bites plenty of people who play around with multiple-equilibria intellectually. The religious belief says “there is one true equilibrium and the economy automatically self-corrects towards it.”

  3. Nanikore says:

    Your posts read like a preacher who finds an explanation for everything in his “Good Book”. The problem with this book is not that it does not try to look at everything, but it does so from the same dubious philosophical foundation.

    (This is touched on with the post, comments and links to Coase here:

    https://hbr.org/2012/12/saving-economics-from-the-economists

    You say that Steve Keen needs to study more economics. I don’t know. But I suspect he sees something fundamentally wrong with the discipline which I doubt that would change.

    But I have a feeling that YOU really do need to take a few 101s in a few other disciplines. I would suggest political science, political philosophy and/or history. Otherwise there is no way you can objectively look at your own discipline. Sure you can study medicine and only medicine and be a good doctor. But that does not apply in subjects like economics.

  4. Daniel Davies says:

    What I meant earlier by “reducing the form” was that for most DGSE models, the model that’s written down is stock flow consistent in the strictest sense, but the model that’s estimated tends to be:

    1) something like an IS curve
    2) something like a Phillips curve
    3) something like a Taylor rule
    4) because we are talking about modern post-crisis versions with financial frictions in them, something like a consumption/wealth ratio
    5) something like a hysteresis effect

    If your starting point is a Godley-style mega-table with all sorts of budget constraints in it, you’re always going to be very sceptical about the exercise of recovering the parameters of the original DGSE model from the set of relationships that you’ve estimated. Similarly, you’re going to be sceptical about whether those parameters are going to be any use for doing things out of sample – particularly, as Luis says, the fact that everything is being driven off a presumed equilibrium relationship and is trying to drag itself back to the list of more or less stable relationships estimated 1-5 above, it’s going to be more difficult than it potentially ought to be to get your model to tell you that everything’s going to blow up.

    I am obviously at a massive disadvantage to TY when it comes to making arguments about econometrics because I am basically shite at it, though! So I’m sure that what I’m describing here is more of a problem for the way I learned this in the 90s at LBS than anything on the cutting edge now. But I find myself wanting to say that there’s an underlying issue here, in that the more sophisticated you get about extracting the model parameters from the estimation, the more arguable it is that you’re basically doing the same thing that Wynne Godley did and just using judgement and staistical good practice to fix up a model that’s basically hugely underidentified by the data available (in which case it comes back to my original point that the correct way to do economics shouldn’t be determined by what decisions were made in statistical agencies fifty years ago!).

    I’m probably talking rubbish though so sorry for that. I’d be really interested in hearing about nonlinear regression (ie regime switching?) on simulated data – how do you simulate the data?

Leave a comment