I had high hopes for this page, but then it fell by the wayside. Think of it like those metal spurs poking out of one of those 1-storey concrete houses, waiting folornly for the money or energy for a second floor that never comes. Maybe I will get back to it.
Below are links to and brief comments on interesting papers I have read recently. Won’t necessarily be new, just things I have come across, been Tweeted, or the unfolding economic news has reminded me of.
Gertjan Vlieghe: Debt, demographics and the distribution of income: new challenges for monetary policy
How these 3 D’s bear down on the natural rate of interest, separately, and perhaps also interacting with each other too, and how that means that the medium-run neutral nominal rate will be significantly lower than the pre-crisis period. To be read in conjunction with Charlie Bean’s paper/slideshows on secular stagnation.
Naranya Kocherlakota: fragility of purely real macroeconomic models.
Kocherlakota has just stepped down from his role a President of the Minneapolis Fed and FOMC member to resume research and teaching. This paper seems very important to me, though I don’t yet grasp all of it.
His paper notes a discontinuity between monetary, rational expectations macroeconomic models that have almost flexible prices, and almost vertical Philips Curves, and those that have completely flexible prices and vertical Philips Curve, under an interest rate peg. NK uses the discontinuity to argue that flexible, price – so called ‘real business cycle’ – models are not useful for policy and business cycle analysis.
The discontinuity is illustrated using the Neo-Fisherian hypothesis. That is: if interest rates are pegged at a rate, say, higher than current rates, would this cause inflation to rise [Neo-Fisherian] or fall [conventional New Keynesian view].
The discontinuity can be got rid of by i) eliminating the zero bound to nominal interest rates, and therefore the probability that interest rates might ever be bound by a peg, ii) specifying a fiscal policy that would respond to make up for any constraint on monetary policy [I think, for example, by a carefully calibrated commitment not to raise adequate taxes to cover all outstanding nominal debt, a la FTPL].
The analysis is confined to rational expectations versions of the model, so it remains to be seen whether RBC can be salvaged as a good approximation by recourse to alternative specifications of expectations, for example constant gain least squares learning. Though one might speculate that even if it could, that would not be terribly comforting for RBC modellers who tend to view RE as a vital part of the machinery.
Alwyn Young: Channelling Fisher
Interesting paper. Shows the range of AW’s interests, for one thing. Paper takes randomized control trial papers from major journals, and recomputes test statistics using Fisher’s randomized method, when asymptotically valid standard tests are used. Finds many papers reporting significant treatments no longer do.
Spencer Dale: New Economics of Oil
Very nice think-piece from my old boss, Spencer Dale, now of BP. Interesting points: likely path of expansion of new reserves means the old Hotelling model of oil as an exhaustible resource is now not right. And likewise the assumption that the oil price should rise in line with the real rate. Also points out that shale is financed by borrowing by small, non-state companies in the US, hence a ‘credit-channel’ has been introduced into oil [a point made by the BIS a few months earlier].
Angus Deaton: Instruments of development
Expresses AD’s scepticism that randomised control experiments can establish a sound enough understanding of causality in development. Nice counterweight to the views expressed in Angrist and Pischke’s ‘Mostly harmless econometrics’, and their follow-up book, that ‘structural econometrics’, [estimated relationships derived from explicit theoretical models of causality], are bad teaching tools.
Mackay, Nakamura, Steinsson: The power of forward guidance revisited.
Just saw this at UCL today, 14.10.15. Great presentation. Paper’s starting point is the puzzle that in the familiar, complete markets, representative agent sticky price model, committing to small changes in monetary policy a long way out has very large effects. Rather than take this as encouragement that forward guidance is effective, some take it as evidence that the model is not right. The paper moves on to present an incomplete markets, heterogeneous consumer version of the NK model, showing that committing to future changes in interest rates has much less of an effect.
Garin, Lester and Sims: On the desirability of nominal GDP targeting
This paper has attracted quite a bit of attention in the popular econ media, eg on Twitter. Partly because of the viral interest in nominal GDP targeting spread by the ‘market monetarists’. I want to emphasise only that this paper does not show that nominal GDP targeting beats how the Fed or the BoE’s MPC, or the BoCanada would interpret inflation targeting. They would view their mandates as providing them with a mandate to do optimal policy, as best they see it, with the quantitative target for inflation defining the expected rate of inflation over the long term. In the UK, a Treasury review of the BoE’s mandate in 2013 interpreted things in just this way. Nominal GDP targeting can beat other constrained policies [not least because it’s not that dissimilar to flexible inflation targeting, which is optimal in the New Keynesian model] but rarely wins in general. Then again, neither does it lose by much. My beef with the NGDP lot was never that this was a dumb policy, just that it’s dumb to think it would change much about the world, in particular that it would magically solve problems we experienced during the crisis.
Yossi Yakhin: Solving linear rational expectations models
This is a nice, old, compact summary of different methods for doing what it says on this tin, organised around a simple RBC model. These techniques can almost be forgotten about with software like DYNARE, but sometimes things go wrong, and to have a hope of debuggin, it helps to have gone through the nuts and bolts. For my purposes, I’m writing a paper using least squares learning, and need to code the RE equilibrium in as a starting point, to ensure stability. [Learning models are often only, and even then only sometimes, stable close to the REE].
Ivan Werning: ‘Incomplete markets and aggregate demand’.
Supremely lucid paper about many things, but broadly the connection between representative agent models like the RBC and NK models, and heterogeneous agent models like the Bewley model. Shows that you can get relations involving aggregate consumption and interest rates that look similar to Euler Equations, with implications for those wanting to motivate demand shocks in representative agent models, and those wanting to infer failures of models of consumer optimisation from discrepancies between traditional EEs and actual consumption.
Just came across this preparing my new MSc open economy macro lectures. Small open economy model version of the RBC model implies a present value relation for the current account. Counterparts to the forecast expressions can be got from VARs. And these show that the PVM predictions are badly rejected, spurring attempts to reconcile data with model.
Chang and Li: Is economics research replicable?….
I found this through Mike Bird’s Twitter Feed. ‘Usually not’ they conclude. And even then much of the replications require assistance from the original authors. When I have tried this, that process is pretty painful, with the back and forth of emails and misunderstandings. I hope these authors get well rewarded for what otherwise is a thankless externality task, and one that is pretty brutalising to execute, I would imagine. The only thing worse than trying to debug your own code, is trying to get someone else’s to work.
Garcia-Schmidt and Woodford: Are low interest rates deflationary?
This paper responds to Cochrane, who pointed out that pegging interest rates at the zero bound may actually be the cause of, and not the solution to low inflation. The critique is a proof by counter-example with a small departure from rational expectations, and thus from the perfect foresight rational expectations equilibrium concept underlying Cochrane [but actually much of the other analysis that reached the opposite conclusion from Cochrane]. With this departure, the stimulative effects of low rates are rescued, though the exact amount of stimulus can be somewhat less than was indicated in the original PF analysis of forward guidance. By extension, one can, I think, make the same general point about Cochrane’s JPE paper on the whole business of RE New Keynesian macro analysis of Taylor Rules, and how their stabilising properties work, or don’t work. That paper debunked the logic policymakers often use to describe the NK model [if I promise to raise the nominal rate a lot, that should raise the real rate and reduce inflation….] and replaced it with a peverse story about threatening to explode the economy with hyperinflation if agents did not coordinate on stabilising expectations. So the comment on that work would be; this takes RE far too seriously. Small departures from it ma well rescue the old policymaker logic. Certainly they would in a model of constant gain least squares learning.