Mainstream macroeconomics in a state of ‘intellectual regress’ – Bill Mitchell – Modern Monetary Theory
http://bilbo.economicoutlook.net/blog/?p=35118Mainstream macroeconomics in a state of ‘intellectual regress’
At the heart of economic policy making, particularly central bank forecasting are so-called Dynamic Stochastic General Equilibrium (DSGE) models of the economy, which are a blight on the world and are the most evolved form of the nonsense that economics students are exposed to in their undergraduate studies. Paul Romer recently published an article on his blog (September 14, 2016) – The Trouble With Macroeconomics – which received a fair amount of attention in the media, given that it represented a rather scathing, and at times, personalised (he ‘names names’) attack on the mainstream of my profession. Paul Romer describes mainstream macroeconomics as being in a state of “intellectual regress” for “three decades” culminating in the latest fad of New Keynesian models where the DSGE framework present a chimera of authority. His attack on mainstream macroeconomics is worth considering and linking with other evidence that the dominant approach in macroeconomics is essentially a fraud.
Romer begins by quoting “a leading macroeconomist” (he doesn’t name him but it is Jesús Fernández-Villaverde from UPenn, who I would hardly label “leading”) as saying that he was “less than totally convinced of the importance of money outside the case of large inflations” (meaning that monetary policy cannot have real effects – cause fluctuations in real GDP etc).
This monetary neutrality is a core presumption of the mainstream approach.
Romer then uses the Volcker hike in real interest rates in the US in the late 1970s, which preceded two sharp recessions in a row and drove unemployment up and real GDP growth down, as a vehicle for highlighting the absurdity of this mainstream approach to macroeconomics.
The Fernández-Villaverde quote, as an aside, came from this awful paper from 2009 – The Econometrics of DSGE Models – where the author claims the developments in macroeconomics “between the early 1970s and the late 1990s” represented a “quantum leap” equivalent to “jumping from the Wright brothers to an Airbus 380 in one generation”.
Well at least the Airbus 380 (and the Wright brothers) actually flew and were designed to work in reality! But then Fernández-Villaverde is typical of the mainstream economists, not short on self-promotion and hubris.
Romer proceeds to viscerate the elements in what Fernández-Villaverde thinks represents the ‘quantum leap’.
He starts with “real business cycle (RBC)” theory, which claims “that fluctuations in macroeconomic aggregates are caused by imaginary shocks, instead of actions that people take”.
In RBC a collapse in nominal spending has no impact. Only real shocks (these “imaginary” shocks) which are driven by random technology changes and productivity shifts apparently drive market responses and economic cycles.
A recession, is characterised as an ‘optimal’ response to a negative productivity shock – where workers knowing that they are less productive now than in the future, will offer less work (hence choose leisure over labour) and become unemployed, in the knowledge that they can make up the income later when they will be more productive.
While Fernández-Villaverde thinks the RBC approach was a “particular keystone” in the “quantum leap”, any reasonable interpretation of the proposition that recessions are optimal outcomes of individual’s choosing to be unemployed would label RBC absurd.
Romer is highly critical of this approach which he calls “post-real”.
He then provides a detailed critique of DSGE models which he rightfully considers to be the next part of the RBC “menagerie” – the “sticky-price lipstick on this RBC pig”.
I won’t go into all the nuts and bolts (being beyond the interest and probably scope of my readership).
I wrote, at some length about the New Keynesian/DSGE approach in this blog (2009) – Mainstream macroeconomic fads – just a waste of time.
The New Keynesian approach has provided the basis for a new consensus emerging among orthodox macroeconomists. It attempted to merge the so-called Keynesian elements of money, imperfect competition and rigid prices with the so-called Real Business Cycle theory elements of rational expectations, market clearing and optimisation across time, all within a stochastic dynamic model.
That mind sound daunting to readers who haven’t suffered years of the propaganda that goes for economics ‘education’ these days but let me assure you all the fancy terminology (like ‘rational expectations, stochastic dynamics, intertemporal optimisation’ and the rest of it) cannot hide the fact that these theories and attempts at application to real world data are a total waste of time.
Yes, they given economists a job. A job is a good thing. But these brains would be far better applied to helping improve the lot of real people in need rather than filling up academic journals with tripe
New Keynesian theory is actually very easy to understand despite the deliberate complexity of the mathematical techniques that are typically employed by its practitioners.
In other words, like most of the advanced macroeconomics theory it looks to be complex and that perception serves the ideological agenda – to avoid scrutiny but appear authoritative.
Graduate students who undertake advanced macroeconomics become imbued with their own self-importance as they churn out what they think is deep theory that only the cognoscenti can embrace – the rest – stupid (doing Arts or Law or something). If only they knew they were reciting garbage and had, in fact, very little to say (in a meaningful sense) about the topics they claim intellectual superiority.
The professors who taught them are worse, if that is possible.
In these blogs – GIGO and OECD – GIGO Part 2, I discussed how Garbage In Garbage Out is that process or mechanism that leads us to be beguiled by what amounts to nothing.
The framework is so deeply flawed and bears no relation at the macroeconomic level to the operational realities of modern monetary economies.
In our 2008 book Full Employment Abandoned: Shifting Sands and Policy Failures, we have a section on the so-called new Keynesian (NK) models and we argue that they are the latest denial of involuntary unemployment, in a long list of efforts that the mainstream has offered over the years.
DSGE/NK models claim virtue (‘scientific rigour’) because they are, in their own terms, ‘microfounded’. What the hell does that mean?
Not much when it comes down to it.
Essentially, as a critique of Keynesian-style macroeconomics which was built by analysing movements in aggregates (output, employment etc), the microeconomists believed that their approach to economics was valid and that macroeconomics was just an aggregate version of what the micro theory believed explained individual choice.
So just as micro theory imposed a particular psychology in individual choice (for example, about decisions to work or consume) – which considered individuals (consumers, firms) used optimising calculations based on ‘rational expectations’ (essentially – perfect foresight with random errors) to make decisions now about behaviour until they died, macroeconomic theory should embrace this approach.
I considered some of these assumptions about behaviour in this blog – The myth of rational expectations.
The Rational Expectations (RATEX) literature which evolved in the late 1970s claimed that government policy attempts to stimulate aggregate demand would be ineffective in real terms but highly inflationary.
People (you and me) anticipate everything the central bank or the fiscal authority is going to do and render it neutral in real terms (that is, policy changes do not have any real effects). But expansionary attempts will lead to accelerating inflation because agents predict this as an outcome of the policy and build it into their own contracts.
In other words, they cannot increase real output with monetary stimulus but always cause inflation. Please read my blog – Central bank independence – another faux agenda – for more discussion on this point.
The RATEX theory argued that it was only reasonable to assume that people act rationally and use all the information available to them.
What information do they possess? Well RATEX theory claims that individuals (you and me) essentially know the true economic model that is driving economic outcomes and make accurate predictions of these outcomes with white noise (random) errors only. The expected value of the errors is zero so on average the prediction is accurate.
Everyone is assumed to act in this way and have this capacity. So ‘pre-announced’ policy expansions or contractions will have no effect on the real economy.
For example, if the government announces it will be expanding its fiscal deficit and adding new high powered money, we will also assume immediately that it will be inflationary and will not alter our real demands or supply (so real outcomes remain fixed). Our response will be to simply increase the value of all nominal contracts and thus generate the inflation we predict via our expectations.
The problem is that all this theory has to be put together in a way that can be mathematically solvable (“tractable”). That places particular restrictions on how rich the analysis can be.
It is impossible to achieve a solution based on the alleged ‘micro foundations’ (meaning individual level) so the first dodge we encounter is the introduction of the ‘representative agent’ – a single consumer and single firm that represents the maximising behaviour of all the heterogenous ‘agents’ in the economy.
Famous (pre-DSGE) macroeconomist Robert Solow gave to the US Congress Committee on Science, Space and Technology – in its sub-committee hearings on Investigations and Oversight Hearing – Science of Economics on Jul 20, 2010. The evidence is available HERE.
Here is an excerpt relevant to the topic:
Under pressure from skeptics and from the need to deal with actual data, DSGE modellers have worked hard to allow for various market frictions and imperfections like rigid prices and wages, asymmetries of information, time lags, and so on. This is all to the good. But the basic story always treats the whole economy as if it were like a person, trying consciously and rationally to do the best it can on behalf of the representative agent, given its circumstances. This can not be an adequate description of a national economy, which is pretty conspicuously not pursuing a consistent goal. A thoughtful person, faced with the thought that economic policy was being pursued on this basis, might reasonably wonder what planet he or she is on.
So the micro foundations manifest, in the models, as homogenous (aggregated) maximising entities, with the assumption that everybody behaves in the same way when confronted with market information.
This ‘one’ (mythical) consumer is rational and has perfect foresight so that it (as a representation of all of us) seeks to maximise ‘life-time’ income.
Consumption spending in the real world accounts for about 60 per cent of total spending. So the way in which these DSGE models represent consumption will have important implications for their capacity to mimic real world dynamics.
This mythical consumer is assumed to know the opportunities available for income now and for every period into the future and solves what is called an ‘intertemporal maximisation’ problem – which just means they maximise in every period from now until infinity.
So the consumer maximises the present discounted value of consumption over their lifetime given the present value of the resources (wealth) that the person knows he/she will generate.
If you don’t know what that means then you are either not a real person (given that the mainstream consider this is the way we behave) or you are just proving what I am about to say next and should feel good about that.
A basic Post Keynesian presumption, which Modern Monetary Theory (MMT) proponents share, and which is central to Keynes’ own analysis in the 1936 General Theory, is that the future is unknowable and so, at best, we make guesses about it, based on habit, custom, gut-feeling etc.
So this would appear to make any approach that depends on being able to optimise at each point in time problematic to say the least.
Enter some more abstract mathematics in the form of the famous Euler equation for consumption, which provides a way in which the mythical consumer links decisions now with decisions to consume later and achieve maximum utility in each period.
This equation is at the centrepiece of DSGE/NK models.
It simply says that the representative consumer has to be indifferent (does care) between consuming one more unit today or saving that extra unit of consumption and consuming the compounded value of it in the future (in all periods this must hold).
It is written as:
Marginal Utility from consumption today MUST EQUAL A(1 + R)*Marginal Utility from consumption in the future.
Marginal utility relates to the satisfaction derived from the last unit of consumption. * is a multiply sign. If we save today we can consume (1 + R) units in the future, where R is the compound interest rate.
The weighting parameter A just refers to the valuation that the consumer places on the future relative to today.
The consumer always satisfies the Euler equation for consumption, which means all of us do it individually, if this approach is to reflect the micro foundations of consumption.
Some mathematical gymnastics based upon these assumptions then generates the maximising consumption solution in every period where the represenative consumer spends up to the limits of his/her income (both earned and non-labour).
Reflect on your own life first. You will identify many flaws in this conception of your consumption decision making.
1. No consumer is alike in terms of random shocks and uncertainty of income. Some consumers spend every cent of any extra income they receive while others (high-income earners) spend very little of any extra income. Trying to aggregate these differences into a representative agent is impossible.
2. No consumer is alike with respect to access to credit.
3. No consumer really considers what they will be doing at the end of their life in any coherent way. Many, as a result of their low incomes or other circumstances, are forced to live day by day. There is no conception of permanent lifetime income, which are central to DSGE models.
4. Even the concept of the represent agent is incompatible with what we observe in macroeconomic data.
For example, a central notion within the DSGE models is that of Ricardian Equivalence.
This notion claims that individuals (consumers and firms) anticipate policy changes now in terms of their implications for their future incomes and adjust their current behaviour accordingly, which has the effect of rendering the policy change ineffective.
Specifically, say a government aanounces it will cut its net spending deficit (austerity), the representative agent is claimed to believe that the government will decreasing taxes in the future which increases the present value of permanent income and as a result, the agent will spend more.
As a result, it is alleged that austerity is not a growth killer because total spending just shifts from public to private.
The reality is quite different and opposite to what the Euler equation predicts. Consumers actually cut back spending in periods of austerity because unemployment increases. Firms cut back on investment spending because sales flag.
5. The research evidence (from behavioural economics) refutes the rationality and forward-looking assumptions built in to DSGE models.
We are not good at solving complex intertemporal financial maximisation mathematical models.
S&P conduced a Global Financial Literacy Survey in 2015 (published November 20, 2015) and found that “57% Of Adults In U.S. Are Financially Literate” (Source).
The survey found that “just one-third of the world’s population is financially literate.”
The questions probed knowledge of interest compounding, risk diversification, and inflation.
You can see the test questions – HERE. I pass but have a PhD in economics.
The knowledge required to pass this literacy test is far less than would be required to compute the intertemporal maximisation solution constrained by the Euler equation, even if there was perfect knowledge of the future.
We tend to satisfice (near enough is good enough) rather than maximise.
We are manipulated by supply-side advertising which distorts any notion that we make rational choices – we binge, impulse buy etc.
In this blog (June 18, 2014) – Why DSGEs crash during crises – the founders of so-called General-to-Specific time series econometric modelling (David Hendry and Grayham Mizon) wrote that we do look into the unknowable future to make guessess when deciding what to do now.
We typically assume that there will be “no unanticipated future changes in the environment pertinent to the decision”. So we assume the future is stationary.
For example, when I ride my motorcycle down the road I assume (based on habit, custom, past experience) that people will stop at the red light, which allows me to accelerate (into the future) through the next green light.
But Hendry and Mizon correctly note that:
The world, however, is far from completely stationary. Unanticipated events occur … In particular, ‘extrinsic unpredictability’ – unpredicted shifts of the distributions of economic variables at unanticipated times – is common. As we shall illustrate, extrinsic unpredictability has dramatic consequences for the standard macroeconomic forecasting models used by governments around the world – models known as ‘dynamic stochastic general equilibrium’ models – or DSGE models …
… the mathematical basis of a DSGE model fails when distributions shift … General equilibrium theories rely heavily on ceteris paribus assumptions – especially the assumption that equilibria do not shift unexpectedly.
So at some point, an unexpected rogue red-light runner will bring my green light equilibrium into question – how many times have I swerved without warning to avoid accident!
Hendry and Mizon’s point is illustrated well by the Bank of England (Staff Working Paper No. 538) analysis presented by Nicholas Fawcett, Riccardo Masolo, Lena Koerber, and Matt Waldron (July 31, 2015) – Evaluating UK point and density forecasts from an estimated DSGE model: the role of off-model information over the financial crisis .
They show that “all of the forecasts fared badly during the crisis”. These forecasts were derived from the Bank’s COMPASS model (Central Organising Model for Projection Analysis and Scenario Simulation), which is a “standard New Keynesian Dynamic Stochastic General Equilibrium (DSGE) model”.
They wrote that:
None of the models we evaluated coped well during the financial crisis. This underscores the role that large structural breaks can have in contributing to forecast failure, even if they turn out to be temporary …
Criticism of the so-called ‘micro-founded’ DSGE approach is not new.
Robert Solow (cited above) also said that the DSGE fraternity “has nothing useful to say about anti-recession policy because it has built into its essentially implausible assumptions the “conclusion” that there is nothing for macroeconomic policy to do”.
Even mainstreamers like Willem Buiter described DSGE modelling as “The unfortunate uselessness of most ‘state of the art’ academic monetary economics”. He noted that:
Most mainstream macroeconomic theoretical innovations since the 1970s (the New Classical rational expectations revolution … and the New Keynesian theorizing … have turned out to be self-referential, inward-looking distractions at best. Research tended to be motivated by the internal logic, intellectual sunk capital and esthetic puzzles of established research programmes rather than by a powerful desire to understand how the economy works – let alone how the economy works during times of stress and financial instability. So the economics profession was caught unprepared when the crisis struck … the Dynamic Stochastic General Equilibrium approach which for a while was the staple of central banks’ internal modelling … excludes everything relevant to the pursuit of financial stability.
The other problem is that these DSGE models are essentially fraudulent.
In my 2008 book cited above we considered the standard DSGE approach in detail.
The alleged advantage of the New Keynesian approach (which incorporates DSGE modelling) is the integration of real business cycle theory elements (intertemporal optimisation, rational expectations, and market clearing) into a stochastic dynamic macroeconomic model. The problem is that the abstract theory does not relate to the empirical world.
To then get some traction (as Solow noted) with data, the ‘theoretical rigour’ is supplanted by a series of ad hoc additions which effectively undermine the claim to theoretical rigour.
You cannot have it both ways. These economists first try to garner credibility by appealing to the theoretical rigour of their models.
But then, confronted with the fact that these models have nothing to say about the real world, the same economists compromise that rigour to introduce structures (and variables) that can relate to the real world data.
But they never let on that the authority of this compromise is lost although the authority was only ever in the terms that this lot think.
No reasonable assessment would associate intellectual authority (knowledge generation) with the theoretical rigour that we see in these models.
This is the fundamental weakness of the New Keynesian approach. The mathematical solution of the dynamic stochastic models as required by the rational expectations approach forces a highly simplified specification in terms of the underlying behavioural assumptions deployed.
As Robert Solow noted (cited above), the DSGE fraternity “has nothing useful to say about anti-recession policy because it has built into its essentially implausible assumptions the “conclusion” that there is nothing for macroeconomic policy to do”.
Further, the empirical credibility of the abstract DSGE models is highly questionable. There is a substantial literature pointing out that the models do not stack up against the data.
Clearly, the claimed theoretical robustness of the DSGE models has to give way to empirical fixes, which leave the econometric equations indistinguishable from other competing theoretical approaches where inertia is considered important. And then the initial authority of the rigour is gone anyway.
This general ad hoc approach to empirical anomaly cripples the DSGE models and strains their credibility. When confronted with increasing empirical failures, proponents of DSGE models have implemented these ad hoc amendments to the specifications to make them more realistic.
I could provide countless examples which include studies of habit formation in consumption behaviour; contrived variations to investment behaviour such as time-to-build , capital adjustment costs or credit rationing.
But the worst examples are those that attempt to explain unemployment. Various authors introduce labour market dynamics and pay specific attention to the wage setting process.
One should not be seduced by DSGE models that include real world concessions such as labour market frictions and wage rigidities in their analysis. Their focus is predominantly on the determinants of inflation with unemployment hardly being discussed.
Of-course, the point that the DSGE authors appear unable to grasp is that these ad hoc additions, which aim to fill the gaping empirical cracks in their models, also compromise the underlying rigour provided by the assumptions of intertemporal optimisation and rational expectations.
Paul Romer draws a parallel between ‘string theory’, which claimed to provide a unified theory of particle physics (yet failed dramatically) and post-real macroeconomics.
He cites a particle physicist who listed “seven distinctive characteristics of string theorists”:
1. Tremendous self-confidence
2. An unusually monolithic community
3. A sense of identification with the group akin to identification with a religious faith or political platform
4. A strong sense of the boundary between the group and other experts
5. A disregard for and disinterest in ideas, opinions, and work of experts who are not part of the group
6. A tendency to interpret evidence optimistically, to believe exaggerated or incomplete statements of results, and to disregard the possibility that the theory might be wrong
7. A lack of appreciation for the extent to which a research program ought to involve risk
Readers of my current book – Eurozone Dystopia: Groupthink and Denial on a Grand Scale – will identify those 7 features as being definitive of a state of Groupthink where the mob rule in a group usurps reasonable practice and attention to facts.
The Them v Us mode is driven by arrogance and denial (Fernández-Villaverde epitomises the capacity to hype up nothing of substance).
Romer believes that the:
… the parallel is that developments in both string theory and post-real macroeconomics illustrate a general failure mode of a scientific field that relies on mathematical theory … In physics as in macroeconomics, the disregard for facts has to be understood as a choice.
A choice to avoid the facts, which are contradictory to one’s pet theory, is equivalent to fraud!
Conclusion
There are New Keynesians who still strut around the literature and the Internet claiming to still be relevant. Some are even advising political parties (for example, the British Labour Party).
The problem is that these theories cannot provide insights of any value about the world we live in for the reasons that Paul Romer discusses and other critics have offered.
The DSGE brigade are really captured in their own Groupthink and appear incapable of seeing beyond the fraud.
That is enough for today!
(c) Copyright 2017 William Mitchell. All Rights Reserved.
0 件のコメント:
コメントを投稿