You can watch From Keynes to PSST. We will return to our usual time Monday at 8 PM New York time. I am happy to take questions about PSST. And we will have new data on employment and unemployment to chew over.
There, I discuss several transitions in macroeconomic thought. First, from Keynes to 1970s macro. Then from 1970s macro to 1990s macro. Finally, I try to explain why I reject both in favor Patterns of Sustainable Specialization and Trade.
I see Keynes as deviating from the practice of thinking in terms of markets. If you have ever seen a diagram showing a line representing supply and a line representing demand, with an intersection labeled as “equilibrium,” you have seen how economists think about markets. Markets are self-adjusting. Keynes did not think that way.
For example, a classical economist would have characterized the market for capital goods in terms of supply and demand relative to the interest rate. At a very high interest rate, households will supply abundant savings and firms will undertake little investment. At a low interest rate, households will not save much, and firms will want to invest a lot. Equilibrium, in which saving matches investment, requires an interest rate that is neither too high nor too low.
Keynes instead saw investment as determined by long-term expectations and “animal spirits” on the part of businesses. He saw savings as mechanically determined by the level of income. Neither savings nor investment would respond to the interest rate. If businesses wanted to increase the level of investment, the only way for savings to match was for income to increase sufficiently to supply the required level of savings. An increase in government spending would have the same impact as an increase in business investment—that is, it would raise total income and employment in the economy.
The transition from Keynes’ General Theory to 1970s macro had three elements: the re-introduction of the concept of markets; the development of National Income and Product Accounts (NIPA); and the development of large computer models.
The re-introduction of markets took place almost immediately. Within a year of publication of The General Theory. John Hicks produced an interpretation in which the market for capital goods and the market for money were linked. The two variables that were determined in these markets were the interest rate and income. In the capital goods market, if you had a high interest rate, you would have an excess of savings unless income is low. In the money market, if you had a high interest rate, you would have an excess supply of money unless income is high.
The possible combinations of the interest rate and income in the capital market were called the IS curve, which slopes downward. The possible combinations of the interest rate and income in the money market came to be called the LM curve, which slopes upward. IS-LM became a staple of 1970s macro. In this framework, an expansionary fiscal policy (greater government deficit) shifts the IS curve in a way that increases income and interest rates (moving along the LM curve). An expansionary monetary policy (the Fed creating money to purchase government bonds) shifts the LM curve in a way that lowers the interest rate and raises income (moving along the IS curve).
Another development that took place very soon after The General Theory was the compilation of national income accounts. These provided statistical measures of the concepts that were important in Keynesian macro: total national income, equal to total national output; consumer spending; business investment; government spending; household and business saving; taxes; exports; imports.
By the 1960s, large mainframe computers became available to economists. Using these, they could undertake statistical estimates of the determinants of various components of consumer spending and business spending. They could build models of interlinked markets and then simulate those models to estimate the impact of policy changes. The economists of the Kennedy and Johnson Administrations thought that they could “fine-tune” the economy using these tools.
In practice, the computer models turned out to be garbage. There were a number of reasons for this. See the video or see my essay on macroeconometrics.
From 1970s Macro to 1990s Macro
If IS-LM gave economists comfort in thinking about the capital market and the money market, there remained some discomfort about the labor market. The classical inclination is to think of labor supply and labor demand as determined by the wage rate. If the wage rate is too low, firms will want to hire a lot of workers, but households will be reluctant to offer to work, leading to excess labor demand. Conversely, if the wage rate is too high, households will eagerly offer labor, but firms will not be able to employ many workers profitably, leading to excess labor demand—unemployment.
Thus, the classical interpretation of high unemployment is that wages are too high. Once wages fall, there will be more demand and less supply, and unemployment will disappear.
Keynesian economics purported to describe an economy that could have an underemployment equilibrium. The classical cure of a lower wage did not work. How might this be? One story was that Keynesian economics simply assumed that wages were fixed at a high level. In that case, what expansionary policy did was raise prices, reducing the “real” wage so that the economy could get back to full employment. Rising prices served as a substitute for declining wages.
Another story was that even if wages fell, this would be deflationary. With long-term debt contracts set in money terms, a deflation would cause defaults and bankruptcies. This would drive down the demand for labor. Other things equal, yes, a lower wage would raise labor demand. But the dynamics of reducing the wage—deflation—would lower labor demand. Given the dynamics, lower wages would be a problem, not the solution.
The economics profession acted as if the first story was the important one. That is, wages were fixed in money terms. An expansionary policy would raise prices, lowering real wages. Thus, an expansionary policy would be inflationary, so that there was a Phillips Curve trade-off: higher inflation meant lower unemployment, and conversely.
In his Presidential Address to the American Economic Association late in 1967, Milton Friedman argued that the Phillips Curve trade-off could not be permanent. He took a classical view of the labor market, in which there is an equilibrium level of unemployment, which Friedman called the Natural Rate of unemployment. If policy makers tried to drive the unemployment rate below this Natural Rate, they would end up with ever-accelerating inflation.
1970 saw the publication of the proceedings of a conference organized by Edmund Phelps, called Microeconomic Foundations of Employment and Inflation Theory. This was a collection of attempts to explain the Phillips Curve trade-off in terms of fundamental principles of labor market behavior. Theories of the process of searching for jobs were prominent. Phelps also wrote a paper that articulated the Natural Rate hypothesis.
The story that expansionary policy increased employment by raising prices faster than wages was enshrined in textbooks and encoded in macroeconometric models. But economists came to agree with Friedman and Phelps that in the long run the real wage should settle at an equilibrium level, consistent with the Natural Rate.
In the macroeconometric models, current wage inflation depended on past price inflation. The rationale was that workers set their wage demands using “adaptive expectations,” meaning that they forecast future inflation on the basis of recent past inflation. But extrapolating from the past in that way meant that they were making systematic errors.
Robert Lucas pointed out that if workers had rational expectations, so that their errors were not systematic, then Keynesian macro models would be completely misleading. If expansionary policy only works by fooling workers into taking lower real wages, and if they cannot be fooled, then expansionary policy will not work.
The response to Friedman, Phelps, and Lucas was what I call 1990s macro. 1990s macro was based on the fear that if macroeconomic models did not have explicit microeconomic foundations, then they might prove useless or even dangerous as guides to policy.
After Lucas, the consensus came to be that microfounded macroeconomics had to incorporate rational expectations. This was a severe constraint, because the mathematics required to do this involved stochastic calculus. The models became very simple on many dimensions.
In 1990s macro, the IS-LM relationship gave way to a mathematical optimization problem in which a representative household tries to smooth consumption over an infinite lifetime. Employment was determined by a mathematical optimization problem in which a representative household decides how much labor to supply this period rather than next period. The “classical” versions of these models yielded fluctuations in output that were entirely determined by shocks to productivity. The “New Keynesian” versions yielded slightly different results, based on the introduction of slow price adjustment. Greg Mankiw’s idea of “menu costs” (you keep your price the same to avoid the cost of printing a new menu) was a typical New Keynesian story.
From 1990s Macro to PSST
The only thing that I can say for 1990s macro is that it does not commit the sins of 1970s macro. But I think that its own sins are worse. As many critics pointed out, 1990s macro has no chance of explaining something like the Great Depression. The idea that years of high unemployment can be interpreted as a solution to households’ optimization problems seems ridiculous. Franco Modigliani famously scoffed at the notion that the Depression was a “spontaneous outbreak of laziness.”
I think that the ritual of assembling a few dynamic optimization problems is pointless. In that regard, I might seem to be on Alan Blinder’s side. But unlike Blinder, I am not satisfied with 1970s macro. I do not think that macroeconomic developments can be described as a set of equations linking large aggregates.
I think that the right cure for macro is to take into account the complexity of specialization and trade. Intricate patterns of specialization and trade result from the discovery of subtle forms of comparative advantage. We have an economy in which it can be a person’s comparative advantage to be managing social media marketing for a sports ticket app.
Any number of developments can break patterns of comparative advantage, sometimes temporarily but often permanently. A temporary break might occur when, say, auto manufacturers have excess inventory, so that they shut down production for a few months, after which they recall the same workers to the same jobs. A more permanent break occurs when, say, print newspapers are put out of business by competition from the Internet.
When permanent breaks take place, the only way to reduce unemployment is for entrepreneurs to discover new patterns of sustainable comparative advantage. That is what job creation means. It is a process that involves trial and error, and it takes time. It is like solving a jigsaw puzzle. When an entrepreneur creates a profitable business, some pieces click into place.
Consider the effect of the pandemic on the market for urban office buildings and related businesses, such as downtown restaurants. If everyone is going to return to the office, then things can return to the pre-pandemic normal. But if many workers and firms have adapted to remote work and become accustomed to it, then the downtown office sector will not come back to what it was. New patterns of specialization and trade will have to be found to employ workers whose former jobs depended on downtown office buildings being fully occupied.
At all times, Schumpeterian creative destruction is taking place. But often, the firms that have become obsolete try to hang on. In a financial environment in which capital is abundant and labor is scarce, zombie firms will hang on while experimental firms try to expand. Total employment is high, beyond what is sustainable in the long run. Firms are afraid to let workers go, because it will be hard to hire them back if you need to.
But late in an expansion, productivity and profits fall off, and capital becomes scarce as investors lower their valuations of businesses. Firms finally let go of excess workers, and you have a recession. In the PSST story, overall economic fluctuations come from this sort of cycle. But every recession is different, because every sudden break in the patterns of sustainable specialization and trade is different. We have had housing-led recessions, oil-shortage recessions, and others.
The PSST story does not lend itself to “fine tuning” the economy. It does not reinforce politicians’ desire to write checks to people in the name of “stimulus.”
Can the PSST story be used to predict fluctuations? Right now, it seems to me that we are nearing a transition between a situation with abundant capital, where venture capitalists were flush with funds to invest, to one of scarce capital. We still have scarce labor, so that firms that are hanging on to workers, in case those workers are needed. But I keep expecting labor demand to “fall off a cliff,” with firms finally realizing that they need to get rid of excess workers.
[After I wrote this, but before it was scheduled to be published, Matt Yglesias wrote,
I’m not exactly sure why this happened, but roughly a year ago there was a substantial vibe shift in Silicon Valley which holds that most large technology companies are massively overstaffed. Multiple CEOs of privately held tech companies have voiced this critique of their larger peers to me. They’ve also criticized the venture capital community for encouraging excessively rapid headcount growth, but some influential VCs are now saying they agree with this. And there seems to be some competition to engage in the highest possible estimates of overstaffing. Marc Andreessen says the good big companies should lay off half their staff and the bad ones are worse.
]
First link seems broken, I believe this is correct version of same... https://www.youtube.com/watch?v=MOgxHLciIt4
It seems to me that an important distinction needs to be made between at-risk capital and deferred consumption. The application of and returns to at-risk capital mostly play out through equity investments and returns on equities, which have high and relatively stable expected returns that are riven by real shocks. The application of and returns to deferred consumption play out through low-risk securities and interest rates.
I think it makes more sense to think of deferred consumption as a service provided by the borrower to the lender, and in order to provide that service, the borrower needs to have an income-producing asset that can credibly fund the deferred consumption. Frequently some of the assets that function as part of at-risk capital can serve as the collateral for deferred consumption, but otherwise the two markets are not particularly described with the same characteristics. I think it is hard to create a coherent model of the economy if they are combined into a single factor of production, called "capital", whose supply and demand is driven by interest rates.