Economics Links, 1/25/2025
Russ Roberts and Mike Munger on the political economy of DOGE; Erik Torenberg on egalitarianism; Aswath Domodaran on the stock market; Reuven Brenner on the reliability of economic data
Russ Roberts and Mike Munger discuss DOGE. Russ points out that libertarians have a hard time believing that government can work.
Yeah--and we wouldn't let the government allocate this money in this better world. We'd have a committee of experts; and they would just actually make a list of the most important pieces of infrastructure--crucial bridges, highways, subways, you name it, high-speed rail. And, instead of it going to the places with the most political power, it would go to where it was most needed, or that was most good for the world or the economy or the American people.
And, I mocked that--politely more or less, and you can go back and listen to it--but we're in the exact same moment. Here is this idea now coming from the Right--from the Republicans--that: Yeah, we can just avoid all these nasty political processes that gave us these bloated agencies; well, we could just actually do some good in the world.
Munger says,
So, on the question about DOGE, the problem is: Well, there's two dimensions of problems that are kind of separate. One is that the budgets and the enabling statutes of these agencies were passed by Congress. They cannot be cut by some--forgive me--bureaucrat, even if that bureaucrat's name is Elon Musk. He's still just a damn bureaucrat who works for a new department called the Department of Government Efficiency.
So, one bureaucrat can't look and say, 'You know, we should get rid of this department,' and everybody will say, 'Yes, thank you for revealing that, and we'll get rid of the department.' It's there because of a statute. It would have to be passed by Congress.
And, it is the nature of these departments that they create quite a few winners that have very concentrated benefits. They will rise up--immediate pop-up lobbyists--against getting rid of the agency.
One worrisome scenario for Trump 2.0 is that it degenerates into a gravy train for Friends of Donald Trump. Some hedge fund managers get rich from “privatizing” Freddie Mac and Fannie Mae. Dozens of cryptocurrency scams thrive. Trump ends up remembered like Warren Harding.
My “tough love for California” essay runs into the problems that Roberts and Munger discuss. Do I really expect that the failures of the California government can be overcome by having the Federal government come up with some board to take over certain state functions in exchange for bailing out California?
I would like to say that there should be no bailout. But I’m taking a bailout as given and hoping for better policies from a Federal government.
The libertarian ideal is for a government that does a few things well. But we cannot get from here to there. I blame FOOLs (Fear of Others’ Liberty).
Munger gives his version of the pessimistic, cynical view:
The reason we have the system of government we have and not some other is there's a set of political forces that have resulted in this. It's not an accident.
The best news is that there is an entire transcript at the link. Econtalk transcripts used to cut off about half way through.
In order to not be woke, you have to say, we're *not* striving for *equal* opportunity anymore, we're *not* trying to use the government to *reduce* inequality between biological groups. We're not using the government to address historical wrongs.
Equality under the law is the only true equality, and of course it leads to inequality in practice, since inequality is the natural state of the world. We can focus on raising the floor without reducing the ceiling — even if that means increasing inequality.
How much inequality is tolerable? Erik argues that once you decide to measure relative wealth by demographic group and you say that “too much” is intolerable, you are headed down the road that leads to woke.
Wokeness is over when the government stops trying to socially engineer economic outcomes based on biological characteristics. Wokeness is over when we stop tracking those characteristics (and the differential outcomes between them) in the first place.
If you are sanguine about stock market levels, you could point to the current premium (4.33%) being close to the historical average across the entire time period (4.25%). If you believe that stocks are over priced, you may base that on the current premium being lower than the average since 2005. I will not hide behind the "one hand, other hand" dance that so many strategists do. I think that we face significant volatility (inflation, tariffs, war) in the year to come, and I would be more comfortable with a higher ERP. At the same time, I don't fall into the bubble crowd, since the ERP is not 2%, as it was at the end of 1999.
…Put simply, at today's price levels, there is an 80% chance that stocks are overvalued and only a 20% chance that they are undervalued.
Morgenstern then lists a range of errors that John von Neumann likewise noted, concluding that unless we have a clue of the magnitude of errors we expect in the data, “the feeding of economic data into high speed computers is meaningless.” He concluded: “The government should be persuaded to state publicly each time a new gross national product figure is made available that it is known only with an error of say, plus/minus 10%, employment figures with no lesser uncertainty, that foreign trade and balance of payments figures are subject to corresponding doubts.” He further notes, “The fundamental reform that will have to take place is to force the government to stop publishing figures with the pretense that they are free from error. There are no such figures, no matter what the layman may think and no matter what the producers of economics statistics may assert.”
…Accountants must estimate the subsidies the buyer received for the computer and the phone in exchange for signing the longer-term contract. Such calculations for this one kind of transaction showed that the adjusted price index deflator for Cellular Telephone Services fell at an annual rate of 7.7 percent during 2010–17, 4 percent faster per year than some conventional measures. The difference suggests faster technical advances in such bundled products.
Consider the complexity of this one relatively narrow example, and now imagine adding up aggregate numbers across a wide range of final products, not knowing in any category, or combination of categories, the magnitude of errors. Now imagine basing political debates on real GDP, or monetary policy. This makes no sense—and never did.
I strongly endorse this essay. Measurement error is the elephant in the room that economists ignore. I have said many times that “the change in trend productivity” is a drop in the bucket compared to the measurement error involved in the data used for its computation.
substacks referenced above: @
@
For a decade I consulted with four large federal agencies on matters economic. I know what passes for data; I know how it gets generated; I know it is full of errors, both known and unknown. The "myth of measurement" in economics is pervasive. All aggregate economic data are wrong to an unknown extent. It would be one thing if we had any idea of the size of the error term; we don't.
Reuven Brenner is one of the most underrated economic thinkers of our time and one of my personal favorites. Thank you, Arnold, for linking his article.