Pity the Harvard Undergrad
AI is taking their jobs; In the future, all jobs will be software development jobs
I had dinner with the CEO of a tech company last week. He said, “We just cut almost 20% of our workers, and chances are we’re going to be doing it again in the next year or two. There are a lot of efficiencies we are getting with AI. At the same time, my daughter is in college and she’s looking for a job. She’s trying to avoid tech because she sees what I’m doing. I’m not sure what her classmates are going to do for jobs in a few years.”
A partner at a prominent law firm told me, “AI is now doing work that used to be done by 1st to 3rd year associates. AI can generate a motion in an hour that might take an associate a week. And the work is better. Someone should tell the folks applying to law school right now.”
Remember the three types of jobs: working with things, working with people, and working with symbols (writing, calculating, programming). College is supposed to help you get into the latter. For Harvard graduates, this means management consulting (McKinsey), finance (Goldman Sachs), or law school. The entry-level jobs in management consulting, finance, and law all pay well, but the work is often quite routine.
From a recent WSJ story on McKinsey:
AI is now a topic of conversation at every meeting of McKinsey’s board, said Bob Sternfels, the firm’s global managing partner. The technology is changing the ways McKinsey works with clients, how it hires and even what projects it takes on.
And McKinsey is rapidly deploying thousands of AI agents. Those bots now assist consultants in building PowerPoint decks, taking notes and summing up interviews and research documents for clients. The most-used bot is one that helps employees write in a classic “McKinsey tone of voice”—language the firm describes as sharp, concise and clear. Another popular agent checks the logic of a consultant’s arguments, verifying the flow of reasoning makes sense.
And I would note that as other corporations make more use of AI, this could reduce demand for McKinsey services.
Yang writes,
I spent 5 months as an unhappy corporate attorney doing document review in my mid-20s. It was 100% the kind of work that AI could do faster and more effectively.
Imagine starting at Goldman and being told that your job is to check “offering circulars.” An offering circular describes a security being issued. Suppose that a company like Microsoft wants to raise $800 million in a bond offering. The circular will give all of the legal information about the bonds, relevant financial information about Microsoft, and all sorts of other disclosures and legally required statements. It goes on for dozens of pages. Our young Harvard grad gets to check the calculations, proofread the text, and perform other almost-mindless tasks. Is it any surprise that ChatGPT can be taught to do this just as well?
Suppose that Goldman hires fewer Harvard undergrads and instead uses AI to check offering circulars. Goldman’s expenses go down, and it probably makes fewer mistakes. But Goldman does not have as many junior bankers trying to climb the corporate ladder, and Harvard undergrads do not have as many job opportunities. My guess is that this is a bigger problem for the Harvard undergrads than for Goldman. Goldman does not need every entry-level hire to climb the corporate ladder—there are only so many spaces at the top. And in a few years, perhaps even some positions higher up the corporate ladder will be able to be handled with AI.
There will be less drive for efficiency in government and non-profits, so job availability will still be there. But government cannot hire everyone.
Jobs for the Harvard undergrad
One way to be employable is to look for jobs that combine working with symbols with other skills. For example, a nurse has to work with people, things, and symbols. I know that when you got into Harvard your parents did not think you would become a nurse, but they did not anticipate that AI would take away so many other options.
If you insist that your job should involve working with symbols, then I have a prediction:
From now on, all symbol-using jobs involve software development
Instead of writing jobs, there will be jobs developing software that writes. Instead of financial analysis jobs, there will be jobs developing software to do financial analysis.
This phenomenon began before there was AI. If your job involves creating spreadsheets, in some sense you are doing software development.
But with AI, more people can do it. If “English is the new programming language,” as Andrej Karpathy put it, then more people can develop software.
And with AI, more work will be done using software. All of the work that entry-level writers, investment bankers, and lawyers used to do.
Software development is a mindset. Whenever he catches himself doing something over and over again, the developer says, “I should have a program that does this.” He either finds one that someone else has written or he writes one himself.
You may not have the skillset of an outstanding software engineer. But you must adopt the mindset. Look for ways to automate the steps that you repeat when you undertake a task.
For example, in maintaining this substack, I do a lot of repetitive work digesting other substacks, extracting quotes, and formatting links. Instead, I could use a software tool to do this. If you are just starting out as a writer, work on finding or developing such tools.
Another way to put this is that you have to learn to train an AI to do as much of your work as possible. If you achieve this, you can continue to earn a living. If you fail to achieve this, someone else will.
My prediction for Harvard undergrads? Pain
substacks referenced above:
@



It seems that for now, Goldman/McKinsey can capture the value of junior people with AI without changing the rate they charge clients.
Assuming there isn’t regulatory capture, Eventually, the costs of these services will go way down. Either Goldman/Mckinsey disrupt their own pricing or someone else does. Maybe that someone else is a current Harvard undergrad (or my preference, a current State college under grad).
When I consider my own career I started off in a job doing unsexy tasks that got filtered down to the entry level. But this allowed me to:
1) Prove myself
2) Learn the things I would need to know to do high level strategy through trial and error doing
Now I'm a high paid strategist, but I could not have become a high paid strategist without that dues paying phase. This is especially true of someone like myself that doesn't come from a privileged background and have generally preferred as little credentialism as I can manage.
I do also wonder to what extent AI was every really necessary to cut some of these jobs. Could it not be the excuse rather then the reason.