I have taught an advanced computer science class at UC Berkeley nine times. The class, Introduction to Database Systems, teaches how data is organized, stored, and managed by computer systems, and is meant to prepare students for their first forays into the software engineering profession.
He then gets mad that they use AI to do the assignments.
I can’t overstate how damaging it is for students to use AI as a means of short-circuiting this process. Part of the purpose of a college course is in learning how to think about and work through complex problems; none of that is achieved if students trade the experience of coding for the convenience of chatbots. In doing this, my students weren’t just cheating the course. They were cheating themselves out of vital development as engineers, failing to make the connections that practice and hard work help form.
The professor thinks that every student needs to learn how he drives his horse-and-buggy. Maybe a few do. But he shouldn’t be angry that most of them prefer to use a car.
He and others took umbrage. Jain wrote,
It’s really just a question of “what are they accomplishing?”
The correct analogy is mathematics. I don’t fault a mathematician or a scientist for using a calculator. I do not want a child using it before they learn addition, or they never learn the basic skill that is necessary and don’t grow their brain in that manner. This is very similar.
(And compared to the tasks that software engineers get in their jobs, a lot of university coding classes are literally the equivalent of basic arithmetic)
I replied,
So every software engineer in the future will still have had to learn all of cs, including machine code, how logic gates work and so on before they can do their jobs?
He responded,
You do realize that learning the principles behind how logic gates work is arguably one of THE most important things in computer science, right? Not being able to understand what AND, OR, and NOT means is a deal-breaker.
A deal-breaker for who? What does a software engineer today need to know about computer science? It seems to me that for a long time, well before ChatGPT, you could develop software without knowing elementary computer science. You need to know about integrated development environments, software packages, scripting languages and database integration. It’s cool to know elementary computer science, but I would argue that it’s not essential.
To be a lawyer or a politician, do you first have to master the Dead White Males curriculum, starting with Plato and Aristotle? There are humanities professors who would insist that it is heresy to suggest otherwise. I would say that it’s cool to be able to quote Plato and Aristotle, but it’s not essential.
I plan to teach a seminar on political psychology without assigning the Dead White Males. I think, perhaps wrongly, that it can still be a good course.
To become an economist, do you first have to master advanced mathematics, including topology, stochastic differential equations, and other tools? That was the understanding when I went to graduate school, where the MIT philosophy was that higher math was the only potential source for new insights. I would say that it might have been cool to learn fancy math, but I am not convinced that better understanding of the economy was the result.
To become a great chef, I would imagine that you have to understand ingredients and their sources. But do you have to know how to milk a cow?
Somebody should know how to milk a cow. Somebody should know how an internal combustion engine works. Somebody should know how to use a horse for transportation. Somebody should know Plato and Aristotle.
But whether any of this is essential knowledge in a particular field or profession is contestable. It was contestable before Large Language Models were invented, but it becomes especially contestable now.
I can imagine a scenario in which professor Jain’s computer science course turns out to be useless knowledge for the vast majority of software engineers. The students that he thinks are cheating may be better qualified to do the job. I am not saying that this is certainly going to be the case. But do not rule out such a scenario.
substacks referenced above:
@
Speaking as a software engineer (albeit with only 3 years of experience, so I'm no CS professor):
Trust me, I absolutely hated it when the college professors forced us to study how to code in an assembly language that was already obsolete. And I used to think that studying how to implement a database is such a waste of time, given that modern databases already have such nice APIs and all you have to do to use them is just call those APIs.
But the more I've worked in this trade, the more I realize: Yes, you can create software without knowing CS. No, the software you create would not be as good as it could be, had you designed and created it with a fuller understanding of how the computer actually works (which is what you learn from CS). Correspondingly, you won't be able to command as high a salary as you could have, either. And yes, you'll be among one of the first ones to be replaced by AI (or more precisely, those who better understands how the computer actually works and knows how to use that understanding to command the AI accordingly).
This leads to a benefit of knowing how the computer actually works when working with AI coding agents: You know when the AI is bullshitting you (or when it makes a very subtle mistake). Some have likened the skills for working with AI coding agents as similar to the skills for managing human coders, and I agree: Managers of human coders who know how the computer actually works (or even just know how to code) get more respect from the coders and can better detect when the coders are bullshitting them too. Hence, they usually are more effective at their jobs than those who don't.
I reckon we can apply the car analogy here too: Do F1 cars make "normal" cars obsolete for professional racers? Maybe. Should those who wanna be professional racers learn how to drive using an F1 car instead of a "normal" car? Maybe not.
I think Professor Jain is right to be frustrated here, and your analogies miss something essential about the purpose of foundational education.
The calculator comparison makes his point well. Nobody disputes that professionals can and should use calculators, but children who skip arithmetic entirely never develop number sense. They become dependent, unable to catch errors, or to understand what the calculator is even doing. That is not horse and buggy thinking, it is scaffolding: you learn the basics so that you can later use tools wisely.
The same holds for computer science. A student who cannot reason about data structures, query optimization, or even basic logic gates is like an engineer who cannot read a blueprint. Sure, they might cobble something together in a high-level environment, but the moment they need to debug performance, integrate across systems, or evaluate the trustworthiness of AI outputs, they are lost. They do not know what they do not know.