93 Comments
User's avatar
Minh's avatar

Speaking as a software engineer (albeit with only 3 years of experience, so I'm no CS professor):

Trust me, I absolutely hated it when the college professors forced us to study how to code in an assembly language that was already obsolete. And I used to think that studying how to implement a database is such a waste of time, given that modern databases already have such nice APIs and all you have to do to use them is just call those APIs.

But the more I've worked in this trade, the more I realize: Yes, you can create software without knowing CS. No, the software you create would not be as good as it could be, had you designed and created it with a fuller understanding of how the computer actually works (which is what you learn from CS). Correspondingly, you won't be able to command as high a salary as you could have, either. And yes, you'll be among one of the first ones to be replaced by AI (or more precisely, those who better understands how the computer actually works and knows how to use that understanding to command the AI accordingly).

This leads to a benefit of knowing how the computer actually works when working with AI coding agents: You know when the AI is bullshitting you (or when it makes a very subtle mistake). Some have likened the skills for working with AI coding agents as similar to the skills for managing human coders, and I agree: Managers of human coders who know how the computer actually works (or even just know how to code) get more respect from the coders and can better detect when the coders are bullshitting them too. Hence, they usually are more effective at their jobs than those who don't.

I reckon we can apply the car analogy here too: Do F1 cars make "normal" cars obsolete for professional racers? Maybe. Should those who wanna be professional racers learn how to drive using an F1 car instead of a "normal" car? Maybe not.

Expand full comment
Scrith's avatar

Minh gets to the heart of it in a way I don't think Arnold or Professor Jain have. There is a very large gap between knowing how to code and software engineering. Simple automation written in interpreted languages can and should be replaced with AI. However, real engineering requires an evolving set of techniques to optimize, secure, and scale software. Yes, AI could be trained to learn these techniques, but the complexity grows quickly as you get into either large cloud-based architectures or embedded systems where every bit of memory or bus bandwidth counts. It takes a lot of analysis to design a system like this - especially if it is already in production. Knowing Python or Javascript is table stakes – and there are no careers based around them. Finding good engineers willing to work in C or Rust is difficult, and those that know how to correctly implement different data structures, optimize networking (e.g. work with different wire protocols), build custom kernels, and leverage different hardware architectures are rare and valuable. I used to run large software organizations for one of the big companies, and there's a reason our top developers make 10x over entry-level.

The problem becomes that almost all of these skills are learned on the job. Even newly-minted Computer Science PhDs struggle to write robust production code. If we're no longer hiring at the entry level, we destroy the pipeline that delivers great engineers.

Expand full comment
Minh's avatar

I agree with you that software engineering is a trade that you learn on the job, but that is enabled by an understanding of how the computer actually works. I'm certainly very lucky to have got my foot in the door just before ChatGPT upended SWE entry level hiring. A more optimistic take on the situation is that some of the current CS students, who would have become a Big Tech employee had they graduated within the last decade, might found a startup that becomes a serious threat to the incumbent Big Tech companies (Mark Zuckerberg comes to mind). Still, it could be a little bit boring to follow the current startup scene, which has become a monoculture due to AI having sucked the air out of the room to the point where you can't get VC funding if your startup doesn't work in (or pretend to work in) AI.

Expand full comment
Invisible Sun's avatar

Software has been automating software coding for decades. The idea of writing programs without coding has been around for many decades. LLM / AI is the next iteration of this concept and capability.

Automation will replace some programming jobs. It also creates new opportunities in software.

To assert AI makes programmers obsolete is no different than saying AI makes authors obsolete. Clearly there will be many authors - humans who will write to communicate ideas and stories. So it will be with humans writing software.

Expand full comment
RoyalScam's avatar

I think Professor Jain is right to be frustrated here, and your analogies miss something essential about the purpose of foundational education.

The calculator comparison makes his point well. Nobody disputes that professionals can and should use calculators, but children who skip arithmetic entirely never develop number sense. They become dependent, unable to catch errors, or to understand what the calculator is even doing. That is not horse and buggy thinking, it is scaffolding: you learn the basics so that you can later use tools wisely.

The same holds for computer science. A student who cannot reason about data structures, query optimization, or even basic logic gates is like an engineer who cannot read a blueprint. Sure, they might cobble something together in a high-level environment, but the moment they need to debug performance, integrate across systems, or evaluate the trustworthiness of AI outputs, they are lost. They do not know what they do not know.

Expand full comment
MikeDC's avatar

Yep.

Respectfully, I think Arnold conceded the point but continued arguing when he replied to that example of the need to learn a basic principle with a loaded question of whether every student needed to learn "all of CS..."

I also think it was a weak argument when he considered the field he does know, economics. I understand that learning advanced mathematical methods was the was the paradigm in place at MIT to push the boundaries of the field, but that's not remotely the developing understanding of economic principles. Without economic principles, math is just math. Arnold thinks, and I might agree that advanced math turned out to not be so essential to economics. But he should consider the reasons for that.

The basic formulation for any discipline is that it's a body of specific knowledge outside of the tools used by the discipline. Math is a tool. And various programming languages and AI are tools. But computer science has important conceptual knowledge, just like economics does. If you know how to use a tool and know what you want to do, sure, you can make a program or an economic model.

But if you don't understand the basic principles, once you get beyond a superficial level of detail, or things change, you are very liable to get stuck.

Expand full comment
Andy G's avatar

You make some ok points. And arguably database structure *is* more fundamental to software development.

But AK is more correct than not.

I took a compiler development class as an undergrad. It’s simply not the case that being able to write your own compiler enables you to write code in a higher level language better than if you did not.

And having learned about them only later in grad school, I can tell you still more definitively that understanding transistors and ICs does *not* actually make you better at software development, save perhaps if you are writing software drivers for hardware (which 97%+ of programmers did not do back then, and surely 99%+ do not do today).

Had Jain stuck solely to his argument for understanding databases, I’d have had a lot more sympathy for his position.

Expand full comment
MikeDC's avatar

I can't say that I went to Twitter and read all the exchanges, but of the other one cited here, Jain is only cited as talking about basic principles like logic gates. I'd have to agree with him on that one too.

Not because anyone needs to know how transistors work, but because it's fundamental to all things computing to be able to logic things down to a Boolean answer.

I'm a weirdo who studied political science and then Econ, and at work mostly do database and programming work, and honestly I think everyone would benefit from learning that, but it does seem essential to programming because I don't think you'd be able to articulate (or check) any sort of complex code without understanding this.

Agree about writing compilers and hardware specific stuff, but again, I'd compare that to comparable stuff in economics. Like, in grad school I had to read, understand and work through Debreau's theory of value. It's a math problem much more than an economics problem, and it contributed little to my understanding of economics to do that.

So I guess my point is that I think every field has some excess of stuff they teach that's probably not essential, but most valuable fields have an important core of distinct knowledge at their heart. Where stuff stops being indispensable is a fair question, but I don't think it's reasonable to force anyone into an all or nothing position.

Expand full comment
Roger Sweeny's avatar

"in grad school I had to read, understand and work through Debreau's theory of value. It's a math problem much more than an economics problem, and it contributed little to my understanding of economics to do that."

Hear! Hear!

Expand full comment
Andy G's avatar

“Where stuff stops being indispensable is a fair question, but I don't think it's reasonable to force anyone into an all or nothing position.“

Agreed fully.

But is this not in fact almost exactly AK’s OG claim here? That some students using AI for the assignment is far from the end of the world?

Expand full comment
stu's avatar

What the teacher is saying is that if they are going to shortcut the assignments with AI, there is little or no benefit to being in the class. AK says nothing to refute that.

Expand full comment
Andy G's avatar

We are now mostly agreeing.

Anyone doing software development of course needs to understand AND, OR and NOT functions, without doubt.

But no, they do *not* need to understand the transistors, the logic gates, nor how they are built.

So the fact that Jain is correct about those *functions* definitively does NOT make him correct about “logic gates”, and yet that was the AK claim he was responding to.

AK never claimed there was no value to understanding these things Jain is promoting; he merely said that some of them are not essential.

If they were all essential, there could be no practical specialization and division of labor, could there?

Expand full comment
Doctor Hammer's avatar

I think your argument is too strong, or comes across as too strong. I agree there are definitely things that professors think are essential that are pointless, possibly even negative, when it comes to understanding a field. I was dismayed by how many economists know a lot of math and very little economics, for instance. I have also noticed that many professors things things are very important that are merely interesting to them, and not useful to others, like all of their research papers no one reads or cares about.

So sure, just because a professor thinks something is important to learn doesn't mean it is.

But on the other hand there are many critical things to knowing a subject well that simply can't be worked around. If I were training small engine mechanics and one of the assignments was to diagnose why a chainsaw won't start and fix it, it would clearly be a bad idea for a student to just drive over to the Stihl dealer and pay them 50$ to do so. Sure, he might claim that he got it fixed, and he wants to fix lawn mowers anyway, but the point is going through the process and exercising the ability to work out a problem, etc. After all, if you can't do that yourself, in what way are you a mechanic as opposed to just the person who delivers your tool to and from the real mechanic?

If that seems silly, consider how many people you have worked with whose job apparently consists of looking at something for 30 seconds and saying "I don't know, I will go ask X". I am pretty sure 80% of the IT people my company hires are basically just a queuing system for the three people who actually know how things work. So what, one might ask. Well, so they aren't actually doing the work they claim, and if those three guys leave the department can do nothing.

It is reasonable to say that modern tools mean we don't have to do a certain job a specific way, but that doesn't mean that outsourcing the job infinitely is also an option. At some point there needs to be a human that really understands how to do something. We can debate what the job title of various functions should be, and what capacities those job titles should include, but it isn't nonsense to say that there are fundamental things a person has to know to do a job.

Expand full comment
Andy G's avatar

“At some point there needs to be a human that really understands how to do something”

And yet here it is you who go too far.

Sure, this statement is tautologically true, but you are basically contradicting the wonders of specialization and the division of labor when you go there.

I can use a pencil without having any idea how to produce one.

Web developers can do a fabulous job with HTML and Java without knowing anything about how ICs work. Or machine language.

As I noted above, perhaps database structure *is* indeed more fundamental. Had Jain stopped at that, I’d have a lot more sympathy for his position. But he went way too far when he talked about the need to understand transistors in order to do software development.

Because even if there were some incremental advantage to that from 1975-1990, it is just not necessary today.

Expand full comment
Doctor Hammer's avatar

I don’t disagree with that. I am not saying Jain is correct in the particulars; I don’t know enough of the subject to have an opinion there. I am saying that even with specialization there needs to be a human specialized in doing the thing that needs done.

I would add that one can take specialization way too far as well; there is immense value in understanding many different things related to your specialization if you are not to be only a passively reacting cog in the great machine.

Expand full comment
Andy G's avatar

We agree here completely.

But your prior comment seemed to claim that AK’s argument went too far, when none of it contradicts your claims here at all.

Expand full comment
Doctor Hammer's avatar

Maybe we are reading Arnold's argument differently. I read him as saying that there is no reason to require students to do something in a less than the minimal effort way in the process of training them to be good at the job, at the strongest, and if I assume he really means what I think he should think, he is giving short shrift to fact that there are definitely things that need to be trained on to be better at it later. I think Arnold is making too strong an argument in one direction, or that it at least comes across as way too strong. Consider his choice in quoting himself:

"So every software engineer in the future will still have had to learn all of cs, including machine code, how logic gates work and so on before they can do their jobs?"

That's a bit much, no? One can argue over what specific things are useful for a software engineer to know, but I rather feel that wasn't doing that so much as doing a reductio ad absurdem to suggest that doing the bare minimum is fine because there is no value in learning that other stuff.

If one wants to argue what one needs to understand to call themselves a "software engineer" vs a "web developer" that's fine. Claims of the sort "Why do football players need to lift weights, just let them use a forklift" are drifting pretty far.

Expand full comment
Andy G's avatar

There are two axes in question here.

1) How necessary is the knowledge in question, relative to the individual’s ultimate objectives

2) How useful is it as a methodology of learning critical thinking and creativity in the field to work through yourself “by hand” the logic of the functions being discussed

And further, forgetting the complications of a continuous or even multi point “grading” system, that there are at least 3 different “grades” one can give for any given function/knowledge being discussed: crucial, useful, non-essential.

So I will agree with you that *both* Jain and Arnold went too far with their assertions, talking past each other about which axis they were discussing, and making too-strong assertions about crucial vs useful vs non-essential.

Personally, I think AK’s initial claim re: my axis 1) is largely correct, he got very weak in the middle, and better by the end.

Whereas Jain’s initial response was highly reasonable re: axis 2). And upon careful review the two professors were talking apples and oranges at that point. But Jain’s subsequent responses got worse and worse, and really did appear to claim that “all of it” on axis 1) is necessary for all students.

So each of them - and for that matter, most of the commenters here, myself included - would have done better to be clearer about “crucial” vs “useful” vs. “non-essential”.

And better still if they had consistently distinguished between “knowledge needed” and “best way to develop the cognitive skills useful for excellence in software development”.

Expand full comment
Doctor Hammer's avatar

I think the your two axis model is a good way of thinking about it, yes.

As an aside, I can sympathize with both disputants here. I have definitely had professors who have fixated on their own pet ways of understanding things, or just how they were taught 50 years ago which was based on how their professor was taught 50 years before that, etc. It is easy to miss what actually matters or is core to understanding something amongst the ritual of learning it.

On the other hand, I have worked with people, both students and professionals, whose lack of deeper understanding of underlying principles hamstring them in understanding problems and coming up with solutions, especially in the awkward corner cases that so many issues come down to. Often the most obnoxious people to work with are those who know a superficial amount about something, "enough to be dangerous" as they so often joke, and don't realize there is a ton of underlying knowledge they lack. They get so certain of their proposed solutions or belief that "this is easy, just..." nonsense, and you basically have to let them fail spectacularly before they believe the stuff they don't know matters.

Expand full comment
Yancey Ward's avatar

Lots of great comments here.

I think Kling and Jain are not discussing the same topic at all. The class under discussion is "Introduction to Database Systems" and not "How To Do A Computer Science Job After Graduation". The entire point of the class was to teach the students exactly what Database Systems consist of and how they work at the fundamental level. On this ground, Jain is correct- it would be senseless to teach students in a class "Addition And Subtraction" by allowing them to use calculators right from the start; or to allow students to use AI to write their term papers on Plato and Aristotle in a class titled "Plato and Aristotle- An Introduction to Greek Philosophy".

While it is true that the majority of the students in Jain's class probably won't ever need to know the fundamental knowledge of how database systems are constructed and manipulated in the future when they are preparing my venti mocha, that isn't the professors problem or even his goal- his goal is to make sure the students who will need to have this basic knowledge acquire it starting with his class.

Expand full comment
luciaphile's avatar

Perhaps it would be more accurate for AK and LJ to acknowledge this - the subject of the class is what it is, and those without curiosity about it, should not take it. Let the chips fall where they may.

Expand full comment
Andy G's avatar

What this misses is that completing the course might be required for other courses of interest, or simply for completing the major.

Expand full comment
luciaphile's avatar

You see this point a lot in reference to turning out doctors in a couple years. It’s not something I can have an opinion about although if AI is involved - yeah, might as well skip. AI is a tool, crude in most realms, perhaps not in CS, and a sign.

Expand full comment
Andy G's avatar

You make excellent points.

But how to *use* a database system and how to *create* one yourself are not the same skills.

The former is undeniably something all students should get out of such a class.

The latter is much less clear.

You don’t need to know how to write your own compiler in order to use one, e.g.

Expand full comment
Yancey Ward's avatar

Consider that the goal of a computer science degree at the masters or PhD level is actually to know how electronic databases were first created among lots of other details. I didn't need to know the mechanism of a Curtius Rearrangement in order to use it the dozens of times I did as a professional organic chemist in industry. However, it was important to know to qualify for my PhD. Or, in mathematics you don't need to know how to derive the derivative of the inverse sine function to calculate it- you can just plug the problem into Wolfram Alpha and it will spit out the formula if you are even too lazy to memorize it but should a student hoping to get a math degree expect that he doesn't need to do that grunt work even once?

Expand full comment
Andy G's avatar

Note that this is an undergraduate class, not a graduate one.

It seems to me part of AK’s point is that it depends on what the student’s objectives are.

If it’s a PhD in Computer Science, then I agree with you fully.

If it’s to learn how to be a successful software engineer, then the answer is less clear.

Now per other comments, understanding databases I agree is pretty fundamentally useful for that. But Jain’s argument seems to go far beyond that. As when he seems to suggest that all need to understand how transistors work.

And I just don’t agree with that, any more than I’d agree with the claim that one must know how pencils are made in order to use one.

Expand full comment
Yancey Ward's avatar

I would say that the goal of the class is set by the department and the professor teaching it, not the student. Nothing is stopping a student from auditing a class such as this, is there? However, if you want to take it for a grade and progress to a CS degree, then you need to bend to the requirements of the class or take the F.

Expand full comment
Andy G's avatar

“I would say that the goal of the class is set by the department and the professor teaching it, not the student.”

Yeah, due respect now you are making stuff up.

Individuals have their own objectives/goals. Equally true of a department chair and of the professor of a class, separately.

The discussion here that AK started with his post was value judgements about learning and knowledge.

IMO Jain’s strongest arguments were about the idea of generalized learning in the whole broad area of CS/software engineering, and his weakest by far were about what specific knowledge is required by whom.

And AK’s strength of arguments were basically the reverse.

Expand full comment
dymwyt's avatar

Students may not realize the limits they inflict on themselves by not fully understanding some of the underlying basics. Short-cutting the learning process may have short-term benefits. Then in the real world, the business has a problem that leads to a search for that one person who has the background knowledge to find what must be fixed. I'm on the business side and out of ~50 engineers, I always ended up finding the same guy that held the knowledge. He got promoted several times and I lost access to him. If demand is there, the educational system should attempt to meet it. The intelligent student will do the boring work and build on that. It's a student's choice to either learn it or take the short-cut. Back to the cow analogy, the cream will rise to the top.

Expand full comment
Christopher B's avatar

I'll date myself by noting I learned to program on a UNIVAC system, including assembly language, before switching to IBM equipment for my career. I sometimes found that I had to explain concepts of program construction to other programmers who hadn't switched, like indirect parameter addressing, because they had not been forced to understand the principles behind the code they were writing. I land more on the side of the professor in this argument. You can certainly be successful without understanding basic principles in your field but I think you are going to face some limitations.

You may not need to know how to milk a cow to cook but not understanding the connection of milk, butter, and cheese is likely to limit your flexibility in preparing dishes.

Expand full comment
Lupis42's avatar

I look forward to your explanation of why economists no longer need to understand supply and demand.

Everyone wants to argue that grasping the fundamentals of a subject they don't deeply understand is a waste of time when tools are available to automate those operations, but the point is *not* to teach the ability to do basic operations, it's to teach concepts (a way or ways of modeling the domain that will allow you to break down other problems later) and to exercise the brain - to build the muscle of thinking with those concepts, so that it is easier to tackle non-trivial problems with them later.

Expand full comment
Alan's avatar

I wonder if there’s a story here about specialization. I’d assume part of what spurs specialization is the commodification of broader knowledge and techniques. Since I can grow wheat at scale someone can now focus on growing almonds and eventually leave agriculture altogether. Eventually that base agricultural knowledge is so commodified and irrelevant that only a few really learn and practice it. Something like that?

Expand full comment
luciaphile's avatar

I hope the people in charge of reassuring us that we never need to worry ever about our food supply, do indeed have some “base agricultural knowledge “.

I hope we’re still growing people like that.

Expand full comment
Andy G's avatar

The students in the class who planned to get PhDs in CS should surely be doing the assignment in question by hand.

The ones who seek only to be professional software developers?

Far less clear.

Expand full comment
luciaphile's avatar

I believe you, but college students may not themselves know where their limits and abilities lie.

Expand full comment
Lex Spoon's avatar

I agree that students should use AI--even, should be required to use AI. Yes, Prof. Jain is right that you need to work certain exercises to build useful mental muscles. The things you learn in a senior-level class like database implementation are useful skills that you just need to know in order to be a strong developer.

I get off the train, though, on spending a lot of time trying to catch students who bypass the exercises. If someone wants to pay $100k for a university program and then not learn anything, why not let them? So I'll ignore that part and speak a moment on what to do for a budding software developer that really wants to learn and is willing to work at it.

There are things you need to know in order to ask for useful things from an AI or a junior developer. As well, you need a way to deal with the agent problems of having someone else do the work rather than doing it yourself.

Prof. Jain's course curriculum strikes me as very helpful for someone who wants to be a strong software developer in the modern world. My apologies if I got it wrong, but I think this may be the class in question:

https://cs186berkeley.net/sp22/

I see disks, buffers, b-trees, indexes, sorting, hashing, relational algebra, query planning, transactions, concurrency, and entity-relational modeling. I've been in software for 45 years and would say every single one of those is valuable even in the post-AI world, even if you never implement a database. Whether you are building something yourself, talking effectively with CEOs, sitting at high-level design tables, or coaching and growing a team, then these areas are so important that you risk embarrassing yourself if you have too many gaps in them.

The thing is that, with all the help in the world, you still need to know what to ask your help to do. As well, you need ways to deal with the agent problems of not doing something yourself. There are general-purpose techniques for the latter, and for doing it in a particular domain, you need to learn the major concepts of that domain and for it to be as fluent as speaking your native language.

They way you get there is what the professor says: you practice the mental process by working through realistic projects. You don't build a toy database because the toy database is valuable. You build a toy because the toy builds you.

AI is going to pulverize the majority of software development jobs, but that's just because there are a huge number of roles that are based on thrashing and praying in an IDE until you get something that looks vaguely in the ballkpark of what you want. There is an available labor market to do this, and there are a lot of margins where your business does better through this kind of software, but AI simply does it better.

Someone with a 4-year degree, though, who actually worked the exercises and took it as a learning experience, will be in good shape to contribute effectively.

As a parting thought, there's an analogy in music that was called out by the MIT Media Lab, in the old article "Pianos not Stereos". I think I would upgrade pianos to keyboards, and stereos to DJs, but the idea is the same.

https://web.media.mit.edu/~mres/papers/pianos/pianos.html

A DJ can be effective in 1-2 years, and I figure that a good music program should require some DJ training. However, all the top acts, even synth-heavy acts like Daft Punk, include a heavy use of general-purpose instruments like keyboards and guitars. It is not for lack of trying. The instrumentalists are just more flexible and will usually pull ahead.

Expand full comment
Jeff B.'s avatar

I would say that the best DJs (well producers) tend to also have sound musical backgrounds in regular old instruments and theory/history. Despite mostly making computers do bleeps and bloops

Expand full comment
Frank Stein's avatar

This is an interesting debate that has important practical implications. I think the level of understanding needed depends on who exactly is being taught in a given class--are the students being trained as specialists in the field or are they using the applied methods for other ends?

For example, there are many social scientists who learn statistics who don't need to know the math behind it--they just need to know the strengths and limitations of a given statistical method and how to use the software correctly to produce a sound analysis. However, for people majoring in statistics, knowing the math behind it seems foundational.

The same applies to other fields, like CS, philosophy, and the like.

Expand full comment
Slowday's avatar

Though the statistics delivered in research papers and downwards as a rule is broken and misleading. Not seldom do I wonder if there is a better way to do all this.

Expand full comment
Invisible Sun's avatar

Did the computer make mathematics and mathematicians obsolete? No. But it made obsolete math problem solvers.

Expand full comment
Gary's avatar

I am reminded of my freshman French professor in college who insisted (several times per week) that no one could possibly learn French without first mastering Latin grammar. And yes, he lacked any sense of irony. I felt sorry for French people everywhere.

Expand full comment
Andy G's avatar

A perfect example of the difference between *useful* - where I agree with your professor - and *essential*, where of course he is just wrong.

Expand full comment
Slowday's avatar

You can be a bluffer but only to a certain extent.

In a somewhat earlier era, I had a colleague who firmly argued that learning facts was useless, since you could just look them up on Wikipedia. To a point, Lord Copper. But will you be able to hold up your side in a technical board argument when your answer would be to prompt ChatGPT, over and over?

Can you even be a promptlord without any deeper knowledge of the field? It can easily become one of those situations where the boss of the IT department only has a business degree. Impedance mismatch.

Expand full comment
FCA's avatar

I think you are talking past each other.

My son studied the violin for 14 years. There were two components to learning, let's call them domain knowledge, and transfer learning. Under the first he learned techniques related to the specific instrument, as well as general elements related to music (but still specialized within the orchestra). Under the second he learned focus, long term investment, logical skills, problem solving, rapid skill acquisition -- transferable to a wide range of other domains and arguably more valuable. The extent to which learned elements of the second was related to the difficulty of the domain (violin is not the harmonica, especially as level goes up).

He will not be pursuing a career in the violin - but does that mean he over-invested in a domain? Is it useless?

Domain knowledge, in particular when in depth and highly specialized, indeed is often not required by most workers even in the general area, and immediate usefulness likely to deteriorate (especially technology although math, logic etc seems more robust to decay). The problem is that it seems that the more transferable skills cannot be acquired without investing deeply in studying some specific hard domain, even if somewhat poorly chosen for immediate application.

I suspect Jain's problem is not that he wants all CS students to be DB experts. But he sees some core "hardness" (Boolean logic?) in the subject. I read his complaint as s"tudents cheating themselves" is that he suspects in NO class are they buckling down and digging down in depth. Just ChatGPT everywhere. And given they are CS students, he would argue they should be digging down in some CS subject as opposed to some philosophy course. And hence they miss out on both components of learning.

And the problem is not that Lib Arts students aren't reading Plato and Aristotle -- it is they are not reading anything and even their summary readings are pulled from weak sauce.

My own maxim when hiring was to make sure the hire had plumbed the ocean depths in some difficult subject -- so they wouldn't drown when I tossed them into a bathtub. Within reason, I did not much care about which ocean.

The "topology before you can do economics" is a mistake by focusing on first component. The "High school teachers can teach math with one undergraduate math course but an Education PhD" is the fallacy that in depth domain knowledge does not matter much.

Interestingly, the claim "You need to know about integrated development environments, software packages, scripting languages and database integration. It’s cool to know elementary computer science, but I would argue that it’s not essential." knocked me back.

I have coded since 1984 and forgotten more languages and frameworks than I can remember. I think you are arguing for becoming very competent at development, and being able to express that general skill in the context of the current tool stack. But I have coded since 1984 and have forgotten more languages and Dev Stacks than I can count. If you think your in-depth knowledge of the Eclipse IDE makes you a coder then good luck on the breadline - ChatGPT will replace you. Saw many local Technical Colleges graduates who had this attitude crash when there was no plug-in for them.

Expand full comment
Dave Friedman's avatar

I don't know anything about the professor you reference, but I have often found that academics' understanding of the skills and knowledge employers want their employees to have is divorced from reality.

Expand full comment
Minh's avatar

From my experience, because CS university professors mainly work as researchers instead of as industry practitioners (also possibly because of incentive to get more helpers for their research projects), they tend to teach what is useful for CS research rather than for industry jobs. Although as I commented, they taught me the fundamental knowledge of how the computer actually works, and that has saved me so many times during my relatively short time in the industry.

Expand full comment
Dave Friedman's avatar

I'm actually sympathetic to the argument that understanding, at a superficial level, the theory that undergirds how a computer operates (logic gates, etc.) could be useful to a software engineer, and that most practicing software engineers don't understand this stuff and so limit their professional ceiling. But it's not clear to me that that theoretical knowledge is a variable considered during the hiring process, or even the promotion process.

Expand full comment
Minh's avatar

Certainly not during the promotion process, but certainly an essential part of the hiring process: Lots of companies, especially the "Big Tech" ones, require you to pass multiple "coding interviews", where a more experienced engineer gives you a problem and you have to write code to solve that problem (most places don't let you use a full integrated development environment - IDE - which is a daily tool to help you write code faster with way fewer errors). It's almost always the case that you have to translate the problem into something that can be solved by manipulating some data structures using some algorithms, which are topics many people consider part of CS theory.

Expand full comment
gas station sushi's avatar

To be a good accountant, you must first learn how to be a good bookkeeper. In order to be a good bookkeeper, you must first start with ledger paper. As boring and as basic as that sounds, the accounting insights of Luca Pacioli from the 15th Century are still relevant today. Unfortunately, there aren’t any significant shortcuts to accelerate learning of complex topics.

For many other occupations, I would assume that this same logic holds.

Expand full comment
Dallas E Weaver's avatar

If you don't understand the fundamental basics of your field in STEM, being creative and innovative becomes a low-probability event. Unless you understand, you can not "see".

Instead of showing my age, I started with programming in Hollerith before Assembly Language. I will point out where Economists seem to have dropped the ball by not seeing the mathematical analogies with other fields. In STEM, we noted that analog circuits, acoustical problems, mechanical problems, thermal problems, etc., all use similar equations, and one system can be modeled in another. An analog computer can calculate where a battleship's main gun will hit, but so can a mechanical computer or be simulated in a digital computer.

I came across a graph of housing real estate costs/ft2 adjusted for inflation since the 1950s to the 2000s for high-regulation, low-regulation states, and average states. For California, the prices started near the average, then started oscillating with spikes more than double the average and increasing instability. The instability began in the 1970s when California passed its development-delaying activist legislation (CEQA), which increased the time delay for expanding the supply of housing. In understanding the control of electrical, mechanical, thermal, and related systems, they also become unstable with on-off controllers, where the feedback is delayed. A little thinking about a supply/demand system where demand increases sends a price signal to the contractors, which will become mathematically unstable when a delay is introduced into the supply function. You can't stably control something whose dynamic response time is the same as your response time.

The supply-and-demand differential equations do require complex math for solutions, and I seldom hear economists discussing imaginary numbers.

Expand full comment