I think you pretty much nail it, although I don't think that having your business even be highly software dependent is critical. I think programming teaches you to think hard about exactly what is happening at each step, exactly what inputs are needed, exactly what outputs are produced, etc. That's as opposed to almost every field involving humans where hand waving and vagueness can often be interpreted by others who know better and can make it work. Most people just can't think seriously about processes and interdependencies, even when their job depends on it. (I see this all the time, working in supply chain ERP system implementation. "Technical" people often get just as handwavy as business, but often because the business side lets them get away with it.)
My guess is that is why folks from the hard sciences tend to do well as business managers too. You can't mumble your way into building a functional bridge, or rely on "then a miracle happens" for step six in mixing up chemicals. Anything where "do it right, in the right order, or it doesn't work" is a good training.
At a high level of abstraction and 'architecture' coding really is the same skill as coordinating human efforts in organizations.
One has to manage complexity at scale in accomplishing a giant, multifaceted mission by divide-and-conquer and division-of-'labor' approaches, arranging hierarchies of compartmentalized, special-purpose functions, outlining paths of communication, and authority, allocating scarce resources intelligently, and constant testing, detection and elimination of 'bugs', with iterative-decision-making and refinement of processes and procedures ('instructions') for quality, reliability, and efficiency.
The basic outline of a modern efficient world-class organization is kind of like a general purpose CPU (they all tend to share similar basic design) with the founders / CEO programming and improving the human-coordination software to make it produce the intended outputs.
With hardware and software, the individual components and routines are both amazing because specialized to do one particular task extremely well, and dumb, because they can't do other tasks efficiently and often only get to work with a tiny piece of the puzzle. The art of managing and arranging the big picture, and to make sure that functions are able to run in parallel on different cores (or brains), and still deliver intermediate outputs just in time for other subsequent functions takes extremely high levels of talent.
I've thought the same, I think another advantage programmers have is that we often have to be aware that the world is adversarial, which is not true of most hard sciences.
One of the most underrated damaging things that can happen to a business are incentive systems that look clever on the surface but are easily abused or distorted. Not only do you encourage the wrong kind of behavior, but in time the wrong people will control the organization.
When writing software you are always aware that inputs are unpredictable. The users can do anything, and even in business systems that should stay internal there are no limits to inattention or malice. You have to be aware that every system you create is going to be tested.
As an organization scales and attention is diluted the interation of incentives is begins to dominate outcomes.
One thing to keep in mind is that having the programmer mindset is not enough. Having been around in the dawn of the PC age, I can recall dozens of companies ran by engineers & programmers that had quick success and then even quicker collapse.
The companies run by businessmen often didn't do much better. (Think Apple under Sculley.)
You need *both* the programmer mindset *and* business insight.
For all his faults, Bill Gates was one of the few people I saw in the 80s (and beyond) who had both, which is why Microsoft succeeded.
Sculley's big failure was Newton, the personal assistant - because it wasn't also a phone, nor was music a big deal then.
Business visions need luck on timing, including improved tech so as to offer the visionary product at a low enough price to be bought. Apple's $10,000 Lisa was too expensive, tho "better" than the $4,000 IBM PC.
The PC was "open", like the popular Apple II, but unlike the Mac. Helped at first, but has become a relative negative against viruses and instability.
The idea of keeping a system that needs constant patching reminds me of a term in software development we call “technical debt”. You implement something in a way that you know will have consequences down the line, but you’re willing to pay the cost in the future in exchange for a quick/less expensive launch. Like real debt, it can be a powerful tool if used responsibly.
Great article and important thoughts. I’d abstract the idea a little further and suggest that it’s important to know how your product works and it’s important to know how teams succeed with uncertainty. Knowing to code can teach you both things but being the best coder generally doesn’t make you a good CEO. There are technical and strategic skills that are hard to separate the more you develop them.
"... when it comes to complex modern businesses, luck plays only a small role in success."
I think we all would agree that success in a complex modern business depends on many factors coming together.
Some of these factors are under the control of (or clearly foreseeable by) the founders, such as corporate strategy and execution. For simplicity, let's call these factors "merit".
Some of the factors are not under the control of (or clearly foreseeable by) the founders, such as competitor's unexpected moves, macroeconomic conditions, and historical events (9/11, Covid-19, etc). No matter how good a restaurateur might be, a restaurant that opened in late 2019 was nearly certain to fail when the lockdowns started. "Luck" is as good a word as any for the combined effect of such factors.
For best-in-many-millions cases like Bezos or Elon, the most reasonable hypothesis is that nearly all factors, including merit and luck and those in-between*, worked out favorably. People rarely achieve notable success without merit, but merit is not sufficient for success (especially in startups, because startups fail much more often than they succeed).
* Some factors that could be controlled will not be because money and time are limited and priorities must be set. Whether these factors are considered merit or luck depends on a judgement about whether failing to prioritize them was reasonable.
Who else had "reasonably good coding skills: Jobs (not in Wozniak’s league, but who was?), Gates"? Bill Gates wrote, in assembler, a version of BASIC, before he dropped out of Harvard, so I'd put him at least in Wozniak's league. Gates also shows Handle's "At a high level of abstraction and 'architecture' coding really is the same skill as coordinating human efforts in organizations."
Jobs & Woz, together were a fine team - Gates' business acumen was better, tho not as visionary.
Likely a better coder than Woz was Mitch Kapor, the main creator of Lotus 1-2-3, the better than Visicalc spreadsheet which was the prime reason that in 1982-84 all the finance guys in all the big companies were able to go buy IBM PCs (tm). Lotus blew away all the competition. Kapor then created Symphony, trying to integrate in one program the spreadsheet and word processing and better presentation. In theory, elegant, but it was too much of a UI mess AND Gates & MS with copy & paste integration of Word and Excel and the purchase of PowerPoint (originally made for Mac) keeping the programs separate but working fairly well together turned out to be a better strategy in the marketplace. [Gates bailing out Apple & Macs for awhile in the late 80s is another fine tech-business story]
(I'm specifically looking for better AI teachers of English as a Second Language, the single subject more people pay money to learn than any other subject in the world - and for all native speaking college graduates, an OK way to earn money in any other OECD country.)
This also relates to Noah Smith's note that Comp Sci students are almost as numerous as students of all Humanities together.
Arnold "was willing to say yes to total rewrites and inclined to say no to patching."
IBM tried to follow this idea in its $5 billion (or more?) "Blue Harmony" attempt to rewrite all of its different country's separate "silos" of data into a single, consistent, customized SAP program. This program was NOT quite successful, so after Germany and China were switched on, IBM quietly stopped switching on the older countries and just put all new countries on it. Not a total failure - VP in charge was not fired.
Rewrites are certainly costly - and might NOT be successful.
Lots of banks still run functioning COBOL programs.
In classifying goods, software is excludable but non-rival - often called a Natural Monopoly. This is something very different from what most businesses deal with. Gates understood that, most people don't. Aside from that, everyone in and around software development needs to read The Mythical Man Month. Several of the commenters here have discovered his rules on their own.
The Harvard MBA belief that "a good manager can manage anything" is BS. If you are in charge of something you don't understand, like coding, you eventually will have some nerds come to you to make a decisions. The only basis for decision becomes who presents the "better case", not who was correct. Among STEM workers the correlation between knowing what you are doing and presentation skills is not large. A coder know when a coder is full of nonsense.
The same is true in all areas of STEM. When I was a young I had a technical disagreement with a consultant on how some data should be analyzed. At a big meeting with all the CEO down and I filled 3 blackboards with differential equations proving the other guy wrong. I lost and the company lost 17 million on that decision. I did the postmortem.
To succeed in managing technical areas, you need to know something about the subject.
Our daughter had a friend whose father, an excellent labor lawyer, became the manager of an oil refinery. He managed to kill 3 people the first year letting little errors that he didn't understand fall through the cracks.
A agree with much that has been said here particularly by Dr. Hammer. I wonder what this might say about new firm entry. I mean if all you need is a good idea, a software savvy leader with a human touch and a team of talented programmers? Then new firm entry depends on the availability of talent--keep open the gates for smart immigrants.
Besides good interface definitions and documentation, scalable software development requires good testing. If you have a rigorous (and ideally mostly automated) way of verifying that any change does not break the existing commitments that your interface definitions have made, you can make changes much more easily and non-disruptively. This often shifts the tradeoffs for systems upgrading in favor of incremental stepwise change and away from ground-up rewrites, because the former becomes so much safer and cheaper.
A good testing discipline in turn improves your ability to think productively about what you are doing, probably producing a general multiplier effect for business efficiency. It forces you to constantly ask yourself: what does it mean for this thing to work correctly, and how would I know whether it does?
I am a software engineer with over decade of experience in the field. There’s a saying in the industry that all software problems are people problems. Whenever I look at a piece of software, I always ask who made this and why ? What were their incentives and constraints? What business problem is the software trying to solve?
What are the risks to the business if I change this software and it introduces a bug? Could the company go out of business? W
In other words, I believe goodhart and Conways law are essential in understanding how software is written and maintained.
Is software really distinct from any other techincal field here? A lot of great industrial concerns were headed by engineers and scientists. Only a minority of nerds make good businessmen, but that minority seems to exist in all fields.
> I think that so many complex businesses today depend on software that to be a business leader you need a sense of the software development process.
Sounds right. It's not that the software skillset is special, it's just that software is very important nowadays.
> I formed the opinion, which I took with me to my own business, that putting off rewriting a system and letting it accumulate patches is a bad idea. I was willing to say yes to total rewrites and inclined to say no to patching.
It's more complicated than that. The most formidabbly excellent software systems I've seen are also formidabbly crusty. They each embody at least a decade of hard-won knowledge about real-world problems, their solutions and the wicked interactions between all of the above. Rewrite attempts just bring in a naive optimisim that makes things worse.
But the converse is not true. Someting is unlikely to be excellent just because it is crusty and ancient. It might just be embodying decades of hard-won stupidity. The developers of the excellent systems have been willing to apply the kind of rigour you are talking about. But part of the rigour was doing some careful study before dismantling any Chesterton fences.
The best time to rewrite a system is when you just got it working. You still have the developers who built it, and they know about the Chesterton fences. Take immediate advantage of "if I had it to do over again" thinking.
How much software have you seen "shipped" recently?
Usually it's already both late AND over budget, so it's shipped even tho there are some bugs. The idea to re-write it rather than fix bugs seems likely business disastrous. ( see my IBM Blue Harmony example- re-written without any orig developers.)
When the developers always wanted to be on the cutting edge of tech.
Many users are sick and tired of MS making new Windows with new, different but not superior User Interfaces, which they need to update to so as to be consistent with work.
Rewriting a popular User Interface (front end) is more likely a mistake than re-writing internal processing (back end), or adding new features (add-in).
This is true -- work with the intention that the first thing you make is a prototype and remember to treat it as such when the temptation comes to just keep it in production. After a few years this is not directly relevant to the whole system as a whole -- bit it is relevant to the components.
My most successful rewrite occurred when I found a way to do something much more efficiently than the preexisting system (which wasn't written by me). It was easier to rewrite the entire system to accommodate my new methods (while preserving externally visible behavior) than it was to retrofit the existing system.
I subsequently executed a rewrite of my rewrite. This one benefited from experience in the manner you describe, and I'd also classify it as successful.
But it's the first rewrite that had the greatest impact. That class of insight is difficult to plan for, but I think it's crucial to account for it in any theory of rewrites.
Agreed, but I think this doesn't square up to the longer term problems of time. Most software is going to change over time. I think software itself is "young" enough that we really haven't captured how to deal with this problem.
Take the IRS software, which IIRC was famously written in COBOL. Just hypothetically, let's imagine the software was written and documented perfectly on Day 1 back and 1970 and then rewritten and re-documented perfectly back on Day 365 back in 1971.
The fact that it was perfect 50 years ago means nothing to us now. Yet, continually dumping a lot of effort into rewriting a perfect (or nearly perfect) code base is an obviously bad, expensive idea. And doing nothing is just as obviously a bad idea, but it's a really cheap bad idea.
Economically, I have no idea where the marginal benefits of such efforts start to emerge.
As a practicing software engineer (trained as a scientist), I would agree that software is not distinct from other technical fields. An engineer that can “do” is accustomed to mathematically defining his problem, forming theories around the problem and then refining those theories by confronting reality. He will generally seek to develop intuition in his problem domain and have a toolkit to quickly make approximations. He will also develop an strong appreciation for how easy it is to be wrong.
I would be surprised if the above background skills are not transferable to business problems. Some entrepreneurs that I know and respect greatly have learned to refine their business skills (theories) over time by confronting reality and have developed great intuition in their problem domains. Those entrepreneurs still find great value in working with engineers who can define a problem well. Those entrepreneurs can also appreciate reasoned feedback on why their ideas might be wrong. I am not aware of any of the above skills being integral part of a business school education.
On rewriting software, I also am a skeptic. My general feeling is that systems produce software and if one’s system has produced a mess, then one needs a better system. One can start all over again with the hope that the second time through the existing system will produce better software, but especially in a group development situation, my null hypothesis is that the second time through will produce a new mess with all the bugs that the old system already fixed. I don’t think that the “better system” technology currently exists for groups, but some individual engineers have thought processes which are better at distilling complexity and tend not to make messes.
> On rewriting software, I also am a skeptic. My general feeling is that systems produce software and if one’s system has produced a mess, then one needs a better system.
Yes. I once joined a firm that was one year into a train-wreck project to rewrite a crusty behemoth straight out of Dilbert. That time should have been spent writing a rigourous set of lower-level tests on the original behemoth so that we could start hacking away at it with confidence.
(At least they had strong system level tests. But in this case system level meant "put the damn software into the damn robot and rigoursly check that the robot does it's work".)
"Is software really distinct from any other technical field here?"
A unique aspect of software is it fails fast. There is near immediate feedback that something in the design or implementation is broken. You find this feedback in certain engineering systems - particularly those related to computer hardware and electronics. But you don't have it in structural engineering. A building or bridge can be flawed, and yet take decades to collapse. All those years and the flaw remains hidden.
There is a certain humility required for doing software. It is hard. It is demanding. One needs to be willing to accept being wrong but also capable of accepting feedback to fix what is broken.
Another aspect is that the complexity of system software requires the developers to understand the full process. This burden to have both a holistic view and a detailed understanding is a valuable skill for managing a business - which is also a complex system requiring acute understanding of processes and their dependencies.
>> "Is software really distinct from any other technical field here?"
> A unique aspect of software is it fails fast. There is near immediate feedback that something in the design or implementation is broken. You find this feedback in certain engineering systems - particularly those related to computer hardware and electronics. But you don't have it in structural engineering. A building or bridge can be flawed, and yet take decades to collapse. All those years and the flaw remains hidden.
Good point, I don't mean to imply that all fields are the same and have transferable skills. But that they relate to businesses in the same way. If you were starting a construction firm in 1950 then the engineer with a rigorously-blueprint-and-calculate mindset would be more valuable than one with a software-like prototyping mindset.
Now to Arnolds point: if you were starting a *new* constructoin company then maybe the software guy really does have an advantage -- but only if the new innovation that this company brings to the table is better use of software.
I think you pretty much nail it, although I don't think that having your business even be highly software dependent is critical. I think programming teaches you to think hard about exactly what is happening at each step, exactly what inputs are needed, exactly what outputs are produced, etc. That's as opposed to almost every field involving humans where hand waving and vagueness can often be interpreted by others who know better and can make it work. Most people just can't think seriously about processes and interdependencies, even when their job depends on it. (I see this all the time, working in supply chain ERP system implementation. "Technical" people often get just as handwavy as business, but often because the business side lets them get away with it.)
My guess is that is why folks from the hard sciences tend to do well as business managers too. You can't mumble your way into building a functional bridge, or rely on "then a miracle happens" for step six in mixing up chemicals. Anything where "do it right, in the right order, or it doesn't work" is a good training.
At a high level of abstraction and 'architecture' coding really is the same skill as coordinating human efforts in organizations.
One has to manage complexity at scale in accomplishing a giant, multifaceted mission by divide-and-conquer and division-of-'labor' approaches, arranging hierarchies of compartmentalized, special-purpose functions, outlining paths of communication, and authority, allocating scarce resources intelligently, and constant testing, detection and elimination of 'bugs', with iterative-decision-making and refinement of processes and procedures ('instructions') for quality, reliability, and efficiency.
The basic outline of a modern efficient world-class organization is kind of like a general purpose CPU (they all tend to share similar basic design) with the founders / CEO programming and improving the human-coordination software to make it produce the intended outputs.
With hardware and software, the individual components and routines are both amazing because specialized to do one particular task extremely well, and dumb, because they can't do other tasks efficiently and often only get to work with a tiny piece of the puzzle. The art of managing and arranging the big picture, and to make sure that functions are able to run in parallel on different cores (or brains), and still deliver intermediate outputs just in time for other subsequent functions takes extremely high levels of talent.
I've thought the same, I think another advantage programmers have is that we often have to be aware that the world is adversarial, which is not true of most hard sciences.
One of the most underrated damaging things that can happen to a business are incentive systems that look clever on the surface but are easily abused or distorted. Not only do you encourage the wrong kind of behavior, but in time the wrong people will control the organization.
When writing software you are always aware that inputs are unpredictable. The users can do anything, and even in business systems that should stay internal there are no limits to inattention or malice. You have to be aware that every system you create is going to be tested.
As an organization scales and attention is diluted the interation of incentives is begins to dominate outcomes.
One thing to keep in mind is that having the programmer mindset is not enough. Having been around in the dawn of the PC age, I can recall dozens of companies ran by engineers & programmers that had quick success and then even quicker collapse.
The companies run by businessmen often didn't do much better. (Think Apple under Sculley.)
You need *both* the programmer mindset *and* business insight.
For all his faults, Bill Gates was one of the few people I saw in the 80s (and beyond) who had both, which is why Microsoft succeeded.
Sculley's big failure was Newton, the personal assistant - because it wasn't also a phone, nor was music a big deal then.
Business visions need luck on timing, including improved tech so as to offer the visionary product at a low enough price to be bought. Apple's $10,000 Lisa was too expensive, tho "better" than the $4,000 IBM PC.
The PC was "open", like the popular Apple II, but unlike the Mac. Helped at first, but has become a relative negative against viruses and instability.
The idea of keeping a system that needs constant patching reminds me of a term in software development we call “technical debt”. You implement something in a way that you know will have consequences down the line, but you’re willing to pay the cost in the future in exchange for a quick/less expensive launch. Like real debt, it can be a powerful tool if used responsibly.
Great article and important thoughts. I’d abstract the idea a little further and suggest that it’s important to know how your product works and it’s important to know how teams succeed with uncertainty. Knowing to code can teach you both things but being the best coder generally doesn’t make you a good CEO. There are technical and strategic skills that are hard to separate the more you develop them.
"... when it comes to complex modern businesses, luck plays only a small role in success."
I think we all would agree that success in a complex modern business depends on many factors coming together.
Some of these factors are under the control of (or clearly foreseeable by) the founders, such as corporate strategy and execution. For simplicity, let's call these factors "merit".
Some of the factors are not under the control of (or clearly foreseeable by) the founders, such as competitor's unexpected moves, macroeconomic conditions, and historical events (9/11, Covid-19, etc). No matter how good a restaurateur might be, a restaurant that opened in late 2019 was nearly certain to fail when the lockdowns started. "Luck" is as good a word as any for the combined effect of such factors.
For best-in-many-millions cases like Bezos or Elon, the most reasonable hypothesis is that nearly all factors, including merit and luck and those in-between*, worked out favorably. People rarely achieve notable success without merit, but merit is not sufficient for success (especially in startups, because startups fail much more often than they succeed).
* Some factors that could be controlled will not be because money and time are limited and priorities must be set. Whether these factors are considered merit or luck depends on a judgement about whether failing to prioritize them was reasonable.
Who else had "reasonably good coding skills: Jobs (not in Wozniak’s league, but who was?), Gates"? Bill Gates wrote, in assembler, a version of BASIC, before he dropped out of Harvard, so I'd put him at least in Wozniak's league. Gates also shows Handle's "At a high level of abstraction and 'architecture' coding really is the same skill as coordinating human efforts in organizations."
Jobs & Woz, together were a fine team - Gates' business acumen was better, tho not as visionary.
Likely a better coder than Woz was Mitch Kapor, the main creator of Lotus 1-2-3, the better than Visicalc spreadsheet which was the prime reason that in 1982-84 all the finance guys in all the big companies were able to go buy IBM PCs (tm). Lotus blew away all the competition. Kapor then created Symphony, trying to integrate in one program the spreadsheet and word processing and better presentation. In theory, elegant, but it was too much of a UI mess AND Gates & MS with copy & paste integration of Word and Excel and the purchase of PowerPoint (originally made for Mac) keeping the programs separate but working fairly well together turned out to be a better strategy in the marketplace. [Gates bailing out Apple & Macs for awhile in the late 80s is another fine tech-business story]
Integration with Machine Learning / AI will be a near future business fight. Tyler linked to a fine post on it: http://blog.eladgil.com/2022/08/ai-revolution-transformers-and-large.html
(I'm specifically looking for better AI teachers of English as a Second Language, the single subject more people pay money to learn than any other subject in the world - and for all native speaking college graduates, an OK way to earn money in any other OECD country.)
This also relates to Noah Smith's note that Comp Sci students are almost as numerous as students of all Humanities together.
Arnold "was willing to say yes to total rewrites and inclined to say no to patching."
IBM tried to follow this idea in its $5 billion (or more?) "Blue Harmony" attempt to rewrite all of its different country's separate "silos" of data into a single, consistent, customized SAP program. This program was NOT quite successful, so after Germany and China were switched on, IBM quietly stopped switching on the older countries and just put all new countries on it. Not a total failure - VP in charge was not fired.
Rewrites are certainly costly - and might NOT be successful.
Lots of banks still run functioning COBOL programs.
In classifying goods, software is excludable but non-rival - often called a Natural Monopoly. This is something very different from what most businesses deal with. Gates understood that, most people don't. Aside from that, everyone in and around software development needs to read The Mythical Man Month. Several of the commenters here have discovered his rules on their own.
The Harvard MBA belief that "a good manager can manage anything" is BS. If you are in charge of something you don't understand, like coding, you eventually will have some nerds come to you to make a decisions. The only basis for decision becomes who presents the "better case", not who was correct. Among STEM workers the correlation between knowing what you are doing and presentation skills is not large. A coder know when a coder is full of nonsense.
The same is true in all areas of STEM. When I was a young I had a technical disagreement with a consultant on how some data should be analyzed. At a big meeting with all the CEO down and I filled 3 blackboards with differential equations proving the other guy wrong. I lost and the company lost 17 million on that decision. I did the postmortem.
To succeed in managing technical areas, you need to know something about the subject.
Our daughter had a friend whose father, an excellent labor lawyer, became the manager of an oil refinery. He managed to kill 3 people the first year letting little errors that he didn't understand fall through the cracks.
I agree, but I think "just luck" is pretty good as an alternative to "learned how in B-school." :)
A agree with much that has been said here particularly by Dr. Hammer. I wonder what this might say about new firm entry. I mean if all you need is a good idea, a software savvy leader with a human touch and a team of talented programmers? Then new firm entry depends on the availability of talent--keep open the gates for smart immigrants.
Besides good interface definitions and documentation, scalable software development requires good testing. If you have a rigorous (and ideally mostly automated) way of verifying that any change does not break the existing commitments that your interface definitions have made, you can make changes much more easily and non-disruptively. This often shifts the tradeoffs for systems upgrading in favor of incremental stepwise change and away from ground-up rewrites, because the former becomes so much safer and cheaper.
A good testing discipline in turn improves your ability to think productively about what you are doing, probably producing a general multiplier effect for business efficiency. It forces you to constantly ask yourself: what does it mean for this thing to work correctly, and how would I know whether it does?
I am a software engineer with over decade of experience in the field. There’s a saying in the industry that all software problems are people problems. Whenever I look at a piece of software, I always ask who made this and why ? What were their incentives and constraints? What business problem is the software trying to solve?
What are the risks to the business if I change this software and it introduces a bug? Could the company go out of business? W
In other words, I believe goodhart and Conways law are essential in understanding how software is written and maintained.
As for
Is software really distinct from any other techincal field here? A lot of great industrial concerns were headed by engineers and scientists. Only a minority of nerds make good businessmen, but that minority seems to exist in all fields.
> I think that so many complex businesses today depend on software that to be a business leader you need a sense of the software development process.
Sounds right. It's not that the software skillset is special, it's just that software is very important nowadays.
> I formed the opinion, which I took with me to my own business, that putting off rewriting a system and letting it accumulate patches is a bad idea. I was willing to say yes to total rewrites and inclined to say no to patching.
It's more complicated than that. The most formidabbly excellent software systems I've seen are also formidabbly crusty. They each embody at least a decade of hard-won knowledge about real-world problems, their solutions and the wicked interactions between all of the above. Rewrite attempts just bring in a naive optimisim that makes things worse.
But the converse is not true. Someting is unlikely to be excellent just because it is crusty and ancient. It might just be embodying decades of hard-won stupidity. The developers of the excellent systems have been willing to apply the kind of rigour you are talking about. But part of the rigour was doing some careful study before dismantling any Chesterton fences.
The best time to rewrite a system is when you just got it working. You still have the developers who built it, and they know about the Chesterton fences. Take immediate advantage of "if I had it to do over again" thinking.
How much software have you seen "shipped" recently?
Usually it's already both late AND over budget, so it's shipped even tho there are some bugs. The idea to re-write it rather than fix bugs seems likely business disastrous. ( see my IBM Blue Harmony example- re-written without any orig developers.)
See the 15 year development of Duke Nukem Forever
https://www.looper.com/246833/this-game-took-15-years-to-develop-heres-why/
When the developers always wanted to be on the cutting edge of tech.
Many users are sick and tired of MS making new Windows with new, different but not superior User Interfaces, which they need to update to so as to be consistent with work.
Rewriting a popular User Interface (front end) is more likely a mistake than re-writing internal processing (back end), or adding new features (add-in).
This is true -- work with the intention that the first thing you make is a prototype and remember to treat it as such when the temptation comes to just keep it in production. After a few years this is not directly relevant to the whole system as a whole -- bit it is relevant to the components.
My most successful rewrite occurred when I found a way to do something much more efficiently than the preexisting system (which wasn't written by me). It was easier to rewrite the entire system to accommodate my new methods (while preserving externally visible behavior) than it was to retrofit the existing system.
I subsequently executed a rewrite of my rewrite. This one benefited from experience in the manner you describe, and I'd also classify it as successful.
But it's the first rewrite that had the greatest impact. That class of insight is difficult to plan for, but I think it's crucial to account for it in any theory of rewrites.
Agreed, but I think this doesn't square up to the longer term problems of time. Most software is going to change over time. I think software itself is "young" enough that we really haven't captured how to deal with this problem.
Take the IRS software, which IIRC was famously written in COBOL. Just hypothetically, let's imagine the software was written and documented perfectly on Day 1 back and 1970 and then rewritten and re-documented perfectly back on Day 365 back in 1971.
The fact that it was perfect 50 years ago means nothing to us now. Yet, continually dumping a lot of effort into rewriting a perfect (or nearly perfect) code base is an obviously bad, expensive idea. And doing nothing is just as obviously a bad idea, but it's a really cheap bad idea.
Economically, I have no idea where the marginal benefits of such efforts start to emerge.
As a practicing software engineer (trained as a scientist), I would agree that software is not distinct from other technical fields. An engineer that can “do” is accustomed to mathematically defining his problem, forming theories around the problem and then refining those theories by confronting reality. He will generally seek to develop intuition in his problem domain and have a toolkit to quickly make approximations. He will also develop an strong appreciation for how easy it is to be wrong.
I would be surprised if the above background skills are not transferable to business problems. Some entrepreneurs that I know and respect greatly have learned to refine their business skills (theories) over time by confronting reality and have developed great intuition in their problem domains. Those entrepreneurs still find great value in working with engineers who can define a problem well. Those entrepreneurs can also appreciate reasoned feedback on why their ideas might be wrong. I am not aware of any of the above skills being integral part of a business school education.
On rewriting software, I also am a skeptic. My general feeling is that systems produce software and if one’s system has produced a mess, then one needs a better system. One can start all over again with the hope that the second time through the existing system will produce better software, but especially in a group development situation, my null hypothesis is that the second time through will produce a new mess with all the bugs that the old system already fixed. I don’t think that the “better system” technology currently exists for groups, but some individual engineers have thought processes which are better at distilling complexity and tend not to make messes.
> On rewriting software, I also am a skeptic. My general feeling is that systems produce software and if one’s system has produced a mess, then one needs a better system.
Yes. I once joined a firm that was one year into a train-wreck project to rewrite a crusty behemoth straight out of Dilbert. That time should have been spent writing a rigourous set of lower-level tests on the original behemoth so that we could start hacking away at it with confidence.
(At least they had strong system level tests. But in this case system level meant "put the damn software into the damn robot and rigoursly check that the robot does it's work".)
"Is software really distinct from any other technical field here?"
A unique aspect of software is it fails fast. There is near immediate feedback that something in the design or implementation is broken. You find this feedback in certain engineering systems - particularly those related to computer hardware and electronics. But you don't have it in structural engineering. A building or bridge can be flawed, and yet take decades to collapse. All those years and the flaw remains hidden.
There is a certain humility required for doing software. It is hard. It is demanding. One needs to be willing to accept being wrong but also capable of accepting feedback to fix what is broken.
Another aspect is that the complexity of system software requires the developers to understand the full process. This burden to have both a holistic view and a detailed understanding is a valuable skill for managing a business - which is also a complex system requiring acute understanding of processes and their dependencies.
>> "Is software really distinct from any other technical field here?"
> A unique aspect of software is it fails fast. There is near immediate feedback that something in the design or implementation is broken. You find this feedback in certain engineering systems - particularly those related to computer hardware and electronics. But you don't have it in structural engineering. A building or bridge can be flawed, and yet take decades to collapse. All those years and the flaw remains hidden.
Good point, I don't mean to imply that all fields are the same and have transferable skills. But that they relate to businesses in the same way. If you were starting a construction firm in 1950 then the engineer with a rigorously-blueprint-and-calculate mindset would be more valuable than one with a software-like prototyping mindset.
Now to Arnolds point: if you were starting a *new* constructoin company then maybe the software guy really does have an advantage -- but only if the new innovation that this company brings to the table is better use of software.