I just went thought an "agile to mvp then refine" process and it was miserable and resulted in the worst product I've used in a decade, and believe me, that's a high bar to have surpassed.
To be fair, my organization is also increasingly dysfunctional, and so there is an extension of Conway's Law that the health of the software development process also resembles the health of the organization's core functioning, and furthermore that there is no one right approach to development, but instead, very different approaches are needed for different organizations or things will end up in disaster.
"Agile MVP" seems better suited to organizations with strong leadership, hierarchies, accountability, (I.e., some one person is known to everyone to be in charge of the project, and is empowered to actually be in charge), well-defined responsibilities and processes, and experience at formulating and communicating vision and requirements precisely and assessing objectively.
None of that is how you would describe my organization, so the approach crashed on the rocks. After MVP, there was zero motive all around to make even the biggest bang-for-buck improvements, so everything froze with all but severe work stoppage bugs going on the indefinite backlog.
You can imagine what the minimally viable version of anything you own would be like, the version that could have been developed with the absolute smallest budget in the minimum amount of time and still technically satisfying some core function, and then the prospect of being stuck forever with that version of it.
Yup, I just wasted my last working year on a project that exhibited a "discovery by user testing" mindset and I couldn't retire fast enough to avoid the inevitable disaster of an implementation.
The idea that "agile" enables software development with no planning, no requirements, no gap analysis, and no discovery is just nonsense on stilts but is believed by way too many people who should really know better.
I would add that we were obligated by leadership to do it all remote, and even the most pro-virtual-everything advocates eventually broke down in frustration at the frictions and frustrations of working that way, and begged for getting the human beings in the same room for a while to look over the shoulders and doodle flowcharts on paper and so forth. Most of the workforce is just not up to the challenge of efficiently and effectively collaborating remotely on complex projects.
In what may be worth a good investigation by somebody, my suspicion is that the average person has become worse at doing this than their predecessors in preceding generations would have been. What I mean is, our society seems increasingly post-literate, with most people reading and writing much less, and communicating strictly in text at length much less, in part because of new multimedia possibilities, and dumbing down of standards, among other contributing factors.
When communication at distance and information technology was mostly just words, people had little choice but to focus effort and become very skilled at explaining exactly what they were thinking and envisioning - with accuracy, precision, economy, and grace - with resort only to words. In short, they had higher vocabularies and were more articulate.
A generation or two of technologically-enabled cultural decline later, I see people, especially younger ones, constantly struggle to put their thoughts into words without a lot of straining and repeated attempts at clarification and so forth.
So, when one is trying to develop a user experience of a software project, a lot of the points one is trying to communicate require high articulation capability to use words to describe how a display should be manipulated or modified. And this is something that is now so frustrating for most people to do in a real-time "zoom-like meeting" conversation, even with screen sharing and drawing tools or whatever, that they throw up their hands and quiet quit anytime the developers ask if there are any issues anyone wants to bring up.
So perhaps there is a kind of ironic paradox in that familiarity with some IT tech actually erodes the skills needed to efficiently collaborate over other IT tech, making occasional syncing in person more, not less, essential.
Oh, dear. My experience has been that people who couldn’t write clear and concise COBOL or PL/1, couldn’t write clear and concise English either. We may find out to our dismay that the reverse is also true.
There is a similar tension in hardware engineering. Take for example the case of building an optical assembly such as a microscope or telescope that requires mechanical, electrical, software, and optical engineers.
There are certain optical assemblies in which no MVP can be built because without a custom-designed and custom-fabricated lens, no feedback can be obtained from the system. Designing and manufacturing the lens assembly whether it be reflective or refractive is almost always the longest lead item.
This puts tremendous pressure on the optical engineers. And to add more pressure, managers typically don’t understand enough optics to understand why it would take so long to design and fabricate a lens assembly, so the optical engineers have to spend extra time educating project managers and executives of nuances such as lens fabrication tolerancing in which the optical designer iteratively goes back and forth with the fabricate to learn which tolerances are too tight. This often requires changing the surface shape of the lens, which requires full re-optimization of all the lens dimensions and the mechanical mounting scheme.
Fabricators typically are unwilling to give away their secret techniques nor their weaknesses, so typical lens fabrication tolerances that are published don’t always apply. For example, some lens materials are so unique there’s nothing published to use as a reliable reference.
On top of this, there is a subset of optics called straylight engineering that is very important for seeing faint objects in a noisy background. For example camouflaged projectiles coming your way, or trace molecules that signal deadly cancer in one’s blood. In these life or death cases the systems sensitivity must meet specification, otherwise the missiles or cancer will not be detected in time and people will die.
If people only knew how much concentration it requires to solve such complex problems and how much it would benefit humanity to be able to detect cancers months sooner...
So people, read The Three Languages of Politics, aim for a civil dialogue, end the cancellation attempts, so your optical engineers can design better lenses, so that we can increase life expectancy and the quality of life.
This message has been brought to you by former optical engineers of Silicon Valley that no longer practice optics because well who can design lenses in the face of lockdowns, school closures, Supreme Court nominee cancellation attempts, gender transition discussions at public schools, so on and so forth.
You won’t have nice optics until you read Arnold’s book.
Agile is Ok. A pure agile development process will result in a poorly designed product that would normally take a decade of upgrades to reach. One add on after another will lead to system that may work, but could have been designed much better. They now refactor code often due to this. A little forethought would also help.
If I may add some color, any project of sufficient complexity will require iteration. Even when NASA put people on the moon, they didn't try it on their first launch. They had many launches leading up to that. The key is that with each iteration, each launch, they had an idea about what they wanted to learn from it, and how that learning fit into the overall mission.
What they weren't doing was just thrashing about until they hit on something. It is easy enough nowadays to thrash about without a vision or plan and call it a methodology. The key with iteration is always to try and learn more about the problem you are trying to solve. To get the most learning from an iteration requires spending some time to think about what you are doing, and figuring out how to articulate the questions you are asking and answers you are finding. We can now make iterations smaller, and so we need less time between iterations, but the some thought to the overall picture is still required.
All I will say about Agile, is that while I have learned much from its proponents, they also tend to be very parochial (everything is viewed through the lens of corporate IT consulting) and love to play "No True Scotsman" whenever their favored methodology is criticized.
In the early 80s, many developers were creating prototypes and expanding usability after debugging the prototype. Similar but inferior to an MVP, which is a prototype that is expected to become an offered product.
Borland had ads with the message “make mistakes faster”. At least got turbo Pascal, maybe also turbo C.
Pascal itself was an “educational language” not designed for professional development, but learning it, mastering it, made it easy to keep using it rather than learn C, or any other language.
PCs were starting to replace mini computers. At Ford Aerospace in Palo Alto, the head of P&BA would bring in his Apple II with VisiCalc to rapidly back calculate the numbers he wanted to present, so I only had a couple of JCL submissions per month, before switching to a new ‘82 group of young IBM PC experts to help teach our VPs how to use PCs.
Everyone wanted Lotus 1-2-3.
A few jobs later I was a developer, doing prototyping & expanding. One good working tip was to make a backup, daily, but first thing in the morning with coffee, and time, rather than late. Many 10 and even 12 hr days.
MVP works much better for smaller projects, as does the agile independent-group process. Yet a small project for a group that is expected to grow, and then does grow, can scale quickly.
Now I’m wondering about hybrid project management with multiple agile scrum groups and ai based coordination & cooperation.
“But the software engineers of the 20th century should not be faulted for failing to invent the agile approach.”
In my career as a systems analyst, I went from punch cards to agile. One of the first minicomputers I worked with was a PDP 12. It was six feet tall, six feet wide, two feet deep, and had 4K of memory. The type of programming necessary given such severe memory constraints would get you fired today. The “clever” coding that was needed was difficult to understand and maintain.
One “problem” that comes with the luxury of infinite do-overs is the mindset that it fosters. You lose touch with the “real world” in which do-overs may not be an option. A miner loading explosives into a drill hole or a bomb disposal expert dealing with a suspicious-looking package can’t afford errors; there are no iterations.
I'm one of those developers who knows what MVP means, but had never heard of JCL (although in my day job, I work on operations. Of course, the actual loading of the programs and data are done by computer programs, but those computer programs need to be configured, and those are configured in YAML. One thing I do is make sure the YAML is right.)
I think you're right that microcomputers and the internet enable "agile" in a way that mainframes don't. But just because you have new computers doesn't mean that you can do agile either.
My understanding of the healthcare.gov meltdown is that the mainframe and the mainframe-modern-server interface worked just fine. What broke was where the modern servers met the user. it simply wasn't ready to handle tens of millions of daily active users, and this was because the contractors hired to build it had no idea how to build for the modern consumer internet.
Then a small team of agile practitioners patched it back together in a few weeks. If it had been the mainframes or how to integrate the mainframes, they would have been dead in the water for the reasons you laid out. Agile has no power there.
Also, have you read Fred Brooks's _The Mythical Man Month_? It is about building the OS for the IBM 360 mainframe. It is all about program management, not code.
Agile development started in the 1990s as e-commerce and the internet took off. It was necessitated by the short timelines and what were seen as dire competitive threats from disintermediation. Object-oriented software also made tinkering easier. Agile was a 20th century development not widely utilized until the early 21st century
This is why minicomputers running UNIX and other OSs took off in the early ‘70s. The cheaper, simpler, and less powerful machines made iteration possible. Microcomputers made it even easier, while eating the minicomputers’ lunch. I don’t know if I could have remained in software development for 40 years without them.
I've been involved with products ranging from fully successful to so bug-laden and ineffective that it never truly worked as it should. Some of the best were created to first collect the data, later perform basic analysis, and then report. Later yet more analysis and reporting was added.
Besides this stepwise process, what I found was most helpful was good code writers who could code each function efficiently but maybe more importantly extensive comment and documentation to be able to easily identify what each subroutine was doing and more easily debug. Messy code was near impossible to debug. Sometimes it was good enough someone could go through and organize and comment but just as often it got rewritten.
Sorry. I went off on a tangent in my comment above about political distractions that make it difficult for engineers to solve complex problems. Where was I going with that? It is possible to build software programs that simulate complex physical phenomena so that a MVP strategy could be implemented for hardware, but this would require a very longterm strategy to invest in such simulators.
It´s nice to see this approach to evolve into try, get error and repeat. I can not imagine to have something to be "offline" this days or some real problems. Sad part is that this approach count´s that you can always have unlimited resources so error doesn´t matter. So what, that the program failed? Try new version. So what, that data leaked? Just update it. So what, that we deleted all records? Just reinstall and try again. When you apply this to medicine, so what, that many people died, next time we will fix this error. Or so what, that the traffic killed 5 people, next time we will fix it.
I just went thought an "agile to mvp then refine" process and it was miserable and resulted in the worst product I've used in a decade, and believe me, that's a high bar to have surpassed.
To be fair, my organization is also increasingly dysfunctional, and so there is an extension of Conway's Law that the health of the software development process also resembles the health of the organization's core functioning, and furthermore that there is no one right approach to development, but instead, very different approaches are needed for different organizations or things will end up in disaster.
"Agile MVP" seems better suited to organizations with strong leadership, hierarchies, accountability, (I.e., some one person is known to everyone to be in charge of the project, and is empowered to actually be in charge), well-defined responsibilities and processes, and experience at formulating and communicating vision and requirements precisely and assessing objectively.
None of that is how you would describe my organization, so the approach crashed on the rocks. After MVP, there was zero motive all around to make even the biggest bang-for-buck improvements, so everything froze with all but severe work stoppage bugs going on the indefinite backlog.
You can imagine what the minimally viable version of anything you own would be like, the version that could have been developed with the absolute smallest budget in the minimum amount of time and still technically satisfying some core function, and then the prospect of being stuck forever with that version of it.
Yup, I just wasted my last working year on a project that exhibited a "discovery by user testing" mindset and I couldn't retire fast enough to avoid the inevitable disaster of an implementation.
The idea that "agile" enables software development with no planning, no requirements, no gap analysis, and no discovery is just nonsense on stilts but is believed by way too many people who should really know better.
I would add that we were obligated by leadership to do it all remote, and even the most pro-virtual-everything advocates eventually broke down in frustration at the frictions and frustrations of working that way, and begged for getting the human beings in the same room for a while to look over the shoulders and doodle flowcharts on paper and so forth. Most of the workforce is just not up to the challenge of efficiently and effectively collaborating remotely on complex projects.
In what may be worth a good investigation by somebody, my suspicion is that the average person has become worse at doing this than their predecessors in preceding generations would have been. What I mean is, our society seems increasingly post-literate, with most people reading and writing much less, and communicating strictly in text at length much less, in part because of new multimedia possibilities, and dumbing down of standards, among other contributing factors.
When communication at distance and information technology was mostly just words, people had little choice but to focus effort and become very skilled at explaining exactly what they were thinking and envisioning - with accuracy, precision, economy, and grace - with resort only to words. In short, they had higher vocabularies and were more articulate.
A generation or two of technologically-enabled cultural decline later, I see people, especially younger ones, constantly struggle to put their thoughts into words without a lot of straining and repeated attempts at clarification and so forth.
So, when one is trying to develop a user experience of a software project, a lot of the points one is trying to communicate require high articulation capability to use words to describe how a display should be manipulated or modified. And this is something that is now so frustrating for most people to do in a real-time "zoom-like meeting" conversation, even with screen sharing and drawing tools or whatever, that they throw up their hands and quiet quit anytime the developers ask if there are any issues anyone wants to bring up.
So perhaps there is a kind of ironic paradox in that familiarity with some IT tech actually erodes the skills needed to efficiently collaborate over other IT tech, making occasional syncing in person more, not less, essential.
Oh, dear. My experience has been that people who couldn’t write clear and concise COBOL or PL/1, couldn’t write clear and concise English either. We may find out to our dismay that the reverse is also true.
There is a similar tension in hardware engineering. Take for example the case of building an optical assembly such as a microscope or telescope that requires mechanical, electrical, software, and optical engineers.
There are certain optical assemblies in which no MVP can be built because without a custom-designed and custom-fabricated lens, no feedback can be obtained from the system. Designing and manufacturing the lens assembly whether it be reflective or refractive is almost always the longest lead item.
This puts tremendous pressure on the optical engineers. And to add more pressure, managers typically don’t understand enough optics to understand why it would take so long to design and fabricate a lens assembly, so the optical engineers have to spend extra time educating project managers and executives of nuances such as lens fabrication tolerancing in which the optical designer iteratively goes back and forth with the fabricate to learn which tolerances are too tight. This often requires changing the surface shape of the lens, which requires full re-optimization of all the lens dimensions and the mechanical mounting scheme.
Fabricators typically are unwilling to give away their secret techniques nor their weaknesses, so typical lens fabrication tolerances that are published don’t always apply. For example, some lens materials are so unique there’s nothing published to use as a reliable reference.
On top of this, there is a subset of optics called straylight engineering that is very important for seeing faint objects in a noisy background. For example camouflaged projectiles coming your way, or trace molecules that signal deadly cancer in one’s blood. In these life or death cases the systems sensitivity must meet specification, otherwise the missiles or cancer will not be detected in time and people will die.
If people only knew how much concentration it requires to solve such complex problems and how much it would benefit humanity to be able to detect cancers months sooner...
So people, read The Three Languages of Politics, aim for a civil dialogue, end the cancellation attempts, so your optical engineers can design better lenses, so that we can increase life expectancy and the quality of life.
This message has been brought to you by former optical engineers of Silicon Valley that no longer practice optics because well who can design lenses in the face of lockdowns, school closures, Supreme Court nominee cancellation attempts, gender transition discussions at public schools, so on and so forth.
You won’t have nice optics until you read Arnold’s book.
Agile is Ok. A pure agile development process will result in a poorly designed product that would normally take a decade of upgrades to reach. One add on after another will lead to system that may work, but could have been designed much better. They now refactor code often due to this. A little forethought would also help.
Have you seen Mollick’s post today about future of organization & software development? Pertinent to your post.
Well said.
If I may add some color, any project of sufficient complexity will require iteration. Even when NASA put people on the moon, they didn't try it on their first launch. They had many launches leading up to that. The key is that with each iteration, each launch, they had an idea about what they wanted to learn from it, and how that learning fit into the overall mission.
What they weren't doing was just thrashing about until they hit on something. It is easy enough nowadays to thrash about without a vision or plan and call it a methodology. The key with iteration is always to try and learn more about the problem you are trying to solve. To get the most learning from an iteration requires spending some time to think about what you are doing, and figuring out how to articulate the questions you are asking and answers you are finding. We can now make iterations smaller, and so we need less time between iterations, but the some thought to the overall picture is still required.
All I will say about Agile, is that while I have learned much from its proponents, they also tend to be very parochial (everything is viewed through the lens of corporate IT consulting) and love to play "No True Scotsman" whenever their favored methodology is criticized.
In the early 80s, many developers were creating prototypes and expanding usability after debugging the prototype. Similar but inferior to an MVP, which is a prototype that is expected to become an offered product.
Borland had ads with the message “make mistakes faster”. At least got turbo Pascal, maybe also turbo C.
Pascal itself was an “educational language” not designed for professional development, but learning it, mastering it, made it easy to keep using it rather than learn C, or any other language.
PCs were starting to replace mini computers. At Ford Aerospace in Palo Alto, the head of P&BA would bring in his Apple II with VisiCalc to rapidly back calculate the numbers he wanted to present, so I only had a couple of JCL submissions per month, before switching to a new ‘82 group of young IBM PC experts to help teach our VPs how to use PCs.
Everyone wanted Lotus 1-2-3.
A few jobs later I was a developer, doing prototyping & expanding. One good working tip was to make a backup, daily, but first thing in the morning with coffee, and time, rather than late. Many 10 and even 12 hr days.
MVP works much better for smaller projects, as does the agile independent-group process. Yet a small project for a group that is expected to grow, and then does grow, can scale quickly.
Now I’m wondering about hybrid project management with multiple agile scrum groups and ai based coordination & cooperation.
“But the software engineers of the 20th century should not be faulted for failing to invent the agile approach.”
In my career as a systems analyst, I went from punch cards to agile. One of the first minicomputers I worked with was a PDP 12. It was six feet tall, six feet wide, two feet deep, and had 4K of memory. The type of programming necessary given such severe memory constraints would get you fired today. The “clever” coding that was needed was difficult to understand and maintain.
One “problem” that comes with the luxury of infinite do-overs is the mindset that it fosters. You lose touch with the “real world” in which do-overs may not be an option. A miner loading explosives into a drill hole or a bomb disposal expert dealing with a suspicious-looking package can’t afford errors; there are no iterations.
I'm one of those developers who knows what MVP means, but had never heard of JCL (although in my day job, I work on operations. Of course, the actual loading of the programs and data are done by computer programs, but those computer programs need to be configured, and those are configured in YAML. One thing I do is make sure the YAML is right.)
I think you're right that microcomputers and the internet enable "agile" in a way that mainframes don't. But just because you have new computers doesn't mean that you can do agile either.
My understanding of the healthcare.gov meltdown is that the mainframe and the mainframe-modern-server interface worked just fine. What broke was where the modern servers met the user. it simply wasn't ready to handle tens of millions of daily active users, and this was because the contractors hired to build it had no idea how to build for the modern consumer internet.
Then a small team of agile practitioners patched it back together in a few weeks. If it had been the mainframes or how to integrate the mainframes, they would have been dead in the water for the reasons you laid out. Agile has no power there.
Here is the best explanation I've heard: https://changelog.com/gotime/154
Also, have you read Fred Brooks's _The Mythical Man Month_? It is about building the OS for the IBM 360 mainframe. It is all about program management, not code.
Agile development started in the 1990s as e-commerce and the internet took off. It was necessitated by the short timelines and what were seen as dire competitive threats from disintermediation. Object-oriented software also made tinkering easier. Agile was a 20th century development not widely utilized until the early 21st century
This is why minicomputers running UNIX and other OSs took off in the early ‘70s. The cheaper, simpler, and less powerful machines made iteration possible. Microcomputers made it even easier, while eating the minicomputers’ lunch. I don’t know if I could have remained in software development for 40 years without them.
The ancestor of agile was worse-is-better...https://dreamsongs.com/RiseOfWorseIsBetter.html
I've been involved with products ranging from fully successful to so bug-laden and ineffective that it never truly worked as it should. Some of the best were created to first collect the data, later perform basic analysis, and then report. Later yet more analysis and reporting was added.
Besides this stepwise process, what I found was most helpful was good code writers who could code each function efficiently but maybe more importantly extensive comment and documentation to be able to easily identify what each subroutine was doing and more easily debug. Messy code was near impossible to debug. Sometimes it was good enough someone could go through and organize and comment but just as often it got rewritten.
Sorry. I went off on a tangent in my comment above about political distractions that make it difficult for engineers to solve complex problems. Where was I going with that? It is possible to build software programs that simulate complex physical phenomena so that a MVP strategy could be implemented for hardware, but this would require a very longterm strategy to invest in such simulators.
It´s nice to see this approach to evolve into try, get error and repeat. I can not imagine to have something to be "offline" this days or some real problems. Sad part is that this approach count´s that you can always have unlimited resources so error doesn´t matter. So what, that the program failed? Try new version. So what, that data leaked? Just update it. So what, that we deleted all records? Just reinstall and try again. When you apply this to medicine, so what, that many people died, next time we will fix this error. Or so what, that the traffic killed 5 people, next time we will fix it.
That is what lab mice were for