There is an old adage that if your only tool is a hammer, everything looks like a nail. We use the lessons learned and experience we have gained to address current issues. But if the problem (or software project) we face today is fundamentally different from those we’ve dealt with previously, past experience isn’t the proper framework. In effect, we will be using a hammer when a saw or a chisel might be the tools we need.
The solution, of course, is to first gain an understanding of the problem at hand. What are its defining features? How does it behave? Only then can a proper solution be designed and the appropriate tools selected.
To a large degree, our understanding of how products are developed comes from knowledge gained from manufacturing since the beginning of the Industrial Revolution. Mentally, our first instinct is to try to apply those lessons learned to software development. But there is a huge problem with this approach. The creation of software is not a manufacturing process, but rather a knowledge acquisition and learning process that follows different rules. Here is a simple example. If I have an assembly line and want to double my output, I have several choices. I might add a second shift of workers or I could install an additional assembly line. Because manufacturing is a repetitive process in which design problems are solved before product construction begins, the relationship between labor required and output remains fairly constant. In a nutshell, we already know exactly what we need to do (and how to do it).
Software is a different beast. Because design often “evolves” during product construction, there is always some uncertainty about what is required and how to do it. Software development is the process of resolving those uncertainties, the result of which is the delivered software product. If this were not so, the problems with software development would have been solved with Case Tools in the 1980s: analyze the problem, design the solution, feed it into the Case Tool, and bingo, out comes a functional software product! Case Tools had limited success in large part because it is impossible to completely eliminate uncertainty about what to do and how to do it in advance. Since software is a learning (rather than a manufacturing) process, it follows a different set of rules.
In the manufacturing example above, you could double the output or halve the development time (within the limits of machine capacity) by doubling the staff. The relationship between output (products manufactured) and the inputs to production (labor and machinery) is linear. Doubling the staff on a software projects doesn’t cut the schedule in half; it just costs more and produces an inferior product.
Here are a few rules about how software development behaves. The business leader who adheres to these will improve his or her project success rate and may experience the benefits of lower blood pressure:
- Rule 1. Your own project history is the best predictor of how you are going to perform at the present. Plan within your demonstrated capabilities. Projects planned outside that region lead to failure. Capabilities can be and should be improved over time, but this is a long term endeavor: one that cannot simply be willed into being.
- Rule 2. While new tools and processes may provide long term benefits, the learning curve associated with them initially lowers schedule and cost productivity because the team has no experience working with them. They may be good long term investments, but you won’t see the results immediately: they are not a short term fix!
- Rule 3. Small teams are more effective than large teams. They cost less, complete the software in about the same time, and produce fewer defects. (They are also easier to manage). There are two reasons why small teams work better. First, as staff is added to a software project the number of communication channels increases in a decidedly non-linear fashion, making communication within the project more complex It goes without saying that good communication is critical to software project success. Second, as more staff is added, the work becomes more and more atomized. While the individual developers may be doing their work well, they will lack the big picture of how their part participates in the whole. This can produce serious problems during integration testing.
- Rule 4. The relationship between software schedule and cost/effort is profoundly non-linear. One unit of schedule reduction is purchased at the cost of many additional units of labor (cost).
- Rule 5. Reducing a project’s schedule below what is optimal is not only costly (see Rule 4); it is dangerous. Fredrick Brooks, the author of “The Mythical Man month”, captured this succinctly when he said “More software projects have gone awry for lack of calendar time than for all other causes combined. Why is this cause of disaster so common?”
All of the concepts touched upon here (and many others) are addressed in the QSM Software Almanac: 2014 Research Edition which you can download at no cost.