Practical Software Estimation Measurement

Blogs

Losses Loom Larger Than Gains

Anyone who has gambled (and lost) knows the sting of losing.  In 1979, Daniel Kahneman and Amos Tversky, pioneers in the field of behavioral economics, theorized that losses loom larger than gains; essentially, a person who loses $100 loses more satisfaction that what is gained by someone who wins $100. Behavioral economics weaves psychology and economics together to map the irrational man, the foil of economics' rational man. 

How can I leverage this theory for software development?

According to the QSM IT Software Almanac (2006), worst in class projects took 5.6 times as long to complete and used roughly 15 times as much effort with a median team size of 17, and were less likely to track defects. 

One way you can leverage your worst in class projects would be to use them as history files in SLIM-Estimate, which would adjust PI, defect tuning, etc., to match how you have developed software in the past. Don Beckett recently discussed how to tune effort for best in class analysis and design.

Another way to leverage your worst in class projects would be to build a "project graveyard," that is, a database of your organization's worst projects, and load it into SLIM-Metrics. In SLIM-Metrics, you can analyze duration, peak staff, average staff, and defects to view your own organization's weaknesses. Depending on how well documented your SLIM-DataManager database is, you could analyze some of the custom metrics that ship with SLIM-Metrics, such as reviewing who the project was built for (customer metric) and complexity.

Blog Post Categories 
SLIM-Metrics SLIM-DataManager

Webinar: Successful Estimating Processes Using the SLIM API

On April 12, 2012 at 1:00 PM EDT, QSM will host a webinar focused on two successful implementations of the SLIM API presented by IBM's Carl Engel, State Street's Scott Lancaster, and QSM's Larry Putnam, Jr

How do best in class development organizations achieve maximum return on investment from their estimation programs? By leveraging the SLIM API for integrations between estimation tools and detail-oriented products, development teams are able to simplify estimation processes and broaden the estimation program user base. Presented by Carl Engel of IBM Global Services, Scott Lancaster of State Street, and Larry Putnam, Jr. of QSM, this webinar explores two successful implementations of the SLIM API between third party tools and the SLIM Suite. 

Carl Engel is the Estimating Program Manager for IBM's Global Business Services responsible for the development and deployment of performance benchmarking and estimating process, methods and tools including the support for nearly 1,000 SLIM Suite users. Carl has been with IBM for 12 years as an Associate Partner and has had previous roles as the program manager for IBM's project management methodology and tools. He is an IBM certified Executive Project Manager, PMP with over 30 years of program and project management experience primarily in very large scale efforts in the nuclear industry and U.S. National Laboratories.

Blog Post Categories 
Webinars SLIM Suite

Software Cost Estimation Article in The DACS Journal

The February issue of the DACS Journal of Software Technology focuses on Software Cost Estimation and Systems Acquisition. My contribution, which you can read here, addresses the challenges faced by estimators and the value of establishing a historical baseline to support smarter planning, counter unrealistic expectations, and maximize productivity.

Using several recent studies, my paper addresses the following questions:

  • What is estimation accuracy, and how important is it really?
  • What is the connection between the Financial Crisis of 2008 and software estimation?
  • Why do small team projects outperform large team projects?
  • How can you find the optimal team size for your project?

Read the full article.

Blog Post Categories 
Estimation Articles

Part III: The Caveats

In Part 1 of How Much Estimation? we noted that there is an optimal amount of time and effort that should be spent in producing an estimate based on the target cost of a project and business practice being supported.

In Part 2: Estimate the Estimate, we saw that the formula to calculate this optimal time (as measured at NASA)  calculates the Cost of Estimate as the Target_Cost raised to the power 0.35 (approximately the cube root of the Target Cost).  The factor that defines the business practice (either by early lifecycle phase or perhaps by the “expected precision” of the estimate) is a linear factor ranging from a value of 24 to a value of 115.

Those Caveats!

I mentioned that there were caveats with the calculation.  Here they are:

Blog Post Categories 
Estimation SLIM-Estimate

Part II: Estimate the Estimate

In Part 1 of How Much Estimation, we observed that both too much time and effort and too little time and effort spent on estimating are less than optimal.  Combining:

  • The cost of producing an estimate—which is a function of the number of people working on the estimate and how long they work
  • The cost of variance in the results of the estimate—that is, how much the estimate varies from experienced actuals and what that variance will likely cost the project.  This is typically a function of the number of unknowns at the time of estimating for which the project cannot easily adjust and which will require additional unplanned resources of time, effort, and staff.

We get a U-shaped curve, at the bottom of which is the optimal time: we’ve spent enough time and effort to minimize the sum of the cost of estimate and the cost of variance.

The question is: how to calculate this point?  It will not be the same for a very large complex project and a very small simple project.  Also we don’t want a complicated and time-consuming approach to calculate the cost of estimate—it should be quick and simple.

NASA’s Deep Space Network (DSN) projecti developed a mechanism for this calculation based on two simple parameters:

Target Cost of Project

This is goal cost of the project as first envisaged in the project concept.  It is NOT the estimated cost of the project (which hasn’t been calculated yet).  Projects for which we expect and plan to spend a lot of money should clearly have more time and effort spent in estimating simply because more is at risk.

Blog Post Categories 
Estimation

How Much Estimation?

How much time and effort should we spend to produce an estimate?  Project estimation, like any other activity, must balance the cost of the activity with the value produced.

There are two extreme situations that organizations must avoid:

The Drive-By Estimate

The drive-by estimate occurs when a senior executive corners a project manager or developer and requires an immediate answer to the estimation questions: “When can we get this project done?”  “How much will it cost?” and “How many people do we need?" (the equally pertinent questions: “How much functionality will we deliver?” and “What will the quality be?” seem to get much less attention).

Depending on the pressure applied, the estimator must cough up some numbers rather quickly. Since the estimate has not been given much time and attention, it is usually of low quality. Making a critical business decision based on such a perfunctory estimate is dangerous and often costly.

The Never-Ending Estimate

Less common is the estimation process that goes on and on.  In order to make an estimate “safer” an organization may seek to remove uncertainty in the project and the data used to create the estimate. One way to do that is to analyze the situation more and more. Any time we spend more time and more effort in producing an estimate we will generally produce a more precise and defensible result. The trouble is the work we have to do to remove all the uncertainty is pretty much the same work we have to do to run the project. So companies can end up in the odd situation where, in order to decide if they should do the work what resources they should allocate to the project, they actually do the work and use up the resources.

Blog Post Categories 
Estimation

"The Difference Engine" by Phillip Armour in Communications of the ACM

January's Communications of the ACM featured an article by QSM consultant Phillip Armour. "The Difference Engine" focuses on building teams of differently skilled people. The article is partly based on University of Michigan Professor of Complex Systems, Scott Page’s book, The Difference, which shows the power of cognitive diversity in building systems and solving problems. Phil will elaborate more on this subject in a upcoming series on the QSM blog, so stay tuned!

Download the PDF

Phil is a regular contributor to Communications of the ACM. You can read more of his articles here.

 

Blog Post Categories 
QSM News Articles

Webinar Replay Now Available: Shifting to Agile Methods - The Keys for Long-Term Success

If you were unable to attend our webinar, Shifting to Agile Methods - The Keys for Long-Term Success, a replay is now available. 

Changes to the software development process, such as moving toward Agile methods, must demonstrate sustainable results over time versus just short-term wins.  There are two keys to reaching long-term success that should be considered up front – the new process must be repeatable and measurable. 

In this session, AccuRev’s Chris Lucca and QSM’s Larry Putnam, Jr. explore these two keys to success.  

Specifically, they cover:

  • The state of software development projects yesterday versus today and the impact to the software development process
  • The techniques and tools that can help a team to build a process that is repeatable and scalable, even across a distributed team
  • Which metrics and measurement processes are important to measuring the results and improvements of implementing repeatable and scalable processes
  • How to use metrics to estimate project schedules, resources and reliability, and monitor project progress and forecast completion
  • Ways to benchmark the results at project completion for time to market, cost performance and reliability – all of which provide the business case for continued investments in technology and repeatable and scalable processes

View the webinar replay.

View all recordings of all of our past webinars.

Blog Post Categories 
Webinars Agile

Part III: Finding the Optimal Team Size for Your Project

In part one of our team size series, we looked at Best and Worst in Class software projects and found that using small teams is a best practice for top performing projects. Part two looked at differences in cost and quality between small and large team projects and found that small teams use dramatically less effort and create fewer defects.  But simply knowing that small teams perform better doesn’t tell us how small a team to use. Most software metrics scale with project size, and team size is no exception. Management priorities must also be taken into account. Small projects can realize some schedule compression by using slightly larger teams but for larger projects, using too many people drives up cost but does little to reduce time to market:

Larger teams create more defects, which in turn beget additional rework… These unplanned find/fix/retest cycles take additional time, drive up cost, and cancel out any schedule compression achieved by larger teams earlier in the lifecycle.

In a study conducted in the spring of 2011, QSM Consultant Don Beckett designed a study that takes both system size and management priorities into account. He divided 1920 IT projects into four size quartiles. Using median effort productivity (SLOC/PM) and schedule productivity (SLOC/Month) values for each size bin, he then isolated top performing projects for schedule, effort, and balanced performance (better than average for effort and schedule):

Effort vs. Schedule

Blog Post Categories 
Team Size

Tuning Effort for Best in Class Analysis and Design

After reading Best Projects/Worst Projects in the QSM IT Almanac, a SLIM-Estimate® user noted that the Best in Class Projects expended around 28% of their total project effort in analysis and design (SLIM Phase II) compared to 10% for the Worst in Class Projects. She wanted to know how she could tune her SLIM-Estimate templates to build in the typical best in class standard for Analysis and Design.

In SLIM-Estimate, effort and duration for phases I and II are calculated as a percentage of Phase III time and effort. To create a template for estimating phases II and III that will automatically allocate 28% of total project effort to analysis and design (Phase II), follow these simple steps.

  • From the Estimate menu, select Solution Assumptions.  Make sure the “Include” check boxes for Phases II and III are selected.  Then click on the Phase Tuning tab.
  • Click on the tab for Phase II.  (If you have previously customized the phase names, the default name for Phase II will reflect that).
  • Click on the Manual button under Effort, and enter 28% for the effort percent.

That’s it. Your estimates based on this template will now automatically allocate 28% of total project effort to Analysis and Design (Phase II).

This procedure assumes that your estimates will be for SLIM Phases II and III, which, we have found, is the typical scope for most project estimates. However, if your estimates include Phases I and/or IV, you may have to increase the effort percent a bit to achieve the desired result.

Blog Post Categories 
SLIM-Estimate Tips & Tricks Effort