Software Estimation Best Practices

Effort: What's Behind that Number?

Effort seems like a metric that's very straightforward, but there is a lot of complexity here, particularly if you are performing benchmark analysis. Recently, I was tapped to help out with a benchmark assessment. One of the metrics that the customer wanted to analyze was effort per function point. "Effort" on its own is very vague, and while the customer might know which phases or activities his organization uses, I can't be sure that definition will match what I think he wants. In order to effectively benchmark, we need to make an apples-to-apples comparison by examining what is really behind the effort number, so it was necessary to send the client phase and activity definitions. 

Here are some helpful definitions to help you understand which activities are included in each phase: 

Concept DefinitionThe earliest phase in the software life cycle, where complete and consistent requirements and top-level, feasible plans for meeting them are developed.

The objectives of this phase are to develop a complete and technically feasible set of requirements for the system and to formulate the top-level approach and plan for their implementation.  Typical products of these activities include:

  • A system specification, statement of need, or list of capabilities that the user or marketing organization expects from the system, or a marketing specification; 
  • A feasibility study report or an assessment of whether the need can be met with the available technology in a timely and economic manner; and
  • A set of plans documenting the project management approach to development, quality assurance, configuration management, verification, etc.

This phase is complete when it is determined that building this system is feasible.  A system-level requirements review may be held at the completion of this phase to determine the scope of the effort and the approach for Phase 2.

Requirements & DesignPhase 2 develops a technically feasible, modular design within the scope of the system requirements.

The objectives of this phase are to complete the system-level design by choosing the appropriate constituent technologies (hardware, software, etc.) and to allocate each system-level requirement to the appropriate technology. Software requirements are defined. The software top-level architecture is defined. Typical products include the following:

  • specifications defining the interfaces between the system-level components;
  • software requirements specifications describing the inputs, processing (logic and/or algorithms), and outputs; and 
  • updated project plans.

A design review (e.g., PDR) may be held during this phase to baseline the system design, software requirements, and software top-level architecture. The Concept Definition Phase normally overlaps the start of Phase 3 as the top-level software design is iterated and refined.

Construct & Test

This phase produces a working system that implements the system specifications and meets system requirements in terms of performance and reliability. It typically begins with detailed logic design, continues through coding, unit testing, integration and system testing, and ends at full operational capability. The Construct & Test phase seeks to implement the software requirements as defined in Phase 2. Once the software requirements have been base lined, the activities in Phase 3 implement these requirements through software detail design, coding, and integration. Typical products include:

  • design documentation;
  • verification (inspection and/or test) procedures and results;
  • user manuals and other operating documentation;
  • maintenance manuals;
  • acceptance test procedures and reports; and
  • a fully functional software product that has achieved at least 95% reliability and is acceptable for initial use in the customer’s environment.

The phase starts on the date when the very first module design is started. The phase ends on the date when it is possible, for the first time, to deliver a fully functional system with 95% of the total defects identified. Subjectively, this is the point in the program where a full-functionality system can be delivered with sufficient reliability so as not to be returned by the customer.

Perfective Maintenance

The phase that usually coincides with the operations phase. It may include correcting errors revealed during system operation or enhancing the system to adapt to new user requirements, changes in the environment, and new hardware.

The objective of this phase is to provide the customer base with product support once the system is operational. In some life cycles, this phase includes continued testing to increase reliability and provide quality assurance to customers and certification authorities. During this phase, residual bugs in the system are corrected. In addition, development of later releases and functional enhancements may be performed to meet the new requirements of a changing world. The principal activities of this phase include:

  • increased reliability and assurance testing;
  • correction of latent defects;
  • new enhancements;
  • modification and tuning of the system;
  • operation support.

In some life cycles, this phase is complete when management decides that the system is no longer of practical use and cuts off additional resources. Some time before this happens management may seek more improvements and initiate a feasibility study for a new system to replace it. In other life cycles, this phase is complete when some sort of certification occurs. Evolution of the product beyond this point typically spawns new projects.

 

 

Effort is defined in the SLIM-Estimate manual as "man months or man years devoted to a single phase or to the entire life cycle. Includes all development staff: analysts, designers, programmers, coders, integration and test-team members, quality assurance, documentation, supervision, and management." Essentially, effort includes the time of everyone who is working on the project, from programmers to management, for whatever phases your organization uses in your lifecycle. This definition is pretty vague because there isn't uniformity across the industry. In order to provide some consistency, QSM uses high level phases so that everyone can map to our lifecycle. The advantage of this approach is that it's flexible. We didn't set out to describe waterfall, iterative, agile, Rational, or any other methodology. Our approach addresses the fundamental issues every project faces, but though these activities are common to all software projects, in practice not all of these activities are recorded on every software project.

Organizations should think very carefully about using lifecycle effort because of the uncertainty about what's included. A lifecycle trend creates a trend based on all included phases for all projects in the data set. The problem is that you don't know what's included (activities and phases) for effort. A better way to benchmark effort would be to benchmark the specific phases that your lifecycle uses (i.e., phase 3 effort to phase 3 effort) rather than trying to use a lifecycle trend. If you are using lifecycle effort to benchmark your project's effort, it's important to keep in mind that according to the QSM Software Almanac (2006 IT Metrics Edition), 83% of projects included phase 2 and phase 3 data, while 38% reported phase 1 data, and 36% included phase 4 data. Most projects go through a "Deciding to do something, Determining what to do, Doing it, and Cleaning it up" lifecycle, but not all of that effort is recorded for the project.

The information that is behind the effort number is different from organization to organization, from project to project. Because of this, it was necessary to create a high-level, generic lifecycle that everyone can map to in order to make apples-to-apples comparisons for estimation and benchmarking. While you may include or exclude activities in your own lifecycle, it's important realize the differences between your lifecycle and the QSM default lifecycle in order to create informed estimates and benchmarks.

Blog Post Categories 
Benchmarking Effort