Practical Software Estimation Measurement

Blogs

Webinar - Building an Estimation Center of Excellence

On Thursday, June 13, at 1:00 PM EDT, Larry Putnam, Jr. will present Building an Estimation Center of Excellence.

The pressure to succeed in software development is higher than ever - the current economic climate demands we do more with less, there is fierce global and domestic competition, time-to-market expectations are high, and your company's reputation is on the line. When projects fail, the failure to meet expectations is more often an estimation or business decision failure than a production or execution issue. In this webinar, industry expert Larry Putnam, Jr. takes you through the key elements and step-by-step process for setting up an estimation center of excellence that will ensure your projects succeed.

Larry Putnam, Jr. has 25 years of experience using the Putnam-SLIM Methodology. He has participated in hundreds of estimation and oversight service engagements, and is responsible for product management of the SLIM Suite of software measurement tools and customer care programs. Since becoming Co-CEO, Larry has built QSM's capabilities in sales, customer support, product requirements and most recently in creating a world class consulting organization. Larry has delivered numerous speeches at conferences on software estimation and measurement, and has trained - over a five-year period - more than 1,000 software professionals on industry best practice measurement, estimation and control techniques and in the use of the SLIM Suite.

Watch the replay!

Blog Post Categories 
Webinars Estimation

How does uncertainty expressed in SLIM-Estimate relate to Control Bounds in SLIM-Control? Part II

Several months ago, I presented SLIM-Estimate’s use of uncertainty ranges for size and productivity to quantify project risk.  Estimating these two parameters using low, most likely, and high values predicts the most probable effort and time required to complete the project.  This post shows you how to use SLIM-Estimate’s probability curves to select the estimate solution and associated work plan that includes contingency amounts appropriate to your risk.

Begin with an unconstrained solution

The default solution method used for new estimates, whether you are using the Detailed Method or another solution option, is what we call an unconstrained solution.  Just as it sounds, no limits have been placed on the effort, schedule, or staffing SLIM-Estimate can predict.  It will calculate the resources required to build your product (size) with the capabilities of your team (PI).  Assuming you have configured SLIM-Estimate to model your life cycle and based your inputs on historical data, you have produced a reasonable, defensible estimate.  

Solution Panel

Blog Post Categories 
SLIM-Control SLIM-Estimate

QSM Announces Latest Update to the QSM Project Database

We are pleased to announce the the latest update to the QSM Project Database! The 8th edition of this database includes more than 10,000 completed real-time, engineering and IT projects from 19 different industry sectors.

The QSM Database is the cornerstone of our business. We leverage this project intelligence to keep our products current with the latest tools and methods, to support our consulting services, to inform our customers as they move into new areas, and to develop better predictive algorithms. It ensures that the SLIM Suite of tools is providing customers with the best intelligence to identify and mitigate risk and efficiently estimate project scope, leading to projects that are delivered on-time and on-budget. In addition, the database supports our benchmarking services, allowing QSM clients to quickly see how they compare with the latest industry trends.

To learn more about the project data included in our latest update, visit the QSM Database page.

Blog Post Categories 
QSM Database

New Article: Data-Driven Estimation, Management Lead to High Quality

Software projects devote enormous amounts of time and money to quality assurance. It's a difficult task, considering most QA work is remedial in nature - it can correct problems that arise long before the requirements are complete or the first line of code has been written, but has little chance of preventing defects from being created in the first place. By the time the first bugs are discovered, many projects are already locked into a fixed scope, staffing, and schedule that do not account for the complex and nonlinear relationships between size, effort, and defects. 

At this point, these projects are doomed to fail, but disasters like these can be avoided. When armed with the right information, managers can graphically demonstrate the tradeoffs between time to market, cost, and quality, and negotiate achievable deadlines and budgets that reflect their management goals. 

Leveraging historical data from the QSM Database, QSM Research Director Kate Armel equips professionals with a replicable, data-driven framework for future project decision-making in an article recently published in Software Quality Professional

Read the full article here.

Blog Post Categories 
Articles Data Quality

Let's Get Serious About Productivity

Recently I conducted a study on projects sized in function points that covers projects put into production from 1990 to the present, with a focus on ones completed since 2000. For an analyst like myself, one of the fun things about a study like this is that you can identify trends and then consider possible explanations for why they are occurring. A notable trend from this study of over 2000 projects is that productivity, whether measured in function points per person month (FP/PM) or hours per function point, is about half of what it was in the 1990 to 1994 time frame.

Median Productivity

 1990-19941995-19992000-20042005+
FP/PM11.1179.215.84
FP/Mth17.163.929.7422.10
PI15.316.413.910.95
Size (FP)394167205144

 

Part of this decline can be attributed to a sustained decrease in average project size over time. The overhead on small projects just doesn’t scale to their size, thus they are inherently less productive. Technology has changed, too. But, aren’t the tools and software languages of today more powerful than they were 25 years ago?

Blog Post Categories 
Productivity Project Management

They Just Don't Make Software Like They Used to… Or do they?

With the release of SLIM-Suite 8.1 quickly approaching, I thought I’d take a moment to share a preview of the updated QSM Default Trend Lines and how it affects your estimates.  In this post I wanted to focus on the differences in quality and reliability between 2010 and 2013 for the projects in our database.  Since our last database update, we’ve included over 200 new projects in our trend groups.

Here are the breakouts of the percent increases in the number of projects by Application Type:

  • Business Systems: 14%
  • Engineering Systems: 63%
  • Real Time Systems: 144%

Below you will find an infographic outlining some of the differences in quality between 2010 and 2013.

Changes in Software Project Quality between 2010 and 2013

From the set of charts above, we can see some trends emerging which could indicate the changes in quality between 2010 and 2013.  By looking at the data, it’s apparent that two distinct stories are being told:

1. The Quality of Engineering Systems has Increased

Blog Post Categories 
Software Reliability Quality

Updated Function Point Gearing Factor Table

Version 5.0 of the QSM's Function Point Gearing Factor table is live!

The Function Point Gearing Factor table provides average, median, minimum, and maximum gearing factors for recently completed function point projects. A gearing factor is the average number of basic work units in your chosen function unit. Originally, it was designed to be used as a common reference point for comparing different sizing metrics by mapping them to the smallest sizing unit common to all software projects. QSM recommends that organizations collect both code counts and final function point counts for completed software projects and use this data for estimates. Where there is no completed project data available for estimation, we provide customers with a starting point to help them choose an appropriate gearing factor for their chosen programming language.

For this version of the table, we looked at 2192 recently completed function point projects out of the 10,000+ in QSM's historical database. The sample included 126 different languages, 37 of which had enough data to be included in the table. Interestingly, this year we added three new languages: Brio, Cognos Impromptu Scripts, and Cross Systems Products (CSP).

One trend we noticed is that, in general, the range for gearing factors has decreased over time. Similarly, the average and median values have decreased, which we attribute to having more data to work with.

Read the full press release or visit the new table!

Blog Post Categories 
QSM News Function Points

QSM Partners with Digital Celerity for CA World 2013

We are pleased to announce QSM's partnership with Digital Celerity LLC, a leader in Project and Portfolio Management (PPM) and IT Service Management expert services and solutions, for CA World 2013

At the event, representatives from both QSM and Digital Celerity will showcase how QSM's SLIM Suite of Tools feeds project estimation data into the CA ClarityTM PPM tool to allow for improved planning and resource allocation. Out of the top 10 systems integrators in the world, seven rely on SLIM intelligence. By engaging in this type of top-down estimating, analytics for project planning can be fed into PPM tools such as CA ClarityTM PPM. Resulting analytics include detailed plans for effort by labor category, time period and project size. Leveraging SLIM tools with CA ClarityTM PPM pushes project risk identification to the earlier proposal and feasibility stages in the project lifecycle, which can significantly reduce the risk of project failure. 

The conference, which takes place April 21-24, 2013 in Las Vegas, NV, will showcase the latest and most innovative technologies for delivering optimal business results. Stop by QSM booth #115 to learn more about how SLIM enhances PPM tools.

For more details about this partnership, read the full press release.

Blog Post Categories 
QSM News

Estimating for the Business Plan

Having worked in sales and customer service at QSM for over 17 years, I speak to hundreds of professionals each year that are directly or indirectly involved with software development projects using many different development processes. One of the things that I hear from time to time is that estimating is not as important when working with more iterative development methodologies. Some of the reasons I hear most often are that “team sizes are smaller,” “work can be deferred until the next iteration,” “we are different,” and “we are agile.”

As I dig deeper though, I find that the fundamental questions that software estimates answer are relevant, no matter what development methodology is being used. Before committing to a project, executives and managers need to determine reasonable cost, schedule, and how much they can deliver. This is when not very much is known and before any detailed planning occurs.  Estimating helps mitigate risk early in the project lifecycle. Companies also need to have reliable information in order to negotiate with clients. How can we negotiate a schedule and a budget on any project without a defensible estimate? 

When looking at QSM research based on our database of over 10,000 industry projects, a common theme that we see in failed projects is that development team performance is often not the issue. When it comes to missed schedules and budgets, many of the problems occur when expectations are too high and when estimates are not a priority. If we don’t have a reliable estimate up front before the project starts, it’s tough to plan ahead. 

Blog Post Categories 
Estimation

Haste Is Expensive

Large companies often seem to have a few people in key positions with extra time on their hands. Occasionally, this time is used to invent acronyms that are supposed to embody corporate ideals. Mercifully, these usually fade away in time. A former employer of mine had two beauties: LOCOPRO (Low Cost Provider) and BEGOR (Best Guaranteer of Results). Unfortunately, besides being grating on the ear, LOCOPRO and BEGOR don’t always march in tandem. LOCOPRO deals with cost and the effort required to deliver something. BEGOR is a bit more amorphous dealing with quality and an organization’s efficiency and consistency in meeting requirements.

What are the normal requirements for a software project? Here’s my short list.

  • Cost. What is being created and delivered has to be worth the expense in the mind of the person or organization that is funding it. (LOCOPRO is good)
  • Schedule. The timeframe in which a project creates and delivers its software is frequently a key constraint. Meeting this is important. Consistency and predictability (BEGOR are good)
  • Quality. In Business IT systems this is often an implicit requirement that is most noticed when it is absent. Real time, telecommunications, military, and life support systems are more frequently developed and tested to explicit quality standards.

The mantra of Faster/Better/Cheaper captures most organizations’ desires for Cost, Schedule, and Quality – all at the same time. If only the laws of software would cooperate! But they don’t. Software is like a balloon. You constrict it in one place (schedule, for instance) and it expands in another (cost). The problem isn’t going to disappear; but by prioritizing requirements, conscious and realistic tradeoffs can be made.

Blog Post Categories 
Schedule