Software Estimation Best Practices

Doug Putnam's blog

How Machine Learning Algorithms Can Dramatically Improve your Estimation Predictions

At QSM, we have been on the leading edge of software estimation technology for 40 years.  One of our recent innovations is to incorporate machine learning into our SLIM-Suite of estimation and measurement tools.  If you are not familiar, the whole concept of machine learning is to “train” your algorithms with data to improve the accuracy of their predictions.  Simple in concept, but the devil is in the details.  In software project estimation, we are always asked to provide timely decision-making predictions based on skimpy information.  Depending on the situation, our analysis will typically focus on one or several of the following criteria:

  1. Schedule (Time to market)
  2. Effort (Cost to develop)
  3. Staffing and Resources Required
  4. Required Reliability at Delivery
  5. Minimum-Maximum Capability or Functionality Tradeoffs

We start the training process by utilizing data from completed projects using these five core metrics.  The data usually resides in tools like Jira or PPM products. Once obtained, we run statistical analysis on the data to determine typical behaviors and variability.

Estimation Machine Learning
Figure 1. Project data used in SLIM Machine Learning Training Process.  Triangles represent completed projects.  Lines are curve fits of the average behavior and statistical variation in the positive and negative directions.  These charts show how time, effort and staffing change depending on the size of the product to be developed. 

Blog Post Categories 
Estimation

Bringing Transparency into Project Contingency Buffers for Schedule and Cost

The application of contingency buffer, more commonly known as “padding” or “management reserve” is the final step in any project estimation process.  The most common practice is for the estimator to use an intuitive multiplier which is added to base estimate.  Unfortunately, everyone has a different multiplier which is shaped by their own personal bias about risk and it is hidden in their head.  This creates a fundamental problem with transparency and consistency within most organizations.

Fortunately, there's a better way.  One solution is to define and configure agreed upon standards that are matched to specific business risk situations.  These should be collaboratively agreed to by all the stake holders in the organization.  Then they can be codified into a configuration that can be selected at the time when contingencies are typically applied to an estimate.  This helps solve the consistency issue.

Project Risk Buffer

To attack the transparency issue, you can use a technique of overlays to visualize the contingency in comparison to the base estimate. 

Project Risk Buffer

Using Business Analytics to Set Realistic Customer Expectations

I was recently reading an article by Moira Alexander titled “Why Planning Is the Most Critical Step in Project Management” and I was stuck by her observation that one of the primary reasons that projects fail is because they commit to unrealistic expectations.  In my 35 years of experience, I believe this is the number one reason projects fail.  Yet it is competence that few organizations or product owners ever get good at.

 Today there are good simulation tools that make it simple to establish realistic project boundaries.  The results can be used effectively to communicate and negotiate expectations with clients.  

For example, imagine that you are a product owner planning out your next release.  Your team of 10 people has been working on a 5-month release cadence.   A backlog refinement has shown that there are approximately 100 story points to be completed in this release.  The project plan is shown in the figure below.

Agile Uncertainty

However, there are some uncertainties and we need to deal with them in a realistic way.  Since the schedule and the team size are fixed, the only area that can give is the functionality.   Simulations are a great way to quantify uncertainty.  In our case, we are confident in our team’s productivity and labor cost, but we are somewhat more uncertain about the new capabilities in this release.   It is easy to adjust the uncertainty settings and run a business simulation.  The uncertainty slider bars are in the image below. 

Blog Post Categories 
Estimation Risk Management

Getting Staffing Right is the Key to Software Development Nirvana

Enterprise IT teams have been searching for years for the Holy Grail of software development: the greatest possible efficiency, at the least possible cost, without sacrificing quality.

This endless search has taken many forms over the years. Twenty years ago, development teams turned to waterfall methodologies as a saving grace. Soon after, waterfall begat object-oriented incremental or spiral, Rational Unified Development (RUP) practices.

Today, it’s agile development’s turn in the spotlight. C-suite executives are investing huge sums of money to develop their organizations’ agile methodologies. They’re also committing significant resources to train employees to work within agile frameworks.

Yet many projects are still failing, clients remain unsatisfied, and IT departments are often unable to meet scheduling deadlines. Why?

It’s the staff, not the method.

Whenever a project falls behind schedule, the natural inclination is to add more staff. There’s a belief that doing so will accelerate development and, ultimately, help the team hit their deadlines.

That’s not always the case, however. In fact, throwing more people at a project often results in slowing things down even more. Sure, your team might get a little bit of a short-term boost, but in the long run, you’ll have more connection points to manage (which can increase the potential for mistakes or defects) and higher costs.

The 2017 Software Almanac: Development Research Series

QSM Software Almanac: 2017 Edition

Software plays an increasingly vital role in our everyday lives. It powers everything from autonomous cars and aircraft, life-saving medical equipment, and the data that allows the government to protect our country. When companies develop software, there’s no room for error. 

That’s why software predictive analysis and estimation are still extremely important. Last year, with the release of the 2016 Software Almanac, we learned that the last 35 years of predictive analytics and estimation principles were still incredibly relevant for providing reliable and applicable business intelligence for implementing successful software projects.

This year’s version of QSM’s annual Software Almanac further strengthens those findings. The 2017 Software Almanac builds on the principles identified in last year’s publication and highlights the dangers of not applying predictive analysis and estimation processes.   As stated by Angela Maria Lungu, Almanac Editor and Managing Director at QSM, these principles can be a “double-edged rearview mirror.” If you move forward without applying the historical principles of estimation and analysis correctly, their value is diminished.   Here’s what else you can expect from this year’s Almanac:

Blog Post Categories 
Articles QSM Database

Assessing Project Portfolio Risk in IT Budgeting

No one said IT budgeting was easy. It seems like you just finished last year’s budget and now it is time to start all over again. Not only is this task difficult, it is made worse by the fact that most organizations do it in an overly simplistic way. This often results in up to 40% of the projects grossly missing the mark, which wreaks havoc on the enterprise resource plans and results in disappointed business stakeholders.

A large part of successful IT budget planning is identifying grossly unrealistic projects – the ones that are likely to fail and the ones that are ultra conservative and wasteful. Our solution is to perform a basic feasibility assessment on each project as it enters the budgeting process. Ultimately, we will want to make adjustments to these projects, making them more reasonable and improving the overall project performance.

So how is this feasibility assessment done? Start by creating a set of historical trend lines for schedule, effort, and staffing versus size of functionality produced. The trend lines provide a basis for the average capability that could be expected. It also gives us a measure of the typical variability that can be expected. Next, position the initial budget requests against the trend lines. The intention is to identify whether or not the projects are outside of the norm and typical variation; i.e., projects that are high risk or poor value. Figures 1 through 3 highlight some of the techniques used to identify those types of projects.

Blog Post Categories 
Estimation IT Budgeting

The 2016 Software Almanac: A Look Back at 35 Years of Predictive Analytics for Business Intelligence

QSM Software Almanac: 2016 Edition

Let’s face it -- times have changed since the initial principles of predictive analytics and software estimation were established. Today more than ever, we live in a time where there is an incredible dependence on software – it is the cornerstone of almost every business. Risk management and cyber vulnerabilities are now major concerns that weren’t even on the radar decades ago. 

The 2016 version of the QSM Almanac, released earlier this week, takes a truly unique look at the last 35 years of predictive analytics and estimation for business intelligence to determine if its previously developed principles are still applicable today, and, if so, how those principles apply to the current state of software projects. The results are somewhat incredible, and I thought I could share a few of the highlights from this year’s resource as a preface to the full (and free) Almanac, which can be downloaded here:

Blog Post Categories 
Articles

The 2014 QSM Software Almanac: Seven Insights that Matter

It is no coincidence that this year’s release of the 2014 QSM Software Almanac has been coined the Research Edition. The data, research, insights, analysis and trends packed into the 200+ page book truly make it the ultimate resource for software development and estimation. That said, I thought I’d share just a few of the highlights from this year’s Almanac as a little teaser to what you’ll find when you download the full (and free) resource.

Blog Post Categories 
Resources Articles White Paper

Announcing the QSM Software Almanac: 2014 Research Edition

QSM Software Almanac: 2014 Research EditionAfter many months of research, I’m pleased to announce that today QSM has released the 2014 version of its Software Almanac.  A follow-up to the previous version released in 2006, this 200+ page book includes more than 20 articles on topics such as metrics, agile methodology, long term planning, and trends in software development.  

The Almanac is one of the few research compendiums that studies how software development has evolved since 1980.  The source of this research is the QSM Metrics Database, which contains data from over 10,000 completed software projects from North and South America, Australia, Europe, Africa, and Asia, representing over 740 million lines of code, 600+ development languages, and 105,833 person years of effort.  

The field of software development has long focused on finding predictable and repeatable processes that improve quality and productivity, which is why many organizations are taking an interest in agile methodology. As such, this year’s Almanac focuses on this topic, which has generated increased interest since the 2006 release. Specifically, it takes a close look at projects that have been based on agile methodologies and successfully completed within the past five years.

Blog Post Categories 
QSM News Articles