|
|
Today's fast-paced IT environments leave little room for
error. But increasingly, software managers are asked to function
in a climate of uncertainty. Project estimates must be prepared
with little notice and insufficient data. Customers keep changing
their minds - and the delivery date - upsetting even the most
carefully planned projects. Now more than ever, software managers
need answers:
· What will it cost?
· When can I deliver?
· What level of reliability should we commit to?
· How should we staff our maintenance phase?
Without facts, you can't make the right decisions. Your customers
won't be happy.
Your project will suffer.
|
|
Unfortunately, the information you need hasn't always
been readily available. The IT Software Almanac was commissioned
to give managers, developers, and industry leaders an inside look
at the current state of software development. We designed it to
answer the critical need for information: a need that all too often
goes unmet.
Throughout the almanac, we approach each topic from a variety of
perspectives. We start by establishing industry benchmarks for the
typical small, medium, and large project. We go on to examine the
tradeoffs between time and effort and take an empirical look at
the cost (in money, time, and defects) of staffing up to achieve
schedule compression.
Becoming more productive is always a big concern to any organization,
so we study best and worst in class performers to see what they're
doing right - and wrong. But there's another way to look at performance.
What kinds of factors affect productivity on in-progress projects?
What caveats should you consider when using different metrics to
assess progress? Our Size, Language, and Reuse section examines
these important questions and comes up with some surprising answers.
Finally we raise our sights to the long-term implications of advances
in the industry. What changes have we seen since the 1980's? What
do current trends indicate for the future of system size, productivity,
defects, and reuse? What lessons can we derive from the data, and
what predictions can we make about the future? The Conclusions section
ties together what we've learned from our short and long term views
of the industry. Are they telling us the same things? We hope to
provide accurate, timely, and focused insights that make your projects
and organizations more successful.
The software business is full of interesting questions. The answers
are in the data.
Measuring Success
We're living proof that measurement needn't be a painful
process. For almost thirty years QSM has helped Fortune 1000 firms
all over the world develop successful metrics programs. As a result,
our database is one of the most comprehensive repositories of modern
day software projects in existence. It allows us to draw on over
7100 completed software projects from North and South America, Australia,
Europe, Africa, and Asia, representing over 740 million lines of
code, 600+ development languages, and 105,833 person years of effort.
We began building the database in 1978. Since that time we've updated
it continuously, adding an average of 500 validated projects a year
over the last 6 years. Constantly refreshing the database keeps
our tools current and informs our clients as they deal with the
challenges of designing and developing quality software in a constantly
changing environment. Clients comprise our main source of project
metrics but estimates, productivity assessments, and cost-to-complete
engagements present other opportunities to collect data.
When analyzing software metrics, we consider it vital to make only
valid comparisons, so we classify projects into complexity or application
domains. IT projects represent by far the largest segment of the
database, followed by engineering class, real time, and microcode
projects. For the past 27 years we've studied development productivity
with respect to cost reduction, time to delivery, and quality improvement
but in 2001 Doug Putnam conducted a study that surprised us a bit.
He sorted IT applications completed between 1982 and 2000 into 3-year
bins and ran regression trends on project size, average Productivity
Index (an indexed measure of overall project efficiency), schedule,
effort, staff, mean time to defect (MTTD) and software reuse metrics.
We expected to see continuing improvement in all these measures
over time but the study showed a marked decrease in several of these
important metrics.
The QSM Software Almanac takes a fresh look at the state of
the software industry with a new set of projects completed between
2001 and 2004. It expands and improves our earlier analysis by
taking both a snapshot of the current state of the industry and
a long term look at software trends over time. We let you benchmark
yourself against some of the sharpest folks in the industry. You'll
get a birds-eye view of what's been going on for the past few decades:
what's changed, and what's stayed the same. You'll see what best
in class (and worst in class) projects look like with respect to
effort, duration, and defects and empirically assess tradeoffs when
projects staff up to meet a tight deadline. Finally, we'll sum up
what we've learned and share insights we think will make your projects
and organizations more successful.
|
|
|