Images in this post missing? We recently lost them in a site migration. We're working to restore these as you read this. Should you need an image in an emergency, please contact us at
Measuring Progress

"Working software is the primary measure of progress"

Fundamentally, there is no more valid measure for progress, than the working software itself. This only leaves open to discussion, the definition of "working software".

Defining "Working Software"

The criterion for defining working software is obviously open to debate. A common definition is:

Software can be called "working software" when it meets a defined set of business requirements and can be demonstrated to do so through testing.

This is one reason why Agile processes put so much value on unit testing, these tests show very early on in the process that software is meeting business requirements, without the need to create a fully functional user testable application. These unit tests also allow demonstration at a fairly low level of granularity that code is meeting requirements, where a user facing application is dependent on too many factors to easily establish correctness.

So How Do We Measure Progress?

Based on this definition, our best way to measure progress and velocity on projects is to evaluate defined business requirements, against the code that is provided to meet those requirements.

Code that is written, but is not yet functional and passing tests cannot be considered progress until it has been completed to a level as defined above.

This then prioritises getting components of functionality completed early, rather than attempting to do all things simultaneously.

The traditional "waterfall" approach to software development is to spread large amounts of functional requirements out amongst large teams, for example assigning each piece of functionality to a team member with an expected delivery date measured in weeks or months. This leads to very long cycles for delivery of something correlating to "working software"

An Agile approach to the same problem is to focus the teams on only a small subset of that functionality, and to attempt to deliver it in a working and testable state in very short iterations. The degree of success with which they do this then becomes their "velocity". The velocity can then be used as a predictor of future success rates and therefore of future timescales. This also allows a "fail fast" mentality, where it is better to hit problems early on and resolve them, rather than delay all the problems for as long as possible down the development path.

Therefore, the best way of measuring success is to do one thing at a time, do it well, ensure it works, ensure it meets criteria, ensure it can be tested, and then to replicate the things that went right on the next piece of functionality, and eliminate the things that did not go so well.


Posted 05-15-2008 9:48 AM by Jak Charlton



Dew Drop - May 15, 2008 | Alvin Ashcraft's Morning Dew wrote Dew Drop - May 15, 2008 | Alvin Ashcraft's Morning Dew
on 05-15-2008 8:43 AM

Pingback from  Dew Drop - May 15, 2008 | Alvin Ashcraft's Morning Dew

Casey Charlton - Insane World wrote Statistics and How They Lie
on 05-16-2008 2:33 AM

Industry experience suggests that the design of metrics will encourage certain kinds of behaviour from

Community Blogs wrote Statistics and How They Lie
on 05-16-2008 6:26 AM

Industry experience suggests that the design of metrics will encourage certain kinds of behaviour from

About The CodeBetter.Com Blog Network
CodeBetter.Com FAQ

Our Mission

Advertisers should contact Brendan

Google Reader or Homepage Latest Items
Add to My Yahoo!
Subscribe with Bloglines
Subscribe in NewsGator Online
Subscribe with myFeedster
Add to My AOL
Furl Latest Items
Subscribe in Rojo

Member Projects
DimeCasts.Net - Derik Whittaker

Friends of
Red-Gate Tools For SQL and .NET


SmartInspect .NET Logging
NGEDIT: ViEmu and Codekana
NHibernate Profiler
Balsamiq Mockups
JetBrains - ReSharper
Web Sequence Diagrams
Ducksboard<-- NEW Friend!


Site Copyright © 2007 CodeBetter.Com
Content Copyright Individual Bloggers


Community Server (Commercial Edition)