Tuesday, July 28, 2009

Metrics for agile projects

Yesterday's Agile Denver meeting was a panel discussion on metrics for Agile Projects. In the end most of the discussion was about software metrics in general, and the discussion produced some good insights that apply to all software projects, not just agile projects. Some highights:

Kevin Sheen from Perficient said you need to know what's important to your customers: deadline, budget, scope, requirements flexibility, product scalability, etc. Use that to decide what metrics are important.

Paul Rayner explained that diagnostic metrics such as code complexity should be initiated by the team, owned by the team, and not reported upward in the organization. He used the analogy of a hospital patient who has his heart rate, temperature, & blood pressure measured. Those metrics are used in combination by doctors and nurses as diagnostic measures, but not reported up to the hospital's board of directors. The problem is that most managers don't understand those metrics and managers generally shouldn't care about them, either. The metrics reported upward to management should be ones needed to make management decisions - about budgeting, schedule, scope. Kevin Sheen pointed out that generally when higher level management wants to "open the hood" and look at these metrics it's because they don't trust that the development team will meet it's estimates, and additional metrics give them a false sense of control over projects.

Brian Boelsterli said that a project's reporting metrics (distinguished from diagnostic metrics) should be clearly rolled up to corporate KPIs. Each project should be able to demonstrate how it is contributing to those KPIs.

Several panelists and audience members spoke about that potential dysfunction that can result when diagnostic metrics are used as carrots or sticks to measure or evaluate individual team members. To avoid dysfunction diagnostic metrics should be chosen by and valued by developers. Even when measured at the team level, compensating metrics should be used: for example if you collect a productivity metric then you should also collect quality metrics so that people don't sacrifice quality for apparent productivity.

Paul Rayner said that he likes the following diagnostic metrics at the development team level:
  1. Cyclomatic complexity - average per namespace and per method
  2. Total lines of code - monitor the trend over time. Should level off or decrease with refactoring.
  3. Coupling - to find classes that have too many dependencies
Richard Lawrence mentioned that he uses Coderush's "maintainability" metric. Visual Studio has it's own maintainability metric which uses a combination of other metrics to arrive at an overall score.

These technical metrics in my view are very valuable indicators - they help you decide when and where a qualitative analysis is required. For example, if you want to know where to get the most bang for your buck on refactoring, look for methods with high cyclomatic complexity or classes with high coupling.

1 comment:

Paul Rayner said...

Thanks for the good summary Brad. I expanded on some of what you had said about my own position and thoughts in a posting of my own. It was a good discussion on the night, and has given me much to think about.