Gaming the Metrics
One day in high school we had a substitute teacher. At the beginning of class, the substitute teacher handed out a quiz and told us we'd be tested on it at the end of the period. During class, we searched the textbook in small groups to find the answers.
Unfortunately, some of us studied to pass the quiz instead of to learn. Rather than learning the material, we memorized the multiple-choice number sequence. And I "succeeded" — brilliantly. So brilliantly that I could ace that same quiz decades later. Yet, I don't remember the information I was supposed to learn.
I missed the point.
And the quiz? It failed too. It didn't encourage all of us to learn the subject matter and it didn't measure whether we learned it.
It would be easy to say that gaming metrics is isolated to high school students but it's not. I recall a 1:1 meeting with my manager where we talked about a stakeholder's metric and how people were redefining and relabeling their processes to meet the metric without achieving its true goal.
More missing the point.
Agile software development has its metrics, too. These metrics should be tools, not masters. The urge to game the metrics must be resisted.
For example, focusing on team velocity to evaluate a team, with expectations that velocity must increase over time can result in point inflation during backlog refinement and estimation. As a result, team velocity becomes less useful for planning sprints and predicting progress.
When a metric fails to provide useful information, or worse, harms what it intends to measure, do something about it.
- Adjust the metric.
- Create a different metric.
- Talk about the value of metrics and the harm in gaming them.
Or risk missing the point.