Monday, June 6, 2016

Goodhart's Law

I enjoy listening to the data skeptic podcast. It's a data science view of statistics, machine learning, etc. They recently discussed Goodhart's Law on the podcast. Goodhart's was an economist. The law that bears his name says that "when a measure becomes a target, then it ceases to be a good measure." People try and find a way to "game" the situation. They maximize the indicator but produce poor quality on other dimensions as a consequence. The classic example is a rat reduction program implemented by a government. They want to motivate the population to destroy rats, so they offer a fee for each rat that is killed. Rather than turn in the rat's body, they just ask for the tail. As a result, some persons decide to breed rats and cut off their tails. The end result... more rats.

I have some mixed feelings about this issue. There are many optimization procedures that require some single measure which can be either maximized or minimized. I think these techniques would be very useful for surveys. Here is an example I've cited in the past. For example, an optimization that finds a solution for maximizing the R-Indicator might be helpful for choosing a survey design. These optimizations can also involve other constraints that control potential unintended consequences.

On the other hand, I do worry that reliance on a single measure can lead to problems akin to those anticipated by "Goodhart's Law." I've said it on this blog before -- what did the focus on the response rate distort about survey design or, more broadly, research into survey methodology? I think we've moved forward a bit on that. And optimization techniques can be helpful in survey design, even if they focus on optimizing for a single target measure, Goodhart's Law is a useful reminder that we also need to keep monitoring multiple aspects of the data collection process.