Goodhart’s Law
Goodhart's Law is a famous observation in economics and social science that captures a common pitfall in how we measure success and set goals. It is most frequently stated as:
"When a measure becomes a target, it ceases to be a good measure."
The law suggests that once a specific metric is used to reward or punish behavior, people will find ways to "game" the system to hit that target, often at the expense of the actual goal the metric was supposed to represent.
Origins - The principle is named after Charles Goodhart, a British economist and former advisor to the Bank of England. He originally formulated the idea in a 1975 paper regarding monetary policy. He noticed that as soon as the government tried to regulate a specific component of the money supply, the statistical relationship they were relying on would break down because banks and consumers changed their behavior to circumvent the new rules.
How It Works - To understand Goodhart's Law, it helps to distinguish between a proxy and the underlying goal:
The Goal: You want a high-quality outcome (e.g., healthy patients, productive employees).
The Proxy: Since "quality" is hard to measure directly, you pick a number that seems related to it (e.g., hospital wait times, number of lines of code written).
The Breakdown: Once people are incentivized to move that number, they focus entirely on the number. A hospital might "cheat" wait times by keeping patients in ambulances outside the door, or a programmer might write unnecessarily bloated code to hit a line-count quota. In both cases, the number looks great, but the actual quality of work has stayed the same or even declined.
Real-World Examples
The Cobra Effect: In colonial India, the government wanted to reduce the cobra population, so they offered a bounty for every dead cobra. In response, people started breeding cobras in their homes to kill them and collect the reward. When the government found out and stopped the program, the breeders released the now-worthless snakes, leaving the cobra population higher than when they started.
Standardized Testing: When a school's funding is tied to test scores, teachers may feel pressured to "teach to the test" rather than providing a broad education. The scores go up, but the students' actual understanding of the subjects may not improve.
SEO and Content: Search engines used to rank pages based on how many keywords they contained. Website owners responded by "keyword stuffing"—filling pages with repetitive words until they were unreadable. The metric (keywords) no longer indicated a high-quality article.
Why It Matters
Goodhart's Law is a warning against "metric fixation." It reminds us that while data is useful for monitoring progress, using it as a primary incentive structure often leads to unintended consequences.