[BBIT] The Illusion of Accuracy

June 9, 2021 | Peter Cronin

Recently I was talking to a real estate agent, who was working out a predicated price for an apartment. They used an average from an apartment that sold two floors below and one that sold two floors above. This is reasonable to use as a rough guide, as the agent knew there were so many other variables. The condition of the units, the direction they were facing, how they might have been renovated, and of course if a building is blocking the view of one of the apartments but not the other!

 

Despite all this, a customer wanted a rough idea to know if it’s worth coming to book a viewing or register for the auction. The issue came when the customer heard the price (I suspect not liking it) and started questioning the equations, wanting the agent to show their workings. Rather than focusing on the variables mentioned above, they focused on trying to get the guide, based on imperfect data, to be more precise.

 

This is a perfect example of the ‘Illusion of Accuracy’ – where there is none.

People take comfort in the decimals, our thinking bias telling us that more precision is more accurate.

 

At this point a quick definition may help

Accuracy definition is – freedom from mistake or error: correctness.
https://www.merriam-webster.com/dictionary/accuracy

Precision definition is – the quality or state of being precise: exactness.
https://www.merriam-webster.com/dictionary/precision

Accuracy is how correct something is, while precision is the level of detail. Or when dealing with multiples it is how tightly grouped the numbers are.

 

How this is relevant to management reporting

The most common place this is used is in management reporting. The sources of data are often dubious. They can be opinion-based by either biased staff, or busy customers who would rather tick a couple of easy boxes on a feedback form than think too much about their answer. Data comes more and more from computer-tracked metrics which removes bias, however, doesn’t remove the variation. A manager who is stuck on a decision may err on data that is more detailed or precise when that precision could mean nothing.

 

Think of cheap scales at home which you use to weigh yourself. Going to two decimal places isn’t going to help you gain or lose weight. Again, the variables of your day, what your body is doing, and what you’ve consumed are going to influence it. The solution here as many accept is to focus instead on the trends. If you weigh yourself in the morning every day of a month. Some days will be up or down, but the trend line will let you know if you are gaining, losing, or maintaining weight.

 

Better yet is to include other information, if you were to measure yourself around a couple of points with a measuring tape too, you would get more useful information. Far more useful than obsessing over a perfect tape measurement, or more decimals on the scales.

 

Let’s take this back to management reporting.

 

You have data that comes from multiple areas, and can be of questionable quality, or be subject to variability. Just like with the fitness data, you want to look at the trends in a report, not the accuracy of the individual points. Secondly, you want a few distinct points, and a few will do. Most decisions we make are biased by so many other pressures that pretending to be scientific by having 30 ‘KPIs’ is just lying to ourselves. The diminishing returns of extra data will make it essentially useless, our bias will simply favour data (cherry-picking) that we prefer. If you have 30 KPIs, reporting on 27 of them is just wasting people’s time, and fuzzing people’s focus from the few that matter more.

Truly more accurate data is of course a good thing (provided the importance of the decisions matches the effort to gather the data), the trick is not to mistake more detail for accuracy.

 

Whenever you are seeking more accurate data, or see highly precise data, question if it matters.