Picture two urns stood on a table in front of you.

You’re given the opportunity to pick a marble from one of them, and drawing a red marble wins a prize.

The first urn has 10 marbles in it, 1 of which is red.

The second urn has 100 marbles in it, 8 of which are red.

Which urn would you choose? It doesn’t seem a tricky decision: your chances of drawing a red marble out of the first urn are greater (10%) than your chances of drawing a red marble out of the second urn (8%).

And yet, as Daniel Kahneman describes in *Thinking, Fast and Slow*:

‘About 30-40% of students (the survey participants) choose the urn with the larger number of winning marbles, rather than the urn that provides a better chance of winning.’

This is an example of denominator neglect – when somebody focuses on the headline number indicating the occurrence of an event (picking the red marble – the numerator), with little consideration for the number of times that event *could* occur (all the marbles – the denominator).

In other words, our view of the probability of an event occurring is skewed by the disproportionate attention we give to the absolute number of *winning marbles*.

**Why does it happen?**

Denominator neglect is largely a problem of perception – and presentation. If we scale the total number of marbles in the first urn to total 100, it would have 10 red marbles.

Now, the decision is clear – we simply have to choose between 8 or 10, because the importance of the denominator has been negated by our number manipulation.

But the problems associated with denominator neglect do not end here.

**Presentation and deception**

It is easy to assume denominator neglect is simply a problem for those who don’t understand probability.

But this phenomenon highlights a broader issue around how we interpret data. In a sense, our understanding of probability is irrelevant, because, as the following examples demonstrate, it is our subconscious that is worked upon by the presentation of data in a certain way.

Take this example, from Slovic, Monahan, and MacGregor, in their book *Violent Risk Assessment and Risk Communication*.

A panel of *professionals* are given information to assess whether Mr Jones is safe to be released from psychiatric hospital.

The data was presented to them in two ways:

*‘Patients similar to Mr. Jones are estimated to have a 10% probability of committing an act of violence against others during the first several months after discharge’*

*‘Of every 100 patients similar to Mr Jones, 10 are estimated to commit an act of violence against others during the first several months after discharge’*

21% of those who were shown the first format denied releasing Mr Jones.

41% of those who were shown the second format denied releasing Mr Jones. Nearly double – and these are trained professionals!

This is because the second statement calls to mind 10 violent criminals, and 10 violent acts.

This is a form of denominator neglect, but, unlike the marbles, we are not dealing with two separate sets of data: instead, *we have two separate presentations of the same fact.* But one presentation* takes advantage* of our instinct to neglect the denominator.

**The business case: how denominator neglect can affect decision making**

You are managing a department, and two of your employees, Paul and Stacy, each have an idea for a new feature on a product, and you have to choose only one of them.

You want to make a fair decision, and so ask them to run two surveys to get feedback on each feature. Each survey reaches 10,000 people.

You want to avoid any risks of the feature damaging the brand, so your focus is on who *dislikes* the new features.

Paul highlights that 400 people hate Stacy’s proposed feature.

Stacy follows up, highlighting that 5% of people hate Paul’s feature.

The headline figure of 400 does something strange to your mind. You begin to call to mind the 400 people who hate the feature. 400 real customers who might complain and stop buying your product.

On the other hand, 5% just doesn’t feel as concrete (or absolute), despite the fact it’s a 20% increase on the number presented by Paul.

Even as a clear-headed executive, it’s easy to see how you might be more alarmed by the idea of hundreds of complaints, compared to a relatively small %, especially if you’re not careful to keep the denominator front-of-mind when evaluating two sets of data presented to you.

**In conclusion**

When dealing with data, you have to make sure the presentation is equal and fair; and, if it is not, it’s essential to take the time to understand the truth behind the numbers, and work against any insidious bias working away.

This is particularly important in situations of risk aversion, and the more emotive subjects where absolute numbers will weigh more heavily on our judgement than fractions, and all the rational they seem to entail.