Five ways to spot misleading numbers
Numbers are everywhere in the news: percentages, probabilities, death rates, graphs. They all look so definite, so believable, but they can also be misleading. Politicians, advertisers and others trying to make a point often mention numbers that fit their argument and keep quiet about those that don’t. Worst still, they sometimes present those numbers in ways designed to give a false impression.
Their task is made easier by the fact that many reporters are arts specialists rather than mathematicians so they don't spot the problems with the numbers they are given. I hope the tips in this article will stop you falling into the same trap.
1: Beware of percentages
Percentages look impressive, but they are really just fractions with a hundred at the bottom that no one bothers to write. And like all fractions, they are meaningless unless you know what they are a fraction of. Just as half a mouse is much smaller than half an elephant, 5% of ten is much less than 5% of a million. So a company that gives everyone a 5% pay rise isn't being as fair as they sound because high-earners will get a bigger rise than those at the bottom of the payscale.
Percentages are particularly useful to anyone trying to make numbers look more impressive than they really are. If an author tells you that sales of his book have increased by 200%, you’ll probably assume that he’s on his way to being a bestseller. But you wouldn’t think the same if he told you that he sold one book last week and three this week. The same technique can also be used to make a slight increase in Covid-19 cases look like a huge jump and to make big increases look less important than they really are. It's less worrying to hear that the national debt has increased by 1% than to be told that your country has borrowed another 16 billion pounds.
Be particularly wary of percentages in medical scare stories. Let’s imagine that researchers have discovered that regularly hopping on one leg increases your chances of contracting a very rare type of cancer by 100%. That sounds terrifying enough to make you keep both feet firmly on the floor until you realise that the normal chance of getting that rare cancer is 0.000001%. Increasing that by 100% gives regular hoppers the exceedingly low risk of 0.000002% - a number which would not have made such a good headline.
Whenever you meet a percentage, always check to see exactly what it’s a percentage of and, if anyone chooses not to tell you, be sceptical of everything else they say.
2: Beware of poll results
Suppose a newspaper article declares that 80% of people believe in fairies (or 8 out of 10 people, which means the same). At first glance, that looks okay because it tells you what the 80% is a percentage of. But what exactly does it mean when it says “people”. It’s highly unlikely that the researchers have asked all the 7.8 billion people in the world and if you weren’t asked, you know for sure that they didn’t. So they have done what all pollsters do and asked a sample of people instead.
As with all polls, how meaningful the results are will depend on how much that sample is like the total population. Suppose the pollsters only asked members of the We-Believe-In-Fairies Society. Then the surprising thing about this result isn’t that 80% of people said they believed - it’s that 20% didn’t.
Now let’s imagine they’ve tried hard to be fair by picking a thousand people at random and emailing them to ask if they believe in fairies. Those that do believe would be delighted to hear from someone who seemed to agree with them so they would be more likely to reply. Those that don’t would think it a stupid question so they would just hit the delete key without answering. If the researchers ignore those who don’t reply, the result isn’t 80% of people believe in fairies – it’s 80% of people-who-bothered-to-answer-our-question believe in fairies which is far less impressive.
This sort of thing happens all the time which explains why opinion polls often turn out to be wrong. Reputable pollsters try hard to make their samples as representative of the population as possible, but people reporting their results often ignore the “Don’t Knows” even if they make up the majority of replies. And less reputable pollsters can easily slant their choice of sample to help give the result that will please the organisation paying the bill. For example, a poll about voting intentions in an election won’t give a meaningful result if the sample is taken mainly from members of one particular party or readers of one particular newspaper.
Biased sampling is not the only way dodgy pollsters can fix their survey to give the results their customer wants. They can also affect people’s answers by the way they word the questions and by limiting the choice of answers. So for a question about voting intentions, they might provide an incomplete list of candidates or parties to choose from.
With polls, always try to look at the full results if you can rather than just relying on a newspaper article interpreting the results. Try to see the actual questions that were asked and check who is paying for the poll to see if they have vested interest in the result. All that information will help you spot problems so you can judge for yourself whether you want to believe the numbers.
3: Beware of unfair comparisons.
Just because two numbers exist doesn’t mean you can compare them: you can only do that fairly if both numbers are counting the same thing in the same way. For example, you can’t compare the speeds of two cars if one speed is in metres per second and one is in miles per hour and you can’t compare the weights of two people if one is in pounds and the other is in kilograms. In order to do a comparison, you need to change both measurements to the same units.
Percentages don’t have units, but you can only compare them if they are all fractions of the same thing. Although 3% looks as if it is smaller than 6%, 3% of a million is much bigger than 6% of ten.
During the Covid-19 pandemic, there have been lots of people comparing numbers in an attempt to show that one country is doing better or worse than another. But any comparison based on the number of cases is very suspect because the availability of tests varies widely from country to country. Some only test people with symptoms so miss all the asymptomatic cases. Others test more widely so automatically pick up more cases. As a result, comparing the number of cases isn't meaningful because you’re not comparing numbers arrived at in the same way.
Comparing total number of deaths raises similar issues because not all countries count coronavirus deaths in the same way. Some count all deaths where Covid-19 is suspected while others only count deaths of people who have had a positive test. But let’s ignore that discrepancy for the moment and just look at the three main ways of doing the actual comparison.
- Comparing total deaths in each country.
It’s obvious that the more people you have, the more deaths you are likely to get. So this comparison naturally puts the countries with the largest population at the top of the list and doesn’t give a true picture of what’s happening.
- Comparing deaths as a percentage of cases.
This has the same problem as comparing numbers of cases because those that carry out fewer tests will find fewer cases. If they only test people in ICU, they will only pick up the most severely ill so the death rate will be really high. If they pick up thousands of asymptomatic cases, the death rate will be lower even if the number of deaths is exactly the same.
- Comparing deaths per million people in the population
This is probably the fairest system although it's still not perfect because it's affected by the problem with counting deaths I mentioned earlier. It’s also open to manipulation by changing the base number. 500 deaths per million sounds worse than 50 deaths per 100,000 to anyone who isn’t looking too hard, but the actual numbers of deaths are the same.
4: Beware of clever use of words
When people want to influence you with numbers, they often accompany them with words designed to affect your emotional response. Prices creeping up by 1% sounds less worrying than prices soaring by 1%. In the same way, sales dipping slightly to 950 sounds less worrying than sales plummeting to less than 1000 even though both statements are describing the same change.
5: Beware of graphs and charts
It’s not surprising that we see graphs and charts all over the media. They are not just a convenient way to break up the text - they are also a brilliant way to make numbers easier to understand and interpret. However, in the wrong hands, they are also a very effective tool for misleading people.
The wiggly line on a graph or the bars on a chart mean nothing unless you know what they represent. So every graph or bar chart needs a scale on the left hand side ( the vertical axis) and another on the bottom line (the horizontal axis). But scales without numbers can be very misleading. For example, in the one below, you can't tell whether the profits have gone up by £1 or £1,000,000.
Putting in the scales shows that profits have gone up by a £50 in 25 days which is better than a loss but nothing to get excited about.
In the example above, the left-hand scale starts at zero, but starting the scale at a higher number makes the line rise much more steeply.
The rise in profits is exactly the same, but the slope of the line is much steeper so people who only glance at the graph will be fooled into thinking that profits have risen faster than they really have. To avoid being fooled yourself, always look at graphs and charts very carefully and check the numbers on the scales.