# Everybody Needs Some Maths and Basic Stats

1,269
Filed under - Skills,

Peter Massey explains why everybody needs some maths and basic stats, in the sixth of nine articles exploring things he has found surprising in business.

## 501

We all know people who say they are rubbish at maths and yet can calculate backwards from 501 faster than you can throw the darts. Or tell you the price of the round of drinks is wrong without going near the till.

Maths can be useful in everyday situations, but it doesn’t need to be all in Greek letters, like it is in my long-neglected physics!

## Literacy Is Expected

We expect reading and writing to a good level to be a basic skill.

But do we accept that we need to understand maths and statistics in the same way?

In work situations we expect literacy, but we don’t always expect numeracy.

## Maths Skills

We expect people to be literate, to be able to write to communicate, to clarify their ideas or findings. And to be able to read and understand what someone else has written.

Why don’t we expect people to be equally literate in numbers?

Not just numerate, but able to take and probe understanding from someone else’s numbers, in a world of business cases, evidence-based decisions and arguments.

## Error 101

Sometimes it’s just mathematically wrong.

As Georgina Sturge a statistician in the House of Commons Library, quotes:

“…a key economic argument of the 2010 Con-Lib coalition government’s austerity agenda was revealed to have originated in a mistake in an Excel spreadsheet.

Economists Carmen Reinhart and Kenneth Rogoff had been recommending lowering the debt to GDP ratio armed with a study in which they claimed to have found that debt of 90% of GDP was bad for growth.

Years later, a PhD student discovered that this conclusion only held because the authors had failed to include the last five rows of their data. The authors admitted their mistake – but not before austerity had become a cornerstone of UK economic policy.”

## Spurious

So much of the data presented might be spurious, taken out of context, from unrepresentative samples. Or deliberately presented to back up messages, regardless of other data.

## Insignificance

I’ve seen many situations where arguments have been spuriously based on statistically insignificant samples, on %s of %s and partial or selective data.

For example, most opinion polls are a sample of 1000 people. Some conclusions may be valid, but the selection of the 1000 people makes a big difference.

Is a 1% or 2% movement a change or a rounding error? There’s a simple calculator for sample sizes and error rates at calculator.net

## Partial to Data

For example, drug company data on effectiveness is fraught with two problems.

They only have to publish the data which supports their argument, not the data that doesn’t.

And the data doesn’t have to compare to the do nothing and/or placebo options which may be just as good.

## Bias

We do that all the time in business cases. We claim most customers want X or Y based on a sample size of diddly squat. We want to prove our hunch or our belief, so we are selective, ignoring contradictory data.

We don’t consider that we would have got the same improvement had we done nothing or something much simpler, because we confuse cause and correlation.

## Laziness

There’s a Mark Twain quote, “There are three kinds of lies: Lies, damn lies and statistics”. It behoves us all to educate ourselves to deal with all three.

Take this quote in the paper as I write this:

Only 6% of burglaries a year are solved by police across England and Wales – a shockingly low figure that has decreased over the last few years.” ….”over the last 12 months, arrests related to burglaries have risen by 52%”.

Yet examining the arguments put forward, there is:

• No base number of burglaries or trend data across previous years.
• No definition of a burglary or burglaries reported or “related to burglaries”.
• No definition of solved vs. arrests vs. convictions.
• No data on number of burglaries per culprit or arrest.
• No assumptions are given or source data links provided.

This illustrates the laziness with which numbers are quoted in the press without challenge – but we aren’t better at work. The use of language and specificity of data is sadly missing from most board reports I read.

Lack of mathematical savvy leads to a common business issue: lazy readers who accept weak arguments and lazy proposers who are not challenged on what’s argued.

The OECD defines mathematical literacy as:

“an individual’s capacity to formulate, employ and interpret mathematics in a variety of contexts. It includes reasoning mathematically and using mathematical concepts, procedures, facts and tools to describe, explain and predict phenomena.

It assists individuals to recognize the role that mathematics plays in the world and to make the well-founded judgements and decisions needed by constructive, engaged and reflective citizens.” (OECD, 2018, p. 67)

That’s quite a mouthful!

## Some Skills to Practise

So what should you check for in your skills with numbers at work?

Some simple favourites are:

1. Sample Size – ask for the statistical significance or error rate on any sampled data. It helps you see if changes or comparisons in the numbers are valid.
2. Base Number – “There are five times as many murders by fishermen as there are by hunters” makes fishermen sound dangerous. But if there are 1 million fishermen and 100,000 hunters, then fishermen are half as murderous!
3. Definition – check the use of the words around the data, of the scope of the data itself. Watch for differing definitions of words or data used as if they were the same.
4. Blind Proof – if the data relates to a test or pilot, check if and how it compares to doing nothing.
5. Do the Sniff Test – does it smell right? Does it pass a common-sense test or the back-of-an-envelope calculation?
6. What Is Missing – don’t accept one side of an argument or a sole source. Check for bias, for further sources and what the data say.
7. Sources – always ask for and check sources of data. Many times the statement made does not correlate with what the data actually said. Many times you’ll find multiple sources referring back to a single source, often with different or spurious data from that inferred in secondary sources.

BBC Radio 4’s “More or Less” is great at debunking statements with scrutiny of the data and its use. It demonstrates the tactics and counter-arguments every week.

The presenter, the undercover economist Tim Harford, has included his top tips from his book “How to Make the World Add Up”.

## Some Sources

Written by: Peter Massey at Budd