Tips and advice
Whether you’re a journalist, an activist, a business leader, a health worker or a regular citizen, how can you know when public figures tell the truth and when they distort it? How can you decide what claims are fair? Who can you trust?
We’ve used our experience as journalists, with help and advice from specialist experts in a range of fields, to draw up this checklist of fact-checking tips.
- Where is the evidence?
When a public figure makes a claim, big or small, first ask yourself if the claim is plausible and worth investigating.
Your next question should be, “Where’s the evidence?”
Officials may often have a good reason to refuse to reveal the evidence for a claim they make. They may need, as journalists do, to protect their source.
Sources do need protection, but we still require evidence. And another reason officials refuse to provide evidence is that it’s weak, partial or contradictory.
Ask for evidence, and if it isn’t provided you’ll know there is, or may be, a problem with the claim.
- Is the evidence verifiable?
The next step is to find out if the evidence can be verified. Can its accuracy be tested?
In the scientific community, any new trial is only accepted once other researchers have tested it and produced the same or similar results. As Thomas Huxley, a prominent 19th century biologist, put it: “The man of science has learned to believe in justification not by faith but by verification.”
It should be the same in public debate. When a public figure, in any field, makes a claim they want believed, they should be able to provide verifiable evidence.
If they can’t, can you take what they say on trust?
- Is the evidence sound?
We have looked, but there is no single fact-checking tips checklist that covers all the different types of evidence you might have to assess before you decide a claim is sound.
Listed below are the main questions we ask.
Could they know what they claim to know?
If the evidence is based on an eyewitness account, could the person know what they claim to know?
Were they there? Is it likely that they would have access to this sort of information? Is the information first-hand? Or is it second-hand, something they had heard and believed? Is it something that could be known?
If there is data, when was it gathered?
One trick public figures use is to present information collected many years before as if it were from today, with no mention of dates. But data ages.
To understand the data, you need to know when it was gathered and what the picture looked like before and after. Public figures may also present data with specific start and end dates, not because this reflects real conditions but to make the numbers look good, starting at the bottom of a regular cycle and ending at the top.
Was the sample large enough? Was it comprehensive?
An opinion poll that samples the views of a few dozen – or even a few hundred – people is unlikely to represent the views of a population of millions.
For most polling organisations, a well-chosen sample of around 1,000 people is the minimum needed to produce accurate results. But public figures often quote – and the media then report – surveys of a few hundred or few dozen people as representing wider views.
And even large scale surveys can give an inaccurate picture if they don’t look in all the right places.
This is known as the “black swan problem”. For centuries, people in Europe assumed that all the world’s swans were white because that was the colour of the hundreds of thousands of swans found there. It was only when 17th century Dutch explorer Willem de Vlamingh returned to Europe after discovering black swans in western Australia that people began to know better. The previous “sample” had been large, but not large enough.
How was the data collected?
Sample size is not all that matters. Researchers taking an opinion poll must include people of all relevant social groups – both genders and all ages from different races, regions, and social and economic groups – and in the right proportions. This makes the poll more representative of society as a whole.
How was the study done? Similar surveys done door-to-door can produce different results from those done on the phone because people may respond differently when they’re interviewed face-to-face and on the phone.
And studies that rely on people filling in forms tend to show more errors, particularly if the respondents are less literate, than person-to-person interviews. If the claim is based on a survey like this, could that be a factor?
Meanwhile, what people taking part in a study or a trial know, or think they know, about it, will also affect the outcome.
This is known as the “placebo effect” – when people’s belief that they’re taking a medicine, when they actually aren’t, affects their symptoms. It’s why medical trials often are, or should be, “blinded” so that patients being studied do not know the nature of the treatment they have or haven’t been given.
Look at the wider picture
Once you know how the data was collected, assess the way it was presented. Did the person tell the truth, the whole truth and nothing but the truth? Public figures may choose what to tell you, and what not to, cherry-picking the juiciest evidence, favourable to their side in an argument, and leaving the less tasty morsels in the bowl.
Is the data presented in context, and would it still support the claim if other, unmentioned factors were taken into consideration?
Say a politician claims that they put “record sums” into the public health system, without mentioning inflation. The claim may in itself be true, but it’s misleading if inflation has caused spending to actually fall, in real terms. Always look at the other factors that make up the wider picture.
And remember to keep numbers in proportion. Spending $50 million on a health project may sound like a lot, for a small community. But divide it among a population, and note that the programme is set to run over 10 years and it seems a lot less generous than it did at first.
- Data sources, experts and the crowd
If the person making a claim can’t or won’t offer evidence to back it up, this may make it harder to check – but doesn’t prove it wrong. To check it, you can turn to credible data sources, acknowledged experts and crowdsourced information.
Data sources
There are many useful sources of data for checking claims.
You can find information in government papers and official statistics, company records, scientific studies and health research databanks, as well as in school records, development charity accounts, religious orders’ papers and more.
Sri Lanka Check maintains a library of guides and factsheets that provide sources of reliable data on key questions. And our Info Finder tool offers useful sources of data on a wide range of topics for Sri Lanka, Kenya, Nigeria and South Sri Lanka.