Data v evidence: What does it matter in evidence-based practice?

What’s the difference between data and evidence? Evidence-based practice

data v evidence. That the difference is between data and evidence and why it matters

Be impressively well-informed

Get your FREE organizational and people development research briefings, infographics, video research briefings, a free copy of The Oxford Review and more...

Powered by ConvertKit

Data v evidence

Data v evidence: This is the first article in a series about evidence-based practice. In this post, I look at the critical difference between data and evidence and why it matters in evidence-based practice.

One of the cornerstones of evidence-based practice is understanding the difference between data and evidence. When people in an organisation understand this distinction between data and evidence, we tend to find evidence-based practice not only starts to become real, but also it enables people to make better decisions and judgements, and you may be surprised to find that people also become much more adaptable and flexible as well.

Now, these are big claims for something apparently as simple as understanding the difference between a couple of terms. However, in this case, understanding the distinction between data and evidence really actually changes the way people think and see things in organisations.  This is why evidence-based practice has so much going for it, it does more than just improve practice

Now, these are big claims for something apparently as simple as understanding the difference between a couple of terms. However, in this case, understanding the distinction between data and evidence really actually changes the way people think and see things in organisations.  This is why evidence-based practice has so much going for it, it does more than just improve practice



Before we get to understanding what data is, think about this: Data is data right? Data is either right or wrong. Now imagine that we have someone who believes Data = Facts and facts are either right or wrong. What does this person think the truth is? Usually, for these people, the truth is just a collection of facts. Collect enough facts and you get to the truth. The right way of seeing something.

This view assumes that how the data was constructed is also ‘correct’. That data just exists. They are immutable or permanent and unchangeable. The data is the data. They just are. You can’t challenge the data.
So let’s look at some hypothetical organisational data. Let’s take something that looks like solid data – a ‘fact’ in fact. Let’s take an organisations sickness data for example. The data for this organisation is:

Average days sickness per person last year is 3.2 days
the average sickness the previous year was 2.9 days

On the face of it, there isn’t much to question here. The facts are the facts, the data are the data. Sickness is going up from 2.9 days a person to 3.2 days.

Because sickness is increasing we need to do something….


Not so fast

What if we find out that two years ago someone was considered to be sick if they didn’t turn up for work, but if they turned up first thing in the morning and later left because they were sick they weren’t counted as sick on the figures. Then this year HR decided to count people leaving work as well. Or what if they included sickness this year even if the person stayed at work but were ill (had a cold or a headache).

Hypothesis and evidence

What if not every manager records sickness in the same way, because they are busy or know someone is having problems and wants to cut them some slack. For example, Jamie is so busy he never records absence and Flo, another manager, records even the slightest abstraction, even someone who goes for a walk to clear their head? Or they forget or, or, or, …. as you can see there are many many reasons variability is likely to creep into any data.

So data is just data but on its own, it doesn’t have any validity or reliability (the two tests for good data. Validity = it measures or describes what it claims to, and reliability = it measures or describes it consistently). There is no such thing as 100% validity or reliability. There is always variation in any measurement or description and these variations can stack up. Due to the variations in components two apparently identical aircraft made next to each other at the same time can vary as much as 10-20% in weight from each other due to the compound effect of the cumulative variation.

In other words, a figure or number or a set of figures tells us very little about the context or whether the data is actually moving/changing or not. In other words, data is just a number or an observation. It has no context or real meaning on its own. 24 is data. I am on the floor is data. It is 5 am is data. These data on their own have little meaning and as we have seen, without understanding the context, how the data was collected, what data was actually collected and vitally what it is going to be used for, the data has little meaning. Data only becomes right or wrong in context.


So whilst data can exist on its own, even though it is essentially meaningless without context, evidence, on the other hand, has to be evidence of or for something.

‘Evidence of or for something’

Evidence only exists when there is an opinion, a viewpoint or an argument. Academics and scientists call this the hypothesis. So if I am of the opinion that sickness has increased over the last 2 years in my organisation the question now becomes what is the evidence for my opinion? Well, that may well be the data I collected. So data only becomes evidence when there is an argument, a hypothesis or an opinion.

But surely that means data is evidence…. and why does this matter?

Does it matter?

Does it matter?

Data is only evidence in the presence of an opinion or argument, otherwise, it is just data and has no meaning on its own. The problem is and why this matters, is that the moment you form an opinion you start to select which data you are going to use to support your argument.

For example, if you are the opinion sickness is increasing you most likely aren’t going to include data about how many goldfish are pets in your town for example. You are however likely to use the data on a system about how many sick days have been taken over the last, say, 2 years.

A test of evidence-based practice

So using these data we see:

Last year there was an average of 3.2 days sick per employee, and
the year before there were 2.9 sick days per employee.


So does this support your opinion that sickness is increasing?

On the face of it, someone might come to this conclusion. Sickness is increasing.

But what if I told you that one employee has been receiving cancer treatment and as a result has racked up about 150 days sickness last year? How might that have affected the figures?

Evidence has perspective – data doesn’t.


The crux of the matter between data v evidence is this:

Ah ok, what happens if you look at the sickness rates over a ten and twenty year period?

If we do that we find that:

20 years ago the average sickness was 15 days per person per year.

Now what is your opinion/hypothesis/argument? Is sickness increasing or decreasing? (last year it was 3.2 days on average).

What if we found that 10 years ago the sickness rate was 1.5 days a year. Now what?

Is it increasing or decreasing?

Knowing all this now significantly changes your opinion and any actions you might take or propose.
Is sickness going up or down given all these data?

Or is it just that there is some random natural variation in the data? You know, stuff just happens and some times people just get randomly sick and there is no real pattern, it just looks like a pattern seen from the perspective of two years of data.

Knowing the difference between data v evidence is very important, particularly for evidence-based practice.


When looking at the difference between data v evidence we find that:

Data is just data and has no intrinsic meaning on its own.

Evidence has to be evidence for or of something; an argument, an opinion, a viewpoint or a hypothesis.

The evidence you use depends on your argument. As we get more evidence or different types of evidence our argument might change.


In my next Thursday blog I will take this a bit further and look at the effects of believing in facts has in evidence-based organisations.



Disclaimer: This is a research review, expert interpretation and briefing. As such it contains other studies, expert comment and practitioner advice. It is not a copy of the original study – which is referenced. The original study should be consulted and referenced in all cases. This research briefing is for informational and educational purposes only. We do not accept any liability for the use to which this review and briefing is put or for it or the research accuracy, reliability or validity. This briefing as an original work in its own right and is copyright © Oxcognita LLC 2024. Any use made of this briefing is entirely at your own risk.

Be impressively well informed

Get the very latest research intelligence briefings, video research briefings, infographics and more sent direct to you as they are published

Be the most impressively well-informed and up-to-date person around...

Powered by ConvertKit
Like what you see? Help us spread the word

David Wilkinson

David Wilkinson is the Editor-in-Chief of the Oxford Review. He is also acknowledged to be one of the world's leading experts in dealing with ambiguity and uncertainty and developing emotional resilience. David teaches and conducts research at a number of universities including the University of Oxford, Medical Sciences Division, Cardiff University, Oxford Brookes University School of Business and many more. He has worked with many organisations as a consultant and executive coach including Schroders, where he coaches and runs their leadership and management programmes, Royal Mail, Aimia, Hyundai, The RAF, The Pentagon, the governments of the UK, US, Saudi, Oman and the Yemen for example. In 2010 he developed the world's first and only model and programme for developing emotional resilience across entire populations and organisations which has since become known as the Fear to Flow model which is the subject of his next book. In 2012 he drove a 1973 VW across six countries in Southern Africa whilst collecting money for charity and conducting on the ground charity work including developing emotional literature in children and orphans in Africa and a number of other activities. He is the author of The Ambiguity Advanatage: What great leaders are great at, published by Palgrave Macmillian. See more: About: About David Wikipedia: David's Wikipedia Page

  • Jon says:

    Good afternoon. You state in your disclaimer, “It is not a copy of the original study – which is referenced. The original study should be consulted and referenced in all cases.” Unfortunately, I cannot locate such a reference in this article. Can you please update per your disclaimer? If not, can you please email a reference to the original study? Respectfully submitted, Jon

  • Daniel says:

    I appreciate how you clarified the difference between data and evidence. Are there any references you would be willing to share to studies or other literature that explore this dichotomy?

  • An outstanding share! I have just forwarded this onto a colleague who had been doing a little research on this.
    And he in fact ordered me breakfast due to the fact that I
    stumbled upon it for him… lol. So allow me to reword this….
    Thanks for the meal!! But yeah, thanks for spending
    the time to discuss this subject here on your web site.

  • Linda Lawrence-Wilkes says:

    I like the article very much and would like to download it for my undergraduate research students. How can I download?

  • >