Is evidence like petrol?
This was the unexpected question I was left with after our anniversary event on Monday 21st March.
It was prompted by something one of our speakers, Professor Jon Shepherd said about the ‘users’ of evidence: decision makers just want to be told how best to achieve what they want to do. Like car users putting petrol in their cars, they don’t want to have to question whether it’s going to work.
Professor Shepherd’s analogy will be familiar to those who have seen the work he did for the Cabinet Office. Using the petrochemical industry as an analogy, he categorises the different bodies involved in the ‘evidence industry’ according to their role in the generation, distribution, storage, synthesis and consumption of evidence. Like petrol, evidence needs to go through a process of be generated, refined and distributed to end users.
But is it right to assume that the end user can put evidence to use unquestioningly?
Another of our speakers, Professor Chris Taylor, talked about the Sutton Trust / EEF toolkit, which is widely held to be the gold standard for synthesising and disseminating evidence. The top rated intervention, in terms of its potential impact on a pupil’s attainment, is providing feedback. As he highlighted, however, the toolkit makes clear that feedback can have a negative impact. Local judgement, and careful implementation will be needed if the desired outcome is to be achieved.
We also discussed the importance of context. Victoria Winckler made a passionate case for increasing our understanding of the lived experience of those in poverty; pointing out that what works for someone in their 20s won’t necessarily for someone in their 80s. You could extend this to say, for example, that successful interventions in an urban environment might not transfer to rural contexts.
An audience member raised the question of objectives – policy makers have multiple objectives, which are not always complimentary. Take one example given: schools in London have some of the best results, but their pupils have some of the lowest scores when it comes to measures of wellbeing. If schools are asked to promote wellbeing and improve exam results, what happens if evidence suggests that addressing one comes at the expense of the other?
So evidence is not like petrol – it’s complex and contested, needs to be tailored according to context, and before it can be used, you need to be clear about what you are trying to achieve.
Nonetheless, the point behind Jon’s analogy stands. Policy makers (particularly politicians) and practitioners want to be told what works. Faced with a decision, they don’t want to be told “it’s complicated…”.
Our experience has shown that this is part of the reason why there is a need for organisations (like the PPIW) that link policy makers with evidence. This is because, as Professor Laura McAllister said, you need someone to help shape the question so that it’s answerable. But it is also about finding the right sources of expertise, and playing a role in ‘refining’ the evidence. If you succeed in doing both, then you can use evidence to inform a decision.
This is only part of the solution, however. Across our portfolio of work we are increasingly seeing the importance of practice. Successful policy often depends on successful practice. As Chris’s example of using feedback shows, practitioners need to be intelligent users of evidence. We need to understand better how to foster this.
About the author: Dan Bristow is the Deputy Director of PPIW. He has particular responsibility for overseeing the assignments requested by Ministers and has experience of working in Government and the third sector. Prior to joining the Institute, he was a Senior Policy Advisor in the Cabinet Office and worked in the Treasury, Home Office, Sustrans and Independent Police Complaints Commission.