I have been grumping for at least the last 25 years about how little of the evidence that is developed by academic researchers, practitioner-researchers, consultants and others is effectively deployed in developing public policy and practice. We intuitively know, and politicians regularly proclaim, that evidence should inform policy. So why did it take over a decade to move from Dr. Barry Marshall vomiting in his lab having taken the extreme measure of drinking a H. Pylori bacteria cocktail to prove that this bug causes stomach ulcers (which can then be cured by antibiotics) to these antibiotics being routinely used in medical practice? And why is the Troubled Families Programme so unusual in the plethora of government mega-initiatives, simply by having a robust evaluation published showing where it works (and where not)?
A lot of the academic research in this area points to deficiencies in what you might call the “supply side” of evidence. Academic research is too slow and therefore out of date before its results are seen. Journal articles are hidden behind expensive firewalls, written in an opaque language, packed with caveats and conclude mainly that we need more research. Academics sometimes don’t know how to talk with policy makers, or even who they are.
There is truth in most of these points, and there has been some useful progress in addressing many of them in recent years, for example the very readable summaries of available evidence published by some What Works Centres on topics such a local economic growth and children’s social care. And it’s an immense privilege to have recently joined INLOGOV at the University of Birmingham, a department with a vast network of alumni across local government, and academics who are well used to speaking the sector’s language.
But I’m increasingly feeling that the real issue is on the “demand” side. Do we as practitioners and politicians really want to know what the evidence is telling us? What if the evidence highlights issues we’d rather not know? How do we know evidence when we see it and what if it is contradictory? Furthermore, how do we find this “evidence” anyway, and how can we know what’s reliable and what’s fake news? With American oil companies allegedly giving millions to fund research questioning climate change, who can we trust? Finally, how can we conduct small scale local research – so important when trying to understand local difference – that provides geographically relevant evidence without being accused of providing limited and unreliable findings?
I’ve been involved as LARIA, the LGA, SOLACE and others have run some excellent programmes to support practitioners and policy makers in making use of evidence.
One of my favourites was the excellent “Evidence Masterclass” organised by SOLACE which provided a one-day familiarisation courses for chief executives. At the other extreme, universities provide extensive MSc courses on research methods for a variety of public health and social scientists. But not many of us can devote years to understanding how research works and can be used, and there’s a limit to what anyone can learn in a single day.
So in my new role as director of the Institute for Local Government Studies at the University of Birmingham I’ve been excited to help deliver our new “Public Policy and Evidence” module within our Public Management and Leadership MSc and Degree Apprenticeship programmes. This is a six-week module, involving around five hours distance learning a week followed by a two-day on-campus session, currently being taken by forty senior local public service managers from a number of English local authorities and the LGA.
It’s been fascinating to see these managers think through how evidence relates to their work areas, explore how rigorous research works and the different ways it can inform policy making and service design, and get to grips with the detail of the various techniques social science researchers use. We’re now moving to the on-campus days, where we’ll be looking at several live examples from local public services in Birmingham, Coventry and Manchester, and keying up a significant practical supervised research project they will each undertake in their home organization over the next several months.
It’s exciting to see the improvements in the “demand” and “supply” sides of evidence informed policy making that are being delivered through this course and initiatives such as the What Works Centres and local Offices for Data Analytics. Who knows, in the decade or so before I retire, I may even be able to stop grumping about evidence-based policy?
This article was originally published in the newsletter of the Local Area Research and Intelligence Association (LARIA).
Jason Lowther is the Director of INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies. Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther