What works in learning what works?

Jason Lowther

I have been grumping for at least the last 25 years about how little of the evidence that is developed by academic researchers, practitioner-researchers, consultants and others is effectively deployed in developing public policy and practice. We intuitively know, and politicians regularly proclaim, that evidence should inform policy. So why did it take over a decade to move from Dr. Barry Marshall vomiting in his lab having taken the extreme measure of drinking a H. Pylori bacteria cocktail to prove that this bug causes stomach ulcers (which can then be cured by antibiotics) to these antibiotics being routinely used in medical practice? And why is the Troubled Families Programme so unusual in the plethora of government mega-initiatives, simply by having a robust evaluation published showing where it works (and where not)?

A lot of the academic research in this area points to deficiencies in what you might call the “supply side” of evidence. Academic research is too slow and therefore out of date before its results are seen. Journal articles are hidden behind expensive firewalls, written in an opaque language, packed with caveats and conclude mainly that we need more research. Academics sometimes don’t know how to talk with policy makers, or even who they are.

There is truth in most of these points, and there has been some useful progress in addressing many of them in recent years, for example the very readable summaries of available evidence published by some What Works Centres on topics such a local economic growth and children’s social care. And it’s an immense privilege to have recently joined INLOGOV at the University of Birmingham, a department with a vast network of alumni across local government, and academics who are well used to speaking the sector’s language.

But I’m increasingly feeling that the real issue is on the “demand” side. Do we as practitioners and politicians really want to know what the evidence is telling us? What if the evidence highlights issues we’d rather not know? How do we know evidence when we see it and what if it is contradictory? Furthermore, how do we find this “evidence” anyway, and how can we know what’s reliable and what’s fake news? With American oil companies allegedly giving millions to fund research questioning climate change, who can we trust? Finally, how can we conduct small scale local research – so important when trying to understand local difference – that provides geographically relevant evidence without being accused of providing limited and unreliable findings?
I’ve been involved as LARIA, the LGA, SOLACE and others have run some excellent programmes to support practitioners and policy makers in making use of evidence.

One of my favourites was the excellent “Evidence Masterclass” organised by SOLACE which provided a one-day familiarisation courses for chief executives. At the other extreme, universities provide extensive MSc courses on research methods for a variety of public health and social scientists. But not many of us can devote years to understanding how research works and can be used, and there’s a limit to what anyone can learn in a single day.

So in my new role as director of the Institute for Local Government Studies at the University of Birmingham I’ve been excited to help deliver our new “Public Policy and Evidence” module within our Public Management and Leadership MSc and Degree Apprenticeship programmes. This is a six-week module, involving around five hours distance learning a week followed by a two-day on-campus session, currently being taken by forty senior local public service managers from a number of English local authorities and the LGA.

It’s been fascinating to see these managers think through how evidence relates to their work areas, explore how rigorous research works and the different ways it can inform policy making and service design, and get to grips with the detail of the various techniques social science researchers use. We’re now moving to the on-campus days, where we’ll be looking at several live examples from local public services in Birmingham, Coventry and Manchester, and keying up a significant practical supervised research project they will each undertake in their home organization over the next several months.

It’s exciting to see the improvements in the “demand” and “supply” sides of evidence informed policy making that are being delivered through this course and initiatives such as the What Works Centres and local Offices for Data Analytics. Who knows, in the decade or so before I retire, I may even be able to stop grumping about evidence-based policy?

This article was originally published in the newsletter of the Local Area Research and Intelligence Association (LARIA).

lowther-jason

Jason Lowther is the Director of INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther

Unlocking research for local government

Jason Lowther

 

This post originally appeared on the Solace website. You can find it here

Local government needs evidence, from the apparently mundane but nonetheless critical (‘What choice of cladding will minimise the risk of fire spreading?’) to extraordinary insights (‘How do people choose what to eat and whether to be active?’, ‘What skills will today’s youngsters need in the jobs market of 2050?’)

In 2014, Solace commissioned an initial Local Government Knowledge Navigator (LGKN) report, From Analysis to Action: Connecting Research and Local Government in an Age of Austerity, which demonstrated that:

1. Councils have a wide range of evidence needs;
2. There is relevant research and expertise in academia but local government doesn’t make the most of this;
3. There are some impressive examples of collaboration between academia and local authorities but engagement is inconsistent, and often depends on existing links between individual researchers and local government officers or politicians; and
4. There is a need for a change of culture in both communities, and the development of more systematic approaches to achieving connectivity between them.

The key issues identified around local authorities’ approach to successful engagement were:

– senior appreciation of and support for research evidence;
– experience of using research and data to inform decision-making;
– consortia, to spread the cost and reduce risks to reputation;
– support from brokers with the expertise and time to develop proposals;
– the ability and skills to successfully commission research (or access to them); and
– local authority research teams and service managers establishing relationships with local universities.

Following the LGKN work, Solace continued working with the ESRC and LGA through the Research Facilitator, and has established dedicated spokespeople on evidence-based policy.  Recently, it supported the Centre for Public Scrutiny and Nesta in producing the document “Using evidence: A practice guide for local government scrutiny” which launched last month, and which aims to help local government make better use of research evidence.

Recent research has highlighted that local government has particular ways of looking at research that differ from much of academia (including the traditional approaches of public health colleagues).  Local government recognises the importance of ‘place’, and the uniqueness of each area’s situation and background.  As a result, we are particularly interested in evidence, including particular expertise, which relates directly to our place, and this can come across as only being interested in evidence that is ‘home grown’.

When Gemma Phillips and Judith Green recently looked at the transition of public health from the NHS to local government, they found that this different culture reflects local government’s more holistic view of health and wellbeing (rather than healthcare services), and our focus on practicality (rather than the provenance and methodological rigour of research studies).

Austerity has meant less government spending on research and evaluation, particularly at local level, although in 2015/16 national government still invested £5.6 billion in science and research, including £178m in ESRC alone.  So if we want research that government (including local government) will practically use, we probably have to get smarter in terms of more targeted funding, and in particular presenting local government as a key solution to the ‘Impact’ agenda which is vital to universities’ research funding.

This suggests to me that local government needs to take a much more active role in influencing the research agenda locally.  It’s not enough to rely on a kind of ‘Brownian Motion’ in the hope that academics’ research interests will in some way coincide with the policy priorities of local government.  We need to let academics know what our policy priorities are, and to listen to them as they explain what is already known in the relevant fields, and how further research might help us address these priorities.

In the West Midlands, as part of a comprehensive partnership with local universities, the Combined Authority this week set out a clear agenda for research related to its policy priorities for the next three years.  Developed from its Strategic Economic Plan, this includes both economic and social (public service reform) policy priorities, and further development of information sharing and the use of evaluation.  The WMCA ‘Policy Research Plan’ has been developed with input from policy leads and academic experts identified across the local universities and agencies, who will now take forward the agreed activities in a common programme.

So, for example, around ‘connected autonomous vehicles’ we are interested in exploring how emerging technologies can be exploited to improve transport accessibility and reduce subsidy costs whilst supporting enhanced network performance.  Around ‘vulnerable offender pathways’, we need to understand areas where regional working can add most value, together with the offence profile and pathways for specific groups, such as young person and women offenders.

Developing the Plan has compelled policy leads to be much more explicit about the questions they need answering to take forward the policy priorities, and has enabled academic experts to engage in developing these into research-able questions.  As this engagement continues, we expect further synergies to develop giving us much more robust and ‘actionable’ research in future.

References

Local Government Knowledge Navigator reports

http://www.solace.org.uk/knowledge/reports_guides/LGKN_Analysis_to_Action.pdf

http://www.solace.org.uk/knowledge/reports_guides/LGKN_LA_research_collaboration.pdf

Phillips, Gemma, and Judith Green. “Working for the public health: politics, localism and epistemologies of practice”, Sociology of health & illness 37.4 (2015): 491-505.

Using evidence: A practice guide for local government scrutiny

WMCA Policy Research Plan

https://governance.wmca.org.uk/documents/s287/Appendix.pdf

 

lowther-jasonJason Lowther is a senior fellow at INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther

HS2: the importance of evidence

Rebecca O’Neill

Large infrastructure planning projects are often met with much controversy and debate. This is partly due to the risks involved and the conflicting views amongst actors. One such project is the proposed high-speed railway to London from Birmingham, the North of England and potentially Scotland; better known as High Speed Two (HS2). After the project received an amber-red rating in May from the Major Projects Authority (MPA) annual report there is every reason for people to be concerned. An amber-red project is defined as follows:

Successful delivery of the project is in doubt, with major risks or issues apparent in a number of key areas. Urgent action is needed to ensure these are addressed, and whether resolution is feasible’.

So the questions that must be asked are what evidence supports the project and how should we analyse the debate? The evidence in favour of the project is largely based on predictive models and statistical data. One would think that after the financial crisis of 2008, people would not be so quick to base decisions on rational predictive models. Or that after the cost overruns and benefit shortfalls of HS1 (the Channel Tunnel Rail Link) supporters of HS2 would be less optimistic in their forecasts. However, advocates of the project believe that the project is both viable and necessary to tackle over-capacity issues on the West Coast Main Line.

There are a number of ways of analysing the debate. One such way is through an evidence-based policy making lens. This approach argues that once a policy problem is identified then research evidence will fill the knowledge gap thus solving the problem. For advocates of evidence-based policy making, ‘the task of the researcher is to make accurate observations about objective reality, ensuring that error and bias are eliminated by isolating variables in order to be able to identify cause-effect relationships’. These experimental methods are usually in the form of statistical analysis and they rely heavily on quantitative data. So evidence must be about ‘facts’ that tend to prove or disprove a conclusion. Evidence-based policy making has underlying positivist assumptions that it is possible to have a value-free science. It assumes that there is an objective truth ‘out there’ and if researchers adopt a certain approach then they will find the answer to the wicked issues and social problems we are facing.

If we utilise the evidence-based policy making approach then I must come to these conclusions:

  • The actors within HS2 are rational actors who have systematically collected scientific, rigorous evidence to support their claims and their decisions are rational and value-free.
  • If there is a conflict of evidence then this is either because the actors have not behaved rationally, they have allowed emotions and values to shadow their decisions or the evidence has flaws in terms of quality and methodology.
  • Those opposed to the project have an argument based on ideologies and less systematic and rigorous evidence.

However, I propose (along with many others) that the policy process is messy, that actors are rarely rational, that evidence is not necessarily ‘out there’ waiting to be found and that assuming more information will provide policy makers with the solution is wrong. The policy process is better viewed as an arena in which actors present claims and attempt to persuade their audience that these claims are true through the presentation of evidence and persuasion. The claims made by actors within the process are based on a variety of different evidence ranging from personal opinion to rigorous, scientific evidence. A good claims-maker will have mastered the art of appealing to a range of audiences, shaping and presenting their evidence in a way that best suits their audience. The concept of evidence-based policy making does not acknowledge the role of humans in this sense.

In the case of HS2, claims were made about the West Coast main line (WCML) stating that it was almost at full capacity as well as claiming that the UK needed to modernise its railway infrastructure. They did not simply claim that it was the right thing to do; rather they captured existing discourses within society such as modernisation and economic growth. The claims-making framework enables us to explain why unfounded anecdotes can easily override rigorous scientific effort and investment. It also explains why some evidence is accepted over other evidence.

For a long time supporters of the project dismissed counter-claims and evidence arguing that the NIMBYs were being selfish, that the project was for the greater good and that they were preventing much needed modernisation. However, more and more people are questioning the claims being presented by HS2 Limited and their followers.  In practice, the philosophy of ‘what works’ often takes second place to, as Russell and Greenlagh describe, ‘experiential evidence, much of which was in the form of anecdotes or generalisations based on a person’s accumulated wisdom about the topic’. Claims-making theory, therefore, provides a robust theoretical framework for examining the process of how claims are made, received, denied through counter claims, and reshaped. It also illustrates how claims and those who make them interact to formulate public policy.

o'neill

Rebecca O’Neill is a doctoral student looking at the role of evidence within High Speed Two. She has an interest in the conceptualisation of evidence, evidence-based policy making, the claims-making framework and interpretive approaches to research.