Jon Bright
One of the most significant problems in public policy has been the persistent failure to draw on evidence of “what works”—and, perhaps more crucially, what doesn’t.
Despite a growing interest in evidence-based policy, we still have a long way to go in identifying and scaling up successful practice. Take, for example, the findings from Nesta’s 2013 report, which revealed that only three out of seventy programmes implemented by the Department of Education were well evaluated (1). Sadly, this gap in evidence was not confined to education and still applies today (2).
Public sector managers need to know what works, what doesn’t and where they should experiment intelligently. However, until the late 1990s, there was little emphasis on evidence as a basis for policy and we haven’t moved as fast as we should have since then. As a result, we have been slow to innovate, evaluate, and scale up new ideas that add value.
What have been the consequences?
That’s not to say there haven’t been some stellar examples of innovation. But these have usually been down to exceptional people or circumstances. In most public organisations, knowledge of best practice is either lacking or hard to access. Public sector managers, particularly outside of professional disciplines, often lack the skills to assess evidence or adapt successful policies to different contexts. In some cases, even when they are aware of evidence, politicians may override advice in favour of projects shaped by political pressure, ideology, or personal interest.
As a result, we keep reinventing policies rather than refining and improving them over the longer term. This makes it much harder to tackle persistent social problems. What’s worse, some policies have been introduced despite evidence that they probably wouldn’t work (3). And even when successful programs are found, we struggle to replicate or scale them up in different contexts.
The Challenges of Policy Transfer and Scaling
This is the core of the problem. A good example is the attempt by English police forces in the early 2000s to replicate a successful gang violence reduction program from the US. Unfortunately, they ignored the detail underpinning the most important components of the US model and the results were largely unsuccessful (4).
In contrast, Strathclyde Police in Scotland carefully adapted the model and successfully reduced gang-related violence. Between 2004 and 2017, the murder rate in Strathclyde halved, and the rate of knife crime dropped by 65%. This example underscores the importance of understanding not only what works, but why it works and how it can be adapted to local contexts (5).
Scaling up successful interventions presents additional difficulties. Long-term success depends on increased funding which is rarely guaranteed. Family Nurse Partnerships (FNPs), for example, have been shown to be effective but have only benefited a small fraction of eligible children in the U.K., despite their positive impact on school readiness and early education outcomes (6). There must be a better way.
Why Is This Still a Problem?
There are several reasons why doing ‘what works’ is a difficult nut to crack. not least of which is the political environment in which decisions are made. Politicians may also reject evidence-based proposals for understandable reasons: cost, public opposition or concern about how they will land with colleagues and the media. Sometimes the timing’s just not right.
Moreover, public sector organisations are often risk-averse. Innovation requires a supportive culture, special funding, expertise, and incentives—elements that are frequently absent. On the plus side, the requirement to produce a business case for new policies does encourage the search for evidence.
The most common objection to evidence-based policy is that we often don’t have the evidence. I deal with this below.
Finally, until recently, there have been too few organisations charged with bringing evidence to decision-makers.
The What Works Centres
The good news is there has been some progress. Ten independent ‘what works’ centres have been set up in recent years to provide evidence-based guidance to policymakers. These centres, covering areas such as health, education, crime, homelessness, ageing and children’s social care, help to bridge the gap between research and practice (7). Their role is to provide unbiased, rigorous, and practical advice to help public services become more effective (8).
However, the work is far from complete. While the centres have made significant strides, there is no agreed, systematic way of incorporating ‘what works’ into the development of policy and delivery of services. Additionally, there has been no independent review of the centres’ overall impact on public policy in the 10 +years since they were founded.
What next for What Works?
The Centre for Public Impact (CPI) argues that a lot of evidence simply isn’t robust enough as the sole basis for social policy (9). It suggests we should use the term ‘evidence-informed’ alongside ‘evidence-based’ and proposes a combination of evidence, expertise, and experience as the best bet for designing policies that will work in most places.
Evidence-informed practice – Centre for Public Impact
To progress the evidence-based policy agenda, five points need to be addressed:
- Government Commitment: Government should invest more in research and development. While private companies like Volkswagen allocate a substantial portion of their budget to R&D, most government departments spend less than 1%. Senior civil servants must also be better equipped to understand and apply evidence-based policies (10).
- Local Government Involvement: Much of the ‘what works’ conversation takes place at the national level. Local government and civil society must be more involved to ensure better policy and bigger impact. The Welsh Centre for Public Policy is thought effective because of its close working with the devolved government.
- The Limits of Evidence: Often, evidence is incomplete or not easily applied to specific contexts. Furthermore, while the Centres are good at synthesising evidence, they don’t take account of the politics of policy making. Local policy makers often query the relevance of evidence when it doesn’t address their main policy questions (11). Evidence often needs to be combined with professional expertise and local experience to tailor policies to local needs.
- Scaling Up Good Practice: Public sector organisations need better systems for integrating successful new approaches into their mainstream services. This reduces the need for special funding. Similarly, successful programmes should be repackaged in a form that makes them easier to replicate at scale (12).
- Support for Local Managers and Practitioners: User-friendly, evidence-based information is crucial. For example, the Education Endowment Foundation assesses interventions based on evidence strength, cost and impact. This helps schools make good decisions. Other centres also provide ‘what works’ toolkits (13)
During 2024/25, there have been developments in the Network. For example, the Centre for Local Economic Growth has advised local authorities and emphasised tailored interventions that consider local contexts and needs. The Centre for Children’s Social Care has been recommending practices to improve outcomes for children in care. There has been greater collaboration among the Centres including a unified digital platform to disseminate findings. Looking ahead, new centres on climate resilience and digital inclusion are anticipated. The UK government has renewed its funding to the Network.
The ‘what works’ movement is a major step forward in improving public policy. To maximize its impact, its leadership needs to be refreshed, local government and civil society better engaged, and systems created to incorporate successful practice into mainstream services.

Jon Bright is a former civil servant who worked in the Cabinet Office and Department of Communities and Local Government between 1998 and 2014.
References
- Cited in The What Works Network (2018) The What Works Movement Five Years On. P15.
- Mulgan. G and Puddick. R, (2013) Making evidence useful- the case for new institutions, National Endowment for Science, Technology and the Arts (NESTA).
See also Institute for Government event in October 2022 ‘What works’ in Government: 10 years of using evidence to make better policy. At this event, David Halpern commented that only 8% of sample of Government programmes had evaluation plans in place. - Wolchover. N, (2012) Was DARE effective? Live Science 27.3.2012; and College of Policing (2015) Scared Straight Programmes, Crime Prevention Toolkit.
4. Knight. G, (2009) How to really hug a hoodie. Prospect. November 2009. See also, Tita. G, Riley. J,
Ridgeway. G, and Greenwood. P, (2005) Reducing Gun Violence Operation Ceasefire. National Institute of Justice (USA); and Braga. A. Kennedy. D, Waring. E, Morrison Piehl. A, (2001) Problem-oriented policing, deterrence, and youth violence: an evaluation of Boston’s Operation Ceasefire. National Institute of Justice.
5.Big Issue (2020) How Scotland’s’ Violence Reduction Unit breaks the cycle of crime, Big Issue 11.9.2020;
Craston. M, et al, (2020) Process evaluation of the Violence Reduction Units Home Office Research Report 116, August 2020; O’Hare. P, (2019) How Scotland stemmed the tide of knife crime, BBC Scotland news website, 4 March 2019; and Batchelor. S, Armstrong. S, and MacLellan. D, (2019) Taking Stock of Violence in Scotland, Scottish Centre for Crime and Justice Research August 2019.
- National Institute of Health Research (2021) Family Nurse Partnerships Building Blocks 5-6 study.
- Gov.UK (2013, updated 2022) What Works Network, Evaluation Task Force. https://www.whatworksnetwork. org.uk/
- The What Works Network (2018) The What Works Movement Five Years On.
See also Breckon. J, and Mulgan. G, (2018) Celebrating Five Years of the UK What Works Centres, NESTA. - Snow. T, and Brown. A, (2021) Why evidence should be the servant, not the master of good policy Centre for Public Impact.10.8.2021
- Halpern, D presentation at an Institute for Government (2022) event op cit.
- Private correspondence with Jason Lowther, Head of INLOGOV.
- Little. M, (2010) Improving children’s outcomes depends on systemising evidence-based practice… Demos
- Education and Endowment Foundation – Teaching and learning toolkit. An accessible summary of education evidence https://educationendowmentfoundation.org.uk/




