Jason Lowther
Following the previous blog on homelessness and rough sleeping, this piece turns to another major area of local government activity: local growth and skills programmes. Here too, evaluation activity has expanded rapidly, with a mix of national frameworks, programme‑level syntheses and place‑based studies. Taken together, these evaluations offer a valuable, and still evolving, picture of what is working, what is proving harder, and what local systems actually need to deliver economic outcomes.
Four strands of evidence stand out.
The MHCLG local growth evaluation programme is significant not just for its findings, but for its approach to evaluation itself. Rather than focusing on single programmes, it introduces a portfolio‑level strategy covering multiple funds aimed at improving sub‑national economic performance.
Recent work, including the process evaluation of the Local Growth Fund and Getting Building Fund, highlight both strengths and tensions in the model. Decentralised decision‑making and the “single pot” approach enabled locally tailored investment and stronger alignment with local strategies. Private sector involvement and local prioritisation were widely valued. However, delivery was shaped by pressures to deliver “shovel‑ready” projects quickly, particularly in the Getting Building Fund, which sometimes limited strategic coherence and innovation. Governance arrangements, while locally responsive, were often complex, and approaches to monitoring and evaluation were variable. More broadly, the evaluation underlines the difficulty of measuring long‑term economic impact, particularly where interventions are diverse and outcomes unfold over many years.
Multiply deep dives (Scotland, Wales, Northern Ireland)
The Multiply deep dives bring a skills and employability perspective, focusing on adult numeracy provision across the devolved nations. Multiply was a £559 million UK‑wide programme designed to improve functional numeracy, with flexible, locally designed delivery models.
The deep dives use qualitative case studies, interviews with delivery partners and analysis of monitoring data, focusing on one area in each nation and drawing on wider place‑level evidence. A central finding is that local flexibility enabled innovation, particularly in embedding numeracy in real‑world contexts such as employment, parenting or financial capability.
At the same time, the evaluations highlight familiar delivery challenges. Short delivery timescales, in some cases just a year, created pressure to scale quickly, often leading to adaptation of existing provision rather than genuinely new approaches. Partnership working across councils, colleges and the voluntary sector was essential but time‑consuming to establish. Engagement with target groups remained difficult, particularly where low confidence rather than low skill was the primary barrier.
Overall, the evidence suggests that contextualised, learner‑centred approaches are promising, but require time, trust and sustained funding to embed.
UK Shared Prosperity Fund (UKSPF) interim evaluation synthesis
The UKSPF interim synthesis report provides perhaps the most comprehensive current view, drawing together 34 place‑based evaluations across the UK. It focuses on process learning rather than impact, reflecting the relatively early stage of delivery.
A clear headline is the importance of local autonomy. Across almost all areas, the ability for Lead Local Authorities to design interventions around local needs was strongly valued, particularly compared to the perceived rigidity of previous EU funds. This flexibility supported alignment with local strategies, more responsive delivery, and better integration across policy areas.
Other success factors included strong local programme management teams, continuity of provision (using UKSPF to sustain previously funded services), and the ability to combine funding streams to create coherent local offers. However, challenges were equally consistent. Tight central government timelines constrained planning and procurement, limited consultation, and created recruitment difficulties. As with other programmes, evaluation and outcome measurement remained underdeveloped.
The synthesis highlights a key tension: local freedom within central constraints. While devolution of decision‑making was real, the operating environment still imposed significant limits on what places could achieve.
The place‑based evaluations add depth to this picture by examining how UKSPF worked in specific localities. Using mixed‑methods approaches – including contribution analysis, surveys, interviews and case studies – across 34 areas, they explore how combinations of interventions interact within local systems.
These studies show that outcomes are highly context‑dependent. In some areas, UKSPF supported visible improvements in community facilities, local business support, and employability outcomes. In others, impacts were harder to detect, reflecting both the early stage of delivery and the complexity of local economies. What emerges clearly is that programme success depends less on individual projects than on how they are aligned and sequenced locally.
The evaluations also reinforce the importance of existing capacity and partnerships. Areas with mature governance arrangements, strong voluntary sector links, and prior experience of managing regeneration funding were better able to mobilise quickly and deliver coherent programmes.
What does this mean for local authorities?
Across these evaluations, several consistent lessons emerge.
First, local flexibility works, particularly when supported by capacity and stability. Both UKSPF and Multiply demonstrate the value of devolved decision‑making. However, the benefits are uneven, depending on local capability, existing partnerships, and the time available to plan and deliver.
Second, time is the missing ingredient in local growth policy. Tight delivery timescales appear across all programmes, driving a focus on “shovel‑ready” activity, limiting innovation, and constraining partnership development. Economic change, skills development and behaviour change all take longer than funding cycles typically allow.
Third, integration matters more than individual interventions. The strongest evidence, particularly from the place‑based evaluations, is that impact depends on how interventions fit together. Skills, business support and community investment are interdependent, yet funding streams and evaluation frameworks often treat them separately.
Fourth, measurement remains a weak spot. Across the local growth portfolio, there are persistent challenges in demonstrating impact and value for money. This is partly methodological, but also reflects the reality that many outcomes (productivity, employment, resilience) are long‑term and influenced by wider factors.
Finally, these evaluations underline a familiar but important point: local systems deliver national priorities. Where programmes align with local strategies, build on existing partnerships and allow room for adaptation, they show promise. Where they are constrained by short timescales, fragmented funding or complex governance, delivery becomes more transactional.
The conclusions from the local growth and skills evaluations strongly align with, and are reinforced by last month’s excellent report from the Institute for Government, Designing and delivering employment support. The IfG goes further in diagnosing why these issues persist and what structural reform is needed. Both emphasise the value of local flexibility, integration and tailoring to place, with the IfG explicitly arguing that strategic authorities are best placed to design joined‑up employment support aligned to local labour markets and services. Likewise, both bodies of evidence highlight fragmentation and poor coordination across programmes as major barriers, with the IfG noting longstanding failures to “shift the dial” despite multiple national schemes, echoing local growth evaluations on disjointed funding and siloed interventions. The IfG report places significant emphasis on the limits of centralised systems and the need for multi‑year funding, capability and accountability frameworks.
In short, the local growth evaluations provide grounded evidence of what works in practice, while the IfG report offers a more explicit systems diagnosis: that without sustained devolution, integration and long‑term investment, the conditions needed for those “what works” approaches to succeed will remain constrained.
