Central Government, Evidence and Short-Term Strategies in the Support to Businesses and Local Economic Recovery in the Age of Covid

Tom Collinson

If there has been one mantra by which government policy has claimed to have lived by during the COVID-19 crisis, it is that it has been led by, guided by or that it is following the science. Intended to strike a reassuring tone, the claim to evidence was routinely emphasised by the government as either the Prime Minister or a deputy was flanked by a member of SAGE. When questioned on his previous disavowal of experts by Sky News at the beginning of the crisis, the Chancellor of the Duchy of Lancaster, Michael Gove, noted that these were economists he had referenced in the past, it was not established medical facts; suggesting that this time was different and that this science was different.

One article by the New York-based magazine The Atlantic even went so far as to claim that ‘Britain Just Got Pulled Back from the Edge’ as ‘the institutions and positions of state were…clicking into gear.’ While it appeared to be a rosy picture at the start, with the government publishing the scientific advice online, with a gesture that it would continue to do so, this tone quickly unravelled as the Guardian reported that non-scientists seemed to be advising government; there was no list of who exactly was in the SAGE group and why there were no (publicly available) minutes of the meetings and government advice was no longer transparent. Some of this has now changed.

All this has provoked an interesting question of the relationship between science, evidence and data-analysis with policy-making in the UK. How does one affect the other? Is it possible for one to distinguish between various forms of evidence in the policy-making process and make a judgement on which is the most appropriate? To distinguish between mathematical modelling, so-called evidence-based policy-making (that which traditionally elevates the role of Randomised Controlled Trials) and place-and-people contextualised policy? Is it possible to have what Kant called a constitutive judgement in public-policy? (I.e. a judgement which is not based on any further assumptions, hypothetical conditions or suppositions, such as values, narratives and aesthetics). For the past decade or so, there has been a growing literature on all of these questions and the urgency of the current pandemic has enlivened them.

These questions are of increasing interest to academics, journalists and opposition parties in the Anglosphere. With regards to the United Kingdom, the establishment of an ‘Independent SAGE group’ has been indicative of some dissent from the government’s claim to scientific unity.

For local government, these issues have taken on another interesting dimension, one that examines the relationship between governance and the collection and application of evidence in policy responses. In a report on the global picture of city-governments, the OECD has distinguished between two types of evidence-led responses. The first discusses local governments as instruments or ‘implementation vehicles of national measures such as confinement’. The second acknowledges the experimentation of ‘more bottom – up, innovative responses while… building on their unique proximity to citizens.’

Building on this insight, we can begin to describe a temporal framework, which provides further detail to the OECD’s report on the times when local government have been able to articulate their own evidence-based response and when the information and decision-making lies more in the hands of central government.

While it is still unclear where we are on the timescale of the virus or the response to it – which indeed make the articles in this post preliminary – this framework can be outlined on the basis of the short, medium and long term response to the epidemic. Such an approach is based on how councils themselves are articulating a response (using similar language such as the ‘rescue’, ‘recovery’, ‘rebuild’ or, ‘hammer’, ‘dance’ and ‘reconstruction’ as distinct phases in the plans).

Categorising policy responses in this way has a lot of precedent in the field of economics. With regards to the economics of a crisis, the same typology has been outlined by Professor Andy Pike, who’s presentation to the ‘Major Economic Shocks Workshop’ at the What Works Centre for Local Economic Growth addresses the types of policy responses available with regards to the local economy, businesses, supply chains and labour markets in the three different time periods. The important point here is that in the short-term responses are direct, and contingent on the problem, whereas long-term responses are open-ended and rely on change. Short-term employment issues for example are addressed through subsistence allowances, while (re)training and entrepreneurship should be leveraged in the long-term. The same applies to supply chains; the short-term goal is to secure capacity and jobs through say refinancing, while in the long-term diversification and innovation is required.

Focussing on short-term strategies during the current epidemic and lockdown, the measures taken have exhibited the direct qualities that Pike addresses. However, these have often been delivered by way of decisions and information collected in the devolved governments and Downing Street. While there have been ongoing efforts by local authorities to assess immediate likely impacts – as seen in Cardiff and the West Midlands – the role of councils has largely been to act as something of a lightning rod (or courier, depending on how you judge their efficiencies) for UK government policies. While there has been some contestation around these matters from local councils, for example in the early closure of parks, the wide picture has been one of convergence throughout the country in a number of areas of practice, including areas of communication and awareness rising, social distancing, confinement and taking targeted measures to help vulnerable groups. In many cases, this has been guided by national government regulations and the ‘dos and donts’ policy responses, financial backing of £3.2bn to be awarded to councils in England to ensure a continuation of services, as well as some financial restrictions or ring-fencing.

The reliance on central government publications and financial backing has characterised the issue of supporting businesses and economic recovery too, where councils are in the front line for conducting policies made primarily in London but also Cardiff, Edinburgh and Belfast. While there may be some differences between the England and the devolved assemblies – for example on the differences in the administration of business support in Wales and England, or the degree of discretion councils are exhibiting when it comes to business support, the general theme of subsistence pay to employees, business relief and grant funding through councils has taken the same shape throughout the country, as we can see from the following examples:

  • In England, the business relief announced by the Chancellor is being paid for by councils through the Small Business Grants Fund and the Retail, Hospitality and Leisure Grant Fund, and reimbursed to local authorities should the guidance published by the MHCLG be followed.
  • There is £6 billion in local authority payments of the Central Share of retained business rates that were due to be made over the next three months.
  • A £500 million Hardship Fund ‘of new grant funding to support economically vulnerable people and households in their local area’ administered through existing ‘local council tax support schemes’.
  • In Wales, the Welsh Government are offering a years relief on business rates to shops, leisure and hospitality businesses, and also offering small grants. Local councils are calculated to have distributed £508m to 41,000 businesses by the end of April.
  • In Scotland, Local Authorities are administering Small Business Support Grants as well as Retail, Hospitality, Leisure Support Grants of up to £10,000 and £25,000 respectively.
  • Similarly, in Northern Ireland a grant scheme of £25,000 for Retail, Hospitality, Tourism and Leisure has been offered, should the criteria outlined by the Northern Ireland Executive be followed.

In one of the foundational texts of modern political science, Alexis de Tocqueville describes the governance structure of the ancien regime, whereby all administrative corridors in French political life led back to the King. Intendants hired by a King to administer a province in-turn hired a sub-delegate to administer canons, where the happiness or misfortune of individuals depended entirely on ‘the whole operation of the central government’. The argument for arranging matters in this centralised manner was a financial one – to levy taxes in order to guarantee the State’s safety. But this ultimately led to the downfall of the regime itself. While I’m not comparing the UK Government to the House of Bourbon, modernity offers a number of examples where centralisation – justified because of finance and security – tends towards political and social disintegration. Further examination will do well to determine whether there is a different path forward in the long-run response to this crisis.

 

Tom is a postgraduate researcher with an MA in Political Thought and a BSc in Economics and Politics from the University of Exeter. His main research interests are in modern political thought, with particular expertise in the political philosophy of Hannah Arendt and Karl Jaspers, on whom he wrote his thesis, rethinking the concept of political participation and civic action in modernity. Tom is now researching the role of local and central government responses to the Covid-19 pandemic, inspecting how they complement and contrast one another. He tweets at @tzcll.

Covid-19: Is Government Really “Led By The Science”?

Jason Lowther, Director of the Institute of Local Government Studies, University of Birmingham (not representing the views of the university)

In the midst of the EU Referendum campaign, Michael Gove famously commented that “people in this country have had enough of experts”. No longer. Fast forward four years, Gove (and every other minister) is sharing press conferences with professors and claiming to be “led by the science”. But with the UK topping the European tables of Covid-19 deaths, what does that actually mean? And is “science” the only type of knowledge we need to make life-saving policy in the Covid crisis?

Making policy is difficult and complex – particularly in a crisis, and especially one caused by a virus that didn’t exist in humans six months ago but has the potential to kill millions. The information we have is incomplete, inaccurate and difficult to interpret. Politicians (and experts) are under huge pressure, recognising that their inevitable mistakes may well cost lives. My research has shown that even in more modestly stressful and novel contexts, policy makers don’t just use experts to answer questions, but also their public claims to be listening to experts are useful politically. Christina Boswell identified the ‘legitimising’ and ‘substantiating’ functions of experts. Listening (or at least appearing to be listening) to experts can give the public confidence that politicians’ decisions are well founded, and lend authority to their policy positions (such as when to re-open golf courses).

Covid-19 is a global issue requiring local responses, so the spatial aspects of using experts and evidence are particularly important. Governments need to learn quickly from experiences in countries at later stages in the epidemic, including countries where historic relations may be difficult. Central governments also have to learn quickly what is practical and working (or not) on the ground in the specific contexts of local areas, avoiding the vain attempt to manage every aspect from Whitehall. My research shows that the careful use of evidence can help here, developing shared understandings which can overcome historic blocks and enable effective collaboration. But in Covid-19 it seems central government too often is opting out of building these shared understandings. Experience in other countries has sometimes been ignored. Vital knowledge from local areas has not been sought or used. Instead of transparently sharing the evidence as decisions are developed, evidence has been hidden or heavily redacted, breaking a basic principle of good science and sacrificing the opportunity to build shared understandings open to critical challenge.

What counts as “evidence” anyway? Different professional and organisational cultures value different kinds of knowledge as important and reliable. In my work with combined authorities, I found that bringing mental health practitioners into policy discussions had opened up a wide range of new sources of knowledge, such as the voices of people with lived experience. And, carefully managed, this wider range of types of knowledge can lead to better decisions. The Government’s network of scientific advisory committees, once we finally were told who was involved, seems to have missed some important voices. The editor of the Lancet, Richard Horton, argued that expertise around public health and intensive medical care should have been in the room. I would also argue that having practical knowledge from local councils and emergency planners could help avoid recommendations that prove impossible to implement effectively. As Kieron Flanagan has noted recently, we learned in the inquiry into the BSE crisis that esteemed experts can still make recommendations which are impossible to implement in practice.

Making a successful recovery will require government quickly to learn lessons from (their own and others’) mistakes so far. Expert advice and relevant data should be published, quickly and in full – treating the public and partners as adults. Key experts for this phase (including knowledge of local public health, economic development, schools, city centres and transport) should be brought into the discussions as equal partners – not simply the “hired help” to do a list of tasks ministers have dreamt up in a Whitehall basement. Then we can have plans that are well founded, widely supported, and have the best chance of practical success. Our future, in fact our very lives, depend on it.

This post was originally published in The Municipal Journal.

 

lowther-jason

Jason Lowther is the Director of INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther

What works in learning what works?

Jason Lowther

I have been grumping for at least the last 25 years about how little of the evidence that is developed by academic researchers, practitioner-researchers, consultants and others is effectively deployed in developing public policy and practice. We intuitively know, and politicians regularly proclaim, that evidence should inform policy. So why did it take over a decade to move from Dr. Barry Marshall vomiting in his lab having taken the extreme measure of drinking a H. Pylori bacteria cocktail to prove that this bug causes stomach ulcers (which can then be cured by antibiotics) to these antibiotics being routinely used in medical practice? And why is the Troubled Families Programme so unusual in the plethora of government mega-initiatives, simply by having a robust evaluation published showing where it works (and where not)?

A lot of the academic research in this area points to deficiencies in what you might call the “supply side” of evidence. Academic research is too slow and therefore out of date before its results are seen. Journal articles are hidden behind expensive firewalls, written in an opaque language, packed with caveats and conclude mainly that we need more research. Academics sometimes don’t know how to talk with policy makers, or even who they are.

There is truth in most of these points, and there has been some useful progress in addressing many of them in recent years, for example the very readable summaries of available evidence published by some What Works Centres on topics such a local economic growth and children’s social care. And it’s an immense privilege to have recently joined INLOGOV at the University of Birmingham, a department with a vast network of alumni across local government, and academics who are well used to speaking the sector’s language.

But I’m increasingly feeling that the real issue is on the “demand” side. Do we as practitioners and politicians really want to know what the evidence is telling us? What if the evidence highlights issues we’d rather not know? How do we know evidence when we see it and what if it is contradictory? Furthermore, how do we find this “evidence” anyway, and how can we know what’s reliable and what’s fake news? With American oil companies allegedly giving millions to fund research questioning climate change, who can we trust? Finally, how can we conduct small scale local research – so important when trying to understand local difference – that provides geographically relevant evidence without being accused of providing limited and unreliable findings?
I’ve been involved as LARIA, the LGA, SOLACE and others have run some excellent programmes to support practitioners and policy makers in making use of evidence.

One of my favourites was the excellent “Evidence Masterclass” organised by SOLACE which provided a one-day familiarisation courses for chief executives. At the other extreme, universities provide extensive MSc courses on research methods for a variety of public health and social scientists. But not many of us can devote years to understanding how research works and can be used, and there’s a limit to what anyone can learn in a single day.

So in my new role as director of the Institute for Local Government Studies at the University of Birmingham I’ve been excited to help deliver our new “Public Policy and Evidence” module within our Public Management and Leadership MSc and Degree Apprenticeship programmes. This is a six-week module, involving around five hours distance learning a week followed by a two-day on-campus session, currently being taken by forty senior local public service managers from a number of English local authorities and the LGA.

It’s been fascinating to see these managers think through how evidence relates to their work areas, explore how rigorous research works and the different ways it can inform policy making and service design, and get to grips with the detail of the various techniques social science researchers use. We’re now moving to the on-campus days, where we’ll be looking at several live examples from local public services in Birmingham, Coventry and Manchester, and keying up a significant practical supervised research project they will each undertake in their home organization over the next several months.

It’s exciting to see the improvements in the “demand” and “supply” sides of evidence informed policy making that are being delivered through this course and initiatives such as the What Works Centres and local Offices for Data Analytics. Who knows, in the decade or so before I retire, I may even be able to stop grumping about evidence-based policy?

This article was originally published in the newsletter of the Local Area Research and Intelligence Association (LARIA).

lowther-jason

Jason Lowther is the Director of INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther

Unlocking research for local government

Jason Lowther

 

This post originally appeared on the Solace website. You can find it here

Local government needs evidence, from the apparently mundane but nonetheless critical (‘What choice of cladding will minimise the risk of fire spreading?’) to extraordinary insights (‘How do people choose what to eat and whether to be active?’, ‘What skills will today’s youngsters need in the jobs market of 2050?’)

In 2014, Solace commissioned an initial Local Government Knowledge Navigator (LGKN) report, From Analysis to Action: Connecting Research and Local Government in an Age of Austerity, which demonstrated that:

1. Councils have a wide range of evidence needs;
2. There is relevant research and expertise in academia but local government doesn’t make the most of this;
3. There are some impressive examples of collaboration between academia and local authorities but engagement is inconsistent, and often depends on existing links between individual researchers and local government officers or politicians; and
4. There is a need for a change of culture in both communities, and the development of more systematic approaches to achieving connectivity between them.

The key issues identified around local authorities’ approach to successful engagement were:

– senior appreciation of and support for research evidence;
– experience of using research and data to inform decision-making;
– consortia, to spread the cost and reduce risks to reputation;
– support from brokers with the expertise and time to develop proposals;
– the ability and skills to successfully commission research (or access to them); and
– local authority research teams and service managers establishing relationships with local universities.

Following the LGKN work, Solace continued working with the ESRC and LGA through the Research Facilitator, and has established dedicated spokespeople on evidence-based policy.  Recently, it supported the Centre for Public Scrutiny and Nesta in producing the document “Using evidence: A practice guide for local government scrutiny” which launched last month, and which aims to help local government make better use of research evidence.

Recent research has highlighted that local government has particular ways of looking at research that differ from much of academia (including the traditional approaches of public health colleagues).  Local government recognises the importance of ‘place’, and the uniqueness of each area’s situation and background.  As a result, we are particularly interested in evidence, including particular expertise, which relates directly to our place, and this can come across as only being interested in evidence that is ‘home grown’.

When Gemma Phillips and Judith Green recently looked at the transition of public health from the NHS to local government, they found that this different culture reflects local government’s more holistic view of health and wellbeing (rather than healthcare services), and our focus on practicality (rather than the provenance and methodological rigour of research studies).

Austerity has meant less government spending on research and evaluation, particularly at local level, although in 2015/16 national government still invested £5.6 billion in science and research, including £178m in ESRC alone.  So if we want research that government (including local government) will practically use, we probably have to get smarter in terms of more targeted funding, and in particular presenting local government as a key solution to the ‘Impact’ agenda which is vital to universities’ research funding.

This suggests to me that local government needs to take a much more active role in influencing the research agenda locally.  It’s not enough to rely on a kind of ‘Brownian Motion’ in the hope that academics’ research interests will in some way coincide with the policy priorities of local government.  We need to let academics know what our policy priorities are, and to listen to them as they explain what is already known in the relevant fields, and how further research might help us address these priorities.

In the West Midlands, as part of a comprehensive partnership with local universities, the Combined Authority this week set out a clear agenda for research related to its policy priorities for the next three years.  Developed from its Strategic Economic Plan, this includes both economic and social (public service reform) policy priorities, and further development of information sharing and the use of evaluation.  The WMCA ‘Policy Research Plan’ has been developed with input from policy leads and academic experts identified across the local universities and agencies, who will now take forward the agreed activities in a common programme.

So, for example, around ‘connected autonomous vehicles’ we are interested in exploring how emerging technologies can be exploited to improve transport accessibility and reduce subsidy costs whilst supporting enhanced network performance.  Around ‘vulnerable offender pathways’, we need to understand areas where regional working can add most value, together with the offence profile and pathways for specific groups, such as young person and women offenders.

Developing the Plan has compelled policy leads to be much more explicit about the questions they need answering to take forward the policy priorities, and has enabled academic experts to engage in developing these into research-able questions.  As this engagement continues, we expect further synergies to develop giving us much more robust and ‘actionable’ research in future.

References

Local Government Knowledge Navigator reports

http://www.solace.org.uk/knowledge/reports_guides/LGKN_Analysis_to_Action.pdf

http://www.solace.org.uk/knowledge/reports_guides/LGKN_LA_research_collaboration.pdf

Phillips, Gemma, and Judith Green. “Working for the public health: politics, localism and epistemologies of practice”, Sociology of health & illness 37.4 (2015): 491-505.

Using evidence: A practice guide for local government scrutiny

WMCA Policy Research Plan

https://governance.wmca.org.uk/documents/s287/Appendix.pdf

 

lowther-jasonJason Lowther is a senior fellow at INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther

Troubled Families: So what can we learn?

Jason Lowther

Over the last five blogs I have looked in some detail at the Troubled Families Programme and in particular its independent evaluation. I’ve argued that the evaluation shows some important impacts from the programme, but has so far missed valuable learning by failing to capture the local angle, covering too short a time horizon, and not designing in a theory-informed experimental approach. This week I want to reflect on four lessons from the experience.

The TFP has delivered real impacts. We know that the TFP has changed how services for these families are delivered. The independent evaluation finds it has mainstreamed “whole-family” approaches, stimulated local multi-agency working, opened up previously impossible data sharing and made employment support more responsive. Families on the programme feel (and told the researchers) that it’s made a big difference to their lives. And the figures local authorities submitted about the changes in families who were classified as “troubled” (out of school, out of work, committing crime, etc) are audited and truthful – they do represent actual changes in people’s circumstances.

The TFP evaluation questions whether these impacts would have occurred in any case, without the TFP itself. But the evaluation was hamstrung by being undertaken too early and for insufficient time, by limited data (for example because academy schools are not required to co-operate on sharing vital information), and by the lack of an experimental and theory-based approach.

So what can we learn from the TFP experience?

First, the TFP isn’t the panacea ministers claimed – trumpeting an incredible 99% success rate whilst delaying publication of the independent evaluation set up the department to face a storm of media criticism. But it has made a big difference: the TFP changed how these services are delivered, the families noticed a significant improvement, and councils have rightly claimed for progress made.

Secondly, the department and evaluators have done a good job at trying to rigorously assess whether the TFP worked better than “business as usual”. Next time, it would be best to build a rigorous experimental approach into the programme design up front – and to develop some testable theories of how the programme is supposed to effect change.

Thirdly, national summaries can only take us so far. The real diamonds of learning are at local level. Departments should fund and support local areas to learn quickly from the natural experiments that happen when different councils adopt and adapt national policy which is based on limited prior knowledge and evidence.

Fourthly, although challenging for politicians with an eye on their ministerial career, pilots need to be given chance to bed-in before being pulled up for evaluation, and evaluation needs to run long enough to know whether we are getting results. Evaluators can learn from past experience and “new” approaches such as theory-based evaluation.

As TFP and other government programmes roll out in future, these four lessons can make sure that we learn and improve outcomes as quickly as possible.

 

lowther-jason

 

Jason Lowther is a senior fellow at INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther

Troubled Families: How Experimenting Could Teach Us “What Works?”. Part 2.

Jason Lowther

In my last blog I looked at how designing a more experimental approach into this and future programmes could yield lots of insight into what works where. This week I would like to extend this thinking to look at how “theory-based” approaches could provide further intelligence, and then draw some overall conclusions from this series.

As well as rigorous analysis of quantitative impacts, theory-based approaches to evaluation can help to test ideas of how innovative interventions work in practice – the “how?” question as well as the “what works?” question[1].

For example the Troubled Families practitioners might have developed theories such as:

  • Having consistent engagement with a key worker, and working through a clear action plan, will increase families’ perception of their own agency and progress.
  • Having regular and close engagement with a key worker will enable informal supervision of parenting and reduce risk around child safeguarding concerns.
  • Having support from a key worker and, where needed, specialist health and employment support, will increase entry to employment for people currently on incapacity benefit.

Interestingly each of these appears to be supported by the evaluation evidence, which showed much higher levels of families feeling in control; lower levels of children in need or care; and reduced benefits and employment (compared to controls).

  • Having consistent engagement with a key worker, and working through a clear action plan, will increase families’ perception of their own agency and progress. The evaluation showed almost 70% of TFP families said they felt “in control” and their worst problems were behind them, much higher than in the “control” group of families.
  • Having regular and close engagement with a key worker will enable informal supervision of parenting and reduce risk around child safeguarding concerns. The TFP “final synthesis report”[2] shows the number of children taken into care was a third lower for the TFP families than for the “control” group (p.64).
  • Having support from a key worker and, where needed, specialist health and employment support, will increase entry to employment for people currently on incapacity benefit. Again, the final synthesis report suggest that the weeks on incapacity benefit for TFP families was 8% lower than the controls, and the entry into employment 7% higher (pp.56-57).

 

The TFP evaluation probably rightly writes off these last few examples of apparent positive impacts because there is no consistent pattern of positive results across all those tested. Given that the evaluation didn’t attempt to test particular theoretical hypotheses like this, it is possible that they have occurred through natural random variation. But if a much more targeted search for evidence built on theory delivered these results consistently, that would be worth celebrating.

Next week I will conclude the series by reflecting on the four key lessons we can learn from the TFP evaluation experience.

[1] See Sanderson, I. (2002) ‘Evaluation, policy learning and evidence‐based policy making’, Public administration, 80(1), pp. 1-22. And White, M. (1999) ‘Evaluating the effectiveness of welfare-to-work: learning from cross-national evidence’, Evaluating Welfare to Work. Report, 67.

[2] https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/560499/Troubled_Families_Evaluation_Synthesis_Report.pdf

 

lowther-jason

 

Jason Lowther is a senior fellow at INLOGOV. His research focuses on public service reform and the use of “evidence” by public agencies.  Previously he led Birmingham City Council’s corporate strategy function, worked for the Audit Commission as national value for money lead, for HSBC in credit and risk management, and for the Metropolitan Police as an internal management consultant. He tweets as @jasonlowther