Do we need yet another body to help local government harness the potential of AI?

Dr Caroline Webb, Dr Stephen Jeffares, and Dr Tarsem Singh Cooner

Does local government need a devolved AI service to help the sector successfully harness the transformative power of AI? A new paper from the Tony Blair Institute (TBI) thinks so.

In their recent report “Governing in the Age of AI: Reimagining Local government” TBI make the case that local government faces several significant challenges – there’s a growing backlog of people seeking support coupled withdwindling resources and two thirds of funds must be spent on social care. Satisfaction is declining. Over £1.8bn is spent by the sector on technology, but innovation is stifled by a patchwork of legacy systems. The solution TBI suggest is the universal embracing of AI tools, but which is orchestrated, curated, supported and (one day perhaps) exploited via the establishment of a Devolved AI Service (DIAS).

The adoption of AI by councils continues to accelerate. Vendors of these tools extol the positive outcomes of AI, suggesting that “the day-to-day tasks of local government, whether related to the delivery of public services or planning for the local area, can all be performed faster, better and cheaper with the use of AI” (p3).  The UK, they argue, could save £200 billion over five-years though AI related productivity improvements (p7).

Whilst AI undoubtedly has the potential to increase the efficiency of some public services, we must pause to ask is it really the panacea that it is being marketed as, and are some public services being unfairly targeted by technology firms looking to promote their products and capitalise on an emerging market?

There is a clamour in the social care sector for example, which accounts for 64.8 per cent of the total budget for local government in England, to develop AI tools that can offer significant time savings, with the rationale that workers can spend more quality time with clients and less time on completing paperwork.  TBI cite Beam’s automated note-taking tool Magic Notes, that aims to transform social workers’ productivity by saving them s up to ‘a day per week on admin tasks’. Yet without external scrutiny and verifiable evaluation, such figures are little more than marketing claims.

As these technologies are capturing and summarising meetings to support some of the most vulnerable members of society, there is a need for local councils to interrogate these marketing claims critically before committing to such AI tools.  Despite safeguards such as ensuring a ‘human in the loop’, if these claims are not thoroughly examined there is a danger that these technologies may serve to reinforce and perpetuate existing biases, pose risks to clients’ data privacy and safety, strengthen process-driven systems which undermine person-centred decision making, and erode the relational foundations on which these services are built.

Of course, AI has numerous applications beyond reducing the cost of resource intensive casework. It has the potential to address some of the most despised and most intractable local policy problems (potholes, mould, chronic pain, mental health waiting lists). But the desirability of these innovations should not cause us to forget that these technologies themselves are not neutral, and just as they can lead to positive outcomes, if they are misused or implemented without proper ethical considerations, then adverse effects are just as likely to emerge.

The proposed introduction of a Devolved AI Service may go some way to ensuring a set of standardised safeguards, allowing for a coordinated approach to AI adoption within public services.  This collective approach could reduce duplication, provide practical support for the implementation and evaluation of sector-specific AI tools and facilitate a collaborative approach to working with technology providers to improve their products.  However, is it necessary to impose another central regulator on Local Government? There is already considerable piloting and evaluation of AI tools being conducted at the local level. These sector-specific evaluations are facilitating opportunities for shared horizontal learning across organisations. But it is vital that these results are shared, and that the evaluative measures and methods being employed are not imposed by the vendor of the tool, but rather determined by the needs of the organisation and the people they serve. Such evaluation should also consider more than value for money or accuracy, but also the experiences of frontline staff and citizens.

Ultimately, irrespective of how we chose to oversee the integration of AI tools, we must not lose sight of the fact that these tools should only be viewed as ‘part’ of the solution to providing effective public services, not the ‘whole’ answer as some technology companies may lead us to believe.

Dr Caroline Webb, Dr Stephen Jeffares, and Dr Tarsem Singh Cooner are academics at the University of Birmingham exploring how AI is reshaping frontline public service. Combining expertise in social work, public policy, and digital ethics, they develop training and research that support practitioners to engage critically and confidently with emerging technologies. Their work champions ethical, human-centred innovation in public services.

How digital policing may transform local relationships with the public: international perspectives from the Policing in the Digital Society Network Annual Conference 2025

Dr Elke Loeffler

The Policing in the Digital Society Network is a European network of academics and practitioners researching the changing nature of policing in the digital society. I was recently able to attend its annual conference, held at the University of Northumbria, for the first time and found this inspiring event brought together a vibrant community actively involved in exploring the impact of digital policing on local relationships with the public.

With the rapid increase in the availability of new digital technologies, including AI applications, together with ever-mounting staff and budget pressures, police forces in the UK, the Netherlands and Nordic countries are making increasing use of digital tools, e.g. automation of processes such as the transcription of interviews with victims and offenders and use of digital forensics to make investigations more effective. At the same time, the speed of technological innovation has given rise to new forms of cybercrime such as online forms of Violence against Women and Girls and has generated new policing tasks such as digital safeguarding.

Does this mean that the ‘bobby on the beat’ will be replaced by chatbots, so that relations between local people and the police will be dehumanised at neighbourhood level? The research presented at the conference in Northumbria University provided two different perspectives on this: Prof. Jan Terpstra’s research on the impact of digitalisation on the policing of public protests suggests both that policing has become more ‘abstract’, with increasing reliance on data-based systems, and has become more distant from and less personally knowledgeable about local community groups, while protesters have sought to become less traceable by disguising their physical appearance and avoiding the use of smart technology.

Interestingly, empirical research by Wendy Schreurs and Prof. Wouter Stol on intelligence-based neighbourhood policing in a selected district in the Netherlands has shown that officers get 54% of their information from citizens and 47% from digital police sources. This suggests that the police still need citizens as much as citizens need the police. At the same time, there is evidence, that neighbourhood police officers still spend a lot of unassigned time in their cars without any contact with local people. This has given rise to an experiment to provide police officers with intelligence-based notifications about priority issues at neighbourhood level such as fly-tipping and local ‘hot-spots’, so that they are able to target these issues in a more structured way and provide feedback, which is shared across police teams, and increase their visibility and dialogue with local people.

Moreover, a study by Prof. Kira Vrist Ronn from the University of Southern Denmark on digital police patrols in Norway suggests that the use of online platforms for genuine dialogue with local people on local issues (not necessarily related to policing), together with videos on social media platforms showing police officers in informal settings (such as the famous ‘dance videos’ by Norwegian police officers), may help to create ‘proximity at a distance’, as Kira termed it. In other words, the development of trust relationships does not necessarily have to start with ‘face-to-face’ meetings.

In the light of rapid technological advances and increasing (transborder) cybercrime there is now clearly an urgent need for police forces to collaborate in order to share risks and learning from digital experimentation – something which is still underdeveloped across Europe. As one police force representative stated at the conference “We have to speed up innovation processes”.

This applies in particular to the UK, where there is a risk that severe austerity pressures will drive the 43 police forces to become more inward looking and reactive, instead of breaking up silos and practicing collaborative innovation (Hartley et al. 2013) and collaborative governance (Loeffler 2024) in order to achieve much needed synergies. At European level, Europol has set up a secure infrastructure and innovation methodologies to enable the sharing of unlicensed tools and innovative projects between its members – but Brexit has excluded the UK from some of the key Europe-wide networks and partnerships. While a number of UK police forces such as Thames Valley Police have set up ‘Innovation Hubs’, which include behavioural scientists, leveraging this potential in policing faces particular ethical, legal and governance challenge. The responsible use of AI and other innovations in policing will require more public scrutiny and dialogue with the public at the local level, with a need for robust practices at local level. The independent Data Ethics Committee of West Midlands Police and Crime Commissioner (WMOPCC) and the West Midlands Police (WMP) has already taken steps toward more public engagement. At the same time, there is also a need for national frameworks which learn quickly from emerging successful practice across Europe in this rapidly changing environment.

Furthermore, Wouter Stol et al. (2025) make the point that a more integrated approach to online crime is needed in terms of prevention, detection and disruption. This integrated approach will not just be the responsibility of police forces but other public services as well, particularly in local government. From that point of view, the impact of austerity at local level, resulting in lower priority to community safety in UK local government, has been damaging and will need to be reversed – a topic which the Institute of Local Government Studies (INLOGOV) and the Centre for Crime, Justice and Policing at the University of Birmingham and, more widely, the Local Area Research and Intelligence Association (LARIA) and the Society for Evidence-Based Policing (SEBP) should urgently address.

The opportunities provided by this Policing in the Digital Society Network for learning from innovations and revealing practice across European police forces is likely to play an increasingly important role over the next few years. Its conference next year will be in Oslo – something to look out for!

Dr. Elke Loeffler is an Associate of INLOGOV and Director of Governance International. She undertakes applied research on local public services and has research interests in community engagement/co-production in a digital world. Elke is Vice-Chair for Doctoral Research in UKAPA and a Board Member of the European Group of Public Administration and the International Research Society for Public Management.