Essex Police Suspends Facial Recognition Deployments Amid Racial Bias Concerns

Essex Police has temporarily suspended the use of live facial recognition (LFR) technology in response to findings that highlighted potential racial bias in the system’s performance. The decision, which affects deployments from marked vans in locations such as Chelmsford, underscores ongoing debates about the responsible integration of artificial intelligence into law enforcement. The pause was triggered by an independent academic study commissioned by the force itself, prompting swift action from civil liberties organisations including Liberty.

Liberty Raises Alarms Over Potential Discrimination

Liberty, a leading UK human rights organisation, has described the suspension as a necessary step and a broader wake-up call for police forces and government authorities. Akiko Hart, Director of Liberty, stated: “The fact that Essex Police has put its use of live facial recognition cameras on hold due to concerns over racial bias should serve as wake-up call for the Government and police. The body of evidence continues to grow on the discrimination within the technology, which is in widespread use by police forces across the UK. As a result, members of the public will continue to be wrongly identified as they go about living their lives – something we know will have a lasting negative impact on them.”

Hart emphasised that robust safeguards, oversight, and transparency should have been established prior to any deployment. Liberty has called on the Government to immediately halt further rollout of the technology until comprehensive legal frameworks are in place to protect the rights of all citizens.

Findings from the University of Cambridge Study

The suspension followed a detailed evaluation conducted by researchers at the University of Cambridge. The study involved 188 volunteer actors who walked past cameras mounted on marked police vans during real-world deployments in Chelmsford. Results showed that the system correctly identified approximately half of the individuals on police watchlists, with false positive matches remaining extremely rare. However, the technology proved statistically significantly more likely to correctly identify Black participants than those from other ethnic groups. It also demonstrated higher accuracy rates for men compared with women.

Co-author Dr Matt Bland, a criminologist, observed that offenders passing the cameras would have a greater chance of identification if they were Black, a disparity that he described as warranting further investigation. A separate analysis of over 40 deployments between August 2024 and February 2025, which scanned more than 1.3 million faces and resulted in 48 arrests, reinforced the need for continued scrutiny. The Information Commissioner’s Office (ICO) confirmed that Essex Police acted promptly upon identifying potential accuracy and bias risks and has advised other forces to implement appropriate mitigations.

Essex Police’s Official Response

Essex Police has maintained a transparent and proactive stance. A force spokesperson explained: “Based on the fact there was potential bias the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software. We then sought further academic assessment. As a result of this work we have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals. We will continue to monitor all results to ensure there is no risk of bias against any one section of the community.”

The technology, supplied by Corsight, had been operational since summer 2024 and was credited with supporting targeted interventions to locate individuals wanted for serious offences. The force stressed that the pause was precautionary and that updated software and revised protocols would allow resumption under enhanced oversight.

Implications for Policing in Essex County

The temporary halt carries notable implications for law enforcement across Essex, a county with diverse urban and rural communities. Proponents argue that LFR enhances operational efficiency by enabling rapid identification of suspects in public spaces, thereby contributing to public safety and the prevention of crime. The force has previously highlighted its value in operations targeting serious offenders, aligning with national efforts to expand such tools.

Nevertheless, concerns about perceived or actual bias risk eroding public confidence, particularly within minority ethnic communities. Civil liberties groups contend that any disproportionate impact could exacerbate existing tensions and deter cooperation with police. For Essex residents, the episode highlights the delicate balance between technological innovation and the principles of fairness, privacy, and equality. Resuming deployments with strengthened safeguards may restore trust and bolster crime-fighting capabilities; however, prolonged uncertainty or unresolved issues could invite legal challenges and necessitate greater investment in independent oversight.

In the wider context, this development reflects a maturing national conversation on artificial intelligence in policing. While the Home Office has signalled plans to increase LFR capacity across England and Wales, incidents such as this serve as a reminder that effective implementation demands rigorous testing, transparency, and accountability. As Essex Police refines its approach, the case may influence policy decisions not only locally but across other forces currently trialling similar systems. Ultimately, the responsible advancement of LFR will depend on sustained dialogue between law enforcement, regulators, civil society, and the communities they serve.

(Word count: 712)

Scroll to Top