Tuesday, March 24, 2026

Advertise With Us
Sign In


Home Innovation Software Essex Police Halts Facial Reco...

Essex Police Halts Facial Recognition after Study Reveals Racial Bias


Software

Essex Police Halts Facial Recognition after Study Reveals Racial Bias

Live facial recognition technology shows algorithmic bias favoring identification of Black participants over other ethnic groups.

Essex Police has temporarily paused deployment of live facial recognition cameras following research revealing potential racial bias in the system's identification accuracy. The force that began using the technology in summer 2024 commissioned independent studies from the University of Cambridge that identified concerning gaps in how the algorithm performed across different demographic groups. The research findings revealed significant algorithmic bias. In a controlled field experiment involving 188 volunteers, the system correctly identified approximately 50% of individuals on watch lists who passed the cameras. The study documented that the system was statistically significantly more likely to correctly identify African participants than participants from other ethnic groups while also demonstrating higher accuracy rates for men compared to women.

A second study analyzing over 40 LFR deployments between August 2024 and February 2025 provided broader operational context. The technology scanned approximately 1.3 million faces in public spaces resulting in 123 police interventions and 48 arrests that is roughly 1 arrest per 27,000 faces scanned. The deployment recorded only one confirmed mistaken intervention suggesting operational effectiveness despite the identified bias concerns.

Essex Police responded by pausing operations and initiating software improvements. A spokesperson stated "We decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software." Following algorithmic updates and revised policies and procedures, the force expressed confidence in resuming LFR deployments, pledging continued monitoring "to ensure there is no risk of bias against any one section of the community."

The pause reflects broader tensions surrounding facial recognition technology deployment. The government announced in January plans to expand LFR vans from 10 to 50 across the country, with Home Secretary Shabana Mahmood citing the Metropolitan Police's 1,700 arrests resulting from LFR use as justification for nationwide rollout. However campaign group Big Brother Watch criticized the technology as "authoritarian, inaccurate and ineffective" with spokesperson Jake Hurfurt emphasizing that "AI surveillance that is experimental, untested and inaccurate or potentially biased has no place on our streets." The organization warned that LFR deployment "could put the rights of thousands of people at risk" particularly regarding racial discrimination concerns.


Business News


Recommended News

×

Subscribe To Our Newsletter

email

please enter valid email

×
tankyu


Latest Magazine