Article Details

Scrape Timestamp (UTC): 2025-12-08 12:24:14.188

Source: https://www.theregister.com/2025/12/08/ico_home_office_rfr/

Original Article Text

Click to Toggle View

Home Office kept police facial recognition flaws to itself, UK data watchdog fumes. Regulator disappointed as soon-to-be-scrapped algo's problems remained a secret despite consistent engagement. The UK's data protection watchdog has criticized the Home Office for failing to disclose significant biases in police facial recognition technology, despite regular engagement between the organizations. Emily Keaney, deputy commissioner for the Information Commissioner's Office (ICO), said the regulator only learned last week about historical bias in the algorithm used by UK police forces for retrospective facial recognition (RFR) within the Police National Database (PND). "It's disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services. "While we appreciate the valuable role technology can play, public confidence in its use is paramount, and any perception of bias and discrimination can exacerbate mistrust. The ICO is here to support and assist the public sector to get this right." The ICO has requested urgent clarity from the Home Office to assess the situation and determine next steps. Keaney's comments follow updated accuracy tests published on December 4, conducted by the National Physical Laboratory and commissioned by the Home Office. The tests examined two algorithms: Cognitec FaceVACS-DBScan ID v5.5, currently used by the Police National Database, and Idemia MBSS FR, planned for future deployments. While Idemia's test results were nearly perfect both in ideal testing conditions and realistic operational deployments, Cognitec's algorithm showed significant weaknesses when identifying certain demographics under strict settings designed to eliminate false positives. In Cognitec's case, when no restrictions were applied, it correctly matched an image of a suspect to an individual in the PND 99.9 percent of the time. However, when testers forced it to return results only when similarity scores were set to very high levels, effectively eliminating false positives, its accuracy dropped to 91.9 percent. This strict setting showed the algorithm was best at identifying Asian subjects, with a 98 percent success rate. White subjects were correctly identified 91 percent of the time, and Black subjects in 87 percent of cases. When the similarity scores were dropped but remained at high levels, false positive rates increased, disproportionately affecting certain demographics. In these tests, Black females were more likely to be falsely matched to a reference image than Black males, returning false positive rates of 9.9 percent and 0.4 percent respectively. Removing gender from the equation, false positive rates for White subjects (0.04 percent) were far lower than those for Asian (4 percent) and Black (5.5 percent) subjects. The Home Office told The Register that RFR results are never used as evidence before undergoing a manual review, reducing the risk of images being used incorrectly, and training and guidance have been reissued to police forces nationwide following the report. The government has also asked the Inspectorate of Constabulary to review police use of facial recognition technology, with assistance from the Forensic Science Regulator, in light of the tests. A Home Office spokesperson, said it takes the findings of the report seriously. "A new algorithm has been independently tested and procured, which has no statistically significant bias. It will be tested early next year and will be subject to evaluation. "Our priority is protecting the public. This game-changing technology will support police to put criminals and rapists behind bars. There is human involvement in every step of the process and no further action would be taken without trained officers carefully reviewing results." The tests were published as the Home Office launched a consultation to expand police use of facial recognition, despite myriad criticisms of the technology across its varying types of deployment. The UK government spends tens of millions on facial recognition technology every year, and has consistently vouched for its efficacy since the PND launched in 2011.

Daily Brief Summary

VULNERABILITIES // UK Home Office Criticized for Concealing Facial Recognition Bias Issues

The UK's Information Commissioner's Office (ICO) criticized the Home Office for not disclosing biases in police facial recognition technology, despite ongoing engagements.

The ICO learned of historical biases in the Police National Database's (PND) facial recognition algorithm only recently, raising concerns about transparency.

Tests revealed Cognitec's algorithm had significant weaknesses, particularly in accurately identifying Black subjects, with a higher false positive rate for Black females.

The Home Office has reissued training and guidance to police forces to mitigate risks and ensure manual reviews of facial recognition results.

A new algorithm, tested independently, showed no statistically significant bias and is slated for evaluation next year to enhance accuracy and fairness.

The UK government continues to invest heavily in facial recognition technology, emphasizing its role in law enforcement despite criticisms of its deployment.

The Inspectorate of Constabulary and the Forensic Science Regulator will review police use of facial recognition technology following the recent findings.