Article Details

Scrape Timestamp (UTC): 2025-08-08 10:52:13.420

Source: https://www.theregister.com/2025/08/08/uk_secretly_allows_facial_recognition/

Original Article Text

Click to Toggle View

UK secretly allows facial recognition scans of passport, immigration databases. Campaigners brand Home Office’s lack of transparency as ‘astonishing’ and ‘dangerous’. Privacy groups report a surge in UK police facial recognition scans of databases secretly stocked with passport photos lacking parliamentary oversight. Big Brother Watch says the UK government has allowed images from the country's passport and immigration databases to be made available to facial recognition systems, without informing the public or parliament. Home Office slams PNC tech team: 'Inadequate testing' of new code contributed to loss of 413,000 records The group claims the passport database contains around 58 million headshots of Brits, plus a further 92 million made available from sources such as the immigration database, visa applications, and more. By way of comparison, the Police National Database contains circa 20 million photos of those who have been arrested by, or are at least of interest to, the police. In a joint statement, Big Brother Watch, its director Silkie Carlo, Privacy International, and its senior technologist Nuno Guerreiro de Sousa, described the databases and lack of transparency as "Orwellian." They have also written to both the Home Office and the Metropolitan Police, calling for a ban on the practice.  The comments come after Big Brother Watch submitted Freedom of Information requests, which revealed a significant uptick in police scanning the databases in question as part of the force's increasing facial recognition use. The number of searches by 31 police forces against the passport databases rose from two in 2020 to 417 by 2023, and scans using the immigration database photos rose from 16 in 2023 to 102 the following year. Carlo said: "This astonishing revelation shows both our privacy and democracy are at risk from secretive AI policing, and that members of the public are now subject to the inevitable risk of misidentifications and injustice. Police officers can secretly take photos from protests, social media, or indeed anywhere and seek to identify members of the public without suspecting us of having committed any crime. "This is a historic breach of the right to privacy in Britain that must end. We've taken this legal action to defend the rights of tens of millions of innocent people in Britain." The Register has approached the Home Office for comment. Maligned technology It's no secret that UK police have steadily increased its use of facial recognition technology in recent years, despite the ardent pushback from the pro-privacy crowd. There are three types of facial recognition (FR) tech used across the UK: retrospective FR, live FR, and operator-initiated FR. RFR and OIFR are generally seen as the less intrusive uses of the technology, wheeled out only when officers are aware that a crime has been committed and used to scan images of specific people of interest. LFR is different in that it involves setting up a camera in a location and it scans every face it captures, which means the vast majority of its subjects will be innocent people. The Home Office insisted in its LFR factsheet, which has not been updated since 2023, that LFR deployments are targeted, intelligence-led, time-bound, and geographically limited.  Efforts are made to inform the public when a camera is due to be set up in any given location, and the government said it has been used to successfully arrest wanted sex offenders in densely populated crowds, as well as other violent offenders. These kinds of examples are often used by the government to validate its controversies, just in the same way it uses the threat of terrorists and child sexual abuse offenders to justify its anti-encryption agenda and Investigatory Powers Act. The technology is pitched as a way to make the normal jobs of police officers more efficient, freeing their time to do other things. Officers are often briefed daily with images belonging to people of interest – LFR just lightens the load of this manual scouting, the Home Office says. Whichever way the government positions LFR, it doesn't appear to be allaying the concerns held by many about the way it is used. Authorities insist the accuracy is increasing, and the racial biases and false positives generated by FR scans are decreasing, despite early rollouts being plagued by such issues. According to the Metropolitan Police, FR is only used as a real-time aid to locate people on a watchlist. However, the revelation that passport and immigration databases are being scanned as part of this process suggests the technology may not be used in the highly targeted way the government says it is. The Home Office's claim that the technology is time-bound also no longer holds true, after the announcement earlier this year that the UK's first permanent LFR camera will be installed in Croydon, South London. Recent data from the Met attempted to imbue a sense of confidence in facial recognition, as the number of arrests the technology facilitated passed the 1,000 mark, the force said in July.  However, privacy campaigners were quick to point out that this accounted for just 0.15 percent of the total arrests in London since 2020. They suggested that despite the shiny 1,000 number, this did not represent a valuable return on investment in the tech. Alas, the UK has not given up on its pursuit of greater surveillance powers. Prime Minister Keir Starmer, a former human rights lawyer, is a big fan of FR, having said last year that it was the answer to preventing future riots like the ones that broke out across the UK last year following the Southport murders.

Daily Brief Summary

MISCELLANEOUS // UK Government Faces Backlash Over Secret Facial Recognition Database Access

Privacy groups have criticized the UK government for secretly allowing police access to passport and immigration databases for facial recognition, raising significant privacy and transparency concerns.

The Home Office's lack of transparency has been labeled "astonishing" and "dangerous," with calls for a ban on the practice from organizations like Big Brother Watch and Privacy International.

The databases in question contain approximately 58 million passport photos and 92 million images from immigration and visa sources, far exceeding the 20 million photos in the Police National Database.

Police searches using these databases have dramatically increased, with passport database queries rising from two in 2020 to 417 by 2023, raising concerns about potential misuse.

Critics argue that the use of facial recognition technology risks misidentification and injustice, especially when deployed without public knowledge or parliamentary oversight.

Despite government claims of improved accuracy and reduced biases, privacy advocates highlight the minimal impact on crime prevention, citing only 0.15% of total arrests in London since 2020.

The installation of the UK's first permanent live facial recognition camera in South London contradicts previous assurances of time-bound and targeted use, further fueling public distrust.