The Metropolitan Police is set to trial 100 handheld facial recognition devices for a period of six months, with around £763,000 allocated to the programme, according to multiple reports. Mayor of London Sadiq Khan confirmed the trial, which will allow officers to scan faces in real time using portable devices. The move marks a significant expansion of the Met's use of live facial recognition (LFR) technology, which has been deployed since 2020 via cameras mounted on vans and at fixed locations.
Met Commissioner Sir Mark Rowley has called for a major expansion in the use of live facial recognition technology, with the force planning 10 facial recognition operations a week and trialling the technology at new venues including football matches, according to multiple reports. The government also plans to expand the use of facial recognition across England and Wales, increasing the number of camera vans from 10 to 50, the Home Office said. This expansion comes as the Met invests in more facial recognition camera vans and plans to install more fixed cameras after a trial in Croydon town centre led to a criminal being caught every 34 minutes, according to multiple reports.
Facial recognition cameras could assist in supervising offenders in the community under Labour's justice reforms.
The Met has been using facial recognition technology since 2020, deploying cameras on vans and in fixed locations. According to the Met Police, there have been over 1,400 LFR arrests since the technology was first introduced, with more than 1,000 people charged or cautioned. The force won a High Court challenge about the use of LFR, with judges rejecting claims that police broke human rights and privacy laws. The High Court challenge, brought by Shaun Thompson and Silkie Carlo, was dismissed, with Lord Justice Holgate and Mrs Justice Farbey ruling that the claimants' human rights have not been breached.
The Met is also considering using AI to help identify victims of online child sexual abuse and categorise imagery by severity, the force said. According to the Met Police, the force investigated more than 5,400 child sexual abuse offences over the past year, with more than 1,300 children requiring safeguarding. The investment in more camera vans and fixed cameras follows a successful trial in Croydon, where the technology helped catch criminals at a high rate.
The Met's website explicitly says they do not use OIFR technology.
Critics have raised concerns about the expansion of facial recognition technology. Green Party London Assembly Member Zoë Garbett called on the Met to halt all LFR use until proper safeguards are in place, according to her statement. According to Evening Standard - News, Zoë Garbett noted that the Met's website still states it does not presently use operator initiated facial recognition, despite the new trial. According to The Guardian - Main UK, Zoë Garbett described the trial as an alarming change. A case of mistaken identity has also highlighted accuracy issues: police arrested a man for a burglary in a city 100 miles away that he had never visited after software confused him with another person of south Asian heritage, the Guardian reported.
Facial recognition technology is already used by other UK police forces. Face scanning has been deployed by police with cameras on vans and in fixed locations including in Croydon, Manchester and South Wales, according to research. Retrospective facial recognition systems are also widely in use across the UK. The Met signed a £490,000 three-month contract with the controversial American AI firm Palantir to try to detect rogue officers based on their wider conduct, according to research. Operator-initiated facial recognition is already in use by South Wales police, where officers run NEC's NeoFace algorithm on their smartphones, according to research.
An alarming change.
Speaking at a press conference, Sir Mark Rowley said: "Facial recognition cameras could assist in supervising offenders in the community under Labour's justice reforms." Mayor Sadiq Khan explained that "officers would not be able to walk up and scan people's faces on the device and said it would be used during police stops and when officers were not persuaded a member of the public had identified themselves correctly." Policing minister Sarah Jones described the technology as "the biggest breakthrough for catching criminals since DNA matching." However, Mary-Ann Stephenson, Chair of the Equality and Human Rights Commission, warned: "There is a danger that these technologies can be inaccurate and falsely identify people. The data shows that there are racial disparities for false positive identification, causing human rights infringements and distress to those affected. That is why a strong legal framework is needed."
Several unknowns remain about the trial. It is not yet clear what specific safeguards or regulations will be put in place for the handheld facial recognition trial, nor what the accuracy rate of the technology is, particularly for people of color. The Met has not detailed how it will ensure that the handheld devices are not used for indiscriminate scanning of the public. The timeline for the government's plan to expand facial recognition vans from 10 to 50 has also not been specified. Additionally, the specific AI tools being considered for child sexual abuse investigations and their privacy implications have not been disclosed.
Officers would not be able to walk up and scan people’s faces on the device and said it would be used during police stops and when officers were not persuaded a member of the public had identified themselves correctly.
There is a potential contradiction regarding the status of the High Court challenge against the Met's LFR use. While the High Court challenge was dismissed and judges rejected human rights claims, some reports may imply an ongoing challenge. Readers should check publication dates to clarify the timeline.
The biggest breakthrough for catching criminals since DNA matching.
There is a danger that these technologies can be inaccurate and falsely identify people. The data shows that there are racial disparities for false positive identification, causing human rights infringements and distress to those affected. That is why a strong legal framework is needed.