Metropolitan Police to test facial recognition for identity verification, says mayor.

The Metropolitan Police are set to begin using automated facial recognition technology to scan citizens’ faces as a means of verifying their identities. This approach is backed by Mayor Sadiq Khan but has drawn criticism from opponents who label it as “alarming.”
The pilot program was announced on Thursday, with Khan revealing that 100 officers will utilize this mobile technology, typically seen on smartphones, over the next six months. This decision came in response to questions posed by an opposition politician in light of growing concerns regarding the proliferation of AI-driven policing tools. Currently, the Metropolitan Police’s official website states that it does “not presently use the so-called operator initiated facial recognition.”
As the largest police force in the UK, this development symbolizes a broader move toward integrating facial recognition into law enforcement. The technology has previously been deployed through fixed cameras in locations like Croydon, Manchester, and South Wales. Furthermore, retrospective facial recognition systems are also widely utilized across the UK.
In a recent article, the Guardian reported an instance wherein police mistakenly arrested a man for a burglary that took place 100 miles away, due to software misidentifying him as another individual of south Asian descent. Additionally, it was revealed that the Met has signed a contract worth ÂŁ490,000 with the contentious US-based AI company Palantir to aid in identifying rogue officers based on behavioral patterns.
ZoĂ« Garbett, a member of the Green party London Assembly, whose queries triggered Khan’s announcement, referred to the latest technological initiative as “an alarming change.”
During a City Hall meeting, she expressed her concerns, stating, “This technique fundamentally alters the relationship between the police and the public. Officers can now directly approach and scan individuals’ faces using mobile devices.”
Khan contested her claims, asserting that the technology would only be utilized during police stops or when an officer suspects that an individual has misidentified themselves.
“The alternative, if this tool is not employed, is to arrest the individual and transport them to a police station,” Khan explained. “This technology allows officers to verify if the person they are addressing matches someone in the custody record, thereby circumventing a significant inconvenience.”
The Metropolitan Police are deploying live facial recognition technology in Croydon, south London. Photograph: PA Images/Alamy
This pilot program launch coincides with calls from the Equality and Human Rights Commission for an independent oversight body to regulate the application of facial recognition technology within the UK. Sarah Jones, the policing minister, has described the tech as “the most significant breakthrough in crime detection since DNA matching.”
However, Mary Ann Stephenson, chair of the equalities watchdog, warned that such technologies can yield inaccuracies, impacting people’s identities and leading to racial disparities in false positive rates, thereby creating potential human rights violations and distress among individuals involved. Hence, she stressed the necessity for a robust legal framework.
Operator-initiated facial recognition is currently operational with South Wales police, utilizing NEC’s “NeoFace” algorithm on officers’ smartphones. This empowers officers to confirm the identity of individuals suspected of being missing, at imminent risk of serious harm, or wanted—especially when such persons are unable or unwilling to provide accurate information.
This technology can also aid in identifying deceased or unconscious individuals, but it is not permitted for covert usage. South Wales police defines its use as permissible when there is intelligence suggesting that the individual may pose a risk to themselves or others. Civil liberties organization Big Brother Watch has criticized this broad definition, stating it lends itself to vague interpretations that could lead to non-crime scenarios being subject to surveillance.
Garbett expressed her frustration, saying, “It’s shocking that I had to compel the mayor to acknowledge that they are testing operator-initiated facial recognition technology. There is a lack of a clear legal framework regulating live facial recognition, and now it’s being expanded through handheld devices, enabling officers to scan anyone’s face. In this country, individuals are not obliged to identify themselves to the police without substantial justification, and this unregulated technology jeopardizes a fundamental right.”
In March 2024, Khan had remarked to the London Assembly that if the Metropolitan Police were to employ operator-initiated facial recognition, he would expect them to engage with stakeholders, including the London Policing Ethics Panel, and to thoroughly assess the legal, policy, community, data protection, and ethical ramifications.
In December, Home Office Policing Minister Sarah Jones initiated a 10-week consultation concerning facial recognition technology. She asserted, “This technology has already aided in removing thousands of dangerous criminals from our streets and holds enormous potential to enhance policing practices and public safety.”
The Metropolitan Police indicated that over 100 wanted criminals were apprehended within the first three months of the Croydon live facial recognition pilot, an initiative that involved situating cameras on lampposts.
Interested in growing your brand with smarter solutions? Get in touch with Auctera today.
