Published Date : 11/3/2025Â
The Metropolitan Police plans to intensify its deployment of live facial recognition (LFR) technology across London, citing its effectiveness in identifying suspects and reducing violent crime.
This follows the release of the force’s annual report, which lists more than 1,400 arrests linked to LFR use between September 2024 and 2025. According to the report, 962 arrests were made during that period, with more than a quarter involving offences against women and girls, including rape, strangulation, and domestic abuse.
Of those detained, 549 were wanted by the courts and 347 by the Met, while 85 were arrested for breaching conditions such as those imposed on registered sex offenders and stalkers. The technology was deployed more than 200 times across London boroughs.
The Met highlighted the success of LFR at public events, notably the Notting Hill Carnival, where 61 arrests were made and 30 registered sex offenders were stopped over the two-day event. Among those apprehended was Tabsart Abderahmen, 58, who had been wanted since October 2015 for harassment.
However, the technology has faced criticism over privacy and potential bias. The report acknowledged that 10 individuals were falsely flagged by the system, eight of whom were Black. While none were arrested, six were briefly stopped by officers.
The Met maintains that the false alert rate remains low, standing at 0.0003 percent from over three million scans, and that demographic imbalances are not statistically significant, although it promised ongoing review.
The biometric software is supplied by NEC. In 2023 testing by the UK’s National Physical Laboratory (NPL), the Neoface software was found to work well despite age changes in subjects, camera angles, headwear, and adverse lighting conditions. When checking against 10,000 reference images, one in 6,000 people were falsely matched and there was no statistically significant race and gender bias.
Civil liberties group Big Brother Watch has launched a legal challenge against the Met’s use of LFR, joined by Shaun Thompson, who was wrongly identified in February 2024. Jasleen Chaggar, the group’s legal and policy officer, described the technology as “Orwellian” and warned of its chilling effect on public life. She also criticised the lack of legislation governing LFR use, calling for government intervention.
The UK Home Office is in the process of drawing up a governance framework for the use of LFR by police, with Home Secretary Yvette Cooper planning to establish a foundation for facial recognition use by law enforcement. The Policing Minister Diana Johnson has been tasked with engaging police forces and other stakeholders to consult on a governance framework.
Meanwhile, public support for the technology appears robust. A survey by the Mayor’s Office for Policing and Crime found that 85 percent of respondents backed LFR to locate serious offenders and those at risk. The Met also reported a rise in public trust, with 74 percent of Londoners now trusting the force.
Lindsey Chiswick, the Met’s national lead for LFR, defended the technology’s role in enhancing safety and trust, calling it a “powerful and game-changing tool.” “We remain committed to transparency and fairness in its use,” she said.
The Met reiterated that biometric data of individuals not on watch lists is immediately and permanently deleted. It now plans to increase LFR deployments weekly, aiming to further bolster its crime-fighting capabilities while addressing concerns over privacy and bias. “We are proud of the results achieved with LFR,” said Chiswick. “Our goal has always been to keep Londoners safe and improve the trust of our communities. Using this technology is helping us do exactly that.”
Outside of London, LFR assisted the arrests and charging of two people after police trialed the technology in Bolton town center. A local publication reported Inspector Jon Middleton from the Live Facial Recognition Unit saying the technology allowed the police to be “more proactive” in identifying and locating individuals who are wanted or missing.
Previously, His Majesty’s Inspectorate of Constabulary, Fire and Rescue Services (HMIFRS) urged UK police forces to “fully exploit” retrospective facial recognition, recommending that no investigation be closed before cross-checking images against available databases.Â
Q: What is live facial recognition (LFR) technology?
A: Live facial recognition (LFR) technology is a biometric system that uses cameras to identify individuals in real-time by comparing their facial features against a database of known faces. It is used by law enforcement to identify suspects and missing persons.
Q: How many arrests has the Metropolitan Police made using LFR?
A: The Metropolitan Police has made over 1,400 arrests linked to LFR use between September 2024 and 2025, with 962 arrests during that period.
Q: What are the main criticisms of LFR technology?
A: The main criticisms of LFR technology include concerns over privacy, potential bias, and the lack of comprehensive legislation governing its use. Civil liberties groups argue that it can lead to false identifications and has a chilling effect on public life.
Q: What is the false alert rate of the LFR system used by the Met?
A: The false alert rate of the LFR system used by the Met is 0.0003 percent from over three million scans, according to the Met's report.
Q: What is the UK Home Office doing regarding LFR use by police?
A: The UK Home Office is drawing up a governance framework for the use of LFR by police, with Home Secretary Yvette Cooper planning to establish a foundation for facial recognition use by law enforcement. The Policing Minister Diana Johnson is engaging police forces and other stakeholders to consult on a governance framework.Â