In 2025, more than 3.5 million Londoners’ faces were scanned by the Metropolitan Police — resulting in just 1,010 arrests, three per every 10,000 people scanned.
Live facial recognition is a form of biometric-enabled technology that works by scanning faces in real time, identifying them based on characteristics like the distance between their eyes and nose.
In practice, the Met pairs this technology with surveillance cameras to scan faces against police watchlists.
The technology was first tested in 2016 at the Notting Hill Carnival, but has been used on an operational basis by the Met since 2020.
New data from the Met shows a sharp increase in deployments. In August, it announced it would be doubling the number of deployments, citing a £260 million budget shortfall, including cuts of 1,400 officers and 300 staff.
Hackney had the highest arrest rate of any borough, with 62% of alerts resulting in an arrest.
Zoë Garbett, a Green Counsellor for Hackney who has consistently campaigned against LFR, likens it to a postcode lottery.
Islington has had zero deployments this year because the council voted for the Met to suspend deployments in July 2024, in light of the Casey Report.
Hackney brought a similar motion forward, but it ultimately stalled.
It argued that the technology was intrusive, unreliable and had the potential to be used disproportionately against Black, asian and ethnic communities.
Garbett attributes this to a lack of deeper work with the local borough commander, the police and council teams, and has expressed scepticism about how LFR is presented as a success story, when there has been little cost-benefit analysis or formal evaluation.
She said: “I come from an NHS background, where that is never enough to decide whether to continue doing something. You need a well-rounded view of the harms, costs and implications rather than output numbers.”
In 2023, Newham, the most ethnically diverse borough in the UK, voted unanimously to suspend the use of LFR technology until sufficient biometric regulations and anti-discrimination safeguards were applied.
However, the decision was ignored by the Home Office and the Met.
The Met’s annual report attributes demographic imbalances in watchlist alerts to the location of deployments, which are focused on crime hotspots. It claims that these areas often overlap with communities experiencing higher levels of deprivation.
However, the geographic distribution of arrests from LFR does not reflect the distribution of offences across London.
Some Hackney residents are concerned about how the technology fits into Met’s history of racial bias in policing.
A 2019 study comparing 189 different algorithms found that they were between 10-100 times more likely to misidentify Black and Asian faces compared to white faces.
Black women had the highest rate of false positive matches.
Scarlet, 27, who grew up in Hackney, said: “I think it’s interesting how the technology was first trialled at the Notting Hill Carnival, an event specifically for Caribbean people.
“It’s that preconception that Black people are predisposed to criminal behaviour, which is completely false and a misguided notion, and these technologies keep reinforcing that.”
Likening LFR to the current stop and search tactics of the Met Police, she said: “I don’t agree with it, I would need to see more evidence showing its efficacy and usefulness, that it’s not just a further tool of control and threat.
“There’s a clear racial bias within the Met police. It’s very unchecked power at this point, I don’t think it even matters what statutes there are, because if they’re gonna do it, they’ll find a way, they’ll forge a reason if you fit a particular description.”
Data from the Home Office in 2022/23 shows black people are stop-searched by the police 4.5 times more often than white people.
In August, knife-crime activist Shaun Thompson brought a case to the High Court against the Metropolitan police after he was mistakenly identified by live facial recognition cameras.
Thompson was held by officers for 30 minutes, who demanded that he provide scans of his fingerprints and threatened him with arrest, despite him showing multiple identity documents to prove that he was not the individual on the watchlist database.
In a statement to Big Brother Watch, he said: “I’m bringing this legal challenge because I don’t want this to happen to other people.
“Facial recognition is like stop and search on steroids and doesn’t make communities any safer. It needs to be stopped.”
Deployment records show the rate of false alerts has decreased by 99.36% between 2020-2025, from 87.5% down to 0.56%.
However, details on false alerts within the Metropolitan police’s annual review found that 80% of this year’s false alerts were from people with a Black background.
Jake Hurtfurt, investigations lead at Big Brother Watch, said: “It’s not just bias in the technology, it’s how it’s used.
“The watchlist is disproportionately targeted towards ethnic minorities, so you get different outcomes.”
Almost half of the deployments in 2025 were in wards with a higher Black population than the London average.
In a recent study from Data & Policy comparing LFR trials across London, Wales, Berlin and Nice, researchers found that according to British Law, the mere generation of a correct LFR match alert does not constitute reasonable evidence of suspicion.
This is a precondition for the police to lawfully stop and detain an individual for questioning about a suspected crime.
Any attempt by a police officer to stop and question an individual whose face is matched to a watchlist must be undertaken on the basis that the individual is not legally obliged to cooperate for that reason alone.
In practice, 46% of alerts in the 2025 deployments resulted in arrests.
Hurtfurt stressed that there is no legal basis for facial recognition.
He said: “It’s never been voted on in the House of Commons, and there’s no law authorising it. The Met relies on a hodgepodge of various laws; it operates in a legal grey area that allows police officers to do what they want.”
The Data & Policy study highlighted a lack of transparency, informed consent, and weak data protection and privacy rights as consistent shortcomings.
It warned that the Met Police’s live facial recognition trials are little more than public performances to legitimise the use of powerful and invasive digital technologies before public debate and regulation have occurred.
A representative from Liberty Human Rights said: “The law on facial recognition is being far outpaced by police use, which is expanding all the time.
“The UK should follow the example of other countries that have put laws in place around how facial recognition is used by police, including robust safeguards to protect our rights.”
The Metropolitan Police have been approached for comment.






Join the discussion