T4K3.news
EHRC challenges Met facial recognition use
The EHRC says the Met's live facial recognition use breaches human rights law and will face a January 2026 judicial review.

The EHRC says the Met’s live facial recognition use breaches human rights law and seeks judicial review.
Regulator raises concerns over Met facial recognition camera use
The Equality and Human Rights Commission has criticised the Metropolitan Police for using live facial recognition technology, arguing the deployment breaches human rights law and falls short of the required standard of necessity and proportionality. The EHRC has been granted permission to intervene in an upcoming judicial review into the force’s use of the surveillance tool. John Kirkpatrick, chief executive of the EHRC, stressed that there must be clear rules to ensure the technology is used only where necessary, proportionate and safeguarded by proper protections. The Met Police says the judicial review hearing is set for January 2026 and that its use of live facial recognition is lawful and in line with policy.
The Met notes it has made more than 1,000 arrests since January 2024 using LFRT and continues to defend its policing approach, while the EHRC highlights the need for robust oversight and concrete safeguards to prevent abuses.
Key Takeaways
"There must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards."
EHRC chief executive John Kirkpatrick articulating the regulator's position
"We are confident that our use of live facial recognition is lawful and follows the policy."
Met Police spokesperson on the current policy
"A judicial review hearing is scheduled for January 2026 and we are fully engaged in this process."
Met Police spokesperson on upcoming legal proceedings
"The tech could be used to combat serious crime and keep people safe."
John Kirkpatrick noting potential benefits but with safeguards
This confrontation spotlights a broader debate about how modern policing should deploy facial recognition. The court challenge could shape how quickly tech is adopted in public spaces, and how tightly it is regulated. The case raises questions about transparency, independent review, and the balance between public safety and civil liberties. As courts weigh technical effectiveness against rights, trust in policing may hinge on proven safeguards and implementable limits rather than on promises of safety alone.
Highlights
- Tech must serve rights, not erode them
- Public trust hinges on real safeguards
- Clear rules protect safety and liberty
- If a tool outpaces law, the public pays the price
Privacy and oversight concerns over LFRT
The use of live facial recognition raises privacy and civil liberties concerns, with potential for chilling effects and uneven enforcement. The EHRC's intervention signals political sensitivity and public backlash risk.
A careful legal path may define how crime prevention and civil rights share the frame of future policing.
Enjoyed this? Let your friends know!
Related News

EHRC challenges Met on facial recognition at Notting Hill carnival

Live facial recognition vans expand

Facial recognition technology used at Notting Hill Carnival

Police will share suspects ethnicity and nationality

Study Reveals Autism Masking Effects on Teens

Trans judge seeks rehearing after ruling on biological sex

UK age verification checks face immediate challenges

Reynolds defends Palestinian state recognition
