The Metropolitan Police are facing a High Court battle over the use of live recognition cameras after an anti-knife crime campaigner’s ‘human rights were violated’ when he was wrongly identified as a suspect.

Shaun Thompson, 39, a respected black community worker, was wrongly flagged up as a criminal after being filmed at London Bridge station.

Distressingly held for 30 minutes under threat of arrest, Mr Thompson was actually returning home from Croydon after a voluntary anti-knife crime shift. 

Big Brother Watch, a UK civil liberties campaign group, is bringing a High Court case on behalf of Mr Thompson, arguing that it breached his privacy under the European Convention on Human Rights (ECHR). 

The group has argued that the deployment of the cameras is so ‘permissive’ that it breaches article eight of the ECHR and is ‘not in accordance with the law’. 

The Metropolitan Police is one of the first forces to pioneer the technology, and is now one of 13 forces which have used or are currently using live facial recognition (LFR) cameras. 

Sir Keir Starmer is said to be keen to scale up the use of live cameras, which work by taking digital images of passing pedestrians and feeding them into a computer using biometric software to measure facial features.

The image is compared with a watchlist and if a match is detected, an alert is sent to officers to consider an arrest. If a member of the public is not wanted by police, their biometrics are immediately deleted.

Shaun Thompson, 39, a respected black community worker, was wrongly flagged up as a criminal after being filmed at LondonBridge station

The Metropolitan Police is one of the first forces to pioneer the technology, and is now one of 13 forces which have used or are currently using live facial recognition (LFR) cameras

The LDR cameras work by taking digital images of passing pedestrians and feeding them into a computer using biometric software to measure facial features

What is live facial recognition?

Live facial recognition allows the police to recognise wanted individuals among a large crowd in real time.

Police use a series of cameras to record the faces of anyone who passes through a set zone.

An algorithm compares the faces of those walking in front of the camera to a ‘watchlist’ of wanted criminals and an alert is generated if there’s a match.

The watchlist includes individuals who are wanted for committing crime, who are banned from an area or who pose a risk to the public. 

The cameras look just like standard CCTV cameras, but do not record footage. In the event of a ‘no match’, the data will be deleted immediately.

The Met Police say that the cameras are lawful, necessary, proportionate and are targeted at crime hotspots. The force claims it has had just 10 ‘false alerts’ out of three million images.

However, Big Brother Watch claims the deployment of LFR extends crime hotspots and is used at critical national infrastructure, public events, and locations based on officers’ intelligence about crime. 

Big Brother Watch and Mr Thompson have submitted expert evidence they claim shows the majority of the public spaces in London fall within the broad ‘crime hotspot’ definition.

They argue this means that there is ‘no meaningful constraint’ on the deployment of live facial recognition across the capital.

Mr Thompson was travelling through London Bridge in February 2024 when he was held by officers after he was wrongly flagged by the cameras.

He claims the officers demanded identity documents, fingerprint scans, and inspected him for scars and tattoos in an attempt to confirm he was the suspect.

Despite providing identification documents proving he had been falsely identified, Mr Thompson says he was threatened with arrest.

He described the police’s use of live facial recognition technology as ‘stop and search on steroids’. 

Silkie Carlo, the director of Big Brother Watch, said: ‘The possibility of being subjected to a digital identity check by police without our consent almost anywhere, at any time, is a serious infringement on our civil liberties that is transforming London.

‘When used as a mass surveillance tool, LFR reverses the presumption of innocence and destroys any notion of privacy in our capital.’

Ms Carlo also argued in the legal challenge that the Metropolitan Police’s use of LFR breached individuals’ rights to freedom of expression and freedom of assembly, protected by Articles 10 and 11 of the ECHR.

She said that ‘excessively broad discretion’ had a ‘chilling’ effect on individuals’ ability to protest.

‘This legal challenge is a landmark step towards protecting the public against intrusive monitoring,’ Ms Carlo added.

It comes as the Home Secretary, Shabana Mahmood, has defended plans for a rollout of live facial recognition to all 43 police forces in England and Wales.

‘Of course, it has to be used in a way that is in line with our values, doesn’t lead to innocent people being caught up in cases they shouldn’t have been involved in,’ she told LBC

‘But this technology is what is working. It’s already led to 1,700 arrests in the Met alone. I think it’s got huge potential.’

Ms Mahmood also announced that the number of LFR vans will triple under the plan, with 50 vans being made available to every police force in England and Wales. 

British police forces will receive a high–tech upgrade as Home Secretary Shabana Mahmood announces more than £140 million in funding for tech, including 50 facial recognition vans (pictured) per police force 

Under current rules, the technology can only be used to search for watch lists of wanted criminals, suspects, or individuals subject to bail or court order conditions.

The government says that this technology will be ‘governed by data protection, equality and human rights laws’ and that faces flagged by the facial recognition system must also be reviewed and confirmed by officers before action is taken.

Even still, rights groups have expressed concerns over the expansion of this surveillance technology.

Brother Watch Advocacy Manager, Matthew Feeney, says: ‘An expansion of facial recognition on this scale would be unprecedented in liberal democracies, and would represent the latest in a regrettable trend.

‘Police across the UK have already scanned the faces of millions of innocent people who have done nothing except go about their days on high streets across the country.’

The Government is also yet to finish its facial recognition consultation, which would provide a legal framework for deploying live facial recognition. 

A Metropolitan Police spokesperson told the Telegraph: ‘We have stringent safeguards in place to protect people’s rights and privacy.

‘Independent testing confirms that the technology performs consistently across demographic groups, and our operational data shows an exceptionally low false alert rate of just 0.0003 per cent.

‘We are committed to providing clear reassurance that rigorous checks, oversight, and governance are embedded at every stage to safeguard people’s rights and privacy.’



Source link

Share.
Exit mobile version