London cops’ controversial public trial of facial recognition technology is coming to an end this week in Romford – and campaigners hope it will be the last time the force uses the kit.
The Metropolitan Police have been using automated facial recognition (AFR) technology since 2016, but its decision to do so in the absence of any legal framework or oversight has drawn criticism from privacy advocates, politicians and watchdogs.
Amid widespread complaints, the police said its use of the kit was simply a trial and that a full assessment would be completed this year.
Today and tomorrow mark the final time it will be used in the trial, with a van parked outside Romford Station in East London from 10am to 6pm – plenty of time to catch rush hour.
The Met has consistently said that the technology will be used “overtly”, but at the previous rollout in December the cameras were fitted to unmarked green vans.
Campaign group Liberty, on the scene, said that this time there is Met Police branding on the blue van – although this doesn’t say anything about AFR. Both cameras are focused on the entrance to the station.
The group also noted that one of the two the public information notices is so positioned that anyone reading it would be caught on camera.
Met police have placed this sign outside #Romford Station, but #FacialRecognition cameras (on top of that blue van) are aimed at it, so your face is being scanned and mapped as you read it #ResistFacialRec pic.twitter.com/1luuvbdUK5
— Liberty (@libertyhq) January 31, 2019
“One of Liberty’s key concerns is that this is supposed to be a trial, but by the time you’re informed that it’s happening you’re probably already on camera,” said Liberty’s advocacy and policy officer Hannah Couchman.
She added that there was a “real lack of transparency” about the use of the tech, pointing to the fact the police only issued a press release revealing the location yesterday at 4pm.
Neither is there information about who is on the watch list – a database of images drawn up for each use of the tech that is scanned against the images collected by the cameras – or where those images are taken from, which can include social media.
In its press release, the Met said that those who decline to be scanned “will not necessarily be viewed as suspicious” but that “officers will use their judgement to identify any potential suspicious behaviour”.
Couchman told The Register this risked chilling people’s freedom of expression even further as those who felt it was a threat to their rights might then be treated as suspects simply for opting to cover their faces.
Underlying such concerns are question marks over the efficacy of the tech itself. A Freedom of Information request by Big Brother Watch found the Met’s use of the tech had a 98 per cent false positive rate. An academic study also poked holes in its abilities in low light and crowds.
“The police’s use of live facial recognition has been an expensive failure, costing hundreds of thousands of pounds, and with a misidentification rate of almost 100 per cent,” said Griff Ferris of Big Brother Watch.
“Despite this, the police have continued to subject Londoners to this lawless and intrusive technology, ignoring and infringing their rights.”
Once this week’s deployments have finished, the Met will have used the tech 10 times, which is what it committed to as part of its trial.
“Following the final deployments this week, a full independent evaluation of the deployments and the technology itself will commence,” said detective chief superintendent Ivan Balhatchet.
Liberty’s Couchman said that the group’s hope is that the force will choose not to keep using the technology in operations, warning that to extend its use “would be a significant infringement of our rights”.
In the meantime, the force’s use of AFR is also facing an independent probe from the Information Commissioner’s Office and a legal challenge from Big Brother Watch.
Balhatchet said: “We continue to engage with many different stakeholders, some who actively challenge our use of this technology. In order to show transparency and continue constructive debate, we have invited individuals and groups with varying views on our use of facial recognition technology to this deployment.” ®