Facial recognition furore: Met police chief backs technology that doesn’t work 98% of the time
The head of the London Metropolitan Police has said that she is “completely comfortable” with the ongoing trials of facial recognition technology in the city – despite legal challenges and reports that it fails most of the time.
Police Commissioner Cressida Dick made the comments on Wednesday at a London Assembly meeting aimed at addressing concerns about the trial and roll out of automated facial recognition (AFR) technology in parts of the city by her force.
Insisting that people living in the city would approve of the controversial measures, Dick said:
"If there's a technology that we can use lawfully – which we can, this is one – and is available, which we are trialling with massive safeguards... [and there is] the notion that that technology might be used in limited circumstances to identify against a small list of wanted offenders for serious violence, [then] I think the public would expect us to be thinking about how we can use that technology, seeing if it’s effective or efficient for us. And that’s exactly what we’re doing,” according to a report of the meeting by The Register.
So far this year, the technology has been used four times and plans are in place to use the technology a further five times by year’s end.
Her defense of AFR comes just one month after a freedom of information request by civil liberties campaigners Big Brother Watch, discovered that the Met’s AFR had a 98 percent false positive rate and has only made two accurate matches.
Of the two correct matches the Met’s technology has made to date, there have been zero arrests. One match was for an individual on an out-of-date watch list; the other for a person with mental health issues who frequently contacts public figures, but is not a criminal and not wanted for arrest, according to technology news site, The Verge.
Referring to the use of the technology as a “tool” and a “tactic,” Dick said: “I’m not expecting it to result in lots of arrests,” despite defending its continued use in the field.
READ MORE: Police face legal challenges over use of facial recognition software
BBW, which is campaigning for UK public authorities “to immediately stop using automated facial recognition software with surveillance cameras,” and is calling on the Home Office to “automatically remove the thousands of images of unconvicted individuals from the Police National Database.”
Referring to the technology and collection of data on citizens as an “Orwellian surveillance tool,” BBW said AFR had “no place on our streets,” and along with Green Party peer Baroness Jenny Jones has written to the Home Office demanding its immediate cessation.
In June Dick admitted to the Home Office select committee meeting that facial recognition is moving very fast, but legal frameworks pose a challenge. She dismissed these legal concerns on Wednesday, saying that she was “completely comfortable” with AFR’s continued use.
Commissioner of the Met on facial recognition:To Home Affairs Select Committee (5 June): “facial recognition [is] moving very fast… in the absence of legal frameworks”To London Assembly (4 July): “I am completely comfortable [with its use] and we’re going to carry on”🤔— Big Brother Watch (@bbw1984) July 5, 2018
“We’re going to carry on with the trial,” she added.
If you like this story, share it with a friend!