Facial Recognition Reportedly Falls Flat, Over 80% Flagged Are Innocent
For those who want to outsource policing to technology, facial recognition is where it’s at. You’ve all heard the line before: If you’re not guilty, what do you have to fear?
Well, plenty, if a study out of the United Kingdom is any indication. In London, Albion’s largest city, over four out of five of those flagged by the facial recognition system as possibly being connected to a crime were actually innocent.
According to Sky News, a study conducted by University of Essex researchers of the facial recognition system used in tests by London’s Metropolitan Police found 81 percent of those identified by the system were tagged incorrectly.
Met Police say that the technology makes one error in 1,000 identifications.
Sky News said that the police force “uses a different measurement to arrive at this conclusion” than the researchers did. I would say so.
The Metropolitan Police force’s facial recognition monitoring system traces back to 2016, when trials were started at the Notting Hill Carnival, Sky News reported. Another nine trials of the technology have since been conducted.
“The first independent evaluation of the scheme was commissioned by Scotland Yard and conducted by academics from the University of Essex,” Sky News reported on Thursday.
“Professor Pete Fussey and Dr Daragh Murray evaluated the technology’s accuracy at six of the 10 police trials. They found that, of 42 matches, only eight were verified as correct — an error rate of 81%. Four of the 42 were people who were never found because they were absorbed into the crowd, so a match could not be verified.”
“Our report conducted a detailed, academic, legal analysis of the documentation the Met Police used as a basis for the face recognition trials,” Fussey said.
“There are some shortcomings and if [the Met] was taken to court there is a good chance that would be successfully challenged.”
“Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police’s systems from the outset,” Murray said, “and was not an integral part of the process.”
David Davis — a former member of parliament and shadow home secretary for the Conservative Party, said that London’s system “could lead to miscarriages of justice and wrongful arrests” and could have “massive issues for democracy,” the U.K. Guardian reported.
“All experiments like this should now be suspended until we have a proper chance to debate this and establish some laws and regulations,” he said. “Remember what these rights are: Freedom of association and freedom to protest; rights which we have assumed for centuries, which shouldn’t be intruded upon without a good reason.”
Silkie Carlo, director of the British privacy advocacy group Big Brother Watch, agreed.
“[This report] is absolutely definitive and I think there is really no recovery from this point,” Carlo told the Guardian.
“The only question now is when is the Met finally going to commit to stop using facial recognition.”
Here is how the Guardian described the test runs:
“Scotland Yard granted the University of Essex academics access to six deployments of the system between June 2018 and February 2019. It uses cameras fixed to posts or on a van, and software cross-checks face-scans of passersby against a ‘watchlist’.
“The research found that police were too hasty to stop people before matches could be properly checked, which led to mistakes; watchlists were sometimes out of date and included people wanted by the courts as well as those considered ‘at risk or vulnerable’; and officers viewed the technology as a way of detecting and deterring crime, which the report argued could have been achieved without biometric technology. They said it was ‘highly possible’ Scotland Yard’s use would be ruled unlawful if challenged in court.”
Amazingly, even though this study was commissioned by Scotland Yard, the police force continued to maintain that its tests of the technology weren’t just legal but also successful.
Mind you, the Metropolitan Police measures the system’s success by the number of misidentifications against the total number of faces scanned; the number of innocent people identified by the system against the total number of people flagged doesn’t factor into it.
That, by the way, is the “different measurement” we talked about before.
(If some entity, in this case, the London Metropolitan Police, pretty much induces you to ask of it, “Exactly what kind of idiot do you take me for?” you can guess the purity of their intentions.)
The deputy assistant commissioner of the Metropolitan Police, meanwhile, attacked the report because it wasn’t nice enough to the police.
“We are extremely disappointed with the negative and unbalanced tone of this report … We have a legal basis for this pilot period and have taken legal advice throughout,” Duncan Ball said.
“We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer.”
Except if it’s misfiring 81 percent of the time, the facial recognition system isn’t protecting anyone except for the people who have jobs working on the NeoFace system at the Japanese manufacturer NEC.
That’s the setup the Metropolitan Police force is using. A NeoFace software deployment in California cost the state $1.8 million, according to Vice — which isn’t exactly exorbitant, but not exactly small change, either.
There are also deeper issues with London’s use of the system, including the apparent fining of a man who didn’t wish to take part in a facial recognition trial.
“Treating LFR [live facial recognition] camera avoidance as suspicious behavior undermines the premise of informed consent,” the researchers wrote.
“The arrest of LFR camera-avoiding individuals for more minor offenses than those used to justify the test deployments raise clear issues regarding the extension of police powers and of ‘surveillance creep.'”
I mention all of this because it’s why there’s a compelling argument against the deployment of these cameras over here. London’s plan isn’t happening in a vacuum. It isn’t just the government, either: The Guardian reports that NEC is selling “the same technology to retailers and casinos to spot regular customers, and to stadium and concert operators to scan crowds for ‘potential troublemakers.'”
Doesn’t that make you feel comfortable? Are you not swaddled in the warm blanket of impersonal digital safety?
At a surface level, this plan is blowing money on Big Brother-invoking technology that doesn’t work. Meanwhile, those resources could be better directed on human policing that does. However, even if NeoFace eventually works the way it’s supposed to and somehow manages to identify actual suspects more than 19 percent of the time, there are obvious, abstract issues involving basic human freedoms at work here.
Mass surveillance is antithetical to the concept of a free society. To foist it upon us is basically China-lite: Behave, we’re told, and nothing will happen to us. Unless we opt out and get fined or get misidentified and get dragged in.
Some are probably willing to countenance this because we believe our government organs — and our law enforcement — are more benign than China’s. This is a low bar to clear when your competition is Xi Jinping’s brutal state apparatus. I’d also like to posit (rather safely, given where I’m writing this) that most of our readership agrees that government has too much power over us already.
We shouldn’t make an intellectual carve-out for law enforcement, particularly in cases where it’s outsourcing important human functions to an indifferent computer software suite that, nearest we can tell, doesn’t work, all at the expense of our right to privacy. Checking our facial features in public locations with a computer against massive government databases of our photos shouldn’t factor into anyone’s concept of limited government.
Surveillance creep ought to frighten every one of us. London’s trials should be ended immediately and the deployment of NeoFace software by officialdom anywhere in the free world ought to be closely examined.
This is a rare issue in world politics today: One where liberals and conservatives can find a certain comity and fight in a common cause. Facial recognition software is immoral, in opposition to the spirit of a free society and of dubious legality in many of the areas it’s deployed.
Perhaps most importantly, the software seems to be failing at a job that an increased police presence could accomplish without the same ethical concerns. If we want to improve public safety, that’s the road we should be traveling down.
It’s worth remembering that a benignly Orwellian system is still Orwellian — and how long it stays benign depends entirely on who’s in charge.
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.