Documents obtained by Big Brother Watch in 2018 showed the Met’s deployments had rung up a 98% false positive rate in May of that year. Nothing improved as time went on. Subsequent documents showed a false positive rate of 100%. Every "match" was wrong. Not exactly the sort of thing you want to hear about tech capable of scanning 300 faces per second.
This followed an earlier report covering a test run by the South Wales Police at a handful of public events. In comparison, the South Wales tests were a success: a mere 92% of its matches were false positives.
The Met’s tech showed some slight improvement in 2019, moving up to a 96% false positive rate. This continued failure to recognize faces -- along with a number of privacy concerns -- prompted a UK Parliamentary Committee to call for an end of the use of facial recognition tech by UK government agencies. This advice was ignored by the Home Office, which apparently believed UK law enforcement would be able to fail upwards towards a brave new world of facial recognition tech worth the money being spent on it.
We’ve apparently reached that inflection point. Test runs are a thing of the past. It’s time for Londoners to put their best face forward.
British police are to start operational use of live facial recognition (LFR) cameras in London, despite warnings over privacy from rights groups and concerns expressed by the government’s own surveillance watchdog.
First used in the capital at the Notting Hill carnival in 2016, the cameras will alert police when they spot anyone already on "wanted" lists.
"The use of live facial recognition technology will be intelligence-led and deployed to specific locations in London," the city’s Metropolitan Police said in a statement.
"Intelligence-led," says the agency that has so far only managed to incorrectly identify people almost 100% of the time. There’s more "intelligence" further on in the article when the Met says the software that’s hardly managed to correctly identify people will help "identify and apprehend suspects." Gun and knife crime top the list of things expected to be curtailed by unproven tech, followed by the sexual abuse of children and "protecting the vulnerable."
Also lol a bit at this, which uses a trite phrase made even triter by the abysmal performance of the Met’s AI:
Metropolitan Police Assistant Commissioner Nick Ephgrave said in a statement: "We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point."
He’s technically correct. It’s has been tried and tested. What it hasn’t been is accurate and that’s what counts most when people’s rights and freedoms are on the line. But better an unknown number of innocent people be misidentified than allow a single suspect to go unscanned, I guess.
Related Articles:
Britain’s capitulation to China’s Huawei is nothing new — for years, Western firms have profited from the world’s largest authoritarian surveillance project
London Police Deploy Advanced Facial-Recognition Cameras Across City Beijing has already allowed the world a glimpse of the lengths that an authoritarian state armed with digital surveillance technology will go to stamp out any and all behavior deemed "undesirable" - even if it’s as mundane and harmless as wearing pajamas in public.
In 2018, London’s Metropolitan Police Force announced trials of a facial recognition system that could be married to the city’s legendarily invasive CCTV thicket; the tests failed 98% of the time and led to arrests of people who opted out by covering their faces.
Over the years, I have written about the possible dangers of firefighters carrying weapons and being trained like SWAT teams. But this latest story will leave you scratching your head and wondering what the CIA's "Signature School" has done to America's firefighters.
The friendly surface-level rationale behind any mass data collection via surveillance is improved efficiency through metrics. With the right amount of the data, and the right analysts working through it, you can optimize pretty much any process. From a business perspective, this could potentially present new ways to work smarter, instead of working harder — increasing profits and productivity through better decision-making, which ultimately makes everyone happier.
Amazon Echo and the Alexa voice assistant have had widely publicized issues with privacy. Whether it is the amount of data they collect or the fact that they reportedly pay employees and, at times, external contractors from all over the world to listen to recordings to improve accuracy, the potential is there for sensitive personal information to be leaked through these devices.
Federal and state law enforcement officers said that while they had only limited knowledge of how Clearview works and who is behind it, they had used its app to help solve shoplifting, identity theft, credit card fraud, murder and child sexual exploitation cases.
Conference goers now have to be wary that robots will identify them everywhere they go. Corporations can now rent Chinese-made CloudMinds robots that can identify everyone on the conference floor. (To learn more about CloudMinds and China click here & here.)
Does anyone really believe America is still the land of the free? Since 9/11, DHS, the FBI, the CIA, and countless other alphabet soup agencies have turned the United States into a public surveillance monstrosity.
As we start the year of the once-a-decade US Census, it’s an appropriate time to start looking at some of the ways and the purposes for which data — including drivers license data — is used and shared by the Bureau of the Census.
2019 will go down as the year facial recognition and corporate surveillance became commonplace.
At the London Summit, the 29 member countries of NATO agreed to “guarantee the security of our communications, including 5G”. Why is this fifth generation of mobile data transmission so important for NATO?
In May, San Francisco became the first city in the United States to ban facial recognition tech by city agencies. Being on the cutting edge has its drawbacks, as the city has now found out several months later. Tom Simonite and Gregory Barber of Wired report the city’s legislation inadvertently nuked many of its employees’ devices.
The town of Brookline overwhelmingly voted to ban government use of facial recognition technology in the city on Wednesday, December 11th . The growing movement to prohibit the use of facial recognition at the state and local levels could hinder the operation of a growing national facial recognition network.
Want to convince our politicians to use facial recognition? How about opening a biometric customer experience center near the Congress and the Senate? That is exactly what the Nippon Electric Company or the NEC Corporation has created in Washington DC.
Police departments across the country are creating "Safe Exchange Zones" in front of police stations or inside them to monitor internet purchases or swaps 24/7. According to Newsweek, police have been encouraging the public to conduct internet purchases or swaps at police-run Internet Exchange Zones since at least 2016.
This week, GEDmatch, a genetic genealogy company that gained notoriety for giving law enforcement access to its customers’ DNA data, quietly informed its users it is now operated by Verogen, Inc., a company expressly formed two years ago to market “next-generation [DNA] sequencing” technology to crime labs.
The Gates Foundation has pledged to fund the World Bank in an effort to take the ID program to other countries. Despite Gates plea that there are no privacy issues with Aadhaar, several court cases have gone to India’s supreme court on grounds of privacy violations.