UK police use of facial recognition tests public’s tolerance
LONDON (AP) — When British police used facial recognition cameras to monitor crowds arriving for a soccer match in Wales, some fans protested by covering their faces. In a sign of the technology’s divisiveness, even the head of a neighboring police force said he opposed it.
The South Wales police deployed vans equipped with the technology outside Cardiff stadium this week as part of a long-running trial in which officers scanned people in real time and detained anyone blacklisted from attending for past misbehavior. Rights activists and team supporters staged a protest before the game between Cardiff City and Swansea City, wearing masks, balaclavas or scarves around their faces.
“It’s disproportionate to the risk,” said Vince Alm, chairman of the Football Supporters’ Association Wales, which helped organized the protest. “Football fans feel as if they’re being picked on” and used as guinea pigs to test new technology, he said.
The real-time surveillance being tested in Britain is among the more aggressive uses of facial recognition in Western democracies and raises questions about how the technology will enter people’s daily lives. Authorities and companies are eager to use it, but activists warn it threatens human rights.
The British have long become used to video surveillance, with one of the highest densities of CCTV cameras in the world. Cameras have been used in public spaces for decades by security forces fighting threats from the Irish Republican Army and, more recently, domestic terror attacks after Sept. 11, 2001.
The recent advances in surveillance technology mean a new wave of facial recognition systems will put the public’s acceptance to the test.
South Wales police have taken the lead in Britain. In 2017 they started rolling out and testing face scanning cameras after getting a government funding grant. While a court last year ruled the force’s trial is lawful, regulators and lawmakers have yet to draw up statutory rules on its use.
The van-mounted cameras, using technology by Japan’s NEC, scan faces in crowds and match them up with a “watchlist,” a database mainly of people wanted for or suspected of a crime. If the system flags up someone passing by, officers stop that person to investigate further, according to the force’s website.
Rights groups say this kind of monitoring raises worries about privacy, consent, algorithmic accuracy, and questions about about how faces are added to watchlists.
It’s “an alarming example of overpolicing,” said Silkie Carlo, director of privacy campaign group Big Brother Watch. “We’re deeply concerned about the undemocratic nature of it. This is a very controversial technology which has no explicit basis in law.”
Her group has scrutinized other British police trials, including one by the London Metropolitan force last year, when officers pulled aside a man who tried to hide his face. They ended up fining him for a public order offence , the group said.
The North Wales police commissioner, Arfon Jones, said using facial recognition to take pictures of soccer fans was a “fishing expedition.” He also raised concerns about false positives.
British police and crime commissioners are civilians elected to oversee and scrutinize the country’s dozens of forces. They were introduced in 2012 to improve accountability.
“I’m uncomfortable at this creeping interference with our privacy,” Jones, himself a former police officer, said in an interview. He said police would be more justified using it if they had intelligence about a specific threat like an impending terrorist attack.
Jones clashed with his South Wales counterpart, Alun Michael, after raising similar concerns at a game-day deployment in October.
Michael said Jones’ criticism was based on misunderstanding of the technology and extensive scrutiny the police faced.
“It is incomprehensible that Arfon Jones should not support measures which keep football fans safe,” Michael said.
Facial recognition was used to spot fans banned from attending Sunday’s game based on previous misbehavior and anyone else’s biometric data was automatically deleted, he said.
“There has not been one single wrongful arrest as a result of the use of facial recognition by South Wales Police,” Michael said. The force has been deploying the technology about twice a month at big events including rugby games, royal visits and yacht races; it scanned nearly 19,000 faces at a Spice Girls concert in May and identified 15 on a watchlist, including nine incorrectly. Six others were arrested.
“In laboratory conditions it’s really effective,” said University of Essex professor Pete Fussey. He monitored the London police trials, which also used NEC’s system, and found a different outcome on the streets. He co-authored a report last year that said only eight of its 42 matches were correct. The London program has since been suspended.
“The police tended to trust the algorithm most of the time, so if they trust the computational decision-making yet that decision-making is wrong, that raises all sorts of questions” about the accountability of the machine, he said.
The debate is also playing out in the U.S., where real-time crowd surveillance is still rare and the technology is more commonly used to identify suspects by running their images through a pool of police mugshots or driver’s license photos.
Critics in the U.S., including politicians, want to ban or curtail facial recognition over racial discrimination fears. Some point to China’s vast networks of street cameras to monitor ethnic minorities.
Britain is the world’s fourth most camera-dense country, with one security camera per 6.5 people, according to IHS Markit.
London is the fifth most surveilled city in the world, and one of only two non-Asian cities in the top 10, according to a report by Comparitech. The British capital has nearly 628,000 surveillance cameras.
It’s so widespread Britain even has a surveillance camera commissioner, Tony Porter.
He and the privacy commissioner, Elizabeth Denham, have urged police forces not to take a British High Court ruling that found the South Wales trial lawful as a green light for generic deployment of automated facial recognition.
Denham is investigating its use by police and private companies. Store owners and landlords are among those keen to use the technology to spot shoplifters and abusive customers.
British startup Facewatch sells a security system to retailers like convenience store chain Budgens that “matches faces against known offenders within seconds of them entering your premises” and sends instant alerts.
The developer of London’s King’s Cross estate said last year it had deployed two facial recognition cameras from May 2016 to March 2018 to prevent and detect crime in the neighborhood, sparking a backlash because the system was used without the public’s knowledge or consent.
___
For all of AP’s tech coverage, visit https://apnews.com/apf-technology
___
Follow Kelvin Chan at www.twitter.com/chanman