11 August 2020
The UK Court of Appeal has unanimously reached a decision against a face-recognition system used by South Wales Police. The judgment, which called the use of automated face recognition (AFR) “unlawful”, could have ramifications for the widespread use of such technology across the UK. But there is disagreement about exactly what the consequences will be.
Ed Bridges, who initially launched a case after police cameras digitally analysed his face in the street, had appealed, with the support of personal rights campaign group Liberty, against the use of face recognition by police. The police force claimed in court that the technology was similar to the use of closed-circuit television (CCTV) cameras in cities. The entire panel of the Court of Appeal judges disagreed.
The system in question, AFR Locate, had been trialled across the South Wales police force since 2017 at big events such as concerts and sports matches. It compares images captured using the system against a database of images of people on a watch list, including criminal suspects and people of interest.
The Court of Appeal said that three of the five arguments put forward by Bridges and Liberty in their case were valid, including the lack of guidance and rules as to when the force could use AFR and who would be on the database of images in the watch list.
The court also said the police force hadn’t tried to understand whether the system being used was biased depending on a person’s gender or race – issues previously highlighted with face-recognition technology. The court previously came to a similar conclusion about the Metropolitan Police’s use of AFR, which said that any decision flagged up by such systems was double-checked by a human officer.
“The court held that the existing legal regime for facial-recognition technology is not robust enough to enable police to use the technology lawfully,” says Carly Kind at the Ada Lovelace Institute in London. South Wales police chief constable Matt Jukes said the force was “confident this is a judgment we can work with”.
Does this mean the end of AFR in the UK for now? It’s unclear as different people have interpreted the ruling differently.
“It means that any use of AFR must be stopped until an appropriate legal basis is established,” says Daragh Murray at the Human Rights Centre of the University of Essex, UK. “The court was unequivocal that their use of AFR was not in accordance with the law,”
Angela Daly at the Centre for Internet Law and Policy at the University of Strathclyde, UK, believes otherwise. “The judgment doesn’t mean that police use of facial recognition technology in England and Wales is illegal, but it must be used in accordance with a very clear, detailed and proportionate legal framework, which was lacking in this case,”
Technology ethicist Stephanie Hare also worries this sets a precedent only for South Wales Police, and not law enforcement across the country or the private sector. There is a strong need for “a nationwide decision on all uses of this technology, not just one police force in one nation of the United Kingdom”, she says. That’s something the UK’s Equality and Human Rights Commission demanded happen in March 2020, and the Independent Surveillance Camera Commissioner for the UK, Tony Porter, has called for today.
“Other police forces will be hard-pressed to justify using this technology following this decision, in addition to the decision by four of the biggest US technology companies – Google, IBM, Microsoft and Amazon – not to sell facial recognition technology to the police because of its problems with inaccuracy, bias and threat to civil liberties and human rights,” says Hare.
More on these topics: