Published: Wed, May 16, 2018
People | By Neil Grant

Police must gain public trust on facial recognition tech

Police must gain public trust on facial recognition tech

The report authors submitted freedom of information requests to every United Kingdom police force and both the Met and South Wales forces said they were testing facial recognition technology.

The UK's Information Commissioner has threatened to take legal action over the use of facial recognition in law enforcement if the police and government can not prove the technology is being deployed legally.

Police attempts to use cameras linked to databases to recognise peoples' faces are failing, with the wrong person picked nine times out 10, a report claims. A Met police spokesperson said that all alerts on its watch list were deleted after 30 days and faces that do not generate an alert are immediately deleted.

Typically people on this list have mental health issues, and Big Brother Watch expressed concern that the police said there had not been prior consultation with mental health professionals about cross-matching against people in this database.

South Wales Police defended its use of the facial recognition software, insisting that the system has improved over time. "When we first deployed and we were learning how to use it. some of the digital images we used weren't of sufficient quality", Deputy Chief Constable Richard Lewis told the BBC.

But for now the Big Brother Watch report says the benefits are missing, because the technology does not work.

Silkie Carlo, the director of Big Brother Watch, said: "Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK".

It's likely that the police's use of this intrusive and oppressive technology is not compatible with the UK's human rights laws, as it poses a significant threat to people's right to privacy and freedom of expression.

This is even more alarming in light of multiple studies showing that many high-profile and widely used facial recognition systems have much higher misidentification rates of people of colour and women (particularly women of colour).

Its false positive rate is 91 per cent, and the matches led to 15 arrests - equivalent to 0.005 per cent of matches.

"Regarding "false" positive matches - we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts", it said in a statement.

"Officers can quickly establish if the person has been correctly or incorrectly matched by traditional policing methods, either by looking at the person or through a brief conversation", a spokesperson said.

"On a much smaller number of occasions, officers went and spoke to the individual. realised it wasn't them, and offered them the opportunity to come and see the van".

Underlying the concerns about the poor accuracy of the kit are complaints about a lack of clear oversight - an issue that has been raised by a number of activists, politicians and independent commissioners in related areas.

Further details are expected in the long-awaited biometrics strategy, which is slated to appear in June. The use of images collected when individuals are taken into custody is of concern, said Denham; there are over 19 million images in the Police National Computer (PNC) database. The Scottish government commissioned and published a report into the use of biometrics in March of this year. How does the use of FRT in this way comply with the law?

Big Brother Watch is taking the report to Parliament today to launch a campaign calling for police to stop using the controversial technology, branded by the group as "dangerous and inaccurate".

In its report, Big Brother Watch said: "Automated facial recognition cameras are biometric identification checkpoints that risk making members of the public walking ID cards".

Like this: