Published: Wed, May 16, 2018
People | By Neil Grant

Facial recognition technology is "dangerously inaccurate"

Facial recognition technology is

Started in New Zealand, Auror offers software platforms created to "help police and retail businesses collaborate and fight crime", its website says.

Big Brother Watch is launching its campaign against AFR today in Parliament.

It is extremely unlikely that police attempts to harness the power of facial recognition will cease as officers believe the benefits are potentially massive.

The Metropolitan Police used facial recognition at London's Notting Hill carnival in 2016 and 2017 and at a Remembrance Sunday event. Systems incorrectly flagged 102 people as potential suspects, though none were arrested.

The South Wales Police's facial recognition system wrongly identified 91% of the matches.

One particular development is the use of biometric data, including databases of facial images, in conjunction with Automatic Facial Recognition Technology.

How have the police forces responded?

"When we first deployed and we were learning how to use it. some of the digital images we used weren't of sufficient quality", said Deputy Chief Constable Richard Lewis.

The tests correctly identified a total of two people who appeared on police databases, but neither was a wanted criminal and no arrests were made, according to law enforcement documents obtained by the report's authors.

"If you've got a higher level of confidence in the accuracy of your technology than it is able to deliver, you can accuse somebody of being untrustworthy, you can exclude them from your store, you can take all sorts of actions that are completely unjustified".

Despite a 2012 High Court ruling that said keeping images of presumed innocent people on file was unlawful, the government has said it isn't possible to automate removal.

"The system requires store security to identify the individual as trespassed from the store or as a person known to have shoplifted in our stores, and manually input the information into the system - they will then be picked up by the CCTV on future visits".

"If an incorrect match has been made, officers will explain to the individual what has happened and invite them to see the equipment along with providing them with a Fair Processing Notice".

"Regarding "false" positive matches - we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts", it said in a statement. "Faces in the video stream that do not generate an alert are deleted immediately". Article 8 of the Human Rights Act says that any interference with the right to private life must be both necessary and proportionate.

It also raised concerns that photos of any "false alarms" were sometimes kept by police for weeks.

"Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK".

What does Big Brother Watch want?

Currently, there is no legislation in the United Kingdom that regulates the use of facial recognition systems through CCTV cameras by the police, nor is there any independent oversight for the police's use of these systems.

A police spokesman confirmed they worked with Auror to reduce retail crime, and information retailers provided to the company's software system was also shared with police.

It is deeply disturbing and undemocratic that police are using a technology that is nearly entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms.

"Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public", said Ms Denham.

The UK home office told the BBC it plans to publish its biometric strategy in June.

Like this: