Published: Wed, May 16, 2018
Business | By Kate Woods

Police facial recognition systems are 98 per cent inaccurate, says research

Police facial recognition systems are 98 per cent inaccurate, says research

Dancers at a Caribbean carnival in a west London street, peaceful protestors at a lawful demonstration against an arms fair, and citizens and veterans paying their respects to war dead on Remembrance Sunday - these people have all been targeted by police's new authoritarian surveillance tool invading our public spaces: automated facial recognition.

Worryingly, 102 innocent members of the public were identified by the technology, although the force has yet to make an arrest using it.

The UK information commissioner, Elizabeth Denham, said that police need to demonstrate the efficacy of facial recognition technology when less intrusive methods are not available: "Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public".

Automated facial recognition (AFR) technology used by London's Metropolitan Police is created to find persons of interests within large groups of people by comparing the biometrics of attendees caught on camera with information already stored on law enforcement databases.

The group described this as a "chilling example of function creep" and an example of the risky effect it could have on the rights of marginalised people.

More news: BSF Personnel Martyred in Ceasefire Violation Along International Border in Jammu

Denham also expressed concern with both the transparency and proportionality aspects of the retention of the 19 million images in the Police National Database.

Big Brother Watch submitted freedom of information requests to every police force in the UK.

The Met Police's facial recognition system had the worst track record, with only 2% matching accuracy and with 98% wrongly identified people.

SWP - which has used AFR at 18 public places since it was first introduced in May 2017 - has fared only slightly better. The system led to 15 arrests or 0.005% of the total matches.

"On a much smaller number of occasions, officers went and spoke to the individual. realised it wasn't them, and offered them the opportunity to come and see the van".

More news: TN 12th Result 2018: Girls outperform boys

Underlying the concerns about the poor accuracy of the kit are complaints about a lack of clear oversight - an issue that has been raised by a number of activists, politicians and independent commissioners in related areas.

Further details are expected in the long-awaited biometrics strategy, which is slated to appear in June.

The Home Office has previously faced a barrage of criticism for failing to explain why it has failed to provide a framework for the use of biometric data such as facial recognition imagery in policing, more than four years after it was due to be published.

"If we move forward on this path, these systems will mistakenly identify innocent people as criminals or terrorists and will be used by unscrupulous governments to silence unwelcome voices". This means that they remain on the system unless a person asks for them to be removed. The rights group argues that police should delete such images as they do fingerprints and DNA once an individual is found to be innocent or released without charge. Despite this, the force is planning seven more deployments this year.

Civil liberties group Big Brother Watch today (15 May) published a report outlining serious claims about the accuracy of facial recognition tools employed by United Kingdom law enforcement bodies.

More news: Barletta wins Republican nomination for US Senate

What does Big Brother Watch want?

Like this: