Facial-recognition inaccurate in 98% of cases according to new data

May16

The supposedly high-tech policing tactic of using facial recognition software is currently being used by the Metropolitan Police, but they have recently reported that out of all cases that the software has been used in, it is inaccurate in 98% of these cases.

Figures recently published by The Independent (based on data obtained by freedom of speech), showed that out of 104 alerts generated by facial-recognition software used by the Metropolitan Police, only 2 were found to be accurate matches. The Independent also reported that similar software used by the South Wales Police returned more than 2,400 false positives since June 2017.

 The software in use here is capable of scanning video footage and identifying individual faces to match with a database of known faces (for example wanted criminals.) It reduces each face to a map of biometric identifiers, this would be things like the length of a nose or distance between the eyes etc. it is capable of making a match in a fraction of a second.

The UK’s independent Biometrics Commissioner Paul Wiles said the figures showed the technology “is not yet fit for use.” He also said “In terms of governance, technical development and deployment is running ahead of legislation and these new biometrics urgently need a legislative framework, as already exists for DNA and fingerprints.”

Unfortunately this all points to the fact that the technology in use in the UK at this time is just not ready for mainstream policing and will need plenty of work before a criminal can be accurately identified by facial recognition software.

p7-asaeigo-a-20180109-870x562

Recent Articles