Research in the United Kingdom shows that in more than four in five cases the facial recognition of the police reports an innocent person. However, the British police state that an error is made in only 0.1 percent of cases.
The research was carried out by scientists at the University of Essex, and they looked at data from six tests conducted by British police in London to test the facial recognition software. This shows that in 42 cases where a face of a potential suspect was recognized, only eight cases were not false positives. This means that the system was wrong in 81 percent of the cases.
However, the British police use a different metric to measure success: they look at the total number of faces recognized and compare it to the number of faces processed by the system, so no match was found and the system did not raise an alarm. By using this calculation method, the margin of error is only 0.1 percent.
In a response, a British police spokesman said he was not happy with how the investigation, which has been seen by Sky News but has not been released publicly, has been reported. They state that the research results are ‘negative and unbalanced’, while the authors of the study indicate that they see major objections to how the system currently works.