Artificial Intelligence – Face-off

Facial recognition has to be nearly perfect.

  • The Metropolitan Police (the Met) is struggling with a report that Scotland Yard commissioned that strongly indicates that its facial recognition technology is not close to being good enough to be acceptable for use as a tool to fight crime.
  • The Met began trailing facial recognition technology in 2016 and have since conducted 10 trials at busy locations in London such as Leicester Square and the Notting Hill Carnival.
  • At the same time, Scotland Yard commissioned a study by the University of Essex (see here) to assess the performance of the technology during these trials.
  • The results are not encouraging and following the finding, the Met is attempting to refute the findings of its own study.
  • The main findings are as follows:
    • First, false positives: The system made 42 identifications of suspects during the study but only eight were verified as correct giving an error rate of 81%.
    • Four of the 42 identifications could not be verified as the persons in question were lost in the crowd before the police could reach them.
    • Second, legality: A trial system like this requires opt-in from users which was deemed to have been given through the use of signage in the area where the technology is being used.
    • The study also made a legal assessment of this consent and found it to be on shaky ground with a good probability of a successful legal challenge in court.
  • Despite commissioning the study, the Met has turned its back on the findings choosing instead to measure the error rate as a percentage of all the faces that were scanned.
  • On this basis, the Met claims that the error rate is just 0.1%.
  • This is a deeply flawed position as:
    • First, false accusations: Identifying innocent people as crime suspects is clearly unacceptable and is likely to lead to public opinion swinging heavily against the use of this technology.
    • Consequently, this is the key measure of how viable this technology is and I would also argue that the 0.1% error rate cited above may not be accurate (see below).
    • Second, no verification: The Met did not verify every face that it scanned and hence it is assuming that those that were not identified as suspects were indeed correct matches.
    • In order for this measure to be accurate, the Met would need to verify the identity of all faces scanned, not just those thrown up as suspects.
    • This was clearly not the case and so the accuracy of the system at correctly identifying faces is completely unknown.
  • Consequently, I think that the Met would be better off working out how to make the technology work better rather than twisting the findings of its own study to save face.
  • To become an effective tool to combat crime, facial recognition has to be virtually flawless in its ability to identify suspects and the database upon which it is based must also be accurately updated in real time.
  • This combined with very tight control on how it is used is how facial recognition can become an acceptable technology to the general public.
  • This is yet another example of how artificial intelligence is simply not yet good enough to be meaningfully applied in real-world scenarios.
  • RFM research (see here) has identified that AI based on deep learning works best when the task at hand is both finite and stable.
  • Facial recognition is a task where the data set is neither stable nor is it finite given the varying conditions under which the data may be collected and how aspects of different faces may change.
  • This is yet another example (like autonomous driving) of how the capabilities of AI being overhyped which must eventually lead to a reset of expectations.
  • The net result will be disappointment, falling investment and lower valuations.
  • Winter is still coming.

RICHARD WINDSOR

Richard is founder, owner of research company, Radio Free Mobile. He has 16 years of experience working in sell side equity research. During his 11 year tenure at Nomura Securities, he focused on the equity coverage of the Global Technology sector.