October 31, 2023 at 08:37AM
UK Minister of State for Crime, Policing, and Fire, Chris Philp MP, has called for doubling the use of algorithmic-assisted facial recognition by police forces. He believes that both live and retrospective facial recognition should be increased, and that searching the whole Police National Database image set will maximize the chance of a match. However, concerns have been raised over potential racial bias in the system.
The UK minister for policing, Chris Philp MP, has called for an increased use of algorithmic-assisted facial recognition technology by police forces. He has suggested that both live and retrospective facial recognition should be utilized more extensively following a commitment to allocate £17.5 million ($21.3 million) to develop a “resilient and highly accurate system” capable of searching all police-accessible databases of images.
Retrospective facial recognition (RFR) utilizes crime scene images, such as those captured by CCTV, police cameras, or phone footage, to scan police databases and find potential matches. On the other hand, live facial recognition (LFR) involves real-time footage from events being compared against a predefined target list of known criminals or suspects.
According to Philp, advancements in algorithms have made it possible to successfully match even blurred or partially obscured images against custody images, resulting in arrests. He believes that searching the entire Police National Database (PND) image set, rather than just local force databases, will increase the chances of finding matches. Philp encourages routine use of RFR for a wide range of crimes, citing examples where RFR has helped identify suspects involved in murder, sex offences, domestic burglary, assault, car theft, and shoplifting, cases where identification might have been difficult or time-consuming without the technology.
Philp also highlights the potential of LFR to deter and detect crime in public settings with large crowds. He claims that there is a College of Policing Authorised Professional Practice in place, providing a solid legal basis for LFR.
The Metropolitan Police in London recently employed LFR during an Arsenal v Tottenham game, resulting in the arrest of three individuals: one charged with breaching a football banning order, one wanted for sexual offences, and one admitting to using threatening and abusive language in violation of a court order.
Philp asserts that the National Physical Laboratory has provided assurance regarding the accuracy of the algorithms used in LFR, assuring the absence of gender or racial bias. Non-matched biometric data is immediately deleted to address privacy concerns.
However, Dr Tony Mansfield, a principal research scientist at the National Physical Laboratory, expressed concerns about bias against Black individuals in the system used by the Metropolitan Police during a parliamentary committee meeting. He stated that the system exhibited bias against Black males and females when operated at low and easy thresholds, although he believes the Met does not use the system at these thresholds.
In 2017, privacy groups, including Big Brother Watch, Liberty, and Privacy International, called on the Met to cancel its plans to implement facial recognition software at Notting Hill Carnival, Europe’s largest street festival. In 2018, Big Brother Watch found that 91 percent of the people flagged by the Met’s facial recognition system were not on any watch list. The Met emphasized that additional checks and balances were in place to confirm identification after system alerts, so false positives were not considered.
Please note that the provided information summarizes the meeting notes. Further details or clarifications may be required for a more comprehensive understanding.