Is this the perfect use case for police facial recognition? - TalkLPnews Skip to content

Is this the perfect use case for police facial recognition?

Minnesota shoplifting bust narrowly thwarts potential mass-shooting planned by suspect: ‘Deathtoamerikka’

By Professor Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner

As another report on facial recognition technology is published, picture this: a man launches a terrorist attack on a crowded subway train. Detonating a smoke bomb while the carriage is between stations, he shoots ten commuters before fleeing. Finding his rental van parked nearby, along with a semi-automatic handgun, more grenades and a hatchet, police confirm the suspect’s identity. He’s fired 32 rounds into a packed train and blown up multiple devices; he’s armed, has several changes of clothing and he’s still in the subway system.

Following this attack on the New York subway in April 2022, I wrote an article asking whether these facts perhaps offered the perfect case for police use of live facial recognition (LFR).

The technology was not used on that occasion but, as the UK’s Biometric and Surveillance Camera Commissioner, I thought the scenario offered a compelling case for LFR. A terrorist attack within a confined city transport system equipped with an extensive camera network, with reliable images of the forensically identified suspect – if the police had been able to feed Frank R James’s picture into the combined surveillance systems to find him, on what basis could they responsibly not have done so? James was eventually caught but only after 36 hours following a huge multi-agency search operation. He subsequently received 10 life sentences.

Almost three years on, the technology has evolved, the debates and reports continue, and another exemplar has arisen. This time, LFR was both available and successfully deployed.

The location was Demark Hill, London and the facts were very different. A convicted paedophile, David Cheneler, had been placed on a police watchlist because he was under a court order preventing him from having contact with children under 14 years old. In January a police LFR camera van picked out a man walking in a street with a 6-year-old girl and matched his face with the image it had of Cheneler. Having befriended the girl and her mother – who was unaware of his previous offending or the court order – Cheneler (who was in possession of a knife when arrested) had collected the girl from school. The trial judge concluded there had been a sexual motivation in his actions and sentenced him to two years’ imprisonment.

There are three important features for the police use of LFR in this case. First, as the judge noted, “fortunately the technology available prevented physical contact going further”. Availability is important here, not just in terms of the equipment being accessible; it has a specific legal element too. Where the technological means to prevent inhumane or degrading treatment are reasonably available to the police, the law in England and Wales may not just permit the use of remote biometric technology, it may even require it. I’m unaware of anyone relying on this human rights argument yet and we won’t know if these conditions would have met that threshold. One thing we can be sure of, however, is that the availability of AI-driven biometrics will bring a tipping point whereby the police will be challenged on their decisions not to use the available technology as often as they have to defend their use of it today.

Second, the person was on the watchlist because he was subject to a court order. This was not the public under ‘general surveillance’: a court had been satisfied on the evidence presented that an order was necessary to protect the public from sexual harm from him. He breached that order by insinuating himself into the life of a 6-year-old girl and was found alone with her. He was accurately matched with the watchlist image. The third feature is that the technology did its job.

It would be easy to celebrate this as a case of ‘thank goodness nothing happened’ but that would underestimate its significance and miss the legal areas where FRT will be challenged. Blanket police use of AI-enabled biometrics would probably stop something happening somewhere. But if they are to use the technology lawfully, accountably and with the support of their communities, the police must be able to demonstrate that every deployment is necessary, justified and proportionate, even when using it for slam dunk cases like capturing a suspect during a terrorist attack or rescuing young children from the risk of sexual harm. The fortuitous avoidance of remote consequences is all well and good but building trust and confidence, in the police and the technology, will need more than a collective sigh of relief.

In the article on the Brooklyn subway case I offered the view that facial recognition technology would be “unsurpassable” in tackling sex offending, illegal immigration and finding individuals in breach of prison licence conditions. These use cases cover some of the most challenging policy areas, and not only for the UK. Preventing harm by those who present significant risk is at the heart of the UK government’s planned justice reforms. What role facial recognition technology will have in those reforms remains to be seen but it undoubtedly has something new to offer.

In the meantime, while the UK remains in the vanguard of facial recognition in policing, internationally the picture is contradictory. French Justice Minister Gérald Darmanin said recently “If you want a secure society, you need facial recognition” but the current impasse in the United States suggests the law enforcement technology will not become de rigueur for a while yet.

As police use cases go, Denmark Hill is near perfect: not so much low hanging fruit as a root vegetable. But not all cases will be so clear cut and we still have some way to go to ensure all police uses will balance what’s possible, what’s permissible and what’s acceptable.

Following James’s arrest for the Brooklyn attack, the New York City police commissioner said, “We were able to shrink his world quickly, so he had nowhere left to turn.” Facial recognition technology will empower the police to shrink the worlds of suspects at a scale, pace and cost unimaginable a few years ago; how far it should be allowed to shrink everyone else’s world in doing so remains the surveillance question of our time.

About the author

Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner, is Professor of Governance and National Security at CENTRIC (Centre for Excellence in Terrorism, Resilience, Intelligence & Organised Crime Research) and a non-executive director at Facewatch.

Related Posts

Article Topics

biometric identification  |  biometrics  |  criminal ID  |  facial recognition  |  Fraser Sampson  |  law enforcement

Latest Biometrics News


 

The steady march of digital identity credentials towards reusability with biometric authentication is meeting the skepticism of some privacy advocates…


 

The UK government is considering a proposal to issue a national smartphone-based verifiable digital identity credential to every adult in…


 

The EU unveiled its International Digital Strategy, outlining an ambitious plan to strengthen its position in global digital affairs. A…


 

The Transportation Security Administration (TSA) and Department of Homeland Security’s (DHS) Science and Technology Directorate (DHS S&T) are jointly developing…


 

HID is in the midst of a push with its facial recognition hardware and software for secure access control and…


 

The International Standards Organization has published a standard for obtaining and recording consent, as is necessary to legally use people’s…

https://www.biometricupdate.com/202506/is-this-the-perfect-use-case-for-police-facial-recognition