Best News Network

Met Police director of intelligence defends facial recognition | Computer Weekly

The Metropolitan Police Services’ (MPS) director of intelligence has defended the force’s use of facial-recognition technology to a Parliamentary committee, as part of its inquiry into the UK’s governance of artificial intelligence (AI) technology.

The session follows reports that policing minister Chris Philp, in closed-door meetings with the biometrics commissioner of England and Wales, has been pushing for the technology to be rolled out nationally and will likely also push to integrate the tech with police body-worn video cameras.

Appearing before the Science and Technology Committee – which launched its AI governance inquiry in October 2022 – MPS director of intelligence Lindsey Chiswick said that while there is understandable “public concern” around facial recognition and AI, the force has attempted to deploy it “in as careful, proportionate and transparent way possible”.

Pointing to a recent study conducted by the National Physical Laboratory (NPL) – which found “no statistical significance between demographic performance” if certain settings are used in the Met’s live-facial recognition (LFR) system – Chiswick said this was commissioned by the force to better understand “levels of bias in the algorithm” and “how we can use AI in a proportionate, fair and equal way”.

She added that the force must also assess the necessity and proportionality for each individual facial-recognition deployment against the purpose it is being used for.

“At the moment, there must be a solid use case for why we are deploying the technology…This is not a fishing expedition; we are targeting areas where there is public concern about high levels of crime – whether that is knife-enabled thefts on Oxford Street, where they operated before, or whether it is some of the gang-related violence and knife-enabled robbery going on in Camden,” she said.

“Carrying that through from why we are there in the first place, there is then the proportionality of the watchlist, following our policy as to who goes there and why.”

Asked by MP Stephen Metcalfe why not everyone is on police facial-recognition watchlists, Chiswick pointed out this kind of indiscriminate inclusion would be illegal, and reiterated the need for necessity and proportionality.

Elsewhere, she added that every bespoke watchlist is deleted after use because there is no lawful reason for the data to be retained: “Technically, we could keep the watchlist, but lawfully, we cannot.”

On whether the CCTV networks of entire UK cities or regions could be linked up to facial-recognition software, Chiswick again said while it is “technically…feasible”, she would question the proportionality of linking up all cameras into a single unified system.

However, there was no discussion of ongoing issues around the unlawful retention of custody images and other biometric material used to populate the watchlists, which were highlighted by biometrics commissioners Fraser Sampson to the Parliament’s Joint Committee on Human Rights (JCHR) in February 2023.

These same concerns were raised to the Science and Technology Committee by Sampson’s predecessor, Paul Wiles, in March 2019, who said there was “very poor understanding” of the retention period surrounding custody images throughout police forces in England and Wales, despite a 2012 High Court ruling that found their retention to be unlawful.

Speaking to Computer Weekly about the Met’s previous deployments, Green London Assembly member Caroline Russell (who was elected to chair the Assembly’s police and crime committee at the start of May 2023), said disproportionate policing practices mean people from certain demographics or backgrounds are the ones that ultimately end up populating police watchlists.

“If you think about the disproportionality in stop and search, the numbers of black and brown people, young people, who are being stopped, searched and arrested, starts to be really worrying because you get disproportionality built into your watchlists,” she said.

Policing benefits and results

In outlining the operational benefits of the technology, Chiswick told MPs that its use has already led to “a number of significant arrests”, including for conspiracy to supply class A drugs, assault on emergency workers, possession with the intent to supply class A drugs, grievous bodily harm, and being unlawfully at large having escaped from prison.

“Those are some of the examples that I have brought here today, but there is more benefit than just the number of arrests that the technology alerts police officers to carry out, there is much wider benefit. The coronation is one example of where deterrence was a benefit. You will have noticed that we publicised quite widely in advance that we were going to be there as part of that deterrence effect,” she said.

“If I recall my time up in Camden when I went to view one of the facial-recognition deployments, there was a wider benefit to the community in that area at the time. Actually, we got quite a lot of very positive reaction from shopkeepers and local people because of the impact it was having on crime in that area.”

According to the Met’s facia- recognition “deployment record document” on its website, two arrests have been made so far in 2023 across six deployments, with estimates that roughly 84,600 people’s biometric information was scanned.

Over the course of the MPS’ first six deployments of 2022, the force made eight arrests after scanning roughly 144,366 people’s biometric information, for offences including those outlined by Chiswick, as well as a failure to appear in court and an unspecified traffic offence.

Asked whether the Met can show an increase in arrests and convictions as a result of the technology, Chiswick said the tool is not simply about increasing arrest numbers: “This is a precision-based, community crime-fighting tool. To use the terrible analogy of a needle in a haystack, the technology enables us to pick out a potential match of someone who is wanted, usually for very serious crimes, and have the opportunity to go speak to that person.

“The results that I just read out to you are people who would still be at large if we had not used that technology. It is not a tool for mass arrests, it is not a tool that is going to give you huge numbers of arrests, it is a tool that is going to focus very precisely on individuals we are trying to identify.”

Despite the nature of the arrests made using facial recognition thus far, the Home Office and policing ministers have repeatedly justified using the technology on the basis it “plays a crucial role in helping the police tackle serious offences including murder, knife crime, rape, child sexual exploitation and terrorism”.

Computer Weekly has asked for evidence to back this claim up on multiple occasions but has never received a response from the Home Office. The Met, however, confirmed to Computer Weekly in January 2023 that no arrests have been made for those reasons as a result of LFR use.

Biometric oversight

The committee also questioned Chiswick on the government’s proposed Data Protection and Digital Information Bill (DPDI), and in particular what she thought of measures to abolish the biometrics commissioner role and repeal the government’s duty to publish a surveillance camera code of practice, which oversees the use of surveillance systems by authorities in public spaces.

While there has been debate around which existing regulators could absorb the responsibilities and functions of the biometrics commissioner  – with suggestions, for example, that either the Information Commissioner’s Office (ICO) or the Investigatory Powers Commissioner (IPCO) could deal take on different aspects of the role – it is open and ongoing.

“I do not think that necessarily more oversight added on and built up is the best oversight. We run the risk of having siloed oversight – oversight for surveillance, oversight for biometrics, oversight from data – when, actually, it cuts across all of that. Currently, there is guidance out there that also crosses over and overlaps a little bit,” Chiswick told MPs.

“So rather than building additional layers of oversight, at a more superficial level, I think it would be great to have simplified oversight, but with the right questions. That is the key – having the right deep dive into how we are using that technology to ensure we are behaving in the way we should and the way we commit to in policy.”

She added: “From my point of view, fewer different bodies of surveillance and a more simplistic approach to get to the point of asking the right questions is probably helpful.”

Speaking to the same committee in the next session, associate director of AI, data law and policy at the Ada Lovelace Institute Michael Birtwistle said: “The claim that it [the DPDI Bill] simplifies the regulatory landscape may be true, in the sense that there will be fewer actors in it, but that does not mean that there will be more regulatory clarity for users of that technology…the removal of the surveillance camera code is one such thing.

“Our proposal on comprehensive biometrics regulation would centralise a lot of those functions within a specific regulatory function in the ICO that would have specific responsibility for…things like publishing a register of public sector use, requiring biometric technologies to meet scientifically based standards, and having a role in assessing the proportionality of their use. Having all those things happen in one place would be a simplification and would provide appropriate oversight.”

Marion Oswald, a senior research associate for safe and ethical AI and Associate Professor in Law at The Alan Turing Institute and Northumbria University, agreed that simplification does not necessarily mean clarity: “Certainly, in the policing sector, we need much more clarity about where the responsibility really lies. We have lots of bodies with fingers in the pie, but not necessarily anyone responsible for the overall cooking of the pie.

“The regulatory structure needs to be very focused on how the police use data and the different ways AI can be deployed. Sometimes it is deployed in respect of coercive powers – stop and search and arrest – but sometimes it is deployed at the investigation stage and the intelligence stage, which brings on all sorts of different considerations. A regulator needs to understand that and needs to be able to set rules around those different processes and stages.”

Both Parliament and civil society have repeatedly called for new legal frameworks to govern law enforcement’s use of biometrics – including a House of Lords inquiry into police use of advanced algorithmic technologies; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee itself, which called for a moratorium on live facial recognition as far back as July 2019.

In February 2023, in his first annual report, Sampson also called for clear, comprehensive and coherent frameworks to regulate police use of AI and biometrics in the UK.

However, the government has maintained that there is “already a comprehensive framework” in place. 

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsAzi is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.