Business Daily Media

why we shouldn't ban the police from using it altogether

  • Written by Asress Adimi Gikay, Senior Lecturer in AI, Disruptive Innovation and Law, Brunel University London
why we shouldn't ban the police from using it altogether

The UK police are being accused of breaking ethical standards by using live facial recognition technology to help fight crime. A recent report[1] by the University of Cambridge into trials of the technology by forces in London and south Wales was particularly concerned about the “lack of robust redress” for anyone suffering harm. It spoke of the need to “protect human rights and improve accountability” before facial recognition is used more widely.

The Cambridge team wants a broad ban on police using the technology, and they are not alone. UK civil liberties group Big Brother Watch has been running a “stop facial recognition[2]” campaign as the government mulls how to regulate AI technologies[3]. Meanwhile, 12 NGOs recently called on[4] EU legislators to completely ban it, along with various other forms of biometric identification, in their upcoming AI Act[5].

Simply banning this technology would be a mistake, however. In my view, there’s a good case for a more measured approach.

Growing police use

The police forces in London and south Wales appear to be the only two in the UK currently using live facial recognition, which uses artificial intelligence software[6] to compare an individual’s digital facial image with an existing facial image to estimate similarity. Manchester Police trialled it but were forced to pause[7] by the surveillance camera commissioner[8] in 2018 for not obtaining the necessary approvals.

In 2020 an appellate court also ruled against[9] south Wales’ use of the technology, concluding the force’s legal framework for deployment effectively gave them unlimited discretion to do so. It made no difference to the court that the police had notified the public (known as overt operational deployment).

Despite this ruling, facial recognition can still broadly be used by police, although numerous other forces[10] have said they are not[11] doing so at present.

Woman on phone while numerous people behind her are being scanned by facial recognition technology
Any UK police force can use facial recognition under the current legal framework. Trismegist San[12]

The London Metropolitan Police increasingly use facial recognition to locate missing persons, suspects, witnesses[13] and victims. They have scanned individuals’ faces in city squares and at public events, using a facial recognition camera[14] typically placed on a police vehicle or street pole. The public are alerted[15] to the deployment through notices as they enter the recognition zone – unless that compromises policing tactics or deployment is urgent.

Between February 2020 and July 2022, the Met deployed the techology in eight locations including Piccadilly Circus[16]. They are estimated[17] to have viewed more than 150,000 faces, leading to nine arrests but also eight occasions where they targeted the wrong person.

The pros and cons

Facial recognition has evolved in recent years, for instance to work in real time, but inaccuracies and errors remain. In New Jersey, 228 wrongful arrests[18] were reportedly made using (non-real time) facial recognition between January 2019 and April 2021. One black American[19] spent 11 days in jail after being wrongly identified. False identifications can also lead to everything from missed flights to distressing police interrogations.

Specific groups are disproportionately affected. A 2019 US study[20] found that women are two-to-five times more likely to be falsely identified, while the risks are ten-to-100 times greater for black and Asian faces than white ones. Given that police already disproportionately stop and search[21] ethnic minorities, this shortcoming in the technology could potentially even be used to sustain such practices.

Crowd in London protesting about police stop and search Facial recognition is not necessarily part of the solution. BradleyStearn[22]

Another risk is that police covertly install facial recognition cameras permanently. This could help the state to crack down on public protests, for example. There is already a pending legal challenge against Russia[23] before the European Court of Human Rights over such practices, and fear of state surveillance is one reason why many want this technology banned.

Nonetheless, facial recognition has its benefits. It can help[24] police to find serious criminals, including terrorists, not to mention missing children[25] and people at risk of harming themselves or others.

Like it or not, we also live under colossal corporate surveillance capitalism already. The UK and US[26] have among the most installed CCTV cameras in the world. London residents are filmed 300 times[27] a day on average, and police can usually use the data without a search warrant. As if that wasn’t bad enough, big tech companies know almost everything[28] personal about us. Worrying about live facial recognition is inconsistent with our tolerance of all this surveillance.

A better approach

Instead of an outright ban, even of covert facial recognition, I’m in favour of a statutory law to clarify when this technology can be deployed. For one thing, police in the UK can currently use it to track people on their watchlists, but this can include even those charged with minor crimes. There are also no uniform criteria for deciding who can be listed.

Under the EU’s proposed law[29], facial recognition could only be deployed against those suspected of crimes carrying a maximum sentence of upwards of three years. That would appear to be a reasonable cut-off.

Secondly, a court or similar independent body should always have to authorise deployment, including assessing whether it would be proportionate to the police objective in question. In the Met, authorisation currently has to come from a police officer ranked superintendent or higher[30], and they do[31] have to make a call[32] on proportionality – but this should not be a police decision.

We also need clear, auditable ethical standards for what happens during and after the technology is deployed. Images of wrongly identified people should be deleted immediately, for instance. Unfortunately, Met policy on this is unclear at present. The Met is trying to use the technology responsibly in other respects, but this is not enough in itself.

Last but not least, the potential for discrimination[33] should be tackled by legally requiring developers to train the AI on a diverse enough range of communities to meet a minimum threshold. This sort of framework should allow society to enjoy the benefits of live facial recognition without the harms. Simply banning something that requires a delicate balancing of competing interests is the wrong move entirely.

References

  1. ^ A recent report (www.mctd.ac.uk)
  2. ^ stop facial recognition (bigbrotherwatch.org.uk)
  3. ^ regulate AI technologies (www.gov.uk)
  4. ^ recently called on (edri.org)
  5. ^ AI Act (eur-lex.europa.eu)
  6. ^ artificial intelligence software (ico.org.uk)
  7. ^ forced to pause (www.express.co.uk)
  8. ^ surveillance camera commissioner (www.gov.uk)
  9. ^ ruled against (www.judiciary.uk)
  10. ^ other forces (www.psni.police.uk)
  11. ^ they are not (www.gmp.police.uk)
  12. ^ Trismegist San (www.shutterstock.com)
  13. ^ witnesses (www.met.police.uk)
  14. ^ facial recognition camera (www.judiciary.uk)
  15. ^ public are alerted (www.met.police.uk)
  16. ^ Piccadilly Circus (www.met.police.uk)
  17. ^ are estimated (www.met.police.uk)
  18. ^ 228 wrongful arrests (incidentdatabase.ai)
  19. ^ black American (edition.cnn.com)
  20. ^ A 2019 US study (nvlpubs.nist.gov)
  21. ^ stop and search (www.theguardian.com)
  22. ^ BradleyStearn (www.shutterstock.com)
  23. ^ legal challenge against Russia (www.hrw.org)
  24. ^ It can help (www.securityindustry.org)
  25. ^ missing children (www.reuters.com)
  26. ^ UK and US (papltd.co.uk)
  27. ^ 300 times (www.theguardian.com)
  28. ^ know almost everything (guardian.ng)
  29. ^ proposed law (eur-lex.europa.eu)
  30. ^ superintendent or higher (www.met.police.uk)
  31. ^ they do (www.met.police.uk)
  32. ^ make a call (www.met.police.uk)
  33. ^ potential for discrimination (news.stanford.edu)

Read more https://theconversation.com/facial-recognition-why-we-shouldnt-ban-the-police-from-using-it-altogether-193895


Start your own business with Shopify


How to Write a Startup Pitch to Attract Investors

The number of startups is growing worldwide year by year. And it’s getting more challenging for innovative entrepreneursto grab the investors’ a...

Business Training

5 most damaging blunders businesses are making

One of Australia’s leading online media agencies, Search Results, has released a list of the 5 most damaging blunders businesses are making today.   S...

Business Training

Balancing Entrepreneurship and Parenting: How to become a successful Dad-preneur

Juggling parenting and family life, whilst running a successful business and maintaining work-life balance is no easy feat! To invest in you...

Business Training

6 Things You Need to Consider When Looking for A Company Car

Setting up a company car is a big decision and one that can have far-reaching consequences for your business, particularly if you choose the wrong o...

Business Training

5 BENEFITS OF USING HIGH-QUALITY HEADPHONES

We all know how colorless life can be without the audio in it. And with the broad range of digital media that surrounds us wherever we go, we all ne...

Business Training

More training for coffee making than property sales: REINSW

Sydney 9 May 2016. An overhaul of education and training standards for the real estate profession must take place to help prevent illegal activities, ...

Business Training