Facing The Future

Steven Porter, Mid Kent Chairman

1st September 2020

Love it or loath it, technology is now embedded in our daily lives. It’s certainly not going away and we are all impacted by technology developments. In many ways we all have a choice in how we use and engage with technology. However, sometimes we have no choice when technology is used on us when going about our daily lives.

For example, we have all become accustomed to seeing CCTV cameras around public spaces, entertainment venues, shopping centres, high streets and private buildings. Many of us now have CCTV and other cameras installed at home for security purposes. We take comfort in knowing this technology is deployed and can be used by the police to find the perpetrators in the event of criminal or terrorist activity. It increases our safety and security so we generally take no exception to its use. We should also not forget that CCTV use is covered by various laws governing how it should be operated.

While we may be content to be filmed on CCTV, are we comfortable knowing that the advances in technology means we can now be identified by cameras using biometric data (Facial Recognition)? Are we still content to be captured and potentially identified by these cameras as we go about our daily lives?

An important test case has recently been concluded relating to the use of Facial Recognition Technology (FRT) by South Wales Police (SWP) [1-4].

The case was brought by a Mr Bridges (with Liberty the Civil Rights group) who challenged the legality of South Wales Police’s use of AFR Locate (LAFR) (a particular application of Facial Recognition Technology) on the grounds that its use was contrary to the requirements of the Human Rights Act 1998, Data Protection legislation (the Data Protection Act 1998 and its successor, the Data Protection Act 2018), and that the decision to implement or use it had not been taken in accordance with the public sector equality duty contained in the Equality Act 2010. The case was originally brought in 2019 and in Aug 2020 Mr Bridges, on appeal, won the case on 3 of the 5 points raised.

It is perhaps worth briefly explaining how the technology works. FRT allows the automatic identification of an individual by matching two or more faces from digital images. It does this by detecting and measuring various facial features, extracting these from the image and, in a second step, comparing them with features taken from other faces. The other faces are stored in what is referred to as a ‘watch list’ – a database of facial images of people that are to be identified. These could be, for example, missing persons, persons wanted in connection with crimes or persons banned or restricted from visiting certain locations. The ‘watch list’ and who is on it is one of the areas of concern that was covered in the case referenced above.

So, on the face of it (excuse the pun), should we be concerned about the use of this technology? After all, many of us now use Facial Recognition Technology to unlock our mobile phone. When passing through security at airports we now have facial recognition technology to check our passports. Social Media now tags us in pictures and we can be identified quite easily across social media platforms. Does the the law abiding citizen have anything to worry about if, just like CCTV, FRT is actually helping the Police and Security Services keep us safe?

On appeal, SWP lost the case on three key points – a good summary of the case is provided in [5].

“In Mr Bridges’ case, the Court of Appeal gave a unanimous judgement, finding there to be three ways in which the South Wales Police’s use of LAFR technology was unlawful:

– It breaches Article 8 (the right to privacy) because it is not ‘in accordance with law’. In the Court of Appeal’s view, there are ‘fundamental deficiencies’ with the legal framework, which leaves too much discretion to individual police officers about how and where the technology is deployed.

– It breaches the Data Protection Act 2018 (DPA) because the data protection impact assessment (DPIA) conducted under s.64 DPA failed properly to grapple with the Article 8 implications of the deployment of LAFR. Specifically, the DPIA failed properly to assess the risks to the rights and freedoms of individuals, and failed to address the measures envisaged to respond to the risks arising from the deficiencies in the legal framework.

– It breaches the public sector equality duty (PSED), because the police have taken no steps to satisfy themselves that the underlying software doesn’t contain bias on the basis of race and sex. As the Court observed, there is no reason to think that the software used by the South Wales Police does contain any such bias – but the whole purpose of the PSED is ‘to ensure that a public authority does not inadvertently overlook information which it should take into account’ and so it was unlawful for the police to have failed to obtain evidence of whether the software might contain inherent bias.”

The use of FRT is the subject of much debate and raises many more questions than has been addressed specifically in the judgments in this case. For example, what happens if the technology is deployed to a location and someone on the ‘watchlist’ is not identified (a false negative) and goes on to commit an act of terrorism?

In America many states have banned its use completely on Civil Liberty grounds [6]. The EU are actively reviewing the technology for migration and border control purposes [7]. The Met Police have also done extensive work on its use [8].

YouGov surveyed the public attitude to use of this technology and the survey shows that the British public are prepared to accept use of facial recognition technology in some instances, when there is a clear public benefit and where appropriate safeguards are put in place, but they also want the government to impose restrictions on its use [9].

So what is our view in For Britain?Should we embrace the technology and see it as a good means of enforcing the law, enhancing security, enforcing strong immigration controls and border security?If a fit for purpose legal framework governing its use can be implemented, is this something we support?

Or do we take the view this is an infringement of our civil liberties to go about our business without being tracked and identified?

What policies might we derive from the above case?

For members reading this blog, I’ll pose the question in our Members Blog for views to be provided.

[Ref 1]https://www.judiciary.uk/wp-content/uploads/2019/09/Press-Summary-Bridges-v-Cheif-Constable-South-Wales-Police-CO-4085-2018FINAL_-1.pdf

[Ref 2] https://www.judiciary.uk/wp-content/uploads/2019/09/bridges-swp-judgment-Final03-09-19-1.pdf

[Ref 3] https://www.judiciary.uk/wp-content/uploads/2020/08/R-Bridges-v-CC-South-Wales-ors-Press-Summary-1.pdf

[Ref 4] https://www.judiciary.uk/wp-content/uploads/2020/08/R-Bridges-v-CC-South-Wales-ors-Judgment-1.pdf

[Ref 5] https://www.adalovelaceinstitute.org/facial-recognition-technology-needs-proper-regulation-court-of-appeal/

[Ref 6] https://eu.usatoday.com/story/tech/2019/11/19/police-technology-and-surveillance-politics-of-facial-recognition/4203720002/

[Ref 7] https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper.pdf

[Ref 8] https://48ba3m4eh2bf2sksp43rq8kk-wpengine.netdna-ssl.com/wp-content/uploads/2019/07/London-Met-Police-Trial-of-Facial-Recognition-Tech-Report.pdf

[Ref 9]https://www.adalovelaceinstitute.org/beyond-face-value-public-attitudes-to-facial-recognition-technology/