août 11 2021

ICO offers insight on its policy around the use of live facial recognition in the UK

Share

On 18 June 2021, the UK Information Commissioner published a Commissioner's Opinion (the "Opinion") on the use of live facial recognition ("LFR") technology in public spaces. Public spaces are defined broadly and include any non-residential space. The Opinion sets out that there is "a high bar" for businesses to meet. It is noteworthy that six organisations investigated for the use of LFR in the past by the Information Commissioner's Office ("ICO") have failed to fully justify its use and eventually stopped, or chose not to proceed with, the use of LFR.

The Opinion also noted that there is generally weak support from the public for the use of LFR by businesses in public spaces. In a blog post about the Opinion, the Information Commissioner said that she is "deeply concerned about the potential for live facial recognition (LFR) technology to be used inappropriately, excessively or even recklessly".

Businesses that wish to use LFR technology in public spaces are advised to consider the central data protection principles of lawfulness, fairness and transparency, and undertake a robust evaluation of the necessity and proportionality of the technology. In summary, businesses should “carefully evaluate their plans with a rigorous level of scrutiny”.

While the Opinion is not binding on businesses, the ICO have said that they will consider the Opinion in relation to any enforcement action they take in relation to LFR.

This Opinion builds on previous guidance published by the ICO on the use of LFR by law enforcement in public places.

What is LFR?

Facial recognition technology will typically involve a captured image of a person, the collection of biometric data is then automated and software will produce a biometric template of the individual. This biometric data is classed as sensitive by the ICO. The software will then compare the degree of similarity between templates to verify someone's identity and / or infer characteristics such as the person's age, sex, gender or ethnicity. This is typically a one to one process, for example an individual knowingly verifying their identity in a banking app. These one to one verifications are not the focus of the Opinion.

In comparison, LFR is often deployed in a similar way to CCTV, covering a large area, and individuals may not be aware of its use. LFR will indiscriminately capture information from an area automatically in real time, which may create a large volume of information depending on the location.

What businesses does the Opinion apply to?

The Opinion is very broad in scope, in particular because of the wide definition of public spaces used. Any business using LFR in relation to surveillance may need to consider this guidance. Notable exclusions are for the use of LFR by law enforcement agencies and one to one processes where the subject is an active participant (e.g. for identify verification).

The ICO has considered 14 examples of LFR being used or proposals for its use and noted that businesses often wish to use LFR for surveillance purposes to prevent crime or other unwanted behaviours by matching the biometric template created to a 'watch list' which may be provided by a third party. This may occur in retail, museums, conference centres, other leisure areas and transport settings, as well as other public spaces which the Opinion defines "any physical space outside a domestic setting, whether publicly or privately owned". Of the 14 examples of the use, or proposed use, of LFR considered by the ICO in the Opinion, 11 of these included some form of surveillance which were typically high footfall areas capturing significant numbers of people. For example, a shopping centre may use LFR to identify a previously known shoplifter, so that they can then be removed.

Although surveillance is the most obvious example of LFR contemplated by the ICO, others recognised by the Opinion include marketing and advertising purposes such as estimating footfall, measuring engagement, targeting adverts or providing interactive experiences.

The Opinion identifies several key data protection issues arising from the use of LFR which are:

  1. the governance of LFR systems, including why and how they are used;
  2. the automatic collection of biometric data at speed and scale without clear justification, including of the necessity and proportionality of the processing;
  3. a lack of choice and control for individuals;
  4. transparency and data subjects' rights;
  5. the effectiveness and the statistical accuracy of LFR systems;
  6. the potential for bias and discrimination;
  7. the governance of watchlists and escalation processes;
  8. the processing of children's and vulnerable adults' data; and
  9. the potential for wider, unanticipated impacts for individuals and their communities.

How can businesses use LFR?

Firstly, businesses must comply with the requirements of the UK GDPR and the Data Protection Act 2018 for special categories of personal data, including the data protection principles set out in Article 5 of the UK GDPR of which the ICO have highlighted the need for lawfulness, fairness and transparency. For example, this would include an evaluation of the necessity and proportionality of the processing, eliminating biases in the LFR systems and considering suitable alternatives such as CCTV. There are not specific provisions on the use of LFR by a business as there are for its use in a law enforcement context.

Under Article 5(1) of the UK GDPR, a business using LFR must identify a specific and legitimate purpose for the processing and this purpose must be sufficiently important to justify the use of LFR.

Secondly, under Article 6 of the UK GDPR, businesses must identify which of the lawful basis it is relying on for the use of LFR. Notably, the ICO found several examples of businesses relying on the consent of data subjects as the lawful basis for LFR. However, the ICO note that this is challenging in the context of surveillance given that there must be a real choice given to the individual and that an individual choosing to enter premises would not be sufficient consent under the UK GDPR.

The ICO also note several examples of businesses seeking to rely on legitimate interests (where processing is "necessary for the purposes of the legitimate interests pursued by the controller") as the lawful basis for the use of LFR. In order to do so businesses must meet the requirements for this set out in Article 6(1)(f) of the UK GDPR which are that:

  1. The Purpose test – is there a legitimate interest behind the processing?
  2. The Necessity test – is the processing necessary for that purpose?
  3. The Balancing test – is the legitimate interest overridden by the individual's interests, rights or freedoms?

Businesses should consider performing a documented legitimate interests assessment, including these tests in order to be able to demonstrate that legitimate interests is a suitable lawful basis for processing, again noting the high bar for use of LFR.

Because biometric data is special category data, businesses must also comply with Article 9 of the UK GDPR, which sets out further conditions for the processing of special category data in certain contexts. In particular, the ICO note the reliance on Article 9(2)(g) which is that there is a "substantial public interest" of which there are 23 in the Data Protection Act 2018, including the "preventing of or detecting unlawful acts". Where businesses are relying on this, they must have an appropriate policy document in place.

Businesses must also be able to demonstrate their compliance with these rules through documentation. In particular, the ICO expect that a data protection impact assessment is undertaken prior to the use of LFR, in compliance with Article 35(1) of the UK GDPR, which would assess the risks and impacts of the processing on the freedoms of individuals, noting that there is a high bar for LFR to be lawful. A business must consult the ICO prior to the use of LFR if their impact assessment indicates that the use of LFR would result in a high risk that they cannot mitigate.

Additionally, given the fast-paced nature in the development of LFR, businesses should not rely solely on any assurances by a third party provider that any technology is compliant with data protection law. In particular, in order for the use of LFR to be fair, the Opinion recommends that businesses must satisfy themselves that the system is technically effective and statistically accurate. The Opinion suggests that businesses may do this in the design and procurement processes prior to the LFR being implemented by engaging with vendors to understand the statistical accuracy of an LFR system. However, the ICO note that a system does not need to be 100% statistically accurate so long as businesses treat outcomes as predictions or estimates rather than facts. 

Finally, data subjects must be clearly informed about the use of LFR. The ICO suggest that this should include prominent signs explaining LFR is in use, its purpose, that biometric data is being processed and how the data subjects can get further information or exercise their rights. The ICO recommend additional steps are taken such as leaflets, staff training and online information sources.

The European Union perspective

In contrast with the ICO's approach, the European Data Protection Board ("EDPB") and the European Data Protection Supervisor ("EDPS") have jointly called for an outright ban on LFR technology given the potential for unfair discrimination.

In the context of the European Commission's proposal for the regulation of artificial intelligence ("AI"), the EDPB and EDPS note that such regulation would not absolve the requirements of the GDPR or other data protection legislation and that the risk framework of any AI regulation should be aligned to the existing EU data protection framework. For more information on the proposed regulation of AI please see our client alert. Meaning that AI for automated recognition of human features in publicly accessible spaces, such as recognition of faces, should be banned as well as any systems categorising individuals into classes of ethnicity, gender, political or sexual orientation.

If this ban were to be implemented, it would be one of the first divergences in data protection legislation between the UK and EU following Brexit.

Conclusion

In summary, there is a very high bar for the use of LFR to be compliant with data protection legislation in the UK. The use of LFR in public spaces will only be lawful in very limited circumstances.

Before its implementation, businesses seeking to use LFR in public spaces should:

  • Identify a specified, explicit and legitimate purpose for using LFR;
  • Identify a valid lawful basis for the processing;
  • Identify a condition for the processing of special category data;
  • Limit the LFR to what is necessary and effective to achieve the specified purpose;
  • Consider whether alternative measures to LFR such as CCTV would be sufficient;
  • Ensure the use of LFR is proportionate to a sufficiently important purpose for the processing;
  • Ensure the LFR system is technically effective and sufficiently statistically accurate;
  • Address any risk of bias of discrimination in the LFR system;
  • Be transparent and provide information on the use of LFR to the affected individuals;
  • Undertake a data protection impact assessment; and
  • Comply with all other data protection principles.

Businesses should consider whether the high bar for its legality and high documentation requirements may make alternative measures such as standard CCTV comparatively more attractive than the use of LFR.

Compétences et Secteurs liés

Domaines de compétences

Stay Up To Date With Our Insights

See how we use a multidisciplinary, integrated approach to meet our clients' needs.
Subscribe