September 6, 2019
There’s much in the press at the moment about the use of facial recognition technology by the police and other law enforcement agencies for crime prevention and national security purposes. A person’s face, in particular their facial features, represents one type of biometric data which falls within the definition of ‘special categories of personal data’ under the GDPR.
Another type of biometric data is a person’s fingerprint. Although fingerprint technology is nothing new, the prevalence of fingerprint technology in consumer devices as a replacement for passwords and for payment authorisation has normalised its use in other contexts; from building access control to point-of-sale technology and even checking books out of libraries. There are two main reasons why organisation consider using biometrics...
Although biometrics aren’t invulnerable to spoofing (Mark Twain once said, “Fingerprints cannot lie, but liars can make fingerprints”, and research has proved him right), they’re often perceived as being more secure than proximity/swipe card entry systems, where cards can be lost or stolen or ‘skimmed’ by intruders using devices which can read the typically unencrypted numbers stored on them. Similarly, biometrics are thought to be more secure than passwords/PINs which are often obvious (as this 2015 clip from the Jimmy Kimmel show aptly shows), shared between users or written down on pieces of paper. The convenience of biometric technology lies simply in the fact that users can’t forget to bring their biometric identifiers with them to work in the morning.
As is very often the case, convenience, whether in the name of security or some other objective, often involves a trade-off when it comes to privacy. In the context of biometrics, the privacy risks are undoubtedly greater: there is nothing more personal or unique than an individual’s biometric data which, if compromised, could be used by an acquirer to access multiple systems (the same can of course be said about passwords, but compromised passwords can easily be changed while your DNA can’t). And with recent news that the fingerprints of one million people together with facial recognition data was compromised as a result of the data being held in a largely unencrypted and unprotected database, the risk of biometric data being compromised is very real.
The results of a study published by IBM in January 2018 also suggest that, contrary to popular belief, users rank privacy and security more highly than convenience: while 67% of 4,000 respondents said they were comfortable using biometrics, 55% said they would not trade privacy and security over convenience and 74% favoured other security measures such as two-factor authentication (2FA). Organisations should therefore not assume that every user will embrace the convenience or novelty of biometrics.
Given the risks, the use of biometrics is not straightforward, with the GDPR requiring organisations that want to use biometric data to:
- establish a lawful basis for processing (remembering that consent has to be informed and legitimate interests requires a ‘careful assessment’ of the risks to individuals);
- satisfy one additional condition for its processing under Article 9 of the GDPR or Schedule 1 of the Data Protection Act 2018; and
- implement additional safeguards and, in some cases, have an ‘appropriate policy document’.
As the processing of biometric data is likely to result in a high risk to individuals, the GDPR also requires organisations to undertake a data protection impact assessment, to establish what those risks are and how they can be eliminated or mitigated. Where it is not possible to eliminate or reduce the risks, the GDPR requires the ICO to be consulted beforehand.
There are circumstances where the use of biometrics may be considered to be proportionate to the security risks the technology intended to mitigate, for example, access to highly sensitive records, systems or locations such as server rooms. However, it’s clear that if an organisation wants to use biometrics for more general purposes, it needs to tread very carefully.
And on that note, the Swedish Data Protection Authority announced its first fine under the GDPR in August in the sum of SEK 200,000 (approx. EUR 20,000) against a school which had been trialling facial recognition to monitor the attendance of its pupils. The school claimed that the pupils had consented to taking part in the trial, but the Authority ruled that such consent was invalid given the imbalance of the relationship between the school and its pupils and that the monitoring attendance could be achieved in less intrusive ways. The school also failed to undertake a data protection impact assessment or consult with the Authority.
If you would like to discuss the use of biometrics in your organisation, drop us a line.
Technology is evolving, and so are your lawyers. Find out more about our technology offering here.