Time to face facts? New research reveals risks around biometric data collection

by

Biometric Data: Misuse, use and collation

ICB CDT Student Vincent Saverat undertook a POST fellowship earler in 2024 as part of the UKRI Policy Internship Scheme.

The Parliamentary Office of Science and Technology has published new impartial research looking into the use, misuse and collection of biometric data, co-authored by ICB CDT Student Vincent Saverat, who won a POST Fellowship earlier this year.

Read POST report Biometric data: misuse, use and collation

Biometric data allows the unique identification of a person. Facial, voice or fingerprint data is commonly used to unlock personal devices, speed up passport checks or for online banking.

Research indicates that people prefer to use biometric verification to log in to devices over using a password or authenticator app. As such, the ‘passwordless verification’ market is projected to be worth £46billion by 2032.

This new paper, co-authored by Vincent Saverat, an ICB CDT student, and Dr Simon Brawley from the Parliamentary Office of Science and Technology, was based on interviews with experts and brings together findings from over 170 sources to provide a unique overview of the how biometric technology is advancing and where the risks might lie.

Professor Lord Robert Winston, Vice Chair of the POST board, said: “This fascinating POST research clearly demonstrates that biometric technology is developing faster than our understanding of its implications. Many of the technologies outlined here would have been unthinkable a few years ago. If we want to protect our most valued asset - what makes us unique, our very identities - we must ensure that we keep abreast of how this data is being used and ensure that it is effectively regulated. This research will be invaluable in helping parliament play its role in the debate.”

Vincent Saverat, POST fellow, said “Writing this paper alongside the Parliamentary Office of Science and Technology has given me the opportunity to ensure that clear and impartial evidence on this quickly developing subject is shared with the people charged with regulating it. Technological progress can happen fast, and often brings with it unexpected technical and ethical considerations. Drawing together this research will help clarify some of issues at the core of new biometric data uses."

Report highlights include:

  • AI has made new uses of biometric data possible. It can be used to classify individuals into different demographic categories and to infer a person’s emotional or psychological state.
  • Some systems can determine the characteristics of users, such as age, and some systems even claim to be able to determine sexuality, although studies relating to sexuality have been widely criticised as ‘pseudo science’.
  • Collection of biometric data is regulated differently in the UK, EU and US.
  • Successive governments have planned to legislate further around the collection and use of biometric data. The previous government’s Data Protection and Digital Information Bill fell at the general election, and the new government has proposed a Digital Information and Smart Data Bill to ‘ensure [public] data is well protected’. The Bill has not yet been published.
  • There have been several high-profile cases of legal action involving biometric data. US firm Clearview AI, which provides a ‘face search’ function drawing on a database of more than 40 billion images, was fined €20million for breaching EU data regulations. In 2019 HMRC was found to have collected the voice records of 5 million users without consent as part of a new voice verification system. It was subsequently required to delete them under GDPR regulations.
  • Facial recognition is particularly controversial due to privacy concerns and due to concerns about bias. UK police forces use facial recognition software to biometrically identify suspects, for example in dense crowds. However, there is evidence to suggest multiple facial recognition systems give false positive identifications more frequently when the individual being identified is from a Black or Asian background. False positives are also more common in women, children and older people. This is likely because AI algorithms within the programme were trained on datasets where certain groups were under-represented.

Many congratulations to Vincent!

Reporter

Emma Pallett

Emma Pallett
Department of Chemistry

Click to expand or contract

Contact details

Tel: +44 (0)20 7594 9098
Email: e.pallett@imperial.ac.uk

Show all stories by this author