The Government-backed Information Commissioner’s Office (ICO) is warning against the use of biometric technologies to conduct “emotion analysis” (EA).
The ICO has described the practice as “immature” and “pseudo-scientific”.
The body warns that the AI algorithms needed to power EA but is unconvinced can detect emotions accurately and could lead to bias and discrimination.
EA needs biometric data from about gaze tracking, sentiment analysis, facial movements, gait analysis and facial expressions.
Deputy commissioner Stephen Bonner said: “Developments in the biometrics and emotion AI market are immature. They may not work yet or, indeed, ever.
“While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.
“The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science. As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.”
The gathering of biometric data is currently fraught with concerns because the information is peculiar to the individual.
BIOMETRIC TECH NEEDS NEW LAW SAYS REPORT
Observers have long voiced worries that the technology is developed, and increasingly used by, private companies, yet effective legislation to control it is not keeping pace.
Earlier this summer, an independent legal review by Matthew Ryder KC found legislation covering biometric technologies is not fit for purpose.
The Ryder Review, published in June, reflected the view that the law has not kept up with the rapid technological advancements or how the technology can be used. The KC said that the technology must not be allowed to “proliferate” without sufficient legal constraints.
Ryder said: “The current legal regime is fragmented, confused and failing to keep pace with technological advances.
“We urgently need an ambitious new legislative framework specific to biometrics.
“We must not allow the use of biometric data to proliferate under inadequate laws and insufficient regulation.”
The biometrics commissioner broadly welcomed the Ryder Report.
The Ada Lovelace Institute, which commissioned the review, stated, as examples, schools using facial-recognition technology to verify students’ identities while paying for lunch or supermarkets marking out dangerous or criminal individuals.
The report suggests better laws and regulation would subject such uses to much greater scrutiny.
An institute spokesman told the BBC: “We can think of this a regulatory ‘whack-a-mole’, which we are arguing is inadequate.”
The institute wants comprehensive laws for biometric technology; oversight by a regulatory body; clear standards on accuracy, reliability, validity and proportionality and a moratorium mass identification or classification in the public sector until new laws are passed.
Police forces including the Metropolitan Police and South wales Police use live facial recognition (LFR) – where a camera system matches faces to a watch-list of offenders.
South Wales was latter successfully challenged in court and in light of that, there must be a legally-binding police code of practice for LFR and all other public should be suspended until there was one covering the private sector.
Laws which influence how biometric data include:
Human Rights Act 1998
UK General Data Protection Regulation
Data Protection Act 2018
Police and Criminal Evidence Act 1984
Protection of Freedoms Act 2012
Terrorism Act 2000
Investigatory Powers Act 2016
Equality Act 2010
Biometrics have been defined as a range of methods whereby body measurements and calculations related to human characteristics are captured by sophisticated.
Biometric authentication (or realistic authentication) is used in computer science as a form of identification and access control. It is also used to identify individuals in groups that are under surveillance.
Click to Open Code Editor