UK government under pressure to legislate “Wild West” of biometrics

By Emma Olsson | 5 May 2020

The UK government needs to create rules around biometric technologies to determine how they are used, said Hugh Milward, director of corporate, external and legal affairs at Microsoft UK, during a panel on developing governance for biometrics at a Westminster eForum policy conference today.

Market participants are urging regulators and parliamentarians to legislate the use of biometric technology such as facial identification and voice recognition.

“Given that this technology is developing fast, we really do believe that there needs to be laws in place to help protect citizens and individuals as we step forward,” said Milward.

According to Milward, legislation should require tech companies to provide documentation explaining the capabilities and limitations of the technology in simple terms, enable third-party testing and ensure meaningful human review.

“Some of these issues are contained within [the General Data Protection Regulation] (GDPR) – most are not – but we believe there is tremendous value in bringing these together into a single place, into a single clear set of laws that people can scrutinise. We believe that companies like ours need to take action now before the law catches up,” he said.

According to Silkie Carlo, director, Big Brother Watch, policy makers are long overdue in legislating the growing technology.

“There should be some alarm bells here. I’ve been in a number of policy forums with Microsoft and I think it’s really important that they’re taking an active interest in the policy area and making interesting contributions, but it does trouble me that there are more policy contributions from people who are involved in the conversation because they need to find a policy legal basis on which to sell their products, than democratically elected people. And I think that’s a real problem we need to grapple with,” she said on the panel.

In financial services alone, the market for biometrics is projected to grow by $6.9bn at a compounded growth rate of 15.5 percent according to research by Research And Markets from March. Biometric data is currently protected under GDPR, but with a rapidly growing market and a plethora of use cases, some market participants do not believe the regulation is sufficient.

“It’s like so many amazing developments in technology, you create a knife and it can be used for tremendous good and it can be used for tremendous damage … it can’t just be left alone,” said Millard.

Much of the concern surrounding biometric technology focuses on the use of facial identification by the police and in public places. The King’s Cross Central Limited Partnership (KCCLP) faced flack in September 2019 after the Information Commissioner’s Office (ICO) unveiled its use of facial recognition technology (FRT) to surveil London’s King’s Cross area. The KCCLP stated that it will not reintroduce FRT.

But the use cases for biometrics in financial services fall into a different category that should be addressed separately, said Andrew Bud, founder and CEO of face authentication company iProov during a later panel on the UK’s commercial biometrics sector.

“I think we have to draw a very strong distinction between face recognition for surveillance and face verification for authentication. And the way I would distinguish that is – Does the user know that [facial recognition] is happening? Have they been given the opportunity to genuinely express or more importantly to refuse consent? And are they benefiting themselves directly from the application of the face verification technology?”

Remote identification through digital identities is currently booming in financial services due to coronavirus, as the pandemic prevents financial institutions from verifying customers in person. As digital identity grows in prevalence, its distinction from other biometric use cases must be clear, said Bud.

“Those are two profoundly different ethical spheres. I absolutely do not make light of the ethical issues of face matching for surveillance, but I think it’s a completely different ethical argument when one uses facial verification for authenticating.”

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development