From fingerprints to facial recognition, biometric identification is the gatekeeper for many a digital service. Even old-school telephone banking is getting a refresh, with the UK’s major financial institutions eschewing passwords to authenticate customers through their voices.
On the surface, people seem to be embracing these services. Over 1.5 million people are already signed up to HSBC’s voice authentication, which launched in 2016. Barclays is also offering this service, claiming that it makes telephone banking easier to use and more secure.
Meanwhile, Lloyds Group including the Royal Bank of Scotland (RBS) and Halifax, has been offering customers the ability to authenticate via voice since October last year.
Using this type of service requires a banking customer enroll their “voice print”. This is often created by a user repeating a certain phrase several times, such as: “My name is X and this is my password.”
When the person next phones their bank, they repeat the phrase and a statistical match against the voice print authenticates them.
Biometric services such as these are largely driven by convenience, as well as additional security at a time when financial fraud is widespread. But the rollout of voice authentication is also raising questions about how the data is collected and stored – and crucially, how customer consent is gained.
Explicit consent required
The EU General Data Protection Regulation (GDPR) states that biometric information is personal data and as such, a person’s consent to use it must be “explicit”. Yet compliance is apparently not so straightforward: last year, HM Revenue and Customs came under fire for collecting the voice prints of over five million taxpayers without their consent.
Consent under GDPR must be “informed, unequivocal and freely given”. In other words, it can’t be implied by conduct or a “soft opt-in”. At the same time, the burden is on the controller, the company that collects the data, to demonstrate this.
Of course, all the banks say they gain consent to collect users’ voice prints in a GDPR-compliant way. But transparency around how this data is collected and used varies depending on the organization.
Barclays, for example, creates a voice print by analyzing several of a customer’s telephone banking conversations – and it does this without alerting customers it is happening. Up until the voice print becomes strong enough to “uniquely identify” the customer, it is known as a “partial print”. The bank confirmed it does not explicitly ask for customer consent to collect these partial voice prints. Indeed, it says that consent is not required to store and analyze partial prints, because a partial print cannot be used to uniquely identify a particular customer. Barclays claims this therefore does not qualify as “special category personal data” under GDPR.
Only when the partial print becomes uniquely identifiable will Barclays ask for consent to store and use it. If the customer declines, does not respond, or ends the call, Barclays will delete the voice print.
The use of voice biometrics in general raises some important questions around privacy and data protection, says Guy Cartwright, associate solicitor, commercial services, Coffin Mew. “Our voice is a unique and integral part of us. It doesn’t change like a password: you can’t change your voice if a company loses your data. Because of this, the risks to the individual are that much greater when companies want to process it – and that means firms need to work very hard to demonstrate compliance with the law.”
“If personal data, like voice biometrics, is collected without consent but with the knowledge that it will then be potentially used for other purposes, such as identification, my view is this is not in accordance with the principles of the GDPR,” Cartwright says.
Some banks have been more active than others in gaining customers’ consent: HSBC spent several months rolling out their voice authentication service while informing customers how they were doing it, says Anthony Stephenson, product director at Enghouse Interactive, which builds communications interfaces for companies.
HSBC customers must enroll for Voice ID through an automated line, where instructions advise the customer that by saying the phrase “my voice is my password”, they are giving their consent for the bank to store the voice print.
Similarly, Lloyds Banking Group says its customers opt into its program and are aware that voice biometrics is being used. Voiceprints are encrypted and stored in a secure database behind a firewall and customers can withdraw from the service at any time and their voiceprint deleted
So far, so compliant?
In fact, Cartwright thinks it’s very difficult to meet the requirements for consent when it is obtained over the phone. “There is little opportunity to consider why that data is being collected and how long it is stored for,” he says.
Voice print versus fingerprint
When it comes to the security of voice authentication compared to other biometric methods, fingerprints are easier to copy, says Stephenson, pointing out that it’s possible to gain access to a device’s fingerprint scanner using a gummy bear.
At the same time, voice authentication is not unbreakable. For example, an attacker could potentially record someone and play their voice down the phone. A fraudster might be able to successfully masquerade as someone else, as evidenced by the BBC reporter who was able to trick HSBC’s system by asking his non-identical twin to mimic his voice.
“No authentication method is 100% bullet proof,” admits Efrat Kanner-Nissimov, marketing director in charge of real-time authentication at customer analytics provider NICE. “But there are a lot of things that vendors add to solutions to ensure no one is mimicking the voice or playing it back.”
Robin Bortz, voice biometrics development manager at Nuance Communications, points to multiple security measures – voice print and speaker identity are encrypted so that voice prints cannot be reverse-engineered, while file names are audio hashed and so cannot be matched with a particular person if the bank’s system is breached.
Even so, there are still questions about exactly how accurate voice authentication is – and whether it can be easily tricked. This so-called “false acceptance” rate needs to be tested within an environment with a large sample set, says Lewis. So far, such studies have yet to be done.
Voice biometrics are also stored differently than fingerprints. Apple, for example, keeps fingerprint data on iPhones themselves, and users have full control over deleting the data when they want to. “That stops Apple from having GDPR problems as [the biometric data] is under the full authority of the user,” says Matt Lewis, research director at NCC Group. In contrast, he says, the banks are likely to be storing biometrics on their own servers.
Did you say yes?
As a form of authentication, Jamal Elmellas CTO at cybersecurity firm Auriga Consulting doesn’t think voice identification presents a privacy problem, although he emphasizes the importance of reading a bank’s terms and conditions so users can ensure they are not unknowingly signed up to a service they don’t want.
And in general, the biometric storage methods are robust – voice data is encrypted and cannot be associated with any individual customer; in Barclays’ case, audio for both partial and completed voice prints are stored separately to its customer systems.
But with the exception of HSBC, it doesn’t seem banks are being all that transparent when it comes to getting consent for collecting people’s voices.