sp3n - stock.adobe.com
Whether it’s being used to unlock your phone, gain access to a your online bank account or government services or even at airports to verify users and put an end to passport queues, an increasing number of companies and governments are looking to facial recognition to verify and authenticate users in other scenarios.
This is because, in its current form, authenticating or verifying identities isn’t working, digital identity is broken. In the physical world, we identify people instinctively because of their characteristics: their face or voice, for example, but online, it’s harder to prove identity and easier for fraudsters to pretend they’re someone they’re not.
We’ve seen frequent data breaches including brands such as BA where customer details are stolen and then sold on the dark web where it’s easy for fraudsters to acquire login credentials en masse. The rise in fraud and the use of synthetic identities shows just how broken digital identity is.
Many businesses are looking to facial recognition to improve their digital identity practices, and a first step in improving verification and authentication strategies. However, as a standalone method, it has its own shortcomings and can be the single point of failure in the verification or authentication process.
First, it’s important to distinguish the differences between verifying and authenticating a user. When facial recognition technology verifies a user, it detects an individual’s face, analyses it and then compares it against information provided such as an identity card.
After identity has been verified, authentication is when a customer’s identity is confirmed by requesting further credentials to allow access to services – for example, through personal information such as a password or pin.
As a method of verification, facial recognition and biometrics more broadly work well. For example, when opening a new bank account, consumers understand that they will need to verify their identity, so using biometrics is an appropriate method to do that.
However, relying on facial recognition for authentication or authorisation purposes further in the online user journey may not be appropriate for several reasons.
1. Inherent biases
One well-publicised challenge of facial recognition when used as authentication or authorisation method is that it can exclude pockets of the population. The widely reported issue with Uber’s use of facial recognition for drivers to access the app had huge unintended consequences through racial or religious bias and technology elitism.
Physical biometrics can also add friction to the customer journey. In some cases, friction is appropriate. For example, when customers are asked to reconfirm their identity before authorising the transfer of a large sum of money, this can be reassuring to customers.
However, if a facial ID is required each time you buy something from an online retailer, you’ll likely take your business to another brand where it is easier and faster to make a purchase.
It may also surprise people to learn that there are security limitations to facial biometrics. Using a simple photo as the single form of authentication can lead fraudsters to falsely claim their biometrics methods are broken just to circumvent the authentication process. Fraudsters are also actively looking at ways to trick facial recognition systems.
4. Lack of privacy
Facial recognition is obtrusive and isn’t privacy preserving. Your face is personally identifiable information (PII) and permission is required to collect, store and process this in many countries under the General Data Protection Regulation (GDPR) so many people will choose not to use facial recognition as a method to authenticate or authorise.
Our own research reveals that only 38% of UK consumers feel comfortable using static biometrics, such as fingerprint ID or facial recognition, to confirm their identity when using a service or buying a product.
5. Single point of failure
Finally, using facial recognition as an authenticator requires asking a closed question. For example, “is this the user’s face?”. It’s a yes or no answer, but if the computer or phone doesn’t recognise the user, what happens then? Quite often there is no other method to allow the user to authenticate and get on with their user journey (as was the case with Uber), highlighting the issue of using facial recognition as a single point of failure.
Unfortunately, when there is a backup plan, it’s often a reversion to passwords and pins which are weak and open to compromise – often the reason security was stepped up to the seemingly more robust method of facial recognition.
Fixing digital identity by layering security
So, what is the solution? While facial recognition has its place, it is vital that it is layered with other data points to ensure there isn’t a single point of failure in the authentication process. One way of doing this is through behavioural biometrics.
Behavioural biometrics such as a swipe or typing are extremely hard to mimic and compromise. Machine learning technology analyses how consumers physically interact with their devices, the angle they hold their mobile device at, their typing cadence, the pressure they apply, and even mouse movements.
These inputs build unique model of a consumer’s “usual” behaviour and provides comparisons for subsequent interactions. If a consumer then displays behaviour that is different to the expected “norm”, any access attempt is identified as potentially fraudulent and can be escalated either requiring further authentication or stopped completely.
By layering behavioural biometrics with other intelligence such as device and location, businesses can ensure there is no one point of failure if something doesn’t look quite right.
We’ll only continue to see announcements from organisations and governments trying to support facial recognition and other static biometric technologies. But to securely authenticate users without compromising the user experience, organisations must not rely on facial recognition as a standalone method of authentication.
Amir Nooriala is chief commercial officer at Callsign.