Biometrics and Privacy

I spoke today at a Social Market Foundation event on biometrics. The keynote was Prof James Wayman, who was exceptionally fluent and interesting on the topic, and I was pleasantly surprised to see him talking openly about the abilities and limitations of biometric technologies.

Biometric technologies are one of those ‘lightning rod’ topics that quickly polarise people into the ‘for’ and ‘against’ camps. It’s difficult to say exactly why this is, but much of the problem probably rests in the dystopian science-fiction visions of the likes of Brazil or Minority Report that blur the reality of technology with the possibility of imagination.

I’m personally not too concerned about the application of biometric technologies in appropriate situations. What worries me are the processes and broader IT systems that depend on those technologies. Biometrics occasionally throw up false acceptances or false rejections. The problem is that the the systems and officials that depend on those biometrics, and the databases of personal information to which they are linked, place too much dependence on them and then make ridiculous decisions as a result. The attitude of “there’s a biometric involved so it must be correct” is very dangerous indeed – ask the people who have suffered wrongful arrest, rendition and torture as a result of stupid decisions made on the back of biometric system errors (more on this in a forthcoming blog article).

The paradox is that used correctly, biometrics can offer great privacy benefits. An oft-quoted example is that of using fingerprints to determine school meal entitlement – all children provide the print to obtain their meal, but the system knows that those on meal subsidies should not be charged, and the children are not stigmatised by having to admit to that subsidy. That’s a great example of identity technologies delivering privacy through anonymisation.

The problem is that all too often the organisations implementing biometric systems have failed to be transparent about the purpose or operation of the system, and this has reinforced mistrust of the technologies. School implementations are once again an example, since local authorities have often refused to discuss details of their fingerprinting approaches, or even to seek valid consent to that use of personal information, believing it to be covered by statutory processing permissions.

Biometrics can reveal information about their subject. A photo can reveal age, gender, race, religion (if the subject is wearing religious clothing or jewellery), health and other information. A voice can reveal age, gender, class, region of origin, education. Even fingerprints can give away gender and in some cases ethnicity. These attributes must be protected appropriately in any biometric system.

Privacy issues arising from the use of biometric technologies appear to have coalesced around three key questions:

Are biometric technologies an appropriate and proportionate solution to the problem? Just because we can use a biometric solution, that doesn’t mean it’s right to do so. The Hong Kong Information Commissioner, for example, made it clear that he felt the fingerprinting of schoolchildren was not an acceptable application. Too often we see biometric solutions rolled out as a solution looking for a problem.

Are we trying to identify or authenticate? In the vast majority of cases, biometrics can be used to help authenticate an assertion by the individual: eg “I am the legitimate holder of this token”. However, sloppy system design, or a desire to use every part of the technology system, or a misconceived desire to future-proof the technology investment, means that organisations instead set out to identify the user: rather than test an assertion, they try to singulate an individual from a database. An example is the IRIS immigration system, that picks the individual out from its database of enrolled users by the biometric alone, rather than asking them to present a machine-readable document and then confirming that the holder has the associated biometrics.

Should we gather biometric templates or biometric images? The most complex and expensive part of a biometric scheme is enrolment of the data subjects into the system. Algorithms and technologies are developing quickly, and to protect the investment it is tempting to capture images (a high-quality scan of the biometric, eg a digital photo or high-quality voice recording) so that templates (mathematical products derived from that image, which can be used to confirm a biometric but cannot be used to recover the original image) can be regenerated when needed. However, 9 times out of 10 organisations go for the image option since they believe that this will future-proof their investment. Templates have fewer privacy implications than images, since a stolen image can (in theory) be used to assist in attacks on the user’s identity, whilst the template is of far less use. Moreover, once a biometric image has been stolen and used for fraud, it can’t be revoked – you can’t change your fingerprints!

Not surprisingly, the answers to our key questions can be derived quickly, easily and with a minimum of cost. Every biometric application should have a Privacy Impact Assessment (PIA) as part of its business case, completed before any procurement or development commences. The PIA should consider whether biometric technologies are a proportionate and acceptable solution to the problem in hand; whether the application should seek to identify or authenticate the users; and if so, whether it is really necessary to capture an image at time of enrolment, or will a template alone deliver the necessary functions.

None of this is particularly difficult, and there’s a lot to play for here. Public trust in biometric technologies must be nurtured and protected, and it will only take a single major privacy disaster relating to a biometric system to destroy confidence in all biometrics. Remember, we’ve only got once chance to get this right – because trust won’t come back, and stolen biometrics can’t be replaced.

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

Added this post to - and I hope Tod Stevens and readers can contribute to that project. The final goal is to have a design (and hopefully even a prototype) of something an org can show a gov (or biz) and say "this is how you do it right" instead of "everybody's doing it wrong". About a 100 years ago, one *could* have have argued that hydrogen-filled zeppelins were wrong, but it took a helium-filled one to stop the nonsense. Let's develop our own helium before it's too late.
The current Biometric authentication methods present a serious threat in a manner that many people regard it as demeaning. The Biometric scheme represents the kind of closed-minded society that the Soviet Union created, and which the free world decried. According the basic human dignity law: “There shall be no violation of the life, body or dignity of any person as such”: Human Dignity transcends any social order as the basis for rights and is neither granted by society nor can it be legitimately violated by society. As free individuals, living in a free country, we have the right to control our own body identifiers and our own physical characteristics. "We are not animals, we are human beings - our body and its lineaments are NOT a blob of tissue” … Biometric is referring to ‘Vital body organs measurement, derived from the Greek words Bio (life) and Metric (to measure). From a democratic and legal point of view, an individual has the right to manage his own bodily identifiers (body, dignity, markers, privacy) or retained intrinsic body cutoffs, as the conceptual basis for human rights. Biometrics should enhance rather than conflict with individual privacy and dignity. As stated by the philosopher Immanuel Kant (1724-1804): “Human beings should never be treated as merely means to an end” - Namely, ‘Human beings are the purpose, they must not be sacrificed to fulfill other purposes’. The Myth of Biometrics Enhanced Security:
Biometric Encyrption - worth reading: