Settling the bill with your face – Safe method of payment?

Share
Share on linkedin
Share on facebook
Share on twitter

Using your face as a bank card or access badge. Recently, more and more stories have been appearing in the media about new methods of payment and systems of admission control based on face recognition technology. Stories about Moscow subway stations, where face scanners are supplementing the traditional ticket dispensers. Schools in Scotland where pupils can now pay for their cafeteria meals by saying cheese – to the camera.

The main reason for the implementation of these systems is the increased transaction speed, which in the cafeteria example above is targeted at an average of five seconds. It is the next step in an evolution in which cash transactions, the predominant procedure of payment as recent as a mere fifteen years ago, have gradually given way to alternative methods of checking out. Because of their convenience, new options like contactless card swiping and Apple Pay have become very popular in a very short time.

In this blog, we will take a look at the privacy-legal context of payment and access control systems based on face recognition technology.

Processing basis

In Article 4(14) of the GDPR, biometric data are described as: ‘personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.’ Human faces in and of themselves, in other words, do not qualify as biometric personal data. Only when an image of the face is processed, are we dealing with biometric personal data. Which, by definition, fall under the heading of special categories of personal data. And which, under Article 9(1) GDPR, may not be processed unless one of the exceptions listed in Article 9(2)(b-j) applies, or under condition of explicit consent from the data subject.

Based on Article 29 of the Dutch national GDPR implementation law (UAVG), processing biometric data – in the absence of consent – is only allowed for authentication or security purposes and exclusively when a location or system requires a level of protection that cannot, to the exclusion of relevant risk, be provided by the use of badges or access codes. As in the case of a nuclear power plant, where the risk of someone gaining access with a stolen badge is relevant and obviously unacceptable, thus introducing the option of access control based on biometric data.

Clearly, checking out at your local supermarket or catching a ride on the subway are not the maximum-security situations from Article 29 UAVG. Thus, the use of biometric data in these cases can only be justified by the one other remaining condition – ‘explicit consent’.

New opportunities?

So far, the sensitive nature of biometric personal data does not seem to discourage the development of systems using them for everyday purposes. As we speak, creative entrepreneurs are considering solutions to reconcile the trivial use of face recognition systems with European privacy legislation. Businessmen like Dick Fens, CEO of 20face, who claims that his company has developed software for privacy-proof face recognition applications. “We want to comply with privacy regulations and we are able to do so by keeping the user in full control of facial characteristics,” Fens says. Here’s how this works. The user submits a picture from which the system creates an encrypted code based on the unique face properties. This code is then stored in a secure digital vault and the user decides which third parties, if any, have access to it. His or her employer, for instance, using the code to grant admission to an office location or a company-sponsored gym.

So, apparently the software is performing a pseudonymisation of biometric personal data, since it is the code that is actually being stored, not the picture on which it is based. In theory, this does indeed seem to be privacy-proof in the sense of a successful implementation of the ‘privacy by design’ principle. But however safe and watertight this may seem to be, there is still a possibility of the data being de-pseudonymised at a later stage and used for alternative purposes or by other parties.

One possible problem here is that making assumptions about the difference between pseudonymisation and anonymisation is an inherently tricky business, as demonstrated by numerous practical cases. Like the one in which the City of Enschede was presented with a hefty fine from the Dutch DPA when pseudonymised personal data from its Wi-Fi tracking system proved to be susceptible to after-the-fact de-pseudonymisation. Granted, in terms of the legal basis for processing, this case hardly compares to scenarios of biometric data being processed for payment or admission control. Still, it does show how easy it is for a trusted security system to unexpectedly turn out flawed after all.

In short, it is never a good idea to dismiss potential complications when using biometric personal data for commercial applications like settling cafeteria bills or controlling admission to public transportation or music festivals. There are inherent hidden risks involved in the use of such data. As a result, it is also important to always offer the consumer freedom of choice and the option of alternatives.

Darinka Zarić

Darinka Zarić

Darinka Zarić is a legal counsel at The Privacy Factory. Legal issues regarding the digital society appeal to her. Especially in the field of Privacy Law and the use of big data. She is currently following the master Internet, Intellectual Property and IT-Law at the Vrije Universiteit Amsterdam.

Recent publications

Privacy Weekly

Subscribe to Privacy Weekly and stay up to date on recent privacy trends and developments.

In search of

Free GDPR|Check

Connect with us

Subscribe to Privacy Weekly

Subscribe to Privacy Weekly
A privacy alert, blog post or white paper in your inbox every Thursday!
cookie

We use only functional and analytical cookies to ensure that we give you the best experience on our website. This means that our cookies do not collect personal data. Learn more.