Privacy and workers in the platform economy

Share
Share on linkedin
Share on facebook
Share on twitter

The market share of organisations in the platform economy continues to grow. Where two decades ago, the sector was basically limited to social media businesses and trade portals for second-hand gear, there now is a platform company for virtually every conceivable form of service, delivered at a moment’s notice and often for very competitive prices – companies like Booking.com, Uber and Picnic, to name but a few examples.

Admittedly, the rise of these platforms has made life easier in many ways for many people, but there is a downside to the added convenience of everyday life, as shown by a recent report from Privacy International which suggests that platform workers are highly susceptible to the risk of illegitimate surveillance, with algorithms being used to predict fraudulent behaviour. Typically, and not surprisingly, given the nature of their intended purpose, these algorithms are far from transparent, while a sound legal basis to justify their deployment is usually lacking.

Besides, what if an algorithm does point to the probability of fraudulent activity? Is it then up to the ‘suspect’ to prove his or her innocence? If so, would this not constitute a serious weakening of the individual’s legal position?

In this blog, we will try to find an answer to the following question: Are platform workers being subjected to automated decision-making in the sense of the GDPR? To answer this question, we will first have to look at the concept of automated decision-making itself. Then, we will focus our attention on a recent verdict by the court of Amsterdam.

Automated decision-making

Based on Article 22 of the GDPR, data subjects have the right to human intervention when confronted with an automated decision which produces legal effects concerning them or similarly significantly affects them. Principally, in other words, data subjects may not be subjected to automated decision-making, including profiling.

Profiling refers to the practice of making automated decisions concerning a data subject which are based on his or her profile. As such, the definition may apply to the algorithms used by platform companies to predict fraudulent behaviour, which suspicion it is then up to the platform worker to disprove.

There are, however, several grounds of exemption which do make it legitimate for an organisation to make automated decisions, such as necessity for the performance of a contract, authorisation by law or the data subject’s explicit consent.

Privacy International also points out that the algorithms used are lacking in transparency, making it unclear what exactly the logic is behind a specific automated decision. This too may be in conflict with the GDPR because, as specified in Article 15(1)(h) GDPR, the data subject has the right of access to the information an automated decision is based on.

Uber I verdict

Recently, there have been several decisions by the court of Amsterdam touching on the doctrine of automated decision-making and the related right of access to information. Below, we will focus exclusively on specific details from the so-called Uber I case.

A number of taxi drivers working for the Dutch branch of the Uber cab company saw their contracts cancelled because of alleged fraudulent activities. The actual cancelling was done by blocking the drivers’ Uber accounts, a measure of which they were informed by email. In response, the drivers filed a civil suit against their (former) employer, demanding, among other things, the reversal of their contract cancellations, arguing that these were based on fully automated decisions which are not legitimate ground for contract termination. The question to be decided on by the court, in other words, was whether or not the contract cancellations qualified as examples of automated decision-making.

In doing so, the court considered the three criteria mentioned in Article 22(1) GDPR: 1) Has a decision been actually made? Yes, namely the decision to block specific accounts. 2) Was this decision solely based on automated data processing? This was judged not to be the case, as Uber, in the opinion of the court, had sufficiently demonstrated that fact-finding research had been carried out by employees of the company prior to the ultimate decision. 3) Did the decision produce legal effects for the drivers or similarly significantly affect them? Again, the court replied negatively, arguing that blocking the accounts did not carry long-term, lasting effects, as it could relatively easily be undone. Affected drivers could simply contact an Uber employee and if they were able to demonstrate the absence of fraud, their account access would be restored and they would immediately be able to go back to work.

In its verdict then, the court ruled that no automated decision-making in the sense of the GDPR had taken place and that, as a result, there was no legal ground for an appeal to the right of access to the information the decision was based on.

Conclusion

From the above, it seems clear that it takes quite a bit for a case to qualify as automated decision-making carrying ‘significant effects’ for the data subjects. The initial two conditions, first that a decision must actually have been made and, second, that the decision must be based on automated processing with no human intervention, are relatively closed requirements with little room for debate. Any act may, in one way or another, be argued to qualify as a decision in the sense of the GDPR, whereas the question as to whether there was human intervention in the process of making the decision, is simply asking for a demonstration of fact. If any form of human intervention can be shown to have been in effect, the applicability of automated decision-making is ruled out straight away.

In contrast, the final criterion, whether a decision produces legal effects or otherwise significantly affects the data subject, is more of an open question, leaving much to the interpretation of the court. Although the blocking of an account, preventing a driver from doing his work, does principally affect the person involved in the sense that it carries economic consequences, courts will, in assessing the applicability of the term ‘significant’ primarily consider whether the decision has long-term and/or lasting effects. As in this case the account blockage could relatively eastly be undone by contacting Uber, the court ruled that there were no long-term and/or lasting effects for the drivers, thereby effectively discarding their claims.

From a GDPR perspective, the drivers were not subjected to automated decision-making, which means that there was no illegitimate action on the part of Uber. The fact, however, does remain that taxi drivers working for these types of organisations are subject to a higher degree of algorithm-based surveillance than their colleagues in conventional cab companies. On the other hand, the latter are in more direct contact with supervisors who may just as well closely monitor their behaviour. In the long term, the fact that many platform companies deploy algorithms to take over these managerial tasks may certainly have less than desirable effects for the employees in terms of surveillance intensity and added pressure to inform. Nevertheless, it seems that platforms like Uber have little choice other than using algorithms in the capacity of company managers.

Darinka Zarić

Darinka Zarić

Darinka Zarić is a legal counsel at The Privacy Factory. Legal issues regarding the digital society appeal to her. Especially in the field of Privacy Law and the use of big data. She is currently following the master Internet, Intellectual Property and IT-Law at the Vrije Universiteit Amsterdam.

Recent publications

Privacy Weekly

Subscribe to Privacy Weekly and stay up to date on recent privacy trends and developments.

In search of

Free GDPR|Check

Connect with us

Subscribe to Privacy Weekly

Subscribe to Privacy Weekly
A privacy alert, blog post or white paper in your inbox every Thursday!
cookie

We use only functional and analytical cookies to ensure that we give you the best experience on our website. This means that our cookies do not collect personal data. Learn more.