Depression detecting apps – marketing gold or privacy risk?

Share
Share on linkedin
Share on facebook
Share on google
Share on twitter
App-weet-depressief-bent

Apple, in cooperation with the University of Los Angeles and Biogen, a U.S. based pharmaceutical company, is currently planning the development of software that will allow detection of depression and common forms of cognitive deterioration at an early stage.

In the end, the algorithms to be created should be able to measure the user’s state of mental health based on iPhone-supplied sensor data related to, among other things, sleep cycle, typing behaviour, amount of exercise and physical activity.

Over the past few years, we have seen an enormous increase in the use of health and lifestyle apps, but so far, they have usually been limited to measurable aspects of physical health, as in the case of heart rate monitors and pedometers.

The idea of a mental health measuring app will sound like pure marketing gold to many sales professionals who will immediately be ready to exploit this obviously profitable gap in the market. What tends to be generally overlooked, however, is the fact that in these commercially compelling scenarios there is the small matter of special categories of personal data being processed and shared. Several years ago, the Article 29 Working Group (predecessor to what is now known as the European Data Protection Board, EDPB) already advised on privacy-friendly use of health apps, as careless handling of health data can produce major risk to the private lives of individual citizens. Overall, very strict rules apply to the processing of health-related personal data.

In this blog, we will take a closer look at health apps from a privacy legal perspective, while exploring the risks that may be involved.

Processing health data

In principle, processing special categories of personal data, ex. Article 9(1) of the GDPR, is prohibited, unless one of two possible conditions are met. Either the data subject has explicitly consented to the processing, or the processing is necessary for the purpose of one of the grounds for exception listed in Article 9(2)(b-j) of the GDPR. In the latter case, in other words, consent for the processing is not necessary.

Where processing special categories of personal data in the context of health apps is concerned, none of the exceptions listed in Article 9(2)(b-j) of the GDPR apply, which means that processing is only allowed under condition of explicit consent from the user.

Requirement of consent

Under the GDPR, for consent to be legitimate, it must meet a number of requirements. First of all, it needs to be freely given, which means that withholding consent may not carry detrimental consequences for the data subject. Secondly, consent has to be unambiguous, meaning that it must be expressed by a clear affirmative action, such as ticking a check box to indicate agreement.

The third requirement is that consent has to be informed. In short, what this means is that the user needs to know what it is he or she is consenting to and to whom. It must also be clear which personal data are to be processed for which purpose and for how long. Also, it must be explained to the users that at any time of their choosing, they are free to withdraw previously given consent.

Finally, consent has to be specific, meaning that consent will always be exclusively related and limited to one specific processing with one specific purpose. For any other individual purpose, additional consent will have to be specifically obtained.

Risks?

All of this should imply that users are making deliberate, well-considered choices as to whether or not they agree to the processing of their health-related data. In practice, however, most users never even read the terms of agreement or privacy statements.

Furthermore, research by the Belgian consumer association shows that the majority of popular health apps actually transfer personal data to third parties, some apps even openly sharing health-related data. The same study also shows that many of these apps are far from transparent about which third parties receive the user data and for what purposes. Not to mention the fact that many health-related apps seem to be lacking in security, as demonstrated by the frequent data breaches.

If information on people’s physical condition is clearly of high-level interest to many different advertisers, the same will be true for mental health data, to the same if not to an even higher extent. People who are struggling with whatever type of personal problems will in general be easily seduced to all sorts of impulse or ill-advised behaviour. Think of a provider of casino games for instance, targeting the population of early-stage dementia patients. It’s not hard to understand the sensitive nature of this kind of information, which, once in the wrong hands, can easily be misused in a multitude of ways, all at the cost of already vulnerable data subjects.

This is not to say that apps capable of measuring the state of a person’s mental health cannot be genuinely helpful to large groups of users, if only by making them aware of certain conditions and prompting them to seek timely professional help. What remains absolutely essential, however, is that users of health apps, in particular if the app’s focus is on mental health, are fully aware of the risks involved. At this point, all we can do is hope that Apple will take the privacy-by-design principle to heart. And that their possible success will not lead to a flood of similar applications from potentially less conscientious developers.

Darinka Zarić

Darinka Zarić

Darinka Zarić is a legal counsel at The Privacy Factory. Legal issues regarding the digital society appeal to her. Especially in the field of Privacy Law and the use of big data. She is currently following the master Internet, Intellectual Property and IT-Law at the Vrije Universiteit Amsterdam.

Recent publications

Privacy Weekly

Subscribe to Privacy Weekly and stay up to date on recent privacy trends and developments.

In search of

Free GDPR|Check

Connect with us

Subscribe to Privacy Weekly

Subscribe to Privacy Weekly
A privacy alert, blog post or white paper in your inbox every Thursday!
cookie

We use only functional and analytical cookies to ensure that we give you the best experience on our website. This means that our cookies do not collect personal data. Learn more.