Heads Up!

It looks like you haven't enabled JavaScript. PsyberGuide works best when JavaScript is enabled on your browser.
Kristen O’Loughlin

Kristen O’Loughlin is a former PsyberGuide Correspondent. She is currently at Virginia Commonwealth University pursuing her doctorate in clinical psychology. Her research interests include disparities in access to mental health care, behavioral health, and integrated behavioral healthcare.

Have you ever entered information into an app you downloaded to your phone and wondered where that information was going? If so, you’re not alone. Individuals commonly report privacy and security as important features when selecting mobile health apps and in fact, concerns for privacy and security serve as barriers to downloading apps in the first place. Therefore, confidence in data security is a key factor in the use of mHealth apps. Some basic questions on data use should be easily answerable for users: Is your data being stored somewhere? If so, where? Is it secure? And who has access to it? Unfortunately, in practice, the answers to these questions are rarely readily available to users.

One type of rating that PsyberGuide provides is transparency of privacy policies. Transparency ratings determine how clearly and thoroughly an app describes their data handling and procedures, and addresses the questions asked above. While completing these evaluations over the past couple of years, the PsyberGuide team identified a concerning trend; many mental health apps provided inadequate privacy policies or missed having one altogether. In response, we took to the literature to see if this was a common trend across mental health apps. 

What we found is that little research has evaluated the availability and adequacy of developers’ security policies for health apps generally. Rosenfeld et al. 2017 evaluated apps specifically for dementia and found that two-thirds of apps did not have any sort of privacy policy. Critically, it was unknown how these findings related to mental health apps. We decided to focus first on apps for depression; given that depression has a prevalence of 7.1% in adults, and was identified in 2017 as a health condition with great market potential for digital health apps. Sadly, but maybe not surprisingly, our results mirrored a familiar trend. Of the 116 apps reviewed, over half provided no privacy policy at all. In many cases, the privacy policy was only accessible after users were asked to input information, meaning the apps had collected data before telling users how that data could be used and stored. Privacy policies tended to be more prevalent for apps collecting identifiable data. Yet even when they were available, they often did not address the key pieces of information we deemed important for users to know. Our results were published in Internet Interventions and if you want more details, such as what these key pieces of information were, you can read the full paper here

Overall, our findings painted a bleak picture for what may be a larger pattern for mental health apps broadly. This begs the question: why aren’t developers providing detailed information to users? One key reason why so few mental health apps have comprehensive privacy policies is that they receive little regulatory oversight. Mental health apps are being developed at such a high rate that is simply not feasible to regulate them all. But even without formal regulation, improving transparency around data security should be a top priority for developers. Improving transparency around data security would benefit users, clinicians, and developers alike. Fear of lax data handling is a well-documented barrier for users trying out new health apps, as well as clinicians recommending them to patients. Use of these apps may increase by simply providing the adequate privacy policies so that users and clinicians can make educated decisions. Furthermore, this increased confidence could lead to a commercial advantage for developers. 

There are a number of ways that developers can improve the communication of their procedures for data security and privacy.

  1. Developers must provide a privacy policy that is accessible prior to downloading the app.
  2. The policy itself should be written in clear language that users can understand. Researchers have found that privacy policies for apps tend to require college-level literacy, which is well beyond the national average reading level. Many policies are full of jargon and vague language that can leave the reader with more questions than answers. Therefore, developers should provide explicit statements describing privacy procedures with appropriate detail, free of jargon and needlessly complex language.
  3. The policy should be a readable length or include a summary of key points. The average privacy policy is about the length of a scientific journal article.  To improve user understanding of data security and practices, developers could provide a summary or TL:DR option that highlights key points.

For mhealth to reach its full potential, developers must help increase user confidence in the use of apps. One way to do this is to provide complete information on data security and privacy in a way that is digestible for the reader.