Health Care Privacy Concerns Around Mental Health Apps

Between 2019 and 2021, the market for mental health and wellness apps such as Calm, Headspace, and BetterHelp increased by 54.6%. In 2020 alone, the top 100 wellness apps were installed over 1.2 billion times. This increase can be attributed to the Covid-19 pandemic. Lockdowns and the rise of remote work led to an increased number of people turning to mobile apps to work on their health and fitness goals from home. Studies have also shown that individuals with existing mental health problems reported that their symptoms worsened during the pandemic. In this case, the convenience of mental health apps has enabled users to connect with therapists and mental health coaches from the comfort of their home. Today, an increasing number of physicians recommend their patients to use mental health apps to reduce stress and improve sleep habits, with apps such as Calm allowing patients to access content through their health insurance plan. 

But as mental health apps continue to grow in popularity, researchers have begun raising privacy concerns surrounding the collection of user data. For instance, the Federal Trade Commission (FTC) issued a proposed order in 2023 banning BetterHelp, an online counseling service, from sharing sensitive information about mental health challenges for advertising. This included requiring the app to pay a $7.8 million settlement to consumers. In this case, it is useful to consider what the current privacy concerns are as well as policy solutions to address the issue. One of the main issues concerns the volume of data that is collected by mental health apps. For instance, data collection can include geolocation information such as clinic visits and purchase history for healthcare services. Furthermore, mental health apps may collect other sensitive information such as a user’s full name, date of birth, or home address, as well as health biometrics on a user’s mood, sleep habits, or symptoms. In the case of BetterHelp, the app’s privacy policy now states that they are not paid for user data, but that they may share visitor data with third party advertisers such as Meta, TikTok, Snap, and Reddit. 

This leads to the question of what improvements can be made to protect sensitive data of users on mental health apps and what solutions are available to developers and policymakers. Currently, HIPAA regulations do not apply to most mental health apps unless it involves a covered entity, a business associate relationship, and the disclosure of electronic protected health information (ePHI). However, there are various steps that can be taken to safeguard user data. One study created a list of actions that stakeholders could take to ensure data privacy. This includes a variety of recommendations, including the creation of a privacy impact assessment and having readable privacy policies with accessible language. De-identification is another option that developers can take to protect user data, which can involve removing specific identifiers such as names and addresses. In addition, apps that do not fall under HIPAA regulations are now required to follow the FTC’s Health Breach Notification Rule, released in June 2023, which requires apps providing health care services or supplies to to notify consumers, the FTC, and often the media of data breaches. Overall, both new and current developers behind mental health apps can take these steps to better protect sensitive data and to build trust among users in the digital space. 

Leave a Reply

Your email address will not be published. Required fields are marked *