Rapid growth in the use of health and healthcare apps has revolutionized the individual users’ involvement with their personal health. For example, MyFitnessPal allows users to track their nutrition and exercise. Calm and Headspace offer meditation and relaxation tools to support users’ mental well-being. MySugr assists in monitoring diabetes and Medisafe assists with adhering to a medication regimen. Teladoc provides virtual medical consultations to make healthcare more accessible.
However, these apps exist in a regulatory gray area. Unlike hospitals and doctors which are covered entities regulated by the Health Insurance Portability and Accountability Act (HIPAA), many health and wellness apps fall outside the regulatory purview of federal health privacy laws. This loophole has enabled a boom in the data brokerage industry, where personal health data is collected, sold and sometimes even used against consumers.
The following aims to explore the legal gaps that allow non-HIPAA-covered entities to monetize sensitive personal health information (PHI), the role of data brokers who take advantage of this regulatory oversight and potential legal solutions to overcome this regulatory gap.
Health Apps and the HIPAA Gap
Health apps are software applications that are designed for smartphones and mobile devices, offering a range of services from medical diagnosis and symptom tracking to medication management and fitness monitoring. By providing tools for telemedicine consultations and mental health support, health apps enhance accessibility and convenience for users. These apps use and assess health information, but they are not covered under HIPAA, which only applies to entities such as hospitals and doctors, health insurers, and business associates that handle data on behalf of covered entities. Since the majority of health apps do not directly provide healthcare services, they successfully evade HIPAA’s privacy and security requirements. Instead, many apps operate under minimal federal oversight, leaving room for unregulated data collection, weak security protocols and monetization of sensitive user information. Apps created by or for covered entities (CEs), such as healthcare providers or insurers, must comply with HIPAA as they handle PHI. However, apps designed for personal use without CE involvement typically fall outside HIPAA’s scope.
The determination ultimately will depend on its specific use case and the relationship between the app developer, the users, and any covered entities or business associates involved. Developers of health-related apps should carefully assess their obligations under HIPAA and ensure that their app is compliant if necessary.
The Data Broker Loophole
Data brokers collect and sell personal information about consumers from various sources, often without the consumers’ knowledge. In the United States, the lack of comprehensive federal privacy legislation allows these brokers to operate with minimal oversight. Health apps, which fall outside the purview of regulations like HIPAA can legally share or sell user data to third parties like advertisers, insurance companies and even law enforcement. This practice creates a significant privacy gap as users’ sensitive health information can be sold without their explicit consent or awareness. For example, a research team at Organization for the Review of Care and Health Apps (ORCHA) reported that “84% of period tracker apps share data with third parties”, but “only one single app demonstrated best practice by explicitly asking users for permission” to share data with data brokers. The ORCHA report exemplifies how entities can sell users’ location data, reproductive health details and mental health history without violating federal law.
The Federal Trade Commission (FTC) has taken steps to regulate data brokers and deceptive health data collection, though FTC enforcement has had limited effect. For example:
- The Federal Trade Commission (FTC) has taken action against data brokers like Gravy Analytics and Mobilewalla for collecting and selling sensitive location data such as visits to healthcare facilities. Such data sales can lead to unauthorized profiling and potential misuse of personal health information.
- A 2023 FTC investigation found that BetterHelp shared users’ sensitive mental health data with advertisers despite promising confidentiality. BetterHelp agreed to pay $7.8 million to settle charges brought by the FTC. This case highlighted the lack of enforceable federal regulations that would prevent such practices in the future.
- In FTC v. Kochava (2023), the FTC sued Kochava, Inc., alleging it sold precise geolocation data that could identify visits to clinics, mental health facilities and addiction treatment centers. As the Verge reported, “Precise location data from advertising IDs and mobile apps can be used for surveillance that, according to the FTC, puts millions of Americans at risk.”
At the state level, California has introduced California Consumer Privacy Act (CCPA), Cal. Civ. Code § 1798.100 et seq. However, there is no single comprehensive federal law closing this regulatory loophole for data collection and sharing.
Privacy and Security Risks
Health apps are vulnerable to privacy and security risks just like any other computer software. Apps collect data that can be used to violate an individual’s privacy. Health apps like mental health trackers and period trackers also pose serious risks to user privacy and security without effective regulation.
Mental Health Apps Sharing Data
A 2022 study found that many mental health apps lacked encryption and transmitted data to third parties without user consent. Another study analyzing mental health applications made similar findings of negligent encryption and data sharing, which not only breaches user trust but also exposes individuals to potential discrimination or exploitation based on their mental health data. As researcher Joanne Kim highlights, “Health insurance providers… could buy mental health data to discriminately charge individuals for care or discriminately target vulnerable populations with advertisements…Scammers could…exploit and steal from individuals living with mental health conditions.”
Period-Tracking Apps and Reproductive Health Data
Period-tracking apps help users monitor their menstrual cycles but also expose sensitive data (PHI) to third parties, including data brokers. Following Dobbs v. Jackson Women’s Health Org. (2022), concerns over reproductive health data escalated. While some states have shield laws to protect reproductive health data, law enforcement and prosecutors can bypass subpoenas and shield laws by simply purchasing user data like location history directly from brokers, creating a direct conflict between state-level privacy protections and permissible activity on the unregulated data market.
The Need for Regulatory Reform
The exploitation of health data through these loopholes has prompted demand for improved regulations. Advocates including Health and Human Rights (HHR) argue for comprehensive privacy laws, suggesting provisions that require apps to obtain informed consent before curating user data and to disclose how PHI will be used. Accordingly, entities that violate privacy standards would be subject to fines and/or sanctions to deter misuse.
Some states have begun to address these issues by following California’s example of the CCPA. For instance, New York’s proposed Health Information Privacy Act aims to limit tech companies’ control over consumer health data and protect individuals. The New York Act can be found here.
Legal Solutions to Protect Health Data Privacy
The unregulated sale of health data collected on health apps presents serious constitutional and consumer protection concerns. While HIPAA was designed for traditional healthcare entities, it fails to regulate the growing health app industry, leaving millions of users vulnerable.
One proposed solution is federal legislation such as the American Data Privacy Protection Act (ADPPA), which proposes restrictions on the sale of consumer health data at the federal level. This would prohibit data brokers from collecting and selling sensitive health information without explicit user consent.
Another approach to addressing this issue involves amending HIPAA to extend its reach to consumer health applications and digital health platforms. This reform would ensure that companies such as fitness trackers, mental health apps and telehealth services, which all collect health-related data, are subject to the same privacy and security requirements as traditional healthcare entities. Expanding HIPAA’s scope in this way would create uniform standards for data protection and require app developers and tech companies to obtain user consent before sharing health data with third parties. Additionally, it would hold these tech entities accountable for breaches and misuse of sensitive information.
At the state level, legislative efforts have attempted to fill the gaps left by federal inaction. [For instance, California’s CCPA and New York’s proposed Health Information Privacy Act aim to limit corporate data sales but without federal backing, enforcement remains inconsistent.] Laws like California’s CCPA and New York’s similar proposed act grant consumers greater control over how their information is collected and shared and provide consumers with the right to request data deletion and opt out of its sale. However, without robust federal backing, enforcement of state laws is inconsistent across state lines, leading to a fragmented national regulatory landscape. While some states have taken proactive measures to safeguard consumer health data, others lack comparable protections, leaving millions of Americans exposed to potential privacy violations.
Until these regulatory gaps are addressed, health tech companies have the right-of-way to continue operating in a legal gray area. While their data remains free to be taken advantage of, consumers remain at risk of harmful privacy breaches. A combination of federal legislation, state enforcement and corporate accountability is needed to bring health data privacy into the modern era.