Health Apps and Data Privacy: Best Practices for Developers
While health and wellness apps can be useful tools, they also raise important privacy issues.
Join the DZone community and get the full member experience.Join For Free
The use of health apps has increased significantly in recent years as people look to take a more active role in their own health and wellness. However, while these apps can be useful tools, they also raise important privacy issues. Patients trusting developers with their health data is a major concern and one that developers must take seriously, as otherwise, it can lead to costly fines and reputational loss.
Imagine this scenario: you’ve built a health app that helps users determine potential diagnoses. One of your users, worried that her partner is struggling with depression, searches for symptoms. She browses around possible diagnoses and looks at treatments. She decides it’s a possibility, but doesn't move forward with anything right now.
The next day, she sees a targeted ad for treating depression. The timing is more than a little suspicious, so she does a little Googling and discovers that your health app sends her activity and information to over 100 third parties, including advertisers.
She deletes your app and writes a blog post about how your app tried to monetize her partner’s depression. Thousands of people read it and learn that you, like many other mHealth app creators, work with ad tech companies that collect information on users to help apps make their advertising “more valuable.”
This is a hypothetical scenario, but it should serve as a cautionary tale. Like many app developers before you, you may have thought users didn’t care about privacy, or that you wouldn’t fall foul of regulations because your app isn’t covered by HIPAA.
But neither of those things is true anymore. If you’re building a health app, privacy can’t be a secondary consideration. The entire app needs to put privacy at the forefront. Otherwise, you could face fines, lawsuits, and loss of trust from the very users you’re trying to help.
Developing a Privacy-First Health App
For a long time, online health privacy was treated like the Wild West. It was lawless, or at the very least, law-lite. It turns out HIPAA is less than perfect when it comes to protecting consumer health privacy. For instance, HIPAA doesn’t protect your online searches for health information.
Big tech companies seem constantly to be testing the boundaries of what they can and cannot do. Meta was recently caught receiving patient health data from hospitals. What’s more, activists worry that Google might share data on medical procedures with law enforcement, or that data companies might sell opioid “risk scores” to health practitioners.
It’s hardly surprising that consumers don’t always know who is collecting which data about them. According to research from the Pew Institute, fewer than one in 10 consumers read privacy policies — and who can blame them? One journalist for the Washington Post found that all the privacy policies for the various technologies he has used over time added up to over one million words, which would take an estimated 55 hours to read.
Today, things are beginning to change, both with legislatures and the tech companies themselves. California Assembly Member Rebecca Bauer-Kahan recently filed a bill intended to protect consumers from apps using “an inferred or diagnosed mental health or substance use disorder” for purposes other than providing care, which Governor Gavin Newsom signed into law in September.
Private sector companies like the Mozilla Foundation are asking for more regulation to safeguard patient data outside the health care system, although critics argue such efforts are a disingenuous smoke screen.
Even so, privacy may be starting to become a competitive advantage for tech companies. Apple, in its cold war against Facebook and Google, handed consumers a privacy win with the company’s new ad-tracking policy. Now, when an iOS app wants to track a consumer, a little notification pops up asking if the user is OK with that. That’s dropped Apple’s advertising ROI by 38%, but the goodwill engendered with consumers could arguably make up for those losses.
That’s because public opinion is shifting. Consumers are more aware of the value of online digital privacy, and they trust companies more that make visible steps to safeguard digital privacy. If you’re developing a health app, it’s more important than ever to put data privacy first.
How to Prioritize Privacy
As the SVP of engineering at a health-tech company, I know that data is a double-edged sword. Your app has to collect enough to be useful to your user, but you have to guard that data as much as possible, because your company’s reputation, finances, and legal standing are all at risk.
There are two issues: first, many apps take liberties with the data of their consumers, collecting more than necessary, selling it to third parties, and not being transparent about what and when they’re collecting data. This leads to the erosion of consumer privacy, which can cause consumers to trust you less and may lead to lawsuits or fines in the future.
Second, bad actors love health data because it’s valuable. The more careless you are with consumer data, the larger the risk of a hack becomes. Here’s how to build a health app that collects data while still prioritizing privacy and security.
Stay on Top of Regulations
The legal landscape is changing fast, and as of today, there’s no single blanket rule to follow. Depending on the focus of your app, you may be subject to the FTC Act, the FTC’s Health Breach Notification Rule, HHS’s Health Insurance Portability and Accountability Act (HIPAA), or the FDA’s Federal Food, Drug & Cosmetic Act.
Before and after you build your health app, you should be monitoring the local and national regulations around health privacy. A great tool to use is the FTC’s own interactive platform. When we at Ilumivu took the FTC’s flowchart quiz, we confirmed — as we already knew — that some of the apps we helped produce were covered by a patchwork of the acts and rules mentioned above.
Plus, because some of our apps deal with data collected from children under the age of 13, we had to comply with the Children’s Online Privacy Protection Rule and the Gramm-Leach-Bliley Act Safeguards and Privacy Rules.
Finally, along with federal laws, you’ll have to confirm your state laws. I recommend researching the National Conference of State Legislatures to see which laws are applicable in your state. I say “finally,” but there’s really no end to the research. Laws change. Make sure you stay up to date because the FTC won’t take ignorance as an excuse if they catch you breaching laws.
I recommend you bring on a compliance officer as soon as the company can afford it — the risk of a violation merits the expense. Most tech folks will not be familiar enough with these regulations, and anyway, you’ll want to keep them free to do what they do best.
Remember those voluminous, arcane privacy policies I was talking about earlier? Don’t be like that. Instead, be as proactive and clear about how you use and store user data as possible.
Don’t just tell people that you’re collecting data — tell them why, too. For example, instead of telling users that your app wants to access their contacts, consider a message like, “MyApp wants to collect your contacts’ information so we can send them updates about your activity.” That gives users more informed consent over what they allow, and why they might allow it — or not allow it.
Finally, make sure users are aware of what data you’ll collect when they download the app, and then notify them again when you collect it. This is called a “just in time” notice. For example, if you need to know their location when they log their data, consider showing a notification when they open the app to do so.
Go Big on Security
If your users are trusting you with precious and sensitive health data, you need to build security at every level of your code. Test your security measures and controls before launching your mHealth app. Evaluate and update your security precautions as appropriate. Build security testing into your SDLC so the controls around data and code are tested with every release.
The authors of one study testing the security of mHealth apps created software called BProxy that tested for seven common security issues. The software checklist is a great concept to ensure that you’re rising above the standards. According to the study, of the 53 most popular health apps, a full 40% had critical security issues.
You should also use strong encryption at rest and in transit: i.e., secure when you store it and secure when you move it. Encrypting data at rest means that when your customers’ data is stored, it’s not stored in plain text. Even if bad actors successfully hack into your database, or get data out, they can’t read it because it’s encrypted.
As for “in transit,” sometimes you have to send data between point A and point B. For example, maybe your app sends user symptoms to a separate data warehouse where those symptoms will be analyzed for a most-likely diagnosis. The data should be encrypted before it moves between your app and that potential third party.
This kind of encryption is tough to code, so if you’re struggling, don’t compromise. Use an off-the-shelf tool to secure the data, or bring in some outside help.
Don’t Sell Customer Data
Selling customer data these days is a dicey prospect. Seriously consider whether you need to sell the data at all. If you do decide to sell, at least give consumers the option to opt-out. Many health apps funnel consumer data to advertisers to turn a profit. While it’s legal to do this today, that may change in the future. Plus, as mentioned above, consumers today are more focused on protecting their health privacy, so such a strategy might be shortsighted.
Limit Access and Permissions
What kind of health app are you building? What permissions does it really need in order to run? Many health apps request access to unrelated information, like contact numbers or location. If you genuinely need that access, build in what security experts call “least privilege,” which is the act of minimizing the amount of access needed for data to be transferred.
For example, Android devices require location permissions for Bluetooth device access. We comply with that requirement while prioritizing customer privacy by making that "why" very clear to the consumers.
Build in a Delete Option
This one’s simple. Give users the option to delete all their data collected by your app, at any time, for any reason. This increases consumer confidence in you and is a best practice for prioritizing consumer privacy.
This is best practice, but it’s also a requirement of GDPR and CCPA. If you’re not already doing this, you should expect to need to soon.
De-identifying data, or removing personal identifiers from health data, is a legal requirement if your app falls under HIPAA regulations. This data includes past, present, or possible future medical conditions, any health care services received, and common identifiers that could be linked to the individual. A standard example is a hospital bill.
Even if your app doesn’t fall under HIPAA regulations, data de-identification is a good practice to follow. There are two well-regarded methods, approved by the HHS, to de-identify data: expert determination and safe harbor.
Expert determination means you get an expert to tell you that there’s only a very small risk of some random person or third-party provider identifying a person with the data you collect. The HSS somewhat wordily defines an expert as a person “with appropriate knowledge of and experience with generally accepted statistical and scientific principles and methods for rendering information not individually identifiable.” That expert will also need to document how they determined this.
Safe harbor is a little more straightforward. It means you strip your data of the 18 key identifiers the HHS has listed. I won’t list them all here, but a few examples include name, address, email, SSN, and biometric data.
Go Above and Beyond
Regulations and consumer expectations are constantly changing. This article is a starting point but not a definitive guide. There’s a lot more you can do beyond these seven tips.
Check out the FTC’s guide to developing mobile health apps. I also suggest reading the California Department of Justice’s recommendations for privacy in mobile apps of all types, since, as of this writing, that state has the most restrictive measures in the nation. Also, avail yourself of the free and low-cost privacy and security resources experts have produced, like the Open Web Application Security Project’s guide on the top 10 mobile app security risks to avoid.
Finally, go above and beyond the bare minimum. If your app doesn’t fall under HIPAA’s regulations, you should still follow them. If your app does fall under HIPAA, don’t just follow them to the letter. Take additional security and privacy measures.
Consumers deserve access to better privacy while legislation and regulations catch up. It’s your job and responsibility to give it to them.
Opinions expressed by DZone contributors are their own.