Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Is AI Putting the Privacy of Health Data at Risk?

DZone 's Guide to

Is AI Putting the Privacy of Health Data at Risk?

A central concern for many is that the sensitive, health-related data about each of us is kept safe and secure.

· AI Zone ·
Free Resource

The tremendous potential for health data to feed AI systems and improve healthcare is a topic I've touched upon numerous times in the past few years, but a central concern for many is that the sensitive, health-related data about each of us is kept safe and secure.

Most of these privacy-related discussions revolve around explicit cybersecurity breaches, but a recent study from the University of California, Berkeley highlights how AI itself could present a threat.

The research found that AI allows us to easily identify individuals just by looking at various forms of health data, such as that recorded by wearable activity trackers and smartphones. The findings emerged after mining around two years' worth of data from 15,000 people.

"We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.," the researchers explain. "The results point out a major problem. If you strip all the identifying information, it doesn't protect you as much as you'd think. Someone else can come back and put it all back together if they have the right kind of information."

They explain that it would be relatively easy for a company like Facebook to gather activity data from our smartphones, and then matching that with healthcare data, they buy from elsewhere to have a detailed database of health-related information on users that could be sold to advertisers.

The Misuse of Data

The authors are keen to say that they don't believe the problem are with the devices themselves, but more with how the rich data generated by them can be misused.

While the study focused specifically on step data, the researchers believe that similar health-related data could be misused to the same extent relatively easily. They believe their findings point to the need for much stronger regulations around how health data is used.

"HIPAA regulations make your health care private, but they don't cover as much as you think," they explain. "Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It's supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it."

Ideally, they believe that new rules and regulations are needed to better cover the technologies surrounding health data today. Alas, they don't believe such moves are likely, with a sizeable push at the moment to actually weaken rather than strengthen the regulations.

"For instance, the rule-making group for HIPAA has requested comments on increasing data sharing. The risk is that if people are not aware of what's happening, the rules we have will be weakened. And the fact is the risks of us losing control of our privacy when it comes to health care are actually increasing and not decreasing," they say.

Given the huge riches access to data has delivered to technology companies in the past decade, it's perhaps not surprising that healthcare is attracting all of the tech giants. If the benefits from health data are to be more equitably distributed than so much of our personal data, however, better regulation is needed. Sadly, I wouldn't be at all surprised if the horse won't bolt long before regulators get around to shutting the stable door.

Topics:
artificial intelligence ,machine learning ,ai in healthcare ,health data privacy

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}