Wearable Health Tech: New Privacy RisksPrivacy Advocate Deborah Peel Discusses Emerging Concerns
Emerging Web-enabled health technologies, ranging from the upcoming Apple Watch to a Google "pill" that could potentially detect cancer in patients' bodies, pose troubling new privacy risks, says privacy advocate Deborah Peel, M.D.
The privacy risks posed by evolving health technologies are largely fueled by a hidden, third-party data broker industry and a regulatory landscape that hasn't kept up with innovation, says Peel, founder of advocacy group Patient Privacy Rights.
For instance, under the banner of the Internet of Things, new consumer-oriented health technologies, including wearable health and fitness devices, create a great deal of uncertainty over where sensitive patient data will end up, while lacking sufficient control for individuals to set their privacy boundaries, she argues.
"None of the sensitive health information that these technologies handle gives us either copies or control over the use and sale of that information," Peel says in an interview with Information Security Media Group.
At an Oct. 28 technology conference hosted by the Wall Street Journal, Andrew Conrad, head of the Life Sciences team at the Google X research lab, revealed that the company is designing a nano-technology-based pill that, when swallowed by a patient, would release "tiny magnetic particles to patrol the human body for signs of cancer and other diseases." A wearable device containing a magnet would attract, count and monitor cells that could indicate signs of disease, Conrad explained.
Google's new pill technology - assuming it's doable and eventually approved by regulators such as the Food and Drug Administration - isn't likely to become available for years. However, it fits into a trend in new Web-based health and fitness gadgets and applications that collect sensitive medical information about individuals.
"It's very mixed news that this pill has been invented by Google," Peel says. "Who doesn't want cancer detected? But the problem is that Google is known for using very sensitive information in ways that one would never want or expect. ... It's very hard to know what can be trusted in healthcare."
Even if a particular player in the healthcare sector, such as an application developer, says it will not look at any consumer data in a health app or use the data for marketing purposes, "there's no guarantee the app developer won't sell your information" to third parties who will, she says.
"The problem is that we have an entire ecosystem where even if one app, or one system, or one electronic health information exchange doesn't sell your information and puts you in control over your data, there are a million that are selling your data," she says.
The issues are complex and need to be addressed from several angles. "We have to have industry and manufacturers fixing this; healthcare has to demand that these technologies be fixed - and Congress needs to pass laws because none of the healthcare system can be trusted," Peel says.
Not only have state and federal privacy laws not kept up with evolving new technologies that potentially put consumer data at risk, but the HIPAA privacy law has "deliberately facilitated this massive hidden health data broker industry," she contends.
In the interview, Peel discusses:
- The kinds of third-party health data brokers that she sees as most troubling;
- The patient safety risks that emerge when individuals do not trust where their health data will end up, and who might see it; and
- Fair information practice principles that she says need to be observed in the healthcare ecosystem.
Peel, a practicing psychiatrist and psychoanalyst, is founder and chair of Patient Privacy Rights, an advocacy group. Peel was also chosen in 2013 by Information Security Media Group as a top 10 influencer in healthcare information security and privacy.