Health Apps, Data Sharing and the Trust Deficit

By SUSANNAH FOX (17)

There has been a steady drip-drip-drip of articles documenting how health apps are sharing data with third parties:

"alt="" class="wp-image-976" width="240" height="267" srcset="https://i2.wp.com/thedeductible.com/wp-content/uploads/2019/05/Susannah-Fox.png?w=922&ssl=1 922w, https://i2.wp.com/thedeductible.com/wp-content/uploads/2019/05/Susannah-Fox.png?resize=269%2C300&ssl=1 269w, https://i2.wp.com/thedeductible.com/wp-content/uploads/2019/05/Susannah-Fox.png?resize=768%2C856&ssl=1 768w, https://i2.wp.com/thedeductible.com/wp-content/uploads/2019/05/Susannah-Fox.png?resize=918%2C1024&ssl=1 918w, https://i2.wp.com/thedeductible.com/wp-content/uploads/2019/05/Susannah-Fox.png?resize=772%2C861&ssl=1 772w, https://i2.wp.com/thedeductible.com/wp-content/uploads/2019/05/Susannah-Fox.png?resize=560%2C624&ssl=1 560w, https://i2.wp.com/thedeductible.com/wp-content/uploads/2019/05/Susannah-Fox.png?resize=600%2C669&ssl=1 600w" sizes="(max-width: 240px) 100vw, 240px">

Data sharing practices of medicines related apps and the mobile ecosystem: traffic, content, and network analysis, by  et al. (British Medical Journal, Feb. 25, 2019)

Is your pregnancy app sharing your intimate data with your boss?As apps to help moms monitor their health proliferate, employers and insurers pay to keep tabs on the vast and valuable data, by Drew Harwell (Washington Post, April 10, 2019)

You Give Apps Sensitive Personal Information. Then They Tell Facebook. Wall Street Journal testing reveals how the social-media giant collects a wide range of private data from developers; ‘This is a big mess’, by Sam Schechner and Mark Secada (Wall Street Journal, Feb. 22, 2019)

Assessment of the Data Sharing and Privacy Practices of Smartphone Apps for Depression and Smoking Cessation, by Huckvale, , and Larsen (JAMA Network Open, 2019)

This post is my chance to share some relevant data, add my perspective, and ask for your input.

First, the data from a 2018 Hopelab/Well Being Trust study I helped write:

  • 71% of female teens and young adults say they have tried mobile apps related to health, compared to 57% of males. Three in ten (30%) females say they currently use a health app, compared to two in ten (20%) males.
  • Fully 48% of females ages 18- to 22-years-old and 25% of teen girls say they have used a period tracking app, compared with 2% of males.
  • Sixteen percent of females use a meditation app, compared with 5% of males.

From a 2015 Pew Research Center study:

  • Eight in ten smartphone owners said they have downloaded apps and 60% of app downloaders surveyed said they have “chosen not to install an app when they discovered how much personal information it required in order to use it, while 43% had uninstalled an app after downloading it for the same reason.”
  • People appear to use popularity as a proxy for trustworthiness: 57% of apps downloaders say it is important to know how many times an app has been downloaded when they are making their choice about which app to use.

This is a fraction of the data available about people’s use of apps. Bottom line: This is a big and growing market.

From my perspective, here are the two sentences that are the crux of the JAMA Network Open article:

Our data highlight that, without sustained and technical efforts to audit actual data transmissions, relying solely on either self-certification or policy audit may fail to detect important privacy risks. The emergence of a services landscape in which a small number of commercial entities broker data for large numbers of health apps underlines both the dynamic nature of app privacy issues and the need for continuing technical surveillance for novel privacy risks if users and health care professionals are to be offered timely and reliable guidance.

What does this mean in practice? Who is responsible for auditing these apps’ security practices?

In England, the NHS features an apps library on their front page and NHS England takes responsibility for evaluating the apps featured (although it appears they rely heavily on self-certification). The existence of the library is emblematic of how the UK approaches health and health care — leaning in a bit more paternalistically to digital health than we in the U.S. have done.

I was the CTO at HHS when NHS was launching their library and warned them to be cautious, which of course they already knew they needed to be, and we had some spirited discussions about how to approach the creation of their recommendations list. I’m happy to see the library featured so prominently since I suspect it means it is a popular feature.

HHS, by contrast, limits itself to recommending fact sheets about prevention and wellness. Nothing dynamic or personalized — just the basics of immunization schedules, physical activity guidelines, etc. Very much the “better safe than sorry” approach to digital health. HHS’s main regulatory arm, the Food and Drug Administration (FDA) has chosen to focus their oversight of digital health on medical apps, not wellness apps, but also created a cybersecurity oversight structure that provides guidance for developers. HHS’s Office of Civil Rights (OCR) maintains fact sheets about what entities are covered by HIPAA. (OCR also created a developer portal but I’m not linking to it since a warning keeps popping up in my browser that it’s an insecure site. Oops.)

Meantime the Federal Trade Commission (FTC) has created a handy checklist for app developers who want to stay on the right side of all the laws that could apply to them.

The basic regulatory structures for consumer protection are being locked into place but it’s still just scaffolding.

Here are my other take-aways:

  • We know people are hungry for guidance and are eagerly using health apps, with varying degrees of success and satisfaction.
  • We also know people are not fully aware of the data sharing that health apps are engaging in.
  • Public shaming by reporters and researchers is currently the main check on companies’ use of people’s personal data.
  • There is a big trust gap that could be filled by government agencies or by companies and organizations willing to do the work of continually testing and auditing health apps’ effectiveness and security practices.

Now: Your turn.

What are you seeing in the health apps marketplace? Which apps do you trust? Which apps have you stopped using or deleted because of data sharing concerns? If you are in a leadership role, either in the government or at an organization that could hold sway, what are you doing to build toward a vibrant ecosystem? Or are you in a protective crouch?

Susannah Fox served as the CTO of the Department of Health and Human Services during the Obama administration. Before that she worked at the Pew Research Center. She can be followed on Twitter at @susannahfox. Her personal blog, where this post first appeared, can be found at susannahfox.com .

17 thoughts on “Health Apps, Data Sharing and the Trust Deficit

  1. Standardized labels would go a long way toward fixing this problem, even if their use was voluntary and enforced by FTC under current laws.

    The platforms / data brokers that enable apps could list apps bearing a standardized label higher in their search. This would reduce the platform’s brand risk and provide an incentive for apps to compete on privacy.

    Patient Privacy Rights proposes a standardized information governance label for all apps, not just healthcare: http://bit.ly/PPR-IGL

    Please comment here or on the Google doc.

    1. Quick response: Whenever I’m tempted to use a health app, I’m totally frightened by the (sneaky) permission requests for all of my data now and forever. Thus, I almost never accept. Why, say, a calorie counter needs all of my addresses, photos, every email, every website I’ve ever visited, all of my phone calls, all of my contacts, etc etc. is so off-putting to me.

      Note that for 9 years I worked on cybersecurity issues as an academic researcher funded by the NSA. I’m not a total moron about data access. I also note that all funding for my team’s cybersecurity research ended with the election of Mr Trump. Apparently there is no longer any federal government concern about cyber access and authentication.

  2. I use a constellation of health apps from Cardiogram and Sonic Sleep to One Medical, MyBanner, Apple Health, and Healthmate. I’m confident they are sharing my data, and in fact I hope they are. I think if we don’t have data sharing we will never have improvements in health care, which I believe is terribly broken. Since I’m a very literate social media user, I share most of my own health experiences online and so do my friends. There’s a sense that privacy is outmoded vis a vis health care, except for people who get insurance from their employers. That system has to stop so we can all take advantage of the data we are generating.

    1. I really like Francine’s comments. We are hugely concerned about privacy of healthcare information for a lot of reasons, the principal of which are stigma, discrimination, and perceptions, remnants of an era when people suffered because of various prejudices re re handicap, mental illness, and of course substance abuse and such things as HIV, etc. It would be great if we societally could get over that. The upside of data sharing from a population health perspective is enormous. I know, I know… maybe not in my lifetime, but it’s a view that has some real merit.

  3. Let me get this straight…

    We spent millions making data “private” with HIPAA regulations. The Office of Civil Rights actually gets paid a great deal of money to fine a physician 100K if they accidentally fax an immunization record to the wrong place.

    Yet, the top 10 health and fitness “app” makers, collect private data, can sell it to whomever they want or use it for other undisclosed purposes and at the same time, earn an estimated $327 million annually for this endeavor?

    I have never had time to use a health app, and don’t have the foggiest idea why it is necessary in the first place. Just go for a run or play tennis every morning and that should take care of everything.

    1. Niran, you no doubt overlooked Ovia — an app that tracks your sexual desire https://www.washingtonpost.com/technology/2019/04/10/tracking-your-pregnancy-an-app-may-be-more-public-than-you-think/?utm_term=.5e6dcca698c9

      Without an app, how would we know if we want sex or not? Actually, there’s a little bit more to this app, but I couldn’t resist.

      I’m like you, Niran, I have no desire for an app of any sort. Maybe some day if I am bed-ridden with six chronic ailments and an app will help me communicate urgently needed information to a hospital or doctor, then I might get one. If I participate in a trial of a new drug or exercise regimen and it helps the investigators track an outcome, I might wear an app then.

      But at the moment I can’t think of a single reason why I would want any of them.

      Big data from apps begs the classic question — if you put garbage into a computer, will you get anything but garbage out?

      Kip

      1. How old are you grumpy old men? I am trying to use these apps to optimize my life and they do. These apps are helpful for people trying to conceive, which has become very difficult probably due to environmental pollution that goes far beyond TV–

  4. Doing a little research …

    what health apps do you trust?

    what health apps have you stopped using because of privacy concerns?

    and / or continue to use despite your reservations?

    1. via TweetBot / DM

      I trust Apple Health as a platform but I don’t use any apps and the Apple app store policies are not transparent enough.

    2. I trust CanImmunize but it’s too much of a pain to use if you’re not starting with an infant. I use Clue and delete/disable everything else from my phone b/c I don’t trust them.

  5. via TweetBot

    Hi John — hope all is well! Personally I use NO health apps as I have concluded that none offer me what I can’t get elsewhere without giving up data/privacy protections. How is that for a retro attitude?!

    1. I have only one app installed on my phone. It’s a mental health app that helps reduce workaholism, eye strain from excessive book reading and existential despair from overthinking. It’s very good at what it does… using it as we speak…

      1. From the STAT News new techno newsletter last week

        A key study supporting a fertility app called Daysy just got retracted. The journal, Reproductive Health, pulled the paper because of “fundamental flaws in [the study’s] methodology,” as Buzzfeed reported yesterday. Apps that claim to help or hinder conception by predicting when a person is ovulating have been around for years — and Daysy is far from the only one that has claimed to be very effective. One app called Dot says its studies show the app is 95 percent effective at preventing pregnancy; the only FDA-cleared app for contraception, Natural Cycles, claims a 93 percent effectiveness rate with typical use.

  6. legitimately curious about the 2% of men who report using a period tracking app in this story

    what did they find?

    what were they thinking?

    are they forward thinking partners who wanted to better understand their mates?

    should I be doing this?

    1. Ha! I’m not answering this one publicly. I’ve done it. I used “Clue”. It was especially useful when our teenage daughter was still at home and I could predict when she and my wife were likely to get into a fight. Also, I got a lot better at understanding when it was a good idea to proposition my wife for sex. I think I started tracking after reading “The Female Brain”, but don’t remember for sure where I got the idea. Might have been from a Dave Asprey podcast.

  7. Yes, there has been a steady drip in the press. Unfortunately, press stories and even the JAMA analysis fail to note significant individual privacy protections that apply in some environments by law, and do not apply in other environments. For example, the Flo story was about a retail app that was leaking data to a Facebook software development kit. This is not an app regulated by HIPAA, otherwise the leakage to Facebook would have been a breach (unless Facebook was the app developers business associate, which is doubtful, but possible).

    Which leads to the JAMA article. Unfortunately, the authors took no account of how apps are regulated (or not) in the US or Australia or comparing the two. Privacy laws vary widely within the US and between countries. Here’s a U. S. summary: https://www.healthit.gov/sites/default/files/non-covered_entities_report_june_17_2016.pdf

    Second, the JAMA article tested the 35 most “popular” apps, potentially introducing an overweight bias of free apps (free is very likely to be most popular). It is well documented that “free” apps have a business model of monetizing the data they collect; apps from HIPAA regulated entities are not allowed by law to do that, nor are apps regulated by EU rules (more below).

    Further, the authors assume that all “secondary” uses are nefarious without defining what “secondary” uses are. But an app that enables patients to knowingly participate in research may well disclose “secondary” uses that the individual wants to occur–for example by both creating a longitudinal record for the individual (primary use) and allowing researchers to access elements of that record (secondary).

    Finally, the JAMA study accessed the apps in early 2018, before the European General Regulation on Data Protection took effect, and so may be significantly out of data as apps who provide services in the EU are required to adapt privacy polices that conform to specific rules and privacy rights to the individual.

    So, unfortunately, apps privacy policies and practices widely vary. Yes, that makes it hard for individuals to choose, but let’s not be alarmist, especially based unreliable research. Instead, let’s identify best practices and point consumers towards those. In the U. S. that will be apps that are regulated by HIPAA, and in retail, those that conform to the California Consumer Privacy Act. (All views are my own.)

    1. Everybody wants to own the app store, decide what to put on the shelves, and where. Brokers make money. Data brokers in healthcare make a lot of money. That includes hospitals, Surescripts, and a thousand others. It’s the patients and physicians that pay. Certification of apps is just another form of data brokerage. It’s indirect but amounts to the same rent-seeking behavior.

      The alternative to certification is a standardized labeling scheme, like nutrition labels on food. It costs almost nothing to setup under a Creative Commons license, can be voluntarily adopted by the apps vendors without new regulation, and can be enforced by the FTC regardless of HIPAA or not. It doesn’t restrict app stores or data brokers per-se, but it provides a huge amount of transparency to the patients and the physicians.

Leave a Reply