Is my data protected? Questions to ask before downloading that health app

This is a two part blog series on privacy. You can read the second post, How are health apps sharing my data with Facebook?

COVID-19 put data privacy concerns in the backseat, as the world rushed to understand the virus and control its spread. However, with the proliferation of symptom tracking apps and government-backed contract tracing initiatives, privacy is once again in the spotlight.

Why should we care?

Most of us have accepted that tech companies make money off our data, and in exchange we are able to communicate, connect, discover, and create in ways that were previously impossible. And while it’s easy to avoid confronting the extent of the data collected and how it’s used, the Folia team believes strongly that your health data deserves special attention because of the unique ways that it can be used and misused.

News headlines are littered with examples: your medication search history is being leaked, your reproductive health data is being shared with tech companies and even your employer, and Facebook can use your mental state to target ads. It’s no wonder that proactive disclosure of privacy policies actually makes people lose trust. We would rather turn a blind eye and keep scrolling.

You’re in the driver’s seat now

Watch our video to understand how the rulings will impact your access to your data (3:38 min)

In March 2020, the CMS and ONC released two federal rules to guarantee patients easy access to their own health data - electronically and at no cost.

The rules faced criticism from EMR developers and providers, who argued that patients could share their data with mobile apps that use their data in unintended ways. 

Dr. Don Rucker criticized this position as self-serving since EMR developers and health systems benefit financially by holding patient data hostage: “All we’re saying is that patients have a right to choose as opposed to the right being denied them by forces of paternalism.” 

And this is really the point. After waiting this long to be in the driver’s seat, let’s take control of the wheel instead of deferring to someone else’s autopilot.

We came up with eight key questions to help you think through your goals and values, so you can navigate the complex world of health apps and be confident in your control of your health data.

#1: What are your goals for using a health app?

It’s first important to think about your goals for using health technology, which will guide your decisions around privacy. 

Generally speaking, health data can be applied at three levels. In the context of COVID-19 contact tracing, patient data has use at the personal, public health, and civilian levels.

At Folia, your knowledge can be used in the following ways: 

  • Personal: To manage your health at home by streamlining care and generating insights on what’s working.

  • Individual care: To help your care team understand what’s going on at home so they can refine your care plan.

  • Community: To help researchers and clinicians improve therapies, reduce treatment burden, and match patients to more precise treatments for their genotype or phenotype.

In our experience working with chronic disease communities for the past three years, people managing complex conditions tend to be more willing to share their health data since medical progress depends on it. We stand on the shoulders of other patients who were willing to participate in clinical trials and share what they learned about their conditions. 

While society needs some people to share their health data for community benefit, we realize not everyone wants to contribute in that way. And for various legitimate reasons, not all patients want to share their daily experiences of their condition with their personal doctor either. 

There is no wrong way to use Folia, which is why we give Folia members control over sharing their data with their doctor or researchers.

#2: What level of sharing are you comfortable with? 

After clarifying your goals, think about what level of sharing you are comfortable with. 

On one end of the spectrum, minimal sharing of just diagnosis information could help in conditions with unknown prevalence. And if you are already public about your condition (by raising money for a walkathon or posting about it on social media, for instance), then you are probably comfortable with your diagnosis being shared with researchers.

On the other end of the spectrum, some people feel comfortable with full transparency and even upload their full genome for the public to access. However most of us lie somewhere in between.

Think through what level of information you are comfortable sharing, with the understanding that once your data is shared, it’s not always easy to “undo.” Would you be comfortable with other parties having access to your list of prescribed medications? The medications you’ve searched online? How frequently you use your medications? This type of data can be very useful to researchers in understanding variations in care as well as the reality of managing a condition, but there are valid concerns to sharing them as well. 

Of course your answer depends on who is receiving what parts of your health data, which leads us to questions #3 and #4.

#3: What 3rd party partners do they use? How much data is being passed to them, and for what purposes? 

Often companies say your information is not sold. But did you know that your information is often shared with 3rd parties for free?

All technology companies leverage 3rd parties so they can focus their research and development on building novel products and services. Common examples are email services (we use Gmail), chat and messaging services (we use Intercom), analytics services to understand web traffic (we use Google Analytics), advertising platforms to get the word out about your product (we have used Google AdWords and Facebook), or even common product features that don’t need to be built from scratch. 

Health apps should only share the minimum information required for the 3rd party to perform its function, and if personal identifiable information is shared, a Business Associate Agreement (BAA) should be put in place.

Headlines from the last 6 months. All articles are linked in this blog post.

As one example, we have a BAA in place with Intercom because patients email us and message us for support, exposing their names and contact information to this 3rd party. We also share some information, such as conditions and time zones, to help us provide customer support through their chat service. However, we don’t give them blanket access to all your data since there’s no need for them to have detailed lists of your medications, symptoms, doctors appointments, etc. 

More concerning, many health tech companies have been called out for sharing sensitive health data (such as medication search histories, and use of mental health services) with ad partners such as Facebook, who can match an individual’s data with his/her identity. You experience this with retailers who send you Facebook ads to remind you what’s in your shopping cart, but most consumers don’t realize their activity within some health apps is also being used in this way. This is such a complex and important topic, that we will be covering it in its own blog post next week. [Update: you can now read it here.]

#4: Who are their data customers? 

It’s safe to assume companies that offer products and services to you for free (and even many that charge) will use your data to generate revenue, which can include showing you targeted ads within the app or licensing your data to business customers. 

While sometimes they are contractually obligated to keep the names of their customers private, are you comfortable with the types of customers they have (pharma, insurance, employer, government)? You can get a sense of who their customers are through their website or in their media coverage.

It isn’t just tech companies. Even your hospital is guilty of monetizing your data. If you read the fine print on your patient portal (and let’s be honest, is the front desk staff really giving you the option to decline the terms in fine print, as they rush you off to your appointment?), you’ll see that your hospital reserves the right to use your health information for marketing and advertising purposes, and to share it with unnamed companies at any point in the future (which they have).

The terms for Christina’s patient portal (highlight is our own). If you are told that you have to use the patient portal to communicate with your doctor, request prescription refills, and pay your bills...are you really being given the choice to opt out of initiatives for “conducting, managing, and growing our business” with unnamed private companies? Shouldn’t there at least be some transparency around these initiatives?

#5: How are the data customers using your data?

Even if they cannot share the names of their data customers, they can share descriptions of what the partnerships seek to accomplish. When we launched Data Dividends, we committed to publishing descriptions of all research projects. If a company is unwilling to do this, how can you be confident that your data isn’t affecting your care, coverage, or employment in unintended ways?

Are there protections in place to prevent re-selling of your data? This question can be harder to investigate, but it is within your right to ask since their customers may have different policies and standards than the app itself.

#6: What is the mission of the company? 

Do you trust the leadership to be good stewards of your data?

While it can be difficult to cut through marketing polish, understanding the motivation of the company’s leadership and its advisors (who are typically listed on their website, or can be found via LinkedIn search if they list their affiliation in their profile) will help you understand whether the company’s values are aligned with yours.

Pay attention to their investors (which you can typically find on Crunchbase) and their media coverage. If they consistently talk about how poor medication adherence costs millions of dollars each year, then they are primarily trying to lower costs for organizations, which could come at the expense of patients.

Finally, most people don’t think 3-5 years into the future when they download a mobile app, but the reality is many apps are eventually acquired by larger organizations. Privacy policies typically include language around how ownership of data assets will transition in the event of a sale or merger, so it’s worthwhile to understand what motivated the founders to start the company, and what types of companies could eventually come to own your data.

#7: Have you reviewed your app settings?

Now that you’ve reflected on your goals for technology use and established your standards, make sure to review your settings in your health app.

At Folia, patients and caregivers always have the choice to share their data with their clinics or with researchers. Make sure to review your settings!

#8: Have you reviewed your settings for social media and other online profiles?

Facebook and Google have launched new transparency initiatives and - after pressure from regulators - are giving consumers more control over what data is captured and shared. Stay tuned for our next blog post, where we will share our recommendations on the first steps to take.

You deserve to be in control of how your health data is used. What other questions do you have? How can tech companies and app developers empower you to take control of the steering wheel? Leave a comment below to continue the conversation.