The Risk of Data Sharing

1st Talk Compliance features guest Iliana L. Peters, Shareholder at Polsinelli PC, on the topic of The Risk of Data Sharing. Iliana joins our host Catherine Short to discuss how these days, health data is an incredibly valuable commodity. Companies of all types should consider the legal risk with data valuation, data ownership, and data sharing agreements. In this episode, we will be discussing the scope and breadth of data sharing projects in development in the health care sector, examine contractual, state, federal, and international legal obligations for data privacy and security for such projects, and discuss issues related to data ownership that may also be part of such projects.

Catherine Short  0:03  

Welcome, and let’s, 1st Talk Compliance. I’m Catherine Short, Marketing Manager for First Healthcare Compliance, a division of Panacea Healthcare Solutions. Thanks for tuning in. This show is brought to you by First Healthcare Compliance as part of our commitment to provide high quality complimentary educational resources. Please show your support by taking a moment to provide a review on Google, Facebook, or iTunes, and be sure to follow us on social media and subscribe to our YouTube channel.

On today’s episode, we are speaking with Iliana L. Peters, Shareholder at Polsinelli PC on the topic of The Risk of Data Sharing. These days health data is an incredibly valuable commodity. Companies of all types should consider the legal risk with data valuation, data ownership, and data sharing agreements. In this episode, we’ll be discussing the scope and breadth of data sharing projects in development in the healthcare sector, examine contractual, state, federal, international legal obligations for data privacy and security for such projects, and discuss issues related to data ownership that may also be part of such projects.

Before we begin, I would like to mention at First Healthcare Compliance we strive to serve as a trusted to resource for compliance professionals and we celebrate their hard work and dedication with our Compliance Super Ninja recognition. For this episode, we’re spotlighting Super Ninja Mika Lantz, Front Office Manager at Mountain Ridge Pediatrics. Mika says what she enjoys most about working at with Mountain Ridge Pediatrics is “interacting and forming relationships with our patients and their families.” Congratulations Mika!  Our team is honored to have the privilege of working with you.

So thank you, Iliana, for joining me on 1st Talk Compliance. It’s such a pleasure to have you on!

Iliana Peters  2:24

Thanks for having me.

Catherine Short  2:26

Why don’t we start with an overview of health data value proposition, and what some of the legal risks are with data sharing projects?

Iliana Peters  2:37

Absolutely. This is a new and evolving area of practice, particularly because we have many different entities that are very interested in engaging in innovative data sharing projects that result from the need to do research of all different types. That is research with a small r in terms of research and developments within entities, development of new products and services and research with a big R, that is human subjects research as defined under the law that may be used to determine new therapies, new drugs, new devices for patients as well. So there are all kinds of research projects going on, related to the use of data, and for many different and important reasons. As a result, we’re seeing a lot of questions about the legal requirements and risks associated with those types of projects, and particularly the agreements that are necessary and that are put in place between business partners related to those projects.

Catherine Short  3:48

Can you give us an example of what some of these new and innovative projects or research?

Iliana Peters  3:55

Sure. For example, we have many different entities that are interested in developing new software applications that may help with treatment or billing or, services in the healthcare sector to some extent, and they need data to really do evaluation and development of those prototypes and tools. We know that there are a lot of entities that are working on developing new drugs or new treatments just based on data that is what kinds of treatments are they seeing working for certain populations of patients over time? And can we implement that same kind of treatment in a larger population? There are entities that are looking at all different types of health disparities issues. So how do we get better treatment to better locations or better population? What does that look like? And how can we help develop tools and technologies to help with those issues? There’s a variety of different projects in this space that have a lot of really important and practical implications for how we provide care in the healthcare sector.

Catherine Short  5:18

That’s really interesting. Some of the ones I would have thought of, and then some of the ones that you mentioned, I never would have thought of such as billing, and a few other things. How about a quick summary of the legal issues involved in these projects? I’m sure it’s Myriad. But if you could tell us some of the legal issues and do you think that there’s serious legal risks associated with some of these issues and projects? What are your thoughts on all of that?

Iliana Peters  5:47

The short answer is, yes, there is serious legal risks. There are requirements at the state, federal and international level in the law itself, related to how we can use and disclose data. And that includes a general prohibition on the sale of data. So many of these innovative projects include some kind of benefit to the entity originating the data, because they are contributing data to an important project that’s going to arguably result in a new service or a new application or some kind of new invention, for lack of a better term. I don’t mean that in the legal sense, I just mean that in a general sense. As a result, these agreements contemplate what we call direct or indirect remuneration, that is some kind of benefit to the entity that’s originating the data. That’s considered a sale of data. And so that would necessitate consent from the individuals whose data we’re using for these projects. It’s really important, I think, that entities understand what this looks like from a legal perspective, because of those risks. As a result, a lot of entities are anonymizing data so that we can use data for projects involving remuneration, without implications for patient privacy, because the patients are arguably not identifiable or we don’t know who those patients are, who those consumers are as part of those projects, because we’ve anonymized the data. But obviously, if we’re going to do that, we have to make sure that we do that properly and in a way that in fact, doesn’t allow for those individuals to be identified, doesn’t allow business partners or downstream users of that data, to re identify or recombine data with other data sets to figure out who those people are, that are the subjects of the data. That’s not easy. It’s a hard issue.

Additionally, we have contractual requirements with our own clients and business partners that may significantly restrict how we can use data, how we can put data together and datasets and how we can anonymize the data. For example, Centers for Medicare and Medicaid Services have significant prohibitions in agreements related to Medicare and Medicaid beneficiary data that we have to be aware of when we’re aggregating data or de-identifying it because we generally can’t use CMS data in that way. That’s just one example from a contractual perspective. And then, of course, we have data breach issues. Anytime we’re putting together lots of data into a big data set, that becomes a target for a criminal, a cyber criminal or threat actor and we have to be very cognizant of the risks there, particularly if we’re providing that data outside our entity, to another business partner, who’s then going to have our data and be subject to those risks.

Finally, there’s always a reputational issue here. Even if we do everything in a legal way, even if we protect the data from a data security perspective, individuals could still find out about how we’re using their data because maybe it’s not identifiable, maybe it’s anonymized data, but it still came from them originally. And they could feel very strongly about how we’re proposing to use data for a particular project. Maybe they don’t agree with that particular project for whatever reason, and that could also create reputational risks for us. So this is all they’re all risks that we have to consider from an underlying legal perspective, a contractual perspective, data ownership, data licensure, all of those important controls that we put in place for data security purposes. Then just considering what the consumer would feel about any particular data project to make sure that we consider their viewpoint on these projects as well.

Catherine Short  10:01

That’s an interesting point, though. But let’s see if data points came from, for example, a patient. I know that occasionally I’ve been in situations where prior to speaking with a doctor,  perhaps a resident has come in and said, Do you mind signing this? We’re doing some research, if you’re okay with this, and I read the paper, etc. and,  I’ve asked some questions, and I assume that I’m not the only one who’s done this. I ask, and I say what is this for? Is this anonymous, etc. and it seems like other people would have done the same thing and if they sign that, it seems like they are being informed,  they gave informed consent about whatever information that they gave about themselves that it would go into this study. I guess that leads a little bit into this next question that I had, how should we consider addressing these issues and risks? I guess maybe one of them would be making human subjects or otherwise people aware of issues and risks.

Iliana Peters  11:06

Absolutely. At the end of the day, we could always get informed consent from the patient for any project that we want to move forward with. And that is the gold standard. So it’s a great point that you made that if what we really want to do is have a well informed consumer or research subject patient, whoever that person is, it’s always good to have a conversation with that patient, and get them to provide an informed consent or a HIPAA authorization for any particular project, because exactly as we say, then it’s clear that we had that conversation with the consumer, and that they’ve made the affirmative decision to share their data for whatever project that we’re contemplating. That said, there are a lot of these projects that we can’t maybe we have the data, and it’s very old, and we can’t go back to the person and get their consent to use it for some of these projects, in those circumstances, or in other circumstances where arguably, we don’t legally need informed consent, because again, it’s anonymized data or de identified data. That’s where those risks that we’ve been talking about come up and that is where we need to make sure that we have very robust contractual protections in place that allow for these projects to proceed in a way that will protect the privacy of those consumers and the ownership of the data for the originating entity. At the end of the day, if we can’t get informed consent, or HIPAA authorization for these projects from the individual, then we need to proceed in a way that is legal, that ensures that we’re not selling this data either directly or indirectly and that provides for good contractual protections for the data, such that we don’t have these really important issues associated with patient privacy, with consumer privacy and what data security

Catherine Short  13:12

Should entities go it alone, do you think or get help on these types of data sharing projects?

Iliana Peters  13:19

It’s a great question. In my experience, it is always good to have outside counsel to consult on these. That doesn’t mean your outside counsel has to look at every single agreement. They certainly can and it helps, but having a specialist in this area is often really helpful to understand the nuances because these are quite complicated issues and very rarely do entities have folks internally that have seen all of these issues, dealt with all of these issues in a way that allows for a really efficient and productive review, revision, negotiation of these types of agreements with business partners.

Catherine Short  14:04

If you’re just tuning in, you’re listening to 1st Talk Compliance brought to you by First Healthcare Compliance as part of our commitment to provide high quality complimentary educational resources. We help create confidence among compliance professionals throughout the United States. My guest today is Iliana L. Peters, Shareholder at Polsinelli PC on the topic of The Risk of Data Sharing. Please show your support by taking a few minutes to provide a review of First Healthcare Compliance on Google or Facebook. You can also follow us and subscribe on all forms of social media.

Catherine Short  14:43

What are the most important risks to consider in innovative data sharing projects?

Iliana Peters  14:48

Great question. As we’ve been discussing, I think the most important risks are the risks associated with how the business partners that you’re working with are going to use your data. At the end of the day, making sure that we understand the data ownership and licensure issues, particularly with regard to the type of data that we’re using for any particular project, so that we can ensure the right controls for that data. Again, we want to make sure that we appropriately take care of that data. But that’s really less of a risk in this context because arguably, we can control how we use our own data.

It’s really about when we share that data with business partners, how we do our best to make clear to those business partners how we expect them to use and share our data and how we expect them to protect it. It’s about making sure they understand our ownership of the data, what the license to the data looks like, for purposes of a particular project, and how they’re going to protect the data as they hold it.  I would say that’s the largest risk. It’s really when we share that data outside of our own institutions.

Catherine Short   16:13

Can you explain what some of those risks are? What are some of those various risks once they might start to go outside of your own entity?

Iliana Peters  16:22

I think one of the biggest data breach, obviously, if we don’t have a good data partner, that is as invested in protecting that data as we are and doesn’t have robust security controls for that data, we could very easily have a data breach, because it’s likely that they are a target for threat actors, because they probably do have a lot of data for a lot of different entities.

The other issue is they could also sell our data and we could ultimately be liable for that, because we handed over our data to an entity that then sold it without consent of the individual

We could also have an issue with re identification. So they could, if it’s anonymized data, they could sell it to an entity, which would arguably be permitted because it’s anonymized. And then that entity could re identify it, because we can’t control how they do that. So these are all serious risks associated with working with these business partners. If we don’t have good controls built into our contracts that says specifically, you know, how they can use the data, how they can disclose the data, and the data security controls that we expect them to put in place in their institution to protect the data.

Catherine Short  17:41

What do you mean when you’re saying that they re-identify it? What are they doing?

Iliana Peters  17:45

We could remove all of those 18 identifiers from the data or we could create, for example, what’s called synthetic data, that is data that is similar to an original data set, but not the original data set, or we could remove certain identifiers and not others in a way that we believe based on an expert opinion does not allow for an individual to be identified. We could give it to a vendor and the vendor could negotiate for example, with a very large internet based data company or an Internet service provider, or some someone who has very large amounts of data and based on the remaining items in that data set, whatever that is, that could be, let’s say a type of prescription drugs that someone is taking certain provider certain type of service that they’re getting for purposes of treatment, a certain state that they’re located in, in combination, it’s possible that someone else could have a dataset that includes enough identifiers that overlap with our de identified data, such that they can re identify it. That is, they can identify the individuals to whom it belong.

When we disclose it, it may not be clear that it’s early on as data. But when it goes to another entity, they may know enough about Iliana to re identify it and make it make it clear to them that that’s actually Iliana.

A good example of this was when HHS was working on The Genetic Information Non Discrimination Act. There was discussion about the identifiability of genetic data, and whether or not it could be de identified. The National Institutes for Health (NIH) had genetic databases on its website, and it was not a HIPAA covered entity to be clear, but they provided these genetic databases for purposes of research for different entities that were doing genetic research, and they believe the information that they had online was not identifiable because they had all identifiers for any particular individuals removed from it. That was very purely genetic information. Unfortunately, it was discovered that our researcher cross referenced at least one of these databases. This was obviously some time ago, cross referenced one of these NIH databases with a publicly available criminal database, and was able to identify convicted felon as a result of the data provided between the two databases. That is the kind of re identification problem that we would really want to avoid.

Catherine Short  20:38

Okay, I understand exactly. One other question. Should entities train their staff on these risks and issues?

Iliana Peters  20:49

I think the short answer is yes, but I don’t think this is the kind of training that everyone needs to this level. I think there are certain folks in every institution, legal and compliance that need this level of training. Otherwise, we need our business folks, our marketing folks, really anyone who has contact with business partners and vendors, who may propose these types of projects, to understand what these projects look like, and where the risks are, from a general sense. So they can appropriately identify this type of project and bring it to the folks that really need to take a closer look at it. We wouldn’t expect someone who is out in the community, working with business partners,  to really know the nuances here, but we would want them to be the kind of employee that says “oh, you know what? I think this is one of those complicated data sharing projects, I probably need to work with legal or compliance on this one” so that they’re escalated appropriately. Not so that everybody has to keep this information handy, but so that they have enough knowledge and understanding of how risky this is for organizations, such that they can say, “oh yeah, this is one of those data sharing projects. I need to be sure to escalate this as soon as possible, so that we can have legal and compliance look at this so we can take advantage of this fantastic opportunity in the right way”.

Catherine Short  22:26

All right. I want to thank you so much, Iliana. Did you have any other words of advice or things that you thought of during the presentation that you didn’t bring up at the time?

Iliana Peters  22:36

I don’t think so. I just wanted to say thank you for having me, and that I absolutely understand this is a really complicated area of current legal issues and so I hope that individuals will take a little bit of time to walk this through with their teams, but of course, are free to get in touch if they have any additional questions.

Catherine Short  22:59

Okay, well, thank you so much. Iliana. I really loved having you today on 1st Talk Compliance. Can’t wait to have you back!

Iliana Peters  23:05

Thanks for having me!

Catherine Short  23:17

And thanks to our audience for tuning in to 1st Talk Compliance. You can learn more about the show on the program’s page on and lend your voice to the conversation on Twitter @1sthcc or #1stTalkCompliance. You can also email me at  I’m Catherine Short of First Healthcare Compliance. Remember, compliance is the key to achieving peace of mind.