Iliana L. Peters, Shareholder at Polsinelli PC, believes good data privacy and security is fundamental to ensuring patients’ trust in the health care system, and to helping health care clients succeed in an ever-changing landscape of threats to data security. She is recognized by the health care industry as a preeminent thinker and speaker on data privacy and security, particularly with regard to HIPAA, the HITECH Act, the 21st Century Cures Act, the Genetic Information Nondiscrimination Act (GINA), the Privacy Act, and emerging cyber threats to health data.
For many years, Iliana both developed health information privacy and security policy, including on emerging technologies and cyber threats, for the Department of Health and Human Services, and enforced HIPAA regulations through spearheading multi-million dollar settlement agreements and civil money penalties pursuant to HIPAA. Iliana also focused on training individuals in both the private and public sector, including compliance investigators, auditors, and State Attorneys General, on HIPAA regulations and policy, and on good data privacy and security practices.
As a member of the First Healthcare Compliance Editorial Council , Iliana is a frequent presenter at educational events. For more information regarding this topic please view the related webinar for further discussion and learning.
Below, Iliana answers some common questions and provides explanations related to the education surrounding Health Data, A Value Proposition: Legal Risks with Innovative Data Sharing Projects.
Can you give us an overview of the health data value proposition, what some of the legal risks are with data sharing projects?
This is a new and evolving area of practice, particularly because we have many different entities that are very interested in engaging in innovative data sharing projects that result from the need to do research of all different types. This includes research with a small “r,” in terms of research and development within entities, and the of development of new products and services. And this also includes Research with a big “R,” that is human subjects research, as defined under the law, that may be used to determine new therapies, new drugs, new devices for patients, as well. There are all kinds of research projects going on related to the use of data, and for many different important reasons. And as a result, we’re seeing a lot of questions about the legal requirements and risks associated with those types of projects, and particularly the agreements that are necessary and being put in place between business partners related to those projects.
Can you give us a summary of the legal issues involved in data sharing projects and do you think there are serious legal risks associated with some of these issues and projects?
The short answer is, yes, there is serious legal risk. There are state, federal or international legal requirements related to how we can use and disclose data. That includes a general prohibition on the sale of data. Many of these innovative projects include some kind of benefit to the entity originating the data, because the entity is, in fact, contributing data to an important project that’s going to arguably result in a new service or a new application. As a result, these agreements contemplate direct or indirect remuneration, that is, some kind of benefit to the entity that’s originating the data. This is considered a sale of data. That would necessitate consent from the individuals whose data is being using for these projects. It is important that entities understand what this looks like, from a legal perspective, because of those risks. As a result, a lot of entities are anonymizing data so that we can use data for projects involving remuneration without implications for patient privacy, because the patients are arguably not identifiable, or we don’t know who those patients are as part of those projects, because we’ve anonymized the data.
Obviously, if we’re going to do that, we have to make sure that we do that properly, in a way that doesn’t allow for those individuals to be identified, doesn’t allow business partners or downstream users of that data, to re-identify or recombine data with other data sets to figure out who those people are. This is not easy; it’s a difficult issue.
Additionally, we have contractual requirements with our own clients and business partners that may significantly restrict how we can use data, how we can put data together and datasets, and how we can anonymize the data. For example, Centers for Medicare and Medicaid Services have significant prohibitions in agreements related to Medicare and Medicaid beneficiary data that we have to be aware of when we’re aggregating data or de-identifying it. We generally can’t use CMS data in that way. That’s just one example from a contractual perspective. And then, of course, we have data breach issues. Anytime we’re putting together lots of data into a big data set, that becomes a target for a cyber-criminal or threat actor. We have to be very cognizant of the risks there, particularly if we’re providing that data outside our entity, to another business partner, who’s then going to have our data and be subject to those risks.
Finally, there’s always a reputational issue, even if we do everything in a legal way. Even if we protect the data from a data security perspective, individuals could still find out about how we’re using their data, because maybe it’s not identifiable, maybe it’s anonymized data, but it still came from them originally. And they can feel very strongly about how we’re proposing to use data for a particular project. Maybe they don’t agree with a particular project, for whatever reason. That could also create reputational risks for us.
These are all these risks that we have to consider, from an underlying legal perspective, a contractual perspective, about data ownership, and about data licensure; all of those important controls that we put in place for data security purposes. Then just considering what the consumer would feel about a particular data project to make sure that we consider their viewpoints on these projects as well.
What are the most important risks to consider in innovative data sharing projects?
I think the most important risks are the risks associated with how the business partners that you’re working with are going to use your data. At the end of the day, making sure that we understand the data ownership and licensure issues, particularly with regard to the type of data that we’re using for any particular project is critical, so that we can ensure the right controls for that data. We want to make sure that we appropriately take care of that data. Arguably, we can control how we use our own data. It’s really about when we share that data with business partners, how we do our best to make clear to those business partners how we expect them to use and share our data and how we expect them to protect it. It’s about making sure they understand our ownership of the data, what the license to the data looks like, for purposes of a particular project, and how they’re going to protect the data, as they hold it. I would say that’s the largest risk, when we share that data outside of our own institutions.