Chuck Christian, VP, Technology & Engagement, Indiana Health Information Exchange
I don’t seem to be able to get through a morning’s reading without seeing one or more instance of a data breach; information being either locked up by ransomware or removed/copied by the thousands and sometimes, tens of thousands of records. Having spent a good portion of my career with one area of my responsibilities being that of the security officer for the various organizations I’ve served, it concerns me more each day that the bad actors are either getting better at finding the chinks in the armor or we’re not doing a great job of identifying and closing the gaps.
Don’t get me wrong, I’m not suggesting that healthcare is a primary offender. Let’s see, we’ve seen breach and/or data leaks at Facebook, Google, Equifax, NSA (Eric Snowden), Cambridge Analytics (Facebook’s customer/partner) and a host of others.
However, if you check out the Experian site that provides the value of various data on the dark web; Medical records ranks right up there with US Passports. The value of a medical record really depends upon how complete it is or if it’s a single record or a complete database.
Are we giving our Privacy away?
When was the last time that you or anyone you know took the time to read the privacy policies, data use agreements, or end user license agreements (EULA) when we sign up for a rewards card or make a purchase on the web? I have friends that are attorneys that will readily admit that these documents (and others) are difficult to understand, cover a lot of bases, and are mostly slanted in favor of the site/company that wants your data.
My wife and I have several reward/shopper cards and we really don’t stop to think about what information is being gathered about our purchasing patterns or who else might be given the data.
I had the pleasure of working with one of the credit bureaus several years ago; they were thinking about getting into the business of helping hospitals clean up their medical record duplication rates (yep, they decided against it). I had the opportunity to get a glimpse of the different sources of data they gather about individuals, over time. A single source by itself seems somewhat benign, but when you start pulling in the data from the other sources, the image of someone’s life starts to appear, just like a Seurat painting.
When we add social media data that is captured as a by-product of our on-line surfing and/or buying habits, what is captured and shared by our cell phones, and other connected devices; our digital picture becomes very clear.
The Third-Party Doctrine
I’m concerned about our privacy for a variety of reasons. What happens to the data we share and what protections or lack of protections do we have as individuals? If you follow some of the proposed approaches of data exchange, they’re based upon the patient being provided access to their data (as prescribed by HIPAA) and then the patient sharing it with a third party of their choice. With Blue Button 2.0, the patient gives a third party permission to access and download 4 years of Medicare claims data. Please don’t hear me as saying that this is a bad idea, I’m not; however, I am concerned.
I was part of a discussion group with CMS and some of the developers of the API (application programming interface) that provides the access to Medicare claim files. When I asked what rules and/or requirements might CMS be putting into place related the secondary use of the data to ensure a level of privacy and standardization; what I heard was that it’s the patient’s responsibility to ensure the data is shared with a third party they trust. Remember our discussion about of who reads the data use agreements.
So if the patient freely shares their information with a third party, but fails to read the fine print and that business either shares or sales that data, if you look back at the details of the Third-Party Doctrine, there is not reasonable expectation of privacy.
I very much agree that all patients should have ready access to their medical records and they should be able to share them with whomever they desire; however, I also believe that these same patients need to be educated on the associated risks, and there should be some level of standard privacy protections related to those transactions.
Last December, a bill was introduced in the Senate — the Data Care Act of 2018 — which outlined the duties of online service providers: Care, Loyalty, and Confidentiality. It’s hard for me to tell just by reading the bill, but it appears to be focused on online providers like Google, Facebook, Twitter, etc. One would think if we’re going to specify how these entities should care for data, we would do the same of our Medicare age patient population. But that’s just me thinking out loud, which has landed me in hot water once or twice.
Are APIs a data doorway with a lock that’s easy to pick?
I can’t write about privacy concerns and not poke at APIs just a little. Don’t get me wrong, it’s not that I have issues with APIs; my issue is with the sometimes weak security built into them. Let’s take a look at just a few of the more publicized data breaches.
Facebook recently announced a very large data breach that affected over 50 million accounts. The vulnerability in the code of the developer’s APIs was introduced in July of 2017, but the gap was not discovered until September of 2018. This was not the first time that Facebook had a problem with the misuse of an API. Remember Cambridge Analytics? They used a loophole to collect data from over 80 million users between 2013 and 2015.
Panera Bread left an unauthenticated API endpoint exposed, which allowed anyone to view their customer information; more that 37 million records were leaked over an 8-month period. I chalk this one up to very sloppy security and system admin work; however, IMHO it should have never been deployed without the appropriate security requirements.
Finally, there’s PayPal’s Venmo subsidiary (a digital wallet that lets you make and share payments with friends). Venmo made its API publicly accessible by default. The app allowed 207,984,218 transactions to be leaked in 2017. Again, I’ll chalk this one up to sloppy security and admin setup work. However, one would think that these are professionals, working in the financial transaction space, which has some of the tightest security regulations around.
Why am I picking on APIs?
API security is a difficult problem to address, and it has huge data ramifications. If you imagine the massive number of connections that occur at the same time, coming from a variety of devices/locations, the needle just got a lot smaller, and the hay stack more massive. Many security experts believe the next wave of enterprise-level hacking will be carried out by exploiting APIs. Based upon the examples I provided, we’re already seeing this as a lucrative attack vector.
If you follow the regulation processes around the 21st Century Cures Act, the ONC certification requirements, FHIR, the 2018 ONC Report to Congress, provider burden reduction, etc., you know APIs are posed to play a large role in how health data is shared and/or exchanged.
On December 11, 2018, Dr. Rucker, the National Coordinator for Health IT, provided written and oral testimony to the House Committee on Energy and Commerce, Subcommittee on Health. In his written comments, Dr. Rucker explained the importance of incorporating APIs into the data sharing/exchange tool-set for healthcare. He wrote, “We (ONC) take cybersecurity threats and issues related to information security seriously. However, it is important to realize that APIs are not usually where these security vulnerabilities reside.” During Dr. Rucker’s oral remarks to the subcommittee, he acknowledged industry security and privacy concerns stemming from APIs, but said security concerns in healthcare often result from password issues or lack of patched systems.
Please know that I have high regard and respect for Dr. Rucker and the ONC team; I’ve had the pleasure of working with this agency since its creation. I believe Dr. Rucker to be extremely qualified; however, I disagree with his assessment of API privacy and security concerns. Granted I mentioned at least two examples of poor security implementation and admin setup; however, there are many other examples of coding gaps and/or bugs that allow unwanted access to what should be secured data.
I’m concerned that we are rushing to provide a new pathway where data can be shared automatically and we’re not fully understanding the risks and ramifications. Last month, HL7 released the first normative version of the FHIR API standard. I don’t disagree that we need to make data available appropriately; however, I’m concerned that we may be running across the minefield, rather than ensuring that we’re on solid footing. Yes, I’m probably overstating this a little bit, but that is the path to the hot water I mentioned earlier.
So what does this all have to do with privacy?
If you take into consideration the recent announcements from Apple, Amazon, and others, there appears to be a great deal of interest in collecting healthcare related data. Okay, at the risk of sounding like a conspiracy theorist, I’ll go ahead and make the observation that these companies are already gathering data from your phone and your internet purchasing habits, and they can acquire other data about individuals from a variety of sources; add in your healthcare data.
I was recently reminded that one of the very best personal identifiers is your cell phone number. Something that we readily provide for two-factor authentication purposes, text notification for order tracking, etc. If the company aggregating the information has a common data element, it makes their job that much easier to associate the data with an individual. Now I know that I’m sounding like a conspiracy theorist. Sorry. However, in talking with a lot of folks that work in the healthcare IT and security space, we’re not thinking about what all can and is being assembled. Nope, no need to get out the aluminum foil hats.
My point is that awareness is very important, not only in our personal everyday interactions with an ever-increasing technological landscape, but also in our professional interactions to ensure that we’re being as diligent as possible and not overlooking a default setting or coding gap that will expose large amounts of data for someone to stumble across. The bad actors are like crows, when they find a field full of food, they call out to the others to come and get it.
Thanks for reading. I’m always interested in different points of view, new information, and opportunities to learn.
This piece was originally published on Chuck Christian’s blog, The Irreverent CIO. After spending decades in the CIO role, Christian now serves as Vice President of Technology & Engagement with the Indiana Health Information Exchange.
Share Your Thoughts
You must be logged in to post a comment.