A few weeks ago, HHS released the Health Industry Cybersecurity Practices (HICP): Managing Threats and Protecting Patients, a set of guidelines designed to help organizations of all types and sizes more effectively safeguard data. Though it was developed in response to a mandate set forth by the Cybersecurity Act of 2015 Section 405(d), the document that was born out of a two-year effort from more than 150 cybersecurity and healthcare experts is not a compliance framework. It’s not an HHS-produced document (although HHS did help facilitate the process).
What it is, according to Erik Decker, Chief Privacy and Security Officer at UChicago Medicine , is more of a “cookbook.” Yes, that’s right. In this interview, Decker likened the guidelines to a collection of recipes that are tailored to different types of facilities, and can be tweaked as needed. “You can substitute, and that’s perfectly fine. You might come up with a better dish, but you’ve got something to work off of, and that’s what this is,” he said. In our recent conversation, Decker also talked about the rigorous process of creating and editing the guidelines; how the group worked to incorporate feedback from so many different entities; what he believes was most important for both clinicians and IT professionals; and what are the next steps.
- Impetus for the guidelines: Cybersecurity Sharing Act of 2015
- “What does this mean to the industry?”
- Starting with an “industry-wide risk assessment”
- Email phishing – “Nothing is 100 percent preventable. It’s just not possible.”
- Creating “human sensors”
- Medical community baselining
- Key message from clinicians: “It needs to be concise.”
It came down to this: if we want to create guidelines that are practical, actionable, and implementable, we need to think about the threats we all face. Let’s come to an agreement as to what those are, and let’s build our practices based on that.
In essence, we did an industry-wide risk assessment. That was the cornerstone on how we focused ourselves throughout the next year and half.
because no system is perfect, you need to make sure you have good detective and response capabilities, as well as an educated workforce. That way, when something ‘phishy’ comes along that doesn’t smell right, you have a mechanism inside your organization to handle it, and you can take more responsive actions.
cybersecurity hygiene should be like washing your hands. It should be as simple as that. That’s a very meaningful message which clearly has resonated.
It needs to be concise. They basically told us, ‘We’re not going to pick up a 250-page document and read it. It’s just not going to happen.’ And we knew we had to lead with a strong executive summary and really draw them in.
Gamble: Hi Erik, thanks so much for taking some time so speak with us. First off, congratulations on the release of the cybersecurity guidelines. Can you start by providing some history behind this initiative?
Decker: Sure. It came out of the Cybersecurity Sharing Act of 2015, which was a legislative mandate. Section 405 is specific to cybersecurity issues in healthcare. It started with an internal HHS report that had to be delivered back in 2016, followed by the Health Care Industry Task Force report, which was released in June of 2017. Once that came out, the big question was, how do we go about resolving some of these issues? The mandate specifically called for a number of key elements: it needed to be voluntary, consensus-based, industry-led, cost-effective, practical, actionable, and implementable.
Gamble: So you definitely had your work cut out for you.
Decker: We did; it was a lot to work through. So when we came together as a team in May of 2017, we started under that premise: What does this mean to the industry? Of course, it also needs to align with NIST, HIPAA, and HITECH — although, it’s important to note that what we’ve created is not a regulation, nor is it intended to be in any way. What it means is there’s a lot of material that exists today, especially in the NIST world. We don’t want to create something new; we want to guide people to where great knowledge already exists, and we want to map it back to the NIST Cybersecurity Framework, because that’s the overarching framework that we operate off of.
Gamble: With so much to consider, how did you find the right starting point?
Decker: When we came together as a team, after we stewed and chewed on the problem a bit, it came down to this: if we want to create guidelines that are practical, actionable, and implementable, we need to think about the threats we all face. Let’s come to an agreement as to what those are, and let’s build our practices based on that. And of course, we need to consider all the other factors that come into play, like the nuances between small practices and large organizations with very different resources, and very different types of challenges. It’s considering the needs of the highly technical individuals who, when the rubber meets the road, are the ones that will need to change all the widgets and configurations and drive these forward, all the way to the executives who support these programs and try to raise a level of awareness. We talked about all of those factors — that’s why it ended up being a four-volume set of practices.
Gamble: There’s so much to consider. How did you work to identify the common threats?
Decker: Initially, we started by looking at the assets we have in healthcare. Right out of the gate, we recognized that the answer to that question depends on who you are. And so we stepped back and said, let’s start by identifying the threats, and from there, we’ll talk about assets, then vulnerabilities, then the impact associated with those. From there, we’ll look at how we can actually manage and mitigate the threats we’re facing: What are the most meaningful practices that will do that?’ In essence, we did an industry-wide risk assessment. That was the cornerstone on how we focused ourselves throughout the next year and half.
Gamble: In the guidelines, you identified five major threats, the first being email phishing attacks. I imagine this can be frustrating because it’s seemingly preventable. What did you find in this area?
Decker: There are some things that can be avoided, but as is the case with all cybersecurity threats, nothing is 100 preventable. It’s just not possible. Think about it this way. If I send you an email that appears to be from the CEO of your organization — using a Gmail account that’s been created, for instance — asking you to send me your W2’s, you’re most likely going to click on it. That’s nearly impossible to prevent.
You might be able to flag certain things like W2s, but should you actually prevent that? You don’t want to overregulate; you don’t want to stop business. There are so many different types of challenges associated to that. Phishing, essentially, is social engineering. It’s tricking the user into divulging or clicking on things in order to get past to the perimeter and into the environment, either through credential theft or through malware droppers — things along those lines.
There’s a lot of technologies out there that take preventative actions and, in fact, can pick up references — the controls that should be in place. But beyond that, because no system is perfect, you need to make sure you have good detective and response capabilities, as well as an educated workforce. That way, when something ‘phishy’ comes along that doesn’t smell right, you have a mechanism inside your organization to handle it, and you can take more responsive actions.
And so, using phishing as an example of how the practices line themselves up, we talk in the guidelines about various preventative steps that can be taken. We also talk about the incident response components, including workforce education, and how to tie all of those things together so that you have a human sensor type of culture where your individuals can inform you of the suspicious things and your security teams can act appropriately. As with an infection or a disease, the earlier you find it, the better off you are. It’s the same thing in cybersecurity.
Gamble: Right. Now, as the group worked to develop these guidelines, I imagine it was important to make sure you were getting a good mix of perspectives and working to incorporate them. How did you go about doing that?
Decker: The process we followed was amazing. Everything went through a multi-faceted, peer review. The first thing we did was get together as an industry and capture all of the feedback from facilitated discussions that happened in D.C. We obtained those perspectives and then when back to the table and asked more probing questions, and got more perspectives. We captured the contributions of the Task Force groups.
After we did that, we separated into subgroups — each of which had its own leader — that focused on very particular, targeted area. With the five threats, for example, we had five different leaders to help really scope out what those threats look like, and what are the key factors. We did this with physician practices, and we did it with small, medium, and large organizations so we could identify the tone that would be necessary to make the guidelines meaningful for different types of organizations. This way, the small practice guide (which falls under Technical Volume 1) reads very differently than Technical Volume 2. It’s specifically written so that a practice manager or a clinician in a small practice can read it and understand what they need to do, or hand it to the outsourced IT division and say, ‘here’s what you need to do.’ It’s written with that perspective in mind, whereas Tech Volume 2 is geared toward subject matter experts.
Gamble: And you said there were different rounds of review during this process?
Decker: Yes. We had a committee comprised of volunteers from the Task Force who actually wrote the guidelines, with support from editors and contributors. We went through several iterations they would do some writing and then meet with the team, who would vet it, challenge it, come up with better language, focus it properly, get down to the right practices, etc.
After that was done, we shared it with a set of CISOs who had not been part of the process at all. We got their feedback, then took it back and got feedback again from key members inside the Task Group on the whole document.
The next thing we did was to create focus groups in seven different regions throughout the country. All of this was facilitated by HHS — they’ve been a fantastic partner throughout the entire process. From there, we divided each region into two groups: clinicians and IT professionals. We split them up and had them go through the documents and provide their critical feedback.
Gamble: It sounds like it was quite a rigorous process.
Decker: It definitely was. And in fact, in addition to reviewing the document, we did something called ‘medical community baselining,’ where we went to various industry stakeholders and asked them some more qualitative questions about cybersecurity to gauge their level of awareness and learn what they’re thinking about. We incorporated that feedback into the main document so that, from a tone and a style perspective, we knew we were really reaching out to that community.
We also had physicians in our Task Group. One of them in particular, Dr. Phil, called out the concept that cybersecurity hygiene should be like washing your hands. It should be as simple as that. That’s a very meaningful message which clearly has resonated. It’s a big win, in my opinion, when we can hit those key messaging marks and get an intrinsic feel to it.
Gamble: In the conversations you had with the medical community, was there anything that surprised you, or really stood out?
Decker: There are always some surprises. Interestingly, when we spoke with the clinician community, one theme we heard was that the message wasn’t strong enough — it needed more ‘oomph’ to it. On the other hand, when we spoke with the IT community, they said, ‘You need to back off. Don’t lead with fear.’ So there was a dichotomy between what various stakeholders wanted to hear.
The other thing that became quite clear is that it needs to be concise. They basically told us, ‘We’re not going to pick up a 250-page document and read it. It’s just not going to happen.’ And we knew we had to lead with a strong executive summary and really draw them in, then let them know what they need to do. That was a big message we got.