Though sharing information with other health systems may not be a priority for leadership teams working through a breach, it is, ironically, one of the most important actions organizations can take for their peers, according to Denise Anderson, President of Health ISAC. That’s because, as she puts it, “one person’s defense will become everyone else’s offense.” But even with copious sharing, organizations will go down, and that’s why business continuity planning is so important. Anderson, who has worked as a firefighter/EMT, knows first-hand the importance of being ready to handle a disaster. She recommends CISOs liaise with emergency management and train copiously, so muscle memory kicks in when things go south. In this interview with healthsystemCIO Founder & Editor-in-Chief Anthony Guerra, Anderson covers these issues and many more.
LISTEN HERE USING THE PLAYER BELOW OR SUBSCRIBE THROUGH YOUR FAVORITE PODCASTING SERVICE.
It goes back to enterprise risk management and looking at what are the crown jewels of what I do and what does it take for me to continue to produce or do those crown jewels, and for how long.
that probably is what keeps me up at night. Imagine that a threat actor goes in and says I’ve changed some records in your organization on blood type, randomly 100 records of your patients. Somebody who is B+ is now an A-. Imagine what that would do.
I do consider myself to be an evangelist for information sharing – that we should be sharing information with each other, and shame on us for not doing it. There’s various reasons why people don’t do it but it’s so easy to do. It’s so beneficial.
Anthony: Welcome to healthsystemCIO’s interview with Denise Anderson, President of Health ISAC. I’m Anthony Guerra, Founder and Editor-in-Chief. Denise, thanks for joining me.
Denise: Thanks for having me, Anthony.
Anthony: Very good. Denise, you want to tell me a little bit about your organization and your role?
Denise: I’m Denise Anderson. I’m President and CEO of the Health ISAC which stands for Information Sharing and Analysis Center. Just a little bit of background on ISACs, the Information Sharing and Analysis Centers were formed under a presidential directive from President Bill Clinton in 1998.
The concern at that time was Y2K, to get industry to share with each other amongst the critical infrastructure sectors and with government and so the ISACs were formed and were paired along critical infrastructure sectors. The US has defined 16 and we are in the health sector.
The Health ISAC formed in 2010. I came on board in 2015 after a stint at the Financial Services ISAC which was one of the first ones to start. Basically, what we do is we’re a trusted community of organizations that touch the patient really in healthcare, and we’re focused on cybersecurity and physical security so that patients can receive care and get it safely.
Our membership is composed of various subsectors within the sector, such as labs and pharmaceuticals, medical device manufacturers, pharmaceutical manufacturers, of course hospitals and healthcare delivery organizations, anything from a large hospital system to a small clinic and then insurance companies, basically anyone that touches the patient can be a member of ours.
Anthony: Very good. I like to start with an open-ended question. What are some of the trends you’re looking at, some of the main things that you’re watching, that you feel like your members want content around and direction and things like that. I don’t know if you’re allowed to lobby or you can’t lobby.
Denise: We don’t lobby.
Anthony: You don’t lobby.
Denise: No, we’re not a lobby organization. No. We’re very operational in nature. We’re, I like to say, where the rubber hits the road. We’re basically looking at the threat landscape that is out there, whether it’s cyber or physical, and then we’re alerting our members to the various threats that could potentially be something that they face.
For example, last year we provided 274 targeted alerts. Those were alerts specific to organizations where they either had a vulnerability that we were able to detect or that we saw their name on a list to be targeted. That’s an absolute value that our members would receive from getting those alerts. But we have general alerts as well. So as we’re seeing vulnerabilities or threat actors out there, we’re sharing that information with the membership. The members actually also share with each other as well. So when they’re seeing things, they’re sharing what they’re seeing, what they’re doing about it or if they’re having challenges with doing something about it. It’s all about members helping members and the community helping the community, and that’s really the strength of what we do.
Anthony: You are a specific source for a threat intelligence feed? They may have other vendors that provide them with other feeds for threat intelligence but you are also another source, correct?
Denise: We’re a source. I wouldn’t call us a feed, per se. We’re definitely a source and I would say that we’re a highly vetted, highly trusted source because a lot of our info is coming from the members themselves. It’s what they’re actually seeing. It’s not necessarily a threat intelligence feed, per se, although we do get indicators of compromise. Those are the email addresses, headers, hash marks that come in and we do share those out. We do have high fidelity indicators of compromise that get shared out.
Anthony: Any thoughts on the main things that you think CISOs at hospitals are looking at, thinking about from your conversations with them and interaction?
Denise: It’s interesting. I do think that those CISOs that understand the threat landscape are constantly watching for whatever could be out there. It doesn’t necessarily have to be something directly targeting them. It could be something that can impact the supply chain, for example. One of the things I talk about often is that you need to be thinking of every potential scenario and every potential situation that could impact you.
One of the examples I’ll give is that during the initial stages of the war between Ukraine and Russia, there was a satellite firm that got attacked, and it took down their operation. What people didn’t realize was that a lot of wind farms, those wind turbines in Germany relied on those satellites to operate, and so it took down electricity in Germany and some other areas in Europe, also internet communications. So those impacted operations. Even though it’s not something directly targeting healthcare, you have to be very mindful of the landscape out there and how you can potentially be impacted by events that happen.
Anthony: Yes, mindful of things that can cause disruption downstream. I looked at your opening letter from your report in 2022 and you really focused on resilience. Huge, huge, issue, resilience business continuity planning, recovering or dealing with an outage is one of the things that I’ve been focusing on a lot – the Joint Commission came out with a very interesting – I thought it was really interesting – paper in August, last month, called Preserving Patient Safety After a Cyber Attack. It’s all part of the issue of continuing operations in the face of a disruption to electronic systems.
I’d just like your thoughts around that. To me, that’s just a monumentally huge issue. The paper from the Joint Commission, when I read it, I was like oh my god, it’s amazing what you have to do to prepare, and the idea of going to paper, the change of what it requires is astounding and almost incomprehensible that it can be done. It feels like this is a huge issue right now because we see more and more outages. We’re seeing them left and right. Your thoughts?
Denise: I think the reality is at some point somebody will be attacked or be impacted by an attack, whether it’s on them directly or on a supplier or partner. You have to be thinking – this is back to my earlier comments. You have to be thinking of all the potential impact to your organization and how you can continue to deliver what it is that you deliver.
It goes back to enterprise risk management and looking at what are the crown jewels of what I do and what does it take for me to continue to produce or do those crown jewels, and for how long. Because sometimes many of these incidents can go on for months. So you’ve got to understand what the potential impacts can be and how you can actually continue operations along those lines for a continued and sustained period of time.
It’s not easy but I think it goes back – I was actually a firefighter EMT for 20 years and when I was coming up through the ranks we looked, you know those old ADC map books, that’s how we found an address. Now it’s all automated but that skill, we tested it continuously, we knew all the streets in our area of response. When the call came, we knew where to go immediately when we turned on the station. That skill is lost because it’s now all made electronic.
And I think that’s the same thing in the hospital, right? The old school doctors that were used to pen and paper, it might not be as hard for them, although it is an issue I won’t deny that; but it would be a little bit easier for them. The younger generation who have been raised on the electronic communications and electronic processes of patient care, it’s harder for them to adapt because they’ve not had that – that experience has not been there for them. So it is a huge issue and it’s something that people need to be very mindful of when they’re looking at how to remain resilient and to continue with patient care.
Anthony: You mentioned the timeframe that we want a scenario plan for, and in the paper from the Joint Commission it was 4 weeks, which is astounding because I think the inclination for the average employee, maybe average clinician is okay, the systems are down, let’s go to lunch and when we come back, things will be up again. It’s not to continue, it’s let’s just wait for them to come back up because again, it’s so inconceivable of how to move forward. You mentioned the preparation that’s required.
Here’s a question and this is something again, I’ve been focusing on for a couple of years now as I have interviews with different CISOs – what is your role, Mr. or Mrs. CISO, in making sure your health system is prepared to deal with a cyber outage?
Denise: That goes back to instant response and instant response management. As I said being a firefighter, they do that every day, right? It’s running a crisis after a crisis. They have a plan, they have a pre-plan and everything is laid out. There is a division for finances. There’s a division for logistics. There’s a division for response. They’re all working the incident.
Likewise, an organization should be doing the same thing. Cyber, I would argue is one component of the crisis, right. As you’re alluding to, they would be the division resource for the cyber aspects or impact of the incident. Absolutely, there should be incident response management. It should fall under the enterprise management plan and people should be prepared.
I always say you should know your response, know your impact, write a plan, have a plan, exercise the plan and make sure that everybody knows the plan. Because when an incident happens, you don’t want to be scrambling to figure out who do I call for this or who I do call for that, what do I do? You need to know. You’re not going to be able to perfectly envision every scenario that can exist but you can have plans in place that can attempt to get you down the response path.
Anthony: Yes, and organizations can’t expect the CISO to manage everything about a cyber incident because it’s going to stop at some point. Their responsibility is going to stop and someone else has to take over. For example, we can deal with the cyber element of it. We can make the decision about systems having to be taken offline but the CISO is not going to make sure that the paper forms are there. They’re not going to make sure the printer ink is there.
Anthony: If that ball doesn’t get picked up by somebody else, you’re going to have a huge gap in that resiliency plan. There has to be enterprise level non-IT, non-security running the whole thing.
Denise: Right. Everybody needs to have a voice, right? Everyone needs to be at the table because not everyone is going not understand all the consequences or impact of an incident upon that particular aspect of the operation.Everyone’s voice needs to be there especially in the planning process and, again, I’m a fan of the checklists that you’re alluding to. There would be a checkmark – did we have the paper forms in place, are they updated? All that stuff. So that it doesn’t get missed in the chaos of response.
Anthony: Well, you’re talking to the right guy. I’m a checklist guy. I’m trying to teach my kids. If there’s 11 things you need to bring to school, don’t try to remember all 11 everyday, make a checklist on your phone.
Denise: Right. That’s right.
Anthony: Beautiful. It’s so easy. Reminders, checklist, calendar appointments. This is how you run your life these days. Let’s use the tools. They use it on the clinical side. There’s all kinds of checklists, in airlines. Everywhere there’s checklist. The beauty of this stuff is when you establish a checklist or a process, you can continually improve it. You don’t have to reinvent the wheel every time, but also know that it’s not set in stone. It’s never done. It’s always subject to improvement.
Anthony: Isn’t this how we move forward in life in almost every aspect?
Denise: That’s right. Absolutely. We do many of these things. We’ve done it as firefighters. We’ve done it as doctors where they have the steps in place for dealing with a procedure or an operation or whatever the case may be. But we don’t bridge that gap and bring it over to other processes. We tend to do things in silos and don’t think about hey, we do this over here, why can’t we bring that over to this side and look at it. It does, it takes a mindset where you’re looking at from above at the whole thing and then being able to say hey, why can’t we use this here, it’s absolutely something that we can absolutely do. It is; it’s a mindset.
Anthony: I would just get your thoughts on this. It’s not the CISO’s responsibility to make sure the paper forms are there and the printer ink is there but it is your responsibility to bring that to emergency management and just tell them. Because at that level, you want the business to continue to operate. You have to try and at least put the business in a position to be successful which is, again, articulating this, voicing this, helping emergency management understand how a cyber incident might unfold.
Denise: Right. I would argue that the CISOs don’t even know everything either. But it’s their job to articulate what the impact is. You can’t just say, “We’re taking the systems offline;” you’re saying when we take the systems offline, this is what’s going to happen. “You’re not going to have access to patient records. You’re not going to have access to images.” Those kinds of things they need to articulate clearly that here’s what the impact of this action is going to be. Then, the response team would figure out okay, what do we need to do to address that impact. That’s the start of the plan.
Anthony: In an article I looked at this morning about an outage – the two things you mentioned, someone at the health system saying the biggest impacts were number 1, our ability to do imaging studies and look at them in real time was gone, which is a huge part of diagnosing clinical issues.
Anthony: The second thing was the EMR allowed us to look back at old records and that has been also 100% taken away. Those are the two big impacts that have to be thought through on the clinical side, right?
Denise: Right. Right.
Anthony: That’s why they’re diverting patients – and I made a joke the other day that if I knew my local health system was down or subject to ransomware, I’d divert myself. You wouldn’t have to worry about diverting me. I would divert myself. But anyway, your thoughts about – again, that clinicians having to think through what they’re going to do. So it’s interesting.
Denise: Right, exactly. Yes, you articulate what the impact is going to be and then they’ve got to figure out well, how do I continue my job with this impact, what do I need to do on my end to make sure that I can still treat a patient. Access to images is huge. I would argue when you’re doing operations or anything like that, lab results, all that stuff is very, very reliant on electronic records nowadays. I mean, that’s just the way it is.
Anthony: Let’s switch gears a little bit. I don’t know how much you’ve been thinking about this. But I’ve just been reading a lot about this generative AI stuff and the security implications of this and some articles I looked at recently talked about, one of the big concerns is I guess people in the health system doing queries. You’re going to do a generative AI query in ChatGPT or something so you dump some stuff in – you’re dumping stuff in and then you’re hoping to get some response back. Well, there’s concerns there that the dumping in part could be dumping in PHI maybe and it’s a concern because you don’t know if these things are going to pop up in somebody else’s query because they’re used now as part of the AI, in its brain, so to speak. There are some issues there. They’re starting to roll the stuff out. There’s a lot of thinking about this. Do you have any thoughts on the generative AI area?
Denise: Like anything, like technology, when we made things electronic, there are great benefits to it. I think AI is going to bring a lot of advances, a lot of changes in medicine, really positive things. But it’s also going to create a lot of challenges like anything else. One of my concerns would be more of the disinformation side of the coin where you just alluded to people putting a query out and then requesting information back. Think about if someone manipulates that and gives bad information that is potentially used to treat patients with. That would definitely be a concern of mine. I could see where that could be very malicious.
I think there’s a lot of other ways, obviously creating personas, imitating personas for authorization to systems, those kinds of things. So I think that there’s a lot that we need to be mindful of. Over time, I think it’s the same things we need to be concerned about which is availability, integrity and access. But it’s just going to be in a different way.
Anthony: Interesting point. It makes me think of that data poisoning concept that you have to watch out about.
Anthony: I interviewed Christian Dameff who’s a doctor who works in cybersecurity – he’s well known with his paper about the regional impacts of a ransomware incident, so he’s done some great work. One of the really interesting – when I interviewed him, one of the things he brought up was think about this – somebody wants to do harm in a health system and generative AI gives them the ability, if they’re able to influence prescribing electronically and they want to do harm. Generative AI gives you ability to come up with a harmful but not obviously harmful prescription. Because there are things that would be complete red flags, totally ridiculous and no one would ever follow through on them but there are ways to do harm less obviously so it doesn’t get caught. Anyway, data poisoning, a very interesting concept. Any more thoughts about that?
Denise: I mean that probably is what keeps me up at night. Imagine that a threat actor goes in and says I’ve changed some records in your organization on blood type, randomly 100 records of your patients. Somebody who is B+ is now an A-. Imagine what that would do. Number 1, they don’t technically even have to do it, but even the thought of them doing it, how can you trust your records again? You’re going to have to go back through and clean everything up. But if they really did do it, think of the consequences of that action. Or they could then threaten ransom, ‘hey, I could tell you which of those 100 records I changed, but you got to pay me so much money to get that information.’ I definitely could see where that… or have got a terrorist aspect too where they go in quietly to do it because they want to create harm. Their motivation is for that.
That data integrity is probably the biggest issue for me because I think it doesn’t take much to really decrease the trust in the data.
Anthony: Great point. Great point. There’s different ways they get in. There’s the technical way, really fancy hacking, really advanced. They really know their stuff from an IT point of view. That’s one way to hack in. That requires a lot more skill and then there’s the basic phishing social engineering. You’re just writing an email. Email or phone calls or combination of researching people on their social networks to find ways to make it seem as if you know more than you do. That’s something anybody can do.
There’s different ways, there’s arguments about which is more prevalent but they’re both serious. At the very least, we want to stop people from clicking on emails, clicking on things, creating a nice cyber culture where people are sensitive to these things. That’s a big responsibility that all CISOs want to do. They want to make sure they have a sophisticated cyber culture in their organizations where people aren’t doing silly things and then dealing with people who make mistakes. I’ve heard many CISOs, they really don’t want to take a punitive approach. It’s just not the way they want to approach things, and more coaching. It’s been said, you should see them as a victim, but there’s a balance.
There’s always a balance because somebody who makes 5 mistakes, come on. At some point, we change from patting you on the back and saying you’ll do better next time, to something has to change. They have to work with HR and all these kinds of things for dealing with these issues. But overall, your thoughts on how to create a positive cyber culture where people are thinking right, they say that this is your biggest vulnerability is your people sometimes. Your thoughts?
Denise: That’s right. No, I absolutely agree. People are the low hanging fruit. There is a reason why the Nigerian prince schemes still are around, right, because people fall for them. You would think, what is wrong with people, how could you fall for this, but they do. Because they are totally engineered. So I agree with you. We see this argument all the time in some of our threads about are you punitive or are you more educational in nature when it comes to employees and clicking on sample phishes or those kinds of things.
We see the gamut. But most people do want to take more of the positive approach. I believe that by getting people to understand the impact of clicking on that link and educating them on why they shouldn’t do it – because this is what could potentially happen – that people would then think twice, hopefully, or at least have some understanding of their role in helping the organization prevent harm so that then patients could be treated effectively.
I think when you couch it that way, it helps. A lot of times you just get that same don’t click on link, don’t click on link, look for this, look for that. But if you understand, wait a minute, if I click on the link and this is what could potentially happen, and maybe my aunt who is being treated on an operating table can’t be operated on, it helps put context around why they play a very critical role in helping an organization stay safe.
Anthony: You just made me think about identity and access management. We can’t have 10,000 employees that each if they click on the wrong link can destroy the organization. You can’t have that.
Anthony: That’s why we have rights and permissions and 99% of those 10,000, hopefully, theoretically, if they do click on the wrong link, there is a minor amount of harm, not that catastrophic harm because they have low level rights and permissions and this is a huge area, right, identity and access? It’s a huge undertaking, not easy, extremely complicated. It’s always changing. But ideally, you want to have the minimum rights and permissions for your role.
Everyone should have the minimum and it should change immediately upon the role changing and the needs changing. That should flex and it should be super tight but it’s way easier said than done when you have again, these people are changing roles constantly, needing access, changing departments. It’s a massive, massive issue but it sounds like that’s going to be key, getting your arms around that is going to be key to minimizing damage if somebody does click. Does that make sense?
Denise: That’s right. Absolutely. I would argue that education is one aspect of protecting identity and access management.
Anthony: Very good, Denise. We’re almost out of time. I want to give you an opportunity for your final thoughts and I’ll couch it this way, anything more you want the folks to know about Health ISAC. I know you’re very well known, your entity is very well known and leveraged by a lot of folks. There may be some who either are new or just aren’t aware of what you offer, and why they might want to become involved. Your thoughts around that or overall, any advice to the CISOs at the health systems?
Denise: I would be remise in not saying – because I do consider myself to be an evangelist for information sharing – that we should be sharing information with each other, and shame on us for not doing it. There’s various reasons why people don’t do it but it’s so easy to do. It’s so beneficial. We have so many examples of where information sharing has protected organizations from harm, has protected the community from harm.
I really, really encourage organizations to really think about sharing information with each other, with an organization like Health ISAC, with whoever it is that you think would be beneficial, whether it’s law enforcement or any other entity to do it because it is. It doesn’t cost anything. It’s easy. Many people get hung up on the type of information that’s shared but really to be effective, it’s not as sensitive as people think it might be. Indicators of compromise can be very generic.
I really encourage people to really share with each other, especially when they’re seeing incidents within their own organizations and how they can benefit from that. We have this mantra, one person’s defense will become everyone else’s offense. That’s really the essence of Health ISAC is the community helping the community so that we can deliver effective patient care. Because we’ll all be patients one day or know a patient, and so we want to make sure that they get treated.
Anthony: I think we’ll definitely all be patients one day.
Denise: Yes, yes.
Anthony: But I did interview John Riggi from the American Hospital Association who I’m sure you know, and he mentioned that there are certain legal protections when you share information.
Denise: There are.
Anthony: As you said, when it happens to you, sharing may not be the first thing on your mind but you would want everyone else to share, right. Because you would want to know to help.
Anthony: If we could figure out a way to do it and have those conversations beforehand and figure it all out, we can be more comfortable talking to legal, making sure privacy is on board, because sharing will help, absolutely.
Denise: Exactly. That’s right. That’s part of the incident response plan. I would love to see that as a checkbox on the list, share information with Health ISAC or whatever ISAC you are in your sector or whoever it is that you want to share with. But that would be one step in the process. Because to your point, when it happens to your partner or you’re connected to them and they’re not sharing information with you, that doesn’t help anybody. So the more we can prepare and are willing to figure out the ways to share with each other before an incident happens, the better off everyone will be.
Anthony: I love your point – it should be on the checklist. I absolutely love it and I think it should be there, every CISO’s checklist as they talk to emergency and work through these things. It may not be the first thing you do, right, you’re dealing with the incident but when and how can we share for the benefit of other health systems absolutely should be on that checklist. Great point, great idea. Denise, a pleasure. Thank you so much for talking today.
Denise: Thank you, Anthony, for having me.