Email is the lifeblood of any organization, with thousands coming in every day. It’s also the number one attack vector. Unfortunately, even the best filtering tools miss between 7 and 10 percent of the spam that CISOs would love to see caught. That puts the onus on employees to manually filter the rest. It’s for this reason, and others, that Rebecca Kennis, CISO at Arnot Health, is laser focused on created a culture of cyber where security is everyone’s job, and nobody thinks of it as unnecessary interference in their work. In this interview with healthsystemCIO Founder & Editor-in-Chief Anthony Guerra, Kennis discusses these issues and many others.
LISTEN HERE USING THE PLAYER BELOW OR SUBSCRIBE THROUGH YOUR FAVORITE PODCASTING SERVICE.
Podcast: Play in new window | Download (Duration: 35:01 — 24.0MB)
Subscribe: Apple Podcasts | Spotify | Android | Pandora | iHeartRadio | Podchaser | Podcast Index | Email | TuneIn | RSS
Bold Statements
The first thing that I do when I’m talking with a new set of employees, when we’re introducing this to any group at all, is really to make sure that they’re understanding the why. Because if they don’t understand the why, they’re never going to hear the what.
These things need to be presented to them not just once a year when it’s time to do the annual required trainings, that needs to be throughout the year. You can’t just have the once and done with security and expect anybody to retain and have that build any culture of security in the organization.
I always make sure I tell new employees that come in, if you click on something, if something happens, and you’re like, ‘Oh geez, I shouldn’t have done that,’ let us know right away. You will not get in trouble for doing that if you let us know right away. If you try to pretend like it didn’t happen, we can’t promise you will not get in trouble for that.
Anthony: Welcome to healthsystemCIO’s interview with Rebecca Kennis, CISO with Arnot Health. I’m Anthony Guerra, founder, and editor-in-chief. Rebecca, thanks for joining me.
Rebecca: Thanks for having me.
Anthony: Very good. Let’s start off – tell me a little bit about your organization and your role.
Rebecca: Arnot Health is a small to medium-size healthcare organization located near the southern tier of New York and very northern tier of Pennsylvania. We have three small hospitals. We have a skilled nursing facility. We have some graduate medical education and around 50 or so primary and specialty care practices throughout the region.
Anthony: I’m not too familiar with that specific area. Would you describe it as rural or not quite?
Rebecca: Parts of it are very rural, yes. I guess you could definitely classify it as rural.
Anthony: Very good. There will be some issues we’ll touch on that are specific to smaller rural facilities having significant challenges around cyber that we’ve heard about recently. We’ll talk a little bit about that.
But I want to start with an open-ended question and just ask you what’s on your mind, what are either the main things you’re working on, looking at, thinking about, that type of a thing, well go from there.
Rebecca: The biggest thing that I have on my mind right now is building up our culture of security at Arnot. Any healthcare organization has all of the patient care issues they’ve got going on, they’re trying to maintain the proper nursing ratios. They’ve got all kinds of issues with all of that. The last thing that they necessarily have on their minds is how are they going to protect the data that they have.
It’s my job and my team’s job to make sure that’s at the forefront of their minds. The way that we’re looking to do that is to really build up the culture of security, make sure that it becomes part of the culture itself of the organization and not just something that they need to check off a box or some regulation that we need to meet.
It really needs to be all hands on deck with security, particularly of late with all of the ransomware attacks that are happening, the staff are really the frontline of the organization, and no matter how many security technical controls that we put in place, it takes one person to click on the wrong thing and we’re in trouble. We’re doing a lot of work on trying to build up that culture of security.
Anthony: Absolutely. They don’t want to practice medicine without their digital tools, just like carpenters don’t want to build houses without their tools. You have to tie it to patient safety, right?
Rebecca: Yes, absolutely. The first thing that I do when I’m talking with a new set of employees, when we’re introducing this to any group at all, is really to make sure that they’re understanding the why. Because if they don’t understand the why, they’re never going to hear the what. With security, of course, there’s the CIA triad – you have the confidentiality, the integrity, and the availability of the data.
Typically, when you’re thinking of security, you’re thinking about trying to protect the confidentiality of it. But if you don’t have a good sense of the integrity of the data and the availability of it, absolutely you’re going to have patient issues, you’re going to have medical errors, all kinds of other things that could happen with that. That’s really the first place that I start is making sure that they’re connecting those dots to: this is why this is so important, and this is why we need to be putting in some of these controls that maybe they’re not going to like or, of course, we would much rather have it so that everybody can just go in and do whatever you want, but that’s not realistic in the environment we’re in.
We need to try to find that happy place where they can still do their jobs, where it’s not going to prevent them from treating their patients and treating them quickly when they have to, but also making sure that everybody else can’t get in there and have access to the data and either steal the data or corrupt it or otherwise make it non-available. There’s that dance that you have to play.
Anthony: They have to understand that it’s part of practicing medicine, just like putting on a seat belt is part of driving.
Rebecca: Right. Yes, absolutely. There’s ways that we as security professionals can make that transition easier. You try to reduce the friction in their work. Obviously, you try to make sure that they understand it. One thing I try very hard at is just not using the word no. You really want to say, ‘Okay, let’s look at that. Let’s talk about this,’ and usually you’re going to end up with a way that they can still accomplish what they want to accomplish but maybe it’s not as risky as what they initially had proposed.
You need to make those numbers-based decisions. You looked at what’s the end goal, what do you want to get to and how much risk is the organization willing to take in doing that, and you’re going to find that place in between those extremes. Because usually those extremes, neither one of those extremes is the best choice. Of course, there’s exceptions to some of things, but usually you’re not going to go to either one of those extremes.
Anthony: And it’s going to be a business owner making that decision, right? The CEO or some governance board. It’s not you, correct?
Rebecca: Exactly, yes. I mean, ultimately, it’s their business decisions. I make my recommendations based on the scenario, based on the risks that are presented to the organization by doing this and then it becomes up to the business owner to say, ‘Yes, I’m willing to accept this risk and here’s why,’ or, ‘no, I guess I’m not willing to accept this risk and we’re going to find the different solution for it.’
It’s the same thing with IT in general too. Security and IT really cannot be driving the business on this. The business needs to be driving it and then we support them in what it is that they want and need to do and try to find the best ways to get it done.
Anthony: Sometimes, I would imagine, the business owner in question shouldn’t be the decision maker, it has to be someone above them; someone higher?
Rebecca: Oh, absolutely.
Anthony: Tell me about that.
Rebecca: Yes, absolutely. You’re always going to have the case where somebody is going to be presenting a scenario to you and you say, ‘these are the risks,’ and if it’s something that you feel strongly is really a big risk to the organization, it’s going to need to be escalated to somebody that is higher in the food chain, either a compliance committee or some other governance committee or, depending on the issue, directly up to the CFO, the CEO, somebody that is really ‘the buck stops with them’ as far as those types of decisions.
Anthony: You mentioned training new hires on cyber – is that easier than it was in the past or is there still a big learning curve?
Rebecca: Well, it is a big learning curve, but I think it is getting better, and it’s all in how it’s presented to them. If they see, if they hear examples, if they get to understand that ‘why’, it gets back to the why. If they can also have that parallel into their own lives, say you would not want your bank account, your money, open wide for anybody to have access to it. You make those parallels.
When you’re in your private life and your bank is requiring multi-factor authentication or some other thing, you’re not fighting against the bank. You just do it and really even though some of them, they feel like a nuisance, you just know that’s your money, you want to protect it. It’s really no different here, although it’s actually a little bit stronger of a reason. These patients are coming to us and they are trusting that we are going to keep their information confidential, that we’re going to keep it safe and secure and that we are going to do everything that we can do to protect it, and them, ultimately.
It’s our duty really to be able to do whatever we can to protect them. It becomes a little bit more like you see some a-ha moments, to steal from Oprah, I guess. When you’re starting to explain some of these things to them and how certain things present risks and could potentially be seen as something that could expose a patient to harm, you see those light bulbs go off in their head like, ‘Oh, I never really thought about it like that.’ Healthcare workers ultimately, they went into healthcare because they want to help. They want to be helping people.
Even something as simple as a staff member walking into a security area and holding the door open for somebody that’s coming behind them with like an armful of stuff, they don’t really think – well, maybe that person with an armful of stuff really shouldn’t be in there in the first place and they’re using that as a means for somebody to be the nice guy and let them in. I hate to say this, but they have to have a little bit of suspicion any time they’re in those scenarios and think twice about it. Think before you click on any emails. Think before you’re doing anything that is out of the ordinary.
These things need to be presented to them not just once a year when it’s time to do the annual required trainings, that needs to be throughout the year. You can’t just have the once and done with security and expect anybody to retain and have that build any culture of security in the organization. That needs to be happening throughout the year whether it’s through monthly short trainings, whether it’s through newsletters that you’re sending out, whether it’s through some other types of engagement that you have with the staff.
It needs to be something that’s an ongoing thing to keep it top of mind for them and then it just becomes part of the culture. It’s part of what they do. It’s putting your seatbelts on when you get in the car just because that’s your habit now. You want them to not have to think about, ‘oh what was it that I learned about this?’ You want it just to become second nature, and then the security team does not end up becoming someone that you have to work around, they are someone that you can work with and someone that you can trust, and all of that.
Anthony: That’s excellent. You want to make it second nature. I know the term is muscle memory. People use things like that. You had an interesting example of holding the door. I’m not sure if you were thinking of that in the sense of just unauthorized person gaining access or you’re thinking of them gaining access and then engaging with the computer. It’s like almost where the physical meets the cyber, right, and perhaps someone didn’t log out, it gets into a rather complex scenario of someone physically gaining access to then gaining access to the computer but I guess it could happen.
Rebecca: Oh, absolutely.
Anthony: You’re concerned about that too as a CISO. You’re not concerned not only about log-ins and clicking on things and all that. That also bleeds over for you and, you tell me, into the physical realm.
Rebecca: Absolutely, right. With information security, there’s three different types of controls. You have the technical controls that everyone is thinking about. You’re putting in the firewalls and all of that to protect the perimeters and all of that. There’s also the administrative controls, meaning you have the policies, the procedures, you’re training people, you’re doing all of your risk assessments, everything in there.
Then there’s the physical controls – you want to make sure that data isn’t lying out on a desk somewhere where someone is going to see it, that you’re not having people that don’t have access or should have access to certain areas have access to it because they could see things inadvertently or on purpose that they don’t want to see. It also gives people access to potentially the servers, potentially to something, any ports that might be open. That’s also part of the physical security as well.
There’s all kinds of aspects that, again, most people when you’re thinking about information security, you’re not thinking about these physical aspects. We have to work very closely with our public safety team to make sure that they’re thinking of all that as well. They’re thinking about trying to keep people safe from fire and theft and all of that, unruly patients and visitors. But there’s this other piece as well. You want to make sure that there’s not easy access to data that people shouldn’t have access to.
Anthony: Two other things come to mind. You tell me if these are to your world. One is when you see passwords on Post-it notes on people’s monitors. Let me mention these two and then you can take them one at a time. Passwords on Post-it notes. And then I wonder if every person has experienced this; I would guess so – you’re a patient or perhaps even someone like yourself walking around the health system or one of the hospitals, whatever, and you hear this talk. I’ve been in a patient room where I could hear caregivers at a nurse station or something talking about a patient, all kinds of things I probably should not be hearing. Do those enter the CISO’s mind, those kinds of things?
Rebecca: Absolutely, they do. Certain things like having the rooms right next to each other and hearing that. Again, you can’t have everything to the extreme. Because otherwise what you’d have to put into place to prevent that, to mitigate that, it would get into the unreasonable. No possible system or healthcare system would be able to afford that, by putting all of the sound proofing and all that.
What you have to do is you put in mitigating controls in place so that you tell people, you train them to understand: don’t have these conversations out in public areas. Don’t put your sticky notes on – never do that. It still gets me when you see that. It’s getting less but yes, that does still happen from time to time. That’s why we’re building up that culture of security. We’re a relatively small team. We’re not everywhere. When you build up that culture of security you then have eyes and ears all around, where the staff are comfortable enough to have a conversation and say, ‘hey, you really shouldn’t be having that password on a sticky note. That needs to be gone. That’s not okay. I hear you’re having this conversation in an elevator about these people. You really shouldn’t be doing that.’
You want to build it up where you’re having a little bit of those uncomfortable conversations but it becomes the norm. You want that to become part of the culture of the organization.
Anthony: One of the main things we mean when we talk about a culture of security is not clicking on suspicious links and not downloading attachments, right? I think that’s the number one attack vector.
Rebecca: Absolutely.
Anthony: Tell me about that and if you want to touch on the idea that you can’t over tighten that knob because the legitimate stuff has to get through.
Rebecca: Well, you can’t. I saw a statistic yesterday that said 7 to 10% of bad emails get through – there’s a 7 to 10% failure rate on any of those email filters, so you could have hundreds of phishing emails that are coming through an organization at a time. You really need to rely on the staff to be able to recognize what’s a phishing email.
I’ll say with the ChatGPTs and those other tools that are out now, they’re great tools that can do a lot of good, but it’s also making it so much easier for the bad actors to create their phishing emails. They’re making it much more realistic. They’re making them, removing all those tell-tale signs that you might have seen before with the bad grammar and all of that. That was one of the first things you would always look for.
What you really need to do then is you need to move to, ‘okay, more generically what am I looking for?’ Well, first thing – is this an unexpected email? Were you expecting to get a message from the CEO to say go and buy me some gift cards because I need to do this. Is that an expected thing? Second thing, the urgency in it. There’s always urgency in a phishing email because they don’t want you to think about it too much. ‘This is something that you need to do and you need to take care of it quickly. We need this information, click on this link, download this thing, fill this out, send this information back to us,’ whatever it is that you’re doing.
Third thing, is there a link or are they asking for any of that information. Any one of those in of itself might be enough to make you say, ‘okay let’s stop and think.’ If you get all 3 of them, be highly suspicious of that email.
Anthony: Right.
Rebecca: You really need to take those and make sure that staff are comfortable with assessing that and that they’re taking the time to stop and read those as if any email could be a suspicious link or a phishing email.
Anthony: Yes, super important. They even said yesterday – there’s one organization, they have, I guess it goes into a quarantine folder and twice a day people get a report or an email that lists the emails that are in their quarantine. They have to review them. Apparently, people are sometimes releasing malicious emails from the quarantine folder by accident. We’ve quarantined it so we want you to take a special look at this stuff and they still occasionally make mistakes. Does that shock you?
Rebecca: No. Anytime you’re relying on humans – humans make errors. You want to have those technical controls in a place to reduce the risks and reduce the number of times that humans need to make those critical decisions, but there’s still going to be errors. You have to expect it; it’s not if it ends up happening but when. You do all you can to prevent things and give people information but, ultimately, there’s a lot of finger crossing and hoping that people recognize it, and if they recognize it that they recognize it quickly and that they can trust you enough to report it.
That’s important – I always make sure I tell new employees that come in, if you click on something, if something happens, and you’re like, ‘Oh geez, I shouldn’t have done that,’ let us know right away. You will not get in trouble for doing that if you let us know right away. If you try to pretend like it didn’t happen, we can’t promise you will not get in trouble for that.
Anthony: You’re absolutely right. Because I’ve said, I’ve admitted that my reaction if I click on something and I got some ransomware screen, I’ll probably close my computer and run to lunch and just hope that it all went away. But that’s absolutely the wrong response.
Rebecca: Absolutely, right. Because the sooner we can try to react to what happened, you can at least minimize it and potentially prevent something – yes, you definitely need to make sure, first you want to make sure that they have the information, the education to try to prevent them from clicking on something. But if they do, you want to make sure that they have that trust in you that they can reach out and say, ‘hey, I made this mistake and I’m sorry I made this mistake,’ so that we can then go and try to mitigate the outcome.
Anthony: You’re so right and that’s got to be done ahead of time. That messaging that you’re not going to get in trouble has to be ahead of time so that they don’t have that panic moment that I mentioned and run to lunch and say maybe it’ll all go away, whatever. That’s great, great messaging.
Rebecca: Again, that’s all part of that culture of security. It’s that full message of security is woven into all the business and operations of what we’re doing, shouldn’t be an afterthought, you should trust us, we’re your team members. You don’t want to think of us as just the security people that are off in the basement and they’re watching what we’re doing, and they’re going to get us in trouble, and they’re going to prevent us from doing our work. We want to make sure that that is integrated in the full culture of the organization that we’re not trying to make it difficult for them. We’re trying to make it easier to do and to prevent issues from coming down.
Anthony: 100%. I have to let you go soon. As a security professional, you have to think all the way through from educating them not to click, to what happens when they do click, especially what happens if they don’t tell us. How do we contain a breach and then how do we continue operations if the worst happens. That’s the job, right? Thinking all the way through that.
Rebecca: Absolutely, yes. There’s the piece that the security team and IT need to own on all of that but then the business continuity plan, that’s the operations. They need to be pulled in on that and there needs to be a full organizational plan, particularly for something like ransomware. There are some things that it would be contained to security and IT, but anything that is beyond anything localized that can’t be mitigated quickly and thoroughly, the full organization needs to have representation and involvement in that. Then, it needs to be practiced regularly so that it’s not something that you’re seeing the first time when it counts.
It’s like a fire drill where you want to make sure that everybody knows how to get out of the building and where are you meeting and all of that. It’s the same thing with this. Everyone has to know – this is my role, this is your role, this is what we need to do in this order, and then it needs to be written down because when you’re in those urgent scenarios, you don’t want to rely on your memory and all that, particularly when there’s multiple people and multiple areas. You want to make sure that there’s a written plan, that there’s written instructions for how to do it and it’s been practiced, at least once or twice, and practiced regularly.
Anthony: Excellent, Rebecca. We’re about out of time. I just want to give you an opportunity for a final thought, the best piece of advice for someone in your position at a comparable sized health system. What’s your best piece of advice?
Rebecca: My best piece of advice is really to work on that culture of security. It’s something that anybody that talks to me, they’re going to keep hearing over and over, build that rapport with the peers, with leadership, with staff, make sure that they feel comfortable that they can come to you, that they trust you and that they are getting regular training in awareness through the year, not just once.
Anthony: Great stuff, Rebecca. That was wonderful. Thank you so much for your time today.
Rebecca: Thank you very much. This was great. I absolutely enjoyed it.
Share Your Thoughts
You must be logged in to post a comment.