Published October 2019
Industry Veterans at Healthlink Advisors Break Down Epic’s Toolset and Offer Advice on Making Data Meaningful
Combining data from disparate sources into something that’s meaningful isn’t easy. But that’s the challenge health systems face today as they shift from making sick people healthy to keeping healthy people out of the hospital. On the positive side of the ledger, there are many robust tools that can take data and turn it into reports. However, on the negative side is the fact that without sound data governance (garbage in) those reports won’t be worth the paper they’re printed on (garbage out). In this illuminating interview, Healthlink Advisors Tina Burbine and Mark Pasquale break down the Epic analytics and population health environment and offer advice on how IT departments can ensure they are not just working on interesting things, but the right things.
LISTEN NOW USING THE PLAYER BELOW OR CLICK HERE TO SUBSCRIBE TO OUR iTUNES PODCAST FEED
Podcast: Play in new window | Download (Duration: 29:22 — 7.2MB)
Guerra: Hi Mark and Tina, thanks so much for joining me today. I’m looking forward to having a nice chat about Epic customers and their work around population health and risk management analytics. Before we get started, do you want to talk a little bit about your organization and your roles there?
Pasquale: Sure and good morning. It’s a pleasure to have the opportunity to meet with you today. My name is Mark Pasquale, and I’m a vice president with Healthlink Advisors. I joined HLA earlier this year. As far as my background, I have had the privilege of serving as a chief information officer for two different health systems. I’ve been in information technology my entire career. At Healthlink Advisors, I’m responsible for working with our teams to support thought leadership and the overall quality of product deliverables.
Burbine: Thank you, Anthony. I appreciate being here with you today. I’m the director of analytics and I’ve got a background in engineering and am Hadoop certified. I’m always happy to be talking about analytics so I appreciate having this conversation.
Regarding Healthlink Advisors, we are an independent advisory firm and we focus on serving the CIO and other operational leaders in healthcare organizations.
Guerra: So today we’re going to talk about some of the trends in the marketplace that you’ve identified from the requests customers are making of your firm. These requests are specifically around Epic customers and their work on population health and risk management analytics. So my first question is what are the business needs underlying these technology-oriented requests?
Burbine: As health systems are continuing to expand their population health programs, they have this need to integrate patient-reported data, public data like social determinants, device streams and more with their existing EHR data, but as I rattle through different data types, I have to note that they are very different in nature, and so the ability to collect those and use them in a combined fashion is challenging for a lot of health organizations.
Something we hear a lot about are the business stakeholders who, when trying to describe their organization’s analytics, claim it’s missing information or data, and so they end up spending a lot of time trying to extract the information that they’re after and combine it for their own use on their desktops. And when individuals are doing that, it’s extremely time consuming, not to mention risky. A great example we hear all the time is, ‘We would love the ability to combine our cost data with our clinical information so we can have some interesting analytics around things like joint replacement.’ So any adjustments as needed around things like that.
Another example is the growth that health systems are going through around their at-risk contracts. They want to continue to expand but, in order to do that, health systems need visibility into how to determine and negotiate acceptable levels of financial risk. Those types of things require contract modeling and the ability to monitor their financial performance overall as well.
And these examples all have one thing in common — which is using data to inform their decisions. These health systems are really excited to leverage new technologies that they’re hearing about in the industry like machine learning and artificial intelligence to help them accomplish this, but it’s complicated, and it’s really difficult to understand how to begin doing that.
Guerra: Are the requests that you get mostly coming from IT or business leaders?
Pasquale: As IT works with the organization to address the initiatives they have with clinical quality, patient safety, as well as operational management initiatives, the requests for leveraging the information contained in the EMR to help support those initiatives come from both sides.
Guerra: Do you want to tell me some more about Epic’s analytics environment?
Burbine: So within the Epic environment, there’s three different data stores — there’s Chronicles, there’s Clarity and then there’s Caboodle — and there are different front-end reporting tools within the Epic ecosystem that enable teams to extract information, support dashboards, etc., from all of those. Caboodle is the Epic, cloud-based analytics environment that incorporates their predictive analytics models, and it can also be used to collect and store non-EHR data.
Epic has put a lot of focus into maturing this analytics platform offering over the past few years and the industry is beginning to recognize that. What’s interesting is that understanding how to best use some of these predictive analytics models begins by understating how to support what the strategic goals of an organization are and stay deliberately focused on the measures that are needed and allotted to those objectives. What I mean is that there are a lot of really fun and cool things that a data scientist or senior analyst can do in the Caboodle environment, but understanding and aligning their work to support the measures and the metrics that are meaningful for the strategic objectives is key so that they are working on the right thing at the right time and creating meaningful analytics, from an enterprise-wide mindset, that’s serving the business in the best possible way.
Guerra: First off, what are some of the other Epic tools in this area and, secondly, can you give me more detail on what clients are asking you for help with?
Burbine: Sure. Let’s start with the Epic product names because they definitely are unique. So starting with Healthy Planet, that’s Epic’s population health tool and represents everything that they incorporate for a holistic approach to care management, while their analytics platform is called Cogito which, by the way, everyone seems to have different pronunciation of, even within Epic — so we always enjoy listening to how everybody approaches that.
In the landscape of the analytics marketplace, Cogito has actually offered more progress in terms of capabilities with the expansion of their roadmap over the last coupe of years than any other analytics platform. And what that means is that Epic has really expanded into leveraging cloud-based technologies such as Microsoft’s Azure platform.
I wanted to take a minute and mention it because — to our points earlier about the complexities around bringing different data sets together for the use of the business — it’s important for a health organization to be able to use the technology that they have and integrate it into their patient care. So in the example of Epic, Cogito and Healthy Planet are tools within the Epic ecosystem that enable cognitive machine learning algorithms to be leveraged accordingly within the patient coordination workflows. For instance, a patient with opioid abuse risk indicators is identified in Cogito and integrated into care management through Healthy Planet.
Another good example is AMI risk, with the Cogito algorithm identifying AMI risk for a patient in the ED that will appear in the chart view, so a clinician can easily see it, and it will also appear in a care coordinator’s workflow from Healthy Planet.
Guerra: Are these tools challenging to use?
Burbine: Actually integrating technologies like machine learning and artificial intelligence is becoming more commonplace for organizations to produce the types of analytics they require. And as they continue to mature their own data processes, they also have to ensure that the technology aligns to help support them in accomplishing that, so determining how to integrate these types of advanced analytics capabilities is complicated for a health system in understanding where to begin and how to prioritize.
Pasquale: Just to add on to that, as Tina was mentioning, the toolsets continue to mature at a rapid rate. For example, Cogito has expanded tremendously over the last couple of years. They leverage Microsoft Azure as the cloud-based platform, but they’ve also opened it more, so if you wanted to develop your own algorithms, if you have your own internal data science or analytics development group, then you write in Python or you write in R or you write in SAS. You can actually integrate these data science technologies, these ML technologies, within Epic’s Cogito platform against the data. So they do have it broken into those different categories.
So if you want to leverage Internet of Things, if you want leverage FHIR APIs for bringing in data from external sources, if you want to leverage standard CCDs — which were initially developed as part of Meaningful Use interoperability capability — you can do that. So there’s a variety of ways Cogito can acquire information. And then you organize it through their cognitive ML which is their cloud-based Azure platform.
As Tina mentioned, there’s Caboodle which is their enterprise data-management warehouse for bringing in claims data and other sources where all that data which has been acquired is organized. And of course, you’ve got the Chronicles layer which is the actual cache core database. So, that’s how they’ve organized it through data acquisition, through data organization, through data governance and making sure — through the data governance side — that you’re leveraging your catalog, your different algorithms, and then you go through different visualization tools.
Epic has several tools available already like SlicerDicer, they have Radar which is the real time dashboard, the metric alerts, they have IOS dashboards called Canto. So that’s how they’ve organized it and — as it continues to mature and grow, which it has significantly over the last couple of years. They continue to expand the capabilities and they continue to make it more open architecture.
Burbine: I think it’s a challenge for organization’s to understand — based on the different reporting tool types that are available within the Epic ecosystem — how to obtain data in a very usable way and where to go for the information that’s meaningful to them.
Guerra: I’m sure it’s very client specific but is there any general advice you give to those asking for your help in this area?
Burbine: I’ll start talking about the financial performance management of at-risk contracts since we mentioned that a few minutes ago. This is an area that the EHR vendors currently do not do well, and so we always share with clients that it’s really important to evaluate what a health system’s business criteria is to determine if a niche tuck-in tool to support that financial performance management capability is going to be beneficial and complement the existing EHR.
So it really comes back to understanding what are the strategic goals of an organization and what does that mean for the business requirements that align to any possible niche tools that could be integrated and tuck into, and complement, their existing EHR system.
Pasquale: It’s also very important for health systems to understand the toolsets they’ve already invested in. There are often opportunities to leverage technology that’s already internal to the organization and potentially has not been maximized in its capabilities. For example, Epic provides a core set of algorithms. When you go live on Cogito, there are a core set of algorithms that you can leverage immediately and you can select. So they have a library of over 40 different ML algorithms and you can select the ones that are most important for what you’re trying to drive from a clinical quality and patient safety perspective, so you can go live on that subset.
You may want to leverage sepsis, for example, which a lot of organizations really focus on. So that is part of the Cogito ML algorithm library. You can leverage those and then you want to look to see where your program wants to go and what current tools and technologies can do to help you get there.
Burbine: There’s another factor as well, Mark alluded to it briefly when he used the word data governance, but this is an important element that has to be factored in anytime we’re talking about analytic toolsets or platforms, because in order to produce meaningful analytics, there must be high-quality data in the source systems to begin with. So what’s important and critical overall is that effective analytic programs require a health system to create a data movement and shift the culture of their organization so that everyone on their team is treating data as an enterprise asset.
We like to share with our clients that there are three foundational analytic truths. One — analytics has to be a stated priority. Two — the organization itself has to be clear on data sources, data ownership and the intended use of that information. And three — that end-users trust the data and are engaged in helping to maintain its authenticity. So this data discipline that’s required to support these analytic truths is not going to be provided from an analytics platform solution.
Guerra: Right and the trust comes from the discipline, right? One flows from the other?
Burbine: Exactly, that’s exactly right.
Guerra: Regarding the Epic products that we went over, are there any scenarios that you’ve seen where the best product for an Epic customer is not the Epic product?
Burbine: What’s interesting is there’s such a wide variety of analytics platforms on the market that it’s important to understand the differences between them because the way a platform integrates and processes your data has a profound effect on whether or not it’s going to serve a health organization well. And every vendor has their own unique blend of architecture and integration capabilities along with areas that they’re still maturing. Epic is no exception to this.
Another example is IBM — they acquired Phytel and Explorys and rebranded it to Watson Health, but that doesn’t necessarily mean that the databases behind those two systems have been integrated or can share data yet. And so while it might be on a product roadmap to do that in the future, it’s important to understand what’s available now and how that’s going to impact the health systems workflows based on the types of things that are deemed a priority for that organization.
So it really comes back to understanding what are your clear strategic priorities at the executive level and what does that map to in terms of the types of metrics and, of course, the data domains that are needed to support those. And then comparing that to the capabilities of the current infrastructure and toolsets that an organization has in place, bearing in mind the type of functionality that Cogito and Epic can support, and then determining if any niche or tuck-in products might be needed as well.
Guerra: So organizations have a number of options — use the Epic products, not use them, or use them in combination with others, correct?
Burbine: I think what’s really important to keep in mind when we think about the Epic ecosystem and the functionality it’s serving to clients is that there are always going to be other data sources that are just as important as the information that lives in the EHR itself. So those types of things could be anything from blood bank and ACO claims and eligibility information, data from Press Gainey, scheduling information possibly from tools like Clairvia, and other sources as well — there’s Lawson that accounts for some HR, accounting information, supply chain data, Kronos, for instance, for timekeeping information. And so it’s very important to be able to evaluate and understand that driving this information into an analytics environment — whether that’s Epic’s Cogito environment or another — is going to require an invested effort in aligning data discipline across the organization, in addition to the toolsets themselves.
And so the way that an organization can accomplish that also depends on the type of staffing they’ve invested in. As Mark alluded to earlier, some organizations have the ability to have data scientists on staff to help support the type of algorithm development that’s needed to leverage some of the models within Cogito; and others require some professional services support in order to accomplish that. So those are some of the factors that need to be considered when outlining an analytic strategy that represents how the toolsets are going to come together to support the data that’s needed to drive better patient outcomes.
Guerra: Any more thoughts around Epic’s product development timeline? Do you have any more information about that you could share?
Pasquale: Last year they switched from an annual release to more of a rapid application development approach where they’re coming out with quarterly releases and updates. They have a defined roadmap for both their Cogito analytics as well as Healthy Planet platform and its capabilities. On the acute care side, there are things you can already report out on like, as I mentioned earlier, early detection of sepsis, unplanned readmissions, hospital fall risk, there’s ICU mortality benchmarking, those capabilities are already there.
They’re coming out with additional capabilities in the near future like acute kidney injury on the acute care side. On the pop health side, they’ve got things like notification for hospitalization for heart failure, negative outcomes of type-two diabetes, risk of hypertension, and they’re expanding that to have things like end-of-life care index and some other core features within pop health.
On the operations side, they are expanding that platform as well. They currently have things like no-show appointments, remaining length of stay, current dashboard reporting. So as they come out with these quarterly releases, they are continually enhancing the platforms.
Guerra: I’ve heard people debating the concept of subscription versus licensing pricing. Do you have any thoughts on how people can evaluate what’s right for them?
Pasquale: I think looking at the overall total cost of ownership is very important. As Tina mentioned, it’s very important to look at what you want to accomplish from an overall analytics capability perspective and apply that to the toolsets that are available, and then really do a thorough total cost of ownership to understand which model would be best for your organization. Some vendors are pure subscription-based while others are perpetual license with annual maintenance, so it comes down to the analysis of what product is best for your organization and where you want to go, where you want to take your program and then doing a good solid total cost of ownership.
Guerra; Are there any other points you wanted to touch on?
Burbine: Oftentimes we talk about health organizations feeling that in order to produce better analytics, a newer, better platform is going to solve their challenge, but something that we talk a lot about with our clients is that toolsets don’t address an organization’s data maturity, so it’s really important to ensure that there are good data habits in place to align with the types of analytics an organization needs — data maturity is just not something that can be bought.
Guerra: Mark?
Pasquale: I agree. I also think it’s very, very important for an organization to thoroughly evaluate the tools and technologies they have currently invested in and align that to their overall organizational needs.
Guerra: It really sounds like what you’re seeing is people coming in and wanting tools to fix the problem but you have to direct them back to data governance. You can’t do much with bad data, right? I mean you can make it look pretty, you can print great reports but if it’s not good data to begin with, those reports are not worth much. So you have to redirect them back to sound data governance as a foundational step. Would that be correct Tina?
Burbine: Exactly, and I think what’s becoming commonplace is that these platform vendors understand the importance of data governance and will oftentimes market that their tool will take care of the data governance workflow. Some of them offer things like badges or approval status where somebody who is a named data steward can click that it’s been approved and, magically, that data is then of high quality, or should be considered high quality, for business use. But that functionality does not replace the workflow and the good data habits that must be followed in order to enable somebody to click that box that says, ‘this data is of high quality,’ and to have it mean something. I think that’s a really hard thing for health systems to understand because the workflow and the data discipline, and that data movement that represents a cultural shift, is not something that’s tangible like a toolset.
Guerra: It’s really fascinating because you could imagine everyone would love to have a tool to fix a thorny problem — so the business wants good data, they want to be able to just have IT get a tool that solves their problem but, as you said, that’s not the case. A data governance tool, a data analytics tool is not going to fix your problem. You can’t get away from that painful, hard, long, difficult work of data governance, which has to do with committees getting end users involved, clinicians, meetings, all kinds of things that are very difficult — all that human-to-human stuff, that’s the hard stuff and that’s what data governance is all about. Tina, is that correct?
Burbine: Yes, that is correct. We liken coaching and the amount of effort that a data movement requires to happen successfully back to the cultural shift that organizations experienced in the late ’90s when everybody was focused around patient safety and, all of sudden, it became commonplace that everybody — regardless of what role they served in — had some ownership and an expectation that they were going to be involved in supporting patient safety and minimizing fall risk and all of the different things that go along with that, and there was a very concentrated effort about standardizing workflows to help make that happen.
All of a sudden, nurses were having input into, and communicating about, the things that they were noticing around a patient that would help identify a fall risk, for instance. Different tools and alerts were put into place to help support that. For instance, alerts within a pharmacy, upon an opioid prescription being filled, would automatically alert a care team that somebody, an in-patient, was now considered a fall risk, and so potentially bed rails with be put into place to help support that patient better. And so all of the things that I’m talking through around that really crafted a concentrated effort by everybody — from the executive levels throughout the organization — with on-going coaching and mentoring that was transpiring over a lengthy period of time to make sure that everybody understood what their role was. That is a perfect analogy that we feel represents the amount of effort and coaching that’s required to support an effective data movement when implementing data governance.
Guerra: Alright Tina and Mark, well I think that is fantastic, great information for our audience and I appreciate it. I want to thank you both so much for your time today.
Burbine: Thank you for having us.
Pasquale: Thank you.