Are we there yet?
It’s one of the most commonly asked questions among those seeking to reach a destination. In healthcare, however, that elusive destination isn’t a physical place, but rather the ability to provide actionable intelligence at the point of care.
And leaders, both from vendor organizations and health systems, are the ones being asked, are we there yet? When will clinical decision support become a reality? When will we achieve the utopian vision of enabling providers to access all of a patient’s relevant data at the right time, and be able to bounce it off of millions of cohorts to obtain helpful information?
Perhaps it’s less about when, and more about how, according to Steven Lane, MD, Clinical Informatics Director for Privacy, Information Security and Interoperability at Sutter Health. “We’re going to get there progressively,” he said during a recent panel discussion. However, there are multiple layers and foundational challenges that must be dealt with first. “We need to do a better job visualizing and presenting the most important data at the critical point in the workflow,” he noted.
During the webinar, Lane spoke with co-panelists Kirk Hanson (Director of Enterprise Data Governance, Geisinger Health System), and Todd Crosslin (Head of Healthcare and Life Sciences, Snowflake) about the roadblocks standing in the way, and how they can be navigated.
Quality improvement
It all starts with data quality – or more accurately, lack thereof. “There’s no shortage of data that could be better,” said Hanson, who believes the problem can often be traced back to poorly functioning operational processes.
At Geisinger, a 13-hospital system located in central Pennsylvania, his team is working to assess and improve those processes, focusing primarily on mastering the member, patient, and provider domains — data that is shared across enterprise transactional systems, and therefore plays a critical role in business operations. “Getting that right not only improves the quality of those individual platforms; it enables us to be able to join the different data sources and be able to clearly identify patients,” Hanson added. As Geisinger’s data mastering program has matured, it has become easier for users to move across data sources and “take advantage of the richness that comes with being able to combine those sources to gain insights.”
Another key component in determining data quality is in providing, and responding to, feedback. It may seem overly simplistic, but the reality is that individuals who input poor-quality data usually don’t realize they’re doing it, which makes it very difficult to correct mistakes, said Crosslin. He believes implementing proper channels for giving and receiving input is “massively important in order to solve this problem.” He added, “Bringing large amounts of quality data to bear, and allowing people to have access to it to analyze its quality, is a big part of that.”
Further complicating matters is the lack of standardization as to how data is entered, whether it’s writing prescriptions or entering medical diagnoses, said Lane, whose team has focused a great deal of time and energy on enterprise data governance. “We’re looking at the key pieces of data that are used to run the organization and assuring that they are as standardized as possible.”
Not an easy feat, particularly at Sutter, where patients routinely seek care at outside facilities. “The challenge is to manage the quality of the data you generate internally, and then integrate and normalize it with data that you received from other sources,” he noted. “You’re beholden upon the quality of that data. How do you normalize that? How do you de-duplicate that?”
The ultimate goal, of course, is to utilize the data being generated by different sources and bring it together to create a unified view of the patient – including clinical, financial and social data – to aid in decision-making.
Trust issues
And although Lane believes the industry is crossing a tipping point with interoperability, there are still hesitancies in dealing with external data. “One of our biggest challenges is finding opportunities to integrate external data automatically, whether it’s taking claims data and using that to inform the clinical process, or taking clinical data from an outside system and integrating that into analytics to do predictive modeling,” he said.
In his experience, which includes nearly 30 years with Sutter, Lane has found that the problem isn’t in accessing data from outside the system or exchanging it. Instead, the problem is getting the data automatically integrated and de-duplicated.
While there are many reasons for this, one of the most significant is distrust of the data, something Hanson believes is deeply rooted in the culture of many organizations.
“There’s an automatic distrust of data that comes from another system,” he said, most of which comes from having a lack of visibility into the processes used by other organizations. “The folks associated with any given system aren’t naive enough to think their data is perfect, but they understand what the problems are, and they understand why the problems exist. So they’re a lot more comfortable with it. That, in my mind, is one of the greatest challenges.”
To the cloud
This is where cloud technologies can make an impact: by offering the scale and performance capabilities that enable organizations to bring in large quantities of data and experiment on it. “The ability to burst up to get what you need and then come back to zero” can be enormously helpful, Crosslin noted.
For Sutter, the appeal of the cloud is in leveraging tools like machine learning and artificial intelligence to identify outliers. Once they’ve been examined, leaders can figure out whether it’s bad data that needs to be cleaned up, or an important signal in the data that merits attention, Lane noted.
Finally, it’s about flexibility and providing the opportunity to spin up environments quickly and spin them back down again. “The cloud promises to provide a degree of flexibility that I don’t think any of us have known, certainly those of us who have a very large and complex IT infrastructure,” he said.
But, like anything else, it’s a process – and one that can’t be entered into lightly. “There are layers we need to deal with. And so, while I’m excited about big data and the role of the cloud in supporting that, I think we have so many foundational challenges in documentation and leveraging data,” Lane added.
To view the archive of this webinar – Turning Disparate Streams of Data Into Actionable Intelligence at the Point of Care (Sponsored by Snowflake) – please click here.
Share Your Thoughts
You must be logged in to post a comment.