Dan Kinsella, Consultant, Bow Tie Consulting Solutions (Former CIO, Cadence Health System)
The gold rush driven by Meaningful Use (MU) incentives to implement certified EHRs has disrupted what was already a fragile, labor intensive, data reporting environment in most healthcare organizations. Even the early adopters who spent millions on Enterprise Data Warehouse solutions prior to MU are affected. Tens of millions of enterprise capital dollars have been deployed for infrastructure, applications, and to a limited degree in most cases, operational alignment of new workflows. As the cost of increased staffing and depreciation of these assets hits the operating budget (mostly low single digits among hospitals anyway) we’re seeing an “MU Hangover” permeating the industry.” Wow, that was a tough grind… I am exhausted… It sure seemed like a good idea when we started… Was there really a Bengal Tiger involved? How am I going to explain this facial tattoo (margin hit) to the health system board?”
On top of the MU hangover, consider the market forces driving consolidation and merger integration. Most provider organizations have more than 200 different software products in their application portfolio — prior to any merger — each with their own view of the enterprise data model. Post-merger, the collection of assets that are yours, mine and ours can become staggering. We are, as a result, drowning in legacy applications and data, and are struggling for information at the same time we are looking for the value from our substantial EHR investment. One consequence of this frustration is the proliferation of departmental or “point solutions” for analytic purposes.
In the 2007 book, Competing on Analytics, Tom Davenport and Jeanne Harris offer the following framework for considering the evolution of the discipline from Descriptive to Predictive to Prescriptive.
Enterprise value (competitive advantage) increases at higher levels, but so does the complexity associated with overall solution architecture. In my opinion, while the evolution from basic to advanced does not need to move gradually through each stage, I do propose that there is a hierarchy of needs involved here that is relevant to our MU Hangover situation. In other words, it’s hard to rationalize deploying advanced predictive and prescriptive tools when we struggle so mightily with the fundamental, descriptive capabilities — how much do you spend to attain 12-decimal point precision if you have only half of the data you need and your confidence in the quality of that data is suspect?
How Can This Be That Difficult?
Capital Shortage — whereas MU incentives for EHR got big budgets, consulting support, and board attention, analytics on its own tends to be forever stuck in “year 3” of the long range systems plan — not because it isn’t important, but because of the difficulty. Few organizations will fund an EDW today, just to say they have one. Many of the enterprise class EHR vendors have reporting and analytic tools that are included. The constraint is often around governance and the capacity of strategic and technical talent in the enterprise to develop a compelling business case.
Data Rendering — the process by which data from multiple transaction systems (see note above on application portfolios) is made available to the repository of data to be analyzed. In the absence of interoperability, we have only a few tools in our box for this task all of which are imperfect.
- Re-keying data, while a time tested solution, this is both labor intensive and error prone.
- Application Program Interface — think of this as “the easy button” where you are in the workflow of system A and need to get something from system B. Context sensitive integration can get you to B without logging in and out.
- Real-time interfaces (HL7) are also well established but the number and variety of point to point transactions makes this option costly and unwieldly from a support perspective.
- ETL (extract translate and load) — periodically sweeping all data of interest from one transaction system repository to the central repository (operational data store) for analysis. Normalization of data elements, even those captured by different instances of the same software vendor product, is laborious.
- “Stealth” — There are a few customized solutions that “stealthily monitor” all transactions for triggers and then perform a prescribed task. Given time and money, you can make these do nearly anything you want, but the result is then a somewhat fragile ecosystem that represents technical debt into the future related to unique skill sets and keeping up with changes.
The challenge of data rendering is often under estimated, especially by analytic software vendors competing on a total cost of ownership.
Data Stewardship — the assignment of accountability for data definition and quality within a cluster of applications. Generally these are the primary users of an application who collect data at its source, respond to real time edits and support a business process workflow. The lines of accountability are not always clear, and therefore there needs to be an overarching discipline: data governance.
Data Governance — an enterprise-level function whereby the big picture is maintained across hundreds of specific application data models. Common definitions on key factors such as patient day calculation or attribution of patients to a particular physician are addressed via governance. Governance should be the keeper of the flame regarding the road map for the evolution of the analytic capabilities. Integrated governance models are rare, but this is not something that can be done just by the IT department. Rules for attribution of patients to specific physicians is one good “use case” for data governance.
Data Quality — the result of good governance and stewardship practices where data in a specific domain is reliable. This is an end state that is likely never 100 percent achieved due to the multitude of moving parts such as changing regulations, acquisitions of new organizations, new payer contracts and new suppliers. Considering the clinical content that is squarely in the mix with the EHR, the importance of data quality has never been higher.
What Can We Do?
So, where does one begin to improve on the current state in which you operate today? Buy a departmental point solution? Build a pristine EDW? Outsource analytics to a specialty firm? Hire more FTEs?
Validate current state. There is so much swirling in this MU Hangover environment that it’s well worth your time to do an assessment of your current needs and capabilities (use the Davenport/Harris model). You might get more value in the near-term out of a more structured intake process around basic reporting requests.
Keep it simple. Don’t overbuy tools prematurely; 12-decimal point precision with data that is incomplete and of suspect quality — you can’t make chicken soup out of chicken feathers. Focus on a problem that is of high value to the enterprise — like population health or eMeasures. Be careful about the value of “comparative analytic solutions.” Knowing what goes on in your own organization, how likely is it that any of the hundreds of benchmark organizations are creating their reported data exactly the same way?
Develop a road map. Getting from here to there will require discipline, commitment of resources, and likely multiple years of seven-figure capital budget allocations. Avoid rework by establishing the blueprint that supports incremental investment. Make sure that the road map points out the need to improve data governance, quality, and stewardship. Target self-service environments where your super users in high data use departments can “roll their own” without compromising enterprise security and provisioning standards. Getting a plan in place to rationalize that legacy application portfolio is critical.
In spite of the hype around Big Data, analytics is not a fad. We in healthcare are in the information business, and the increased clinical nature of the data in our environments requires that we embrace the challenge.
[This piece was originally published on LinkedIn Pulse by Dan Kinsella, consultant with Bow Tie Consulting Solutions and former CIO at Cadence Health.]
scottlinehealth says
Yes, Healthcare analytics So Difficult because data is in multiple places, data is structured and unstructured, Inconsistent/variable definitions, data is complex, Changing Regulatory Requirements
Dan Kinsella says
Thoughts on what to do next? Hard to accept throwing up ones hands.