I’m sure you noticed I didn’t mention EHRs in the title, I did it on purpose and maybe by the end of this post the rational will be a little more apparent.
Most know that I’ve spent my entire adult life in the healthcare field. I was drawn to the career path while I was in high school and never looked back. That career has ranged from the clinical side of things as a Radiologic Technologist, to early systems implementation/integration (80’s), to CIO for almost 30 years. I had the pleasure of being at the very genesis of the automation processes in healthcare; I’ve seen many successful and not so successful implementations of technology and approaches to automation.
It’s taken us a long time to get to where we are today; for good or bad. The wind in the sails of healthcare automation has blown from a couple of different directions. The early winds were created by changes in the insurance and reimbursement process; anyone remember when DRGs were the new-new thing? The more recent and much stronger winds have been blowing from Capitol Hill for the last 10 years. I’m sure that we can agree that this has not been a gentle breeze.
But let’s give credit where credit is due. I say this a little tongue-in-cheek, but our leaders in Washington finally put “our” money where “their” mouth was. I believe HITECH and the related regulatory programs have helped us move faster and the funding was helpful; however, the sticks that came along with the carrots continue to loom over healthcare’s head like the Sword of Damocles.
We find ourselves in the middle of the comment periods for NPRMs (Notice of Proposed Rule Making) from both CMS and ONC. I’m sure you’ve noticed there is a much tighter focus on interoperability than ever before. Everyone agrees that we need to share as much information, appropriately, as possible. However, not everyone agrees with how that information could/should be shared. There are several methods; however, that discussion is out of scope for this post.
An Old Notion
During my tenure as part of the leadership team of a busy radiology department, we worked to find effective ways of getting exam results/reports back to the ordering physicians. Yes, we had fax machines, the US mail, etc. — remember, this was long before the internet or any type of networking technology. When RSS (Really Simple Syndication) first came out, I thought it could work by having physicians or their practice subscribe to a specific RSS feed for patient results. Of course, there was some work required in the order management/results reporting applications to serve up the physician specific feeds, and then there were security/privacy issues that needed to be addressed. You might consider this a type of interoperability, but the technology wasn’t ready yet — neither were the physician practice workflows.
The Winds of Regulation
The HITECH Act and the associated incentive program gave us Meaningful Use and the prescriptive regulations (IMHO) that shoved the healthcare industry to move more rapidly toward the automation horizon. I am thankful the Federal agencies listened to the many experts that offered thoughtful comments on all the NPRMs that eventually became the final regulations.
We saw a modulation of the regulations, where they separated into different streams to separately address various segments of the provider space. Later work by the regulators harmonized these different regulatory streams closer to a singular list of rules, but not totally.
There are times that it almost appears the agencies are stacking regulations in an effort to correct issues caused by previous regulations. This reminds me of the time my physician prescribed a medication, then had to prescribe another to treat the effects of the first one.
Albert Einstein is credited with saying, “The definition of insanity is doing the same thing over and over again, but expecting different results.” Well, the regulators aren’t really doing the same thing over and over again, but they do continue to be overly prescriptive about the features/functions that EHRs and evolving interoperability technologies need to have, IMHO.
The Future Vision (But first, a little background info)
I don’t believe I can take credit for the concept that I’m going to describe below, but it’s one that I’ve been thinking about as healthcare has been on this journey for the last ten years.
We now have the majority of providers (hospitals and physicians) using EHRs, we have a library of data standards (albeit somewhat incomplete), and we also have several methods for moving and sharing data, with new ones now maturing. All this automation has created clinical data repositories that are siloed at the health system, physician practice, and other levels. We also have several large clinical data repositories that are patient-matched, normalized, and curated by the higher functioning HIEs.
Now let’s go back to the early days of healthcare automation. Many of the HIT vendor solutions started as department specific applications and then grew into what they are today (Oh, BTW that took 30+ years). There was not a lot of emphasis on integration; hospital departments were allowed to purchase the applications with the features/functions that best fit their specific workflows. This was referred to as a “best of breed” approach to automation. Thankfully the healthcare industry came to its senses and realized the cost associated with this approach and the impact upon productivity and patient care processes.
Now for the Future Vision
One of the core reasons I started this blog was to hopefully offer thoughts and commentary on topics that could generate other conversations. I’m hoping this one accomplishes that aim.
Imagine a time where the healthcare data landscape looks more like a collection of large, regional, standardized clinical repositories; where data lives in repositories rather than in the local, individual hospital and physician practice EHRs. These would be large normalized data lakes where data lives and is fed as part of the care process, regardless where the patient receives care.
Now that we have centralized, normalized, and identified data, things could get simpler and potentially less expensive for the healthcare industry. In full disclosure, I do not have any data to back that statement up. However, I’m thinking about the economies of scale. Every health system, physician practice, etc. has to maintain their own data stores. They have to secure the data using the same standards, and they have similar server and data storage requirements. These are duplicated to the “nth” power, and now that we’re interoperating, the same data resides in multiple locations, with multiple versions of the truth.
Okay, let’s look at the application and workflow level of things. In this future we would have specific and very tight standards about how applications would interact with/access (read and write) to/from a data repository (think the FHIR standard on steroids). This would move the integration and standardization tasks to the data repository level, allowing for a greater variety of applications/modules that best fit the defined clinical workflows; which looks a little like the “best of breed” solution approach.
I know that this sounds a lot like the “service bus” approach that was considered several years ago; however, that approach was designed to work within a specific EHR.
A Few Additional Thoughts
One thought is (at least right now) a large portion of the costs of every EHR implementation is around the data; its storage, backup, etc. If the data was centralized, maintenance costs might be shared. Another is, with all of a patient’s data in one centralized database, it would reduce the need to have the data moved around (at least within the defined referral region) and it would provide that “single source of truth” that so many people talk about.
Another thought; if the applications interact with the centralized data store in a strict and standardized manner, they could become commodities based upon functionality. That is to say, if you find another application that has more appropriate features/functions for your specific workflows; unplug the old one and plug in the new one. This approach would open up another level of industry where we can get back to concentrating on innovation and creating applications that help, rather than hinder.
The centralized nature of the data could also potentially assist with some of the research initiatives that are trying to gain traction by building large data sets.
At present, we continue to move the current reality forward, where the market is locked up due to the fact that participation in many of the Federal reimbursement programs requires the use of Certified EHR Technology. So the next question is, where will the next new-new thing come from? What vendor can afford to build a next generation system that can meet the certification standards before anyone purchases it? As I mentioned above, most of the major EHR systems in use today were designed and build over a 30+ year period. Having said that, I’m not sure where the next innovative platform will appear.
Thanks for reading.
This piece was originally published on Chuck Christian’s blog, The Irreverent CIO. Christian, who recently took on the chief technology officer role at Franciscan Health, has 30 years of experience in healthcare IT, both as a hospital CIO and as VP of Technology & Engagement with the Indiana Health Information Exchange.
Hear hear. In addition to potential cost savings, there would be better options for patient matching, terminology standardization, and other important foundations for interoperability. Well said.