You often hear ill-informed keynote speakers, healthcare leaders and others say that the healthcare industry is behind, often drastically, other industries when it comes to the adoption of IT. Those who make these statements base them on limited facts and anecdotal information. I, however, will suggest the opposite opinion.
Having spent many days with CIOs from almost every other major industry during the last few months, I found that we are not only not behind, but in many cases adopting new technologies at a much faster pace with significantly wider areas of responsibility.
At a recent executive IT summit, the only leaders who could honestly attest to pursuing full spectrum virtualization were the two healthcare CIOs. Additionally, the best equipped to discuss organizational strategy — and the ability to bring about transformational process improvement through advanced IT solutions and process optimization — were healthcare CIOs.
In the past, we have talked about the need for healthcare CIOs to work at the 2.0 level – that of the innovative, transformational leader. Do we have significant opportunities to improve, and do we have organizations that are way behind? Yes. But please take solace in the fact that in many cases healthcare organizations in the United States are often better at innovative IT adoption/utilization than Fortune 500 or even Fortune 100 companies.
In other words… You ROCK! Keep it up!
blhayes says
Greetings all,
Having been in and out of clinical computing for over 20 years, I believe that Healthcare IT is very much behind other industries.
I think that there are several reasons for this. I hope that I don’t annoy too many with my personal list of reasons. I am curious to see how many strongly disagree with my assessment. I’m always happy to learn!
I would state the reasons as:
– Until very recently, docs didn’t like to type. That’s a massive reason that includes some subissues,
— Docs tended to be rather touchy-feely, not so data-involved (info yes, data no)
— Clinical systems just could not handle or display the data required to make effective clinical judgements, such as radiology/dermatology/*ology images. Clinical sites were very late to get off RS-232 and text-only terminals.
— Outside of research settings (all bow before MGH and the VA) tracking clinical conditions was not really valued. What WAS really valued was billing, which was text, fax, and Facets. Money Talks.
An eye-opening experience for me was implementing IHE radiology workflows in many clinical settings. A lot of MD / RN / RT / CIO types just didn’t see the value in things like MPPS – fat fingering demographics into every system was just fine. RSNA and HIMSS did a massive job of education in the InfoRAD area for years, showing folks the real benefits that come from integrated network systems. People in discrete and process manufacturing understood this installing Ethernet (and yes, sometimes Token-Ring) and then actually INTEGRATING SYSTEMS’ INFORMATION FLOWS during the early 1980s. In my experience, outside of research institutions one didn’t see LAN technology much used in clinical settings until about 2005 or so.
That meant no imaging or rich data, which meant minimal clinical value until widespread adoption of LAN technology in clinical settings.
There is an explosion going on since about 2005. What happened?
In My Humble Opinion:
– Docs started shooting / saving digital movies of their kids, teaching them that rich media was producable and viewable outside of Hollywood.
– Younger docs entering the field from places like the VA with experience from VistA (hi, Peter K! Dr. Siegel!)
More recently, Meaningful Use has given clinical folks the incentive to learn about clinical computing and then spend the money to implement the systems and the integration. It does help that we now have a body of people in the field who understand the implications of rich imagery being widely available. (thank you Dr. Horii!)
In conclusion, I feel that many things had to come together to make more effective use of the rich data potentials now available in clinical computing. And without the hard work of so many over many decades (yes, I know that I missed naming so very many) we’d still be focused on using computers to track reimbursements, which has been the lion’s share of clinical computing efforts until very, very recently.
I look forward to criticisms of the thoughts presented here. Thanks for reading, sorry it’s as long as it was.
garystafford says
I agree with Russ that healthcare IT is far from behind other industries in their adoption of IT. I also concur that most industries are too big to make blanket statements about the level of technical integration.
Like Russ, I have been exposed to both the business and IT side of many industries, as a long-time executive in the digital imaging and printing industries. I have worked closely with business and IT leaders in many B2C, B2B, and B2E industries, producing and marketing such things as juvenile products, private label food and beverage, clothing and apparel, gifts and collectibles, sporting goods, and educational products.
In my experience, each industry has its leaders and its laggards in adopting IT for the betterment of the customer, the employee, and the bottom line. Certainly, some industries are better suited to adopt new technologies quicker than others, but all industries, from agriculture to aerospace, continue to evolve their use of technology at a break-neck pace to remain competitive.
Being directly associated with the healthcare industry for only a short time, I am amazed at the technical sophistication and complexity of solutions employed by healthcare providers, developed by the providers themselves and their suppliers. I believe the specialized nature of healthcare – its stringent requirement for regulation, importance of absolute patient information security, need for tight process control, and multidimensional resource management actually foster accelerated adoption of leading-edge, albeit safe and well-tested, technologies. I further believe the changes healthcare must undergo as part of the newly enacted Affordable Care Act initiative will only further accelerate IT advancements, in order to maintain continued patient satisfaction, as well as the bottom line.
HMShankwiler says
I would like to add another perspective that I think we are missing in this discussion. It is not so much that the industry is ahead or behind, but that we have tools at our disposal and a great many different users and variations that make the adoption and change question one that cannot be universally answered. If we look at healthcare and say that we are ahead or behind, we are essentially saying that every injury can be addressed with a bandaid and aspirin (and we all know that is not the case). I think a more effective way to look at the adoption of technology into medical areas is to look more to what are the tools and new processes and how does it address EACH SPECIFIC AREA of care. If we are not able to stop looking at the industry as one large monolith, we will not be able to find a solution that is suitable for anyone but a few. However, if we look at it as a means of addressing specialities and specific needs there will be a greater possibility of blending it into a flexible solution that ALL can benefit from (different starting point, universal design). If we can take this approach, then we have a better means of making progress.
Doctors, Dentists and other medical specialists have chosen their areas of specialty due to a great number of reasons. The fact that we are taking away the ability for intelligent, high-educated professionals to decide for themselves what is applicable or not to their practice by prescribing a one-size-fits all approach, then measuring an entire industry from the same yardstick really does a disservice.
Can we change the discussion from what serves ALL of healthcare to what can can identify as common threads and where practices differ? If we can work from that direction, we will most likely find more similarities that we can exploit (or provide to professionals to incorporate). This takes it away from the direction of what “must” be done (that most people will resist anyway), and more into the collaborative space that is able to establish baseline options and determine where the variations can help and hinder.
Thoughts?
flpoggio says
Great topic, good points.
There are some very strong arguments as to why healthcare is behind other industries in IT, such as:
1- In what other industry does the buyer tell you what content/format/process you need to follow to get paid? (Then change it every few years!)
2- In what other industry is the key sales person also as the key product definition person, and not reporting to or accountable to a CEO?
3- What other industry can’t clearly define its end product, e.g., please define ‘good’ health care? I know what a good airline is, a good bank, a good retailer…Health /medical care is amorphous, and so must its IT systems be.
On the other hand there are some strong reasons why in some places healthcare is ahead of the IT curve, such as;
1. Digitizing images.
2. Medical diagnostic tools. The delivery and practice of medicine is data /information intensive and today is easily automated,
3. Integration /interface standards (there is no HL7 comparable tool on private industry)
In a nutshell, I believe the hands-on elements of health care /medicine are way ahead of private industry, yet on the production and operational management aspects they are way behind.
Let me raise a related question. In terms of production management, is healthcare behind or ahead of commercial industry? And why so?
Frank Poggio