While a number of HIT-related patient safety issues can be traced to system usability, network stability, and end-user competence with the application in question, few are attributable to traditional software “bugs,” according to Marc Probst and Paul Egerman, HIT Policy Committee Members and co-chairs of a special hearing on such issues.
“It’s amazing what end users can do to a system to make it not work,” said Probst at the full committee’s March 17 meeting. Regarding how software vendors handled reports of bugs, Probst said there was significant inconsistency — some vendors were quite responsive and others less so. “The majority of issues we deal with (at Intermountain Healthcare), are local issues around local conditions and the configuration of the systems themselves,” said Probst, who serves as CIO of that organization.
Power, he continued, is another major issue, with potential points of failure at many places along the line. “You could have a failure, for example, on the screen in the OR, at the data center or even on the network — the breadth of possibilities is hugs, so you need to have the ability to quickly disseminate that information.”
Paul Egerman, software entrepreneur and co-chair of the Policy Committee Workgroup on Certification and Adoption, said he was unable to find many formal studies on HIT-related patient safety issues. From the information Egerman was able to locate, he determined some of the most important problems occurred not at the software level — in terms of bugs — but around “the complex interactions of people and technology.”
For example, is an application so user-unfriendly that errors of interpretation are almost inevitable? “When a provider is looking at the screen, does it make any sense or is it misleading? Sometimes things make sense in isolation, but how do those applications work while in real-world operation? Is the user receiving meaningful views?”
There is a popular misconception in the HIT world, Egerman said, that if the software simply “worked,” there would be no errors. “That’s simply not correct,” he said. “There are tons of issues that are completely independent of the technology.”
The solution? Egerman suggested creating an FAA-like atmosphere of continuous learning and process improvement, fostered by a (within reason) no blame or immunity approach. He suggested providers be encouraged to report “near-misses,” hoping there would be less fear of reprisals if no harm was ultimately done. While the FAA model seemed to be embraced by the committee, the FDA model didn’t receive such enthusiastic support.
Michael Klag — Dean, Johns Hopkins University, Bloomberg School of Public Health — said the FDA focused on finding individuals who make errors, something which would not create the learning environment Egerman was supporting.
Neal Calman, M.D. — President and CEO of The Institute for Family Health — made the point that users had to shoulder some responsibility for navigating the systems properly. “Sometimes the systems do great and are designed well, but people are not appropriately trained and tested to use them. Think of the FAA model —when a new plane comes out, pilots don’t just get to fly them, they have to be trained and tested on every single plane they want to fly. Before we require people to report on things that don’t work, we have to make sure, up front, they actually know how to use the systems.”
If — as Probst, Egerman and Calman argued — HIT safety is largely dependent on solid technology infrastructure and sound end-user training, what does that say for the HITECH program’s chances of success? Probst offered an anecdote that cautioned an over-ambitious approach.
“Thirty years ago, I was a missionary in South America working in remote villages. The streets were small and muddy with many ruts. When it rained, the villages were largely inaccessible. There were no traffic signs or lights, no police. Many pieces of traditional infrastructure were missing. I was thinking, ‘I’m sure people in those villages would love red Ferraris, but if they got them without first putting in the proper infrastructure, without auto repairmen, tow trucks, police to keep people under the speed limit, you’d have villages full of red Ferraris stuck in the street,” he said. “The biggest thing we heard was that if we implement these systems too quickly and without the right infrastructure and knowledge, we could introduce many HIT safety issues.”