The healthcare IT industry has experienced a handful of pivotal moments in the last few decades, from the passage of the HITECH Act to the start of the Covid lockdown, to presidential elections. Each one has had a lasting impact on the way care is delivered.
The most recent of those seminal events happened in November of 2022 when generative artificial intelligence (AI) became publicly accessible. “Our landscape changed dramatically overnight,” said Chad Jones, SVP of Information Systems at Baylor Scott & White Health, during a panel discussion. An enormous amount of buzz was generated, and before long, “everyone was selling some form of automation that was going to revolutionize everything.”
It also generated misconceptions around how AI is defined and the impact it can have on healthcare organizations, noted Patrick Woodard, MD (Chief Healthcare Information Officer, Monument Health), who also served as a panelist, along with Tina Burbine (VP, Care Innovation & Enterprise Analytics, Healthlink Advisors). “With the advent of generative AI, I think a lot of people forgot that there are other types of AI and algorithms” that have aided in decision-making for quite some time.
“The reality is that EKGs have been read by machines using an algorithm since the 1970s. Is that AI or is that just an algorithm?” Woodard asked. “When you start to think about the way we’re making clinical decisions and the way that it impacts patients, it deserves to be looked at with the same critical eye.”
Burbine agreed, noting that although machine learning tools have demonstrated great potential, they’re merely one piece in a larger puzzle. “We have to think about how we define AI,” she said. “It isn’t just generative AI. There’s a lot of this technology in use, and it’s going to continue to change how we work.”
And that, of course, means organizations must have the right pieces in place to be able to accommodate (and perhaps even anticipate) those changes. The first piece, according to Burbine, is ensuring solid governance is in place to help guide decision-making. “Leadership teams need to get together and work through the questions of, how are we going to define AI as a team, and how are we going to measure the value of what we invest?”
That could mean creating an AI-first guiding principle to help inform decisions and ensure everyone is on the same page in terms of what’s happening in the market, how to invest in the right areas, and how to measure the value of investments.
Doing so, she noted, requires “open and transparent conversations so that everybody can be in alignment and stay on top of what’s happening and what they need to be aware of.”
Jones agreed, adding that those conversations should include “the usual suspects”: analytics, digital, and core IT teams, as well as clinical and revenue cycle leaders, among others. And, importantly, they need to “make sure we’re highly aligned with the organization’s strategy and priorities,” and that “our governance agrees collectively that these are the right investments to make.”
When it comes to governance, there’s no secret; rather it’s about establishing a solid foundation and relying on the same tried-and-true principles that govern any initiative, noted Woodard. “Good governance is good governance. Whether it’s an AI project, an infrastructure project, or a new application, we want to maintain the same level of repeatability.” Instead of changing strategies to meet the needs of AI — and, in effect, creating separate workflows that can lead to errors — his recommendation is to “follow the same process, follow the same governance, and treat it with the same level of scrutiny as you would any application.”
As important as governance is, there are several other factors that must be carefully considered with any AI strategy. Below, the panelists share key learnings based on their experiences.
- Don’t buy a strawberry huller. “There are a lot of single-use AI tools right now,” said Woodard, who compared those tools to the specialty items found at stores like Williams-Sonoma. “Health systems are buying strawberry hullers, grapefruit spoons, and soft-boiled egg openers. At some point, a knife will be able to do all of those things, but we don’t have an AI knife yet.” The technology will catch up and mature, he noted. In the meantime, he recommended finding “more ubiquitous tools that can solve a lot of things,” rather than investing too much in one particular capability.
- Think short-term. When leaders identify a tool that does meet their needs, it’s best to opt for a short-term commitment, said Burbine. “Keep in mind that we’re seeing news and announcements every day from our largest partners,” including EHR and ERP vendors, who are heavily focused on “creating new opportunities with generative AI.” But with the market growing – and changing – so quickly, she recommended staying with the 12-month mark.
- More partnering. Although it may be tempting for organizations to build their own large language models (LLMs), the better bet is to lean on vendor partners and leverage commercially available solutions, which his team is doing as they work with Epic and Microsoft to pilot autodraft email responses, in-basket prioritization, and ambient listening. “We don’t have a deep bench of data scientists,” Woodard said. “For us, it’s more about hoping we make the right bets with our partners and being able to pivot quickly.
- Less development. “Health systems don’t been to be development shops and build their own software,” said Woodard, pointing out that organizations that built EHRs in the ’90s ended up having to replace them with commercial products. “We have slim margins like everybody in healthcare,” he added. “Is it more valuable for us to spend resources building out a high-quality AI development team that can build out a model, or would we be better served opening a new building in a community that has been left behind? I’d much rather have a building for a patient who now doesn’t have to drive 200 miles to receive care.”
- The human factor. Whether it’s a homegrown tool or a vendor offering, it’s critical to keep in mind the needs of those who are using it, said Woodard. “It’s easy to think about AI as a magic bullet that can you do all your documentation for you or is able to code twice as fast. But there’s still a human element in training that,” said Woodard. With ambient listening tools, for example, the note should still reflect the personality and preferences of the user, rather than just being a standard, out-of-the box note. “If it’s AI writing the note, you lose some of that nuance and some of the specifics might make all the difference in helping you make a diagnosis. The clinician needs to sit down and train it to be more of their style,” just as they would with a scribe.
- Don’t lose the art. Most of the information found in clinical notes are pulled in automatically, except for two key areas: the subjective part detailing the interaction between physician and patient, and the assessment and diagnosis. “Those are the only two places where humanity comes in,” said Woodard. “If you automate that and have a robot write the subjective part, you’ve lost half of it. And if you have a robot suggesting your diagnoses and plan, you’ve just lost the other half.” By reducing clinical variation, organizations risk losing the art aspect of medicine, he said. “If we lose that, I’m not sure what we have.”
It’s an important piece to remember, particularly as the industry hurdles forward in its use of AI and other tools, according to Burbine. “We continue to see teams taking advantage of the things they have at their fingertips,” such as leveraging predictive modeling and natural language processing for imaging. Other areas where AI is picking up steam including revenue cycle automation and clinical operations.
“I see a huge opportunity for teams to embrace prompt engineering training and put the knowledge into our clinical and operational users to be able to access information in different ways and take self-service to a whole different level,” noted Burbine. “I’m really excited to see how that gets integrated into the way that we work and hopefully reduces some of the burden. It’s going to be a game changer for our IT teams and for our clinical and operation end users.”
But the game has to be approached thoughtfully and with the right strategy, noted Jones. “The formula for effective governance doesn’t change just because it’s a new technology.” He urged leaders to be disciplined and “don’t get caught up in the hype. We’re still early in this evolution.”