HEALTHTECH: From a clinical informatics perspective, where should AI investments be focused?
BEENE: At Trinity Health, we are looking at this through the lens of a very large healthcare system, with several leaders across the system. Essentially, we’re a “system of systems,” and so we have to take a broader, more comprehensive view as we consider how we approach investments.
We are implementing and evaluating several tools and applications that focus on patients, automation, and clinician and operational efficiency — with an emphasis on the back-office area where there is lower risk, higher feasibility and the ability to deliver the most value.
Trinity Health has an AI governance group, which I am a member of, that includes stakeholders from across our system, such as CEOs, CMOs from our Regional Health Ministries, CIOs and others who can represent multiple perspectives regarding AI. We have to be thoughtful because AI is always evolving. While we have got to keep our finger on the pulse of it, we must keep an ethical lens on it too, aligning to our core values, always keeping our patients first and mitigating risk.
I’ll share a tactical example: We’re implementing a few AI tools for our revenue area — where we know there is a lower risk to our organization — to provide efficiency and make things more accurate. The areas where we tend to be more cautious are the clinical areas because you’re impacting patients. We must ensure that those tools are safe to deliver that care to those patients, in an ethical manner.
READ MORE: Healthcare organizations need AI data governance strategies that ensure success.
HEALTHTECH: What lessons could healthcare leaders take from the field of clinical informatics as they prepare their data for AI?
BEENE: Communicate the true possibilities of what the AI tool can do. That is one of the early lessons we learned about health IT overall: Be transparent about what it can and cannot do. I know there is a lot of pressure to push the promise of AI — and for some organizations, I totally respect that — but when we overpromise and underdeliver with the financial investment, then you have something to answer for later. Communicate plainly, honestly and with transparency about what you know and what you don’t know.
The organization should also be prepared for the change that’s coming with that AI tool. It’s about making sure everyone understands what exactly is coming and how they’re going to prepare for it in their environment. That’s so important, because organizations will invest in something that they think is going to bring them certain promise, but it ends up being underutilized. Why? Because there was a lack of understanding of the problem the organization was trying to solve. How do you make this tool valuable in my daily activities? There must be organizational buy in.
Think about your daily life. If you don’t think something is valuable, you don’t go to that place, or you don’t invest your dollars there. People have to understand how the tool is valuable to them if they’re going to use it.
As informaticists, we have to make that connection real. What leaders can take from us, in the implementation of any health IT tool, is that they have to make it make sense to the people who are using it. They have to adopt it. They have to see that it brings value to them and is not a burden. That takes a lot more than just communication. So, be fluid. Be open. Be adaptive.
There’s so much fluidity here that it’s almost like, if you dip your toe, you have to make sure that you’re only dipping your toe, not your entire foot, because there’s something else that’s just around the bend that’s going to be better. It may be delayed gratification, and you have to be OK with that. You have to be OK with the fact that you just invested by dipping your toe versus waiting to dip your whole foot.
Organizations can’t always pivot that quickly. These are multimillion-dollar investments. Sometimes you have to take the information you have and make a decision based on what you know now. It’s easy to overanalyze. It could put you in a space of analysis paralysis, which could become a barrier to advancing any type of investment in AI. It all moves really fast. You must know whether or not you’re a risk-taker, what part of the risk continuum you and your organization are on, and where you feel that it’s OK to do that. Then jump in.


