Newsday: Unsanctioned AI Risk, Governance, and the Path Forward with Dr. Holly Urban
Episode 109th March 2026 • UnHack with Drex DeFord • This Week Health
00:00:00 00:23:00

Transcripts

This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.

Newsday: Unsanctioned AI Risk, Governance, and the Path Forward with Dr. Holly Urban

Sarah Richardson: [:

UpToDate is where trusted evidence meets real-time efficiency.

I'm Bill Russell, creator of this week Health, where our mission is to transform healthcare one connection at a time. Welcome to Newsday, breaking Down the Health it headlines that matter most. Let's jump into the news.

Bill Russell: All right. It's Newsday and today I'm joined by, uh, Drexel Ford in, uh, in black today. And, uh, Holly Urban with Walters Kulwer. A. Uh, pediatrician, informaticists and, uh, currently, um, working at w Walters Kulwer and, uh, today we're hard to believe.

We're gonna talk a little bit about ai. Is that hard to believe these days?

You rarely hear about it, so.[:

Bill Russell: I don't, the conference season's coming up, I'm wondering, like we used to say, oh my gosh, every booth has AI in it. And we were kind of surprised. I, I'd be surprised if every booth doesn't have AI in it this time. If they don't, I'm, I'm gonna wonder why they spent the money to be in the, uh, in the conference space altogether,

Drex DeFord | This Week Health: hope somebody goes around and does the tally of all the booths. You know, like every once in a while, like we've had this before with big data and, you

Holly Urban: Explosion health

Drex DeFord | This Week Health: The buzz with blockchain, the buzzwords of the day. yeah, it'll be interesting to see. 95% of the boosts will be talking about their AI capability,

Bill Russell: and for.

Holly Urban: not talking about their AI capability. They're talking about why they're not talking about their AI capability.

s. I, I'm just, I'm just cur [:

Holly Urban: Not. I've heard that. It's amazing. Um, though I've heard it's very, very true, very realistic, but I, I have not seen the show.

Bill Russell: Well, they have a storyline right now about generative AI in, in the health system. If people aren't familiar, it's on HBO. Uh, the pit is, uh, you know, uh, Noel Wiley following, you know, healthcare, and they took a storyline, which is generative ai. In healthcare, and they have this whole thing where, you know, the documentation came up with stuff that didn't even happen.

Like just, you know, some surgery that didn't happen. Some person that doesn't, isn't even, uh, you know, a practicing physician at the, at the place and at documented and they're following that storyline. This is very real and it's, it's, it's happening right now. In, in our health systems, obviously we have the, uh, ambient listening is, is huge, but shadow AI to me is a, is a really interesting, um, concept.

uh, we've had shadow it for [:

Joe's email addresses and to the point where we had radiologists sharing data with their patients through Dropbox. And I was like, oh my gosh. Like what just happened? We have something similar going on today with, with ai. Uh, Holly, I'd love it if you could walk us through a little bit of, of the findings on this.

Holly Urban: Yeah, absolutely. And you, you're right. This isn't anything new, you know, as healthcare leaders that, um, people are doing things that. The, that, that, that the leadership maybe doesn't, isn't quite aware of, but I think the risks are a little bit higher when we think about AI and when shadow ai, it's, it's um, it's sort of the use of AI by individuals that haven't been vetted.

They haven't been [:

And, and if I were pushed, I would say that's. 17% is probably low. I think there's a lot of individuals going out and leveraging AI tools. Um, so like, just one kind of pertinent example is, um, there was a, a tumor board. So, um, at a large healthcare system where tumor board is where, you know, pathologists, oncologist.

into tumor board with their [:

Bill Russell: 40% have encountered unauthorized AI tools. 20, roughly 20% have used them. One in 10 used unauthorized AI for direct patient care. Uh, you know, these are just some of the numbers we're, we're pulling out of this, uh. You know, direction the example she just gave of the transcription of the meeting and [00:06:00] whatnot.

You, you, um, you don't allow us, within our organization to use those transcription services anymore for, for exactly that reason.

Drex DeFord | This Week Health: Yeah, I've, I've talked about it, you know, to our team, but I've talked to CIOs across the country too, who now have, a lot of them have sort of taken the attitude that these things are kind of like viruses almost. Like once you get them embedded into your, into your calendar and into your zoom calls or whatever you're using, like it's really hard to get them out. you know, once they've figured out how to get stuck in there and, and invite themselves to, uh, to your meetings, it's hard to remove them. And then, like you said, I don't think anybody does it maliciously. I think they're trying to figure out how to be more efficient and how to make sure they don't miss something in the meeting or some to do, or some task that gets assigned to them.

s that kind of go along with.[:

Bill Russell: And, and that was the response from the clinicians when I, when I said, uh, hey, you're, you're using this to share PHI and, and, and whatnot across Dropbox. Like, you can't do that. And they said, give me something that I can use

I mean, we can talk about governance and we will talk about governance around this, but to a certain extent, have we given them the tools they need to do the job or have we given them substandard tools from what they can get themselves from their personal account?

t they want to do using this [:

hard. You know, healthcare systems are kind of slow moving organizations, so you know, to try and keep up with with governance is really, really tough when you're talking about a technology where there's new tools being introduced basically on a weekly basis.

Drex DeFord | This Week Health: We just did a CSO summit the conversation there. a few folks was like blocking, blocking sites that, um, folks can't get to it. Um, and that became okay. But blocking is exactly what you're saying. Like, I'm, they, they obviously need it.

te, instead of taking you to [:

You know, and we may do our best, but if you don't get the right message in at the right time into the right, you know, pocket, uh, they just won't read it or they won't internalize that, no, we have something for that.

Holly Urban: , Even if your organization does have policies, not all do. Not broad awareness of it. So you know, maybe they don't know that there's an approved tool or there's a policy around the use of these ais. There's just not awareness,

Bill Russell: Well here, here's my phone. There's five AI tools every single one of them could take. If I took a picture of this article and said, summarize it, it could summarize the article, it could do the same thing with a screenshot of an epic. I know we're not supposed to do that. But, you know,

Holly Urban: Mm-hmm.

IT organization has sort of [:

Let me, let me hit a couple more things in the article. So, top concern, patient safety, uh, 25 to 26%, uh, privacy security, close behind 23%. Um, you know, the bottom line is unsanctioned tools. Our, uh, our regulatory exposure and PHI risk, and we're, we're not even talking about the regulated tools that we still have.

We still have drift with, and we still have all sorts of other, uh, hallucinations and other things we have to, uh, be concerned about. And I, I will get to governance in a minute. The other part of this story I found interesting was AI optimism is high. 50% frequently use AI tools. 90% believe ai. Will significantly improve healthcare in five years.

Um, is, is, is that the reason this is so hard to hold in because it's, the optimism is so high.

se an unsanctioned, but very [:

So it's reducing pajama time. So I do think that one of the primary reasons we're seeing it is, um, it's allowing them to be efficiency. So how can we make sure we work with staff that we have tools that have been a hundred percent vetted and have gone through a formal approval process and that, you know, as a health system that are gonna be safe and, and gonna provide high.

Quality care. Um, you know, so that, so how do you balance that? That's really, I think that's the rub, is what we're seeing in terms of the shadow AI today.

Bill Russell: The bigger question I'm asking here. Before we get into the details is how do we do this correctly? It, it feels a little different. Like we've had governance for years and we have policies, procedures, we have groups that meet, we have groups that oversee things where they control the inflow of, of applications that have AI and those kinds of things.

ut why this feels different, [:

And then we have these other tools that people are coming in saying, Hey, look, we could do this on imaging and it's going to increase our, our quality and stuff, and it increase our quality, increase safety. And you sit there and you look at these numbers and things, it's like, well, we gotta vet those things and, and whatnot.

s I am, I, I don't know. I'm [:

Or, or maybe I'm over blowing this a little bit.

Holly Urban: I mean, to the degree that you know. There's so many. Like I said, there's new AI applications being introduced in healthcare on a weekly basis, really. So I think you're right. There's just so much out there that, that people can use, that clinicians can use and, and other clinical folks can use that It's really hard to keep on top of.

That's why. Governance that also has policies that are everyone's aware of so that they can say, oh, I really shouldn't be using these tools until they've been sort of sanctioned by my governance processes. Because the other, the other piece that I think a lot about is, um, is that patient safety angle, because I.

tion from, you know, you can [:

And so one example that I talk about is, so, you know, it, making sure that it's grounded on good quality evidence as well as having some guardrails. So, like if you pull up your favorite LLM and you ask it, um, okay, I have a patient with a complicated, um, urinary tract infection, but I wanna manage it in the, um, outpatient setting of care.

They're not in the inpatient setting. tell you to do a fluoroquinolone. Which is great unless the patient is pregnant. And then that's actually a big patient safety risk in terms of harm to the fetus. So you're, you know, if you, if the LLM doesn't have that context of whether or not the patient's pregnant, then you know, it is just opening up to all kinds of risks.

So those are the kind of things that, um, I think we really need to be cognizant of. If it's an industry, um, that we do, we do not want, we do not want our patients to be put at risk.

our, uh, city tour dinners, [:

um, because they're a trusted vendor. And that created kind of a pretty wild back and forth at that point, at the table. becuase there's some CMOs there, some clinicians who were like, no, you can't do that.

And there's other people that are like, you know, where do you draw the line? Like, this is a trusted, uh, partner. We, they're already in our system. Let's go ahead and turn those things on. I mean, what's your response when you hear that statement?

ll the way on the other end. [:

Bill Russell: His,

Drex DeFord | This Week Health: about, so

Bill Russell: his VC startup just walks in the door. You're gonna vet them a little more.

Drex DeFord | This Week Health: but also the, all the way on the other end. The thing that I was thinking about was, um, you know, when we talk about. Uh, the good old bad old days when somebody would build their own database in, uh, you know, x, y, Z department and, the whole department would become completely reliant upon this database that nobody in information services knew about. And then that person would move away and then the database database would break and they would call somebody information services. So I think you've got those. AI agents and applications that are running inside of vendors that we know. I think we've got sort of these unsanctioned applications that are out there that are coming in from the outside that are very helpful.

rganization who are building [:

easily turn into

an applications as custom built just for the pharmacy or just for RevCycle. And, um,

Bill Russell: Holly as a, as a clinician, I'm curious, uh, chat CPT is out there sort of presenting itself as a, as an AI tool, or I'm sorry, as a, a healthcare tool, a trusted healthcare tool.

Drex DeFord | This Week Health: GPT for health or GPT Healthcare.

Holly Urban: Yeah.

Bill Russell: and, you know, I'm wondering if that marketing, uh, I mean my experience with, with, uh, health system, uh. Physicians and clinicians as they're looking at that, like, that's insane.

I can't believe they're out there doing that. Not that it can't come up with the right answer, but it doesn't have the context for the complete context to come up with the right answer. And, but they're telling patients that, uh, to, to, to go ahead and use it. And we have probably some clinicians using it as well on, on the other side.

you tell clinicians who are [:

Holly Urban: I will say though, that if. You know, the, it's sort of the promise, right? And then the risk, the promises is, could these AI models actually help patients understand their illnesses and understand their treatment courses and be more empowered, you know, if, if it's a a means to get a more empowered patient, I'm all in on that. at the same time, the information they're getting has to be legitimate and it has to be based on, you know, repeatable sources. So if it's just going out to the wild world, world of the internet and finding gosh knows what from gosh knows where, yeah, then, then it's almost doing the patient a disservice 'cause they're not getting the information that they need in a way that it can be trusted. Um, so, so I'm all in if it's gonna empower patients, I just wanna make sure it's grounded in a way that they're getting the right answers and not just getting. or fluff or misinformation at worst.

I, I realize I'm part of the [:

Sometimes that's comforting. Sometimes it's like, Hey, you should, and actually I will give it credit. A fair number of times it will say you should talk to a physician. Like there's, you know, there there's reason to talk. If your blood pressure's elevated, you should talk to your, I mean, it, it's like every third sentence is you should, you should, you should kind of thing.

Um, I.

Drex DeFord | This Week Health: is a medical emergency, please dial hang up and dial nine

tools fit? And the liability [:

Uh, eventually we'll have some, some lawsuits going in the. Direction of health systems, some lawsuits going, uh, in the direction of maybe some of these foundation models. And, uh, we will have the courts weigh in on, on where this goes. But until then, be quite frankly, uh, be quite frank with you, I'm pretty happy I have this tool available to me as a patient.

It's, it's, I don't, I don't know whether it's, whether it's right or wrong. I do, I, I, I do like having it, um,

Drex DeFord | This Week Health: I think

Bill Russell: you know.

Drex DeFord | This Week Health: are gonna pick this stuff up faster, you know, than a lot of health systems too. And this is gonna be another one of those. Um, gaps that I think patients are gonna feel like, why is my health system so far behind? Are, are you, is that part of what you think is the pressure that health systems are feeling around all this?

unction there. You know, you [:

Bill Russell: I mean, some of the protections of me using it are if it says, Hey, you should be on this drug. I can't do that. I've gotta go talk to somebody. Hey, I should be on this drug. And then they go, well, why do you think you should be on this drug? Well, my, my doctor, Dr. GPT told me this, this is the drug I should be on.

And they go, well, let's take a look at some of your blood work first before we make that determination.

Holly Urban: that's right.

Bill Russell: It's good stuff. Holly, I wanna thank you for your time. This was a, this was a, a good and very timely conversation. I appreciate you coming on the show.

e. Thank you so much for the [:

That's Newsday. Stay informed between episodes with our Daily Insights email. And remember, every healthcare leader needs a community they can lean on and learn from. Subscribe at this week, health.com/subscribe. Thanks for listening. That's all for now.

Chapters

Video

More from YouTube