Protecting our Patients from Rogue Personal Health Applications
Episode 17710th September 2021 • This Week Health: News • This Week Health
00:00:00 00:10:19

Transcripts

This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.

  Today in Health it, I pose a question to Mickey Tripathy about information sharing and I think it's really relevant to you, so I wanted to share it. My name is Bill Russell. I'm a former CIO for a 16 hospital system and creator of this week in Health IT a channel dedicated to keeping health IT staff current.

And engaged. We have a webinar coming up. Wanted to make you aware of it. We're gonna be talking to the CIO for Sky Lakes Medical Center as well as Asante Health System. Sky Lakes was ransomed, uh, about six to nine months ago. They were down for 30 days. Their community Connect partner was Asante Health, and I'm gonna have their two CIOs on the call as well as the CISO from Asante.

And also Matt Sickle, who is the cybersecurity first responder for a serious healthcare. And we're gonna be talking through that event, what it's like to get that phone call in the middle of the night. The network's slowing down, finally realizing it's ransomware and going down, and all the things that they went through, what worked, what didn't work, and what you should know as health system professionals to ensure your best prepared for this type of event.

So we're gonna be doing that on October 7th. If you would like to register, just go to this week, health.com/register. All right. Here's today's story. So there's a proposed rule change out there to hipaa. And Chime wrote a response. They had six or seven key points that they were making, and one of the points was about this idea of providing information through APIs to personal health applications.

And I'm just gonna read it to you. So again, this is chimes response to the proposed rules change to hipaa. We are concerned about the implications of proposals involving personal health applications, calling for covered entities to transmit electronic health information to those personal health applications without requiring those PHAs to include privacy and security controls or sign a business associate agreement.

Okay, so that was in a document that I cover actually on Monday with Mari. Vicki, who is in charge of public policy for Chime, and she works the hill on behalf of the Chime members, uh, on our behalf. So as Press, which is a title I reluctantly put on, but am finding to be much more useful these days, I was invited to a q and a with ONC.

On Thursday afternoon, and I posed the question to Mickey Pathy, the national coordinator for HITI asked him, there's a common pushback that happens in healthcare around information sharing. And the most recent being this response that I just read a few seconds ago from Chime on, the HIPAA proposed rule change, and I asked him, what, if any progress are we making?

On the privacy and security aspect around PHAs, and I'm gonna read to you the response and give you my 2 cents. When I get through it. So this is what Mickey had to say. That is a great question. It is an interesting conundrum. A truly personally controlled device would not fall under any of the parameters of becoming a business associate, which by the way, I agree with wholeheartedly if it is truly a person in control, meaning it is not something that the provider is providing and something that the patient has on their own saying it falls under a business associate agreement.

nd this is going back to even:

It was this question of how much control do you allow a vital organization or vendor to have over the technologies that a patient themselves bring to the table? There was a real back and forth. On the one hand you have some of the issues that we're raising and patient can have an app. That is known to have a security hole and is known to have practices behind them that you wouldn't want.

And on the other hand, there's a concern that part of the problem we have with patient access today is how the providers are exerting too much control over the data. Really the individuals should be able to decide and make their own assessment. There's a lot of back and forth and we ended up landing on the side of patients.

and at the end of the day saying that individuals should have the ability to make their own decision about that. And we did say providers can provide education, certainly for known issues with apps, to be able to educate patients that we didn't want that to go too far so they could dissuade someone from using an app that he or she would be able to use.

Okay. So that was the answer and I, it really gives us some insight as to where this came from. This was not a . Fly by night passed at the last minute kind of thing. This is part of the 21st Century Cures Act. It was bipartisan legislation and it is very clear what the intention of this is. The patient should have control of their, their information, their health record.

What's my so what on this? You know, the battle's really over, that's what I'm hearing here. The battle is over. This is bipartisan. It is part of the law. It's going to be enforced with penalties. So the battle is over on this one, and as I'm reading some of the language, the language from the chime response talks about either having security controls or sign A, B, a, A.

And I wanna go back to the BA, A for a second. A BAA is really silly. You're gonna hear me say that on Monday's Newsday Show with Mari. . It. It's like inviting people into our bureaucracy and saying, Hey, sit here while we take four months to get a B, a, A signed. It's just not gonna happen. In addition, do we really want to take responsibility?

I. For auditing them as BAA partners? Probably not. I, I, I know I wouldn't. So that is, I think, a non-starter. I think that's outright out. However, security controls on PHAs, that's a different conversation. And here's where I diverge a little bit from Mickey. I think there should be a vetting process of some kind of the applications looking to extract information from the EHRs through the APIs, right?

So that's what they're doing. They're extracting information on behalf of the patient. From the EHR through APIs and I think there should be a vetting process in the Apple world. Apple has taken responsibility for policing aspects of the applications that they put on the app store, and I'm thinking something to that effect.

I think there should be a government mechanism to vet these apps, give them a seal of approval, and allow them to connect to the health systems. As required, but I don't believe that they have the budget for such a thing. So if I were still at St. Joe's, which as you heard earlier, was a 16 hospital system with about seven and a half billion dollars in revenue.

I would be looking to partner with a few other health systems to develop a vetting process, to publish a list of applications that may become unmanageable at some point, but it works today there. There aren't that many apps to go through today. The vetting process may be as simple as reading the TCCs, noting the use of patient data, how they intend to use the patient data, making the patient aware of that and having them click.

I'm giving them approval even after understanding. How they're gonna use my data, even if that is to use it for things other than just helping me with my health. And once the warning's in place, they can select and move forward or not. But at that point, I feel like I've done my duty as the CIO to our community to make them aware of how their data's gonna be used.

I think that's the minimum. I think we can do that. Perhaps also a, a submission of a set of security documents or maybe even an attestation to handle the data securely. I would build a process that could maybe vet 50 applications per month the first year up to about 600 applications. That's just off the top of my head.

After hearing Mickey's response, I would want to do my best to protect the patients who often don't read T's and C's and get a little cursory agreement from the health app that they intend to protect the patient's data with security best practices. This is all really fresh, so just read this, this afternoon.

Had the conversation with Mickey this afternoon. But this represents a process that I would follow. What I would normally do is start with an objective. Protect the patients and adhere to the CURES Act. That's the objective here, right? So I'd postulate some solutions and start to run them by my advisors, uh, a legal team, partners, other organizations that might be facing the same challenge, and see if what I'm thinking is viable, I'd refine that solution about 20 times over the next 30 days, and then I'd move on It.

So that's what I would do if I were CIO for St. Joe's today, and perhaps you can take the idea from here and let me know how it goes. Perhaps you run with it and let me know if it's viable or how it gets changed over the course of time and how you plan to vet these applications and protect your patients.

That's all for today. If you know of someone that might benefit from our channel, please forward them a note. They can subscribe on our website this week, health.com or wherever you listen to podcasts. Apple, Google Overcast, Spotify, Stitcher. I. You get the picture. We are everywhere. We want to thank our channel sponsors who are investing in our mission to develop the next generation of health leaders, VMware Hillrom, Starbridge Advisors, McAfee and Aruba Networks.

Thanks for listening. That's all for now.

Chapters