This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.
Hey everyone. I'm Drex and this is the two minute drill, but today I'm gonna skip some of the news stories and I'm gonna talk about something that's happening right now that I think we need to think about getting ahead of, because there's a synthetic media tidal wave that's coming and I don't know that we're really.
Prepared for it. Honestly, I don't know exactly how to prepare for it. I just think we need to be thinking about it. I've talked about some of the synthetic media challenge issue in the past from time to time. I do a show called Un Fake and I can certainly reshare some of those links, but this past week or so has really been different.
You, you've probably, in your social media channels, you've probably seen demos of. Some of this video creation capability, meta just dropped its answer to open AI's SOA Meta's product. It's called Vibes, a new video generator that. Turns simple text prompts into vivid cinematic scenes. The videos are, they're clean, they're smooth, they're surprisingly believable.
And yeah, the tech is incredible. But from a cybersecurity standpoint, it's also a little terrifying. We live in this world now where anyone from a board teenager to a nation state can spin up hyperrealistic video content in just minutes. No studio, no camera, just a prompt like. Hospital, CEO, addresses staff about a cyber incident, and then boom, a fake video message that looks real enough to fool most people.
And while Meta and OpenAI are doing things like enforcing a watermark or a symbol on the video as a. Easy indicator to tell whether or not the video is real or AI generat. AI generated those marks, those watermarks, those labels are often really small and they're kind of placed in a spot that makes it really easy for creators to just crop them out.
So think about what all this might mean for healthcare. Phishing campaigns that don't rely on bad grammar or sketchy emails, they show up as a video message that looks like your boss or. Deep fake patients or families sending fake pleas for help or asking for GoFundMe donations or saying bad things about your facilities or your staff, individuals or disinformation campaigns during a crisis that just erodes trust in leadership when you need it the most.
Some of those kinds of things, we've already seen this kind of use of technology in the past. Some of those things may be targeted specifically. At your individual teammates to distract them in the middle of a cyber event or another crisis where they're a key leader. Healthcare professionals are already dealing with burnout and staffing shortages and complex technical stacks, and I know you're doing your best to make improvements in all of that, but now you throw in this wave of AI generated noise.
And videos and fake alerts and fake patient stories, and it's one more cognitive load for people who are already hyper stressed and doing the hardest work there is in the world. This isn't about banning tech. I'm not saying that. It's about understanding what's coming and trying to get ahead of it.
Start thinking about creating a synthetic media policy that covers everything from verification to brand protection to employee training, and I would say in the near term, especially employee training, it's the stuff that's coming from the outside that concerns me the most. Um, but we have to make sure our internal teams aren't accidentally in the process becoming part of that problem too.
And, and look, I know you don't need another policy. Uh, so just consider the policy discussion idea as a way to get the right people in the room to talk about the issue and what the right next steps are. For you and for your organization because if we don't start now, we'll spend half our time chasing what's real and what's fake, and that's a game that the bad guys are already playing to win.
As always, thanks for being here. That's today's two minute drill. Stay a little paranoid. I'll see you around campus.