This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.
Today in Health IT, we're discussing humans feel pity for robots. Ethical concerns emerge from new research. I'm Kate Gamble, Managing Editor at This Week Health. where we host a set of channels and events dedicated to transforming healthcare, one connection at a time.
I've spent the last 12 years interviewing healthcare leaders, and I'm excited to bring that knowledge into this community. So today we're discussing humans feel pity for robots, ethical concerns emerge from new research. And I'm joined by Sarah Richardson, President of 229 Executive Development Community.
Sarah, thanks for being here. Always a pleasure to join you, Kate.
Research
so from Marika Waringa at Rad Bound University indicates that humans
exhibit
feelings of pity toward robots displaying distress signals such as sad sounds or trembling movements and experiments. Participants showed reluctance
mistreat robots
that appeared to experience pain, highlighting a disconnect
their
understanding of robots
as inanimate
thy, drawing parallels to the:For one thing, it shows how far technology has come that we're even dealing with this Robots showing emotion so I know that there's a lot here, but what stood out to you most about this? What stood out most to me, other than still wanting to see 3PO, let's be honest, everybody kind of wishes that he could just show up at your house and help you take care of things in 6 million different languages that he can speak and understand.
People are less likely to mistreat robots that emit distress signals, like sad sounds or trembling. And then the study talks about the ethical questions about designing robots. to simulate emotions, which could be exploited for profit. And then emotional robots benefiting therapeutic environments where human empathy could improve care outcomes.
You mentioned the past examples like Tamagotchi's that show humans have a history of forming attachments to inanimate objects with personality traits. It's also important that calls to regulation emphasize the need to prevent companies from exploiting human empathy through emotionally expressive robots.
And yet I recall at Vive, there was a golden retriever puppy therapy dog that had been developed by a gentleman in his mother's disease progression when sundowning was affecting her. And sundowning is a term used to describe a state of confusion, agitation, or increased behavioral changes that some individuals with dementia experience later in the day. And how this therapy dog, that was essentially a golden retriever puppy robot, helped with that.
And when you think about the components of being a healthcare CIO, understanding the human robot interaction dynamic is essential, especially as emotional robots and robotics could play a future role in patient care and therapeutic care. Consider an emotional robot enhancing patient experience in long term care settings by providing companionship and even easing emotional distress.
However, it's important for us to consider the ethical concerns and regulatory implications surrounding the deployment of technology like this in a healthcare environment. I remember when Furbies came out and they were considered a potential national security threat because they could be considered a listening device.
Yeah, I remember that too. And that brings up a really good point that you and I say that cybersecurity has to fit into every single strategy and especially something like this. When you think about the potential there for, unfortunately, for hackers to get into these systems, that has to be a top consideration.
It'll always be a top consideration because whether it's at the edge, at your house, a listening device, etc., most home networks especially, unless you're in IT or have a security lens, are not secure per se. And if you're going to bring the robot either into a home or even into a long term care setting, really discussing the balance between beneficial and manipulative uses of emotionally expressive robots in healthcare It may reduce staff workload and enhance patient interaction, particularly in elder care.
And you think about ethical and regulatory landscape that help guide responsible development and use of robots that simulate emotions, inclusive of those cyber elements. Exactly. And not to make light of it, but I do think about Rosie from the Jetsons and Rosie had emotions. But in all seriousness, like you said, we've seen robots fill in gaps when you're talking about tasks that are somewhat menial and the ability to have that done can make a big difference. So it's really going to be very interesting to see what happens here. Any final thoughts on what leaders need to know in this space?
Yes, if I was going to continue to, and I will, think about the implications of empathy driven technology on patient care standards and the overall human experience, that's a space I will lean into for multiple reasons. A, we've often discussed, can AI and can robots and can technology Have an empathy lens in how care is being delivered.
I'm going to need a robot at some point to help take care of me because I don't have kids. And I'm like, okay, I want Rosie or I want 3PO or I want whatever that's going to look like. The thing about it is if we have a. Reduction in staffing, and I mean like a shortage of healthcare workers and people able to take care of a rapidly aging population, we're going to see the biggest influx in the next 20 years of eldercare being a component of our healthcare system.
And if the robotic technology can accelerate quickly enough. For that to be a component of care that is reliable, that is safe, that has empathy, that gives patients interaction that is so critical to their mental well being as well as their physical well being, we're on to something that could truly make a difference.
And all of it comes back to the technological implication of that being successful in the delivery of health care. Really great points. And we are pro robots. Don't forget to share this podcast with a friend or colleague. Use it as a foundation for daily or weekly discussions on the topics that are relevant to you and the industry.
They can subscribe wherever you listen to podcasts. Sarah, thank you as always for joining me. Thank you. And I'm laughing because you said we're pro robot unless it's the Roomba, which our pets all hate. And so we've already established that the Roomba is the best space that works in our homes because all of our pets growl at the Roomba.
📍 Yes. Good point. Always great to join you. And I appreciate the conversation. Thank you all for listening and that's it.