Artwork for podcast The Cone of Shame Veterinary Podcast
387 - Are We Morally Breaking Our Technicians?
26th March 2026 • The Cone of Shame Veterinary Podcast • Dr. Andy Roark
00:00:00 00:27:13

Share Episode

Shownotes

Dr. Nathan Peterson, DVM, DACVECC, takes us straight into one of the most uncomfortable questions in veterinary medicine: are we sometimes prolonging suffering, and what does that do to our teams? In this episode, Dr. Andy Roark and Dr. Peterson unpack medical futility, moral distress, and veterinary burnout in a way that feels both honest and actionable. They explore what happens when technicians feel stuck providing care that conflicts with their values, and why that tension matters more than we think. You’ll hear practical ideas like creating psychological safety, building a “pause button” for team concerns, and even introducing ethical rounds to navigate tough cases together. If you’ve ever wrestled with end-of-life decisions, team conflict, or the emotional weight of patient care, this conversation will leave you thinking differently and leading better. Gang, let’s get into this episode!

Mentioned in this episode:

On Demand Team Training Bundle

Great medicine can still lead to tough conversations with clients. The Team Training Bundle gives your veterinary team practical tools to handle angry clients and communicate clearly in the exam room, so every interaction builds trust instead of tension. Flexible, on-demand training makes it easy to strengthen communication across your entire clinic.

Learn more about the Team Training Bundle here!

Learn more about Simparica Trio here!

Office Hours w/ Dr. Andy Roark

Inside the Uncharted Veterinary Community, Dr. Andy Roark hosts Office Hours where veterinary leaders can bring real-world challenges and get practical guidance from someone who understands the realities of practice life. These sessions give veterinarians, practice managers, and team leaders a chance to ask questions, workshop difficult situations, and gain perspective on issues like team dynamics, communication, burnout, and clinic operations. Instead of navigating leadership challenges alone, members get direct access to Andy’s insight along with the support of a community of veterinary professionals working through many of the same challenges.

Register for Office Hours here!

Uncharted Practice Owner Summit at NAVC SkillShop

Owning a veterinary practice comes with challenges no one teaches you in school. The Uncharted Practice Owner Summit, hosted at NAVC SkillShop in Orlando, is a hands-on leadership workshop designed to help owners strengthen their teams, improve operations, and make confident business decisions. This small-group, working session focuses on real-world tools and honest conversations, not passive lectures. Spots are limited. Register now!

Register for Uncharted Practice Owner Summit at NAVC SkillShop here!

Transcripts

Speaker:

Welcome everybody to the K of Shave Veterinary podcast.

2

:

I am your host, Dr.

3

:

Andy Rokey.

4

:

I got such a good one.

5

:

I am back with my friend Dr.

6

:

Nathan Peterson.

7

:

If you don't know Dr.

8

:

Peterson, , he does great work.

9

:

He is an emergency critical care

specialist at Cornell University

10

:

College of VE Medicine, and

he also studies, , bioethics.

11

:

He, has done a lot of

work around medical few.

12

:

Which was one of the most popular,

most talked about episodes

13

:

of the Kone Shame last year.

14

:

and so I, I wanna have

him back on the show.

15

:

I wanted to talk some more.

16

:

I've been rolling this around in my

head and the idea of medical futility

17

:

and, , especially the technical staff

that, that may feel like they, like,

18

:

they're providing treatments that are

doing nothing but prolonging suffering,

19

:

and there's nothing they can do about it.

20

:

And so that has just stuck with me.

21

:

We start to dive in a bit more

into, well, what does this look

22

:

like as a team conversation?

23

:

What is the veto button?

24

:

Is there a veto button?

25

:

How does that work?

26

:

And so we get into that.

27

:

We start talking about ethics

rounds, there's a lot of really great

28

:

ideas and, and the work that he's

doing I think is really important.

29

:

We talk about the intersection of

medical futility with moral distress.

30

:

We talk about burnout and just how

all those things really come together.

31

:

So let's get into it.

32

:

Kelsey Beth Carpenter: This is your show.

33

:

We're glad you're here.

34

:

We want to help you in

your veterinary career.

35

:

Welcome to the Cone of Shame with Dr.

36

:

Andy Roark.

37

:

Andy: Welcome to the podcast Dr.

38

:

Nathan Peterson.

39

:

Thank you for being back, my friend.

40

:

Nathan: Yeah, thanks for having me back.

41

:

Andy: For those who don't

know you, I'll pause here.

42

:

You are an emergency

critical care specialist.

43

:

Uh, you're on the faculty at Cornell.

44

:

College of vet medicine and you hold

a, a master's degree in bioethics from

45

:

Harvard, and your research is around,

uh, sort of ethics and, and, and

46

:

specifically medical futility is kind

of what we want, we wanna talk about.

47

:

And some of your research was on that.

48

:

And so, you have this really fascinating

perspective and that's why I, I'm always

49

:

so honored to get to sit down with you,

is that you are a clinician and you're

50

:

in the trenches and you are an emergency

critical care specialist, yet your sort

51

:

of research interests are around ethics.

52

:

And so I think that that's fascinating.

53

:

And just based on, . Sort of the

conversations and the comments that that,

54

:

that I sort of got after our last episode.

55

:

I wanted to kind of explore

this a little bit more with you.

56

:

let's just start at a, a high

level, like, talk to me a little

57

:

bit about medical futility.

58

:

People have told me, you know,

I never heard that term before.

59

:

Can you just sort of start and lay

out like what is medical futility?

60

:

And then also what, what kind

of drew you to that sort of

61

:

area of, interest in research?

62

:

Nathan: what is medical futility?

63

:

that's a difficult question to answer.

64

:

Sounds, sounds pretty straightforward.

65

:

, but it's really pretty difficult.

66

:

, I think there are different ways.

67

:

That, that we can conceptualize medical

futility and, futile treatments.

68

:

, ultimately I think there's kind of

two different big ways to consider it.

69

:

, and one is we might say

a treatment is futile.

70

:

, if there's no way that a proposed

treatment could achieve some

71

:

physiologic end, just impossible.

72

:

That treatment would be futile.

73

:

There's not a reason to do it.

74

:

, I think the stickier, , version of

the definition is, treatment might

75

:

be considered futile if the expected

outcome following a treatment is.

76

:

, prolonged suffering or

continued suffering.

77

:

, so then I think we might say, you know

what, this treatment is futile because

78

:

it's going to lead to protracted

suffering, , and it's not ultimately going

79

:

to change the outcome for the patient.

80

:

, so defining futility is, is part of,

both of the studies that I, I worked on

81

:

and, , still no closer to an answer for

a a, a one size fits all definition.

82

:

Andy: It makes sense that

there's not a one size fits all.

83

:

I, I guess definition when, when

you say this and you say, you know,

84

:

there's, . Defining as there's no

chance of a successful outcome.

85

:

And I'm like, zero is a very small

number, and I think you and I sort

86

:

of talked about it before, you know,

we had said, , I think in our last

87

:

conversation we mentioned, like, what

if there's a 5% chance, , is that a

88

:

compelling chance to everyone on the team?

89

:

You know, are they on board with.

90

:

pursuing treatment if there's only a

5% chance that this patient is going

91

:

to going to improve, when you think

about sort of defining futility or

92

:

looking at, where does this happen?

93

:

Do you see that?

94

:

Do, do you have the kind of wrangling

conversations of, I don't know, is it 10%?

95

:

Is it 5% chance?

96

:

And, and I, I don't know, is, is there.

97

:

How do you even begin to, to

decide what is, officially futile?

98

:

Because zero chance, that seems

like such a, a low bar for me.

99

:

I'm an optimist.

100

:

I'm like, there's always a chance.

101

:

Nathan: when I think about Zero

chance or something, I, I sometimes,

102

:

, will use the example of like.

103

:

You know, treating a viral disease

with antibiotics, that's not gonna

104

:

work no matter how many times I try it.

105

:

, that would be an easy example of

that, that's a futile treatment.

106

:

my own personal definition of

futility is probably some sort

107

:

of combination of the two.

108

:

I would say, what are the

chances of a proposed treatment

109

:

achieving the desired goal?

110

:

, and what does that goal look like?

111

:

, is it going to lead to

protracted suffering?

112

:

And when I say 0% chance of, , success

of a treatment, maybe another example

113

:

would be, , because I think it's

really related to the goals, right?

114

:

, so if, you know, I have a high

performance, Hunting dog or something,

115

:

and the dog comes in and ends up having,

, you know, a four limb amputation , and we

116

:

diagnose it with, hip dysplasia and, the

owners are, are talking about a total hip.

117

:

You know, if, if we do a total hip,

but you still have a three-legged

118

:

dog, this dog's still not gonna be

a, a high performance hunting dog.

119

:

, so I think futility is not only related

to sort of the proposed treatment,

120

:

but really what is the expected or

proposed goal of treatment also.

121

:

that's how we get more to those.

122

:

Like, yeah, I, I can't, I'm never gonna

be able to return you a completely healthy

123

:

animal, , under these circumstances.

124

:

That's something that might be an example.

125

:

Andy: can you talk a little bit more

and sort of elaborate, you talk about

126

:

sort of goal-focused care, but then you

also , talk sometimes about the all

127

:

possible options, I think is the phrase

that, that I've heard you use before.

128

:

And, and it's sort of as a, as

a criticalist, you know, you,

129

:

you are presented with a lot of

different ways to take cases.

130

:

Can you sort of talk about that

all, all possible options versus

131

:

goal-focused care and kind of,

kind of your thoughts around that.

132

:

Nathan: Yeah.

133

:

There.

134

:

And one of the questions we asked in the,

in the survey, , was, do you believe that,

135

:

, veterinarians are obligated to present all

possible options of treatment to a client?

136

:

, or are we really

obligated to just provide.

137

:

, treatments that we think

will be beneficial, , or

138

:

will have some, some value.

139

:

And it's a tricky, it's a

tricky subject, tricky topic.

140

:

I think, part of us, I think we, we

all feel this obligation to say, you

141

:

know what, , these are all of the

choices here, , that are, are possible.

142

:

What do you want to do?

143

:

But I think that if we take a step

back and say, well, what's gonna

144

:

be in the patient's best interest?

145

:

What's gonna be in the

client's best interest?

146

:

What are their goals and what are my

goals for the patient and for the client?

147

:

, we really focus on, what

we're trying to achieve.

148

:

then I think it's easier for us

to tackle that question of what.

149

:

What treatments do I need to present,

or what options do I need to present?

150

:

if I have a client who, their goal

is, is short term and I know they have

151

:

financial constraints, I don't know

that I feel an obligation to talk to

152

:

them about, long-term hemodialysis.

153

:

that's gonna cost tens

of thousands of dollars.

154

:

So I think that idea about.

155

:

, having a clear connection on what

our goals are, , both for the patient

156

:

and the client, , helps to inform the

recommendations that we might make.

157

:

Andy: I really liked your example

earlier about the high performance

158

:

hunting dog, a around, around goals.

159

:

what's the language that you tend to

use when you talk to, you know, when you

160

:

talk to a, a pet owner or a client to

try to understand what their goals are?

161

:

I'm trying to think about how exactly

you say that, sort of in the exam room

162

:

to suss out, , the important sort of

context in which we're working, but

163

:

how do you help pet owners kind of

understand what you're talking about?

164

:

Because if I say, what

are your goals here?

165

:

They're gonna say A

healthy, a healthy pet fix.

166

:

My pet, you know, is the, is the goal.

167

:

what does that sound like

when you, when you explore?

168

:

Nathan: sometimes it, it

just has to be explicit.

169

:

where it has to be, this is what

we've diagnosed, this is, what I

170

:

expect the, the clinical course to be.

171

:

What are your goals?

172

:

What is it that you're

hoping for at the end?

173

:

, and.

174

:

You know, really hearing that,

feedback from the client, this

175

:

is, this is what I'm hoping for.

176

:

, because then I can, you know, hey, I

don't think I'm gonna be able to reach

177

:

those goals, but let me tell you what

I do think we'll be able to reach.

178

:

, so I think , in those types of situations,

I'm usually pretty explicit about it.

179

:

, what are you really hoping

for, at the end here?

180

:

Andy: can you talk a little bit

about , the connection or lack

181

:

thereof between medical, futility,

moral distress, and burnout?

182

:

Like those are all sort of topics

that get thrown around and kind of

183

:

jumbled together, but tee, tease

those apart for me if you can.

184

:

Nathan: Moral distress and

burnout, are similar, but

185

:

they're not entirely identical.

186

:

, so moral distress is something

that, , that we all experience.

187

:

Really, , the nuts and bolts of it, when

you feel like you know what the right

188

:

thing to do is, but you just can't do it.

189

:

, and the reason you can't do

it doesn't always matter.

190

:

It might be, , there's

some external constraints.

191

:

There might be some.

192

:

, internal constraints, whatever it is.

193

:

I know what the right thing

to do is, but I can't do it.

194

:

, that leads to that sort of gross

feeling and, and that's moral distress.

195

:

, burnout is similar, but burnout,

when we talk about it, is really, , a

196

:

syndrome that's, that's characterized

by, some specific, experiences.

197

:

for the person.

198

:

Suffering from burnout.

199

:

So things like, , separation or

distancing themselves , from patients

200

:

or clients, , or this, , disinterest or

feelings of dread, those types of things.

201

:

, so I think what's been shown

is that repeated instances of

202

:

moral distress certainly can

lead to or contribute to burnout.

203

:

, and when we have.

204

:

Clinicians and technicians experiencing

burnout that leads to turnover and, and

205

:

is generally not good for the profession.

206

:

And so recognizing this, this idea of

moral distress and, and repeated exposures

207

:

to moral distress can lead to burnout.

208

:

I was wondering, I was curious

about, you know, what is it that

209

:

contributes to moral distress?

210

:

What are the things that we find really

morally distressing in the profession?

211

:

And that sort of led me to futility,

where it's that sense of, man, I feel

212

:

quite certain I know what the right

thing to do for this patient is, but for

213

:

some reason the clients aren't on board.

214

:

, and.

215

:

I can't carry out what I think

is right until they're on board.

216

:

and that led to moral distress and, and

sort of my, my pursuit of, of the topic

217

:

Andy: in the survey research you put

out in:

218

:

stuck with me is , you found that 83.

219

:

0.7%

220

:

of technicians said that they

had been directed to deliver

221

:

care that they felt , was futile.

222

:

And, I've thought a lot about that.

223

:

And then sort of looking at sort

of what we're talking about,

224

:

with moral distress as well.

225

:

what does the stop button look like,

I guess, you know, like, what is

226

:

a veto card that, the team members

, can reach for when they say this

227

:

doesn't feel right, or I'm concerned

that we're, prolonging suffering.

228

:

You know what I mean?

229

:

Are we really doing the right

thing for the patient here?

230

:

And in my mind, I, I don't

really necessarily see it as a.

231

:

technician veto card necessarily, but

I definitely think that there should

232

:

be a button that people can push that

says, I would like to discuss this,

233

:

and like, let's get on the same page.

234

:

what does that mechanism

kind of look like?

235

:

Nathan: I think there are a

variety of ways that we could

236

:

sort of provide that off ramp for

the technicians in particular.

237

:

, I think the first step, , really

is making sure that your practice

238

:

culture, , empowers technicians to

feel like they can raise concerns.

239

:

, it has to be a psychologically safe

place for them, , so that they feel like.

240

:

You know what, not only can

I raise concerns, but my

241

:

concerns are taken seriously.

242

:

, it's not good enough

just to pay lip service.

243

:

And I think one of the interesting

things, , in the survey about this is,

244

:

you know, there was these directions

or, or request to act against their

245

:

conscience, to provide futile treatments.

246

:

if I recall right, about 80% of the

technicians of a, a large number.

247

:

Actually raised concerns to

somebody in the practice.

248

:

Most of them raised concerns to

the doctor, , and still ended

249

:

up carrying out the treatments.

250

:

And so, I think that, there

is the opportunity for people

251

:

to raise these concerns.

252

:

But I guess, , beyond raising the

concerns, what I would like to

253

:

see next is first psychologically

safe place to raise concerns.

254

:

Second is inclusion in the

process of decision making.

255

:

, there's some great research coming out

of Colorado State where, , they involved

256

:

licensed very technicians in some of

these conversations with clients, and

257

:

they found that client acceptance of

plans and engagement in the decision

258

:

making process went up when we included

technicians in the conversations,

259

:

because they have a valuable perspective.

260

:

So create that space.

261

:

Include them in the conversation.

262

:

And then I think the last thing is

to provide technicians some sort

263

:

of consequence free way to opt out

of providing certain treatments.

264

:

you know, sort of like a,

conscientious objection.

265

:

I've raised my concerns,

I understand them.

266

:

I don't think I can in good

faith do these treatments.

267

:

and making that okay.

268

:

in your practice.

269

:

Andy: I've seen cases where, the

doctor and the technician will look

270

:

at like a euthanasia case differently,

and I've had technicians say,

271

:

that's a convenience euthanasia.

272

:

And I have looked at it and

said, I don't see it that way.

273

:

You know what I mean?

274

:

I, I think this, there is

legitimate medical reasons to

275

:

do this and I understand the.

276

:

Depend on her in this way.

277

:

And I'm not saying that

person is wrong, you know?

278

:

And I definitely don't wanna

force them and say, yes, I don't,

279

:

but I don't see it that way.

280

:

And so now I need you

to participate in this.

281

:

And so when you're talking about kind

of a, an an opt out, is that kind of

282

:

like the, type of scenario in your

mind is to say, to me that always

283

:

seemed like the conscious objector

and I would sort of say, okay, well if

284

:

the technician is is not comfortable

with this, then I'm not, gonna.

285

:

Force them to, you know, to participate

in this case or do things like that.

286

:

Is that, is that kind of, , what

you sort of imagine in that way?

287

:

But, but this would be more

in, in palliative care,

288

:

ongoing care, things like that.

289

:

Nathan: obviously it has to

be able to be tailored to the

290

:

circumstances in a practice, right?

291

:

, I'm fortunate here at Cornell

we have, , a whole bunch of

292

:

highly qualified, exceptional

licensed veterinary technicians.

293

:

, and so if somebody says, you know what?

294

:

I'm just not comfortable

doing these treatments.

295

:

I'll be able to find somebody that

, can do them, , who maybe just has a

296

:

slightly different value structure or

feels more comfortable with, the way

297

:

that we've arrived at this decision.

298

:

, so yeah, I think that what that might

look like for me is, is somebody

299

:

saying either for a specific procedure,

you know what, I'm happy to keep

300

:

taking care of this patient, but

I just can't participate in this.

301

:

Specific thing.

302

:

, but I think it could include, you know

what, I'm happy to take on a couple of

303

:

other patients or another patient if I can

swap somebody, , to take care of this one.

304

:

, really it's that, , I, I think what I

want to avoid, or I think what would

305

:

be better for the profession is if

technicians didn't feel they would

306

:

lose their job or suffer serious

consequences for raising concerns.

307

:

. Because ultimately what happens is we

just drive them outta the profession If

308

:

we, if we don't give them that off ramp.

309

:

Andy: No, I, I, I co I

completely agree with that.

310

:

, the idea that someone says, I'm

morally opposed to this, and we're

311

:

like, well, you're gonna do it anyway.

312

:

To me, that's sort of common decency

of, of, of type of place that I

313

:

would want to work and everything.

314

:

, can you talk a little bit about,

implementing ethical rounds, and that's,

315

:

been an idea, I think that, that you've

sort of brought up or, or floated before.

316

:

And sort of like what it looks

like for the team to, to have

317

:

sort of open conversations around

cases or, or points of conflict.

318

:

Is that, is that an accurate,

representation of the idea?

319

:

Nathan: Yeah.

320

:

it's one of the things that we don't

spend a lot of time talking about, right?

321

:

I think we are all as veterinarians,

, and technicians both we're pretty

322

:

comfortable with, , picking

apart medical decision making.

323

:

Oh, hey, you know what?

324

:

kind of like morbidity, mortality rounds.

325

:

This is what went wrong.

326

:

We've identified what went wrong.

327

:

This is what we would do

differently the next time.

328

:

. So it's that idea of taking that same

kind of approach, a morbidity and

329

:

mor mortality approach to like, how

did we make our decisions, , from

330

:

an ethical or moral perspective.

331

:

I might have made very sound

medical decisions, , but maybe the

332

:

ethical decisions were the ones that

were really, really challenging.

333

:

so I think that idea about creating a

space where we can have conversations,

334

:

, that are open and judgment free.

335

:

And, and really where everybody's

opinion carries equal weight.

336

:

And this is not an expert

versus novice type of situation.

337

:

, we all have our own morals.

338

:

We're all experts on our own morals.

339

:

, so really we're entering these

conversations as, as peers.

340

:

, but just creating that space to, to

talk about the decision making process.

341

:

Did I identify the stakeholders?

342

:

Ooh, maybe I, I.

343

:

Attributed the wrong weight.

344

:

Maybe I, I identified the wrong person

to be making the decisions or, these

345

:

are the values that underpinned my

comfort with continuing treatments.

346

:

, can we talk about why you had reservations

about it or something like that?

347

:

, really again, just, just talking about

it, getting it out there, and making

348

:

sure that everybody feels like they are

allowed to contribute to the conversation.

349

:

Andy: Yeah, , this feels terrifying to me.

350

:

I, I, I love the idea,

you know what I mean?

351

:

And, and I am definitely someone

who's up for, for talking about,

352

:

well, you know, I understand.

353

:

We made the choice this way,

or, or this is, this is kind of

354

:

the value structure that I used.

355

:

And, you know, and I, I generally

really enjoy these sort of

356

:

philosophical, ethical conversations.

357

:

I, but I do, I do imagine myself as

a young doctor, even, even where I

358

:

am now, and I feel like the staff is

all looking at me and they're like,

359

:

why did you think this was okay?

360

:

And I would, I would've

immediate thought me.

361

:

I don't know why did I think it was okay?

362

:

I clearly made a terrible mistake

and then, and, and I would have to,

363

:

I'd have to work back through that.

364

:

It would be very hard for me not to

get defensive if I feel like the moral

365

:

judgment that I made was being questioned.

366

:

So have, I mean, have you, have

you, have you seen best practices

367

:

around conversations like this?

368

:

Nathan: I don't have them at my

fingertips to say, you know what, this

369

:

is, this is a proven way to do it.

370

:

, but certainly there are

structured conversations.

371

:

There's some good research out of, of

Europe, , Austria in in particular,

372

:

, around this idea of sort of a structured

debriefing, , for ethical challenges.

373

:

, I'll look for some of that and I'll

see if I can shoot it over to you

374

:

Andy: Oh, that'd be great.

375

:

Nathan: in, in the reading.

376

:

But, those things exist, I

would say, whether you have

377

:

a, a formal structure or not.

378

:

, you're absolutely right.

379

:

One of the hardest parts is, sort of

that humility that it takes to be able

380

:

to say, Hey, I might have messed up.

381

:

I don't really know.

382

:

and you have to be willing

to be very vulnerable with,

383

:

your staff as you're doing it.

384

:

I've held sort of informal ethics

rounds with students sometimes, , where

385

:

I just say, Hey, , have you guys

experienced any ethical challenges?

386

:

and surprisingly they're very comfortable

saying, heck yeah, I actually have.

387

:

, and once the conversation gets

going, , usually it, it's, Really

388

:

a, a productive conversation.

389

:

And I think uniformly everybody kind of

leaves feeling like, yeah, that was good.

390

:

even if I couldn't prove that my decision

making was right, I feel better because

391

:

I heard what other people were saying

and, and how they were approaching it.

392

:

Anecdotally for me, it,

it works pretty well.

393

:

Andy: Yeah, that's good to hear.

394

:

It, it feels like to me, you show me

a team that can have a good productive

395

:

conversation like this, and I'll show

you a team that has got really good

396

:

psychological safety and a really great

culture because I think that those things

397

:

are, they're absolutely, they, they

would have to be absolutely essential.

398

:

I think those types of conversations,

and having , the ability to sort of open

399

:

the floor and let people talk about it.

400

:

To me that would be an

absolutely amazing thing.

401

:

it's a lot, I think, to make

people not feel defensive

402

:

about decisions that they made.

403

:

And a lot of times, a lot of times you

don't know if you made the right choice.

404

:

You kind of tried to read the

room a little bit on where the pet

405

:

owners were and what was possible

and understand the facts and I

406

:

think that that's fascinating.

407

:

Yeah.

408

:

I, I would love to continue to sort

of think about those sorts of things.

409

:

I think the art of doing a good debrief

meeting, after sort of a medical,

410

:

either a medical safety incident or, or

in this case a medical ethics incident.

411

:

I'm really interested in kind of what,

you know, what that looks like and

412

:

how, and how good practices do that.

413

:

Nathan: Yeah.

414

:

And if I, I would say if you want to add

structure to it, you might use something

415

:

like, , the way we might conduct sort

of an ethical analysis of a case, right?

416

:

\ first is identify what

the ethical problem was.

417

:

Okay.

418

:

This is really specifically

the ethical and moral problem.

419

:

And then what options did I have?

420

:

, and you can use different moral theories.

421

:

Duty based theories, or utilitarian

theories or principle, different ideas.

422

:

, and then we want to identify stakeholders.

423

:

We wanna think about the

consequences of our decisions.

424

:

, and that allows us then to, to really

lay out, Hey, you know what, , besides

425

:

just having a gut feeling that I did

something right or wrong, I can point to

426

:

a, to a moral theory and say, Hey, this.

427

:

This theory is kind of what I was

making my decision on from the

428

:

perspective of benefiting this

stakeholder or something like that.

429

:

Andy: Yeah, I think

that that's fascinating.

430

:

That's the type of stuff

I, I sort of nerd out on.

431

:

I do think that there's great value

in saying, how did we come to this?

432

:

Choice from a really open standpoint.

433

:

Like I was, I was looking at

something recently and you know,

434

:

I was kind of wrestling back and

forth with is this the right call?

435

:

\ And I actually got kind of

nerded out on it and everything.

436

:

And, and you sort of look and you

say, from a utilitarian standpoint, I

437

:

would say this is definitely the call.

438

:

And from a, sort of a, a.

439

:

Buddhist standpoint, you know, where

the goal is to reduce, you know, reduce

440

:

suffering and, you know, above all else

and things I'd say, well, you know,

441

:

utilitarianism and Buddhism, don't

always take you to the exact same answer,

442

:

but, you know, it's not that one is,

is right or wrong, but I, I think that

443

:

those are interesting and I, I found

that there's great comfort for people.

444

:

If they at least understand why the other

person made the choice that they did.

445

:

I think a lot of times we tell ourselves

stories about, well, this person wanted

446

:

this or they just didn't wanna do that.

447

:

And if you can, you can actually

talk through the decision process.

448

:

You can say, that's not the

path that I would've taken,

449

:

but at least I can see that.

450

:

I can see what they were trying

to do and why they made this path

451

:

Nathan: Yeah,

452

:

Absolutely.

453

:

And I think that's, that's borne

out, in the research, , in both human

454

:

nursing and, and the stuff, , coming

outta Colorado State is involving

455

:

the technicians if they just feel

empowered as part of the conversation.

456

:

A lot of times I heard straight

from the owner's mouth why

457

:

they're making this decision.

458

:

A lot of times that that goes a long

ways to resolving some of the moral

459

:

distress that they're experiencing.

460

:

Andy: I, I think the, I

think you're very right.

461

:

I think a lot of times some of us are in

the room and they hear exactly what the

462

:

person says and also how they say it.

463

:

And another person may,

may not be there for that.

464

:

, and they, you know.

465

:

They've had a previous experience

with a case like this, and

466

:

they draw heavily from that.

467

:

And so you end up with these

people who are really looking

468

:

at these cases very differently.

469

:

Dr.

470

:

Nathan Peterson, thank you

so much for being here.

471

:

Where can people keep up with you

when you have new research coming out?

472

:

Where, where can they find it?

473

:

Nathan: I'm not big on social

media, but I have a, LinkedIn page.

474

:

I'm out there on Facebook.

475

:

, they can look at, look for me,

at the Cornell website too.

476

:

Andy: I will link everything up.

477

:

We'll, , get direct links to the

Cornell website, to your LinkedIn

478

:

page, all that sort of stuff.

479

:

Thank you for being here guys.

480

:

Thanks for listening and tuning in.

481

:

Everybody.

482

:

Take care of yourselves, gang.

483

:

Speaker: And that's what I got guys.

484

:

Thanks for being here.

485

:

Thanks to Dr.

486

:

Peterson for being here.

487

:

Guys.

488

:

I hope this was helpful.

489

:

, check out his, LinkedIn page.

490

:

He's definitely

someone, , to keep up with.

491

:

, I just, I love his research.

492

:

I, I think that he's doing

really, really good stuff.

493

:

I think he's doing good

stuff , for both the people and

494

:

the pets that we care about.

495

:

gang.

496

:

Take care of yourselves, everybody.

497

:

Be well.

498

:

I'll talk to you later on.

Follow

Chapters

Video

More from YouTube