Artwork for podcast Reimagining Work From Within
AI and the Evolution of Work: Machines Can Be the Machines
Episode 5431st March 2026 • Reimagining Work From Within • Within People
00:00:00 00:35:00

Share Episode

Shownotes

We've had two conversations about AI on this podcast: whether it could write your values, and whether it could write your strategy. This one is more personal than that.

In this episode, Emily, Laurie, and Bev Attfield (joining the AI conversation for the first time) explore a question that's become impossible to avoid: what does the arrival of AI actually mean for the way we work, and for a company like Within?

The conversation centres on Within's evolving belief that the evolution of work is superhuman. That AI, used well, might finally liberate us from doing robotic work so we can focus on the things that are irreducibly human. Empathy. Relationships. Creativity. The discomfort of making something that actually matters.

It wouldn't be an honest conversation without the hard parts too. What does it mean for the people inside organizations who aren't ready? What do we lose when we make creativity too frictionless? These are questions we're genuinely sitting with, not ones we've figured out. If you're a leader navigating this right now, we think you'll find yourself in this conversation.

Highlights

  1. Why Within sees AI as an existential question, and an exciting one
  2. What machines can't do: love, dream, discern, and hold accountability
  3. Why the journey to an outcome still matters more than the output itself
  4. What's coming next: Within is writing an AI manifesto, and you're invited along

Transcripts

Emily Shelton:

Hello everyone and welcome back to Reimagining Work From Within.

2

:

I am Emily Shelton and I'm

here in conversation with

3

:

Laurie Bennett and Bev Atfield.

4

:

Today we are here to talk about ai.

5

:

We have had two conversations

about AI previously, in relation

6

:

to purpose and strategy.

7

:

Can AI write your values and

can AI write your strategy?

8

:

Today is more personal than that.

9

:

It's about how we work now.

10

:

So we've been evolving

within, Vision on ai.

11

:

We've been talking about what

we believe the future of AI will

12

:

look like, and we believe that the

evolution of work is superhuman.

13

:

That's what we want to get into today,

and it's our first time inviting Bev

14

:

Atfield in on this conversation with us.

15

:

So, hi Bev.

16

:

Hi Laurie.

17

:

Are you guys ready to chat about ai?

18

:

Bev Attfield: Hi Emily.

19

:

Hey, Laurie.

20

:

Good to be here.

21

:

Fun to be on the other side

of the mic for a change.

22

:

And yeah, excited to dig into

this juicy topic of the day, which

23

:

is, I guess, the topic of the

day for most people right now.

24

:

Laurie Bennett: Yeah.

25

:

I'm excited to get back into this.

26

:

It's changed so much since our last

couple of conversations and great to

27

:

have Bev's perspective in here too.

28

:

Emily Shelton: Absolutely.

29

:

Well, I, I wanna kind of us

right into the conversation.

30

:

I think that, We we have two different

perspectives to share here today.

31

:

what we're kind of seeing internally

and what we're seeing externally.

32

:

Bev Attfield: So I'm gonna kind of

bounce this back and forth between

33

:

those two different perspectives and

let's see what we learned together.

34

:

Laurie, where does within find

itself right now in relation to ai?

35

:

How has the conversation

shifted over the last year?

36

:

Laurie Bennett: Well, I think it's

shifted from it being this interesting

37

:

thing to explore that's kind of

new and fun and not that scary.

38

:

Mostly exciting into something

that's really become all consuming

39

:

of, Workplace conversations

around culture and the future.

40

:

I think it's impossible.

41

:

It feels impossible now to

talk about what's happening

42

:

with the evolution of work.

43

:

Somehow factoring ai, some of the deeper

thinking that we've been doing over this

44

:

time has been what does the arrival of AI

into this space mean for a company like

45

:

within, we've gone from seeing it as a.

46

:

Quirk to deal with in our work

to, possibly, and this sounds

47

:

terribly dramatic as an opening

statement in the podcast, but an

48

:

existential threat to what we do.

49

:

Why would people need.

50

:

Human within people to come in and

help them with aspects of their

51

:

culture, articulation and development,

and the tools that we need to

52

:

understand how things are going

inside a culture, the ability to

53

:

build strategy when AI can seemingly

do so much of that work without us.

54

:

And so the thinking that we've

been doing is kind of, well,

55

:

what's within this role in a.

56

:

Future that has AI in it, and how do

we continue to make that something that

57

:

inspires us and that gives us a mandate

to be doing great work in service

58

:

of more human work out in the world.

59

:

Emily Shelton: Yeah, that's great.

60

:

So would you say that we're no longer

asking whether AI will work for us,

61

:

but how it will be working for us?

62

:

Is that a shift that you've seen the tone?

63

:

Laurie Bennett: Yeah.

64

:

I think we're resigned to the, to

the fact that it's gonna be here and

65

:

it's gonna make a really big splash.

66

:

And so I think, yeah, it's less about

whether or not it's going to be a part

67

:

of our future in a meaningful way, but

how do we make it a part of our future

68

:

in a way that's really meaningful to us?

69

:

the big questions that sit out there

right now around AI replacing people in.

70

:

The workplace AI taking on the kinds

of things that people do at work.

71

:

the big question of, well, what do people

do at work and what is work left behind?

72

:

And I think we're finding ourselves now

having some really interesting thoughts

73

:

from Withins perspective around how can

we use this amazingly powerful tool not to

74

:

replace people, but actually to make work

more human than it's ever been before.

75

:

Emily Shelton: And what about you, Bev?

76

:

What are you seeing out

in the world of work?

77

:

What are organizations and

employees grappling with right now?

78

:

Bev Attfield: Yeah.

79

:

I, I think it's, I mean, we're in

this interesting moment in time where

80

:

as we're having these thoughts and,

we're considering what this means

81

:

for us as a business and a group of

folks who are passionate about work

82

:

and how to leverage that for people.

83

:

We are in the context of a, of the wider

world, and there is this significant

84

:

shift that is happening for society.

85

:

I'm reading a book right now by.

86

:

A person who is interested in the

economics of this shift that we're making.

87

:

his name is Iad Mustique.

88

:

And the, the book is The Last Economy, A

Guide to the Age of Intelligent Economics.

89

:

And I'll just read the quote because

this really made me sit up last night as

90

:

I was thinking about this conversation.

91

:

And he says that we are the generation

that will live through the discontinuity,

92

:

the last humans to remember when

human thought had economic value.

93

:

And we are the first to discover.

94

:

What comes after.

95

:

And that gave me chills, but also

gave me like goosebumps of like

96

:

excitement around the opportunity

that this could offer to us.

97

:

And I think it's right now, as a

business you can take a glass half

98

:

full or you can take a glass half

empty approach to this, right?

99

:

And I think that there is very much

in the conversation right now around

100

:

this new technology potentially

leading to the destruction of.

101

:

The human race.

102

:

That's the very far end of, like,

Laurie, you were talking about

103

:

being alarmist or dramatic, but that

that could be one outcome, right?

104

:

There is another one other outcome

of many that could be that we create

105

:

this technology and we adapt and we

use it to be more prosperous than

106

:

we've ever been before as a species.

107

:

Right.

108

:

And I think for me, as I think about what

Withins position on this is, because,

109

:

I mean, we can't control or change the.

110

:

Process and the, the course that AI

is going to take, but we are able

111

:

to listen to our clients and take a

position on what this could mean for

112

:

people in the workplace and maybe

for once and for all we can actually.

113

:

Find a way to let humans do the things

that they are really great at, right?

114

:

Creating, innovating, planning,

thinking, dreaming, right?

115

:

Those are all the things.

116

:

However, there are some realities

that are going to get in our way.

117

:

I think there's a few things that I'm

seeing out there in the world right now,

118

:

There's a lot of optimism about what

AI is gonna do and how it's gonna

119

:

make us more efficient, and how

yes, it's probably going to lead to

120

:

some jobs being displaced and doing

things that humans used to do, right?

121

:

Yes, that is reality.

122

:

But I think that there is also this

friction where we are not ready

123

:

to actually capitalize on what

this could mean for us as humans.

124

:

So there's a lag between the appetite

for change and the ability to actually

125

:

structurally make the change in business.

126

:

And a, a big part of that is this, there's

this tension from the C-suite and from

127

:

stakeholders to wanna jump on this AI

bandwagon and get the productivity gains.

128

:

But then we've got the folks who

are advocating for the people, the

129

:

workers, the, the doers of the things,

or the makers of the widgets in these

130

:

businesses that aren't there yet.

131

:

And so there's this real tension

between the HR function and generally

132

:

speaking, the C-suite function.

133

:

And what I think is

fascinating about that is that.

134

:

This is not a technology problem to solve.

135

:

This is a human problem to solve.

136

:

And I think we have to be aware, and

I know that we already are thinking

137

:

about this for our own clients and

for ourselves around how are we

138

:

gonna bring people along with us?

139

:

People who are fearful, anxious,

don't have a great appetite for

140

:

change in the global talent trends

report from Mercer for this year.

141

:

we just had a conversation with

Kate Bravery, who is one of the

142

:

lead authors of that report.

143

:

only 44% of employees

are thriving right now.

144

:

Right?

145

:

So that's the lowest number in

that report in 10 years, right?

146

:

So when you've got this base of

people who are already struggling

147

:

and you've now got this pressure to

perform and adapt because of this

148

:

existential crisis that we're feeling,

that is a recipe for disaster, right?

149

:

You can't bring disenfranchised

people along with you.

150

:

So I think there's a real human

challenge, and I think it's going

151

:

to rely on the leaders to now step

in and lead in a different way.

152

:

And I mean, that's a drum we've

been beating for 10 years, isn't it?

153

:

Around human-centered leadership.

154

:

So I think there's a big swirl.

155

:

Like there's a lot in there that

I just kind of threw into this

156

:

conversation, but I think it all

comes down to me to one thing.

157

:

Again, it's a human problem.

158

:

It's not a technology.

159

:

Emily Shelton: Absolutely.

160

:

There's so much to dig into.

161

:

I love that quote.

162

:

And the ending of that quote was

what stuck with me, which is that

163

:

we have the opportunity to kind of

shape how this evolution moves forward

164

:

if, if we get to decide what it

looks like on the other side of it.

165

:

So that.

166

:

To me feels pretty empowering.

167

:

I wanna focus in on the vision that we've

been talking about internally, which is

168

:

that the evolution of work is superhuman.

169

:

So I wanna get a little bit more

into the risks and realities, but

170

:

before that, I, I wanna dig into

the evolution of work is superhuman.

171

:

So I'd love to actually

understand where that came from.

172

:

Laurie, could you give us a little bit

of insight into our thinking there?

173

:

Laurie Bennett: Yeah.

174

:

The evolution of work is.

175

:

Super human.

176

:

I think as we've been thinking about

not just ai, but what the future of

177

:

work looks like, what's always been

important to within is that work

178

:

is a place of meaning and purpose,

and that the way to get to that for

179

:

the last 10, 15 years has been two.

180

:

Tap into the humanity that

people get to experience at work.

181

:

The real things that make people

connect to each other and connect

182

:

to the impact that their work has.

183

:

And so much of the time that's been about.

184

:

Starting to move away from the, the

structures and the systems that were

185

:

developed through the Industrial

Revolution in the name of efficiency

186

:

and production productivity, where

workplaces became these very structured

187

:

environments very high on control.

188

:

That our, that you work, the way that you

show up, the way that you dress, the way

189

:

that you, that you're managed like almost.

190

:

A machine, ironically, we kind of

invented machines to help people

191

:

at work, and then we had people

behave like machines alongside them.

192

:

one of the things that seems to be

really exciting about AI and the, the

193

:

associated kind of machine learning

and capabilities that that brings

194

:

is, well, maybe now for the first

time in a really long time, people.

195

:

Don't have to be machines.

196

:

Machines can be the machines and people

can be the people, and that there's

197

:

something really powerful about that

idea that when we think about the

198

:

future of work and how do we make

it human AI can feel like the single

199

:

biggest threat to that, that it's

going to come in and replace and reduce

200

:

the value of humans in a workplace.

201

:

But there's another side to that picture,

which is maybe AI's been exactly the thing

202

:

we need in order to finally liberate us

from having to behave like robots at work.

203

:

And that if AI can take the

work that has felt robotic and

204

:

mechanical, that can bring knowledge.

205

:

Yes.

206

:

And insight.

207

:

This amazing ability to

process information and data.

208

:

What does that do to enable people

and empower them in ways that they've

209

:

never, we've never had access to before?

210

:

years of education and training

and learning can be at our

211

:

fingertips now, seconds away.

212

:

And so that can be a

really terrifying idea.

213

:

Or it can be one I think which, if

we can leverage and channel that in

214

:

the right ways, could actually help

us make the work that we do as people

215

:

at work more human than ever before.

216

:

And that's really exciting and I.

217

:

Considering the evolution of work as

being this super human place where we get

218

:

to be super human, we get to concentrate

on the aspects of being a human that are

219

:

most powerful, our capacities for empathy

and imagination, and connection and love.

220

:

All these qualities that sit there,

that we know are the foundation of what

221

:

makes work meaningful and underpin.

222

:

Ways of leading that make for

way more human workplaces.

223

:

That kind of stuff we get to really

focus in on knowing that the other pieces

224

:

and aspects of work that we do can be

boosted by this incredible technology

225

:

that can help us step into that space.

226

:

So for us, I think it's a, it's an

ambition and it's a bit of call to,

227

:

to say, what can we do with this

technology to make sure that it becomes.

228

:

Enhances the human experience

of work rather than removes it.

229

:

And it, it's a huge challenge that

I know we don't have the answers

230

:

to yet, but that's the way of

looking at it that I think makes

231

:

sense for us as within to hold.

232

:

And I think it's the thing that a lot

of leaders out there are struggling with

233

:

right now, which is some way, as Bev

said just now, the way to balance the

234

:

incredible opportunity for efficiency

and productivity that it brings.

235

:

With the high potential for, to having

a real toll on the human experience

236

:

of work, and how do we hold those

two things together while we take

237

:

that dark force and we bring it

over to the light side and use it.

238

:

For good.

239

:

Bev Attfield: Yeah.

240

:

I'm gonna hop on that real

quick with a bit of a hot take.

241

:

I, I think you, you're right, Laurie.

242

:

I think that there is lots of room to

feel optimistic, but I, I think that

243

:

we are also at the mercy of a handful

of actors here who are determining

244

:

the course of human history, right?

245

:

So we've got just a handful of

really powerful people who are

246

:

leading the way with AI development.

247

:

To get us to artificial general

intelligence, and I've been quite

248

:

skeptical around their motivations

and like, what does that mean

249

:

for the rest of us, right?

250

:

Like we are downstream

from their decisions.

251

:

But there's a glimmer of hope because I

was listening to an interview with the

252

:

co-founder of Anthropic, Daniela Am who.

253

:

Has had some quite vocal opinions about

what AI should do and what humans should

254

:

do, and I'm gonna read another quote

because I think that this is like really

255

:

something for us to hold onto as a glimmer

of hope in the course of the future here.

256

:

And she said, in a world where AI is

very smart and capable of doing many

257

:

things, the things that make us human

will become much more important, not less.

258

:

And I really felt like that aligned with

where within is feels we need to go.

259

:

I mean, that is about the future

of work or the evolution of

260

:

work being superhuman, right?

261

:

It is about doubling down on the

things that humans are able to

262

:

do that we should not delegate or

automate and allow machines to do.

263

:

Right?

264

:

And why wouldn't we want a future where

we could get the both best of both, right?

265

:

Not the best of one at

the expense of another.

266

:

So I think we, what I'm holding

right now is a bit of like

267

:

helplessness because we are all just.

268

:

In this, simulation is what some people

are calling it, depending on your

269

:

philosophical view, but nonetheless,

like this is the reality, right?

270

:

Like there are other people that are

making really important decisions

271

:

right now, and I think that's why

it's really important for all of us

272

:

to have an opinion and a voice around.

273

:

What do we want this to look like for

us and what could it be rather than just

274

:

having this decision thrust upon us and

have to deal with the fallout of it.

275

:

Emily Shelton: Yeah.

276

:

Really, really powerful and kind of

led me, leads me to my next question,

277

:

Bev, which is that we have this vision

of where we see or what we see AI

278

:

could be the potential of AI is that.

279

:

Vision something that you're

seeing in other leaders.

280

:

We've talked about the select

five that are making some

281

:

big choices, but maybe, in.

282

:

The other group of leaders that

you're seeing day to day, do you think

283

:

that they have similar ambitions?

284

:

Bev Attfield: Look, I, I think

we're still grappling with what

285

:

this actually means for us, like in

the day to day, and I don't think

286

:

anyone actually has the answer.

287

:

I, I know that a number of our clients

are, are really forward thinkers and they

288

:

are trying to solve this problem for their

own business and adapt as they need to.

289

:

But I think the reality is

it's, it's a moment in time

290

:

where I don't think anybody.

291

:

Really knows.

292

:

I mean, I suppose you could say

that about any big change in the

293

:

course of history, but I just feel

like the velocity of this change is

294

:

what's curious and different for us.

295

:

And I think leaders who can say, I

actually don't have the answer and I'm

296

:

just going to bring curiosity to this,

and have a conversation with people around

297

:

what they're feeling right now, that's the

most important thing you can do, right?

298

:

Because.

299

:

I just don't think that there is anybody

out there who actually knows what this is

300

:

truly gonna mean for us as we go forward.

301

:

Laurie, what do you think?

302

:

Laurie Bennett: Yeah, I, I do think

that that's, that's true and I think

303

:

that that is true as well, as you say

of so many aspects of what's going on.

304

:

AI's one thing right now in a

tumultuous world, and I think when

305

:

you don't know what's happening

in the future, you gotta try to.

306

:

Shape and create it in some way.

307

:

And whilst there's a few people who have a

lot of power around this, I think there's

308

:

a lot of people who have some power around

this, especially within organizations

309

:

and within, and leaders in organizations.

310

:

And to, to be able to, to make this

something which adds something meaningful

311

:

to the human experience of work.

312

:

And I think we've said that for

a long time around, and it's

313

:

why we as within, why we work.

314

:

Business is that they're, they're

these amazing containers of society

315

:

and the leaders within businesses

have a real capacity to shape the

316

:

experience of a large number of people.

317

:

And this is no different.

318

:

The tool is the tool and the way

that we use it becomes the really

319

:

important thing in the wrong hands.

320

:

It'll become something that's

could be really destructive.

321

:

Culture and beyond that to society

when used with the right intentions and

322

:

in a way that is designed to support

more people rather than fewer, I

323

:

think it can be an incredibly powerful

thing, and I think there are so many

324

:

opportunities for leaders even now,

even in the dis, even in the confusion

325

:

of what all this is going to mean.

326

:

To turn back to their own purpose

and their own set of values and ask

327

:

themselves, how does this new thing

help us live into those things more?

328

:

And as we think about integrating

it and bringing it into our worlds,

329

:

how do those aspects of what have

always been true and important for

330

:

us continue to shape and guide how we

step into an unknown space around this?

331

:

And so I think the culture of

an organization has so much.

332

:

To shape what AI looks like

inside that organization.

333

:

And AI has so much potential to

enhance stuff for us if we use it

334

:

and we approach it in that way.

335

:

Emily Shelton: Yeah.

336

:

Thank you.

337

:

Oh, go ahead Bev.

338

:

What were you gonna

339

:

Bev Attfield: Yeah, I

was gonna jump in there.

340

:

I, I was just thinking, Laurie, as

you were talking, that before AI even

341

:

arrived, humans have done a pretty

good job of destroying culture, right?

342

:

With toxic leadership.

343

:

With practices and ways of being that

have not served the wellbeing and the

344

:

creativity and the joy of humans, right?

345

:

I mean, historically, over the

years, like employee engagement

346

:

has been very low, right?

347

:

It's very hard to engage

people, and that was before ai.

348

:

So I think, hey, like, let's

see what AI can do for us.

349

:

Maybe we'll be able to move

that needle a couple of points.

350

:

Laurie Bennett: And it's, I think

it's, like anything new that comes

351

:

in like this, it's it doesn't

necessarily change everything, but

352

:

it changes the volume of everything.

353

:

So if you've been doing something, if

you've been leading an inhuman culture.

354

:

AI will accelerate that process for you.

355

:

If you've been leading a really

human culture, AI has the possible

356

:

to possibility to accelerate

or amplify that for you, so the

357

:

human is still sitting there.

358

:

Holding the controls for this.

359

:

And I think, one of the challenges

in these moments is not to throw out

360

:

everything because the system has changed

or the paradigm has shifted and imagine

361

:

that it needs something completely

different, when in fact, so much of the

362

:

things that have been developed inside

of cultures up until now will be the

363

:

very things that we need to draw on to

be able to navigate this disruption too.

364

:

Emily Shelton: Well, I, I think

that we've done a pretty good job.

365

:

Chatting about the risks and

the opportunities here for ai.

366

:

Something I, I wanted to just touch

on, we, we've talked a lot about like

367

:

societally what could be at risk with, ai.

368

:

I think that there's also,

potential loss in creative

369

:

processes with individuals as well.

370

:

I think that if, we find people

relying too heavily on AI

371

:

to create something that we.

372

:

Miss the moments of discomfort

that are required for making

373

:

something, if that makes sense.

374

:

I think that AI is really great

at streamlining, and I think

375

:

that streamlining isn't always

needed in creative environments.

376

:

Right?

377

:

So I, I don't know.

378

:

I just think that that's something

also to consider that humans we.

379

:

Confront struggle when we're creating

or, collaborating with individuals.

380

:

But that struggle is what helps

us get to the other side and

381

:

make something meaningful.

382

:

And while AI has the ability

to make our lives easier,

383

:

maybe there's risk in that too.

384

:

So, just something that

I've been chewing on lately.

385

:

Is it better to make or

to make with intention?

386

:

Right.

387

:

And AI does not have the

ability to discern like we do.

388

:

So now I'm just on my

389

:

Bev Attfield: soapbox.

390

:

That's,

391

:

Emily Shelton: but

392

:

Bev Attfield: that's an interesting

perspective and I, I would say.dot

393

:

yet.

394

:

Right.

395

:

I think for us to imagine that we

will not get to a place where the

396

:

artificial intelligence can think

and create like us, I think that's,

397

:

we, we need to be careful with that.

398

:

Yeah.

399

:

I think the other force that we're

gonna have to reckon with is the

400

:

market tension of commoditizing

something like creativity, right?

401

:

In that quote at the beginning that I, I

shared around, what is the value of human

402

:

thought when it's cents on the dollar for

creating a new logo or spending $500 with

403

:

the graphic artist, the market isn't going

to be able to support the graphic artist.

404

:

For very long.

405

:

So I think we're gonna have to

reckon with some realities that

406

:

there are market forces at play.

407

:

Yes.

408

:

I mean, we don't wanna lose what the

essence is of being human, being able

409

:

to have creative thought, but how do

you combat the market force of that?

410

:

Right?

411

:

How do you protect that?

412

:

Especially if it's your profession, right?

413

:

If it's your intellectual pursuit

to be an artist and to go into

414

:

the forest and paint, well

that's a different conversation

415

:

as someone's profession, right?

416

:

So

417

:

Emily Shelton: yeah,

418

:

Bev Attfield: absolutely.

419

:

I think that, I think there

are more conversations to come.

420

:

Is is where I definitely,

I think I'll leave that.

421

:

Laurie Bennett: I have such a juicy one.

422

:

I love this.

423

:

I feel like this is the debate that's

out there right now, back Emily, to

424

:

some of those previous conversations

we had around can AI create the

425

:

values or write your strategy?

426

:

And the executive summary of

those being, it can give you

427

:

a really convincing outcome.

428

:

Emily Shelton: Mm-hmm.

429

:

Laurie Bennett: But the journey

to it has lost all its meaning.

430

:

Space and there's something really

important about those journeys

431

:

and how things happen, not just

the outcome that we arrive at.

432

:

Yeah.

433

:

And so I, I think that's gonna

be the kind of stuff that we

434

:

wrestle with and wrangle with.

435

:

these are the things which have

been deeply human processes in the

436

:

past around creativity, around that.

437

:

You mentioned there the ability

to discern and have wisdom.

438

:

And there's something in that which is,

yeah, AI's gonna get really convincing

439

:

in those spaces and ultimately can't

be in the same space as a actual

440

:

human connection in that world.

441

:

AI can not love you.

442

:

It can sound like it does.

443

:

It can tell you that

it does, but it can't.

444

:

And can it have empathy in the

way that a human has empathy?

445

:

I don't believe so.

446

:

Can it dream about a future state in a way

that a human can do that that's driven by

447

:

an emotion rather than a, an algorithm?

448

:

it can give you a billion possible

future states, but it can't tell

449

:

you which one makes your insides.

450

:

Just, I think there's gonna be some

things in that kind of space that become

451

:

really important, and I don't think it

necessarily will remove the value of them.

452

:

It might increase the value of

those particular things, the

453

:

things that are still inaccessible.

454

:

A world where seemingly

everything becomes accessible.

455

:

And the thing that I've been really

encouraged by, and I don't, I think this

456

:

is probably mostly from creative agencies

and people out there in the world, is the

457

:

concept of slop, which seems to be in a

word that's arriving into the AI universe

458

:

of like, it's fine, but it's slop.

459

:

it'll do the job, but it's not.

460

:

It doesn't have the magic somehow.

461

:

And I think the, our, the, the value

of not slop will still hold something.

462

:

And I think the human experience of

creating together the feeling, empathy of

463

:

dreaming together, that's the stuff that

I want to, to see become the things that

464

:

we do, we get to do the most of, because

that's where our value really lies.

465

:

That's where the source of joy

and meaning in our work sits.

466

:

And.

467

:

If AI can tee us up to do those

things with even more energy and

468

:

insight and focus, then bring it on.

469

:

Bev Attfield: Yeah.

470

:

Yes.

471

:

I think what you're hitting on there

for me, Laurie, is that like it's

472

:

relationships at the heart of it, right?

473

:

And I have some real questions around.

474

:

Is, can a machine be a proxy for

the relationships that we can build?

475

:

Right.

476

:

And is if we bring this down to our

context, and maybe there's some listeners

477

:

here who are also in consulting or

B2B situations where they're having

478

:

the same existential crisis, it's

the relationships that are the.

479

:

Foundation of how we run

our businesses, right?

480

:

And our clients don't come

to us simply for outputs.

481

:

They come to us for transformation.

482

:

And that transformation happens because

we can build trust, we can listen in

483

:

the room, we can co-create something.

484

:

Whether it behind a Zoom screen

or in person in a joyful workshop.

485

:

Right.

486

:

And I think it's the, the, the

connection and the community and the

487

:

touchpoint that we offer and that

we can build because we have that

488

:

capacity as these organic beings.

489

:

That is, I think that's where we are

most certainly doubling down on, right.

490

:

When we say that the evolution

of work is super human.

491

:

It's more human than

it's ever been, right?

492

:

Because we have the capacity

to do the things that.

493

:

Machines cannot.

494

:

Right.

495

:

And I mean, future will tell us.

496

:

Right?

497

:

We'll see how that unfolds.

498

:

but I think for now, like as I'm

thinking about it, it's the relationships

499

:

that are what we offer and that's

where we get joy in our work.

500

:

Right.

501

:

It's not, just.

502

:

Using a tool for the sake of using a tool.

503

:

Right?

504

:

We, yes, we will rest on the tool

to help us do the things that

505

:

we want to do, most certainly.

506

:

But it comes down to the connection

and the human, the humanness of it all.

507

:

Laurie Bennett: Yeah.

508

:

It feels, I, I think it's in danger,

isn't it, of sounding like work becomes

509

:

this whimsical place where we dream

and have empathy for each other.

510

:

But I think.

511

:

It does also touch on some really

critical aspects of decisions and

512

:

accountability and things as well.

513

:

I don't think that AI is going to be well

positioned to make decisions on behalf of

514

:

people, yet it can inform them and provide

insight, but ultimately, somebody's.

515

:

Holding the accountability for that,

that can't be devolved to a machine.

516

:

And it's really difficult to imagine

that sort of capability inside

517

:

a business being replaced by ai.

518

:

And so, and I think that's the, the danger

of AI coming into organizations in the

519

:

wrong ways is imagining that it is a

replacement for creativity or empathy and

520

:

relationship or decision making that we

can devolve those kinds of things to it.

521

:

Coach that you speak to, or the mentor

that you work with is a bot, not a person.

522

:

That the creative process becomes

feeding a prompt and having

523

:

something appear magically, seconds

later that decisions become.

524

:

Quantitative assessment of logic by a bot.

525

:

Like I think we get into some really

dangerous territory around what happens

526

:

inside organizations if we replace

the wrong stuff as we go forward.

527

:

Yeah.

528

:

Emily Shelton: Yeah, I agree.

529

:

Well, I, I wanna be aware of our time.

530

:

We, this was gonna be a short, snappy

conversation, but it's impossible for

531

:

us not to dig in on these juicy topics.

532

:

So I, I do wanna speak to you a little

bit about what we're doing at, at within.

533

:

We're working on something currently,

a way of articulating what we believe

534

:

about AI and how we use it at within.

535

:

So we'll be bringing updates to you

about that over the coming months.

536

:

I, this definitely feels like a

continued conversation, right?

537

:

I think that this is not the end.

538

:

This is, not the beginning either.

539

:

We've started it a couple of years

ago, so I, I would love to continue

540

:

this conversation with you over

the, the next year and see how this

541

:

continues to evolve and develop.

542

:

I would like to ask you one more

thing before we go, which is if

543

:

we were to take this conversation.

544

:

If we were to reopen this conversation

in a month or two months, what, where

545

:

do you think we should go next with it?

546

:

What feels like a natural place

for us to dig into or to look

547

:

at as we are in this journey?

548

:

Bev Attfield: A lot can

change in two months.

549

:

Emily Shelton: It's true.

550

:

Bev Attfield: I think we'll have

to recalibrate in two months

551

:

about where, what has actually

changed since this conversation.

552

:

Laurie Bennett: Bev will have, well,

Bev will be a bot in two months, and so.

553

:

That will change everything.

554

:

Bev Attfield: Yes.

555

:

Yes.

556

:

The upload will be complete.

557

:

Laurie Bennett: I really love

these tensions that we've

558

:

started to, to talk about.

559

:

I think there's a, there's some really

interesting and important things to

560

:

tease apart in conversations about

creativity and decision making and

561

:

empathy and coaching and well, really,

how, how do you start to, to use this?

562

:

Power of discernment that we're

saying is so fundamentally human.

563

:

How can we start practicing that now to

understand really practically what are

564

:

the things that AI can really support us

with and help us with and what aren't?

565

:

And how do we tell the

difference between those two?

566

:

And start to make sure that the kind of

systems that we're building out inside

567

:

our cultures and inside our businesses.

568

:

Respect that boundary.

569

:

Bev Attfield: Yeah, I, I think, there's

so many rabbit holes we can go down

570

:

as we continue this conversation

and I'm, I love this format.

571

:

This is, and I hope that others

will join us and, be part of

572

:

the conversation like that.

573

:

That would be great.

574

:

We'd love to hear from our audience around

things that you would like to hear us

575

:

riff around and bring you our thoughts on.

576

:

But I think I'll, I'd like

us to keep focused on this.

577

:

Notion that this is a human

problem, not a technology problem.

578

:

And I think that there's much

to solve, and there's much to be

579

:

hopeful about, but I think we have

to have zesty hard challenging

580

:

conversations around it, right?

581

:

Because we still have the

ability to shape our future.

582

:

And, let's lean into that, the

humanness, as we go forward from here.

583

:

Controlling what we can within

our own business and the people

584

:

that we impact through our work.

585

:

Emily Shelton: Awesome.

586

:

Amazing.

587

:

Well, I think, yeah, I'm, I'm

excited to continue to dig into

588

:

it with you and excited to see

or witness this, this evolution.

589

:

It does feel like we are in a very

interesting spot as human people.

590

:

Getting to experience this alongside

of the evolution of technology.

591

:

So thank you both for your time.

592

:

Thank you both for sharing your

thoughts, and we will see you next

593

:

time on our next AI pod conversation.

594

:

So stay tuned listeners.

595

:

Thanks.

596

:

Reimagining Work from Within is available

wherever you listen to podcasts.

Links

Chapters

Video

More from YouTube