Emily Shelton:
00:00:17
Hello everyone and welcome back to Reimagining Work From Within.
2
:
00:00:22
I am Emily Shelton and I'm
here in conversation with
3
:
00:00:26
Laurie Bennett and Bev Atfield.
4
:
00:00:28
Today we are here to talk about ai.
5
:
00:00:31
We have had two conversations
about AI previously, in relation
6
:
00:00:36
to purpose and strategy.
7
:
00:00:38
Can AI write your values and
can AI write your strategy?
8
:
00:00:41
Today is more personal than that.
9
:
00:00:43
It's about how we work now.
10
:
00:00:45
So we've been evolving
within, Vision on ai.
11
:
00:00:50
We've been talking about what
we believe the future of AI will
12
:
00:00:53
look like, and we believe that the
evolution of work is superhuman.
13
:
00:00:57
That's what we want to get into today,
and it's our first time inviting Bev
14
:
00:01:02
Atfield in on this conversation with us.
15
:
00:01:05
So, hi Bev.
16
:
00:01:06
Hi Laurie.
17
:
00:01:07
Are you guys ready to chat about ai?
18
:
00:01:09
Bev Attfield: Hi Emily.
19
:
00:01:10
Hey, Laurie.
20
:
00:01:10
Good to be here.
21
:
00:01:11
Fun to be on the other side
of the mic for a change.
22
:
00:01:15
And yeah, excited to dig into
this juicy topic of the day, which
23
:
00:01:19
is, I guess, the topic of the
day for most people right now.
24
:
00:01:22
Laurie Bennett: Yeah.
25
:
00:01:23
I'm excited to get back into this.
26
:
00:01:24
It's changed so much since our last
couple of conversations and great to
27
:
00:01:30
have Bev's perspective in here too.
28
:
00:01:32
Emily Shelton: Absolutely.
29
:
00:01:34
Well, I, I wanna kind of us
right into the conversation.
30
:
00:01:37
I think that, We we have two different
perspectives to share here today.
31
:
00:01:42
what we're kind of seeing internally
and what we're seeing externally.
32
:
00:01:46
Bev Attfield: So I'm gonna kind of
bounce this back and forth between
33
:
00:01:49
those two different perspectives and
let's see what we learned together.
34
:
00:01:54
Laurie, where does within find
itself right now in relation to ai?
35
:
00:02:00
How has the conversation
shifted over the last year?
36
:
00:02:03
Laurie Bennett: Well, I think it's
shifted from it being this interesting
37
:
00:02:07
thing to explore that's kind of
new and fun and not that scary.
38
:
00:02:11
Mostly exciting into something
that's really become all consuming
39
:
00:02:17
of, Workplace conversations
around culture and the future.
40
:
00:02:21
I think it's impossible.
41
:
00:02:22
It feels impossible now to
talk about what's happening
42
:
00:02:25
with the evolution of work.
43
:
00:02:27
Somehow factoring ai, some of the deeper
thinking that we've been doing over this
44
:
00:02:32
time has been what does the arrival of AI
into this space mean for a company like
45
:
00:02:39
within, we've gone from seeing it as a.
46
:
00:02:42
Quirk to deal with in our work
to, possibly, and this sounds
47
:
00:02:49
terribly dramatic as an opening
statement in the podcast, but an
48
:
00:02:52
existential threat to what we do.
49
:
00:02:55
Why would people need.
50
:
00:02:58
Human within people to come in and
help them with aspects of their
51
:
00:03:02
culture, articulation and development,
and the tools that we need to
52
:
00:03:06
understand how things are going
inside a culture, the ability to
53
:
00:03:10
build strategy when AI can seemingly
do so much of that work without us.
54
:
00:03:18
And so the thinking that we've
been doing is kind of, well,
55
:
00:03:21
what's within this role in a.
56
:
00:03:24
Future that has AI in it, and how do
we continue to make that something that
57
:
00:03:30
inspires us and that gives us a mandate
to be doing great work in service
58
:
00:03:35
of more human work out in the world.
59
:
00:03:37
Emily Shelton: Yeah, that's great.
60
:
00:03:38
So would you say that we're no longer
asking whether AI will work for us,
61
:
00:03:44
but how it will be working for us?
62
:
00:03:48
Is that a shift that you've seen the tone?
63
:
00:03:50
Laurie Bennett: Yeah.
64
:
00:03:50
I think we're resigned to the, to
the fact that it's gonna be here and
65
:
00:03:54
it's gonna make a really big splash.
66
:
00:03:57
And so I think, yeah, it's less about
whether or not it's going to be a part
67
:
00:04:01
of our future in a meaningful way, but
how do we make it a part of our future
68
:
00:04:07
in a way that's really meaningful to us?
69
:
00:04:10
the big questions that sit out there
right now around AI replacing people in.
70
:
00:04:17
The workplace AI taking on the kinds
of things that people do at work.
71
:
00:04:22
the big question of, well, what do people
do at work and what is work left behind?
72
:
00:04:27
And I think we're finding ourselves now
having some really interesting thoughts
73
:
00:04:32
from Withins perspective around how can
we use this amazingly powerful tool not to
74
:
00:04:40
replace people, but actually to make work
more human than it's ever been before.
75
:
00:04:46
Emily Shelton: And what about you, Bev?
76
:
00:04:47
What are you seeing out
in the world of work?
77
:
00:04:49
What are organizations and
employees grappling with right now?
78
:
00:04:53
Bev Attfield: Yeah.
79
:
00:04:54
I, I think it's, I mean, we're in
this interesting moment in time where
80
:
00:04:57
as we're having these thoughts and,
we're considering what this means
81
:
00:05:00
for us as a business and a group of
folks who are passionate about work
82
:
00:05:04
and how to leverage that for people.
83
:
00:05:06
We are in the context of a, of the wider
world, and there is this significant
84
:
00:05:10
shift that is happening for society.
85
:
00:05:12
I'm reading a book right now by.
86
:
00:05:15
A person who is interested in the
economics of this shift that we're making.
87
:
00:05:19
his name is Iad Mustique.
88
:
00:05:21
And the, the book is The Last Economy, A
Guide to the Age of Intelligent Economics.
89
:
00:05:26
And I'll just read the quote because
this really made me sit up last night as
90
:
00:05:29
I was thinking about this conversation.
91
:
00:05:31
And he says that we are the generation
that will live through the discontinuity,
92
:
00:05:37
the last humans to remember when
human thought had economic value.
93
:
00:05:41
And we are the first to discover.
94
:
00:05:44
What comes after.
95
:
00:05:46
And that gave me chills, but also
gave me like goosebumps of like
96
:
00:05:50
excitement around the opportunity
that this could offer to us.
97
:
00:05:54
And I think it's right now, as a
business you can take a glass half
98
:
00:05:58
full or you can take a glass half
empty approach to this, right?
99
:
00:06:01
And I think that there is very much
in the conversation right now around
100
:
00:06:06
this new technology potentially
leading to the destruction of.
101
:
00:06:10
The human race.
102
:
00:06:11
That's the very far end of, like,
Laurie, you were talking about
103
:
00:06:15
being alarmist or dramatic, but that
that could be one outcome, right?
104
:
00:06:19
There is another one other outcome
of many that could be that we create
105
:
00:06:24
this technology and we adapt and we
use it to be more prosperous than
106
:
00:06:28
we've ever been before as a species.
107
:
00:06:30
Right.
108
:
00:06:31
And I think for me, as I think about what
Withins position on this is, because,
109
:
00:06:36
I mean, we can't control or change the.
110
:
00:06:40
Process and the, the course that AI
is going to take, but we are able
111
:
00:06:44
to listen to our clients and take a
position on what this could mean for
112
:
00:06:48
people in the workplace and maybe
for once and for all we can actually.
113
:
00:06:54
Find a way to let humans do the things
that they are really great at, right?
114
:
00:06:59
Creating, innovating, planning,
thinking, dreaming, right?
115
:
00:07:03
Those are all the things.
116
:
00:07:04
However, there are some realities
that are going to get in our way.
117
:
00:07:09
I think there's a few things that I'm
seeing out there in the world right now,
118
:
00:07:12
There's a lot of optimism about what
AI is gonna do and how it's gonna
119
:
00:07:15
make us more efficient, and how
yes, it's probably going to lead to
120
:
00:07:18
some jobs being displaced and doing
things that humans used to do, right?
121
:
00:07:22
Yes, that is reality.
122
:
00:07:25
But I think that there is also this
friction where we are not ready
123
:
00:07:30
to actually capitalize on what
this could mean for us as humans.
124
:
00:07:36
So there's a lag between the appetite
for change and the ability to actually
125
:
00:07:40
structurally make the change in business.
126
:
00:07:43
And a, a big part of that is this, there's
this tension from the C-suite and from
127
:
00:07:48
stakeholders to wanna jump on this AI
bandwagon and get the productivity gains.
128
:
00:07:53
But then we've got the folks who
are advocating for the people, the
129
:
00:07:55
workers, the, the doers of the things,
or the makers of the widgets in these
130
:
00:08:00
businesses that aren't there yet.
131
:
00:08:02
And so there's this real tension
between the HR function and generally
132
:
00:08:06
speaking, the C-suite function.
133
:
00:08:07
And what I think is
fascinating about that is that.
134
:
00:08:10
This is not a technology problem to solve.
135
:
00:08:13
This is a human problem to solve.
136
:
00:08:15
And I think we have to be aware, and
I know that we already are thinking
137
:
00:08:19
about this for our own clients and
for ourselves around how are we
138
:
00:08:23
gonna bring people along with us?
139
:
00:08:24
People who are fearful, anxious,
don't have a great appetite for
140
:
00:08:27
change in the global talent trends
report from Mercer for this year.
141
:
00:08:31
we just had a conversation with
Kate Bravery, who is one of the
142
:
00:08:34
lead authors of that report.
143
:
00:08:36
only 44% of employees
are thriving right now.
144
:
00:08:39
Right?
145
:
00:08:40
So that's the lowest number in
that report in 10 years, right?
146
:
00:08:44
So when you've got this base of
people who are already struggling
147
:
00:08:48
and you've now got this pressure to
perform and adapt because of this
148
:
00:08:52
existential crisis that we're feeling,
that is a recipe for disaster, right?
149
:
00:08:58
You can't bring disenfranchised
people along with you.
150
:
00:09:01
So I think there's a real human
challenge, and I think it's going
151
:
00:09:05
to rely on the leaders to now step
in and lead in a different way.
152
:
00:09:09
And I mean, that's a drum we've
been beating for 10 years, isn't it?
153
:
00:09:12
Around human-centered leadership.
154
:
00:09:14
So I think there's a big swirl.
155
:
00:09:17
Like there's a lot in there that
I just kind of threw into this
156
:
00:09:19
conversation, but I think it all
comes down to me to one thing.
157
:
00:09:23
Again, it's a human problem.
158
:
00:09:25
It's not a technology.
159
:
00:09:26
Emily Shelton: Absolutely.
160
:
00:09:27
There's so much to dig into.
161
:
00:09:28
I love that quote.
162
:
00:09:29
And the ending of that quote was
what stuck with me, which is that
163
:
00:09:33
we have the opportunity to kind of
shape how this evolution moves forward
164
:
00:09:37
if, if we get to decide what it
looks like on the other side of it.
165
:
00:09:41
So that.
166
:
00:09:41
To me feels pretty empowering.
167
:
00:09:43
I wanna focus in on the vision that we've
been talking about internally, which is
168
:
00:09:48
that the evolution of work is superhuman.
169
:
00:09:51
So I wanna get a little bit more
into the risks and realities, but
170
:
00:09:54
before that, I, I wanna dig into
the evolution of work is superhuman.
171
:
00:09:58
So I'd love to actually
understand where that came from.
172
:
00:10:02
Laurie, could you give us a little bit
of insight into our thinking there?
173
:
00:10:06
Laurie Bennett: Yeah.
174
:
00:10:07
The evolution of work is.
175
:
00:10:09
Super human.
176
:
00:10:10
I think as we've been thinking about
not just ai, but what the future of
177
:
00:10:15
work looks like, what's always been
important to within is that work
178
:
00:10:21
is a place of meaning and purpose,
and that the way to get to that for
179
:
00:10:27
the last 10, 15 years has been two.
180
:
00:10:30
Tap into the humanity that
people get to experience at work.
181
:
00:10:35
The real things that make people
connect to each other and connect
182
:
00:10:39
to the impact that their work has.
183
:
00:10:43
And so much of the time that's been about.
184
:
00:10:47
Starting to move away from the, the
structures and the systems that were
185
:
00:10:53
developed through the Industrial
Revolution in the name of efficiency
186
:
00:10:58
and production productivity, where
workplaces became these very structured
187
:
00:11:03
environments very high on control.
188
:
00:11:06
That our, that you work, the way that you
show up, the way that you dress, the way
189
:
00:11:12
that you, that you're managed like almost.
190
:
00:11:16
A machine, ironically, we kind of
invented machines to help people
191
:
00:11:19
at work, and then we had people
behave like machines alongside them.
192
:
00:11:25
one of the things that seems to be
really exciting about AI and the, the
193
:
00:11:31
associated kind of machine learning
and capabilities that that brings
194
:
00:11:36
is, well, maybe now for the first
time in a really long time, people.
195
:
00:11:41
Don't have to be machines.
196
:
00:11:43
Machines can be the machines and people
can be the people, and that there's
197
:
00:11:48
something really powerful about that
idea that when we think about the
198
:
00:11:52
future of work and how do we make
it human AI can feel like the single
199
:
00:11:56
biggest threat to that, that it's
going to come in and replace and reduce
200
:
00:12:03
the value of humans in a workplace.
201
:
00:12:07
But there's another side to that picture,
which is maybe AI's been exactly the thing
202
:
00:12:13
we need in order to finally liberate us
from having to behave like robots at work.
203
:
00:12:20
And that if AI can take the
work that has felt robotic and
204
:
00:12:28
mechanical, that can bring knowledge.
205
:
00:12:32
Yes.
206
:
00:12:33
And insight.
207
:
00:12:35
This amazing ability to
process information and data.
208
:
00:12:39
What does that do to enable people
and empower them in ways that they've
209
:
00:12:45
never, we've never had access to before?
210
:
00:12:48
years of education and training
and learning can be at our
211
:
00:12:54
fingertips now, seconds away.
212
:
00:12:56
And so that can be a
really terrifying idea.
213
:
00:12:59
Or it can be one I think which, if
we can leverage and channel that in
214
:
00:13:04
the right ways, could actually help
us make the work that we do as people
215
:
00:13:09
at work more human than ever before.
216
:
00:13:12
And that's really exciting and I.
217
:
00:13:16
Considering the evolution of work as
being this super human place where we get
218
:
00:13:21
to be super human, we get to concentrate
on the aspects of being a human that are
219
:
00:13:27
most powerful, our capacities for empathy
and imagination, and connection and love.
220
:
00:13:36
All these qualities that sit there,
that we know are the foundation of what
221
:
00:13:41
makes work meaningful and underpin.
222
:
00:13:44
Ways of leading that make for
way more human workplaces.
223
:
00:13:48
That kind of stuff we get to really
focus in on knowing that the other pieces
224
:
00:13:53
and aspects of work that we do can be
boosted by this incredible technology
225
:
00:13:59
that can help us step into that space.
226
:
00:14:00
So for us, I think it's a, it's an
ambition and it's a bit of call to,
227
:
00:14:04
to say, what can we do with this
technology to make sure that it becomes.
228
:
00:14:11
Enhances the human experience
of work rather than removes it.
229
:
00:14:16
And it, it's a huge challenge that
I know we don't have the answers
230
:
00:14:19
to yet, but that's the way of
looking at it that I think makes
231
:
00:14:24
sense for us as within to hold.
232
:
00:14:26
And I think it's the thing that a lot
of leaders out there are struggling with
233
:
00:14:31
right now, which is some way, as Bev
said just now, the way to balance the
234
:
00:14:35
incredible opportunity for efficiency
and productivity that it brings.
235
:
00:14:40
With the high potential for, to having
a real toll on the human experience
236
:
00:14:47
of work, and how do we hold those
two things together while we take
237
:
00:14:52
that dark force and we bring it
over to the light side and use it.
238
:
00:14:58
For good.
239
:
00:14:58
Bev Attfield: Yeah.
240
:
00:14:59
I'm gonna hop on that real
quick with a bit of a hot take.
241
:
00:15:02
I, I think you, you're right, Laurie.
242
:
00:15:03
I think that there is lots of room to
feel optimistic, but I, I think that
243
:
00:15:07
we are also at the mercy of a handful
of actors here who are determining
244
:
00:15:11
the course of human history, right?
245
:
00:15:13
So we've got just a handful of
really powerful people who are
246
:
00:15:21
leading the way with AI development.
247
:
00:15:24
To get us to artificial general
intelligence, and I've been quite
248
:
00:15:28
skeptical around their motivations
and like, what does that mean
249
:
00:15:32
for the rest of us, right?
250
:
00:15:33
Like we are downstream
from their decisions.
251
:
00:15:35
But there's a glimmer of hope because I
was listening to an interview with the
252
:
00:15:40
co-founder of Anthropic, Daniela Am who.
253
:
00:15:44
Has had some quite vocal opinions about
what AI should do and what humans should
254
:
00:15:48
do, and I'm gonna read another quote
because I think that this is like really
255
:
00:15:52
something for us to hold onto as a glimmer
of hope in the course of the future here.
256
:
00:15:57
And she said, in a world where AI is
very smart and capable of doing many
257
:
00:16:01
things, the things that make us human
will become much more important, not less.
258
:
00:16:07
And I really felt like that aligned with
where within is feels we need to go.
259
:
00:16:13
I mean, that is about the future
of work or the evolution of
260
:
00:16:16
work being superhuman, right?
261
:
00:16:17
It is about doubling down on the
things that humans are able to
262
:
00:16:21
do that we should not delegate or
automate and allow machines to do.
263
:
00:16:26
Right?
264
:
00:16:27
And why wouldn't we want a future where
we could get the both best of both, right?
265
:
00:16:32
Not the best of one at
the expense of another.
266
:
00:16:35
So I think we, what I'm holding
right now is a bit of like
267
:
00:16:39
helplessness because we are all just.
268
:
00:16:42
In this, simulation is what some people
are calling it, depending on your
269
:
00:16:46
philosophical view, but nonetheless,
like this is the reality, right?
270
:
00:16:49
Like there are other people that are
making really important decisions
271
:
00:16:51
right now, and I think that's why
it's really important for all of us
272
:
00:16:56
to have an opinion and a voice around.
273
:
00:16:59
What do we want this to look like for
us and what could it be rather than just
274
:
00:17:02
having this decision thrust upon us and
have to deal with the fallout of it.
275
:
00:17:07
Emily Shelton: Yeah.
276
:
00:17:07
Really, really powerful and kind of
led me, leads me to my next question,
277
:
00:17:11
Bev, which is that we have this vision
of where we see or what we see AI
278
:
00:17:17
could be the potential of AI is that.
279
:
00:17:20
Vision something that you're
seeing in other leaders.
280
:
00:17:23
We've talked about the select
five that are making some
281
:
00:17:26
big choices, but maybe, in.
282
:
00:17:29
The other group of leaders that
you're seeing day to day, do you think
283
:
00:17:31
that they have similar ambitions?
284
:
00:17:33
Bev Attfield: Look, I, I think
we're still grappling with what
285
:
00:17:35
this actually means for us, like in
the day to day, and I don't think
286
:
00:17:39
anyone actually has the answer.
287
:
00:17:41
I, I know that a number of our clients
are, are really forward thinkers and they
288
:
00:17:45
are trying to solve this problem for their
own business and adapt as they need to.
289
:
00:17:50
But I think the reality is
it's, it's a moment in time
290
:
00:17:53
where I don't think anybody.
291
:
00:17:55
Really knows.
292
:
00:17:56
I mean, I suppose you could say
that about any big change in the
293
:
00:18:00
course of history, but I just feel
like the velocity of this change is
294
:
00:18:03
what's curious and different for us.
295
:
00:18:06
And I think leaders who can say, I
actually don't have the answer and I'm
296
:
00:18:11
just going to bring curiosity to this,
and have a conversation with people around
297
:
00:18:17
what they're feeling right now, that's the
most important thing you can do, right?
298
:
00:18:21
Because.
299
:
00:18:22
I just don't think that there is anybody
out there who actually knows what this is
300
:
00:18:25
truly gonna mean for us as we go forward.
301
:
00:18:29
Laurie, what do you think?
302
:
00:18:31
Laurie Bennett: Yeah, I, I do think
that that's, that's true and I think
303
:
00:18:35
that that is true as well, as you say
of so many aspects of what's going on.
304
:
00:18:40
AI's one thing right now in a
tumultuous world, and I think when
305
:
00:18:46
you don't know what's happening
in the future, you gotta try to.
306
:
00:18:52
Shape and create it in some way.
307
:
00:18:55
And whilst there's a few people who have a
lot of power around this, I think there's
308
:
00:18:58
a lot of people who have some power around
this, especially within organizations
309
:
00:19:03
and within, and leaders in organizations.
310
:
00:19:05
And to, to be able to, to make this
something which adds something meaningful
311
:
00:19:13
to the human experience of work.
312
:
00:19:15
And I think we've said that for
a long time around, and it's
313
:
00:19:17
why we as within, why we work.
314
:
00:19:20
Business is that they're, they're
these amazing containers of society
315
:
00:19:26
and the leaders within businesses
have a real capacity to shape the
316
:
00:19:31
experience of a large number of people.
317
:
00:19:33
And this is no different.
318
:
00:19:35
The tool is the tool and the way
that we use it becomes the really
319
:
00:19:40
important thing in the wrong hands.
320
:
00:19:41
It'll become something that's
could be really destructive.
321
:
00:19:47
Culture and beyond that to society
when used with the right intentions and
322
:
00:19:53
in a way that is designed to support
more people rather than fewer, I
323
:
00:20:01
think it can be an incredibly powerful
thing, and I think there are so many
324
:
00:20:05
opportunities for leaders even now,
even in the dis, even in the confusion
325
:
00:20:12
of what all this is going to mean.
326
:
00:20:15
To turn back to their own purpose
and their own set of values and ask
327
:
00:20:18
themselves, how does this new thing
help us live into those things more?
328
:
00:20:24
And as we think about integrating
it and bringing it into our worlds,
329
:
00:20:28
how do those aspects of what have
always been true and important for
330
:
00:20:32
us continue to shape and guide how we
step into an unknown space around this?
331
:
00:20:38
And so I think the culture of
an organization has so much.
332
:
00:20:43
To shape what AI looks like
inside that organization.
333
:
00:20:47
And AI has so much potential to
enhance stuff for us if we use it
334
:
00:20:52
and we approach it in that way.
335
:
00:20:54
Emily Shelton: Yeah.
336
:
00:20:54
Thank you.
337
:
00:20:56
Oh, go ahead Bev.
338
:
00:20:56
What were you gonna
339
:
00:20:57
Bev Attfield: Yeah, I
was gonna jump in there.
340
:
00:20:58
I, I was just thinking, Laurie, as
you were talking, that before AI even
341
:
00:21:02
arrived, humans have done a pretty
good job of destroying culture, right?
342
:
00:21:08
With toxic leadership.
343
:
00:21:09
With practices and ways of being that
have not served the wellbeing and the
344
:
00:21:15
creativity and the joy of humans, right?
345
:
00:21:17
I mean, historically, over the
years, like employee engagement
346
:
00:21:21
has been very low, right?
347
:
00:21:23
It's very hard to engage
people, and that was before ai.
348
:
00:21:27
So I think, hey, like, let's
see what AI can do for us.
349
:
00:21:30
Maybe we'll be able to move
that needle a couple of points.
350
:
00:21:33
Laurie Bennett: And it's, I think
it's, like anything new that comes
351
:
00:21:36
in like this, it's it doesn't
necessarily change everything, but
352
:
00:21:39
it changes the volume of everything.
353
:
00:21:42
So if you've been doing something, if
you've been leading an inhuman culture.
354
:
00:21:48
AI will accelerate that process for you.
355
:
00:21:52
If you've been leading a really
human culture, AI has the possible
356
:
00:21:56
to possibility to accelerate
or amplify that for you, so the
357
:
00:22:01
human is still sitting there.
358
:
00:22:04
Holding the controls for this.
359
:
00:22:07
And I think, one of the challenges
in these moments is not to throw out
360
:
00:22:11
everything because the system has changed
or the paradigm has shifted and imagine
361
:
00:22:16
that it needs something completely
different, when in fact, so much of the
362
:
00:22:20
things that have been developed inside
of cultures up until now will be the
363
:
00:22:24
very things that we need to draw on to
be able to navigate this disruption too.
364
:
00:22:29
Emily Shelton: Well, I, I think
that we've done a pretty good job.
365
:
00:22:33
Chatting about the risks and
the opportunities here for ai.
366
:
00:22:37
Something I, I wanted to just touch
on, we, we've talked a lot about like
367
:
00:22:41
societally what could be at risk with, ai.
368
:
00:22:45
I think that there's also,
potential loss in creative
369
:
00:22:48
processes with individuals as well.
370
:
00:22:51
I think that if, we find people
relying too heavily on AI
371
:
00:22:56
to create something that we.
372
:
00:23:00
Miss the moments of discomfort
that are required for making
373
:
00:23:04
something, if that makes sense.
374
:
00:23:07
I think that AI is really great
at streamlining, and I think
375
:
00:23:11
that streamlining isn't always
needed in creative environments.
376
:
00:23:16
Right?
377
:
00:23:17
So I, I don't know.
378
:
00:23:18
I just think that that's something
also to consider that humans we.
379
:
00:23:23
Confront struggle when we're creating
or, collaborating with individuals.
380
:
00:23:28
But that struggle is what helps
us get to the other side and
381
:
00:23:31
make something meaningful.
382
:
00:23:32
And while AI has the ability
to make our lives easier,
383
:
00:23:36
maybe there's risk in that too.
384
:
00:23:40
So, just something that
I've been chewing on lately.
385
:
00:23:44
Is it better to make or
to make with intention?
386
:
00:23:48
Right.
387
:
00:23:48
And AI does not have the
ability to discern like we do.
388
:
00:23:52
So now I'm just on my
389
:
00:23:55
Bev Attfield: soapbox.
390
:
00:23:55
That's,
391
:
00:23:55
Emily Shelton: but
392
:
00:23:56
Bev Attfield: that's an interesting
perspective and I, I would say.dot
393
:
00:23:59
yet.
394
:
00:24:00
Right.
395
:
00:24:00
I think for us to imagine that we
will not get to a place where the
396
:
00:24:05
artificial intelligence can think
and create like us, I think that's,
397
:
00:24:09
we, we need to be careful with that.
398
:
00:24:11
Yeah.
399
:
00:24:12
I think the other force that we're
gonna have to reckon with is the
400
:
00:24:15
market tension of commoditizing
something like creativity, right?
401
:
00:24:19
In that quote at the beginning that I, I
shared around, what is the value of human
402
:
00:24:23
thought when it's cents on the dollar for
creating a new logo or spending $500 with
403
:
00:24:30
the graphic artist, the market isn't going
to be able to support the graphic artist.
404
:
00:24:36
For very long.
405
:
00:24:37
So I think we're gonna have to
reckon with some realities that
406
:
00:24:41
there are market forces at play.
407
:
00:24:43
Yes.
408
:
00:24:43
I mean, we don't wanna lose what the
essence is of being human, being able
409
:
00:24:46
to have creative thought, but how do
you combat the market force of that?
410
:
00:24:51
Right?
411
:
00:24:51
How do you protect that?
412
:
00:24:52
Especially if it's your profession, right?
413
:
00:24:54
If it's your intellectual pursuit
to be an artist and to go into
414
:
00:24:58
the forest and paint, well
that's a different conversation
415
:
00:25:00
as someone's profession, right?
416
:
00:25:03
So
417
:
00:25:03
Emily Shelton: yeah,
418
:
00:25:04
Bev Attfield: absolutely.
419
:
00:25:04
I think that, I think there
are more conversations to come.
420
:
00:25:06
Is is where I definitely,
I think I'll leave that.
421
:
00:25:09
Laurie Bennett: I have such a juicy one.
422
:
00:25:11
I love this.
423
:
00:25:12
I feel like this is the debate that's
out there right now, back Emily, to
424
:
00:25:15
some of those previous conversations
we had around can AI create the
425
:
00:25:19
values or write your strategy?
426
:
00:25:22
And the executive summary of
those being, it can give you
427
:
00:25:25
a really convincing outcome.
428
:
00:25:28
Emily Shelton: Mm-hmm.
429
:
00:25:28
Laurie Bennett: But the journey
to it has lost all its meaning.
430
:
00:25:33
Space and there's something really
important about those journeys
431
:
00:25:36
and how things happen, not just
the outcome that we arrive at.
432
:
00:25:40
Yeah.
433
:
00:25:41
And so I, I think that's gonna
be the kind of stuff that we
434
:
00:25:44
wrestle with and wrangle with.
435
:
00:25:46
these are the things which have
been deeply human processes in the
436
:
00:25:50
past around creativity, around that.
437
:
00:25:52
You mentioned there the ability
to discern and have wisdom.
438
:
00:25:56
And there's something in that which is,
yeah, AI's gonna get really convincing
439
:
00:26:01
in those spaces and ultimately can't
be in the same space as a actual
440
:
00:26:08
human connection in that world.
441
:
00:26:10
AI can not love you.
442
:
00:26:12
It can sound like it does.
443
:
00:26:14
It can tell you that
it does, but it can't.
444
:
00:26:18
And can it have empathy in the
way that a human has empathy?
445
:
00:26:23
I don't believe so.
446
:
00:26:24
Can it dream about a future state in a way
that a human can do that that's driven by
447
:
00:26:32
an emotion rather than a, an algorithm?
448
:
00:26:36
it can give you a billion possible
future states, but it can't tell
449
:
00:26:41
you which one makes your insides.
450
:
00:26:44
Just, I think there's gonna be some
things in that kind of space that become
451
:
00:26:49
really important, and I don't think it
necessarily will remove the value of them.
452
:
00:26:54
It might increase the value of
those particular things, the
453
:
00:26:56
things that are still inaccessible.
454
:
00:26:59
A world where seemingly
everything becomes accessible.
455
:
00:27:03
And the thing that I've been really
encouraged by, and I don't, I think this
456
:
00:27:06
is probably mostly from creative agencies
and people out there in the world, is the
457
:
00:27:12
concept of slop, which seems to be in a
word that's arriving into the AI universe
458
:
00:27:18
of like, it's fine, but it's slop.
459
:
00:27:21
it'll do the job, but it's not.
460
:
00:27:24
It doesn't have the magic somehow.
461
:
00:27:25
And I think the, our, the, the value
of not slop will still hold something.
462
:
00:27:32
And I think the human experience of
creating together the feeling, empathy of
463
:
00:27:36
dreaming together, that's the stuff that
I want to, to see become the things that
464
:
00:27:44
we do, we get to do the most of, because
that's where our value really lies.
465
:
00:27:48
That's where the source of joy
and meaning in our work sits.
466
:
00:27:52
And.
467
:
00:27:53
If AI can tee us up to do those
things with even more energy and
468
:
00:27:58
insight and focus, then bring it on.
469
:
00:28:02
Bev Attfield: Yeah.
470
:
00:28:02
Yes.
471
:
00:28:02
I think what you're hitting on there
for me, Laurie, is that like it's
472
:
00:28:05
relationships at the heart of it, right?
473
:
00:28:08
And I have some real questions around.
474
:
00:28:11
Is, can a machine be a proxy for
the relationships that we can build?
475
:
00:28:15
Right.
476
:
00:28:15
And is if we bring this down to our
context, and maybe there's some listeners
477
:
00:28:18
here who are also in consulting or
B2B situations where they're having
478
:
00:28:23
the same existential crisis, it's
the relationships that are the.
479
:
00:28:29
Foundation of how we run
our businesses, right?
480
:
00:28:32
And our clients don't come
to us simply for outputs.
481
:
00:28:35
They come to us for transformation.
482
:
00:28:37
And that transformation happens because
we can build trust, we can listen in
483
:
00:28:41
the room, we can co-create something.
484
:
00:28:45
Whether it behind a Zoom screen
or in person in a joyful workshop.
485
:
00:28:49
Right.
486
:
00:28:49
And I think it's the, the, the
connection and the community and the
487
:
00:28:52
touchpoint that we offer and that
we can build because we have that
488
:
00:28:57
capacity as these organic beings.
489
:
00:28:59
That is, I think that's where we are
most certainly doubling down on, right.
490
:
00:29:04
When we say that the evolution
of work is super human.
491
:
00:29:07
It's more human than
it's ever been, right?
492
:
00:29:10
Because we have the capacity
to do the things that.
493
:
00:29:13
Machines cannot.
494
:
00:29:15
Right.
495
:
00:29:15
And I mean, future will tell us.
496
:
00:29:17
Right?
497
:
00:29:17
We'll see how that unfolds.
498
:
00:29:19
but I think for now, like as I'm
thinking about it, it's the relationships
499
:
00:29:23
that are what we offer and that's
where we get joy in our work.
500
:
00:29:29
Right.
501
:
00:29:29
It's not, just.
502
:
00:29:31
Using a tool for the sake of using a tool.
503
:
00:29:34
Right?
504
:
00:29:34
We, yes, we will rest on the tool
to help us do the things that
505
:
00:29:37
we want to do, most certainly.
506
:
00:29:39
But it comes down to the connection
and the human, the humanness of it all.
507
:
00:29:43
Laurie Bennett: Yeah.
508
:
00:29:43
It feels, I, I think it's in danger,
isn't it, of sounding like work becomes
509
:
00:29:48
this whimsical place where we dream
and have empathy for each other.
510
:
00:29:53
But I think.
511
:
00:29:55
It does also touch on some really
critical aspects of decisions and
512
:
00:30:00
accountability and things as well.
513
:
00:30:03
I don't think that AI is going to be well
positioned to make decisions on behalf of
514
:
00:30:09
people, yet it can inform them and provide
insight, but ultimately, somebody's.
515
:
00:30:15
Holding the accountability for that,
that can't be devolved to a machine.
516
:
00:30:19
And it's really difficult to imagine
that sort of capability inside
517
:
00:30:24
a business being replaced by ai.
518
:
00:30:27
And so, and I think that's the, the danger
of AI coming into organizations in the
519
:
00:30:32
wrong ways is imagining that it is a
replacement for creativity or empathy and
520
:
00:30:37
relationship or decision making that we
can devolve those kinds of things to it.
521
:
00:30:44
Coach that you speak to, or the mentor
that you work with is a bot, not a person.
522
:
00:30:50
That the creative process becomes
feeding a prompt and having
523
:
00:30:57
something appear magically, seconds
later that decisions become.
524
:
00:31:03
Quantitative assessment of logic by a bot.
525
:
00:31:06
Like I think we get into some really
dangerous territory around what happens
526
:
00:31:12
inside organizations if we replace
the wrong stuff as we go forward.
527
:
00:31:17
Yeah.
528
:
00:31:19
Emily Shelton: Yeah, I agree.
529
:
00:31:20
Well, I, I wanna be aware of our time.
530
:
00:31:22
We, this was gonna be a short, snappy
conversation, but it's impossible for
531
:
00:31:26
us not to dig in on these juicy topics.
532
:
00:31:29
So I, I do wanna speak to you a little
bit about what we're doing at, at within.
533
:
00:31:33
We're working on something currently,
a way of articulating what we believe
534
:
00:31:38
about AI and how we use it at within.
535
:
00:31:40
So we'll be bringing updates to you
about that over the coming months.
536
:
00:31:45
I, this definitely feels like a
continued conversation, right?
537
:
00:31:49
I think that this is not the end.
538
:
00:31:51
This is, not the beginning either.
539
:
00:31:53
We've started it a couple of years
ago, so I, I would love to continue
540
:
00:31:56
this conversation with you over
the, the next year and see how this
541
:
00:32:00
continues to evolve and develop.
542
:
00:32:02
I would like to ask you one more
thing before we go, which is if
543
:
00:32:07
we were to take this conversation.
544
:
00:32:10
If we were to reopen this conversation
in a month or two months, what, where
545
:
00:32:14
do you think we should go next with it?
546
:
00:32:16
What feels like a natural place
for us to dig into or to look
547
:
00:32:20
at as we are in this journey?
548
:
00:32:22
Bev Attfield: A lot can
change in two months.
549
:
00:32:25
Emily Shelton: It's true.
550
:
00:32:26
Bev Attfield: I think we'll have
to recalibrate in two months
551
:
00:32:28
about where, what has actually
changed since this conversation.
552
:
00:32:31
Laurie Bennett: Bev will have, well,
Bev will be a bot in two months, and so.
553
:
00:32:36
That will change everything.
554
:
00:32:37
Bev Attfield: Yes.
555
:
00:32:37
Yes.
556
:
00:32:38
The upload will be complete.
557
:
00:32:40
Laurie Bennett: I really love
these tensions that we've
558
:
00:32:42
started to, to talk about.
559
:
00:32:44
I think there's a, there's some really
interesting and important things to
560
:
00:32:48
tease apart in conversations about
creativity and decision making and
561
:
00:32:54
empathy and coaching and well, really,
how, how do you start to, to use this?
562
:
00:33:00
Power of discernment that we're
saying is so fundamentally human.
563
:
00:33:03
How can we start practicing that now to
understand really practically what are
564
:
00:33:10
the things that AI can really support us
with and help us with and what aren't?
565
:
00:33:16
And how do we tell the
difference between those two?
566
:
00:33:19
And start to make sure that the kind of
systems that we're building out inside
567
:
00:33:22
our cultures and inside our businesses.
568
:
00:33:26
Respect that boundary.
569
:
00:33:27
Bev Attfield: Yeah, I, I think, there's
so many rabbit holes we can go down
570
:
00:33:30
as we continue this conversation
and I'm, I love this format.
571
:
00:33:34
This is, and I hope that others
will join us and, be part of
572
:
00:33:37
the conversation like that.
573
:
00:33:38
That would be great.
574
:
00:33:39
We'd love to hear from our audience around
things that you would like to hear us
575
:
00:33:44
riff around and bring you our thoughts on.
576
:
00:33:46
But I think I'll, I'd like
us to keep focused on this.
577
:
00:33:50
Notion that this is a human
problem, not a technology problem.
578
:
00:33:53
And I think that there's much
to solve, and there's much to be
579
:
00:33:56
hopeful about, but I think we have
to have zesty hard challenging
580
:
00:34:03
conversations around it, right?
581
:
00:34:05
Because we still have the
ability to shape our future.
582
:
00:34:09
And, let's lean into that, the
humanness, as we go forward from here.
583
:
00:34:14
Controlling what we can within
our own business and the people
584
:
00:34:17
that we impact through our work.
585
:
00:34:18
Emily Shelton: Awesome.
586
:
00:34:19
Amazing.
587
:
00:34:20
Well, I think, yeah, I'm, I'm
excited to continue to dig into
588
:
00:34:23
it with you and excited to see
or witness this, this evolution.
589
:
00:34:28
It does feel like we are in a very
interesting spot as human people.
590
:
00:34:33
Getting to experience this alongside
of the evolution of technology.
591
:
00:34:36
So thank you both for your time.
592
:
00:34:38
Thank you both for sharing your
thoughts, and we will see you next
593
:
00:34:42
time on our next AI pod conversation.
594
:
00:34:45
So stay tuned listeners.
595
:
00:34:47
Thanks.
596
:
00:34:48
Reimagining Work from Within is available
wherever you listen to podcasts.