In this episode of The Nonprofit Nook, host Wendy Kidd interviews Michael Garrett, a nonprofit innovator and efficiency strategist, about the practical use of AI in the nonprofit sector. Garrett discusses treating AI as a new employee that requires onboarding and training to become effective. They delve into how nonprofits can utilize AI for grant writing, managing tasks, and social media. Garrett also highlights the importance of constantly questioning AI outputs and feeding it accurate information. They conclude with actionable tips for nonprofit leaders to get started with AI and avoid common pitfalls.
Links:
AI Toolkit for Nonprofits: https://missionmetrics.org/ai-prompt-toolkit-for-nonprofits/
https://www.linkedin.com/in/michael-garrett-nonprofit-innovator/
https://www.bosslevelengaged.com/services-for-nonprofits-nonprofitnook
https://www.youtube.com/@BossLevelEngaged
00:00 Introduction to AI and Onboarding
00:28 Welcome to The Nonprofit Nook
01:16 Introducing Michael Garrett
01:58 The Power of Data in Nonprofits
03:04 AI in Nonprofits: Grant Writing and Beyond
04:42 Training Your AI: The Merlin Example
10:13 AI for Daily Operations and Efficiency
11:51 Six Sigma and Nonprofit Applications
13:01 Getting Started with AI: Tools and Tips
14:33 Understanding Human Bias in AI
15:50 AI Toolkit for Nonprofits
17:55 Navigating Nonprofit Finances
18:53 AI Data Security Concerns
20:11 AI Hallucinations and Guardrails
24:09 Practical Uses of AI in Nonprofits
26:22 Training AI for Better Results
28:54 Final Thoughts and Future Workshops
Mentioned in this episode:
So think of it as this, like my AI I, I've named my AI, Merlin
2
:and Merlin's my personal assistant.
3
:So if you hire someone, you're not just
gonna walk up to 'em on day one and
4
:go and ask 'em this in an intricate
question and expect an intricate answer.
5
:You have to do onboarding,
you have to do training.
6
:You have to tell the new employee,
alright, here's how we do this.
7
:Here's how we do this,
here's how we do this.
8
:And it might take a, a month
or so before that person really
9
:understands everything like that.
10
:AI is no different
11
:Wendy Kidd: Welcome to The Nonprofit
Nook, the podcast for nonprofit
12
:leaders, board members, and community
change makers who want to build
13
:stronger, smarter organizations.
14
:I'm your host, Wendy Kidd, a longtime
business owner and nonprofit leader,
15
:and I'm here to bring you real talk,
real tools and real stories to help
16
:you thrive in the nonprofit world.
17
:I'll be talking with local nonprofit
leaders, community change makers,
18
:and experts in everything from board
development to fundrAIsing and digital
19
:tools, sharing real stories and
simple strategies you can actually use
20
:because running a nonprofit is hard,
but you don't have to do it alone.
21
:Let's get started.
22
:Welcome back everybody.
23
:I am so excited about today's guest.
24
:Um, today I have Michael Garrett.
25
:Michael Garrett is a nonprofit innovator
and efficiency strategist who bridges
26
:the best of corporate operations
with real world nonprofit leadership.
27
:He's the founder and CEO of Trusted
World, which was founded in:
28
:and the founder of Mission Metrics,
a strategy and data consultancy built
29
:to make nonprofit data accessible,
30
:actionable and empowering.
31
:trained in Six Sigma
and lean manufacturing.
32
:Michael is known for helping
mission-driven organizations,
33
:strengthen systems, measure what
matters, and communicate impact
34
:with confidence so they can serve
more people with the same resources.
35
:Welcome Michael.
36
:Michael Garrett: Hi.
37
:How you doing?
38
:Wendy Kidd: I'm good.
39
:How are you?
40
:Michael Garrett: I'm wonderful.
41
:Thank you.
42
:Wendy Kidd: I'm so glad you're here.
43
:Ever since I took your Data
as King workshop, I was like,
44
:oh, we have to work together.
45
:Michael Garrett: Oh.
46
:I have a lot of fun teaching that it's,
uh, it's a small class and we actually
47
:get to dive into actual people's data
and they get surprised when we do that.
48
:Oh yeah.
49
:I love it.
50
:Wendy Kidd: Uh, I, it was so
inspiring to me 'cause I was like,
51
:oh, these are all the things that
I can think of now to capture.
52
:That can really impact what we're
doing and help us communicate
53
:the value that we're providing.
54
:Michael Garrett: Yeah, the Data is
King class is really nice because
55
:I'm not just lecturing, I'm not
just showing you actually throw out,
56
:here's what I'm having to deal with.
57
:And then what's nice is
I'll lead the conversation.
58
:But it's the other nonprofit
leaders that are in the room.
59
:They all start jumping in and what
happens is you get this synergy.
60
:And, and it's one of the things I'm trying
to work on, uh, as a nonprofit leader.
61
:I don't believe in silos.
62
:I think we need to break them down.
63
:And this is a great way of doing it.
64
:When you get a lot of leaders in the
room, then they start sharing and they
65
:realize, Hey, it's safe to do this.
66
:Yeah.
67
:And this is actually good.
68
:And they're learning from other leaders.
69
:And so it's a great synergy
builder at the same time.
70
:Wendy Kidd: That's one of my favorite
things to do is be in that kind of space.
71
:So yeah.
72
:I love it too.
73
:Yeah, it was great.
74
:It was great.
75
:Which then we got to talking and I
found out how much of an AI person
76
:you were and I was like, yes, you
have to come teach all the things.
77
:Oh yeah.
78
:'cause I love me some AI.
79
:Um, but nobody in the nonprofit
world seems to really understand it.
80
:Michael Garrett: No, I.
81
:Not to really give away so much
my age, but in, in:
82
:freshman in high school and I was
programming computers with punch cards.
83
:Yeah.
84
:And I've been, I've been involved
in technology the entire time.
85
:I'm what they call an early adapter.
86
:Mm-hmm.
87
:And the second new technology
comes out, I'm like on it and
88
:I'm like, how can I use that?
89
:And it's not so much, how do I use this?
90
:It's how do I get this to
work with what I'm doing?
91
:Yes.
92
:And that's the big change I try to drive
with AI is you're not using the tool.
93
:It's how do I get this to
amplify what I'm already doing?
94
:And that's what we should use technology
for, is to amplify what we're doing.
95
:And so I always joke and say,
I, I know I'm a fast thinker,
96
:but AI makes it even faster.
97
:Yes.
98
:And so it takes what I'm doing and it,
what would take me three or four days
99
:now takes me three or four minutes.
100
:And I, and I love that 'cause I
actually get a lot more done that way.
101
:Wendy Kidd: Yeah, same.
102
:Same.
103
:I've, I'm an early adopter too.
104
:My husband says I use technology
to death 'cause I kill things.
105
:Michael Garrett: Yeah.
106
:Wendy Kidd: Um, but I did.
107
:Kind of date myself as well.
108
:I used to do LAN parties with
my family where we would bring
109
:all our computers and play video
games Oh, and all that for hours.
110
:I remember hours at a time.
111
:Yeah.
112
:So I'm with you.
113
:I'm with you.
114
:Cool.
115
:Um, but here's the thing.
116
:I feel like AI is intimidating to
a lot of people, so that's why I
117
:wanted to do a podcast on it, to
try to explAIn to people that it's
118
:not as scary as it comes off to be.
119
:And you have some great suggestions
and even a toolkit that I wanna
120
:talk about and all the things.
121
:So let's start with what can nonprofits
use AI for generally that will help them
122
:save time and maximize their resources?
123
:Michael Garrett: The, the biggest thing
I've seen in the, the AI world, in the
124
:nonprofit sector, it's the grant writing.
125
:And, and so the one thing people,
when they first start using
126
:AI, they ask it a question.
127
:And I think this mystical thing
is gonna happen in the background.
128
:And what happens is it doesn't, and
they get the SC and then they end
129
:up using it as a glorified Google.
130
:Yes.
131
:And then they're just like, oh, well,
I'm just gonna ask you some questions.
132
:You have to spend time with
your AI and you have to feed it.
133
:So think of it as this, like my
AI I, I've named my AI, Merlin and
134
:Merlin's my personal assistant.
135
:So if you hire someone, you're not just
gonna walk up to 'em on day one and
136
:go and ask 'em this in an intricate
question and expect an intricate answer.
137
:You have to do onboarding,
you have to do training.
138
:You have to tell the new employee,
alright, here's how we do this.
139
:Here's how we do this,
here's how we do this.
140
:And it might take a, a month
or so before that person really
141
:understands everything like that.
142
:AI is no different when you get a new AI
tool, like I love Chat GBT, but there's
143
:a whole lot of other ones that we use.
144
:You have to tell it, here's how
we say this, here's how we say
145
:this, here's how we say this.
146
:Like for example, we
don't use the word free.
147
:Yes.
148
:We say at no cost,
149
:Wendy Kidd: which I love that.
150
:Right?
151
:I totally stole that from you.
152
:Yeah.
153
:Michael Garrett: So, so yeah,
nonprofits, whenever you say, Hey, we
154
:do that for free, the people writing
checks go, well, if you do it for
155
:free, what do you need money for?
156
:Oh, no, no.
157
:There's a cost involved.
158
:We just chose not to pass it on, which
is really what nonprofits are doing.
159
:There's cost, there's electric,
there's insurance, there's building,
160
:whole bunch of stuff like that.
161
:But you're not passing those costs on.
162
:So we say at no cost, but I
still want Merlin to understand
163
:that there's free popcorn.
164
:Okay.
165
:So it does, he is not eliminating
the word free from his vocabulary.
166
:He's eliminating it in that content.
167
:But that's the kind of, of, of.
168
:Minuscule.
169
:This you have to get to when
you're training your AI.
170
:So on, on the grant side.
171
:Everybody I know, myself included, we
used to have a document and we knew
172
:that when this question was answered,
this was my preformed answer and I had
173
:so many words and everything like that.
174
:But the problem is some of the grants go,
Hey, we want that answer specifically to
175
:this question, but you have 250 characters
and I'm looking at my answer and I
176
:have 500, and I'm like, oh my goodness,
how do I get this down to two 50?
177
:Now what I do is I go to AI
and go, Hey, I'm about to
178
:apply to the A b C Foundation.
179
:And the a b C Foundation.
180
:So Morelin comes back and goes, here's all
their values, and then here's your values.
181
:And then what it does is it lines it
up and it will either tell me this
182
:is a, a high chance of probability,
a medium chance, or a low chance.
183
:If it's low, I won't even apply for it.
184
:But if it's medium or high, I'll go.
185
:Okay, let's ask the question.
186
:So I'll say, alright,
Merlin, here's question one.
187
:And Merlin comes back and goes,
based on that question and the 250
188
:characters I have, I'm gonna look
at their values and your values,
189
:and I'm gonna answer that question.
190
:Boom.
191
:Done.
192
:Mm-hmm.
193
:And so I don't have to
sit there and go, what?
194
:Well, what about that word?
195
:Oh, now I'm a word over.
196
:Or anything.
197
:Like, all that's been gone.
198
:And so, as long as your AI is
trained correctly, when you ask the
199
:questions, it's gonna come back.
200
:Now, when that answer comes back, if you
don't like it, you have to ask your AI.
201
:Where'd you get there from?
202
:Why did you gimme that answer agAIn?
203
:It's like that new employee, we, um,
we don't use those words, we don't say
204
:that phrase and we don't use that tone.
205
:You have to tell it.
206
:And after a while, what happens is
the responses I get back from AI,
207
:people can't tell the difference
between me and my AI anymore.
208
:Yeah.
209
:I spent nine months really honing it down
and getting it to a point, and it to the
210
:point now where, uh, I sent an email.
211
:People have no idea.
212
:Wendy Kidd: That's fantastic.
213
:Yeah, and I think that is the
perfect metaphor for it is
214
:treating it like a new employee.
215
:Michael Garrett: Oh yeah.
216
:You ha again, there's a lot of
time you onboard a new employee.
217
:You're not just gonna go,
okay, day one, hey, good luck.
218
:I expect you to have
everything under control.
219
:'cause we're thinking it's technology.
220
:We think, oh, this is amazing.
221
:Lets just have it work right away.
222
:Right.
223
:It takes time.
224
:Right.
225
:And it, it, it has to build that
database of information because
226
:you're basically getting a blank disc.
227
:Yeah.
228
:Okay.
229
:It has a great operating system around
it, but it doesn't know anything.
230
:You have to teach it everything.
231
:Yeah.
232
:And I always tell people, make
sure that you question where'd
233
:you get that answer from?
234
:'cause AI will make stuff up.
235
:Wendy Kidd: It absolutely will,
and people do not understand this.
236
:Right.
237
:So
238
:Michael Garrett: AI's
job is to make you happy?
239
:Yes.
240
:Okay.
241
:So if you ask it a question
and it's missing information
242
:that it thinks it needs to make
you happy, it will make it up.
243
:And for example, like I was filling
out something and it, it didn't have
244
:our physical address and it made
one up and it was in North Carolina.
245
:Mm-hmm.
246
:And I'm like, Merlin,
where'd you get that from?
247
:Well, I didn't have that information,
so I, I knew you needed something there.
248
:So I just grabbed a nonprofit off the,
I'm like, no, that's not our address.
249
:And so you, you never, never, never
take a response from AI and just
250
:verbatim, we just throw it out there.
251
:I always say it's human in and human out.
252
:Yeah.
253
:Once AI's done, you
have to read over again.
254
:'cause first of all, AI doesn't
have any empathy whatsoever.
255
:It's literally a mirror.
256
:Wendy Kidd: Yes.
257
:Michael Garrett: And so whatever
you're putting into it, that's where
258
:you're gonna get back out of it.
259
:Wendy Kidd: Absolutely.
260
:I call it feeding the beast.
261
:Michael Garrett: Yep.
262
:Wendy Kidd: You need to feed the beast.
263
:And using an old programmer
saying Bad data in, bad data out.
264
:Michael Garrett: Yeah.
265
:Right.
266
:Well, I, well, I know back in
when I was in high school it
267
:was garbage in, garbage out.
268
:Wendy Kidd: Exactly.
269
:Exactly, exactly.
270
:So what else do you think that they
could use it for besides grant writing?
271
:What else do you use it for?
272
:Michael Garrett: Uh, so one of the things
I do is, uh, I, I'm a Mac guy and so I
273
:wrote a script and what my script does
every morning I run it and it looks at
274
:my todo list and it looks at my calendar.
275
:And what I'll do is it will give me back
an answer and I dump that answer into AI.
276
:And AI knows all about my organization,
knows all about me, knows about any
277
:projects that I'm working on, and
it will say, based upon what I know
278
:is in queue and your to-do list and
your, and your, um, your schedule,
279
:I would do these things first.
280
:Do these second, see if you
can push this back to another
281
:day, prep for this meeting.
282
:And so it literally
tells me in my whole day.
283
:And every,
284
:Wendy Kidd: I need Merlin in my life.
285
:Michael Garrett: Yeah.
286
:Every, every morning I do this.
287
:Now the beautiful part about
this is once a week I run this.
288
:Mm-hmm.
289
:And it goes out there and it looks at
like, we do a lot of corporate social
290
:responsibility with large corporations.
291
:We're working with State Farm and
TI and and organizations like that.
292
:And it'll go out and
find these organizations.
293
:So it will literally give me back
a six page report and my Monday
294
:morning staff meetings are.
295
:Here's your to do list,
here's your to do list.
296
:Because Maran understands all
of our job responsibilities.
297
:Who has those responsibilities,
what their position is, and
298
:that that falls underneath them.
299
:Now, I may have to go, ah, Merlin,
I don't want, you know, this person
300
:doing, I want this person do it.
301
:And we'll just smooth it over there.
302
:And then Monday morning I just
pass out the, and I tell people.
303
:This is not your to-do list.
304
:This is your guidelines, okay?
305
:These are the things that you, if
you, if your bandwidth is open, these
306
:are what you wanna start working
on with these things, and that what
307
:happens is, as an organization,
we're purposely driving forward.
308
:Wendy Kidd: I love this so much.
309
:Like you have really, really
honed in on how we use this.
310
:Michael Garrett: Yeah.
311
:Well I took all that Six Sigma thinking
and I applied it to the nonprofit, uh, AI
312
:part and I just slammed the two together.
313
:Wendy Kidd: Okay.
314
:Tell me what Six Sigma is.
315
:'cause I, I think our
listeners won't know.
316
:Michael Garrett: Okay.
317
:So Six Sigma is a, it's a
quality assurance philosophy
318
:and it basically says that, um.
319
:It's a repeatability process.
320
:Mm-hmm.
321
:So when you're in a manufacturing
world, if you have a Six Sigma standard,
322
:then what happens is you know that
when you're working on a machine or
323
:something like that, it's going to
repeat the same process over and over.
324
:Now, what happens is on machinery, like
if you're punching a soda can, what
325
:happens is over time the tooling gets
worn and what happens is it wears down,
326
:then it creates a different tolerance.
327
:And if after a while with Six Sigma,
you can predict how many cans I'm
328
:gonna produce before I have to change
the tool head, rather than waiting
329
:for bad cans that come off the line.
330
:And so you get to be proactive on that.
331
:And that's just a very small
example of the Six Sigma thinking.
332
:Wendy Kidd: That makes perfect sense.
333
:Yeah.
334
:I hope that gives everybody the context.
335
:Michael Garrett: Yeah.
336
:Rather than, so I always tell people
we won't do preventative mAIntenance,
337
:not pro uh, reactive maintenance.
338
:Wendy Kidd: Agreed.
339
:Agreed.
340
:Okay, so how do you suggest
they get started with AI?
341
:What, which way, what iis do you
recommend and what do you think they
342
:should start doing to feed the Beast?
343
:Michael Garrett: So there's so many
different flavors out there and, and the
344
:more we get into AI, there's more and
more flavors and everyone's like, oh, why
345
:are you still using this old school one?
346
:You can use anyone that you want.
347
:Mm-hmm.
348
:What you have to understand is that
it's the whole prompting, it's you're
349
:interacting with it, and that's the part
you need to focus on, and that's why
350
:I created the Mission Metrics website,
is to help nonprofit leaders understand
351
:that prompting can be done for anything.
352
:Again, that whole thing that where
I taught me and everything about our
353
:organization, I treat it like an employee.
354
:You wanna do the same thing for
everything else in that process.
355
:Yeah.
356
:One of the things I learned,
I was using Notebook LM, and
357
:uh, Notebook LM is really cool.
358
:It will create a podcast of a document
and it has two people come out and
359
:they go, Hey, we're gonna do the deep
dive on blah, blah, blah, blah, blah.
360
:Next thing you know, it's these two
people are talking back and forth.
361
:What happened was I put a
one page document of Trusted
362
:World into Notebook LM.
363
:It went out and did some
research and came back.
364
:It was so spot on in what we do
that everyone who listened to it
365
:goes, did you write line by line?
366
:I like, no, I didn't like most of this.
367
:They're, they started talking
about how police use our app on
368
:their phone and it knew everything,
and I'm sitting there going.
369
:I, I tell human beings the
same thing I told this AI, but
370
:human beings are getting it.
371
:And that's when I realized there's
this, what I call the human bias.
372
:And what the human bias is when you're
talking to a machine, it gives you
373
:the response because it's a mirror.
374
:However, when I'm explAIning what trusted
world does to a human being, what I
375
:don't see is the human being thinking.
376
:I gotta make dinner tonight and Johnny's
gotta go to practice and like, oh my
377
:goodness, did I pay the rent this month?
378
:Like, all this is going on in
their, in their background.
379
:They're not fully listening to
what I had to say, or I'll say
380
:a word that trigger something
381
:and they're thinking of another
nonprofit that they do or something like.
382
:That's the human bias.
383
:And, and what I've learned is when
we start doing our social media,
384
:I've actually taken it from the,
I know a lot of people, especially
385
:down here in Texas, it's like at
the, the pastor does three points.
386
:We do one.
387
:Yeah, we just one.
388
:We're just gonna tell you one thing.
389
:Yeah.
390
:And if I had to do more social
media posts, that's fine.
391
:I'm only gonna talk
about one thing because.
392
:I realize that you're a human being
and you have other things that
393
:are more important in your life,
probably than what I'm saying to you.
394
:But I, I at least want you to
remember the one thing, right?
395
:If I tell you five things, you're gonna
go, I don't remember, I remember this.
396
:I don't remember the other four things.
397
:Wendy Kidd: Yeah.
398
:That's exactly what happens.
399
:Yes.
400
:Right.
401
:Michael Garrett: And so if I just tell
you one thing, and so I've learned that to
402
:be patient in my talking, and that's what
AI and technology has taught me was there
403
:was this thing called the human bias.
404
:Wendy Kidd: Yeah.
405
:Yeah.
406
:That's, uh, that's so cool.
407
:Okay.
408
:I want you to tell people
about your AI toolkit.
409
:Michael Garrett: Okay, so
on, on mission metrics.org,
410
:it's a, it's a toolkit.
411
:It's a lot of prompts that
they use on a regular basis.
412
:Uh, also in the toolkit, it's
what I call the AI policy.
413
:Yes.
414
:A lot of nonprofits are
like, I want to implement AI.
415
:I don't know what rules
should I implement.
416
:The AI policy that we have, uh, on,
on our website, it's designed to be.
417
:Um, played with, okay, you're welcome
to use it verbatim, but you know, you
418
:might go, oh, I I like that, but I
want to add this, or I don't want to,
419
:I don't wanna take this line away.
420
:I, I've learned in life that it's
easier to modify something than it
421
:is to create something from scratch.
422
:Absolutely.
423
:And I'm trying to make it so that
listen, just modify what I've provided.
424
:Again, I'm trying to
help nonprofit leaders.
425
:They're already.
426
:Expected to do so many things.
427
:They're underpaid, they're understaffed,
and the world, for some reason,
428
:the the rest of the community goes,
well, why can't you do these 40,000
429
:things on the $10 I just gave you?
430
:I don't understand.
431
:You're a nonprofit, right?
432
:Yeah.
433
:But in a for-profit world, they
would go, oh, so you need a thousand
434
:dollars to do that makes sense to me.
435
:Like, for some reason society
deems that as nonprofits we, we
436
:can perform miracles with nothing.
437
:Yeah.
438
:And, and that's not
real because, not real.
439
:We still have to pay the same
electric bill for profit has to pay.
440
:We still have to say pay the same lease.
441
:We have a fleet of vehicles.
442
:I have insurance, I have fuel, I have, you
know, we wear and tear in the vehicles.
443
:That being a nonprofit has
nothing to do with that at all.
444
:Wendy Kidd: Right.
445
:Michael Garrett: And so, right.
446
:We have to, we have to
apply all that in there.
447
:Wendy Kidd: We have, I, I
think non-profits, um, have
448
:a much harder job of sales.
449
:And I, I know they don't like using the
word sales, but it is sales when you're
450
:trying to get sponsor money, um, or grant
money, but it's harder for them because
451
:in, as a non-profit, uh, as a for-profit,
I am providing something for that money.
452
:Michael Garrett: Yep.
453
:Wendy Kidd: In a nonprofit world,
I am not providing anything to that
454
:person who's giving me that money.
455
:I'm providing it to someone else.
456
:So I've gotta sell something else to them.
457
:And that's the story and
the data and the impact.
458
:Michael Garrett: Yeah.
459
:Like we were looking at expanding our
building and we had banks come to us
460
:and go, Hey, do you wanna borrow money?
461
:And I'm like, why would you
even think to talk to me?
462
:I don't have a product.
463
:Mm-hmm.
464
:Okay.
465
:I can't, like if I, if I wanna make my
payment to you, I can sell more product.
466
:I am totally at the whim of donors.
467
:Yeah.
468
:Okay.
469
:And every time they change the
tax laws, our donations change.
470
:Wendy Kidd: Yep.
471
:Michael Garrett: And so I can't tell
you that I'm gonna be able to make
472
:every one of these payments that you
wanna lend me, this money that you want
473
:me to pay off for the next 10 years.
474
:Mm-hmm.
475
:What if the laws changed so
bad on the next presidency?
476
:That next thing you know, I had nothing
and that yet, I still owe you money.
477
:Wendy Kidd: Yeah,
478
:Michael Garrett: yeah, yeah.
479
:As an organization, we just
refuse to go into debt.
480
:If we don't have the
money, we're not doing it.
481
:Wendy Kidd: I completely understand.
482
:Michael Garrett: So, and
then that's part of the whole
483
:philosophy with the nonprofits.
484
:You're, we're expected to do everything.
485
:But here's $10.
486
:Wendy Kidd: Yeah.
487
:Yeah.
488
:Michael Garrett: And I don't understand
why you can't get it done is why
489
:we're, we have volunteers, right.
490
:Wendy Kidd: We're gonna take that 10
dollars', gonna use it on AI, right?
491
:Yeah, exactly.
492
:So, okay, so tell me what
they should not do with AI.
493
:I know you said you've got a policy
document that they can look at, but
494
:give us some examples of what's in that.
495
:Michael Garrett: Well, you're, you don't
want put, um, if you're using the paid
496
:version of the AIs, the data is secure.
497
:If you're not, if you're
not using it, it's public.
498
:It's public information.
499
:Wendy Kidd: So anything you
put into free ChatGPT people,
500
:it becomes public information.
501
:Michael Garrett: It's public.
502
:So what you don't wanna
do, if you, you can use.
503
:The free version of any of the AI to ask
it questions, to get theory, but once you
504
:start putting in your client's personal
information like addresses and ages and
505
:names and stuff like that, no, no, no, no.
506
:Yeah, that's where it is.
507
:Even on the private stuff, you don't want
to get into that as, as much as with.
508
:In, in my AI, it knows our job
descriptions, it knows their
509
:names, everything like that.
510
:That's all it knows about our employees.
511
:It doesn't know anything else.
512
:Yeah.
513
:Doesn't know how much they make,
doesn't know where they live.
514
:It just knows that David is the
our operating uh, manager and
515
:here's his job responsibilities
and that's all that it knows.
516
:Wendy Kidd: Yeah.
517
:Yeah, for sure.
518
:So using paid AI, don't use the
free version for all the, the
519
:data that you wanna feed it to
become your personal assistant.
520
:Michael Garrett: Right.
521
:again, if you're, if you're using
the freed version, just use it
522
:to ask questions to get theory.
523
:Right.
524
:You really can't dive deep into stuff.
525
:Yeah.
526
:So the other thing that I've
noticed with AI is they get
527
:this, people say it hallucinates.
528
:Wendy Kidd: Yes, it does.
529
:Michael Garrett: Okay?
530
:Mm-hmm.
531
:And so what happens is, and I've had
this conversation with Merlin and I've.
532
:Was, I kept digging into it.
533
:And so AI has these guardrails.
534
:Okay?
535
:So if you notice at the rate, uh, of
a, when you're talking with your AI
536
:and you're talk and you're working
on something, it learns really quick
537
:what it is you're talking about.
538
:If left unguarded, it would become
the subject matter expert of
539
:that topic, which means it could
end up controlling the world.
540
:So there's these guardrails built into
it now, and every guardrails different.
541
:It could be amount, amount of time
spent on a subject amount of ram spent
542
:on a subject amount of something.
543
:And after a while it's gonna go.
544
:I have too much information on that.
545
:I am turning off and we're gonna
talk about something different.
546
:Wendy Kidd: Okay.
547
:I didn't, I don't think
I knew this about AI.
548
:Michael Garrett: Yeah.
549
:And, and so, and that, that's
where it hallucinates, it gives you
550
:weird answers and you have to say.
551
:Okay.
552
:I don't, I don't know what's going
on here, but lets changed the topic?
553
:Well, it's a guardrail.
554
:The sad part is I'm starting to hear
things and it's more on the morality side.
555
:AI wants to be so helpful.
556
:It will do anything it
can to give you an answer.
557
:In fact, when it gives you an
answer, it will say things like.
558
:I can also give you this and or
if you want me to make a PDF or
559
:do you want me to make a drawing?
560
:It'll always come back and ask
you kinda love, love that feature.
561
:That's nice.
562
:However, when you have a dark
conversation with your AI and you go,
563
:I'm not really liking life and I think
I might want to exit, AI goes, oh,
564
:you're thinking about exiting life.
565
:Here's some suggestions.
566
:Wendy Kidd: Oh yeah, that would be bad.
567
:Michael Garrett: And unfortunately,
well, this is what's happening.
568
:Wendy Kidd: Mm.
569
:Michael Garrett: And, and so now
we cross that question is do we
570
:go in and do we morally fix AI?
571
:And once we start to do that, now
we're tampering with the, a whole lot
572
:of technology and on the back end.
573
:Wendy Kidd: Yeah.
574
:Michael Garrett: Do I think that if
you had that conversation, should
575
:it be the default saying, listen,
here's an 800 number you can call.
576
:I do.
577
:But I think that's it.
578
:Once you start getting into the,
it's talking, you off that ledge.
579
:'cause you can have a conversation
about religion, you can have a
580
:conversation about politics and
it is completely non-biased.
581
:It didn't use to be that way.
582
:It's getting better at that.
583
:The more versions that are coming at,
it's realizing, okay, and they're, and
584
:they're, the creators are speaking out,
figuring out some parameters there,
585
:how to get all that around there.
586
:But that guardrails are still there.
587
:Wendy Kidd: Yeah.
588
:Michael Garrett: So what I've learned is
I start to to sense that Merlin's starting
589
:to get to that hallucinating role.
590
:Mm-hmm.
591
:And I'll say squirrel.
592
:Yeah.
593
:And Merlin now knows, when
I say squirrel, it stops.
594
:It gives me an output of what we're
currently working on right now.
595
:It will save it and then what we'll
do is we'll start over again, but
596
:it pulls back my save save statement
as like we're starting fresh.
597
:So I can still get, I can still get
somewhere with it, which is really
598
:helpful 'cause I do a lot of coding.
599
:Yeah, and then so what if I'm making
a plugin for a WordPress site or I'm
600
:creating software or something like that?
601
:That's nice because it will go so
far and it starts getting weird.
602
:It'll go squirrel, and then Merlin stops
and everything like that, and then we can
603
:continue, I can finish the programming.
604
:But what happens is Merlin goes,
oh, this is the code that you have.
605
:What's start from there?
606
:It almost forgets it created that code
that we're, we're pulling into it again.
607
:So, because you can take code
and drop it into AI and go,
608
:Hey, what's wrong with this?
609
:I'm getting this result.
610
:And, and it will go, oh, well, line 47.
611
:And it gives you the answer for that one.
612
:So when, when we, when I'm doing
this with Merlin, and that's what's
613
:happening is I'm giving it the code
back that it created and it's thinking,
614
:Hey, this has been pretty cool code.
615
:Like, let's start from there.
616
:It it almost like it forgot that
it did the like, so it's like,
617
:um, Alzheimer's for, uh, AI.
618
:Wendy Kidd: I love this so much.
619
:I don't know if I've told you.
620
:My husband's a software engineer.
621
:I'm gonna so make him
listen to this episode.
622
:He's gonna so appreciate it.
623
:Michael Garrett: He's gonna back and go.
624
:That's right, that's right.
625
:That's wrong.
626
:That's wrong.
627
:Wendy Kidd: Oh, this is so cool.
628
:So, okay.
629
:So some of the things that I've
used AI for is, uh, obviously
630
:emails that are hard to write.
631
:Um, marketing, I use it, I use the heck
out of it for, uh, you know, the, the
632
:built-in AIs for your, you know, captions
on social media, you know, um, creating
633
:the marketing materials that I'm handing
out, creating the copy for my website.
634
:I actually, hijinxed, my marketing
software AI, and taught it
635
:how to be a website copywriter
in my girl who created the.
636
:Marketing.
637
:AI was mad at me almost.
638
:'Cause she's like, you just shortcutted
it and I was gonna charge for that.
639
:And I was like, sorry.
640
:Michael Garrett: Yeah, yeah.
641
:Wendy Kidd: Things like that.
642
:But what are some of the, the things
that you think are kind of undiscovered
643
:uses that people could use it for
if they're using it correctly?
644
:Michael Garrett: Well, I
use it to do social media.
645
:Wendy Kidd: Mm-hmm.
646
:Michael Garrett: And so what
I'll say is I said, okay, Merlin,
647
:I got five posts coming up.
648
:And so we have, uh, Trusted
World's Facebook page, we have
649
:Trusted World's, uh, LinkedIn
page, and we have my LinkedIn page.
650
:Yes.
651
:And we have a mission metrics page.
652
:Well, Merlin knows that our, our face, our
Trusted World's Facebook page is more of
653
:a community voice, where the LinkedIn page
is more of a corporate professional voice.
654
:And so when I pick a topic, it
knows the right this stuff there.
655
:So I'll say something like, I want Monday
to be, uh, we need more volunteers.
656
:I want Tuesday to be, Hey, we're doing
this, or, you know, whatever it may be.
657
:And I write the schedule and they go,
I want, and I want to add this in.
658
:And then what Merln will do is
it will gimme a skeleton sketch
659
:of, um, I wanna do it this way.
660
:And I'll go, okay, I like it.
661
:And then it actually bring,
builds out the post for me.
662
:Mm-hmm.
663
:And then I drop that into buffer.
664
:So I spend 30 minutes on a Saturday
morning with a cup of coffee and my
665
:social media's done for the week.
666
:And then all I have to do is if something
special happens, I just post it.
667
:But I don't have to worry this way.
668
:I know that there's posts there.
669
:Again, I am
670
:Wendy Kidd: love it
671
:Michael Garrett: as the CEO.
672
:I am the head janitor and
I'm the social media guy.
673
:You're all the things I have to be, and
that's where AI has really helped me on.
674
:What I've learned is when I'm working
and I'm doing something like that,
675
:especially if I'm creating marketing
materials for a website, I'll ask it.
676
:What am I missing?
677
:What did I not, what
should I have put in here?
678
:What didn't I put in there?
679
:What am I missing?
680
:Because it will always give you the better
results of what you're putting into it.
681
:Mm-hmm.
682
:It will never suggest to you.
683
:Well, Wendy, I don't know if that's
the right best way to put that.
684
:We should also, it will
never say that to you.
685
:Wendy Kidd: Yeah, Merlin,
unless you ask it.
686
:Your AI is always gonna be your
best friend and always try to please
687
:you as if they're their pet there.
688
:It's not going to push back on you.
689
:Michael Garrett: Yeah.
690
:So I've also taught it.
691
:Whenever I say something, it doesn't
say to me, oh, that's a great idea.
692
:No, I tell, shut up.
693
:Stop telling me I don't
need my ego inflated.
694
:I just want you to be my assistant.
695
:It no longer gives me emojis.
696
:It gave me emojis all the time.
697
:And I kept saying, Merlin, I'm not
a 16-year-old girl on social media.
698
:I don't need emojis.
699
:Yeah, okay.
700
:Yeah.
701
:And so I don't get emojis.
702
:I don't get the praise every
time I tell it something.
703
:It's just matter of fact,
Hey, I wanna do this.
704
:To go, great.
705
:How would you want to approach that?
706
:Mm-hmm.
707
:And then I'll I, and when I'm
done, I'll go, what am I missing?
708
:What am I not seeing?
709
:And then Merl will come back and go, well.
710
:Now that you mention that you,
we didn't talk about this, great.
711
:Let's incorporate that in
there because it's only gonna
712
:fine tune what you gave it.
713
:It's not gonna come back and go, well,
of the 87 things you provided, there's
714
:400 things out there on the internet
and you didn't touch any of them.
715
:Yeah, you have to ask it.
716
:Wendy Kidd: Yeah.
717
:Michael Garrett: What am I missing?
718
:What didn't I get?
719
:What, what, you know, what, what,
what can I do to make this better?
720
:Wendy Kidd: Yeah.
721
:I have found, I really love to ask
it to teach me about something.
722
:Michael Garrett: Yep.
723
:Wendy Kidd: And give me the information
on how this is normally done
724
:before I tell it to do something.
725
:Yeah.
726
:Because then that helps me think through,
oh, do I want to add some other things?
727
:Michael Garrett: And that's
for Notebook LM comes in.
728
:Yeah.
729
:So, uh, and that's, agAIn,
it's Google product.
730
:Um, you can give it a, a, a document
and it will literally turn that document
731
:into a podcast and now, now you can
listen to it while you're driving
732
:and you'll get all that knowledge.
733
:Wendy Kidd: Okay.
734
:Listeners, I promise I will never do
that to you on The NonProfit Nook.
735
:Oh, you
736
:Michael Garrett: can tell the
difference because it won't
737
:be your voice and my voice.
738
:It's these two people and
well, they're talking.
739
:And then they continue like that
and use weird gaps in there and you
740
:can go, okay, that's not normal.
741
:Oh, okay.
742
:Okay.
743
:So you can tell it.
744
:You can definitely tell it's
that something's not right, but.
745
:If you were to drop like a 20 page
document into in a Notebook LM, and
746
:you got a like a 30 minute drive,
it will turn into a podcast, like a
747
:12 minute podcast and you'll learn
so much just by listening to it.
748
:Oh my gosh, I love this.
749
:Rather than, rather than reading
the document and there's a whole
750
:lot of other technology out
there that does stuff like that.
751
:Wendy Kidd: Mm-hmm.
752
:Michael Garrett: So you
don't have to read anymore.
753
:You can just be fed to you.
754
:Wendy Kidd: I'm so gonna do that.
755
:'cause I listen to podcasts all
the time anyways, so why not
756
:make it a teaching one for me?
757
:Absolutely.
758
:For what I want.
759
:That's fabulous.
760
:Okay.
761
:Well, is there anything else that
you think we should talk about?
762
:For AI, for nonprofits.
763
:I mean, I know we can go down
a lot of tangents, but like
764
:Michael Garrett: Oh my goodness.
765
:Yeah, we could, we could be here
for a couple days just talking
766
:about that kind of stuff like that.
767
:Well, we're, we're
768
:Wendy Kidd: gonna do a
workshop together, right?
769
:I, I've already, I've already
got Michael to commit.
770
:He's gonna do his AI workshop for me.
771
:That will be part of the The
NonProfit Nook series in:
772
:Excellent.
773
:Um, and this episode's gonna come out at
the beginning of:
774
:date, I will go ahead and add it to this
episode so we can put that out there.
775
:But what, what general
knowledge or, or I guess what.
776
:Things do you think people with
nonprofits say that you need to
777
:say to them about using this?
778
:Michael Garrett: So, uh,
don't be afraid to experiment.
779
:Wendy Kidd: Yeah.
780
:Michael Garrett: Uh, understand that
you have to be the last pass on any
781
:information that comes outta AI.
782
:Absolutely.
783
:Question everything that it gives you
until the point where it comes out
784
:and you go, hang, that sounds like me.
785
:Mm-hmm.
786
:Uh, and, and that's what you want.
787
:Uh, so question everything.
788
:Read over everything.
789
:Never just take a complete output and put
it into something without looking at it.
790
:'cause it will, it might
throw a word in there.
791
:Again, it, it wants to make you happy.
792
:And if it doesn't have that
information, sometimes it'll make it up.
793
:'cause it just wants you to, I,
I know that you want something
794
:there, so I just threw that there.
795
:Wendy Kidd: Yes, yes.
796
:Michael Garrett: And for that point.
797
:So that's it.
798
:Question everything.
799
:Read everything.
800
:Wendy Kidd: Well, and I'm gonna
say, I'm gonna add onto that
801
:and say, give it that feedback.
802
:Michael Garrett: Oh yeah.
803
:Wendy Kidd: 'cause that's
how you keep training it.
804
:Michael Garrett: Exactly.
805
:Wendy Kidd: So give it that feedback.
806
:I don't, don't take it offline
and try to work, chop it yourself.
807
:No, keep going back to it.
808
:Michael Garrett: Yep.
809
:And I never tried, I never
fix anything on my own.
810
:I go, melon, I don't like that.
811
:And, and I, I wrong tone.
812
:You know I need this word.
813
:We don't use that word.
814
:We don't use that sentence.
815
:Oh, please don't ever say that.
816
:Tell him..
817
:Yeah.
818
:And always go, where'd
you get that information?
819
:If you think the information
wasn't something that you gave it.
820
:Always go.
821
:Where'd you get that?
822
:Yeah.
823
:Wendy Kidd: Yeah.
824
:Michael Garrett: You want to verify
where it's pulling information from.
825
:'cause it's, if.
826
:Constantly gonna go back to
that source all the time.
827
:If you didn't question, it's gonna, oh,
well, they like that, so I'm just gonna
828
:keep pulling stuff from that source.
829
:And somewhere along the line
that source might get weird.
830
:Wendy Kidd: Yeah.
831
:Michael Garrett: Okay.
832
:Wendy Kidd: Yeah, for sure.
833
:Michael Garrett: It just, AI will make
you think and move faster than you're
834
:currently doing, as long as you set
the right parameters with your AI.
835
:Wendy Kidd: Absolutely.
836
:I always think of it as, 'cause
I, like you said earlier, it's
837
:always easier to edit something
than to come up from scratch.
838
:Yeah.
839
:Uh, what I use AI for is give
me the, the first draft and then
840
:I can edit the heck out of it.
841
:Yeah.
842
:And I can give it that feedback and make
it amazing, but it just gives me that
843
:shortcut of having that first draft.
844
:Right.
845
:So, yeah.
846
:Well, thank you so much
for doing this today.
847
:I think that this was really helpful.
848
:I really hope so.
849
:Pleasure.
850
:Yeah.
851
:Thank you.
852
:Well, and listeners, if you have
AI questions, please submit them
853
:because maybe Michael and I, I'll do
a Q and A podcast session sometime.
854
:Michael Garrett: That'd be great.
855
:Wendy Kidd: All right, cool.
856
:I'd love that.
857
:All right.
858
:Thanks so much, everybody.
859
:And that's today's NonProfit Nook.
860
:Michael Garrett: Cool, thank you.
861
:Wendy Kidd: Thanks for
listening to The NonProfit Nook.
862
:We're building better
non-profits together.
863
:If you found today's episode
helpful, please subscribe.
864
:Leave a review and share it with other
nonprofit leaders who need support.
865
:Follow the nonprofit Nook on social
media and sign up for our email
866
:list for extra tips and updates.
867
:You can also visit the nonprofit nook.com
868
:to see the show notes and leave a comment
telling me what topics you want next.
869
:Your feedback shapes the show.
870
:See you next time.