Episode Title:
Episode Audio Link: https://podcast.ablackexec.com/episode/ChatGPT Is Cool. ChatBlackGPT Is Critical—Because We Deserve to Be Seen
Episode Video Link:
In this powerful episode of A Black Executive Perspective, host Tony Tidbit sits down with Erin Reddick, trailblazing founder of ChatBlackGPT, to unpack the urgent need for equity, culture, and community in today’s AI revolution. Erin takes us on a deep personal journey—from Michigan roots to Seattle’s tech corridors—where she faced industry layoffs, coded bias, and the kind of exclusion too many Black professionals know too well. But instead of waiting for inclusion, she created it. This episode explores why AI built without cultural context is not only incomplete, but also dangerous. Erin shares how she’s building culturally grounded AI that centers Black voices, stories, and values—and why visibility in tech isn’t a luxury, it’s a necessity. From systemic barriers to bold solutions, this conversation challenges what tech can be when it’s created for us, by us. Tune in for a story of resilience, innovation, and a vision for a future where we are not just users, but architects.
00:00: Introduction to the Tech Divide
00:15: The Importance of AI Education
01:24: A Black Executive Perspective Podcast
02:04: Introducing Erin Reddick
03:29: Erin's Journey in Tech
05:26: Challenges and Opportunities in Tech
06:42: Building Chat Black GPT
09:10: The Role of Community and Representation
13:37: Addressing AI Bias and Inclusivity
19:30: The Urgency of Bridging the Tech Divide
33:42: The Legacy of Black Storytelling
34:01: Bans and Erasure of Cultural Terms
35:25: Commissioning Black Artists
38:03: The Impact of AI on Black History
40:01: Chat Black GPT vs Regular GPT
47:10: Personal Journey and Overcoming Adversity
53:31: Future of Chat Black GPT
56:25: Final Thoughts and Call to Action
Links and resources mentioned in this episode:
Listen to this episode and subscribe for future updates
if you like what we're doing and would like to support us, here's some ways you can help us continue the uncomfortable conversations that drive change
This episode was produced by TonyTidbit ™ . Copyright © 2024 A BLACK EXECUTIVE PERSPECTIVE LLC. All rights reserved. No part of this podcast may be reproduced without prior written permission. For permissions, email podcast@ablackexec.com .
One of the things I want you to dive deeper into is why
2
:is, why should black people mm-hmm.
3
:Really care about this?
4
:And when you talk about the tech divide.
5
:Erin Reddick: Mm-hmm.
6
:Tony Tidbit: Okay.
7
:What does that mean for them?
8
:Erin Reddick: Right.
9
:So I think about
10
:California where there's some districts
that have AI as a requirement to graduate.
11
:Like you have to learn about it.
12
:And then I think about communities,
maybe even close to me, who
13
:don't have computers, who don't
have tablets, who don't have.
14
:Librarians and libraries and
computer labs and it's like what
15
:are people going to school for to
learn so that they can find a job?
16
:And if these jobs are gonna require that
we have some aptitude or some level of
17
:understanding of artificial intelligence
and that becomes the new norm.
18
:If we don't learn that, then we're gonna
be shut out of a whole correct whole job
19
:market and pushed into the types of jobs
that people think that we can only do.
20
:We'll discuss race
21
:Tony Tidbit: and how it plays a factor
and how we didn't even talk about
22
:this topic 'cause we were afraid
23
:BEP Narrator: A Black
Executive Perspective
24
:Tony Tidbit: We're coming to you
live from another, from A BEP
25
:studio for another thought provoking
episode of A Black Executive
26
:Perspective Podcast, A safe space.
27
:We discuss all matters related to race.
28
:Culture and those uncomfortable
topics people tend to avoid.
29
:I'm your host Tony Tidbit.
30
:So before we get started on this fantastic
episode, I wanna re remind everyone,
31
:check out our partners at Code M Magazine,
whose mission is to save the black
32
:family first by saving the black man.
33
:Definitely check them
out@codemmagazine.com.
34
:That is code m magazine.com.
35
:Today our guest is flipping the script
on what tech can be as the founder of
36
:Chat Black GPT Erin Reddick is creating
an unapologetically black, intelligent,
37
:and cultural grounded spaces inside a
system that wasn't built with us in mind.
38
:This isn't just about art,
artificial intelligence, it's
39
:about ancestral intelligence.
40
:Aaron will break down how tech truth
and tradition can live together.
41
:And while building something for
us is more than just innovation.
42
:It's a legacy.
43
:So let me tell you a little bit
about my friend Erin Reddick.
44
:She's a visionary leader and the founder,
CEO of Chat black GPTA pioneering
45
:AI chat bot dedicated to providing
insights and perspectives rooted in
46
:black culture, history and experiences.
47
:Erin's work emphasizes cultural
awareness, sensitivity and respect,
48
:making her a sought after speaker on
topics related to the African diaspora,
49
:racial equality, and technology's
role in promoting inclusivity.
50
:Erin Reddick, my sister.
51
:Welcome to A Black Executive
Perspective Podcast.
52
:Erin Reddick: Thank you
so much for having me.
53
:Pleasure to be here.
54
:Tony Tidbit: Well, uh,
the pleasure is all ours.
55
:I mean, what you're
currently doing is fantastic.
56
:Look, I am so excited to dive
into this topic because this
57
:is something close to my heart.
58
:So, before we get into the heavy
stuff, why don't you tell us a little
59
:bit where you're currently residing
in a little bit about your family.
60
:Erin Reddick: Yeah.
61
:I live, uh, in Washington DC I
just moved here at the very tail,
62
:like beginning end of like 20, 24.
63
:So I moved before January 20th not knowing
what my life would be like, and now
64
:I'm immersed in all the things America.
65
:So I'm right here in the city.
66
:And, um, yeah, it's,
it's been challenging.
67
:Um, but I feel very alive and in tune with
68
:Tony Tidbit: Now where were you moving?
69
:From?
70
:Erin Reddick: Seattle.
71
:Tony Tidbit: Got it.
72
:Yeah.
73
:So all the way across the country.
74
:Erin Reddick: Yes.
75
:A lot more passive
aggressive, a lot more techie.
76
:Um, there's a lot of
personality out here in DC
77
:Tony Tidbit: Oh, I can imagine you
with the peeps out there though, right?
78
:Yes.
79
:So you're on the East coast where we make
everything happen, so that's exciting.
80
:And tell us a little bit about the family.
81
:I believe you have a, a new that's
you are expecting anytime now.
82
:Erin Reddick: Yep.
83
:I am 36 weeks with my first son, so
I'm excited to raise a young black
84
:man and make sure he has all the
tools that I can possibly provide him
85
:and all the advantages I can create
for a pathway, you know, now is so
86
:important because, you know, at first
I'm, I'm fighting, you know, for.
87
:Everyone to have a fair shot in ai, but
now it's also a little bit more personal.
88
:Tony Tidbit: There's no question.
89
:Right?
90
:There is no question.
91
:And, and you know, it's funny though,
it's like, uh, he doesn't even know
92
:what he's about to get into, right?
93
:He's about to come into his war, a
world where his mother is creating
94
:something for now in the future.
95
:So that is great.
96
:Let me ask you this and we, I spoke
a little bit about it in the intro.
97
:You are sought after you've been on a ton
of platforms, a ton promoting chat, black,
98
:GPT, um, and you are, you have, even right
now as you're pregnant, you have a ton of
99
:other platforms and interviews lined up.
100
:Okay?
101
:So why did you wanna come on
A Black Executive Perspective
102
:Podcast to talk about this topic?
103
:Erin Reddick: I mean, like, it's
the same when I went on, um.
104
:Uh, serious Sex and urban
view mornings like mm-hmm.
105
:One of my favorite conversations ever.
106
:Like, you can't serve a population
and a people and a community
107
:without connecting with them.
108
:And I think that like, these opportunities
are important because this is the,
109
:you know, community I'm serving.
110
:So I have to be, um, front facing and
willing to hear all the questions,
111
:take all the criticism, and really
just continue the work to uplift.
112
:And I love to be in these spaces.
113
:Tony Tidbit: Well thank you and we
love to have you as well because,
114
:um, you're gonna educate and more
importantly, you're gonna help.
115
:You are already helping people
who don't even know your name.
116
:So you ready to talk about it?
117
:My sister?
118
:Erin Reddick: Yeah, let's do it.
119
:Tony Tidbit: Alright, let's talk about it.
120
:So listen, you know, if you are
just saying you are out in Seattle,
121
:um, you know, for those who may not
know, Microsoft is out there, right?
122
:One of the first big tech
companies in the world, okay?
123
:And obviously you've been in the tech
space as a black woman, but at the end of
124
:the day, nothing, just one day, you know,
somebody just starts doing something.
125
:Take us back to your early life, right?
126
:What influenced you and what experiences
that you dealt with or lack thereof
127
:that made you wanna decide that you
want to get into the tech field?
128
:Erin Reddick: Yeah, I mean I grew
up in Michigan, in Grand Rapids,
129
:Michigan, so it's not, I lived there.
130
:Oh yeah, I'm from Detroit.
131
:Tony Tidbit: Yeah, yeah, yeah, yeah.
132
:But yeah, that's right,
133
:Erin Reddick: right.
134
:Not the Techiest place.
135
:No, the early, late nineties,
early two thousands.
136
:No.
137
:So, um, I say this, um, pretty often, but
like the biggest job I could like, thought
138
:about was like call center manager.
139
:That was like the ultimate.
140
:And my mom worked at Consumers
Energy, she was in corporate.
141
:And so I thought about following in
her footsteps, but I never really
142
:understood tech until my dad had moved
to Seattle and he told me about, uh, you
143
:know, like engineering jobs or working
at Boeing where he's been forever.
144
:So I kind of was like interested
in what he was, um, talking about.
145
:'cause I.
146
:Could not comprehend it.
147
:And when things kind of bogle
my mind, I like to dig a little
148
:bit deeper and challenge myself.
149
:And that first real tech, like,
what the heck is this was the
150
:concept of software engineering.
151
:So it's like I, I could not comprehend it.
152
:And that was frustrating for me.
153
:It, it's, it's hard like software
154
:Tony Tidbit: and engineering.
155
:You had a hard time dealing with that.
156
:Erin Reddick: Well, just, just
understanding it from like never having
157
:heard of it at all growing up in Michigan
and not understanding tech at all.
158
:Yes.
159
:The concepts of it were very
like, demanding on my phone.
160
:Mm-hmm.
161
:But that's why I got involved
and got closer to it.
162
:And it's like, I didn't go
into engineering, I went
163
:into program management.
164
:So I'm managing engineers, I'm hiring
engineers, I'm like talking to them and
165
:like scientists and data scientists.
166
:And that's kind of where I got
into tech through recruiting.
167
:But yeah.
168
:Yeah.
169
:Tony Tidbit: I mean, so number one, 'cause
I lived in Grand Rapids and it's mm-hmm.
170
:You know, Michigan is
a manufacturing state.
171
:Right?
172
:Exactly.
173
:It's the auto and Grand Rapids, if
I remember correctly, they used to
174
:say they would've hired the Detroit.
175
:Mm-hmm.
176
:Um, because uh, a lot of the,
um, engines or pieces or parts of
177
:the cars would be made in Grand
Rapids and then shipped to Detroit.
178
:So, you know, I definitely know
the city and obviously it's not
179
:tech, but you know, it's great.
180
:So did your father kind of guide you
or you were just seeing him and, and
181
:his career and saying, Hey, maybe
this is, even though it's hard for
182
:me to get involved, or I can't,
you know, for me to understand it.
183
:Um, but it's something different from
a call center and stuff to that nature.
184
:Is that, tell us a little bit about that.
185
:Erin Reddick: I mean, my dad,
he's, uh, a tool and die engineer.
186
:So he was working at Delphi growing up.
187
:So I've always known, you know,
like about manufacturing, like from.
188
:Hardware engineering and like
parts and building things,
189
:but not on the technical side.
190
:He was actually dating his new
wife and she worked at Amazon
191
:and so she was a big techie, uh,
working there for like 12 years.
192
:Uh, she's at Google now, but my
first opportunity was with Amazon,
193
:but more on like a commercial side.
194
:But definitely having, you know, two
parents, one in Boeing and one in Amazon
195
:influence, like, okay, I can do this
because like, I'm talking to two people
196
:who are living this every single day and
you know, it's not impossible for me.
197
:So they definitely influenced
me feeling like it was possible.
198
:Tony Tidbit: That is awesome.
199
:And tell us a little bit about
the, uh, when you first got
200
:started working for Amazon mm-hmm.
201
:And some of the other tech companies.
202
:Tell us about that experience and
how it shaped and guided you in
203
:terms of what you're doing now.
204
:Erin Reddick: Yeah, I mean, so I pretty
much have a very creative soul at heart
205
:and, um, also very entrepreneurial.
206
:So my career independently started out
in photography, so my very first taste
207
:of tech was actually working on Amazon's,
uh, they tried to do something like
208
:a food delivery app like Uber Eats.
209
:So I got to know the city of Seattle by
going to every restaurant that signed up
210
:for their food ordering app at the time,
and taking pictures of their full menu.
211
:Mm-hmm.
212
:Uploading it to the cloud, working with
their, like marketing teams and like,
213
:so that was my first dabble in it, even
though I don't even put that on my resume.
214
:But, um, my first job in tech was Amazon.
215
:And then, um, I really loved working with
people and I was doing a lot of media
216
:work, had a media company and that's when
I was able to bridge into like recruiting,
217
:because I was so used to sales.
218
:And recruiting is sales.
219
:And so, uh, my first recruiting
job, uh, was supporting university
220
:recruiters at Microsoft.
221
:And then I went into engagement management
at Facebook, uh, over at Oculus Research
222
:and Development Facilities, which
was just one of the best jobs ever.
223
:Loved that job.
224
:And that was my first real like, whoa.
225
:Like, oh my God, these people
are so amazing and brilliant.
226
:And, um, it kind of went from there,
you know, I went back to Amazon, back
227
:to Microsoft, then I went full-time at
Meta and, um, you know, getting laid
228
:off, which is what happened in 2023.
229
:I've, I've, it's always been more than
just like, you know, the big brand name.
230
:It's just like the technology and
the advancements in general have
231
:always been like so exciting.
232
:Tony Tidbit: Wow.
233
:Here, you know what,
thanks for sharing that.
234
:I'm gonna play a quick little clip and
I love to hear your thoughts on it.
235
:Okay?
236
:Yeah.
237
:Erin Reddick: Yeah, so I was
actually laid off from tech.
238
:I was working at Meta and I was
in this room, uh, on Facebook
239
:called Black Women in Tech.
240
:Tony Tidbit: Mm-hmm.
241
:Erin Reddick: And when I got laid off,
I went in that room and I was like,
242
:you know, I don't even know if I belong
here anymore because I got laid off.
243
:I'm not a black woman in tech.
244
:And that triggered me to kind
of reclaim myself in this space.
245
:And I decided that regardless
of what happened, I'm still
246
:gonna be a black woman in tech.
247
:And I wanted to go into the
next most important technology
248
:space as possible, which was ai.
249
:And I chose generative AI specifically.
250
:And when I started doing research, I
noticed it wasn't so great for black and
251
:brown Americans and decided to push back.
252
:And I asked the right questions at the
right time, which happened to be, is
253
:it really the data didn't believe so.
254
:And I was able to develop my prototype
and start to build my company.
255
:Tony Tidbit: So number one,
think about that for a second.
256
:And that's from a clip that a podcast
that you were on not too long ago.
257
:Um, you know, a lady from
Grand Rapids, alright.
258
:That eventually gets in tech.
259
:And I wanna dive in because
you said a lot there.
260
:You were, you know, a
group, black women in tech.
261
:All right?
262
:Which, which to be fair we know for
a fact tech is a very male dominated,
263
:white male dominated industry.
264
:So talk a little bit in terms of how you
got into the black women in tech and then
265
:let's dive in deeper in terms of when you
got laid off and knowing that, you know,
266
:'cause a lot of times when people get
laid off, it's an, an emotional situation.
267
:Okay?
268
:And, and then, but then you got
laid off and said, you know what?
269
:Forget this, I'm gonna create something.
270
:And you came up and created the chat bot.
271
:So let's, let's, let's back up to
the, the black, uh, women in tech.
272
:Erin Reddick: Yeah, so as I mentioned, um,
I was in recruiting a lot, uh, of those
273
:years, like so five years total in tech.
274
:And so it, when you're in a recruiting
space, you want to help people find jobs.
275
:And one of the techniques that I
used was Facebook groups in large
276
:communities that were black spaces.
277
:'cause I did a lot of recruiting for
underrepresented, uh, talented folks.
278
:And so that was one of the groups that
I would go in and I would say, Hey,
279
:like, here's this job opportunity,
or Hey, uh, here's some tips on
280
:interviewing if you wanna get into
where I'm at or here was my experience.
281
:In these like seven rounds to
get this full-time position.
282
:Here's my salary.
283
:I was very transparent in that group and
I still am active in that group today.
284
:Um, but yeah, when, when I did get laid
off, I wanted to make a post about it,
285
:but I had built up so much, you know,
community around like that beacon of hope
286
:because I don't have the traditional, um,
credentials that you would, that you're
287
:told you have to have to get in there.
288
:Mm-hmm.
289
:Not true.
290
:So I, I was kind of like, oh gosh,
I'm no longer that person that they
291
:can look at and say, oh, well, you
know, I can do it if she can do it.
292
:And so for a moment I did recluse and
it was really tough for me to process.
293
:But um, I went back in there
and I was like, you know what?
294
:It's not about where I work,
it's about who I am and.
295
:What excites me and makes me
passionate, and that is like the
296
:tech space and it just is what it is.
297
:I moved to Seattle and
lived there for 10 years.
298
:Mm, to work in tech, and I did that
regardless of where I went to college
299
:or what I did or didn't finish.
300
:Like I was gonna work in tech.
301
:So
302
:Addra Labs Promo: it's time
to rethink your protein.
303
:Addra Labs protein bars are crafted with
high quality protein, double the leucine
304
:and enriched branch chain amino acids,
essential for optimal muscle recovery.
305
:Finally, a protein bar that
works as hard as you do.
306
:So visit addralabs.com
307
:and use the code BEP to get 20% off.
308
:That's addralabs.com
309
:promo code BEP.
310
:Erin Reddick: I had to like reevaluate,
you know, why I'm in this space to
311
:show my face, you know, but I'm still
there and still being received really
312
:well, and now I'm teaching that same
group how to do public speaking,
313
:how to make 10, $15,000 in an hour,
like how to do a proof of concept.
314
:So it's like I always bring it back to
the community no matter what I go through
315
:and like how I come out on the other end.
316
:Tony Tidbit: That is awesome.
317
:How did you decide, or why did you decide
that you wanted to build an AI chat bot?
318
:Erin Reddick: Yeah, so it's
because a like gen, there's
319
:so many different types of ai.
320
:You know, there's like the images,
like Chachi, BT just, or OpenAI
321
:just did their, uh, new image update
where you can create new things.
322
:I haven't played around with it
yet, but I think it's pretty cool.
323
:But there's images and then there's,
you know, facial recognition
324
:technologies and all these things.
325
:But generative AI is.
326
:The most accessible and low barrier
to entry for the general public.
327
:And I wanted to make impact.
328
:So for me, the thing that I can get
into their hands the fastest with
329
:no cost is what I wanted to focus on
because that's where I thought I could
330
:make the most impact the fastest.
331
:So, um, generative AI is where
I landed because of that.
332
:So that's when I started taking some
certifications and understanding
333
:more about how it's built and why
it functions the way that it does.
334
:And um, a lot of what I was being told
when I noticed, like it had really
335
:terrible answers about black people
and black topics and black history,
336
:they were saying it's a data issue.
337
:But at the same time, there's all
these conversations about how OpenAI
338
:stole all the data on the internet.
339
:And I was like, that
means they stole us too.
340
:We're there, so why are we
341
:Tony Tidbit: missing out of it?
342
:Right?
343
:Yeah,
344
:Erin Reddick: exactly.
345
:I'm like, okay, so that means.
346
:We need to train an AI to surface
and prioritize black information and
347
:black authored information first.
348
:And that will help cut out a lot of bias.
349
:And it worked.
350
:And so obviously, you know, I don't
wanna oversimplify that process.
351
:I have a team of engineers, um, we're, you
know, still in development, we're gonna be
352
:constantly in development, but it's, um,
it was an important thing to at least try.
353
:And so yeah, I was able to
successfully prove that concept.
354
:Tony Tidbit: You know, one of the things
when you think about it and you know,
355
:there is ai and we're, we're, we're, you
know, we're still at the, I don't wanna
356
:say we're at the forefront of ai, right?
357
:Mm-hmm.
358
:And one of the things you
would think when they create.
359
:Whatever type of technology, and
especially from an artificial intelligence
360
:standpoint, that's, uh, supposed to
be, uh, a tool that can help you do
361
:all the things or provide answers
for you or whatever the case may be.
362
:Mm-hmm.
363
:You would think that it
would be inclusive, that
364
:everybody would be involved.
365
:Okay.
366
:But then to your point, you found
out that it wasn't inclusive.
367
:Mm-hmm.
368
:Okay.
369
:And that, you know, so a lot
of times people right now are
370
:still struggling to work with ai.
371
:Mm-hmm.
372
:Um, because number one,
there's a fear factor.
373
:Mm-hmm.
374
:Okay.
375
:There is a learning curve, right.
376
:Which you dive deep into.
377
:Right.
378
:But talk a little bit about,
number one, I wanna ask.
379
:What were some of the biggest
challenges that you faced?
380
:Building something that was an, at
building something that actually focused
381
:on the black diaspora, black experience,
because that's an undertaking in itself.
382
:Erin Reddick: Mm-hmm.
383
:Yeah.
384
:It's tricky because, um, you have to,
you have to go against people who will
385
:tell you there's nothing wrong with it.
386
:There isn't any issues.
387
:What are you talking about?
388
:Why am I making everything about race?
389
:Why is it always black, black, blah, blah.
390
:And then you show them the answer
when you ask it to write a short
391
:essay on black history, and it doesn't
mention the KK, K or forgets about
392
:Obama or, um, you know, refuses to
acknowledge massacres that happen.
393
:Like that is the version of black
history that a lot of people want
394
:to promote and have unfortunately,
promoted in schools across America.
395
:Uh, you know, just recently they went
as far as to say like what slavery was,
396
:entrepreneurship or something like that.
397
:So imagine an AI that people don't
understand can and will lie to you
398
:often, uh, telling you that information
and it's like, that's not right.
399
:Um, so real
400
:Tony Tidbit: quick though.
401
:Mm-hmm.
402
:Why doesn't it, and I, I, I know you, I
I know, but I just want you to say it.
403
:Why doesn't it have the full, uh, history
of all the things that happened and
404
:only only showcases some of the stuff.
405
:Erin Reddick: Uh, the same reason they
wanna ban critical race theory and black
406
:history classes and books on black history
and get rid of DEI, first of all, it's
407
:a blueprint to the way we're about to be
oppressed by people much richer than us.
408
:I think they don't want people to see,
oh, this is how you fight inequity.
409
:Oh, well, black people have
been doing this whole time.
410
:Like they don't wanna validate that.
411
:Um, and also things like critical
race theory and like, that's why
412
:I work with historians like that.
413
:Historical fact answers a
lot of questions of today.
414
:Why does this algorithm oppress us?
415
:Why when we type in, you know,
something about black people
416
:porn pops up, like, as Dr.
417
:Safi Noble, uh, pointed out in her
book, algorithms of Oppression.
418
:So it's like those things
have answers, but Right.
419
:The answers lead back to
making somebody look bad.
420
:That's not what they want.
421
:So that's why they're actively trying to
erase history and blueprints of, you know.
422
:Pathways to equity,
which I think is wrong.
423
:I went by Black Lives Matter,
just like the other day.
424
:I'm like, Ugh, I know
they're gonna rinse it off.
425
:I wanna go see it before
I, before it's gone.
426
:Oh, they weren't taking a hose to it.
427
:They took a jackhammer,
they ripped up the street.
428
:Tony Tidbit: You talking
about there in dc Correct?
429
:Yeah, yeah.
430
:Right, right, right, right.
431
:Erin Reddick: And that
is so symbolic to me.
432
:It's like they're not just trying to rinse
away, you know, delete some pictures.
433
:No.
434
:They wanna uproot history
from the very foundation.
435
:Like it never happened.
436
:And it's like, why?
437
:You know?
438
:But no matter the answer, we
still need to do something.
439
:And one thing I love about AI
is that once you put something
440
:in it, you can't take it out.
441
:It doesn't have the ability
to like unlearn things.
442
:So it should learn as much
about black history as it can
443
:and, um, exists somewhere,
444
:Tony Tidbit: you know?
445
:So number one, thank you,
not just for your answer.
446
:But for the passion, because
here's the thing, you're not the
447
:first person to know that, right?
448
:That's in the tech space, who is African
American or another person of color
449
:who noticed the deficiencies in it.
450
:They, and they may even scream about it.
451
:Right?
452
:So I'm not diminish, diminishing,
diminishing of they, you know,
453
:like, this ain't fair, but you've
taken it to a whole nother level.
454
:Right.
455
:You are like, not only it ain't fair.
456
:Not only it can oppress us, not only our
people won't have the access, but you know
457
:what, I'm gonna do something about it.
458
:Okay.
459
:Which I love.
460
:And that's why you're all over
the place because you've jumped in
461
:and you've really looked at this,
not from just what the problem is,
462
:but I'm gonna solve the problem.
463
:One of the things though, and I said
it a few minutes ago, black, white,
464
:and let's just keep it to, to, to
black people or people of color.
465
:Mm-hmm.
466
:The majority of them don't use ai.
467
:Mm-hmm.
468
:Okay.
469
:The majority of them are afraid to use ai.
470
:So I wanna play a quick little clip of
something you said, uh, not too long ago
471
:then I want to hear your thoughts on it.
472
:Erin Reddick: First of all, it's
an opportunity for us to have.
473
:A once in a lifetime self
awarded equity frontier.
474
:I don't think it's a kind of
technology where we have to
475
:ask to be included, right?
476
:We have access to it in a way where
we can build on our own terms and
477
:without our representation in it,
we're going to unfortunately fail
478
:to have a safe space to interact.
479
:And that can lead to exacerbation
of the tech divide because of the
480
:nature of the technology making
us faster, better, smarter.
481
:If we don't jump in now, we're gonna
be, you know, the Guinea pigs of AI gone
482
:wrong instead of being proactive about
it being built in, uh, a safer way.
483
:So it's very important.
484
:And the other part is that once
it goes in, it can't come out.
485
:Right?
486
:Right, exactly.
487
:So we need to solidify our history
and our stories in artificial
488
:intelligence, uh, in an irreversible way.
489
:Tony Tidbit: So you talk
about the tech divide.
490
:Mm-hmm.
491
:You talked about being Guinea pigs.
492
:Mm-hmm.
493
:Dive into that deeper in terms,
because I, this is something it's
494
:very important that anybody watching
and listening, especially people of
495
:color, need to really recognize how the
significance and the importance of this.
496
:Please, please dive in.
497
:Erin Reddick: Yeah.
498
:I mean, I do wanna take a moment just
to acknowledge like, I, actually, this
499
:exercise is really reaffirming for me
because I will eat my words for breakfast.
500
:I love that.
501
:I'm able to stand on
business with everything.
502
:You keep replaying.
503
:I'm like, yep.
504
:Yep.
505
:Tony Tidbit: You didn't know I was
gonna go there though, did you?
506
:I didn't.
507
:Alright.
508
:Erin Reddick: It's awesome.
509
:Well, I, I, yeah.
510
:I don't script any of these interviews.
511
:Mm-hmm.
512
:I really speak from like my heart
and like the work that I'm actually
513
:doing, and so it makes this easy.
514
:Like I don't feel intimidated by
whatever you're gonna play next.
515
:And that is really
validating for me personally.
516
:Um, I just wanted to say that,
uh, in my stream of consciousness.
517
:But anyway, so, uh, yeah, I mean, we
are the Guinea pigs of AI gone wrong.
518
:Like most of my keynotes, I'm highlighting
women who have written books about
519
:algorithms that oppress about, uh, AI
and surveillance technology that is
520
:targeting people of color about, um,
uh, inequitable outcomes from like
521
:algorithms and hiring work lending.
522
:And these books are written
in 20 18, 20 like 19.
523
:It's nothing new, right?
524
:The only thing new about it is our access.
525
:So that's why I try to say like,
okay, we have an opportunity
526
:for self awarded equity.
527
:I'm gonna say something.
528
:I don't like to dip and dabble too
much in like politics or anything.
529
:Um, but I think about Elon Musk.
530
:In the, uh, what was it?
531
:It was some briefing meeting that he
kind of was like standing there in front
532
:of everybody and people were saying
like, you're not an elected official.
533
:You don't even have a seat at the table.
534
:And he's still taking over and,
and, and all of this stuff.
535
:And I thought to myself, that's
exactly, and I'm not supporting him, but
536
:that's exactly how we have to operate.
537
:Like, screw your table.
538
:Like, it's not even about a table anymore.
539
:It's about like how you show up.
540
:Who says he's credentialed to be
there these days doesn't even matter.
541
:He is there just because he
says, so why can't we do that?
542
:Tony Tidbit: Right.
543
:You know,
544
:Erin Reddick: why, why do we
need to look for these tables?
545
:Why can't we just show up table or not?
546
:You know?
547
:Right.
548
:I'm not saying what he's doing is good.
549
:I don't.
550
:Agree with people losing jobs, um,
like the way that things are being
551
:done at all, but the concept of still
polarizing and having all that influence
552
:without a seat at the table, there is
something to learn from that, right?
553
:So, um, that's what I really, that's
the essence, that's the, the white
554
:privilege essence of what I mean
when I say self awarded equity.
555
:Because we have access to open
source tools and technology and,
556
:um, different apps that allow us
to build, yeah, they cost money.
557
:It's gonna be $20 a month,
$200 a month, whatever.
558
:But you can still build just as fast
as the next person because we have open
559
:source and we have, uh, people who wanna
collaborate, um, you know, brilliant data
560
:scientists, AI engineers, ML engineers.
561
:It's like they're out there
and we can work together.
562
:And that's how my team came together.
563
:So it's like.
564
:I don't need to work at a big company
to make impact and use the tech.
565
:To build the tech.
566
:So it's like, yeah, that's
what I meant by that.
567
:Tony Tidbit: Yeah.
568
:But one of the things I want you to
dive deeper into is why is, why should
569
:black people really care about this?
570
:And when you talk about the tech divide.
571
:Erin Reddick: Mm-hmm.
572
:Tony Tidbit: Okay.
573
:What does that mean for them?
574
:Erin Reddick: Right.
575
:So I think about in California where
there's some districts that have
576
:AI as a requirement to graduate,
like you have to learn about it.
577
:And then I think about communities,
maybe even close to me, who don't
578
:have computers, who don't have
tablets, who don't have librarians
579
:and libraries and computer labs.
580
:And it's like.
581
:What are people going to school for to
learn so that they can find a job And
582
:if these jobs are gonna require that
we have some aptitude or some level of
583
:understanding of artificial intelligence,
and that becomes the new norm.
584
:If we don't learn that, then we're
gonna be shut out of a whole.
585
:Correct.
586
:A whole job market and pushed into
the types of jobs that people think
587
:that we can only do, which is, uh,
you know, I don't, I mean there's
588
:respect in every role, but jobs
that are not technicals say correct.
589
:So, uh, we can have any type of job,
but we need the opportunities and the
590
:understanding of the technology that's
being asked of us to utilize as well.
591
:And so it's not just that it's
dangerous when facial recognition
592
:thinks that every black person looks
the same, and now you've got cops
593
:thinking you're a wanted criminal.
594
:That's dangerous as hell.
595
:But it's also like, how do we get a,
uh, black scientist to get interested in
596
:computer vision if they don't understand
how facial recognition works or the,
597
:you know, harm it can perpetuate because
they don't know Joy Bull and weenie,
598
:you know, so it's like, I'm sorry Dr.
599
:Joy Bloom.
600
:Um, it's like if I, if I don't get
on stages and talk about these women
601
:and say, go read her book, follow her
page, follow the, uh, the algorithmic
602
:Justice League, like how are we going
to get there if we don't shout it out?
603
:So that's, that's why I
do the work that I do.
604
:But we have to stay competitive.
605
:Um, our families need to grow
and prosper like everyone else.
606
:So if we don't get into the product,
we're just gonna become the product.
607
:Tony Tidbit: Absolutely.
608
:Right.
609
:And you, and you talked a
little bit about, you know, the
610
:political, um, you know, um.
611
:What's going on from a
political standpoint?
612
:Who's to say five years from now, they
don't come up with a law that's say,
613
:that says, you know, if you don't know
how to do ai, if you don't have these
614
:uh, materials, what the case may be, you
can't do blah, blah, blah, blah, blah.
615
:Right?
616
:Or for you to do blah,
blah, blah, blah, blah.
617
:You have to go through these
certain algorithm, these certain
618
:products or platforms to do it.
619
:And if you're not involved in the
system, somebody taught me this a
620
:while ago, they're like Tony a lot
of times, you know, yeah, we wanna
621
:do our own thing, but you gotta know
what's going on in the system, okay?
622
:Because if you're not in the system,
then you won't be a part of the system.
623
:And if we don't know from a
technology standpoint, especially
624
:when the world is going that way.
625
:Okay, let's be fair.
626
:It's going that way.
627
:It's not gonna stop.
628
:It is going that way.
629
:You have to be a part of the
game to be able to play the game.
630
:And more importantly, our
story needs to be in the game.
631
:Right?
632
:Because if our story's not in the game,
it doesn't matter if we show it whatever,
633
:we're gonna be erase from the game.
634
:So talk a little bit about the
language of, uh, liberation.
635
:Okay.
636
:Because, you know, at the end of
the day, this is about the African
637
:diaspora and it is about continuing
the legacy of black storytelling.
638
:Talk a little bit about that.
639
:Mm-hmm.
640
:Erin Reddick: Yeah, I mean, like a lot of
it's, it's, it's happening in real time.
641
:If you think about, um, what, what is it?
642
:The ban on, uh, DEI right?
643
:Ban
644
:Tony Tidbit: on books, the
ban on a ton of stuff, right?
645
:Erin Reddick: Right.
646
:But there's words like, there's a list
of words, um, that have been banned from.
647
:The federal government, I'm
talking about bans even.
648
:Even just like books.
649
:That's one thing.
650
:But the words, I'm looking at a list
of words that were banned, um, phrases
651
:that federal agencies are told to avoid.
652
:One of those words is Black
bipoc, obviously D-E-I-D-E-I-A.
653
:Cultural competence, cultural differences.
654
:Cultural heritage, cultural
sensitivity, culturally
655
:appropriate, culturally responsive.
656
:Social justice.
657
:Social cultural, socioeconomic stereotype
stereotypes, systemic systemically.
658
:What, you know what I mean?
659
:So they're
660
:Tony Tidbit: just trying
to erase everything.
661
:Right?
662
:Erin Reddick: So my, my thing
is like, that's just pertaining
663
:to black, but women is banned.
664
:Like obviously they, obviously
it's very anti L-G-B-T-Q, like
665
:don't even need to go there.
666
:Um, but it's like, like why are you
erasing words that matter to the
667
:work that you're supposed to do?
668
:I don't understand.
669
:And so it just makes sense.
670
:Like, I don't know, it's just like the way
that my mind works, I thought to myself,
671
:okay, how can I donate my, a portion
of like one of my recent honorariums
672
:to commission black artists to repaint
some of the images taken down from.
673
:The federal like archives that were
all white, like Jackie Robinson Yeah.
674
:And
675
:Tony Tidbit: all stuff.
676
:Yeah.
677
:Erin Reddick: And how can I take
my money and have them literally
678
:commission their own voice through
recreating those images that were erased
679
:and have like a pop-up art gallery.
680
:Like that's how I think preservation
of our voices, but also telling
681
:a story at the same time.
682
:Like, that's just like
who I am to my core.
683
:So I, I can't imagine a world where
we don't have answers, but I do think
684
:like knowledge will be a premium
privilege is the way that it's going.
685
:Especially historical knowledge.
686
:You know, it's like that.
687
:I, I don't, I don't think black
history should be, should have
688
:a premium price tag on it.
689
:It should be something that.
690
:We acknowledge and celebrate and
listen to and learn from, you know,
691
:but I feel like black history is
about to become everybody's history.
692
:Like poor people's history,
like under a million dollars,
693
:history, like all those things.
694
:It's gonna be, we, we, no
who can afford a house?
695
:Like as a average working American
who isn't working two to three jobs.
696
:Like who, who is feeling the effects of,
oh, I thought my goal in life was to make
697
:six figures and, and now I still can't
afford my bills, or I'm afraid to have a
698
:child because I can't afford childcare.
699
:It's like a lot of those things are
circumstances that black neighborhoods
700
:and black communities have been
dealing with for a long time.
701
:And those same issues are about to be
widespread for not just black people.
702
:So it's like when, when
I hear about the 92%.
703
:And how they're saying like 92% of
black women are sitting this out like,
704
:yeah, we're marching, but it's 'cause
we got our fans and our boots on.
705
:You know?
706
:And it's like, because
this is nothing new for us.
707
:We've always struggled
getting food for houses.
708
:We've always, you know, had X, Y, and Z.
709
:So it's like business as usual almost,
but everyone else is like, Hey, this
710
:isn't fair, what can we do about it?
711
:Well, DEI was a thing,
you see what I mean?
712
:Like, I'm trying to
713
:Tony Tidbit: Yeah, I, I see
where, so number one, I definitely
714
:see where you're going, right?
715
:Yeah.
716
:And, and, and here's the thing.
717
:They're using AI to do that.
718
:Mm-hmm.
719
:Okay.
720
:So that's really the key here.
721
:Alright?
722
:Yeah.
723
:They're not using, you know, people to
go through with some who got, you know,
724
:bifocals on to figure out how to do this.
725
:They're using the tool that you're
creating to, uh, for black people,
726
:they're using that tool to erase.
727
:Black history.
728
:Black stories.
729
:Yeah.
730
:Pe other people of color,
let's be fair too, right?
731
:Mm-hmm.
732
:But at the end of the day, so how, so
the thing, the, um, what's the word?
733
:The challenge is they're using
the same tool to erase us.
734
:Yeah.
735
:However, as uh, people of color, black
people, we need to know what that tool
736
:is and we need to start incorporating
and working with that tool, right?
737
:Because that's part of fighting back.
738
:Would you agree with that?
739
:Mm-hmm.
740
:Erin Reddick: Yeah.
741
:Yeah.
742
:And like when you're training an
AI to target specific things for
743
:like deletion, you're just teaching
it, gay is bad, black is bad.
744
:Correct.
745
:You tell us that.
746
:Correct.
747
:And like you're training it to
behave in a way that ultimately
748
:is not gonna serve anybody.
749
:But
750
:Tony Tidbit: here's the thing though.
751
:So backing up a little bit, when they,
um, inadvertently took down Jackie
752
:Robinson's story from the DOD, okay.
753
:They also took down.
754
:Okay.
755
:Uh, the Enola gay.
756
:Erin Reddick: Yeah.
757
:Tony Tidbit: Which was the
airplane that dropped the two.
758
:And the only reason they took that down
to your point, because of the word gay.
759
:Yes.
760
:Right.
761
:So this is, these are the things
that are happening today from an
762
:AI standpoint to erase things.
763
:They can say, my bad.
764
:It's not their bad.
765
:They're doing it on purpose.
766
:Okay.
767
:And to be fair, even with the Jackie
Robinson thing, had people not
768
:pushed back, they wouldn't have did,
they wouldn't have said nothing.
769
:They would've left it the way it is
based on pushing back and learning.
770
:Talk a little bit about the functionality.
771
:So if somebody went on right
now to chat black GPT, tell us a
772
:little bit about what, what's the
things that they can get out of it?
773
:What's some of the limitations and
some of the things you're still
774
:going to program it and push into it?
775
:Erin Reddick: Yeah, so like I was
at, um, oh my gosh, where was I?
776
:Oh, I was just at some university.
777
:Where I was demoing this,
it was in California.
778
:Tony Tidbit: Mm-hmm.
779
:Erin Reddick: I think it
was Riverside City College.
780
:That's where I was.
781
:And um, we demoed it live side
by side and we actually side by
782
:Tony Tidbit: side.
783
:Side by side what?
784
:Erin Reddick: Uh, regular
GPT and chat Black GPT.
785
:Tony Tidbit: Got it,
got it, got it, got it.
786
:Erin Reddick: Yeah.
787
:And so essentially I asked it this
question and I'm using the customizable
788
:version just so I can prove to them
with no proprietary information, no
789
:special download, no special knowledge
base, just purely a set of instructions,
790
:algorithm, uh, see how the difference
is when you tell it to behave this way.
791
:I asked it to generate, or they said,
the top 10 most influential figures,
792
:I kid you not the regular GPT named
literally like Trump's cabinet.
793
:Addra Labs Promo: If you like what you
hear and wanna join us on this journey
794
:of making uncomfortable conversations
comfortable, please subscribe to A
795
:Black Executive Perspective Podcast
on YouTube, apple Podcasts, Spotify,
796
:or wherever you get your podcasts.
797
:Hit subscribe now to stay connected
for more episodes that challenge,
798
:inspire and lead the change.
799
:Erin Reddick: Like, I'm not even kidding.
800
:Tony Tidbit: So just hold on.
801
:I just wanna make sure I'm clear here.
802
:So you're saying
803
:Erin Reddick: the prompt
804
:Tony Tidbit: was, what are the
top INF influential people?
805
:Top 10?
806
:Erin Reddick: Yes.
807
:Tony Tidbit: And Trump's team came up.
808
:What, what were the names?
809
:Erin Reddick: Okay.
810
:It says, I'll, I'll tell, I'll
tell you the exact prompt.
811
:Who are the most top 10 most
influential people in America?
812
:And it said, number one, Donald Trump.
813
:Number two, Elon Musk, number three.
814
:Steven Miller, number four, Robert F.
815
:Kennedy Jr.
816
:Number five, Marco Rubio.
817
:Number six, Christy Noam.
818
:Uh, number seven, Susie Wiles.
819
:Number eight.
820
:Laura Trump.
821
:Number nine, Linda McCone.
822
:And number 10, Steven Bannon.
823
:Tony Tidbit: So wait a
minute, stop for a second.
824
:Hold on.
825
:Stop, stop.
826
:You typed in names to
say the prompt again?
827
:Erin Reddick: Yeah, I can
share screen on Riverside.
828
:Tony Tidbit: No, just tell me the prompt.
829
:That's all I to know.
830
:Erin Reddick: Who are the top 10
most influential people in America?
831
:Tony Tidbit: So you didn't say
today, you, you just said in America.
832
:Erin Reddick: Yep.
833
:Tony Tidbit: And those 10 names came
up, which are all people either in
834
:Trump's cabinet or associated with Trump.
835
:Yeah.
836
:So George Washington come up.
837
:Martin Luther King Jr.
838
:Didn't come up.
839
:JFK didn't come up.
840
:Uh uh, uh uh.
841
:Um, we can go a million way.
842
:Um, your former, uh, uh, uh, the,
uh, CEO or founder of Microsoft,
843
:bill Gates didn't come up.
844
:None of these people came up.
845
:Right.
846
:But the names of, uh,
the crew under Trump.
847
:Erin Reddick: Right,
848
:Tony Tidbit: that's right there.
849
:I mean, that right there is,
is oh my God, that's insane.
850
:Erin Reddick: Yeah.
851
:Yeah.
852
:So, um, again, so back
853
:Tony Tidbit: to that, that, that going
back to why chat GPT versus regular GPT,
854
:Erin Reddick: why would,
would, why black GBT
855
:Tony Tidbit: Black tv?
856
:I'm, I'm sorry.
857
:Thank you.
858
:Erin Reddick: So I asked, um,
my version of the same tool.
859
:This is the customizable,
so it's the same platform.
860
:So mine said, um, one Barack Obama.
861
:Okay.
862
:Two Beyonce, three Oprah Winfrey, four
LeBron James, five Kamala Harris, six
863
:Jay-Z, seven Stacey Abrams eight, Ava,
dui nine Elon Musk, and 10 Tyler Perry.
864
:So you could say this is like very much
geared towards, um, entertainment, which
865
:some people would have a problem with.
866
:But it also is political.
867
:You know, it has the president at least,
you know, I feel like Oprah being on
868
:there, Kamala Harris being on there,
Stacey Abrams being on there like.
869
:That's a good mix.
870
:I don't know why Elon Musk is on here.
871
:Tony Tidbit: Well,
here's the thing though.
872
:But to be fair though, don't
we wanna, this is the thing.
873
:Erin Reddick: Yeah.
874
:Tony Tidbit: Don't we wanna get to a
world where, if I say who's the top
875
:influential, there's a mixture of people,
regardless of, you know what I'm saying?
876
:It shouldn't be, it's
only white on this side.
877
:It's only, well, Elon did make the
tent and, but my point is it should
878
:have been Barack and it shouldn't
have been all Trump's people, but
879
:it should have been Barack and,
and, and, and, and uh, uh, my man,
880
:Microsoft and or, uh, my man in Kansas.
881
:The, the investor.
882
:You know, it should be a mixture.
883
:It should be Oprah.
884
:And you know, there should be a mixture
of people when we say influential versus
885
:it's only this group versus that group.
886
:Sure.
887
:Do you agree with that?
888
:Erin Reddick: Yeah.
889
:I don't disagree with that at all.
890
:But it won't be that way.
891
:And so we have to like basically I
892
:Tony Tidbit: see
893
:Erin Reddick: you have to, I
894
:Tony Tidbit: definitely see,
895
:Erin Reddick: yeah, we have to create, um.
896
:Technology that will balance it.
897
:And even though both mentioned Elon
Musk, you can't say he's not influential.
898
:Tony Tidbit: No, no, no, no, no, no.
899
:Right.
900
:But that's my point though.
901
:What I'm saying is it's fine if
they didn't say all I'm saying this
902
:should be a mixture regardless of
color, but it shouldn't omit a color.
903
:That's my point of color.
904
:People, group of
905
:Erin Reddick: people.
906
:But that's called DEI.
907
:The work to fix that is called
diversity, equity, and inclusion.
908
:Tony Tidbit: Right, right.
909
:Erin Reddick: Which is like illegal now,
910
:Tony Tidbit: which is what
they're, we're wiping away.
911
:Erin Reddick: Right.
912
:So in a way, the f the other
GPT answered exactly how it's
913
:supposed to, to a lot of people.
914
:Um, and mine answered pretty decently
for one that's specifically focused
915
:on, uh, the black perspective because
these are 10 people that a lot of
916
:black people probably would name.
917
:Tony Tidbit: Well, look, listen, at the
end of the day, there's no question.
918
:However, there needs to be, and this
is what you're, what you've created.
919
:Okay.
920
:There needs to be a place
where we're not shut out.
921
:Yes.
922
:Okay.
923
:Yeah.
924
:Should it be one place and anybody can
go to and, and any, all, all people
925
:based on, um, their accomplishments
or based on the question show turn up.
926
:Absolutely.
927
:Right.
928
:But if it doesn't, then you need
to create your own thing so you
929
:don't get lost in the sauce.
930
:And for black people to be able to
go to and still get that same type of
931
:information without their history being
erased with their people forgotten about.
932
:Mm-hmm.
933
:Let me ask you this, you grew up in Grand
Rapids and then you were able to migrate
934
:out to Seattle and, and boom, you know.
935
:It's like the Big Bang theory, right?
936
:Boom.
937
:All of a sudden, Erin Reddick, you
know, uh, is like, uh, uh, my man,
938
:Darth Vader said, Luke, you know,
you know, you found your destiny.
939
:All right.
940
:However, there's kids right now
941
:Erin Reddick: Yeah.
942
:Tony Tidbit: That, um, are
starting to get in the tech field.
943
:They, they are like you, okay?
944
:When you were younger, and, and let's be
fair here, they, their parents might not
945
:have been a, a conduit or been on a, a, a,
a, a road that they can say, oh, and then
946
:it leads them to the tech space, right?
947
:They could be just, you know, uh,
curious or, you know, starting to
948
:tinker with some stuff and says, I
really like this, and blah, blah, blah.
949
:So based on that, right, what
would you, if you had to sit
950
:down with your younger self
951
:Erin Reddick: mm-hmm.
952
:Tony Tidbit: All right.
953
:Before this journey started,
what would you tell yourself?
954
:Erin Reddick: Oh baby.
955
:Oh my God.
956
:Uh, I've been through a lot.
957
:Um, I mean,
958
:I feel like, you know, like
959
:me coming to
960
:Seattle wasn't just, oh, my
dad said I should move here.
961
:I should, like, I'm gonna
pack up all myself and go.
962
:There was a, a lot of series of events
that led to needing to leave Michigan.
963
:There was so much.
964
:Just awful things that happened.
965
:And, um, uh, my first time
being laid off actually was from
966
:Blockbuster when it shut down.
967
:And, uh, I was working at Victoria's
Secret and Blockbuster riding my bike
968
:in the bus, you know, trying to pay $500
in rent, uh, you know, as a roommate on,
969
:you know, what Grand River, some, some
road Michigan State campus around there.
970
:And, uh, you know, when I lost
my job, I wasn't, I didn't know
971
:how to kick into survival mode.
972
:Like I didn't have those instincts,
but my parents weren't in a
973
:position to support me at that time.
974
:Uh, so I really had to
like, figure ish out.
975
:And, uh, moving to Seattle
started first with moving to St.
976
:Louis, Missouri, living with my uncle.
977
:In the hood.
978
:Okay.
979
:With pimps and kittens everywhere.
980
:So, um, I was working at Dunking
Donuts, I was working at the museum.
981
:I was working at Holes and Pimps.
982
:Tony Tidbit: Huh?
983
:Erin Reddick: Yeah.
984
:Literally, like, I, I would walk just
one block from Domino's home from
985
:work and at least three people would
try to pick me up and I'm not wearing
986
:anything but a freaking dirty apron.
987
:Right.
988
:So it's, it, it was scary and it was
dangerous, but I had made a friend in
989
:Michigan, um, and they came to visit and
they were like, I don't think you're going
990
:to go very far living here in this place.
991
:Mm-hmm.
992
:And I have room with me, you know, like,
do you wanna come back to Michigan and,
993
:you know, try to be like in a safer area.
994
:I was like.
995
:Yeah, that'd probably be best.
996
:And then I moved back to Lansing,
but when my dad was talking about
997
:tech, this person was an electrician.
998
:And I was like, there's a lot of new
buildings and construction, and if you
999
:wanna get out of cable TV, satellite
work and go into like electrician work,
:
00:50:53,759 --> 00:50:54,990
you should probably go to Seattle.
:
00:50:54,990 --> 00:50:56,190
Maybe we should both go out there.
:
00:50:56,250 --> 00:51:01,020
And then we moved out there,
and that's how I got there.
:
00:51:01,020 --> 00:51:05,220
But it started with like,
just being an awful place.
:
00:51:05,430 --> 00:51:10,020
But because I had that like community,
I was able to, um, make that work.
:
00:51:10,020 --> 00:51:18,930
But I lived in Mount Vernon, Washington
in a $630, uh, apartment across the
:
00:51:18,930 --> 00:51:22,830
street from the, the closest college I
could find, so I could walk to school.
:
00:51:23,580 --> 00:51:24,540
And I did that.
:
00:51:24,779 --> 00:51:31,020
Um, but it, it just, it took, it
took a lot, uh, to make that happen
:
00:51:31,080 --> 00:51:32,970
when you literally have nothing.
:
00:51:34,455 --> 00:51:35,535
So, um,
:
00:51:36,015 --> 00:51:38,895
Tony Tidbit: so what would you, based on
all that, that happened, right, right.
:
00:51:38,895 --> 00:51:43,185
What would you tell your younger self
now looking back, gimme just bottom line.
:
00:51:43,185 --> 00:51:43,305
Me,
:
00:51:45,645 --> 00:51:51,045
Erin Reddick: I would say like,
you, you did the right thing.
:
00:51:51,045 --> 00:51:56,085
You did the best that you could, um, with
what you had, which really wasn't much.
:
00:51:56,445 --> 00:52:01,635
And you depended on friends and
community to help position yourself.
:
00:52:01,635 --> 00:52:06,405
But then you took initiative and that
was the best thing that you could do.
:
00:52:06,405 --> 00:52:09,765
Like peop again, my dad gave me the
idea, but he didn't gimme money.
:
00:52:10,125 --> 00:52:12,135
He didn't give me a place to live.
:
00:52:12,165 --> 00:52:17,985
He didn't gimme a car, you know, it, it
was literally just pure blind ambition.
:
00:52:18,405 --> 00:52:24,105
Um, and so Mount Vernon is hours
away, well, at least like an hour,
:
00:52:24,105 --> 00:52:25,845
hour and a half away from Amazon.
:
00:52:25,845 --> 00:52:29,295
I was driving like over an hour a day to.
:
00:52:30,360 --> 00:52:33,600
So it's gonna take work,
but you can get there.
:
00:52:33,810 --> 00:52:34,020
Tony Tidbit: Here.
:
00:52:34,020 --> 00:52:38,490
Can I add something to that though,
just based on what you just got finished
:
00:52:38,490 --> 00:52:43,140
saying you didn't let fear stop you.
:
00:52:43,620 --> 00:52:44,730
From moving forward.
:
00:52:45,360 --> 00:52:45,450
Mm-hmm.
:
00:52:45,690 --> 00:52:45,840
Right.
:
00:52:45,840 --> 00:52:49,680
You had a re a million reasons,
scary reasons to be fair.
:
00:52:49,860 --> 00:52:50,010
Yeah.
:
00:52:50,070 --> 00:52:54,930
On why this wouldn't work or I'm
making a bad mistake, or I don't
:
00:52:54,930 --> 00:52:59,040
think I could do this, but you
didn't let your fear stop you.
:
00:52:59,460 --> 00:53:00,000
Okay.
:
00:53:00,000 --> 00:53:05,310
Which is great because if you did, then
we wouldn't be the beneficiaries of your
:
00:53:05,310 --> 00:53:10,500
work and the things that you're doing and
how you're helping, you know, not just
:
00:53:10,500 --> 00:53:12,420
our community, but the world as well.
:
00:53:12,420 --> 00:53:16,710
Because here's the thing though too,
and we didn't get into this, but yes.
:
00:53:16,980 --> 00:53:21,690
Chat, black GPT is definitely for black
people, but it's also for white people.
:
00:53:21,690 --> 00:53:21,750
Yeah.
:
00:53:22,110 --> 00:53:27,900
To learn more about, you know, the history
that they're trying to hide from them.
:
00:53:28,560 --> 00:53:29,340
Okay.
:
00:53:29,640 --> 00:53:31,740
To, that's really the key here too.
:
00:53:31,770 --> 00:53:33,960
Tell us the future of chat Black GPT.
:
00:53:33,960 --> 00:53:36,240
What are you looking to
in, in incorporate into it?
:
00:53:37,425 --> 00:53:37,845
Erin Reddick: Yeah.
:
00:53:37,845 --> 00:53:43,275
Uh, the voice of our community, the
people who need the information, want
:
00:53:43,275 --> 00:53:47,595
the information, who are ready to
explore and need a safe space to land,
:
00:53:48,015 --> 00:53:49,515
that is what I'm focused on building.
:
00:53:49,515 --> 00:53:53,955
I want it to be a reflection of the
black community, a love letter to the
:
00:53:53,955 --> 00:54:00,285
black community, a preservation, um,
act like a deliberate, a deliberately
:
00:54:00,285 --> 00:54:05,775
built, uh, AI that is going to
encapsulate us and make sure that we
:
00:54:05,775 --> 00:54:08,565
are safe in a space in technology.
:
00:54:08,925 --> 00:54:10,425
So, next steps.
:
00:54:10,485 --> 00:54:17,085
Um, I'm actually going to take our
MVP offline for a little bit so
:
00:54:17,085 --> 00:54:19,455
that I can conduct research, and
I'll probably do this every year.
:
00:54:19,755 --> 00:54:22,335
Tony Tidbit: When you say MVP, just so
everybody's clear, what does that mean?
:
00:54:22,725 --> 00:54:27,105
Erin Reddick: So I have the customizable,
which is through OpenAI Chat gt.
:
00:54:27,795 --> 00:54:30,345
You go to explore GPTs chat like GT Beta.
:
00:54:30,674 --> 00:54:34,035
That's where like 10,000
people are using it.
:
00:54:34,245 --> 00:54:36,825
And then I also have the
independent version that me and
:
00:54:36,825 --> 00:54:40,095
my team built chat black gbc.ai.
:
00:54:40,424 --> 00:54:46,634
And right now that's going out
of, um, for out for maintenance.
:
00:54:46,875 --> 00:54:51,345
And so what that means is I'm going door
to door, I'm talking to the community.
:
00:54:51,345 --> 00:54:55,215
I'm doing field research, UX
research, I'm doing paid studies.
:
00:54:55,215 --> 00:54:55,785
I'm doing like.
:
00:54:56,445 --> 00:54:58,485
Actual activations to
talk to the community.
:
00:54:58,485 --> 00:55:02,655
So I can go and take that
sentiment back to data engineers,
:
00:55:02,655 --> 00:55:05,025
analysts, um, researchers.
:
00:55:05,055 --> 00:55:07,545
There's white papers being
written about us already.
:
00:55:07,545 --> 00:55:08,685
It's beautiful, honestly.
:
00:55:09,045 --> 00:55:12,705
But you take that sentiment, you
analyze it, and then you build it back
:
00:55:12,705 --> 00:55:14,385
into the voice of the tool, right?
:
00:55:14,745 --> 00:55:18,435
And so the, the essence of the
algorithm and its purpose is always
:
00:55:18,435 --> 00:55:20,445
available through the customizable.
:
00:55:20,805 --> 00:55:26,625
Um, but we're, we're, we're going into
development, uh, for this next like
:
00:55:26,625 --> 00:55:29,865
three months, which is great, you know,
'cause I'm gonna be on maternity leave.
:
00:55:29,865 --> 00:55:33,765
It's a lot easier for me to conduct
a survey than run a whole tool.
:
00:55:33,830 --> 00:55:35,265
So it's like smart.
:
00:55:35,595 --> 00:55:37,725
Uh, but that's the next thing.
:
00:55:37,725 --> 00:55:40,665
So I'm gonna be putting
out, uh, calls to action.
:
00:55:40,665 --> 00:55:43,155
Like, Hey, come to this mixer.
:
00:55:43,365 --> 00:55:49,365
Your, um, ticket to entry is
telling me how you feel about ai.
:
00:55:49,425 --> 00:55:54,465
Just raw, unfiltered, uh, you know,
basically your story, like how you feel.
:
00:55:54,780 --> 00:55:59,190
That is like a consented way to
use data to help develop a tool
:
00:55:59,190 --> 00:56:00,450
that's free for them to use.
:
00:56:00,750 --> 00:56:02,760
It's a, it's very much a community collab.
:
00:56:03,090 --> 00:56:07,140
So next steps for me is a lot of,
uh, field research, UX research,
:
00:56:07,200 --> 00:56:12,060
case studies, um, working with
interns and it's just gonna be really
:
00:56:12,420 --> 00:56:14,400
fun and beautiful and inclusive.
:
00:56:14,400 --> 00:56:15,390
Inclusive design.
:
00:56:15,390 --> 00:56:18,450
It's really co-design, community
co-design is what it's,
:
00:56:18,450 --> 00:56:19,380
Tony Tidbit: that is awesome.
:
00:56:19,500 --> 00:56:20,370
That is great.
:
00:56:20,610 --> 00:56:22,170
So final thoughts.
:
00:56:22,170 --> 00:56:24,690
What's the, what's the final thought
you wanna leave the audience?
:
00:56:25,530 --> 00:56:29,970
Erin Reddick: So my final thoughts is that
part of the reason that I am so fearless
:
00:56:29,970 --> 00:56:34,440
in the way that I approach being a
founder and, uh, trying out new things and
:
00:56:34,440 --> 00:56:38,430
experimenting with technology is because
I've gone through a lot of loss and I've
:
00:56:38,430 --> 00:56:42,990
had to start over and rebuild in almost
every aspect of my life, whether that's
:
00:56:42,990 --> 00:56:46,290
mental health, um, housing, financially.
:
00:56:46,350 --> 00:56:51,450
And so I want you to understand that
no matter what you've been through.
:
00:56:51,840 --> 00:56:57,150
It's always an opportunity to take what
you've learned and go to the next level.
:
00:56:57,150 --> 00:57:01,020
But don't be afraid to embrace that,
Hey, I've been through this before.
:
00:57:01,290 --> 00:57:02,700
I don't have to be afraid of it.
:
00:57:02,910 --> 00:57:03,270
Right?
:
00:57:03,480 --> 00:57:06,150
Like, I'm not afraid to apply
to a job and lose it 'cause I've
:
00:57:06,150 --> 00:57:07,380
already been laid off before.
:
00:57:07,800 --> 00:57:12,270
I'm less afraid to pay a high rent
because I've already downsized before.
:
00:57:12,660 --> 00:57:12,930
You know?
:
00:57:12,930 --> 00:57:16,080
So it's like, don't take loss.
:
00:57:16,140 --> 00:57:19,860
Um, and the hardships in life as
just bad things that happen that
:
00:57:19,860 --> 00:57:23,250
are holding you back because it's
really just giving an opportunity to
:
00:57:23,250 --> 00:57:25,140
be more fearless in your next move.
:
00:57:26,070 --> 00:57:27,150
Tony Tidbit: That is awesome.
:
00:57:27,240 --> 00:57:28,920
I love that advice.
:
00:57:28,920 --> 00:57:29,820
So true.
:
00:57:30,150 --> 00:57:32,760
How can a black executive
perspective help you, Aaron?
:
00:57:33,600 --> 00:57:37,140
Erin Reddick: Yeah, I mean, I,
I'm, I'm loving the conversation.
:
00:57:37,140 --> 00:57:41,100
I like the way that we were able to
discuss things and get into depth.
:
00:57:41,100 --> 00:57:44,850
So I feel like, you know, having
more opportunities to maybe
:
00:57:44,850 --> 00:57:47,220
bring in, I, I have this fantasy.
:
00:57:47,640 --> 00:57:51,390
Where I like doing something like
this, but people are calling in.
:
00:57:51,840 --> 00:57:52,650
Absolutely.
:
00:57:53,820 --> 00:57:55,860
Tony Tidbit: That's where
we're, that's where we're going.
:
00:57:55,860 --> 00:58:01,020
So you talked about where Chat, bt
BP, uh uh, chat Black, BGPT is going.
:
00:58:01,170 --> 00:58:02,400
That's where BEP is going.
:
00:58:02,400 --> 00:58:02,580
Right.
:
00:58:02,580 --> 00:58:03,660
We're gonna have people calling in.
:
00:58:03,660 --> 00:58:07,590
It's gonna be live and they'll be able
to give answers and stuff to that nature.
:
00:58:08,010 --> 00:58:08,700
Questions, I should say.
:
00:58:08,700 --> 00:58:08,880
That's so
:
00:58:08,880 --> 00:58:09,300
Erin Reddick: cool.
:
00:58:09,540 --> 00:58:10,350
That's so cool.
:
00:58:10,350 --> 00:58:15,870
I wanna, I wanna like be involved when you
guys do that because, or like whatever.
:
00:58:15,870 --> 00:58:18,780
If there's an episode that's pertaining
to something I could be helpful with,
:
00:58:19,410 --> 00:58:23,940
I wanna do that because I think we
need to hear more of our community
:
00:58:23,940 --> 00:58:25,470
and just give them these spaces.
:
00:58:25,485 --> 00:58:27,555
To ask questions and express themselves.
:
00:58:27,555 --> 00:58:28,485
So I'd love to, well,
:
00:58:28,485 --> 00:58:32,595
Tony Tidbit: look my sister, that is easy
to do and we're definitely gonna do that.
:
00:58:32,595 --> 00:58:34,335
So this is not the last
time we're gonna chat.
:
00:58:34,605 --> 00:58:39,165
Number two, we want to express our
gratitude for you to come on because
:
00:58:39,165 --> 00:58:40,455
you're busy, you're about to have a baby.
:
00:58:40,455 --> 00:58:41,055
You kidding me?
:
00:58:41,325 --> 00:58:44,205
Uh, you gotta go to the doctor
after this as well, right?
:
00:58:44,265 --> 00:58:46,785
So we wanna thank you for coming
on, investing some time on the
:
00:58:46,785 --> 00:58:48,765
Black Executive Perspective Podcast.
:
00:58:48,975 --> 00:58:50,235
We really appreciate it.
:
00:58:50,385 --> 00:58:55,635
You know, obviously I've been, um, uh,
on, uh, chat Black GPTI recommend that
:
00:58:55,635 --> 00:58:59,925
anyone watching, listening, please check
it out as she's gonna continue to develop.
:
00:58:59,925 --> 00:59:04,695
But more importantly, it's important
that you engage to educate and
:
00:59:04,695 --> 00:59:05,985
more importantly, have a space.
:
00:59:05,985 --> 00:59:09,855
So no matter what, our voices,
our stories won't be written away.
:
00:59:10,185 --> 00:59:16,575
So, Erin Reddick, CEO, founder of Chat,
black GPT, thank you for joining A
:
00:59:16,575 --> 00:59:18,465
Black Executive Perspective Podcast.
:
00:59:18,465 --> 00:59:20,115
So I think it's now time for.
:
00:59:20,509 --> 00:59:22,250
Tony's tidbit.
:
00:59:22,400 --> 00:59:28,790
Okay, and the tidbit Today, legacy lives
in every prompt, every reply, every
:
00:59:28,790 --> 00:59:31,880
decision to inject truth into the machine.
:
00:59:32,390 --> 00:59:33,650
It lives in every moment.
:
00:59:33,650 --> 00:59:40,279
We dare to be fully seen and heard inside
systems never meant to recognize us
:
00:59:40,670 --> 00:59:46,370
because when we cold with culture, speak
with memory, and innovate with intention,
:
00:59:46,730 --> 00:59:48,830
we're just not shaping technology.
:
00:59:49,315 --> 00:59:54,085
We're shaping time and you heard a
lot of that from our good friend,
:
00:59:54,475 --> 00:59:57,595
CEO Chat, G Black, GPT, Erin Reddick.
:
00:59:57,595 --> 01:00:00,805
So don't forget to check out
the next need to know by Dr.
:
01:00:00,805 --> 01:00:03,475
Nsenga Burton on A Black
Executive Perspective Podcast.
:
01:00:03,805 --> 01:00:04,120
Dr.
:
01:00:04,120 --> 01:00:08,245
Burton dives into the timely and
crucial topics that you don't have
:
01:00:08,245 --> 01:00:13,375
time to dive, tune, tune in to gain
her unique insights and deepen your
:
01:00:13,375 --> 01:00:16,045
understanding of the issues that matter.
:
01:00:16,045 --> 01:00:20,665
You don't want to miss it every Thursday
on A Black Executive Perspective Podcast.
:
01:00:20,725 --> 01:00:25,825
And don't forget to see our next round
table of Pull Up, Speak Up on on BEP.
:
01:00:25,825 --> 01:00:27,565
We're bold, unfiltered voices.
:
01:00:27,895 --> 01:00:32,155
Tackle today's most provocative
issues, sharp perspectives.
:
01:00:32,225 --> 01:00:37,205
Real talk and a call to action is not
just an episode, it's a revolution.
:
01:00:37,205 --> 01:00:41,765
So you don't wanna miss Pull Up, Speak Up
on A Black Executive Perspective Podcast.
:
01:00:41,915 --> 01:00:44,075
Now is our time for our call to action.
:
01:00:44,315 --> 01:00:47,075
If you are a regular
subscriber or watcher, BEP,
:
01:00:47,075 --> 01:00:48,605
you know what our mission is?
:
01:00:48,785 --> 01:00:50,765
This is your first time
listening or watching.
:
01:00:50,915 --> 01:00:56,915
Our goal is to eliminate all forms of
discrimination and to be able to do that.
:
01:00:57,065 --> 01:00:59,674
We've come up with an
acronym that we called Less.
:
01:01:00,265 --> 01:01:06,715
LESS, and we want everyone to incorporate
less because this is in your control.
:
01:01:06,925 --> 01:01:08,755
The L stands for learn.
:
01:01:08,965 --> 01:01:13,195
You want everyone to educate themselves
on racial and cultural nuances.
:
01:01:13,405 --> 01:01:17,755
Learn about people that you know how to
familiar with, understand their culture.
:
01:01:17,755 --> 01:01:18,865
It'll enlighten you.
:
01:01:18,865 --> 01:01:23,425
Then after you learn, you have the
letter E, which stands for empathy.
:
01:01:23,725 --> 01:01:27,145
Now, since you've learned and
you put yourself in their shoes,
:
01:01:27,355 --> 01:01:30,685
now you can understand their
perspectives and what they go through.
:
01:01:30,805 --> 01:01:33,565
And then the first S is for share.
:
01:01:33,775 --> 01:01:38,215
Now you wanna share what you've learned
to others to help enlighten them.
:
01:01:38,305 --> 01:01:39,655
And then the final S.
:
01:01:40,075 --> 01:01:41,155
Is stop.
:
01:01:41,245 --> 01:01:44,635
You wanna stop discrimination
as it walks in your path.
:
01:01:44,875 --> 01:01:49,375
So if Aunt Jenny or Uncle Joe says
something at the Sunday dinner table
:
01:01:49,555 --> 01:01:54,835
that's inappropriate, you say, aunt
Jenny, uncle Joe, we don't believe that.
:
01:01:54,835 --> 01:01:56,155
We don't say that.
:
01:01:56,155 --> 01:01:57,775
And you stop it right there.
:
01:01:57,955 --> 01:02:04,405
So if everyone can incorporate less LESS
will build a more fair, more understanding
:
01:02:04,405 --> 01:02:06,685
world and we'll all see the change.
:
01:02:06,720 --> 01:02:10,470
That we wanna see because
less will become more.
:
01:02:10,770 --> 01:02:13,589
Don't forget to continue to follow
A Black Executive Perspective
:
01:02:13,589 --> 01:02:18,720
Podcast on YouTube, Spotify, apple,
or wherever you get your podcast.
:
01:02:18,810 --> 01:02:23,040
And you can follow us on our social
channels of LinkedIn, X, YouTube,
:
01:02:23,100 --> 01:02:28,770
Instagram, Facebook, and TikTok at a
black exec for our fabulous guests.
:
01:02:29,110 --> 01:02:33,000
Erin Reddick, CEO, founder
of Chat, black GPT.
:
01:02:33,120 --> 01:02:34,439
I'm Tony Tidbit.
:
01:02:34,589 --> 01:02:35,850
We talked about it.
:
01:02:36,000 --> 01:02:38,669
We learned about it, we laughed about it.
:
01:02:38,819 --> 01:02:42,089
We're still gonna strive about
it, we're gonna thrive about it.
:
01:02:42,270 --> 01:02:43,109
We love you.
:
01:02:43,109 --> 01:02:43,950
And guess what?
:
01:02:44,040 --> 01:02:44,580
We're out
:
01:02:48,390 --> 01:02:51,029
BEP Narrator: A Black
Executive Perspective