Artwork for podcast User Friendly - The Podcast
Cyborgs, Robots and AI, OH MY!
Episode 516th December 2025 • User Friendly - The Podcast • User Friendly Media Group
00:00:00 00:49:19

Share Episode

Shownotes

This week, the team dives into a set of questions that you have sent us throughout the year.

Guest: Luke Sanders

Will Sikkens, Bill Snodgrass, and Gretchen Winkler

Transcripts

Speaker:

Welcome to We Are Technologies

User friendly.

2

:

2.0 with host, Bill Sikkens,

3

:

technology architect.

4

:

And this is User Friendly 2.0

I am your host Bill Sikkens.

5

:

Welcome to this week's

show, Bill Gretchen, welcome.

6

:

Hello there.

7

:

We are in the middle of the 2025 season

of Giving and user Friendly.

8

:

Is proud to support

the William Temple House in Portland.

9

:

This is a great charity.

10

:

They've been around for 60 years.

11

:

They provide mental health services

to those

12

:

that can't afford it, either low

or no cost, depending on the situation,

13

:

and also provide a full food bank

which this year has been in demand.

14

:

I wouldn't say it hasn't been in the band,

but more this year.

15

:

They're doing a fall fundraising campaign.

16

:

Check it out if you want to see what

they offer or if you can make a donation,

17

:

go to William temple.org

or User friendly dot show.

18

:

Either one will take you to the right

place to get some more information.

19

:

All right.

20

:

Bill, Gretchen, how was your week?

21

:

All right.

22

:

Yeah, I did some crazy, shopping.

23

:

Well, that's that's good.

24

:

And enjoy it because, you know,

with the onset of AI and everything else

25

:

that's going on, there's been a lot of

different things talked about.

26

:

But according to Elon Musk, we're not

going to have to work at all in 20 years.

27

:

Now, I don't know what that means

28

:

exactly, but,

you know that prediction, I'll be dead.

29

:

Yeah, that's one possibility, right.

30

:

We're going to be talking this week

about a lot of different things,

31

:

including AI, where that's headed,

cybernetics, robotics,

32

:

answering some of your questions

you've been sending in that

33

:

we haven't gotten to earlier in the year.

34

:

And just kind of giving a quick once over,

I don't know how quick, but

35

:

more of a deep dive,

I guess this will be in the second segment

36

:

to look at what you know,

we might need to fear

37

:

and what we don't need to fear

and all the rest of that.

38

:

And the other big thing is

I going to steal your job.

39

:

We're going to talk

about that a little bit too.

40

:

After the news in this segment,

41

:

our guest is going to be

on talking about his e-book.

42

:

I will leave it to that

and we'll get to that a little bit later

43

:

when we get closer

to that part of the show.

44

:

But before that, let's

go ahead and jump in to the news.

45

:

What do we have this week?

46

:

All right.

47

:

Shopify suffers an outage at a bad time.

48

:

So what is a bad time?

49

:

I would think any time would be oh yeah.

50

:

Any times a bad time

I'll tell you. You know.

51

:

And that's the question though. Hey,

you know, what's a bad time?

52

:

What's a good time for your website

not to work. Right.

53

:

So Shopify, for anybody

that doesn't know this runs

54

:

a lot of businesses, mainly

small businesses that have web presence.

55

:

And they offer credit card processing,

the ability to load catalogs

56

:

and inventory.

57

:

And most of the time,

you probably don't know

58

:

that you're on a Shopify site when you are

there may or designed to be able

59

:

to look like whatever company they are,

and they handle the background well.

60

:

They had an outage on Cyber Monday.

61

:

That was the bad time.

62

:

Oh yeah.

63

:

And,

64

:

this is something that affected

a lot of businesses because Black Friday

65

:

weekend, Cyber Monday,

66

:

all of these, or the reason they call it

that is because we have for years

67

:

because with most businesses,

especially retail,

68

:

the day after Thanksgiving

69

:

when the shopping starts is

when they get into the black,

70

:

which is the term for

we are now in profit.

71

:

So it's a very, very important time.

72

:

And with all of that moving online

as it has in recent years,

73

:

if your website doesn't work

during one of the main days,

74

:

which Cyber Monday is a big part of that

now people will go and find the product

75

:

somewhere else,

the first brand loyalty and stuff.

76

:

They might want to come back,

but this is just something

77

:

that you don't want to have happen.

78

:

We've had a lot of outages, as we've

talked about over the last few months,

79

:

from Amazon to Microsoft to others, this

being another one for a different reason.

80

:

As of right now, the explanation

that's been given is just simply

81

:

that they had a problem.

82

:

Well, I could have told you that

without reading their explanation.

83

:

Get more details on it.

84

:

We'll certainly talk about it.

85

:

But this is something that I are

we are technology

86

:

and we have several clients

that do use Shopify.

87

:

And this was a major problem

and there was no way to switch to a backup

88

:

or anything else.

89

:

So on the positive side for Shopify,

I have not heard about this

90

:

happening often.

91

:

In fact, I think this is the first time

I've ever heard about a real problem.

92

:

There's been some quirks with it, the

where it's actually gone down like that,

93

:

but just that was not the

day to have it happen. And

94

:

also the term in the red meant

that you were in debt.

95

:

Yeah, exactly.

96

:

I used to use, red

ink for when the money was leaving

97

:

and black ink for when the money

was coming in the books.

98

:

Yeah, exactly.

So that's where that saying comes from.

99

:

From in the black,

because now you're in black ink.

100

:

Red ink is bad. Black

ink is good. Right? So.

101

:

Right.

102

:

Cambridge scientists

103

:

successfully reverse human skin cells age.

104

:

So this is an interesting thing.

105

:

And they're talking about a reprograming

process is what they're calling this.

106

:

And this is one of those things

that I'm going to preface with.

107

:

I'd like to actually see this.

I completely believe it.

108

:

Although looking at it,

it looks like it is not an exact study.

109

:

It's if it is, it's

really kind of a cool thing because there

110

:

and this is the part of it

that made me question it

111

:

a little bit, is they're claiming

to be able to reverse the,

112

:

age of your skin cells by about 30 years.

113

:

Okay. That would put me at 20.

114

:

That would be really nice.

115

:

Yeah.

116

:

So they're saying that the approach

117

:

could be an important step

towards future anti-aging therapies,

118

:

although it is still early

and much more testing needs to be done.

119

:

So, you know, the thing of it is,

is talking about it from a standpoint,

120

:

assuming all of this is as

the study has presented it,

121

:

I do know we're we're moving

in the direction of having

122

:

some technologies at the forefront

that will do things like this

123

:

that we're not,

you know, we're science fiction.

124

:

Not so long ago.

125

:

So the idea that this is possible.

126

:

Yeah. It's possible.

127

:

Are they up to a point where

they can reverse your aging by 30 years?

128

:

In my opinion, probably not.

129

:

But even if it's somewhere

in the middle of that

130

:

or even ten years,

that's still a big accomplishment.

131

:

Well, well, think of it this way.

132

:

If if humans are living longer,

133

:

that means our skin needs to last longer.

134

:

Yeah, think about it that way.

135

:

And if you were able to do something

to help the skin regenerate,

136

:

that could be just useful, right?

137

:

Right.

138

:

And, you know, and again, all the way

around, it could be useful, right?

139

:

I mean, if you can reverse your age

now, it's not

140

:

they're not claiming

they reverse your aging,

141

:

but your skin's aging by

something like that.

142

:

Whatever the reason, it be nice to do so.

143

:

Yeah.

Well so we'll see what happens with that.

144

:

If they do say like I say, it's in the

early testing stages and all of that.

145

:

And I'm sure if this becomes a thing

where it really works and is available,

146

:

we're going to be

hearing a lot more about it.

147

:

All the movie stars will use it.

148

:

Okay.

149

:

Japanese

helmet with built in visor is created.

150

:

Tell us.

151

:

So this is a model called GT three helmets

152

:

and developed by the French company

Eye Lights.

153

:

So basically what it is is

it is a motorcycle helmet, as you said.

154

:

And in the visor you have a projector

and that projector sends up information

155

:

like your speed, your music

that's playing or talking on the phone,

156

:

that kind of thing.

157

:

I would imagine mapping and all that

kind of stuff would be a part of this too.

158

:

Now, what's interesting about it is,

is this type of technology is something

159

:

that would be very beneficial.

160

:

I'm just a little surprised

161

:

that they're just announcing it now,

because we've had had displays for a while

162

:

and even, you know, 15 years ago,

I know it didn't market well,

163

:

but with the beginning of all of this,

which would arguably be Google Glass

164

:

had this type of a technology

and the fact that it's

165

:

now being perfected and available

on a consumer level, that's great.

166

:

But we're seeing this type of thing

already in glasses and other things

167

:

that you can just get up

and look at and wear and buy.

168

:

So I think it's a good idea.

169

:

I think it's great for people

that ride bikes, and that type of thing

170

:

makes things even safer.

171

:

So you're not having to fumble around,

you know, just like in a car.

172

:

But when you do that on a motorcycle,

it's even more dangerous.

173

:

And having that available.

174

:

Oh, I'm just surprised that

175

:

it's at this point

that it's just coming out.

176

:

Teens innovative system eliminates

177

:

96% of microplastics from drinking water.

178

:

Yeah, and this is absolutely cool.

179

:

I was looking into this to try and,

you know, because we try to confirm,

180

:

stories we get.

181

:

And I will say that a good portion

of what comes in is,

182

:

you know, you don't cover it

because it's like this.

183

:

It might be something more for The onion,

but this one is not.

184

:

And this is something

that is really kind of cool.

185

:

I am working on getting this,

the inventor of this thing, on the air

186

:

next year.

187

:

Some time to talk about this

a little bit more.

188

:

Because of this, he did a Ted talk

and some other things, and I'll get into

189

:

names and details here in a minute.

190

:

But basically, what we're talking about

here is microplastics are in everything.

191

:

And that causes a lot of problems

because they're toxic.

192

:

So when you have something

that's made out of plastic,

193

:

even biodegradable plastic,

if it's not recycled properly,

194

:

it will break down, break

195

:

down, and eventually break down to a point

where you don't see it anymore.

196

:

But it is definitely still there.

197

:

So what ends up happening is this has

gotten into the water, into the air,

198

:

into our bodies, into our animals bodies,

all of these different type of thing

199

:

and something that is very hard, if not

impossible at this point to get away from.

200

:

So what you're coming up with here

is a system

201

:

that is able to basically filter

the microplastics out of drinking water.

202

:

So this is something that would work

in, conjunction

203

:

with things like your water

treatment plants and all of that.

204

:

And then the other part of it

that's really cool.

205

:

And this has been an area

that has been somewhat difficult, to do

206

:

is that it

actually can capture this stuff,

207

:

and then they can get it

out of the environment and recycle it.

208

:

I'm in the plastic

at that point could be reused.

209

:

So something

that if this can be actually automated

210

:

and put into,

you know, that type of a situation,

211

:

I think that this could be something

that is a game changer or a game changer.

212

:

Yeah, that's the word I'm looking for,

really I it absolutely is.

213

:

So again, you did this.

214

:

And what I'm keeping for the moment is

I want to get our interview

215

:

set up for next year,

but we will put some information.

216

:

I'll make sure it gets up onto our socials

to be able to get to the TEDx talk.

217

:

What to know about the three Atlas comment

218

:

and why people keep talking about aliens.

219

:

Yes, so

220

:

really Atlas,

221

:

for anybody that doesn't know about

this is a

222

:

comet is what NASA's calling it.

223

:

They came from outside

of the solar system.

224

:

It's the third interstellar object

that we've observed ever, meaning

225

:

that it's simply that it did not originate

within our solar system. So.

226

:

But the thing of it is, is there's

stuff in space besides just what's here.

227

:

I mean, you know, the Earth is not flat,

and it's not the center of the universe.

228

:

I'm sure I'm going to get some comments

on that, but just I believe me, it's not.

229

:

So just ignore them.

230

:

Yeah.

231

:

So in a quick aside on that,

there was a meme that came up with it

232

:

says, well, this is embarrassing.

233

:

Has our solar system

of all the other planets and ours is flat?

234

:

Well that's embarrassing.

235

:

Yeah, that would be in any of it.

236

:

Even the comet has been

237

:

exhibiting some unusual behaviors

that researchers

238

:

have not seen in other objects,

interstellar or otherwise.

239

:

One of them is that it seems to be able

to correct its own ports.

240

:

There seems to be exhaust

that changes color.

241

:

It went around

the sun, used it to launch itself back out

242

:

the tail of the comet, which normally

goes away from our normal comet.

243

:

The ice is melting,

so that's what creates the tail.

244

:

And it's going in the opposite direction

of where it's moving.

245

:

It had an anti tail.

246

:

It was going the other way.

247

:

And from that

there's been a lot of speculation

248

:

with the idea of is this actually just

a naturally occurring thing

249

:

or is it something like, you know,

an exotic example being an alien spaceship

250

:

or even a probe like a Voyager?

251

:

Yeah. Yeah, exactly.

252

:

You know, or something of that nature.

253

:

First of all,

this wouldn't be impossible.

254

:

But the thing of it

is, it's a major researcher at Harvard.

255

:

We talked about this earlier in the year,

has come up with the idea.

256

:

He's not saying, well,

this is a spaceship, but he's saying,

257

:

look at the facts from scientific theory

and scientific method.

258

:

There are some things here that require

259

:

an open mind to be able to look at this

and see if maybe there is

260

:

something else going on that is something

that we haven't seen before at all.

261

:

From that end, we're getting,

262

:

a lot of disinformation

I've noticed on the internet

263

:

as well as I produce things

that make it look like a spaceship.

264

:

You know, being reported.

265

:

Well,

this is the picture NASA took. No, it is.

266

:

And if that was the case,

there wouldn't be speculation.

267

:

But the thing of it is, is at the end

of the day, it is also an odd size,

268

:

which is also something that,

269

:

again, is different from most comets.

270

:

Is it impossible?

271

:

No, but one of the other things

that really has surprised

272

:

people is the emissions of this thing

include elements like arsenic.

273

:

But up

until now, we've only found in refined,

274

:

you know, modern technology

where these things are made.

275

:

So again, it's certainly possible

this is all naturally occurring,

276

:

which is something very unusual.

277

:

And on the other end of the spectrum, it's

certainly possible that this is

278

:

something that is completely new and,

you know, game changing would be the case.

279

:

Now people talk about this kind of stuff.

280

:

Stephen Hawking said that meeting aliens

281

:

could be very negative for humanity.

282

:

Yeah, that's

one of the things that drives, all this,

283

:

you know, a lot of this kind

of speculation and stuff in respect

284

:

to the idea of what happened

when the Europeans got to North America,

285

:

as if, for example, as we all know, that

went very badly for the Native Americans.

286

:

And there are a lot of stories

like this in our history.

287

:

Now, if we're going to talk about aliens

288

:

and look at this

from a scientific approach,

289

:

I personally would think that, number one,

290

:

if this was possible,

they'd already be here.

291

:

Probably,

we might not know it, but number two,

292

:

I don't see where express rules

that have the ability to create spaceships

293

:

have this type of technology

would want to take over this planet

294

:

if I was one of them.

295

:

And figure out what's going on,

I think I just let my ship

296

:

in the opposite direction

and go as fast as I could.

297

:

These

people are crazy, you know, to look at it.

298

:

And but the thing of it is, is there

probably isn't that much difference

299

:

either.

300

:

You know, Roswell, when all that happened,

hey, maybe it was one of their kids

301

:

got a hold of a saucer ship

and took it on a joyride

302

:

and crashed it into the desert.

We all know.

303

:

Oh, yeah.

304

:

Oh, yeah,

that would be, I'd be a big ol bet.

305

:

Dad, I crashed the spaceship.

306

:

Yeah.

307

:

You know, it's like,

308

:

can you imagine this is bad enough

that I crashed your brand new BMW?

309

:

I think this would even be worse.

310

:

So, in fact, I'm quite sure it would.

311

:

But, you know, again,

at the end of the day,

312

:

very likely.

313

:

I mean, you know,

you've look at the numbers and everything,

314

:

there's very likely I would almost say

you can't say with certainty, but 99%

315

:

that there are other species

that live in this universe

316

:

and other intelligences

and all of that kind of stuff.

317

:

And, you know, again, people are worried.

318

:

Well, the other side of it

is, is having this technology

319

:

and things they might show up

and try to help us.

320

:

In fact,

I think they would be much more likely

321

:

because if they were going to invade

would be invaded, you know,

322

:

you know, reality. Yeah.

323

:

And so there might be

a very positive direction

324

:

that something like that could go to now

is I Atlas an alien spaceship.

325

:

Well, look at the evidence

and decide for yourself what you think.

326

:

Generally speaking,

327

:

the reality that those in

328

:

these mindsets believe is that

it is just an unusual planet.

329

:

But there are other people

that are credible scientists

330

:

that seem to be suggesting otherwise.

331

:

And I'll tell you this if it is turns out

to be aliens, I don't know.

332

:

Would it be ancient aliens?

333

:

I mean, history is going to have to do

something with that history channel.

334

:

All right.

335

:

So I guess it's entirely.

336

:

Yeah, yeah.

337

:

Well, anyway.

338

:

All right, well, that's our news.

339

:

Let's

go ahead and get over to our interview

340

:

and find out about e-books

and some things dealing with that.

341

:

There we go.

342

:

Joining us now, Luke Sanders,

author, podcaster and many other things.

343

:

Welcome to User Friendly.

344

:

Thank you. Thank you for having

me. I'm glad to be here.

345

:

So instead of me trying to sit here

346

:

and describe all the things you do,

I'm just going to throw it to you.

347

:

Luca, tell us a little bit about yourself

348

:

and your podcast

and your upcoming e-book.

349

:

I know those were

some of the things on the agenda.

350

:

Yeah, absolutely.

351

:

So my name is Luke Sanders

and Indianapolis native,

352

:

if you don't know too much by Indianapolis

babyfaces from here, people like that

353

:

come I start here worked on web series

and at that place on Amazon.

354

:

Then I went into radio and transitioned

into podcasting with podcasting.

355

:

I've been able to be a nine time

top ranked Apple

356

:

podcast host,

and with radio, I earn two certificates.

357

:

Thank you.

358

:

And then now

I'm interested in being an author.

359

:

So I had my first book, my debut book,

Luke's Dreamland Superhero in a movie.

360

:

So I'm super excited about that.

361

:

So that story is not based of my life,

but is inspired by my life,

362

:

like some things are different

about the character,

363

:

but he's a six year

old boy, has a big imagination,

364

:

lucid dreamer, and yeah,

the possibilities are endless with him.

365

:

All right, so let's dive into that

a little bit more.

366

:

Talk about the book.

367

:

And, you know,

when you're going to be releasing it yet

368

:

and tell us a little bit about

369

:

why don't give us any spoilers,

but a little bit about the story.

370

:

No. So the book came out on Tuesday, so

it's been out for, a couple of days now.

371

:

But really, about the book.

372

:

So not giving too much away, we noticed

about a dreams about a superhero,

373

:

but the synopsis is he goes to sleep

after watching his favorite movie

374

:

about a superhero, and he wakes up

and he's that superhero in that dream.

375

:

And, he learns very quickly

that it's not all about fun and games.

376

:

You have to, in a very quickly.

377

:

That is so true of superheroes.

378

:

I think, you know, I know, yeah, there's

always that journey of of discovering.

379

:

Oops, it's not as easy as it looks.

380

:

Yeah, but how do people find your book?

381

:

You have a website or go to Amazon

with Access Point?

382

:

Absolutely.

383

:

So you call me

I loopy Sanders on social media platforms,

384

:

you also can search on Apple Books,

and you also can buy it online on Barnes

385

:

and Nobles and other places, as well

other places online.

386

:

All right.

387

:

So I will include that

in our social media.

388

:

So the link is out there.

389

:

But thanks so much. I'm curious what's up.

390

:

Yeah, I have

I have a question, why just an e-book and

391

:

and not not a printed version

or maybe just not.

392

:

Yeah. Question.

393

:

I am self-published,

so this is just a start.

394

:

So for

395

:

audience is 1 to 5 year olds, but,

it's getting started.

396

:

First time book, first time author.

397

:

Oh, terrific.

398

:

Thank you. Gratulations.

399

:

Now I want to ask about some other things

in your bio here.

400

:

You say you're a filmmaker

and I've worked in that too.

401

:

Tell us a little bit about that.

402

:

That sounds interesting.

403

:

So I have a web

series called Luke's IV Chronicles

404

:

which basically, chronicled my life

through college.

405

:

And I'm bringing it back

for a two part special.

406

:

So it's really about the third season,

but it's put together in two parts

407

:

with the 21 minute

format for each episode,

408

:

and it's at the same show

that people have seen before.

409

:

As much more fun as all the love and

editing that deserve a first go around.

410

:

And, yeah, it's a it's a great time.

411

:

So it's about me going back to college.

412

:

It's also about real life

and catastrophes happening.

413

:

We try to take a podcast

and go back to school days later.

414

:

It's real.

Life is fun. Life is everything.

415

:

And so, you know, writing a book

and doing the web series

416

:

and everything else to do, it sounds,

you know, there's so many things

417

:

what goes into something like this,

but what is it taking you to actually get

418

:

to the point

where you're both ready to go out

419

:

and actually get your,

series ready to where it was to go out?

420

:

I mean, I know there's a lot going on,

but you talk about that a little bit.

421

:

Great. Yeah, absolutely.

422

:

The book.

423

:

So I came up with the title

and the idea two years ago.

424

:

I was looking

when would be a good time to release it,

425

:

and, this was a good time

for me to release it, but,

426

:

yeah, really held it back to make sure

the manuscript was where I wanted to be.

427

:

I designed the cover.

428

:

So I really wanted to work hard on that.

429

:

And then with the web series,

I'm kind of a one man

430

:

show when it comes to like, the editing

and the story of it all.

431

:

And because it's a docu web series, it's

432

:

not a reality show,

but it's that web series.

433

:

So people don't know.

434

:

It's like a documentary,

but it series format.

435

:

But, yeah, takes a lot of blood, sweat

and tears.

436

:

Time, patience.

437

:

Going back to the drawing board when

you know this different stuff happens.

438

:

So takes time.

439

:

Yeah, I the reason I,

I think it is absolutely amazing

440

:

when you see content creators

such as yourself being able to actually do

441

:

this, as you say, a one man show

and a bigger production.

442

:

You have all these people around

to do all these different things.

443

:

But, when you do it

yourself, you're wearing a lot of hats.

444

:

You and have to really know

a lot of different things,

445

:

you know, to pull it off

properly. So that's great.

446

:

Thank you.

447

:

Is there

anything else you'd like to talk about?

448

:

Really

just getting the word out about this book.

449

:

I'm really excited about it.

450

:

I made, like, I made him kind of like.

451

:

Like I said, inspired by me.

452

:

Like, his hair is blond,

hair is not blond, but is kinky.

453

:

So there's different things,

that I put into him.

454

:

I made him interracial

455

:

so he can appeal to kids who are of color

and who are not of color.

456

:

So, yeah, just really excited about him,

even though he's me and I'm him.

457

:

Like, look, it's not the same person, but,

458

:

but yeah.

459

:

So movie sounds.com.

460

:

I'll go ahead.

461

:

And Gretchen so I was going to ask you,

so do you seem your target audience,

462

:

is it mostly children

or do you think that adults could read it

463

:

and enjoy it just as well?

464

:

So like if you're, you know, a mom

and you're reading it to your kid

465

:

with the mom, enjoy it too.

466

:

Yeah,

absolutely. I do. I can say that humbly.

467

:

I know I wrote it, but,

I do think adults would enjoy it as well

468

:

because it's a it's a simple story,

but it's a fun story.

469

:

It's something that you would really

470

:

it will really

you will really get immersed in the story

471

:

and just seeing all the what it does

472

:

and the people that he saves

and how this story ends.

473

:

And will it be a part too,

with the cliffhanger?

474

:

Because there's a cliffhanger

at the end. So yeah.

475

:

But all the way around to have this

476

:

underscore is where you go, and we'll

include that in our social media loop.

477

:

Thank you so much for joining us

and good luck with this.

478

:

Thank you so much.

Thank you for having me.

479

:

All right.

480

:

Next segment

we are going to be talking about

481

:

cybernetic bionics artificial intelligence

482

:

and answering some of your questions

and looking at the actual state

483

:

that this is the one

where it's probably headed, given

484

:

some of our opinion,

this is user friendly 2.0.

485

:

We'll be back after the break.

486

:

Snow's coming down, and,

487

:

I'm watching at home.

488

:

You know, I love people around and maybe.

489

:

Please come up in the church bell now.

490

:

Welcome back. This is user friendly 2.0.

491

:

Check out our show at User Friendly

Dot show.

492

:

That's your one stop for everything

user friendly back episode.

493

:

Submitting your questions and comments.

494

:

Looking at our blog,

which we promise to start writing again

495

:

this year and did twice and stopped.

496

:

So we're going to have

that be a New Year's resolution.

497

:

How's that? Gretchen? Oh goody.

498

:

I've been busy. I've been really busy.

499

:

I have these things called novels

that I've been fretting over.

500

:

Yeah, yeah, you got it.

501

:

Speaking of which, before we get

into our topic today, how's it going?

502

:

All right.

503

:

School Journal is out almost now.

504

:

It is almost out. Yes.

505

:

And, after, you know, monkeying around

with formatting and,

506

:

and all kinds of technicalities,

we're getting, like, super duper close.

507

:

Yep, yep.

508

:

So it's, it's going to be

the we're actually going to cover this

509

:

in a little more detail next week

because it will be out by then.

510

:

You know, so what we're going to get you

the details on it and how to find it.

511

:

It's turned out to be an amazing book too.

512

:

So, I think you're going to probably

really enjoy it anyway, that being said,

513

:

let's dive into the topic that we're going

robotics,

514

:

cybernetics, AI, and so on in your book

actually goes into that a little bit too.

515

:

So that's appropriate sorta.

516

:

So let's let's talk about some things

517

:

that ten years ago were completely science

fiction.

518

:

And today are an absolute reality.

519

:

And one thing I can say on that front,

520

:

with all honesty is I am a cyborg.

521

:

And I will explain that

a little bit later on.

522

:

This is called your tease

that you need to listen to this segment.

523

:

So let's start with AI.

524

:

Okay?

525

:

We've talked about this a lot

this year, as any technology show has,

526

:

because it is proving itself

to be somewhat groundbreaking,

527

:

a little bit more than I thought

it would be a year ago.

528

:

And there's both sides

of the argument on this,

529

:

but we'll get into here

in just a second on how this is used.

530

:

Now, Bill, I know you're concerned

531

:

and I've talked about it

upstaging and replacing artists.

532

:

And why don't you comment

on that for a minute?

533

:

Just because this is a big thing.

534

:

People are asking questions about?

535

:

Well, when it first started out,

it was easy to determine

536

:

whether it was an AI picture or not.

537

:

And now that it's been a year since

you know, those softwares have been out,

538

:

the AI now being done to get better

and better at it, and it becomes

539

:

harder and harder to determine

if it's been AI generated or not.

540

:

And it has taken away a lot of jobs

541

:

that, was previously for artists.

542

:

And we're going to dive into the jobs

thing

543

:

in a minute here, because that is one area

that is directly affecting

544

:

artists, graphic designers,

that whole area of expertise.

545

:

Now, Gretchen, you're a artist.

546

:

I mean, you write novels and are an artist

and these type of things.

547

:

And I know we're just talking

about High School Journal.

548

:

You've used AI not to write the book,

but for some of the editing process.

549

:

And this,

I think, would be a positive thing.

550

:

Can you talk about that a little bit?

551

:

Actually, it's really been helpful

as far as the editing process.

552

:

But you have to be mindful

that when you use one of these

553

:

AI to help you edit that, you don't

just accept everything that it suggests,

554

:

because I have seen a couple of instances

where it made a suggestion

555

:

that would change the meaning of

the sentence, you don't want to do that.

556

:

So this is a tool that you work closely

with.

557

:

Like, all right.

558

:

When I went to college,

they did not train me

559

:

how to write for, like, like a novel.

560

:

It was only expository.

561

:

And so I didn't learn the proper

punctuation

562

:

for, conversations in a textbook.

563

:

Okay, I didn't learn that.

564

:

And so I'm starting

to learn the punctuation

565

:

from the software

because it keeps correcting me.

566

:

I keep making the same mistake.

567

:

And so I'm learning

while it's it's telling me, hey,

568

:

you need to do that,

but it's something you work with. It.

569

:

And,

you have to be careful because a lot of it

570

:

seems like the

AI that I'm using is business oriented.

571

:

It's mindset is focused on

572

:

helping business

people write factual documents.

573

:

So when I'm using it, I'm doing

fictional characters,

574

:

I have children,

I have adults who are maybe less educated.

575

:

I have to be careful that I don't let

576

:

this suggest the wrong things

for the characters.

577

:

So you have to use your brain.

578

:

So it's a

it's it's a companion, it's a helper.

579

:

But it does not replace the writer. Okay.

580

:

It doesn't come up with the ideas.

581

:

I do write in the.

582

:

Now, the thing of it is,

is it? Conceivably it could.

583

:

And I think that's

where Bill's coming from.

584

:

The fact that he's the writer,

you know. Right.

585

:

I tend to look at

AI is it's like the backup camera.

586

:

Most newer cars have a backup

camera is shift in a reverse

587

:

and the camera turns

588

:

on the original goal of that was

589

:

you would look in the mirror and

then have an additional in the lookout.

590

:

A lot of people just look at the backup

camera.

591

:

It's it's not meant to replace

what you would normally do.

592

:

And I think that analogy

kind of connects here

593

:

because you look at something like art

now, not the monitor monetization

594

:

of it, graphic artist

and that type of thing.

595

:

But true artists was about sitting down

and creating something from nothing.

596

:

It's a skill

that not a lot of people have,

597

:

and those that do have it are unique,

and it really is something to look at.

598

:

I've often said before, I can't draw

a straight line, but I love art.

599

:

I know what I like, right?

600

:

So from that type of a standpoint,

601

:

you're going to still have artists

creating art.

602

:

But Bill, like you said, it's

getting to a point where you can't tell

603

:

what's made by a human,

what's made by machine anymore.

604

:

And and even a year ago, you could

now there's still

605

:

some stuff in there right now, today,

I think if you really take a look at it,

606

:

you can pick it out.

607

:

But at the rate it's going another year,

that might not even be here either.

608

:

So now you look at industry

and things like graphic design

609

:

or commissioned artists or anything else

that is a moneymaking venture

610

:

of being an artist,

is that going to affect those jobs?

611

:

Well, yes, it certainly will.

612

:

And it certainly already has.

613

:

Now, I know from my own standpoint

614

:

that some of the stuff that we work in

for clients and that type of a thing,

615

:

a lot of the area is that I would

experience is things like logos

616

:

and other assets like that, things like,

like Jeremy would have produced back

617

:

when,

you know, he was around and doing that.

618

:

I know from

619

:

my own standpoint,

I have yet to go to an I have tried,

620

:

I have yet to go to an AI and say, hey,

this is what I want, make me a logo.

621

:

It does, but it looks something.

622

:

It's just not the creative component

623

:

that would exist from

this is what I want it to be is not there.

624

:

So what ends up happening?

625

:

You're wanting to build logos

626

:

for a lot of different stuff

and don't care about that.

627

:

And yeah, now you've just displaced jobs

628

:

from the graphic designers

that would normally have been doing it.

629

:

But there's another part of this

630

:

where and it's the way

that I've been as I've been able to get it

631

:

to come up with concepts

I like that I wouldn't have been able

632

:

to create on my own.

633

:

And then we have been handing those off

to an actual graphic designer

634

:

that then takes it and makes the real logo

and that kind of stuff.

635

:

So again, this is something

that has a lot of moving parts

636

:

because it's changing so much.

637

:

So today as we stand, will not be where

we're at a year or even a month from now.

638

:

And it's a good idea in these

professionals to keep an eye on that.

639

:

And I think that this is one area

where it would be nice if government

640

:

caught up a little bit not said, well,

you can't use AI for artwork, but

641

:

definitely said that you have to declare,

642

:

you know where this stuff originated.

643

:

I noticed Gretchen, in publishing your

book and stuff and working with Amazon,

644

:

there's a checkbox

now that, you know, is this AI generated?

645

:

You have to you know, some of the bigger

companies are already doing this,

646

:

and it's nothing against

I think it's an amazing tool.

647

:

I think it is something that I have found

has really helped me with programing

648

:

and other things.

649

:

But there are hallucinations

and that kind of stuff,

650

:

and you can't take everything as you say,

just here it is.

651

:

A lot of people do and like

looking in your mirror and nothing else.

652

:

Knowledge.

653

:

You see articles and stuff that are

written by AI that have not been vetted.

654

:

You can tell.

You can tell most of the time.

655

:

And also

656

:

at the end of the day,

it is going to replace a lot of

657

:

outside commercial artists.

658

:

Yeah, a lot of the commercial people.

659

:

We're going to see that change,

660

:

and I think we're going to see a change,

just like when we went from,

661

:

you know, using Movable Type and Blue Line

and all of that kind of stuff

662

:

to having computers.

663

:

There was a huge change in the industry

at that point, too.

664

:

You used to have type setters and people

who would, create these layouts

665

:

and things, and then all of a sudden

the computer did it right.

666

:

And so that was a big change.

667

:

So now some of the other areas to just

668

:

doing some research to talk about this

that I found is

669

:

it is actually adding in many ways

where it is

670

:

giving additional artistic endeavors

that wouldn't have existed before.

671

:

Things like you, list your house

on the market, and AI creates the rooms

672

:

so that you can see what it look like

with furniture in it, that type of thing.

673

:

Now, this is not something

a graphic artist would have done.

674

:

It's not something that would have been

a thing before I was here to do that.

675

:

So that's not replacing a job, that's

just adding on to what it what it can do.

676

:

So I think, to be perfectly honest,

are we going to see job loss?

677

:

Have we already in artistic graphic arts.

678

:

Things like architecture.

679

:

Yeah we are.

680

:

And it's very least

it's going to change it.

681

:

There will still be commissioned artists.

682

:

There will still be people

that create art for art's sake.

683

:

But all through history,

it's been known that it's very rare

684

:

that an artist is rich.

685

:

Yeah, it tends to be, you know,

after their lifetime, things become good

686

:

and that kind of stuff.

687

:

And the last 40 or 50 years artists

have been able to monetize their work.

688

:

And I think that one thing

that will happen with

689

:

this is we're going to have what

I call a SAG moment.

690

:

What I'm talking about

there is a view of two years ago

691

:

now when the Screen Actors

Guild went on strike

692

:

because they were trying to use

AI to replace actors in movies.

693

:

And today, could

there needs to be some rules

694

:

that we live by to make sure

that we aren't eliminating artists,

695

:

because that's not a thing that you want,

because art is important,

696

:

real art is very important,

and that's something that can't go away.

697

:

And if it does, that would

really diminish society in my opinion.

698

:

You know, I agree.

699

:

So that being said, I think definitely

700

:

what we've been talking about here

is that it is a tool.

701

:

It is a very useful tool

when used in the right way.

702

:

But you go to college and try to use

703

:

AI to write all of your papers,

even if you get away with it,

704

:

you're not accomplishing anything

because yeah, you got to

705

:

maybe get the degree,

but you're not going to know

706

:

what it is that you have a degree

in. Not really.

707

:

You're not going to understand

708

:

and be able to produce

when you're out in the real world.

709

:

Yeah, yeah, yeah, yeah.

710

:

Some of the dangerous parts of this

711

:

are talking about jobs

and all that kind of stuff.

712

:

Some of the more dangerous parts

713

:

is the bad guys, hackers

and so on, or using AI for those sorts.

714

:

And one of the things is,

as with all these scams,

715

:

we've talked about it

many times, you get an email that you're,

716

:

you know, how to charge your credit card

and it's going to post

717

:

if you didn't really make the charge.

So you better call us.

718

:

And there's a phone number.

719

:

Now, even a year ago, you looked at this

and you could see some things

720

:

that were out of whack.

721

:

You know, the logo might be wrong or

in the wrong place where it's misspelled.

722

:

That kind of thing just didn't

look right today with using AI.

723

:

It can look pretty much perfect.

724

:

You don't notice it anymore.

725

:

So from those type of standpoints,

you have to take a different approach

726

:

that if you need to call about your credit

card,

727

:

you never, ever use anything

that's in a message that you've received,

728

:

whether it's text or email or whatever

website you go to, your card, all the bank

729

:

with the number

730

:

that's on the back of the actual card

or a bank statement or something,

731

:

and check into it that way,

732

:

because the number you call

is going to put you in touch

733

:

with the bad guys that sent you the email,

it's not going to do you much good.

734

:

It's going to do you very bad, you know.

735

:

Yeah, yeah,

736

:

it's and it's interesting to me too,

because the banks aren't

737

:

completely getting this.

738

:

And this is a true story

about six months ago, I was online

739

:

ordering, some stuff for an arcade

cabinet and get the Facebook.

740

:

And there's this thing that there were,

you know, some accessories on sale.

741

:

It was 11:00 at night.

I was going through.

742

:

I did not check it. This was my fault.

743

:

But I went on to go,

okay, this is a good thing.

744

:

I'm going to go ahead and order it.

745

:

Turned out to be a fraudulent site.

746

:

Facebook Marketplace does not get

their advertisements very well.

747

:

And, you know, fortunately,

it was only like $90.

748

:

So, I realized that the next morning.

749

:

Okay, when I looked at the credit card,

there's a pending charge from something

750

:

that was in Chinese letters,

which is being paid one up.

751

:

Yeah.

752

:

And this is what really got me.

753

:

I called American Express to let them know

there's been a false charge.

754

:

You know, right away.

755

:

Well, we can't do anything about it

until the charge actually post.

756

:

But you need to call them back

and see if they'll remove the charge.

757

:

Let me try really scammers.

758

:

And and that's what I was told to do

I couldn't. Yes.

759

:

So, Yeah.

760

:

So there's a lot to be learned.

761

:

Both business

and personal things on this kind of stuff.

762

:

And things that we're going to have to

deal with is this continues down the road.

763

:

Now, the other side of this

is robotics and cybernetics.

764

:

Okay.

765

:

So the definition of a cyborg is

you have at least one mechanical implant

766

:

that senses

767

:

something on your body and manipulates

your body based on what it sends.

768

:

Okay.

769

:

That's a that's essentially a definition.

770

:

So by strict definition,

a pacemaker would make you a cyborg.

771

:

Yeah. In my case,

I have an inspire implant.

772

:

That's what I was talking about.

773

:

And some other guys with the inspire

implant senses when I'm having a moment

774

:

of sleep apnea

and stimulates muscles to eliminate that.

775

:

No, no, the CPAp machine has been

quite nice, but that is cybernetics.

776

:

Now, where this is going on down

the road is we're seeing

777

:

a lot of the technologies

we have in medical

778

:

that are essentially being upgraded.

779

:

Bionic knees, bionic hands.

780

:

Okay.

781

:

So arm and leg extremity

782

:

replacements, something

that's been around for a while now.

783

:

But they are mechanical.

784

:

They're getting to a point now where

the machine can interface with your brain

785

:

and be able to react

and interact in the exact

786

:

same way that your natural leg or arm

or whatever would have been able to do.

787

:

We're seeing this in

things like art replacement

788

:

and other stuff of that nature,

and this is something where this can go

789

:

really build a quality of life,

different type of things.

790

:

Now, of course, when you go there

you can add sensors

791

:

and different things that wouldn't

normally exist in the original bio,

792

:

you know, material.

793

:

And one of the things that I think is

interesting is a concept called smart

794

:

prosthetics network controlled,

adaptive and Future Ready.

795

:

They connect to the 5G network

on the device.

796

:

Cloud computing and machine

based neural learning helps

797

:

to improve control and adaptability.

798

:

On one side of it, that's great,

but on the other side,

799

:

what could possibly go wrong

when somebody has to weigh,

800

:

oh no, I've got restless leg

801

:

syndrome for serious, seriously,

802

:

it's.

803

:

Yeah, I do think people

804

:

I think cyberpunk, kind of describes

805

:

that because, you know,

you can hack somebody's cybernetics.

806

:

Yeah.

807

:

And it certainly would be possible,

you know, true cybernetics.

808

:

You're going to have a control

unit of some kind built in.

809

:

What is it.

810

:

What is it?

811

:

Control run pulls out a cyber deck

and there's other names for it,

812

:

but it's kind of

your central control system

813

:

that interfaces between your biological

and your machine parts.

814

:

And yeah, you would want to protect that

at all costs that somebody can't get in

815

:

and hack you in this possibly

is already a thing and it's getting there.

816

:

So then the next thing that comes up

from that kind of stuff is this argument

817

:

that's coming out.

818

:

That would have been something you would

have seen in a movie not so long ago,

819

:

is what is a prosthetic

and what is an upgrade.

820

:

So right now,

today, I can go have my wrist replaced.

821

:

There's a site online that does that.

822

:

My wrists are fine, but I could still go

have them replaced with something

823

:

that would have more power,

you know, that kind of thing to it.

824

:

And you can get them even where

they're scanned to look like Iron

825

:

Man or whatever you would want it to be.

826

:

Now, is that elective surgery?

827

:

Are you upgrading your body?

Is it a prosthetic?

828

:

You know, what are you actually doing?

829

:

I don't like using the cyberpunk analogy

with these type of things,

830

:

but using it the wrong way.

831

:

You could go down a very dark road

with that.

832

:

And the technology is here,

833

:

you know, basically now it's either

being developed or it's already here.

834

:

So it's

a situation of like any technology, it's

835

:

not about the technology itself.

836

:

It's about what you do with it.

837

:

Talking about something

from the last century, nuclear energy,

838

:

this type of a thing from one standpoint,

it's a bomb can blow things up, destroy

839

:

life on Earth from another one.

840

:

It can produce clean energy

until your reactor melts down

841

:

because you built it wrong or something.

842

:

But it can produce energy

843

:

and do some things

from that kind of a standpoint.

844

:

So there's, you know, uses for some of

the technology itself is benign.

845

:

It's what you do with it.

846

:

And that's

I think that's a lot of the same thing

847

:

with implants

and all of these type of things. Now,

848

:

you know, I've been asked in the past,

but what do you think about it?

849

:

I'm all for it.

850

:

I'd be I'd be not cyberpunk, but I'd be

the person that would go in and,

851

:

you know, change what I can change

and all of that.

852

:

And I'd have a problem with it.

853

:

As long as number one, I have complete

control over it, I can turn it off.

854

:

I know where it was manufactured.

855

:

And my goodness,

if it's online, I have, you know, complete

856

:

control over that of that system as well,

my friend.

857

:

You'd what you'd want the

the items to be constructed

858

:

by rational people

who aren't trying to make a quick buck,

859

:

people who care about whether you're going

to have an allergic reaction,

860

:

whether it functions

well with the rest of the body.

861

:

These are all really important,

complicated factors to think about.

862

:

And so I kind of deal with some of this

in my cyber hock books.

863

:

Yeah.

864

:

And I have to think about what

kind of problems good and bad can occur.

865

:

And you know, at the end of the day,

I think it would be other cyborgs

866

:

that would maintain it,

because you'd also have to have someone

867

:

that knows what it is

to have that kind of equipment, you know?

868

:

Yeah.

869

:

And obviously people you can trust,

I mean, this is all stuff

870

:

that we are going to have to address

and deal with.

871

:

And one of the thing

872

:

that is a thing that you could do right

now is something called biohacking.

873

:

And this is using implanted technology,

very basic implant technology

874

:

in some ways to do certain things

like start your Tesla,

875

:

you know, or things of that nature.

876

:

I think with some of the cybernetics

that are coming out,

877

:

you'll be able to be the key to your car

if you want.

878

:

You know, stuff like that.

879

:

Some of this is already possible.

880

:

Some of it is going to be here

very, very soon.

881

:

Is the technology to do it

actually already exists?

882

:

It may just not be in a way

that you can get to it just yet.

883

:

So, you know, you look into this

884

:

and you have to start talking

about the ethics,

885

:

the moral construct for different people,

because different people

886

:

are going to think of these

in different ways

887

:

what it can do,

what it can't do, and all of that.

888

:

And that's talking about cybernetics,

889

:

where you still have a human involved

in some capacity.

890

:

Now the next one is robotics.

891

:

There is a number of different companies

892

:

that are really doing

well at producing robotics.

893

:

I mean, you know, you you,

894

:

like spot the dog

895

:

if you've ever seen that on,

why Boston Dynamics.

896

:

Yeah. The Boston Dynamics. Yeah.

897

:

It's not really a dog,

a creature that can run around and

898

:

they're going and kicking it over

and all this kind of stuff.

899

:

And I'm thinking,

oh, that's not a nice thing to do.

900

:

And, you know, but no, it's not.

And then it gets back up.

901

:

Oh yeah. So

but these things are our robot.

902

:

They have a program that's running them.

903

:

They're operating off of that.

904

:

And that's basically the end of it.

905

:

But what you're seeing is direction

where we're going to be essentially

906

:

combining cybernetics, bionics, robotics

and AI and going to end up

907

:

with something very different,

you know when that happens.

908

:

So you know

from that kind of a standpoint,

909

:

you know question is we reach out

where is this going to end.

910

:

Well you know Ironman I'm

911

:

Gretchen is you do you are learning to fly

an actual real jetpack.

912

:

That does work. Yeah.

913

:

You know and it's usable.

914

:

Yep. Technology was before 2017

when, Richard Browning

915

:

actually started working on

this was completely,

916

:

completely with an astronaut science

fiction in the:

917

:

They did have a jetpack

for a little while.

918

:

They were working on a few problems

that could only stay in the air for 30s,

919

:

for one thing, and was not very user

920

:

friendly in the sense of being able

to steal and all the rest of it.

921

:

Gravity's pax.

922

:

And I can say from personal experience,

923

:

you know, it

is something that I'm I'm not there yet.

924

:

Hopefully

someday I'll learn it at this point.

925

:

But those that can fly these things will.

926

:

It looks just completely effortless.

927

:

Yeah, it does look effortless.

928

:

It's impressive.

929

:

Now watch some of those videos

and it makes you wonder, oh, is this real?

930

:

Just like with some of the robots

that I've seen, I wonder is that

931

:

I know the Boston dynamic ones are real,

but then I see some of these other ones

932

:

from Asia and I go, are those real

or are they messing with us?

933

:

You know, if

934

:

and that's really what it comes down to

because you're, you're,

935

:

you know, you're talking about something

where

936

:

again, is what you're seeing propaganda

or is it something that's real?

937

:

And enough of this stuff exists

938

:

now that is real,

that you can't just write it off anymore.

939

:

And this second part of it is, is okay,

you know what's going on with it.

940

:

What happens when the AI has to make

the decisions instead of the human?

941

:

Well, we're seeing this and what they're

calling self-driving cars.

942

:

I don't think we're completely there yet,

but we will be someday.

943

:

But when the car has to make the decision,

okay, I'm going to get it in.

944

:

No accident.

And then I don't have a choice here.

945

:

Do I hit the human in the crosswalk?

946

:

Or do I hit the guy in the wheelchair

on the sidewalk?

947

:

This kind of thing

has already started coming up.

948

:

And so you have a machine that's going

to have to make a decision like that,

949

:

and that could be a problem, you know?

950

:

And that's why I say that is because

that's one standpoint where that's

951

:

something that would be accidental

and that you wouldn't want to have happen.

952

:

All right. Let's take another step.

953

:

We all know that drones are being used

more for now.

954

:

And I started

to be used to run the drones.

955

:

And what they're doing is the human

or in some cases with Implant cyborg.

956

:

I'm going to just use that word

because that's what it is, is controlling,

957

:

a set, a group,

a unit of whatever you want

958

:

to call it, of robots, whether that's AI

or ground pounders or whatever.

959

:

And then the AI actually controls

the individual unit.

960

:

So it's like a command or controls the AI,

961

:

but the AI makes

the on the ground decisions

962

:

like something you've seen out of a movie.

963

:

Right? Exactly.

964

:

I, I don't know, I was having

a conversation with this with a think tank

965

:

a while back, and I said, well, you know,

in my case right now, I've got to decide.

966

:

I'm already a cyborg, okay, by definition.

967

:

So if this keeps going,

do I want to be at 800 or 81,000?

968

:

If you haven't seen Terminator yet,

969

:

you know,

and then they got into a big argument.

970

:

Is the Terminator

actually a cyborg because of the android?

971

:

You know, because it's,

the 800 is the one with a, you know,

972

:

melting skin and all that.

973

:

I'm not sure there are

biological components.

974

:

Yeah, that's a good question.

975

:

So when are you going to

end up like General Grievous?

976

:

What is that to I, you know,

977

:

Star Trek or Star Wars or Star Trek

before they blew up the enterprise

978

:

and every movie recently or Star

Wars, Star Trek is idealistic.

979

:

Star Wars is more probably

what the reality would be.

980

:

Yeah. You know, in my opinion.

981

:

All right.

982

:

So we're almost at the end here,

but this is a good conversation.

983

:

And again,

touching on a lot of the questions.

984

:

So I mean please continue sending the man.

985

:

We'd love to do a Q and A on this.

986

:

I'm going to try

to also get a guest on the make

987

:

some of the biohacking products to talk

about that a little bit and what it is

988

:

and where you can go with that

and just see where it is.

989

:

It's going to be interesting

to see where we are

990

:

when we're closing the year next year,

see how much this has changed.

991

:

And if it's the three of us really here,

992

:

or if you're just going to be listening

to Three Eyes

993

:

and tell them this is user friendly.

994

:

2.0 keeping you safe on the cutting

edge, user friendly 2.0.

995

:

Copyright 2013 to 2025

996

:

by User Friendly Media Group incorporated.

997

:

All rights reserved.

998

:

Content is the opinion

of the show's participants

999

:

and not necessarily this station

or platform.

:

00:48:57,334 --> 00:48:59,803

Request for material use.

:

00:48:59,803 --> 00:49:05,075

Interviews CcpA Privacy Notice

for California residents,

:

00:49:05,542 --> 00:49:10,514

GDPR information for UK and EU residents

:

00:49:10,847 --> 00:49:16,353

and any other feedback

may be submitted at userfriendly.show.

:

00:49:16,620 --> 00:49:19,289

We welcome your input.

Thank you for listening.

Links

Chapters

Video

More from YouTube