Artwork for podcast User Friendly - The Podcast
FanExpo Portland, AI & Online Tracking
Episode 524th January 2026 • User Friendly - The Podcast • User Friendly Media Group
00:00:00 00:49:29

Share Episode

Shownotes

Join us this week as we talk ComicCon and welcome guest Dr. Vin Mitty.

The team had its first experience at a FanExpo since Wizard World was replaced.

Meet AI Expert Mitty, who provides guidance on distinguishing what's real from what's hype.

Finally, we talk about common sense ways you can minimize your trackability.

William Sikkens, Bill Snodgrass, Gretchen Winkler

Transcripts

Speaker:

Welcome to

2

:

User Friendly 2.0 with host Bill Sikkens,

3

:

technology architect.

4

:

And this is User Friendly 2.0.

5

:

Thank you for joining us this week.

6

:

Joining me my co-hosts Gretchen and Bill.

7

:

Welcome to this week's show.

8

:

Hello there. Hello.

9

:

So before we get going

any further on this, I give my pledge

10

:

that the captioning of the show

will no longer spell my name wrong with

11

:

different letters.

12

:

And call Gretchen. Gretchen.

13

:

We got a little bit of.

14

:

Yeah, we got a little bit of listener

feedback on that.

15

:

And, you know, it's.

16

:

We talk about this on the show

all the time, so I have no excuse for it.

17

:

But we must not just let the AI transcribe

the show.

18

:

I guess it actually has

to be reviewed by you and,

19

:

so anyway,

20

:

on that note, we do

offer a transcript on all of our shows.

21

:

Now, we've been doing that for a while,

and it's something

22

:

that, seems to be appreciated

across the board.

23

:

So, you know, let us know

when something like that happens.

24

:

Sometimes these things do get missed.

25

:

And, we try to do the best we can,

but we're a small staff,

26

:

and at the end of the day, we do rely

a lot on you as our listeners to give us

27

:

that type of information and also

what you want to hear us talk about.

28

:

So that's where all this comes from.

29

:

And on that note, we have an interesting

show for you this week.

30

:

And the second segment we are going

to be talking to an expert on.

31

:

I was going to be going into some detail

on how to know what's hype

32

:

and what's real, which is becoming more

and more of an issue.

33

:

These days.

34

:

We're going to be doing a follow up

on our Tech Wednesday from this week

35

:

to talk about some ways

that you can reduce online tracking,

36

:

and these are the ones where you don't

have to be paranoid and go off the grid

37

:

and live in the middle of a forest,

38

:

although I think that might be

the only way to eliminate tracking.

39

:

But there are some things

you can do to minimize it.

40

:

And finally,

after our news and our first segment,

41

:

we are going to be talking about an expo

42

:

in Portland that we had the opportunity

to go see this last week.

43

:

And so got a good show coming up for you.

44

:

And with no further ado,

let's jump into the news.

45

:

What do we have this week?

46

:

All right.

47

:

Star Wars reveals new look at Darth

48

:

Maul's 2026 return.

49

:

Yeah, I thought it look cool.

50

:

He looks fiercer in my opinion.

51

:

So yeah, I think so.

52

:

And that's a cosplay

I wouldn't mind doing either.

53

:

So, you know, it's

kind of good to see this.

54

:

Now I the more haunt

you have on your head, the better, right?

55

:

I guess I like to play in order

to another thing.

56

:

So anyway,

57

:

depending on how difficult they make

58

:

the clothing, I haven't gotten a chance

to really look at it.

59

:

Maybe it looks like

it could be interesting to as cloak

60

:

and stuff is kind of.

61

:

I don't know, it just looks cool.

62

:

It's, Anyway, what this is coming up for

is we're going to see the order

63

:

of Maul return

this year and Maul, shadow Lord

64

:

and, Clone Wars prequel wars,

and they're bringing them back.

65

:

I said yes, yes.

66

:

Okay.

67

:

Just wanted to make sure I thought

you said something else that I was gonna

68

:

have to bleep out there for a minute,

but no, it's a really

69

:

well you'll have to let us know

what you think of.

70

:

I check out his picture

and you know, obviously this news content

71

:

is always very factual and unbiased.

72

:

Except when I want to get my opinion.

73

:

So in this case I like it.

74

:

But let us know what you think.

75

:

Oh yeah. None of us have any opinions.

76

:

Of course not.

77

:

You know,

not any more than the major news outlets.

78

:

So I'll just. I'll leave it with that.

79

:

And that's my excuse,

and I'm sticking to it.

80

:

Sun releases S4 severe solar radiation

storm

81

:

largest in decades, making northern lights

visible across the world.

82

:

And I finally saw them last night.

83

:

You saw them?

84

:

What time I saw them, yeah, I saw my.

What time?

85

:

I wasn't able to sleep.

86

:

Woke up about 3:00 and was just doing

some stuff and looked out the window.

87

:

It was really quite beautiful too.

88

:

Oh, I missed it.

89

:

I bet it must be 3:00.

90

:

Somebody else said around three as well.

91

:

Yeah, I was around 3 a.m..

92

:

I will add to it that we're in a unique

line here up in Oregon.

93

:

Normally this time of year it's raining

and and gray and we've had some weather

94

:

where it's been sunny, cold,

but sunny in the clear skies.

95

:

So I think that was part of why I was able

to see it, because it wasn't overcast,

96

:

but it was. It was interesting.

97

:

I mean, I see

why do people think that's interesting?

98

:

Make like ancient cultures laugh.

99

:

That was, you know, an act of the gods

because it is really quite unusual

100

:

and pretty. But, watch your electronics.

101

:

This definitely causes havoc

with planes, radio transmissions, all

102

:

that kind of stuff.

103

:

Here on the planet, the more detailed,

104

:

more intense, electronic systems

can have some complications.

105

:

But this does affect satellite

communication and that stuff a bit more.

106

:

And I know back, my dad used to be a ham

radio operator.

107

:

Amateur radio operator.

108

:

And when we'd have these solar storms,

he used to have problems with that, too.

109

:

So it does affect electronics,

a little bit more sometimes than others.

110

:

And we're supposed to be getting away

from solar maximum.

111

:

But this is the largest storm,

as it says in, in a very long time.

112

:

I know right back when they first started

113

:

having electronics such as,

114

:

teletype and, or telegraphs, solar storms

were quite an interesting effect.

115

:

If you ever want to look up

something really interesting,

116

:

to check that out, I think, you know,

117

:

it's it's weird to think,

but it is interesting how much

118

:

our environment around us

does directly affect everything, you know,

119

:

and in ancient times, like, be like,

you were kind of coming into an ancient

120

:

times, you would look at this

and it was a beautiful display in the sky,

121

:

but you didn't.

122

:

Yeah, at least

I don't think the Vikings had, you know,

123

:

digital communication

or quantum computing.

124

:

Maybe they.

125

:

Yeah, I wouldn't put it past them like,

you know, at that

126

:

time, your sons don't work either way.

127

:

And like you say today, it's interesting

to look it up and see

128

:

some of what this has created.

129

:

All right.

130

:

Security risks and hardware advances

start to matter for investors.

131

:

I'm surprised

that wasn't mattering to them earlier.

132

:

But explain this.

133

:

Yeah, I kind of have to agree with you

on that in the way that headlines work.

134

:

This is specifically

because of the onset of quantum computing

135

:

is what they're focused on here.

136

:

And while I do think it's important

137

:

for security risks,

the matter should have all along,

138

:

and it does, usually after the fact,

unfortunately,

139

:

in a lot of the bigger companies.

140

:

But what they're looking at here

141

:

is we're going to be transitioning

into quantum computing.

142

:

It's here right now.

143

:

It's extremely expensive

and still prototype,

144

:

but that won't be the case forever,

probably even a long time.

145

:

And more than just being kind

of a really unusual,

146

:

pretty chandelier,

these kind of devices are going to

147

:

change

148

:

fundamentally, change

the way that we interact with things.

149

:

Now just go to the internet,

browse websites, that kind of a thing.

150

:

But the ability with a computer like that

151

:

to hack a password

or something is going to be child's play.

152

:

So security, you know, and that type of a

thing, it's not just logging into your,

153

:

you know, account at the grocery store

and ordering food for you or something.

154

:

It's the fact that

155

:

back end systems that maintain databases

and all these other type of things

156

:

classically are also set up on a username

password combination.

157

:

Now, in recent years, this has become more

and more of a problem.

158

:

So this has been moving, the adoptions

159

:

a lot more on the back end of things

and on the front end of the consumer.

160

:

See, but where we're starting to see

is an encrypted things to be able

161

:

to log on in that kind of stuff.

162

:

Multifactor authentication is a big one

that we all use.

163

:

Now that does help.

164

:

It's not the end all be all of it,

but it's better than not having it.

165

:

So, you know,

as we move in that direction, I think

166

:

probably if I was to make a guess on this

in the next two years, probably less,

167

:

we won't be using passwords anymore

because it will be obsolete.

168

:

It has been for years,

169

:

but it's going to get to a point

now where it's not workable anymore.

170

:

So, you know,

we look at other ways to do that.

171

:

But,

172

:

investors are looking at this because

the technology is getting out the door.

173

:

And like I say, it's

no prototype right now in a lot of ways,

174

:

but it won't be for too much longer

in the potential capability of what

175

:

these things can do.

And they use less energy.

176

:

You know, we've got the problems

with the data centers

177

:

and AI's sucking down the power grids

and that kind of stuff.

178

:

So it's going it's going to change

a lot of that if it's done well.

179

:

But it is going to mean

that we have to change the way that we do

180

:

some of the things that we do online,

if we want to stay secure.

181

:

In good news, Kathleen

182

:

Kennedy's exit interview

as she leaves Lucasfilm.

183

:

Yes, and my own biased opinion too.

184

:

I would have to agree with you

of where you read that headline.

185

:

So okay, to be fair, actually,

186

:

to go back to the beginning of this,

to be fair, Kathleen

187

:

Kennedy has been the CEO of Lucasfilm

since George Lucas sold it to Disney.

188

:

Back in I believe, 2011.

189

:

And there are two schools of thoughts

on the way

190

:

that she has been running the company.

191

:

And this is actually a very divisive thing

among some people,

192

:

because there are people that love her.

193

:

And, you know, those people are entitled

to their opinion.

194

:

Again, I'm not editorializing

or giving my opinion here, but,

195

:

you know, even though their opinions

wrong, they're entitled to it.

196

:

I thought maybe they just needed to see

a mental health provider.

197

:

Yeah. So yeah, there you go.

198

:

You know, it's, now that I in all honesty,

she actually has done

199

:

some incredible stuff.

200

:

I mean, if you look at it

201

:

from an investment standpoint,

the films have grossed a lot of money.

202

:

In some case,

I think one of them is the most in history

203

:

or close to it or something like that.

204

:

But was that like Rogue One, a Rogue One?

205

:

No, it wasn't Rogue One,

but Rogue One was definitely a sleeper.

206

:

It did a lot

better than they were expecting.

207

:

Yeah, it was a good movie.

208

:

It was a very good movie, you know?

209

:

So the thing of videos is, it's not like

she's all bad or anything.

210

:

I don't think that,

but it is definitely that, in my opinion,

211

:

she's taken Star Wars in a direction

that would have been different

212

:

from what the founder of Star

Wars would have wanted to do.

213

:

Did you disagree with me on that,

Gretchen?

214

:

No. And, and I think

Dave Filoni being the choice to take over

215

:

is appropriate because it feels like Dave

Filoni understands

216

:

the spirit of the direction

that George Lucas originally was going.

217

:

I don't know, do you guys have real

belief, even from your standpoint?

218

:

Or go ahead and weigh in on this because,

like I say, there's a lot of opinions

219

:

out there, about this.

220

:

I think she had a few good runs,

221

:

but she's also had some pretty bad flops.

222

:

Let's, you know, an acolyte,

223

:

please. We don't want it.

224

:

We have to be radio friendly here.

Anyway.

225

:

Swearwords.

226

:

Yeah.

227

:

Now, one thing I will say, if you look.

228

:

Oh, go ahead, Bill.

229

:

But Filoni has definitely.

230

:

And what he has done,

231

:

the works that he has touched

has definitely felt like the Star Wars

232

:

for him. It is felt like Star Wars.

233

:

And as Gretchen

said, he's taking the reins.

234

:

And, I think it's gonna be interesting.

235

:

We got to see him at Comic-Con

in San Diego four years back.

236

:

An amazing person, just is.

237

:

You know, I have the poster.

238

:

Yeah. Oh, yeah.

239

:

And to Kathleen Kennedy credit,

240

:

I think as a producer, she's done

some some good stuff yet.

241

:

Schindler's List, Sixth Sense,

242

:

I mean, these are all films

that are well liked and accepted.

243

:

But she didn't write them.

She produced them now.

244

:

And I think that's the reason

why she was picked,

245

:

because she was a good producer.

246

:

She's not really what

I would call the leader of a franchise.

247

:

Yeah,

I think she's just been producing stuff.

248

:

Yeah.

And that's just word that we all have.

249

:

The things that we excel at

and when you can, you know, focus on that.

250

:

You do a very nice job.

251

:

And she's done

a very nice job in these things.

252

:

She said in her exit interview

253

:

that she's planning to go back

to producing, and I'm looking forward

254

:

to see some of the stuff

that she's going to create.

255

:

I also very much look forward

to the direction of Star Wars.

256

:

It's going to go now

with, Dave Filoni at the helm.

257

:

Me too.

258

:

All right.

259

:

Is the check I got from Amazon a scam?

260

:

At least that's the headline.

261

:

Yeah, this this article is a little bit

kind of out of date right now.

262

:

This came into the news

cycle back at the beginning of January,

263

:

and we were still on our holiday

break at the time,

264

:

which is why we didn't

cover it at that point.

265

:

But it's worth talking about

because we've got a lot of questions

266

:

that have been coming in on this.

267

:

And as we all know, there's a lot of

people out there that try to scam you.

268

:

I mean, you know,

269

:

I'm sorry to have to disclose that,

but it is definitely the case here.

270

:

And you want to be very careful.

271

:

And when you do get a random

272

:

check in the mail, even if it purported

to be from Amazon, it's good to take that

273

:

with some idea of risk and at least check

and see what's going on.

274

:

But yes, these are real.

275

:

And what's happening

is it's part of a settlement,

276

:

between Amazon

and the FTC, the Federal Trade Commission.

277

:

And it goes back to the way

that they were marketing crime,

278

:

and they were doing some stuff

that the allegation was that,

279

:

was a practice to try to trick people

into joining a crime.

280

:

You know,

some of the things like that, making it

281

:

a little bit obscure

is what you're getting.

282

:

Again, those the allegations

that were made and, the Federal Trade

283

:

Commission decided that

that was not a, appropriate thing to do.

284

:

So the settlement includes,

paying this out, class action style,

285

:

which, you know,

286

:

means the people that have been injured by

it will probably get a free,

287

:

you know, fries at McDonald's

or something.

288

:

Is there something that I, I'm not a huge

supporter of class action.

289

:

It seems like

you never really get anything from them.

290

:

But yeah, positive side of these type

of things is, these big companies I like.

291

:

There's one that happened

about two years ago with Apple

292

:

that was a settlement

for Syria, eavesdropping on people.

293

:

And I think the settlement on that was

it was a couple of bucks maybe.

294

:

I don't know, I could be wrong, but it,

was it I do remember when the whole CD

295

:

class action thing happened.

296

:

Chuck I got was for $0.96, I think.

297

:

But what ends up happening is

it does make the issue public.

298

:

And while, you know, $95 million at Apple

is, you know, an accounting error,

299

:

really, it does actually get them to

maybe make some fundamental changes.

300

:

Unfortunately, in the technology world,

it seems like when these things

301

:

come out, actually get adjudicated,

they're long on to something else

302

:

and they're just,

303

:

you know, trying to close the door

on whatever happened in the past.

304

:

But,

but yeah, but to that end, and I digress

305

:

a little bit

with these other things, the check itself.

306

:

Israel, if you have any concerns about a

check with your bank for your deposit it.

307

:

And the other thing of it

is, is when using checks and other

308

:

obsolete methods of payment like that,

you do want to be a little bit careful,

309

:

because if you do get a fake check

or a, you know, check

310

:

without money behind bounced check,

all that kind of a thing,

311

:

even as the recipient that can cost you

money because you can have overdraft fees

312

:

bounced,

check these, all that kind of thing.

313

:

So it is definitely something

to be concerned about.

314

:

And if you ever sell something

315

:

and they give you a check for more money

than what you sold it for, just don't

316

:

just just don't. It doesn't matter

what it is that is a huge red flag.

317

:

And that's a big scam

also that goes along with these things.

318

:

You deposit the check.

319

:

Gretchen, I almost had this happen

with a car we sold.

320

:

We talked about it last year

when it happened,

321

:

but you go and deposit the check.

322

:

It's going to be probably a couple

of months before it actually comes back.

323

:

And by that time,

what you sold is long gone.

324

:

The extra money that the seller once back

to help with you know, whatever that they

325

:

usually come up with is long gone

and you're left holding the bag.

326

:

So there's nothing wrong

with being careful about these things.

327

:

And you know what?

328

:

That car is now safely in Japan

in a museum.

329

:

So yeah. Yes. Right where it belongs.

330

:

New York City mayoral

331

:

inauguration dance flipper

zero and Raspberry Pi devices.

332

:

Yeah, this is an interesting thing.

333

:

So when you go to these events,

there's a list and one that

334

:

very well should be there of things

you can't bring in giant backpacks

335

:

or obviously

any kind of weapons or anything like that,

336

:

you know.

337

:

So at the end of the day,

it does make sense.

338

:

You want to keep things secure.

339

:

And unfortunately,

340

:

we live in a world where if you didn't do

that, you'd have a lot of problems.

341

:

But what I thought was interesting about

this is naming these devices by brand.

342

:

So a flipper zero is a officially a device

that you would get

343

:

to test your network and make sure it's

secure and test other resources.

344

:

You would have to make sure

they are secure.

345

:

But I have heard some rumors

that this has been used for hack

346

:

and I don't know, but I guess someone

somewhere used it for the wrong reason.

347

:

That coupled with the fact that your,

electronic pad that's built into

348

:

the thing gets mad at you and leaves

if you don't hack enough.

349

:

I mean, you tested not work enough.

350

:

So, you know, and then a Raspberry

351

:

Pi as a reference

to a small kind of computer

352

:

that is, does a lot of different things

353

:

and are used as microcontrollers

and that kind of thing.

354

:

So for whatever reason, you can't have

either one of those devices.

355

:

Now there's a million clones

of both of them that do the same type

356

:

of thing.

357

:

There are other devices

like a Raspberry Pi, especially.

358

:

There's there's I wouldn't say knock offs,

but there's generic versions of it

359

:

and there are knock offs and all of them

work, or,

360

:

you know, that you would bring

and have set up inevitably

361

:

working the same way,

but not actually be a Raspberry Pi.

362

:

You know, this would be something along

the line of saying that,

363

:

you can't bring

in a certain brand of knife,

364

:

but if it's not that brand now, it's

okay to bring it out.

365

:

It's the exact same thing.

366

:

You know, you can't bring a Tonto knife

into the event.

367

:

Well, this one's not a Tonto.

368

:

Oh, it's a different problem.

369

:

Yeah. Help yourself.

370

:

Now you want to sell on the side?

No, it's it is a little weird.

371

:

And I think what happened is, is

they were concerned about hacking devices,

372

:

which I don't know how valid of a concern

that needs to be at an event like this.

373

:

But even if it is, you would have to ban

the device class for this to make sense.

374

:

Not a brand, you know, so, yeah.

375

:

And then your security people need to

all know what a hacking device looks like.

376

:

And, is that really something

377

:

your your regular security

people are going to know?

378

:

I would say probably not.

379

:

And when you have a large quantity

of people, are you going to be able.

380

:

And these are also small devices too.

381

:

So you're going to have to

go through stuff and try to find it.

382

:

It's, you know,

same footprint as a cell phone really,

383

:

you know, so it's not not something

that would be even that easy to do.

384

:

But, if anybody listening knows

more specifically why

385

:

this was done or why the brands, please

go to the website user friendly Dot show

386

:

and let us know in the comments,

387

:

because I'd love to know

if there's something I'm missing here,

388

:

but it just seemed quite odd

to handle it in this way.

389

:

So anyway,

all right, well, to change topics,

390

:

we had an opportunity to do something

that we haven't done before

391

:

and that was to go to a Comic-Con

or we've done that before,

392

:

but a fan expo, Comic-Con

now, a little bit of history on this.

393

:

We used to go to Wizard World,

and these were events held

394

:

across the country in different places.

395

:

In fact, down in Reno, Bill,

you had one there for a while

396

:

when we were still there

that we went to great event,

397

:

you know, love

Comic-Con and all that type.

398

:

Work.

399

:

And yeah,

yeah, Sacramento was another one.

400

:

Oh yeah yeah that's right

Sacramento you know.

401

:

So by the way, don't

402

:

walk in the first suit

like a Wookie suit in Sacramento in July.

403

:

Just just don't ask me how I know.

404

:

But anyway so

405

:

even if you're going to the Comic-Con,

get a cab, get an Uber or something

406

:

with air conditioning,

or you're going to be in an ambulance

407

:

anyway, outside of that, though,

we got to Syfy on Expo.

408

:

Now, the connection here is that Fan Expo

took over was a world a few years ago.

409

:

Long story.

410

:

You can go online

if you want to dig into the reasons

411

:

as to why,

but one company acquired the other

412

:

and since then, just life circumstances

and everything else.

413

:

We have not had a chance to get the fan

actually to see what it is.

414

:

And we got there

and I enjoyed the afternoon.

415

:

But Gretchen, I don't know if we

and if you disagree with me,

416

:

it was a very different experience

than what we would have had.

417

:

It was our world.

418

:

And my it was a

it was a very different experience.

419

:

I think the layout of,

420

:

the event was very different

from what I recall from Wizard World.

421

:

Wizard world seemed like

they had the stage

422

:

in the middle of everything,

so it kind of pulled everybody

423

:

into seeing this big happy location.

424

:

And it was off to the side

425

:

on this one in the dark kind of.

426

:

And, That's good.

427

:

Yeah.

428

:

I found it by accident. You

429

:

know.

430

:

Yeah.

431

:

So, there were some things that,

it just felt very different.

432

:

And I. Yeah.

433

:

So the amount of amount of energy at night

434

:

and some of these may have nothing to do

with the production of the event.

435

:

I don't know, but, the storage stars.

436

:

Yeah, that's students

there was hardly any,

437

:

costumed people roaming around.

438

:

And you didn't see

any of the ultra fabulous

439

:

unless they were waiting somewhere

hiding for the costume contest.

440

:

I'm not even sure they had one.

441

:

But, I didn't see that

442

:

costume that just, like, knocked

you off your feet.

443

:

And I remember

seeing lots of those at these events.

444

:

And so I wonder, you know, is it

just because people are slowly recovering

445

:

from Covid still,

or has something changed?

446

:

You know, it's hard to say.

447

:

Or the time of year, maybe because I know

when Rose city, we've seen costumes.

448

:

I mean, that's felt pretty good.

449

:

And and again, as far as like

I hadn't thought about that.

450

:

I don't know that they didn't have one.

451

:

I just didn't look for one.

452

:

And, you know, maybe that

that's all it is.

453

:

But one of the other things

that I would love to see there

454

:

again is the sports,

Wizard World had a focus on that set up.

455

:

You can come in and game

and do all of these kind of things.

456

:

Now, one thing I will say on this,

and I know this is going to be a shock.

457

:

Anybody that knows me,

I although that was cool, is

458

:

the Portland Retro Gaming

Expo had a room set up.

459

:

And what this is, is

460

:

they hold a convention

in Portland in the fall

461

:

that is all retro video games, cabinets,

consoles, all that kind of stuff.

462

:

Obviously, you know,

I have no interest in that at all.

463

:

But yeah, sarcasm sign, right?

464

:

I just don't look at my garage.

465

:

I'm trying to somehow figure out

how to squeeze another cabinet in there.

466

:

And, so far the laws of physics

are against me, but I'll win anyway. The

467

:

the thing

468

:

of videos, though, they had a pretty cool

set up in there later pinball machine

469

:

and, you know, in some cabinets and people

you could talk to you and even had a,

470

:

Nintendo set up on an, a very 1970s

console TV like you would see.

471

:

And yeah, I saw that.

472

:

Yeah, that was was kind of cool, complete

with a couch to sit in

473

:

while you were playing it

and everything like that.

474

:

So, you know, there were some things

like that that were pretty cool.

475

:

And it just, you know, it's

476

:

just interesting to see the difference

in the look and feel of these things

477

:

from event promoter and time of year

and all the other stuff that goes into it.

478

:

And, you know, we've produced

479

:

special events ourselves in the past,

not recently, but in the past.

480

:

And these things are difficult to do

and even be able

481

:

to pull off a show,

whatever it is, is amazing.

482

:

It takes a lot of time, talent

and treasure to do it.

483

:

And it's just, you know,

but the people that are involved

484

:

and the way where I can't say too much on

that because the Oregon Convention Center

485

:

is the same place that Rose city was, but

they were very different from each other.

486

:

And this is very different from, yeah,

you know, from Wizard World.

487

:

So anyway, good event.

488

:

We had fun for the day

489

:

and looked forward to next year

and looking forward to things.

490

:

We're about to take our break to the back

in the second segment and talk

491

:

AI and talk about how you can minimize

how many other people can track

492

:

where you are.

493

:

This is user friendly.

494

:

2.0 we'll be back after the break.

495

:

You see here he's from the future.

496

:

He's got a really big computer

497

:

and he uses it every day.

498

:

And he uses it uses it in every way.

499

:

What do you before you know, I'm

not that sure because he uses you.

500

:

Welcome back. This is user friendly 2.0.

501

:

Check out our website at User Friendly

Dot show.

502

:

This is your one stop for everything

user friendly.

503

:

Send us your questions, your comments,

check out our shows and some of

504

:

the different things we're doing

and let us know what you think!

505

:

In user friendly dot show.

506

:

All right,

so as promised in the beginning,

507

:

we're going to be asking the expert here

in just a moment about AI

508

:

act and hype and all the rest of that.

509

:

And instead of me

trying to give a bad introduction,

510

:

let's just let the man himself give you

his expertise.

511

:

Let's go to our interview.

512

:

Joining us now, been a data and AI leader,

513

:

PhD in the same field, and author

of an upcoming book, the AI Decision Map.

514

:

Welcome to user friendly.

515

:

Thanks for having me.

516

:

So, let's go ahead and dive right in here.

517

:

Getting

just kind of on the top of all of this.

518

:

Let's go ahead and talk AI,

because, I mean, that's what

519

:

we're going to be doing here.

And I'm going to start here.

520

:

Why don't you, separate the real value

from the hype is I actually in a bubble?

521

:

This question comes up a lot.

522

:

Or is this just a normal

cycle of new technology?

523

:

Yeah, absolutely.

524

:

So I think there there is both.

525

:

I mean,

we see, a lot of chat about AI, right?

526

:

So I think it's both,

there is a bubble and also a hype.

527

:

But also there is real value.

528

:

Right.

529

:

So there's, there's real value,

from AI that, we can derive,

530

:

but there is so much, expectations

that it's all inflated right now.

531

:

We saw this during the dotcom

bubble right?

532

:

Many companies,

you know, failed during the dotcom bubble.

533

:

But there are real value,

that we got from the.com area as well.

534

:

So I think that's what

we're going to see here.

535

:

The bubble really doesn't kill all value,

536

:

but it just exposes companies

that are doing AI for the sake of it.

537

:

Okay.

538

:

So why do so many AI projects

539

:

fail even when companies

invest a lot of money?

540

:

Yeah, I think

541

:

like we said, there's so much

talk about AI, there's so much pressure

542

:

on leaders about using AI

543

:

that that they're all trying different

things.

544

:

But, when it comes to the crux of it

545

:

and AI adoption is human,

it's not necessarily too technical

546

:

because people, don't trust the data,

people don't trust the tool.

547

:

People don't see how it helps them.

548

:

These are the cases where, I project fail.

549

:

You can have great algorithms

and still fail

550

:

if people don't feel confident

in using them.

551

:

So, you know, generative

AI as it is now is still very a new

552

:

thing is in the last couple of years,

553

:

and a lot of people

are just getting on board now.

554

:

What is the biggest mistake

people make when they start using AI?

555

:

Yeah, I think one of the big mistakes

556

:

is people ask themselves,

where can I use AI?

557

:

How can I use AI versus

coming from a problem standpoint?

558

:

Hey, what are the things

I'm trying to solve?

559

:

And then understanding

is AI the right solution

560

:

for the problem

or the right tool for the problem?

561

:

Where does I actually work well today?

562

:

In every life, every day life and work.

563

:

So where does it work best?

564

:

Yeah, I think I works best

565

:

when there's a lot of mental overhead.

566

:

In, in a task. Right.

567

:

So summarizing long documents, drafting

568

:

first versions of a book, for example,

569

:

or organizing information, taking notes.

570

:

All of these things can help people have a

571

:

when when there's a lot of context

to remember and think fast.

572

:

But it is important to note

573

:

that we can use AI to synthesize

574

:

all this information, but not necessarily

575

:

think for us.

576

:

You know, as technology marches along,

577

:

I'm going to throw this out at you

because as people get into this,

578

:

what are some of the ways

579

:

regular people can use AI without feeling

just totally overwhelmed?

580

:

Yeah, a 100%.

581

:

So I think, the trick is to start small,

582

:

you know,

use it as a smart assistant for yourself.

583

:

But not as a decision maker.

584

:

Ask it to summarize, brainstorm,

585

:

explain something in plain language,

and you can, you know,

586

:

you can ask it to explain 100 times,

and it won't get mad at you.

587

:

Right? So I think, you know,

you don't need to master.

588

:

I just use it where it saves you time.

589

:

That way you can get slowly started

and then,

590

:

you know, get a go from there.

591

:

So why do people resist new

592

:

technology

even when it's supposed to help them?

593

:

Yeah.

594

:

I think people resist technology

because it feels risky.

595

:

People worry about losing relevance.

Right.

596

:

So people worry about making mistakes

or looking uninformed.

597

:

So resistance,

usually isn't about the technology itself.

598

:

It's about the fear and uncertainty

that the technology

599

:

brings together.

600

:

Along those lines, how important is trust,

601

:

you know, where does that lead in here?

602

:

Yeah. I believe trust is everything.

603

:

If people don't trust, the tool

or the outputs, they won't use the system.

604

:

A lot of times,

we've seen this in the corporate world.

605

:

Nobody comes in and say,

606

:

nobody comes in and says, hey,

your tool is bad.

607

:

But they just stop using it, right?

608

:

So if they don't trust

it, you have to look for,

609

:

the silent signals.

610

:

People just stop using it or people just,

you don't get loud feedback.

611

:

That's

kind of what I've learned over the years.

612

:

Okay.

613

:

So what should leaders or organizations

focus on before

614

:

rolling AI tools out to to people to use?

615

:

Yeah.

616

:

I think, the word that comes to

my mind is clarity.

617

:

Right.

618

:

So we have to understand,

come from a standpoint

619

:

of what problems are resolving

and is AI the right tool for it?

620

:

You know, asking questions about,

what decisions will this improve?

621

:

Who will use it?

622

:

What does success look like?

623

:

These questions help you,

624

:

pick the right project,

set you up for success.

625

:

What skills do

people really need to stay relevant

626

:

with AI in an AI powered world?

627

:

Let's just call it that.

628

:

Yeah.

629

:

I think, we was already

630

:

seeing AI democratizing

a lot of technical things.

631

:

Right.

632

:

So coding, writing, all,

all these different things

633

:

that are very deeply technical

634

:

has been democratized

or is being democratized by AI.

635

:

But I think the key skills

that we all need to

636

:

focus on

is really the human skills, right?

637

:

The soft skills,

638

:

they're going to be more critical

than ever,

639

:

you know,

the the way you build relationships, the

640

:

the way you build consensus,

being creative and just being yourself.

641

:

I think, also knowing

642

:

how to ask good questions matter now

643

:

because technology,

will take care of how to do it.

644

:

But you have to ask the right questions

and get that clarity.

645

:

That's going to be a good skill to have.

646

:

So what should listeners not worry

647

:

about when it comes to AI?

648

:

Yeah,

I think, we don't need to worry about

649

:

how these tools are built

or becoming very technical with AI, right?

650

:

Like I said, it's democratizing a lot

of things, which is great for the world.

651

:

You don't need to use every new tool.

652

:

I know there's a new tool

popping up every day, a new best

653

:

tool called, open AI ChatGPT.

654

:

I don't worry about all of it.

655

:

So pick

the one that you like and just, use it.

656

:

We don't need to panic, I think I rewards,

657

:

thoughtful, steady learning

658

:

and urgency is more of a the bubble factor

that we're talking about.

659

:

We talk about.

660

:

But really use use

661

:

any AI tool, and, it'll help you out.

662

:

But let's dive into something here.

663

:

You're the author of the book

the AI Decision Map.

664

:

And, now it's either been released

or has just been released.

665

:

Can you tell us a little bit about what

666

:

inspired you to write your book

and tell us a little bit about your book?

667

:

Yeah. Thank you. Yeah.

668

:

We're we're planning to release it

in January,

669

:

mid to late January,

just in the new year.

670

:

The, the inspiration for the book was,

671

:

what I saw in, the corporate world

and, and, academia.

672

:

Right.

673

:

So every business leader is under

so much pressure to do something with AI.

674

:

But like we discussed,

majority of AI projects fail.

675

:

I wrote and data and AI for decades

676

:

and did my PhD in technology and adoption.

677

:

So it's, this was the right time

678

:

for me to do something

or share my thoughts.

679

:

So I've gathered my thoughts

into a framework so leaders can go

680

:

beyond the hype.

681

:

Beat the bubble,

and deliver real value with AI.

682

:

So if

683

:

you wanted to make sure that our listeners

walked away

684

:

with one special

thought or when something to hold on to,

685

:

would you want them to remember

from what a concerning AI?

686

:

Yeah.

687

:

Yeah.

688

:

Again, don't worry

about all the different AI tools.

689

:

Fear of missing out is a real thing,

and companies want us to feel that way.

690

:

But I think just choosing a specific,

tool that you like and sticking it,

691

:

sticking with it to, help you with

your day to day would be super helpful.

692

:

And for companies,

don't jump into every project

693

:

that you can think of as,

you know, ask yourself,

694

:

what kind of problems you're solving

and what kind of risks, you know,

695

:

that puts your company into,

and then and then go from there.

696

:

So if your book coming right up here,

tell our listeners, how do they find it?

697

:

How they find you is at Amazon.

698

:

What's the best way to Westwood

to get to it?

699

:

Yeah.

700

:

We're going to have, my book on Amazon.

701

:

It's called the AI Decision Map.

702

:

We're going to be launching in Amazon

and, a few bookstores around the country.

703

:

So yeah, check, check out, my LinkedIn,

704

:

then many, on LinkedIn

and YouTube data democracy.

705

:

That's my podcast.

706

:

All right. That's great.

707

:

We'll go ahead and get that link up

708

:

on our social media

on, blue Sky, LinkedIn and Facebook.

709

:

And I think we're doing Instagram

now, at least that's what I've heard.

710

:

So if it's there,

then we'll do it there too

711

:

then.

712

:

Thank you so much for joining us.

713

:

Is there anything else

you'd like us to know before we end today?

714

:

Thank you so much, William.

715

:

This was wonderful.

716

:

Yeah.

717

:

I think AI is here to stay, and I think,

718

:

human skills are going to be more critical

than ever going forward.

719

:

I agree 100%.

720

:

Me too.

721

:

You know, I have a feeling

we're probably going to talk

722

:

AI on our show again someday, too.

723

:

Yeah.

724

:

You know,

you almost could have a show podcast

725

:

just about I'm

sure these things actually existed

726

:

that are just

727

:

AI and just what's new and what's changing

and all that kind of stuff.

728

:

So but it's something that's here

to stay and,

729

:

definitely something

that's good to learn about.

730

:

And on the other note of what

we're going to be covering today,

731

:

we talked about

all of the different devices

732

:

and other things

that are set to track you right now.

733

:

Right now, I'm

734

:

pretty sure your toaster is probably

preparing a report for somebody on you.

735

:

And the thing of it

is, is, okay, maybe not.

736

:

But anyway, Breville,

the Bravia wouldn't do that.

737

:

Of course not.

738

:

You know,

I guess that's its lawful good, right?

739

:

So, you know.

740

:

So anyway,

741

:

we wanted to talk about some steps

though, on things

742

:

that you can do to help

make your life a little less trackable.

743

:

Now, anybody that uses the internet,

there's the potential for this.

744

:

And because so much stuff

is designed to take information,

745

:

it is very difficult to get away from it

completely.

746

:

And again, the motivation

here in a lot of cases is money.

747

:

These companies get this information

and sell it

748

:

and get a lot of money for selling.

749

:

In some cases, it's looking like they're

making more than the actual product.

750

:

They're selling.

751

:

So it's a thing to think about

if you're using software,

752

:

hardware or whatever

the case may be that's cheap or free.

753

:

It's not really.

754

:

It's just what you're spending is you,

because it's taking information

755

:

or taking something else

and doing something with it.

756

:

And all of these some things

come down to figuring out how you can

757

:

a figure out what something is, and b

758

:

think about how you want it to be handled.

759

:

And first and foremost, it's

to think about your phone.

760

:

Because most of us carry

761

:

a phone now, just the normal thing to do

762

:

and that phone knows where you are.

763

:

It has DPS built in to it,

764

:

and the functionality of these things

is something we use a lot.

765

:

CarPlay and,

766

:

you know,

Android Auto and these things for mapping

767

:

and all that kind of stuff

that are not only up to date

768

:

but can show you in real time

where there's traffic

769

:

and, you know, all the different types

of things like that.

770

:

But in order to do that,

it knows right where you are.

771

:

And in the cases of these things,

772

:

what you want to do is look at

773

:

app permissions and think about what

you're doing when you're installing it.

774

:

So first of all, the set up

of both devices, you can go in and,

775

:

you know, tell it not to track or clear it

or that type of thing,

776

:

which is good to do.

777

:

But if you're installing an app

that, wants to run your phone

778

:

as a flashlight and it wants permission

to your microphone, they probably say no.

779

:

And don't just click accept all the time

because that can cause problems.

780

:

You know, you do want to think it through.

It's easy to do that.

781

:

Yeah. And

782

:

while using the

783

:

app, if it wants a location,

it's probably a good choice.

784

:

So you'll get, three choices

when it wants location information usually

785

:

where it's all the time.

786

:

Only when you're using that app,

whatever the app is

787

:

that needs the information or not at all.

788

:

And if you leave it on

789

:

just when you're using the app,

you're good to go.

790

:

Because if you're not using the app, it

doesn't really need to know where you are.

791

:

I mean, conceivably,

I don't know if that's maybe

792

:

just oversimplifying it,

but that's one thing where you do

793

:

want to look at this

and make sure that your controls

794

:

are set down

and be intentional with location.

795

:

And the reason for

this is with enough information from that,

796

:

it tells the story of your life,

where you work, where you go to school,

797

:

where you do your grocery shopping,

who your friends are

798

:

because you're monitoring

all the different places you're going.

799

:

And that type of a

800

:

thing is something to consider.

801

:

You don't need the location

history saved forever.

802

:

So number one,

to know that your device is saving it.

803

:

And on your Google or app settings,

you can turn it off

804

:

or at the very least,

clear it out once in a while,

805

:

you know, so it doesn't go back to the

beginning of time, which they want to do.

806

:

And again, it comes down to money.

807

:

They sell this stuff for money.

808

:

So I have a question.

809

:

I don't like it

when they want to access my contacts.

810

:

They don't need to know who I have

contact with.

811

:

Yeah, yeah.

812

:

And, you know,

one of the one of the big ones

813

:

that came out from this,

is the game Angry Birds.

814

:

And it was and still is,

from my understanding, a popular game.

815

:

But they came to Angry Birds, too,

816

:

and all of a sudden it was asking

for these kind of permissions.

817

:

And it's like, well,

why do they need that?

818

:

Well,

they want to know who your friends are

819

:

so they can, you know,

about playing their game or whatever.

820

:

But it's, you know, that type of a thing.

821

:

And that caused a lot of

problems and actually caused

822

:

something that

823

:

was very much a worldwide sensation.

824

:

Still played a lot,

but not like it once was.

825

:

And because people don't want

to open themselves up to these things.

826

:

And the other thing is to think about

stuff that calls itself smart devices.

827

:

Most of us have these things

in one way or another.

828

:

Smart home and all the different things.

829

:

But most of these like data

830

:

and the default settings,

is going to be to collect more data

831

:

than they need to do

whatever it is they're doing.

832

:

So you go into your television

or other smart device, you're

833

:

looking for an option that's labeled

as something like improve our services,

834

:

personalization, diagnostics,

those type of things,

835

:

and say something like,

you have a check box, send diagnostic data

836

:

back to wherever it is

that they send it to you.

837

:

Well, that's information

that they probably don't need.

838

:

Certainly not to make the act work.

839

:

You can usually turn those things off

and you won't lose anything important.

840

:

And they do try to guilt you into it.

841

:

We're just doing this to make our product

better.

842

:

So, you know, we know what works

and what doesn't.

843

:

And for some reason, to do that,

we need to know who all your friends are

844

:

and your blood type,

you know, kind of thing.

845

:

It's not necessary.

846

:

Not necessary to, go down with that.

847

:

And the other thing of it is,

848

:

is to reduce your exposure

to data brokers.

849

:

And these are services

that will come up again,

850

:

you know, with the idea of,

collecting information.

851

:

Unfortunately, a lot of these don't ask,

you know, so the more reputable ones do.

852

:

But if you've ever googled your name

853

:

and all this information shows up

and a lot of times it's very accurate,

854

:

including home address, phone numbers

and not just that, the addresses you've

855

:

had for the past 15 years or something

and email addresses and all that.

856

:

That's where this is coming through

is data brokers.

857

:

Now, the more reputable ones

858

:

you can go on and there's a system

where you can ask to be removed.

859

:

There's ways to do that.

860

:

California is spearheading a new law

that's gonna allow a very easy

861

:

way to do that.

862

:

We're going to be talking about that

in a future show,

863

:

that you can go in

and get this stuff removed.

864

:

And I think we're going

to be seeing more legislation come out.

865

:

But the other problem that this is created

866

:

is there's no guarantee

that that information is correct either.

867

:

So if somehow it's got the idea

that you've, you know, been convicted

868

:

of a felony and escape from prison

and you go apply for a job

869

:

and the people that are vetting

you find that, that could be a bad thing.

870

:

In fact, it would be,

871

:

because you might not get the job

and you might never know why you didn't.

872

:

Yeah, because they never tell you

why you don't get the job.

873

:

Very rarely.

874

:

Very rarely. I've had that.

875

:

But it's very much

the exception to the rule.

876

:

And it's too bad because it would help

you build if you could know.

877

:

You know what they didn't like.

878

:

But that's a whole nother topic.

879

:

It's just to be aware.

880

:

Google yourself. See what,

see what comes up.

881

:

You might be surprised.

882

:

Just be prepared.

883

:

It might not be a pleasant surprise.

884

:

The other thing of videos

is public and workspaces,

885

:

and you want to just assume

886

:

in this day

and age that you're being recorded.

887

:

There are cameras everywhere,

888

:

front door of your house,

you know, walking down the street.

889

:

We've talked about block cameras

and we've all seen the cameras on the,

890

:

you know, red lights and things like that

that are keeping conditions safe

891

:

and whatnot.

892

:

You can go on most state websites and view

those things.

893

:

They are public.

894

:

They're not trying to hide them

or anything, but they are out there.

895

:

And the other thing of it

is, is also just assume

896

:

that your license plates

are being scanned as you're driving.

897

:

So whether that's by some of the people

we've talked about in the past

898

:

or private companies that just drive

around, do that, collect it and sell it,

899

:

or you're going through a photo

radar light or something of that nature.

900

:

It's going to scan your license plate

and just be prepared and act accordingly.

901

:

We just don't have the privacy

that we once did on these things.

902

:

So question becomes to that,

what can you do?

903

:

Well, vigilance is one thing.

904

:

If you really want to eliminate tracking

905

:

wherever you can go through

and turn it off,

906

:

shut it down, don't return diagnostic data

and be aware of the

907

:

terms and conditions of things that you're

accepting when you install software.

908

:

Or I guess it's apps.

909

:

Now, we used to call that software

in the olden days

910

:

and know what is going

where to the best of your ability.

911

:

And don't just

912

:

allow on autopilot.

913

:

A lot of people do.

914

:

We're busy and, you know, it's like, well,

that doesn't work.

915

:

We can't use that app.

916

:

One advantage right now

917

:

is that there's probably going

to be an alternative in most cases.

918

:

And how badly do you need to do that?

919

:

And if you do need to use it badly,

at least know what's happening

920

:

to your information and be aware that it's

tracking or doing whatever it is doing.

921

:

You know,

922

:

and the other thing of it is,

and this is a big one,

923

:

keep personal stuff off work devices

and work stuff off personal devices.

924

:

A big reason for this,

especially on the former, is most work

925

:

devices can be audited and again,

not from any standpoint of legal advice

926

:

and check this out with a competent lawyer

if you needed to get that information.

927

:

But my understanding of it is,

is that in a lot of cases, work

928

:

devices can be searched and that doesn't

limit it to just your company

929

:

seeing what you're doing.

930

:

They can be subpoenaed for information,

that kind of thing.

931

:

And if the information's out there,

it's probably fair game.

932

:

And just like things that if you ask

I ChatGPT or something

933

:

a legal question

that's not attorney client privilege.

934

:

Your chat logs can be subpoenaed

and they're most likely being saved

935

:

or almost certainly being saved, but most

likely accessible in one way or another.

936

:

Those that would want to get into it

937

:

and the

938

:

other thing to remember

is that when you're dealing with

939

:

things like airlines and all that

kind of stuff, you don't really have

940

:

the option

to necessarily opt out of these things.

941

:

And now it's, traveling internationally.

942

:

I notice this.

943

:

They take your photograph now when you

leave the country and when you return.

944

:

And that's kind of interesting.

945

:

I don't know if you can opt out of it.

946

:

I mean, if you try to, you're

probably still be at the airport

947

:

weeks later. You know,

948

:

I don't even know

if that's a thing you can do.

949

:

So some of the stuff, it's out there

and there's not a whole lot,

950

:

at least right now,

until they get the laws changed and stuff

951

:

that you're going to be able to do it.

952

:

Public Wi-Fi is another one,

and this is not anything new,

953

:

but if you're on a public Wi-Fi network,

be careful of what you do.

954

:

If you're going to do banking.

955

:

First of all, don't on a public network.

956

:

But if for some reason you absolutely

have to use the app from your bank, don't

957

:

go to a website

that may or may not be secured.

958

:

You know that kind of a thing.

959

:

And again, for data like that, just don't

unless it's absolutely necessary.

960

:

And in all reality, most of us have good

961

:

5G network coverage on our phones.

962

:

And if you're going

to need to do something

963

:

like that,

get off the Wi-Fi and use your phone.

964

:

Connection is a lot more secure.

965

:

That actually goes

over the cellular network.

966

:

And again,

967

:

just first of all, being aware

of what's going on,

968

:

not accepting permissions carte blanche

all the time and going into your devices.

969

:

And I know this is a lot of work.

I did it on my televisions.

970

:

It took me about an hour.

971

:

I don't have that many televisions

because each one was different.

972

:

To go in

973

:

and figure out what it was calling,

sending back information or monitoring,

974

:

getting into it, going through the 15

prompts that say, if you turn this off,

975

:

it seems like the LG one was the worst.

976

:

Tried to guilt trip me.

977

:

You know, we just need this

to make our stuff better.

978

:

And, for some reason, when you know what

you're watching to do, that, you know,

979

:

not did the software in the TV crash

or is the signal bad or something?

980

:

It's they want to know

what you're watching at all times.

981

:

Then the next part of it was prompts of,

well, if you turn this off,

982

:

you know, this isn't going to work.

And yeah, that might be true.

983

:

You might lose some smart features

if you need them. You can't turn it off.

984

:

But in most cases, I found that it didn't

really create a problem with that.

985

:

One objection

I had was the fact that the television

986

:

started disabling things,

but then wouldn't take it off the menu.

987

:

So you still go to an, oh,

this is disabled because we can't do it

988

:

because you've turned off this function.

989

:

Now, whether or not

990

:

that has to be the case notwithstanding,

they want the data or money.

991

:

So these are just some things

to think about.

992

:

Let us know what you think.

User friendly dot.

993

:

Show us a place to send us your feedback.

994

:

And until next week,

this is user friendly 2.0

995

:

keeping you safe on the cutting edge, user

friendly 2.0.

996

:

Copyright 2013 to 2026

997

:

by User Friendly Media Group Incorporated.

998

:

All rights reserved.

999

:

The content is the opinion

of the show's participants

:

00:49:13,984 --> 00:49:17,988

and does not necessarily

reflect this station or platform.

:

00:49:18,722 --> 00:49:22,592

Requests for material use, interviews,

disclosures,

:

00:49:22,826 --> 00:49:25,796

and other correspondence may be viewed

:

00:49:25,796 --> 00:49:28,932

and submitted at userfriendly.show.

Links

Chapters

Video

More from YouTube