Artwork for podcast Bitcoin Homeschoolers
₿HS017: AI for Homeschoolers Part 2 – Free Speech
Episode 1711th January 2024 • Bitcoin Homeschoolers • Scott and Tali Lindberg
00:00:00 00:43:53

Share Episode

Shownotes

SHOW TOPIC:

There’s more to understanding AI than its technical applications.  Tali and Scott continue their AI for Homeschoolers series exploring the connection between AI and free speech.  They balance the Bitcoin Moms' point of view with the Bitcoin Dads' point of view.  AI is just a tool, although a very powerful one.  Our free speech rights are under attack from government censorship, e.g., biden’s recent executive order.  Free speech is not contingent upon being “correct” because at the heart of the matter is the question of who gets to decide what is correct.

IN THIS EPISODE, YOU'LL LEARN:

  • Tali and Scott take a moment to rif on how to make this podcast the most valuable for others.
  • Homeschooling is not limited to those who take 100% control of their kids education.  Nights and weekends provide learning opportunities for dual-income families and public-schooled families as well.
  • While anyone at any stage can benefit from discussions like today’s topic on AI, the discussion focuses on families with middle-school-aged children and older.
  • You as a parent can prepare yourself well before your kids are ready for the material
  • Consider yourself part of the home education process … you learn too
  • Many subjects studied in public school curriculum do not help kids prepare for post-graduation life.
  • Conversely, many useful subject, e.g., AI, are not included but would be very valuable in preparing students for post-graduation life.  
  • If you are not taking advantage of AI in your work you are falling behind.  So, don’t hold kids back from this tool called AI.
  • Bitcoin Moms’ points of view
  • Bitcoin Dads’ points of view
  • If you have the skills to use this tool, AI, then you are incredibly valuable to any organization.
  • “Biden’s AI Bill of Rights May Just Be Another Censorship Plan” written by Jeremy Powell is available on Mises.org
  • Disinformation about disinformation … 1984’ “Ministry of Truth”
  • Free speech is not contingent upon being “correct” because at the heart of the matter is the question of who gets to decide what is correct.
  • The government continues to play Big Brother and deciding what citizens can or cannot be exposed to, what they can think and cannot think.  If you disagree, they shut you down.
  • Stay positive.  Be careful to guard your emotions.  It’s hard to keep positive when there are so many negative things in the news like biden’s executive order. 
  • AI is bringing great benefits, to include new types of jobs.  Focus on those good things.
  • Kids could be turned off if we communicate in a harsh way, even if our passion is coming from a place of love.
  • Social media street smarts.  We are all susceptible to being manipulated by AI, .e.g., by social media, chat bots, etc.  
  • Those creatives who are embracing AI and how to use it are advancing their skills and thriving.

RESOURCES MENTIONED IN THE SHOW:

HAPPY TO HELP:

  • Tali's Twitter @OrangeHatterPod
  • Scott's Twitter @ScottLindberg93
  • Scott's nostr npub19jkuyl0wgrj8kccqzh2vnseeql9v98ptrx407ca9qjsrr4x5j9tsnxx0q6
  • Free Market Kids' Twitter @FreeMarketKids
  • Orange Pill App @FreeMarketKids
  • Free Market Kids' games including HODL UP https://www.freemarketkids.com/collections/games

WAYS TO SUPPORT:

We are essentially our own sponsors and are so grateful for all of you who support this show.  Thank you!

STANDING RESOURCE RECOMMENDATIONS:

Tali’s “Quick Start” checklist  https://www.freemarketkids.com/blogs/i-want-to-start-homeschooling/i-want-to-start-homeschooling-quick-start-checklist

Mentioned in this episode:

Aleia Free Market Kids Full

Transcripts

Speaker:

whether you are a creative.

2

:

Or just running a business in any

capacity, if you are not taking

3

:

advantage of the AI tools that are out

there, you're really behind the times.

4

:

So the essence of the article is that

is they are trying to continue to

5

:

sensor and the, the AI bill of rights,

isn't really protecting you from AI.

6

:

I just want to caution parents who are

just as passionate as you are, especially

7

:

probably Bitcoin dads in general.

8

:

To just be very, very aware of

creating emotional triggers that

9

:

actually turn your kids away from what

you want them to pay attention to.

10

:

Free speech is not contingent on someone

being correct because at the heart

11

:

of the matter is who gets to decide.

12

:

Hey everybody.

13

:

Welcome to becuae

homeschoolers today's podcast.

14

:

We're taking a little bit

of a different format.

15

:

Scott and I have had.

16

:

Some discussions about how to proceed

in:

17

:

So we'll start by just sharing a

little bit of our discussion with

18

:

you, and then we'll launch into.

19

:

R a topic that we are very, very

passionate about most recently because

20

:

of the development in the AI industry.

21

:

And the fact that it's suddenly looks

like it is absolutely everywhere

22

:

and the lawmakers are not happy

and the bake techs are not happy.

23

:

So we're going to launch into that a

little bit and, uh, talk from both a men's

24

:

point of view and a woman's point of view.

25

:

So welcome.

26

:

Hey, everybody.

27

:

Welcome to Bitcoin homeschoolers.

28

:

Scott, I'm going to start today's

podcast a little bit differently.

29

:

We were having just a slight disagreement.

30

:

Before we started this episode

in how we should approach our.

31

:

Podcast episodes going forward in 2024.

32

:

Want to make sure that we're bringing

the most useful information to

33

:

people who would like to hear it?

34

:

And we were discussing who we're

directly talking to actually.

35

:

Who are we?

36

:

Who are we trying to talk to?

37

:

So I don't think the word is to talk

to, and let me just back up, first of

38

:

all, when a couple is not an agreement,

it is, it's a, it's a feature, not

39

:

a bug of the relationship because.

40

:

You're you're working things out.

41

:

Right.

42

:

So we were just working things out.

43

:

I don't look at this as like,

we're talking to people,

44

:

I'm looking at this as we.

45

:

We are trying to figure out how we take

our experience and our knowledge and.

46

:

Basically give back to people who

might want to take some more steps.

47

:

And maybe they can learn from us

and avoid some of the pain or maybe

48

:

they can pick up a couple of good

things that we wish we had known.

49

:

Previously.

50

:

So it's really, we're trying to

figure out how to add the most value.

51

:

Right.

52

:

To our listeners.

53

:

Right.

54

:

And because we're early on.

55

:

We're still figuring it out.

56

:

How we do that.

57

:

So, The question of who

is Bitcoin homeschoolers.

58

:

For there, there are a

couple of different points.

59

:

First point that I think we

should, we should talk on.

60

:

He is.

61

:

You don't have to actually be.

62

:

Uh, homeschooler to homeschool.

63

:

And what I mean by that is.

64

:

Let's say you're a single

parent or you maybe.

65

:

You're a couple and you're both working.

66

:

This is still for you.

67

:

This is, this is something,

these are the things we're going

68

:

to talk about are things that

you can work on with your kids.

69

:

At nights you can work on the weekends.

70

:

If you are.

71

:

Fortunate enough that you can actually

survive on an income and has one

72

:

of the parents stay in homeschool?

73

:

That's fantastic.

74

:

But I think the first thing we

should address is this is not for

75

:

people who are only 100% time.

76

:

I homeschooled.

77

:

We call it that, but we're meant

it's meant to be much more.

78

:

Inclusive.

79

:

Of parents.

80

:

Parents in general.

81

:

We're really just,

we're trying to talk to.

82

:

Bitcoin parents.

83

:

Who have the intention to hum

educate their kids, whether they're

84

:

full-time homeschoolers or not.

85

:

The most important piece of this

is just other parents get involved.

86

:

And not delegate fully.

87

:

Your children's education to whatever

institution they happen to be in,

88

:

whether it's public or private or even.

89

:

Ah, those charter schools.

90

:

So we want to just hear

from our listeners as well.

91

:

Give us feedback.

92

:

Let us know.

93

:

What is helpful, what is not

helpful, what topics you would

94

:

like to discuss going forward?

95

:

What questions you have, which is welcome.

96

:

Some feedback and comments,

how should they reach us?

97

:

Scott?

98

:

It's like a newscast.

99

:

Over to you Tali.

100

:

Well, how should they reach us?

101

:

Anyway, any way you want, we could,

we have all of the contact center

102

:

you can do Noster you can do Twitter.

103

:

You can do.

104

:

You can go to the website and

email us, you can go to Twitter,

105

:

like whatever it is, I don't.

106

:

Um, I really don't care how you reach out.

107

:

Well, The show notes in the show notes,

they have, uh, several different options.

108

:

But, yeah, so we're looking for feedback.

109

:

We just want to be as helpful

as possible because we have.

110

:

So many things we want to share.

111

:

But we, we should be set a strategic.

112

:

So anyway, so we had this discussion,

we thought we should bring it to you.

113

:

I listen to his attention.

114

:

Well, there's more to it.

115

:

Yeah.

116

:

So let's talk, let's talk on ages.

117

:

Yeah.

118

:

Let's talk on ages.

119

:

All right.

120

:

Horrible.

121

:

So.

122

:

Aye.

123

:

I'm looking at this.

124

:

As if a lot of the people that we've

met at conferences and we've met it.

125

:

And meetups and we've just talked

to in general and a lot of them are.

126

:

Really early on, maybe they

don't even have kids yet, or

127

:

their kids are super young.

128

:

However, a lot of the things like

today's subject actually is AI because

129

:

we're doing a series on, on AI.

130

:

You, if you have a

one-year-old or two-year-old.

131

:

That's probably not the most important

thing to you right now is figuring out

132

:

how to work that into the curriculum.

133

:

So in terms of the age of the kids,

That the content we're talking about.

134

:

It should be middle school and above.

135

:

I think that's, if you hit now,

if you have someone who's you

136

:

have a kid, who's not that.

137

:

Old yet, you still, you can still gain

from this because there is a part of this

138

:

where you have to teach yourself first.

139

:

And this could be something that you

say, okay, I'm going to work this

140

:

into my home homeschool curriculum.

141

:

I want to teach about AI, even

if my kids aren't ready for it.

142

:

Now, there are things I

can do as a parent to.

143

:

Prepare myself.

144

:

I can learn things I can try for myself.

145

:

I can do.

146

:

I can do research.

147

:

Right.

148

:

Um, However.

149

:

Purely in the context of how we are.

150

:

Talking to one another.

151

:

The.

152

:

It's as if we had someone who was

at least 13 years of age or older.

153

:

In the family.

154

:

And you could then start to

direct some of this stuff.

155

:

Pretty much immediately and

work it into your curriculum.

156

:

You agree with that?

157

:

I do.

158

:

I think if your kids are much

younger than that, one of the.

159

:

One of the lessons that I

learned, homeschooling our kids.

160

:

Was a lot of what I taught.

161

:

Weren't direct and lessons.

162

:

A lot of what the kids learned from me.

163

:

And you were just through observation.

164

:

And so.

165

:

If you consider yourself part of

this home education process, not

166

:

just as a teacher, but as a student.

167

:

Even as you learn.

168

:

Things for yourself.

169

:

Your kids are going to benefit by

observing you, even if they're too young

170

:

for you to directly tell them, Hey, sit

here and let me show you this AI thing.

171

:

So the value is really twofold.

172

:

It's not only.

173

:

The fact that we need to.

174

:

Teach our kids, but we need to really stay

ahead of the game and teach ourselves.

175

:

So that's part of the discussion as

well as something to keep in mind.

176

:

Yeah.

177

:

I mean, there's a bigger, there's

a bigger theme there, and that is.

178

:

You can't.

179

:

If you look at the

public school system and.

180

:

How the curriculum is set up.

181

:

And you compare that to today's world.

182

:

You really would have to

question whether the things that.

183

:

A lot of people are including us.

184

:

Kevin gone through hello school.

185

:

A lot of people go through and study.

186

:

Really doesn't help them.

187

:

In preparing them for.

188

:

Adulthood.

189

:

So then, then sort of

what's the point, right?

190

:

Out of that.

191

:

And if somebody has an interest

in it, then, then great.

192

:

But you also leave out.

193

:

Other things that you would

probably benefit from.

194

:

So if you.

195

:

I don't know.

196

:

I'm trying to think of what

the, you know, what some of the

197

:

subjects are, but I mean, if you.

198

:

If you have chemistry in high school.

199

:

And he says mandatory.

200

:

And you intend to go into something

with the arts or otherwise.

201

:

That doesn't really help you.

202

:

However, I can't think of anyone today.

203

:

That if you, if they're a teenager

that would not benefit from learning

204

:

about AI, so AI as a subject.

205

:

Absolutely.

206

:

Which is today's topic.

207

:

That's why I pick it.

208

:

Absolutely something that is

worthwhile to put into the curriculum.

209

:

Money and money history, obviously

that's where we're Bitcoin homeschoolers.

210

:

So that should be in there.

211

:

That's not taught in traditional.

212

:

So to me, that's.

213

:

That's part of what we're

talking about too, is you.

214

:

You, you get the opportunity

to decide as a parent.

215

:

Am I going to include this or not?

216

:

And if I am going to include it,

What am I going to learn about it?

217

:

Like, should I learn first about it, which

I think is where you were going, right?

218

:

You.

219

:

You decide.

220

:

I think.

221

:

Our kids are going to be exposed

to AI regardless of whether

222

:

or not we teach it to them.

223

:

It's out there.

224

:

They're probably already using it

without knowing they're using it.

225

:

The thing about the institutions,

whether your kids are in

226

:

public or private school is.

227

:

I think at least as far as I

know, there's a lot of focus on.

228

:

You know, don't let the

kids take the shortcut.

229

:

Don't let the kids.

230

:

You to plagiarize, et cetera.

231

:

And so they really want to heavily control

whether or not the kids are exposed to

232

:

AI for the purpose of academic studies.

233

:

And I was just having a conversation

with our daughter the other day.

234

:

She's in college and on her

college campus, the use of

235

:

AI is strictly forbidden.

236

:

The students would get a visit from.

237

:

From the faculty.

238

:

If there, if there was any.

239

:

Any chance they were using AI to

help them with their papers or

240

:

research or anything like that.

241

:

But that's really backwards.

242

:

And that kind of goes back to why we

decided to homeschool in the first place.

243

:

Schools don't reflect real life because

in any profession now, if you are a

244

:

professional, whether you are a creative.

245

:

Or just running a business in any

capacity, if you are not taking

246

:

advantage of the AI tools that are out

there, you're really behind the times.

247

:

And we're keeping as a generation of kids.

248

:

Back.

249

:

So that they don't plagiarize and they

don't take shortcuts in their learning.

250

:

And I feel like that's really backwards.

251

:

And so it's up to the parents

to step forward and say, Hey.

252

:

It is a tool.

253

:

It is advanced.

254

:

And it does it.

255

:

Doesn't just have the, um,

Yeah, it doesn't just have the

256

:

capability of allowing you to cheat.

257

:

Which is what people, a lot of people

are focused on in terms of AI in the

258

:

schools, but it also gives you tremendous.

259

:

The speed and the ability to

create that you never had before.

260

:

And why would we want to

keep the kids away from that?

261

:

Right.

262

:

Which was.

263

:

Right.

264

:

Okay.

265

:

Let me just, let me, let me, let me

back up so that if, if I could wrap up

266

:

the subject before that though, you and

I having a discussion of where this.

267

:

Podcast is.

268

:

Best suited to help people.

269

:

There was one last point that I wanted

to get on the table before we go.

270

:

Further with the AI.

271

:

Okay.

272

:

And that is.

273

:

We.

274

:

We can add value to, to other people.

275

:

But.

276

:

Um, there, there are certain people

who are going to want to listen to

277

:

the Bitcoin dad's point of view.

278

:

And there are people who are going to

be better to hear from a Bitcoin moms.

279

:

Point of view.

280

:

So as we go into this AI discussion today,

let's I think we should test that out.

281

:

And.

282

:

And make sure that eat that you, you

give them the coin mom's point of view.

283

:

I give the Bitcoin dad's point of view.

284

:

On.

285

:

And we just make sure that that's

part of every time we bring up a topic

286

:

that we say, Hey, this is something we

recommend that you put in your curriculum.

287

:

We add that in there.

288

:

So maybe we don't even

call it a curriculum.

289

:

Maybe we just call it.

290

:

A project.

291

:

That you have fun with.

292

:

Sure.

293

:

It sounds so standardized.

294

:

Yeah.

295

:

Okay.

296

:

So I want to get that.

297

:

I didn't, I didn't want

to lose that point.

298

:

I thought that was an important

part of our discussion earlier.

299

:

And I just wanted to close

it out now, going back to.

300

:

A I there.

301

:

Y you don't just give it.

302

:

Uh, a person who's never handled a firearm

before you don't just give it to them.

303

:

And then.

304

:

Let them go learn.

305

:

You want to be there to tell him

like what's safe, what's not safe.

306

:

How do you use it correctly?

307

:

And.

308

:

And I go to that kind of extreme,

just to say anything, it could be.

309

:

My kitchen knife.

310

:

It could be the internet.

311

:

I mean, you can go on the internet

and find really bad things that

312

:

can hurt you and you can find

things that could help teach you.

313

:

I'm putting AI in that same bucket where.

314

:

I agree with you.

315

:

We should teach them how to do it.

316

:

Because where the jobs are going to go.

317

:

In the future.

318

:

And it, and I don't think it's

that distant in the future.

319

:

If you let's say you are a

manager, you you're responsible

320

:

for whatever department.

321

:

Maybe you're an engineer.

322

:

Or you're creative.

323

:

Thing.

324

:

And you have a team of people that you.

325

:

You have.

326

:

For the last decade or more people

talk about virtual assistance.

327

:

And they say.

328

:

I'm going to outsource the editing

of my podcast, or I'm going to

329

:

outsource the writing of my blog.

330

:

That type of thing.

331

:

Where we're going pretty quickly.

332

:

Is AI.

333

:

Is like this extension.

334

:

It's like a team.

335

:

It's almost like a team member.

336

:

Right.

337

:

And if you have the skills and

you know how to use this tool,

338

:

You are incredibly valuable.

339

:

To any organization.

340

:

Whether it's, you're creating.

341

:

Video or images or you're doing

coding or I don't care what it is.

342

:

If AI can help in that.

343

:

You learning how to use AI.

344

:

Makes you the person who

adds the value cause you.

345

:

You're sort of like, instead of like

a dog whisper, you're the AI whisper.

346

:

Whereas, if you have been avoided and

you have been told you can't use it.

347

:

Then you are at a

disadvantage because now.

348

:

Your skillsets.

349

:

You just have you've limited what

your skillsets are in this new

350

:

environment, in the new economy.

351

:

And I think that really, really does put.

352

:

Any, any kid or any, I guess.

353

:

Young person.

354

:

Behind the April.

355

:

So if you use it, it

can help you add value.

356

:

If you don't learn how to use it.

357

:

You are now a disadvantage

to everybody else who does.

358

:

Know how to use it.

359

:

Yeah.

360

:

And I just want to add, um, some

of the moms that I spoken to.

361

:

Our voicing concern that AI is basically

going to take over all of these jobs.

362

:

You know, the things that people

had to study hard to, to achieve,

363

:

like being a professional

copywriter or being a professional

364

:

artist, for example, mid journey.

365

:

Where the ability to

create incredible images.

366

:

So quickly.

367

:

But.

368

:

If we look at it from their

point of view of a tool.

369

:

AI itself is not intelligence.

370

:

It will never replace a human sprain.

371

:

So if, and.

372

:

It's not like a.

373

:

Ah, it's not like a genie in the

bottle kind of thing, where you just

374

:

pose a question and all of these

assumptions are all calculate it,

375

:

and then they just spit out the best

answer you really need to learn.

376

:

How to interact with AI.

377

:

Strategically so that you

can get the best output.

378

:

It is still just an output machine.

379

:

You need to put in the right input.

380

:

And so there's us.

381

:

I don't know if that's

going to be picked up.

382

:

Um, See, there are specific skillsets

in interacting with different AI

383

:

programs to make your job productive.

384

:

So it's not just like a

switch that you flip on.

385

:

You go, oh, now I'm using AI

and I'm going to be incredible.

386

:

No, there's a learning curve for

the person to use this new tool.

387

:

So, what we want to emphasize

is don't be afraid of it.

388

:

Stay ahead of it and know that

the human brain trumps these AI.

389

:

Tools right.

390

:

All right.

391

:

So let me, let me introduce today's topic.

392

:

So we're we're well into this thing.

393

:

We've now.

394

:

We've introduced AI again.

395

:

What we didn't say up front, and

maybe I can do this in the intro.

396

:

I'll do a top low in Joe.

397

:

Today's subject is a

continuation of our AI series.

398

:

And the first time we talked about this.

399

:

We talked about a lot of these concepts

that we're, we're repeating some of the

400

:

themes, I think on why you want to do it.

401

:

And in that, that other episode, we

also talked about a lot of the FID.

402

:

Around AI.

403

:

But today.

404

:

The thing that.

405

:

That I thought would be a good.

406

:

Next step in the continuation of

learning about AI actually is a

407

:

connection with, with free speech.

408

:

And so.

409

:

Here's where I'm going with this.

410

:

So one of the things I

do to continue learning.

411

:

I mean, and this gets back to

what you wanna teach your kids to.

412

:

You always want them to keep learning.

413

:

There are several.

414

:

Austrian economic podcasts that

are on my R in my library of shows.

415

:

And I'll listen to those

when I'm working out.

416

:

And.

417

:

It's just, it's just, it just helps me.

418

:

I just, it's my way of

continuing to learn.

419

:

While I'm doing something else.

420

:

And there was a, um, a podcast it's it's.

421

:

Put on it's called the audio Mrs.

422

:

Wire podcast.

423

:

And episode.

424

:

Uh, 2131.

425

:

They essentially are reading this article

by a gentleman named Jeremy Powell

426

:

who wrote an article on misses.org.

427

:

The title of that is Biden's AI.

428

:

Bill rights may be.

429

:

Just another censorship plant.

430

:

And so that's.

431

:

That's a subject it's

basically AI and free speech.

432

:

And there were a few parts of this.

433

:

That I thought we could break down.

434

:

So the first of all, I,

I think we should take.

435

:

Like a couple minutes and try to

recap that we'll have the show.

436

:

We'll have the links to that podcast

and the article in the show notes.

437

:

It's the benefit of this particular

podcast is the episodes are

438

:

pretty short, 10 to 20 minutes.

439

:

You can get like a little bit

of dose in it and you're on.

440

:

You're done.

441

:

On a lot of different subjects.

442

:

So I think we should take.

443

:

My proposal is we take a few

minutes to talk about the

444

:

essence of what that article is.

445

:

And then we can link that back to

understanding AI and, and then what

446

:

do we need to learn for ourselves

as parents, as citizens about AI?

447

:

And in this case, it's

connection with free speech.

448

:

And then we can get to, okay,

well, what kind of things might, we

449

:

want to teach our kids with that?

450

:

What are some approaches we

can get into the Bitcoin dad,

451

:

Bitcoin mom, point of view.

452

:

Discussion on, on that.

453

:

So, so with that, I

mean, unless you object.

454

:

You.

455

:

I'm good.

456

:

You're good.

457

:

Okay.

458

:

So.

459

:

I am definitely not going

to read the article.

460

:

Here's the, the idea is it's like

there's, there's disinformation

461

:

about disinformation.

462

:

And the theme of the article is.

463

:

That the government has

proven, especially recently.

464

:

That they are willing to

weaponize government agencies.

465

:

And even third parties that

are paid with government funds.

466

:

In order to sensor.

467

:

Speech.

468

:

So if you go and look at COVID,

you weren't allowed to say certain

469

:

things, even if they were based on.

470

:

Medical history or, or facts

or what not, you weren't even

471

:

allowed to voice and opinion.

472

:

Right.

473

:

And you can label someone with hate.

474

:

You can do other things.

475

:

So a political example is Biden's laptop.

476

:

So.

477

:

The censorship.

478

:

To, to basically shut down a story

that would have impacted that.

479

:

Election.

480

:

Now we know the story's true.

481

:

But here's, here's the point?

482

:

It doesn't even matter.

483

:

I mean, it matters, obviously it matters,

but free speech is not contingent on

484

:

someone being correct because at the

heart of the matter is who gets to decide.

485

:

That's what I, that's

what I take out of it.

486

:

Who gets to decide.

487

:

It reminded me.

488

:

They don't say this in the

article, but what reminds me

489

:

is like 1984 ministry of truth.

490

:

Who gets to decide.

491

:

What is okay to speak and not

speak right at, or say, or not say.

492

:

And you can have a lot

of things with this.

493

:

I mean, if, if Biden can use tick talk

to try to get people to think that

494

:

the economy is in a positive state.

495

:

Right now.

496

:

I mean, Just think about the

manipulation that's going on.

497

:

We need more than ever.

498

:

That first amendment to be able to speak.

499

:

Freely without being shut down.

500

:

Without being censored.

501

:

And so the, the thing that

led to the article was.

502

:

Um, When Biden comes out

and says, we have this AI.

503

:

Bill of rights.

504

:

I think that's the title of it.

505

:

And there it's an executive order

essentially to say, we, we want to

506

:

make sure AI is not discriminating.

507

:

We want to make sure AI doesn't.

508

:

Have things that are hateful

or, or have any qualities in it.

509

:

And then it lists out all these

things for safety reasons like

510

:

that sound really important.

511

:

The national Institute of

standards and technologies.

512

:

Okay.

513

:

Well, okay.

514

:

So that's the person, that's

the entity that gets to decide.

515

:

The of speeches.

516

:

Okay.

517

:

And then there's the red, they

call it red team testing before

518

:

something could be publicly released.

519

:

They talked about an AI safety board.

520

:

And.

521

:

So the essence of the article is that

is they are trying to continue to

522

:

sensor and the, the AI bill of rights,

isn't really protecting you from AI.

523

:

And we covered that last time.

524

:

Like that is.

525

:

That's like AI is going to

be whatever tool good or bad.

526

:

That it is.

527

:

And the idea that we have to surrender.

528

:

Our first amendment rights and,

and trust in a, another agency.

529

:

Uh, government, some other person.

530

:

To tell us what we're

allowed to say or not say.

531

:

Like, it's not.

532

:

Um, it is not okay.

533

:

And the reason that I thought this was

important to add to our AI discussion is.

534

:

We need to not only understand what the

technology of AI, which is usually where

535

:

the discussion is, uh, largely is to.

536

:

It's just this discussing,

what is the technology?

537

:

There's a lot of disinformation.

538

:

There's a lot of FID.

539

:

And right now, there is a movement.

540

:

That is basically going to

continue to take away our rights.

541

:

And it's, I find it

deeply, deeply disturbing.

542

:

And.

543

:

That's why this particular

podcast, the one I mentioned

544

:

from the audio misses wire.

545

:

This particular article, why?

546

:

I thought it was so important to

add this to our AI discussion.

547

:

So.

548

:

That's the overview.

549

:

That has the overview of what they cover.

550

:

So before we go on into.

551

:

Like the next section of earth.

552

:

The discussion Talia.

553

:

Is there anything that

you think I left out?

554

:

Of.

555

:

Uh, the essence of what that article is.

556

:

No, it's just, it's very disturbing that.

557

:

The government continues to play big

brother and thing that they should decide.

558

:

On what their citizens should be

able to expose to, to be exposed.

559

:

See, I mean, that is.

560

:

Basically what China's doing

in controlling communication.

561

:

And it just bugs me so much

that there's no trust at all.

562

:

And that people can be adults

and form their own opinions.

563

:

Right.

564

:

They want to control the narrative.

565

:

They, they want to tell you how you think.

566

:

And if you don't agree with

them, they want to shut you down.

567

:

So.

568

:

This is, this is not, we're not gonna talk

about Nasr today, but I think this is.

569

:

There's another example of it.

570

:

Like you have the.

571

:

Of the sec is out there to

supposed to safeguard us and they

572

:

just, they just screwed up the.

573

:

The communication about the, uh, the ETF.

574

:

So anyway, we can do

nostril on a different day.

575

:

So I propose we move into.

576

:

We break it into sections.

577

:

What do you, what should you be

learning about it for yourself?

578

:

As a parent.

579

:

As says, and, and then we go into

the kid's point of view, what we

580

:

should be doing with the kids.

581

:

And that's where we could do the

Bitcoin mom, Bitcoin dad thing.

582

:

Yep.

583

:

All right.

584

:

So.

585

:

Um, I kind of, I mean, it kind of,

it's a natural lead in when if you,

586

:

you need to understand our rights.

587

:

I added this.

588

:

The analogy of putting your,

your oxygen mask on before you.

589

:

You help the person next to you?

590

:

It's just the standard.

591

:

Airline safety thing, make sure you

put your oxygen mask on first, because

592

:

if you pass out, you can't help.

593

:

Other people, including your kids.

594

:

So we need to put on her oxygen

mask first and we need to

595

:

be aware of what's going on.

596

:

AI is a lot more than just creating

cool pictures or videos or.

597

:

Writing blog posts and things like that.

598

:

Yes, it can do all that kind of stuff.

599

:

We need to understand a little

bit about what the technology is.

600

:

We need fundamentally to

understand what our rights are.

601

:

We need to understand the significance

of the first amendment in this case.

602

:

And what the clowns are trying to do.

603

:

They are actively trying

to take away those rights.

604

:

They are, the government is involved.

605

:

In.

606

:

It's been proven.

607

:

It's come out where they're

trying to influence.

608

:

Whether it's Twitter or Google

or Facebook, like it is.

609

:

It's not a question that there

there's, um, the connection there.

610

:

So, I'm not saying you have to go

read Atlas shrugged or:

611

:

mandibles, however, When, when you

hear about the AI bill of rights and

612

:

supposedly it's for your own safety.

613

:

Man, you, you gotta, you gotta get

a, um, a refresher in our rights.

614

:

And that would be my.

615

:

That's what I took away from it is.

616

:

Um, Understand what our rights are,

and then you can get another things.

617

:

Chat GPT is, is biased towards

a progressive narrative.

618

:

If anything, if you've ever tried

to go there and ask it something.

619

:

That's conservative or right.

620

:

Or libertarian.

621

:

It's clearly, already.

622

:

Progressive leaning.

623

:

So.

624

:

And I mean, AI is a.

625

:

It's all hallucinations.

626

:

Anyways, it's a hundred percent made up.

627

:

But the, this idea that somehow it's.

628

:

Causing racial discrimination

and other inequalities, like.

629

:

But wrong point.

630

:

That's not the point.

631

:

The point is.

632

:

That it is just a technology.

633

:

It can be an incredibly powerful

technology for our future.

634

:

And yeah, so that's, that's

what I took away for ourselves.

635

:

Would it, what was your

point of view in terms of.

636

:

Yeah, maybe it's what you could do.

637

:

The mom thing.

638

:

What is it that you think.

639

:

You should learn for yourself or do.

640

:

On this subject.

641

:

I kind of want to go into

the energy flow a little bit.

642

:

Even though it has nothing to do with AI.

643

:

What I have learned.

644

:

Is.

645

:

There's a lot of things going on the

world that are moving really fast.

646

:

And.

647

:

For people who are busy, it can feel like.

648

:

You can't ever catch up and it's just.

649

:

They're all of these.

650

:

Um, Well, they call watchdog.

651

:

Watchdog.

652

:

Influencers and.

653

:

TA watchdog influencers talking about.

654

:

The government trying to take away

your first amendment rates and more

655

:

and more regulations, and basically

the whole civilians kind of.

656

:

Uh, surveillance kind of.

657

:

Development in society and it

can seem very overwhelming and.

658

:

I don't know about.

659

:

Most people, both for me, I almost

get to the point where I feel like.

660

:

Where is the hope because.

661

:

Everything is happening so fast.

662

:

I can't keep up.

663

:

And so.

664

:

For the moms out there.

665

:

Um, Who.

666

:

Who, uh, feel that way.

667

:

I want to just say.

668

:

There are also a lot of really

wonderful things happening.

669

:

Very fast.

670

:

And.

671

:

In discussing homeschooling in discussing

development, AI in discussing what's

672

:

going on the monetary system and

the political landscape, et cetera.

673

:

It's important for us to focus on.

674

:

More energy and more

time on the good stuff.

675

:

So that we don't.

676

:

Get bogged down energetically

by the negative things.

677

:

And when we are talking to our kids.

678

:

They will pick up on those things.

679

:

And if we get.

680

:

I'm going to say overtly, but

obviously that's relative, but.

681

:

If we get overly.

682

:

Passionate.

683

:

Talking about the bad

stuff that's going on.

684

:

It really turns away.

685

:

The kids because they just want

to have a carefree childhood.

686

:

They just want to be able to enjoy.

687

:

Funny videos on Tik TOK without their

parents saying you're being watched by

688

:

the Chinese government or something.

689

:

You know, if we get to that point where.

690

:

Everything is a conspiracy.

691

:

It's very difficult for

a mom to know what to do.

692

:

And I just want to call

attention to the fact that.

693

:

Energy flows, where your attention goes.

694

:

So give your attention to the good stuff.

695

:

Give you attention to the good

stuff that AI is bringing.

696

:

And guy, your kids there.

697

:

Instead of focusing so much on the.

698

:

Negative side of it.

699

:

That's that's my take from a mom.

700

:

Yeah, I sound so negative right now.

701

:

I'm talking about this

dystopian future in your life.

702

:

Yeah.

703

:

But you can still.

704

:

Focus on a good, all right.

705

:

Well, that's a good point.

706

:

All right.

707

:

Well, let's, let's uh, why

don't you just keep going then?

708

:

So from again, Bitcoin mom, point of view.

709

:

The, the aspect of, is there anything

else about, cause I think you already.

710

:

You've kind of already did this then.

711

:

What else would you would teach the kids?

712

:

So you're talking positive things.

713

:

It's.

714

:

A tool for good.

715

:

Here's how you can use it.

716

:

Um,

717

:

I don't know what else.

718

:

I mean, I don't know.

719

:

So for the moms out there.

720

:

Who are, uh, thinking about,

you know, preparing your kids.

721

:

Fuck.

722

:

Uh, high school, preparing your

kids for college, preparing our

723

:

kids for the job market, et cetera.

724

:

There's actually, uh, a lot of job

creation, like, as we're talking

725

:

about AI, taking away jobs, there

are also job creation because of AI.

726

:

So I read an article the other day.

727

:

I think it was on CNBC or

something, something like that.

728

:

Like one of the bigger.

729

:

Bigger publications.

730

:

And they were talking

about how they are now.

731

:

Artists creative people.

732

:

Um, People don't fiber like that

kind of creative people, whether

733

:

it's it's physical arts or

digital art or writing art or an

734

:

entertainment, art doesn't matter.

735

:

But.

736

:

Art immediate, there are

special AI specialists.

737

:

Now there are popping up that

are getting really wonderful gigs

738

:

because they know how to use AI.

739

:

To advance their own skills.

740

:

And so they are ahead of the

curve and they are thriving.

741

:

And so.

742

:

It's just a reminder to the moms out

there who might be concerned because

743

:

their kids are kind of artistic leaning

and you know, not necessarily you're

744

:

getting ready to become a doctor.

745

:

And only at that kind of stuff, those

people tend to be more worried about the.

746

:

Invasion of AI technology.

747

:

And so I'm just calling attention

to the fact that AI technology will

748

:

actually create jobs and it will make

the work that in the past could have

749

:

been very tedious for the creative

to actually just focus on creating.

750

:

Ideas and, and, um, Uh,

artistic expression.

751

:

You know what I'm saying?

752

:

Yeah, this is.

753

:

Yeah, I'm actually really happy.

754

:

You're taking this angle because

I, I went down the negative.

755

:

I was brought in like the hook bait,

if you want to call it or clickbait

756

:

or whatever, like the idea of.

757

:

The sensor shift and what's going on.

758

:

And I do think we have to teach

people to think critically.

759

:

And I do think we have to teach

people to understand the rights.

760

:

But this is a really wonderful point.

761

:

The it's as a continuation of your

point of focusing on the positive.

762

:

And I I'm just, um, I'm just

thinking to myself, I didn't.

763

:

I didn't think about that at all.

764

:

When I, when I first heard this

and I think it's a really cool.

765

:

Outcome of this discussion of going.

766

:

Deep on this.

767

:

Um, otherwise you can

get pretty down yourself.

768

:

On.

769

:

On things.

770

:

And so.

771

:

To me, the things I was thinking about

were things like teaching kids about.

772

:

The first amendment teaching kids about.

773

:

Uh, Critically thinking and understanding

that if you have a chat bot.

774

:

Or you have tic talk or you have whatever.

775

:

Any type of social media, you.

776

:

Are.

777

:

Susceptible to being.

778

:

Manipulated.

779

:

I mean, it's not even,

it's not a question.

780

:

It's.

781

:

It's almost a science with the,

with those that have a lot of data.

782

:

Can I kind of just

address that really quick.

783

:

I'm sorry.

784

:

Yeah.

785

:

Yeah.

786

:

So I want to just go into the

psychology of, of just people.

787

:

In general for a second.

788

:

So the reason that social media is, has

become so powerful is because people.

789

:

Get that, um, that what.

790

:

I called dopamine hit.

791

:

People want to lean into things

that make them feel good.

792

:

So if it's a video of a puppy,

if it's a video of somebody doing

793

:

something silly, It's a video

of somebody doing as LA dance.

794

:

What they're going after is how they feel.

795

:

So as a parent, we have to walk

that mind very carefully because we

796

:

simultaneously need to teach them.

797

:

But, and keep their attention.

798

:

So if we only just go into the

intellectual, the, our first amendment

799

:

is being violated, et cetera.

800

:

It doesn't feel good.

801

:

It's you feel threatened?

802

:

I feel threatened when I

read things about that.

803

:

And naturally.

804

:

A natural response for a human being when

they feel threatened is to lean away.

805

:

And so even as we're teaching the

kids, we have to be really careful

806

:

that we're not pushing them away.

807

:

In, in the way that we

express these ideas.

808

:

Does that make sense?

809

:

Just.

810

:

We.

811

:

Yeah, we're hitting different points

on it were where I was going with.

812

:

It is.

813

:

If you're Elon Musk and you want.

814

:

Advertising dollars for whatever

eyeballs and engagement.

815

:

You could use AI.

816

:

And see.

817

:

What triggers an emotional response?

818

:

And it's usually going to be

someone who's more extreme.

819

:

So you come on there and you're low.

820

:

You give a level headed example of

whatever, a response to somebody.

821

:

That's not going to generate

a whole lot of buzz, but.

822

:

I come on and I say

something that's incendiary.

823

:

Guess what.

824

:

I'm going to use that algorithm

to try to boost that one.

825

:

Because it gets the engagement going.

826

:

On the platform.

827

:

That's what I mean by manipulated.

828

:

They're like you can use.

829

:

It's not just the message of

what comes across and the words.

830

:

It's also, what's even shown

what's not showing and.

831

:

You can.

832

:

You can be manipulated in other ways.

833

:

That's right.

834

:

And all I'm saying is you

should make your kids aware.

835

:

I didn't mean to go down the negative

rabbit hole again, I'm just saying.

836

:

Let the kids know that

anytime you're interacting.

837

:

Through social media or other things.

838

:

You just have to be, you just

have to have like a natural, it's

839

:

almost like being street smart

except it's social media smart.

840

:

You just need to have an awareness

and, and you're not gonna be able

841

:

to block it all, but at least.

842

:

If you, if you are aware, you

can be, you, you are susceptible

843

:

to that because you're human.

844

:

Then you have at least a chance of.

845

:

Applying those critical thinking

skills and defending yourself from a

846

:

lot of these negative type of things.

847

:

Maybe it's just pulling

yourself off the media.

848

:

I mean, whatever it is.

849

:

I, I agree with everything you're saying.

850

:

From the point of view of a middle

schooler or a high schooler.

851

:

Based on my experience,

interacting with them.

852

:

Timing of raising that topic is going

to be the most important thing because.

853

:

Kids.

854

:

They just don't care right now.

855

:

That's not the most important

thing in their life.

856

:

They don't really care that their

rights are being violated yet.

857

:

You know, It's not relevant to them yet.

858

:

So when, when a parent is trying to raise.

859

:

Awareness.

860

:

My suggestion from a mom's point

of view is to make it really

861

:

brief, very circumstantial, very.

862

:

Impassing because.

863

:

If you.

864

:

If you launch into a lengthy lecture.

865

:

Which we tend to do in the kitchen.

866

:

They shut you off.

867

:

Not only do they shut you off?

868

:

They, we have just created an

emotional trigger for them to.

869

:

Turn away.

870

:

So physically, they might still

be standing there mentally.

871

:

They've gone.

872

:

And.

873

:

Emotionally, they've gone further.

874

:

And so.

875

:

I just want to caution parents who are

just as passionate as you are, especially

876

:

probably Bitcoin dads in general.

877

:

To just be very, very aware of

creating emotional triggers that

878

:

actually turn your kids away from what

you want them to pay attention to.

879

:

And that's all I have to say.

880

:

I think that's a pretty okay.

881

:

That's pretty fair.

882

:

So dads, Bitcoin dads out there.

883

:

That is something that I think we should.

884

:

We should heed that advice.

885

:

And I am guilty of.

886

:

Lecturing.

887

:

I am guilty of getting passionate about.

888

:

Things and.

889

:

I certainly, I certainly don't

want to push our kids away from

890

:

those things just because of.

891

:

The way that I expressed that.

892

:

And that's, uh, that's a really.

893

:

Hard thing too, because you care, right?

894

:

I mean, that's, that's at the heart of it.

895

:

You care?

896

:

That's the reason you're passionate.

897

:

And.

898

:

You're not aware that you're doing it.

899

:

So.

900

:

So great.

901

:

That's some great advice.

902

:

I think we should wrap up their work.

903

:

We're going to do some more.

904

:

With AI, this is, this is a series.

905

:

We do think this is a

very important topic.

906

:

And we will continue to refine.

907

:

How we bring.

908

:

Subjects on this show or guests on

this show to be most beneficial,

909

:

to, to those who are listening.

910

:

So all the listeners out there.

911

:

Thank you so much.

912

:

Uh, we do.

913

:

Look forward to hearing

some feedback from you.

914

:

What things you like, don't like things

that could be helpful as helpful.

915

:

Just let us know and we will grow.

916

:

We will grow with you on this.

917

:

So anything else tell you as we wrap up?

918

:

Nope.

919

:

That's it.

920

:

All right, everybody have a great week.

921

:

We'll talk to you next week.

922

:

Bye.

Links