It’s mission critical for Bitcoin homeschoolers to overcome the fear, uncertainty and doubt (FUD) about artificial intelligence (AI). Tali and Scott kick off this series with the “why” about this powerful type of tool. The decision parents must make is whether to learn about AI and be the master of the new technology or be the victim who fears it.
· Preston Pysh’s interview with Jeff Booth https://www.theinvestorspodcast.com/bitcoin-fundamentals/personal-ai-models-and-bitcoin-jeff-booth/
· Guy Swan interview with Jeff Booth https://youtu.be/s8XoMg3tb0s
· Spirit of Satoshi https://www.spiritofsatoshi.ai
We are essentially our own sponsors and are so grateful for all of you who support this show. Thank you!
Mentioned in this episode:
Aleia Free Market Kids Full
Built-in Microphone & FaceTime HD Camera (Built-in)-6:
Understanding artificial intelligence.
2
:Let's start by.
3
:Getting a clearer picture of what it is.
4
:I think there's a lot of, uh, people
out there talking about artificial
5
:intelligence as if it's a living
thing as if we're talking about.
6
:Like the movie Terminator where machines
are taking over the world and they are
7
:more intelligent than human beings and
they enslave human beings in these.
8
:You know, human guard, labor garden,
like in the matrix or something.
9
:So let's just first and foremost, clear up
what artificial intelligence actually is.
10
:Scott.
11
:It's a newscast.
12
:Over to you Tali.
13
:So the.
14
:The thought is this, if this, this is
going to be a topic that we're going
15
:to have to hit on multiple times,
and part of being able to figure out
16
:what to include in your curriculum.
17
:Or not including your curriculum or
the framework you give your kids, like,
18
:for example, your, your framework for
politics or money or anything else.
19
:You have to study a little
bit yourself to understand.
20
:So Tommy and I have started to.
21
:To go down the artificial intelligence.
22
:Rabbit hole and it's been eyeopening.
23
:And actually the slides that we're looking
at now and the ones that are come up.
24
:Uh, we we've edited these, but they
started with the initial kind of framework
25
:of them actually was from one of the
tools that Talia was playing with.
26
:The main idea of this,
this entire episode.
27
:Is why this is a critical
subject to be taught.
28
:So if you're a Bitcoin homeschool or.
29
:Our opinion is.
30
:teach AI now.
31
:What level you do that too.
32
:Like you can get into, you
can get into that, but.
33
:Part of this.
34
:Part of this is there's a lot
of misunderstanding out there.
35
:I didn't actually read the article, but.
36
:Supposedly some things, some of
the things that were coming out
37
:of the white house in terms of
guidance happened after Joe Biden.
38
:Watched mission impossible.
39
:And he clearly has no clue.
40
:On anything in AI.
41
:You have tech giants.
42
:Then have a lot of political cloud
because of the, the, the money
43
:that they can, they can bring.
44
:And they want to create a
whole moat around this and.
45
:There's a tremendous amount of.
46
:Just a lack of understanding, whether
it's just like, there's a, there's a lack
47
:of understanding of what it really is.
48
:And there's a lot of fear.
49
:So Hollywood's going to do that.
50
:It could be, there's a ton of
movies we can get into for examples.
51
:And.
52
:The thing about it is let's let's
first, just let's peel this thing,
53
:this peel, this onion slowly.
54
:Let's let's dig in a little bit to it.
55
:And it's just like any other tool.
56
:Right.
57
:When electricity first came
out and people worried about.
58
:All everyone dying and
being burned to death and
59
:electrocuted and everything else.
60
:Like it's.
61
:Could you imagine a life
without electricity now?
62
:So this is something that is a technology.
63
:It's advancing quickly
and let's just start with.
64
:Teaching yourselves a little
bit about what AI is and
65
:then we can get, get into it.
66
:Um, more, but the premise is.
67
:Let's let's at least
understand what this is.
68
:Yeah.
69
:And as you can see on the side,
we're going to address not only what.
70
:AI is, but also to really view
it as just another advanced tool.
71
:So for example, when cars came out,
people who own horses or very upset.
72
:Uh, it was going to take over
jobs like the blacksmith's job.
73
:You know, horse.
74
:Carriage.
75
:Carriage drivers.
76
:You know, and who, the people who make
wagons, you know, All that type of stuff.
77
:And then what they focused on was
what they're going to lose rather
78
:than to see it as just a new tool
that will save them time and energy.
79
:So.
80
:The picture that you see there,
even though we're talking about AI,
81
:if you look at it, it's a ruler, a
pair of scissors and some thread.
82
:Well, If we look at AI, like it's
anything more than a tool then of
83
:course it brings a lot of fear.
84
:But as Scott said before, When electricity
first came out, people were very fearful
85
:of it because they did not understand it.
86
:And so that's what we want to.
87
:That's the way we're going to
approach this presentation as
88
:well as features presentations is
AI is nothing more than a tool.
89
:And the decision you have to make
is are you going to teach yourself
90
:how to use the tool so that you are
the master of the tool, or are you
91
:going to become a victim of this?
92
:Tool and somehow give them more
power than it actually has.
93
:All right.
94
:Well, let's get into it then.
95
:AI is obviously tremendously powerful.
96
:It's it's amazing how
quickly it has developed.
97
:We only started hearing about AI.
98
:In everyday conversations
a year or two years ago.
99
:And suddenly it is.
100
:Absolutely everywhere.
101
:Every software that you use is now.
102
:Using AI to help make content.
103
:Better.
104
:It's changing how we interact.
105
:Almost in every single way.
106
:So.
107
:W, but this is where by first.
108
:Contention.
109
:Is, it's not actually intelligent.
110
:So as I've started to study this.
111
:Um,
112
:The, the idea that when
you're using chat GPT or your.
113
:Whatever, whatever it is
that like, what you're.
114
:You're talking about Tali, where
you, you have something and you're
115
:interacting and there's AI behind it.
116
:So it can be anything
from a Google search to.
117
:Two.
118
:Uh, who knows what everything, like
you said is really touching this.
119
:It's more like a.
120
:Auto-fill it's a giant auto-fill.
121
:And it's, it's not, it's not as if
the machine is actually thinking.
122
:So the first problem I have
with AI is it's not, there's
123
:no, there's no intelligence that
I can really, I can answer a.
124
:A.
125
:L said whatever the, whatever
the test is that you take,
126
:the, the, the, the legal stats.
127
:It was the whole set, right?
128
:There was a law school.
129
:Maybe this is a wrong example.
130
:There was a thing where there,
there was a, there was a test given.
131
:And judgy.
132
:GBT or some other AI
was able to pass this.
133
:This graduate level exam.
134
:And that's this implies we're,
we're kind of reading into this.
135
:That is intelligent.
136
:And that's not true at all.
137
:It's more like if you
started typing a word.
138
:And then the rest of the word popped up.
139
:Right.
140
:I'm.
141
:I'm texting and I started
writing something and then the
142
:rest of the word popped up.
143
:This is just the, it's just the
probability of what the next word
144
:is and the next word, the next word.
145
:And it's, it's a whole bunch of.
146
:Probabilities.
147
:Right.
148
:It's not reading your mind.
149
:Right.
150
:It's it's deducting logically.
151
:There's no, there's no deduction.
152
:It is.
153
:It's like a whole bunch of
vectors and probabilities.
154
:That you know, this.
155
:This word that like once upon a right.
156
:Once upon a, as an example.
157
:The the, the number of times
the probability that the next
158
:word would be, time is higher
than the next word being dog.
159
:Right.
160
:And then it says, okay.
161
:And then, and it does it with phrases
and it does with other things as well.
162
:I'm not a mathematician.
163
:I certainly not sorta can program AI,
but what, what I took away from my.
164
:Uh, initial.
165
:Reading and podcasts on AI is.
166
:It's more like an auto-fill.
167
:And if you think about that, it,
you should relax a little bit.
168
:We are not on the verge of Ultron.
169
:Taking over the world, or if you're,
if you're older, like me, how from
170
:2001 space Odyssey, like we're.
171
:We're not to the point where this
has any kind of general intelligence.
172
:This is.
173
:It's literally a probability.
174
:Tool and it.
175
:looks to us like it's really smart
because it's, it's able to look at.
176
:Millions or billions or
however many pieces of input.
177
:To get us out.
178
:And when people say, well, you have to
be careful because AI makes up stuff.
179
:Like it would make up case
study in that example.
180
:Well, Technically, it makes up
a hundred percent of this stuff.
181
:It's not just making
up some of this stuff.
182
:Everything you get is made up.
183
:And.
184
:Relax.
185
:We're, we're not, this is, this is
literally not a thinking machine coming
186
:at you and trying to answer this, that
it understands what is giving you.
187
:It's just the probability.
188
:So, um, to me that, that's the,
that's the amazing thing about it.
189
:It has so much potential.
190
:And we literally only just started
down this, this path, but that's what
191
:I thought of when I first started.
192
:When you, when you're talking
about defining what is AI.
193
:Well, a part of me is
saying, well, what is it not?
194
:It is not actually a thinking.
195
:Capable type of entity.
196
:That's going to take over the world.
197
:But anyway, that's my take.
198
:I think that was definitely one of the
things that we are sure of me in the
199
:beginning when I first started hearing
about AI, because my first knee-jerk
200
:reaction was also fear like, oh my
gosh, You know, we have kids in college.
201
:What are they going to do?
202
:Coming out?
203
:What we have one daughter
who's very interested in art.
204
:If AI is taking over art, then
that means you have no job.
205
:We have another one
that, that writes songs.
206
:And of course we're hearing
people talk about how AI's
207
:writing songs and they're winning.
208
:Grammy's and things like that.
209
:Like, what does that.
210
:What does that leave?
211
:For the human students.
212
:And that was so that,
wasn't my first fear.
213
:And when I came across this
definition that it is nothing
214
:more than a probability generator.
215
:And may me relax, because if it's
just a probability generator,
216
:then that means you've got to have
somebody interpreting the data and
217
:that human person cannot be replaced.
218
:Right.
219
:So it's totally my thought in this.
220
:This is the most.
221
:Just rip style in terms of
how we're addressing this.
222
:We should look at.
223
:This would be a good.
224
:A good episode would be just breaking
down what the different levels of the
225
:intelligence are as a been defined.
226
:And where we're at.
227
:Going forward.
228
:Yeah.
229
:No, no.
230
:I mean, as a, as a future,
Installment in the series.
231
:Yeah.
232
:Well, I really quick look at the
compounding effect of AI and.
233
:Really the chip.
234
:It's it's really, it really
comes back to the chip.
235
:Uh, development.
236
:The speed's development because
AI works off of the speed
237
:of processing information.
238
:Partially.
239
:Because of its slow.
240
:Then AI just wouldn't
be everywhere right now.
241
:Okay.
242
:Yes.
243
:I think you need that as
a foundation, but yeah.
244
:I mean, this timeline shows
you how you should read that.
245
:And 56, the term artificial
intelligence is coined.
246
:So that wasn't the 1950s.
247
:And by 1997, IBM's deep blue
beats world chess champion, Gary.
248
:Casper off.
249
:So I remember when that story came out,
I remember that was such a big deal.
250
:And the time between the first, you know,
artificial intelligence term came out.
251
:To when.
252
:Artificial intelligence, beat the world.
253
:Chess wish.
254
:I think in my mind, when you
think smart people, you really
255
:think people like chess champions.
256
:And so it took about 40 years.
257
:Yeah.
258
:So, this is where again, I want.
259
:W we're going to, we're going to have
to do an episode, just some what.
260
:What is AI?
261
:When you're, when you're in the world.
262
:Of chess.
263
:The, uh, computer is really good
at running a lot of calculations.
264
:And keeping track of them.
265
:So in chess, there's a, it
may be a log, but there is a
266
:finite number of, of options.
267
:And if player X.
268
:Chooses an option.
269
:That reduced, that changes.
270
:The the possibilities each time.
271
:Right.
272
:So the, the chest.
273
:Like a chess program, could theoretically
look at all the possibilities, make it.
274
:And so this is the one
that's going to go with.
275
:Whereas it's not really.
276
:Intelligent.
277
:Well, again, I think again,
the intelligence thing throws
278
:me, and this is not the same.
279
:The, that that example is interesting.
280
:But it's not the same as what
we're seeing with Chet GPT.
281
:It's not the same as what
we're seeing with Dali.
282
:There.
283
:But that was the beginning.
284
:That was 1997 button.
285
:But what I want to say is, and I
don't know if they actually did.
286
:Did this, but.
287
:When a human.
288
:Chess player is.
289
:Against artificial intelligence
or I'll just call it a computer.
290
:For, for simplification and they test and
you're like, oh, The computer beat, Gary.
291
:Moving on the computer is that smart,
but what, what I think would have
292
:been really interesting is if you
gave Gary a chance to learn how.
293
:That program.
294
:The blue works, whether or not
Gary was some work could have
295
:beaten the, the AI program.
296
:I think that would have
been an interesting thought.
297
:But I'm not sure that they did that.
298
:I disagree.
299
:I think the more interesting thing is.
300
:The, this is work.
301
:This is why AI is
actually a tool for good.
302
:If you wanted to be a chess
champion 30 years ago.
303
:And you were trying to get in your,
your reps, if you will, to learn.
304
:And you were fortunate enough that
your parents could afford to hire
305
:a, you know, a grand champion for
you to spend a lot of time with them
306
:and study, you could get in these.
307
:Repetitions yourself to learn how to play.
308
:Today.
309
:I can download an app on my phone.
310
:And.
311
:Because we have these programs.
312
:I can actually be taught.
313
:I can actually get the repetitions
in at a, and this is the Jeff Booth.
314
:As technology is deflationary at a much
lower cost than it would have been.
315
:30 years ago.
316
:So to me, the more interesting
thing of that is is.
317
:The potential of AI as a tool.
318
:And what it means for learning.
319
:Yes, definitely as a tool.
320
:And I want to reference the book
that we talked about in the, in
321
:last week's episode of the art
of learning by Josh Watkins.
322
:When he speaks about, his
experience facing off at the
323
:world, chess championships.
324
:A lot of it comes down to my games.
325
:And when you're working against
the computer, it is neutral,
326
:always emotionally neutral.
327
:And you are honing your
technical abilities for sure.
328
:But there's an element that can never
be replaced by machine, which is that.
329
:Human reaction.
330
:Element.
331
:So he said that when the top chess
players face off, they both, like, when
332
:you're talking about they're playing
for first place, you know, everything
333
:they've gone through every round when
they're playing for first place, was that.
334
:That champion apart from.
335
:The loser is just their ability to
keep their cool or to detect minute
336
:emotional shifts in their opponent.
337
:And that's how they beat.
338
:The other person, not necessarily because
they are technically more superior.
339
:So going back to what you said, AI is
still just a tool and if you use it to
340
:your advantage, Then you have the power,
but if you give it some, um, some like.
341
:Unreal expectation that
it can somehow replace.
342
:Actual human being, you know,
honing your trust skills then
343
:I think we miss the point.
344
:Yeah.
345
:So is there anything else on this?
346
:The, the milestone discussion.
347
:It went from.
348
:Yeah, so it went from 1997 to 2011.
349
:So that's what like 14 years later.
350
:And now they're talking about IBM Watson
wins jeopardy against human champions.
351
:Um, so again, the AI programming is
doing the data analysis and in those
352
:two cases in 1997 and 2011, they.
353
:Basically, we're used to demonstrate how
powerful they are and how smart they are.
354
:But again, don't lose sight
that they are still just.
355
:Data.
356
:And analyzing right.
357
:Yeah.
358
:Yeah.
359
:And then the.
360
:Oh, well, Okay.
361
:So let me jump here because
the timeline also includes this
362
:stuff with where Google's getting
into recognition algorithms and.
363
:And another big company,
alpha is alpha go alpha.
364
:What is the one that beat the champion?
365
:I can't.
366
:Oh, so.
367
:One of the things that's on my
mind, as you think about this, this
368
:history is the Reese, the intensive
amount of resources necessary.
369
:To build this, right?
370
:This wasn't somebody in their
garage that could build deep blue.
371
:We're talking about IDM.
372
:And then you get you fast forward to.
373
:The the.
374
:Recognition stuff that
the Google was doing.
375
:Like you're talking big tech.
376
:And one of the things that
I think we talk about.
377
:We might bring this up again later, but.
378
:As technology.
379
:Continues to develop at
the pace it's developing.
380
:It's also getting cheaper.
381
:So not only are the chips faster.
382
:And the computers can hold more your
servers or whatever you're using.
383
:But actually.
384
:There is a trend that is
towards decentralization.
385
:And one of the episodes that I listened
to at a it's a resource I recommend
386
:to those that want to go deep on this.
387
:Guys Swan now has a
separate podcast just on AI.
388
:It's called AI on chained.
389
:And one of the very interesting
points that he made.
390
:Was that AI is actually, it
tends towards decentralization.
391
:And now you have programs that you could.
392
:We're the once you've done the
hard work of going through a lot of
393
:data and creating whatever these to
create to all that, the vectors and
394
:probabilities and whatever else.
395
:You can now get something down to 200 gigs
or 250 gigs, or, you know, whatever it
396
:is, it's something that could be on a home
server, a home computer, even your phone.
397
:And now you can be offline.
398
:And you would have access to
the equivalent of like this.
399
:This, uh, this, this language
model that can answer things.
400
:And again, we're still early
in this, in this process, but
401
:what comes to mind is in this.
402
:This discussion of the milestones
is it started out where
403
:you're really needed to be.
404
:Resource.
405
:Rich.
406
:In order to, to start to build this.
407
:But we're trending towards more
and more de-centralized abilities.
408
:Are these things.
409
:And I actually, I was
actually very relieved.
410
:To hear that otherwise.
411
:The the, um, where they call
them the controller guards.
412
:You know, the Facebooks, the
Amazons, the Googles of the world.
413
:Even the government.
414
:Would be able to control this
because they, the massive
415
:resources, you would think like
a data center or something to be.
416
:To run these things.
417
:And the reality is it's
actually going the other way.
418
:And it's actually very decentralized.
419
:That actually was.
420
:Uh, pretty reassuring, but all right.
421
:I think from a.
422
:Uh, history standpoint, I'm not
sure how much more we really
423
:need to, to go into that.
424
:So, all right, let's just jump
into some of the other, um,
425
:some other aspects of this.
426
:Tell me, where do you want to go with.
427
:With the next, I mean, these
are just a couple, a few bullet
428
:points about how AI works.
429
:Uh, they, they can extract.
430
:And analyze information
for images and videos.
431
:They are obviously building
robots that can perform tasks.
432
:By themselves.
433
:Um, they have models that were
inspired by biological neural networks
434
:and they can recognize patterns.
435
:Very quickly.
436
:Um, other things like natural language
processing, they enable computers to
437
:analyze and generate human language.
438
:I was playing around with an AI program.
439
:Where if I upload it.
440
:A.
441
:Short video of myself speaking in English,
I could within minutes change my spoken
442
:language in the video lip sync to my lips.
443
:In 150.
444
:Different languages.
445
:Very.
446
:Very cool.
447
:Very cool.
448
:Yeah.
449
:So the actual way that AI works
to me is beyond our, beyond the
450
:scope of what we personally know.
451
:I do think it's almost like.
452
:Do kids really know how
a dishwasher works or do.
453
:Did someone really know
how their microwave worked?
454
:Do you actually know how
your car works or your phone?
455
:You may not need to be an
expert down to the point where
456
:you could do the programming.
457
:I think this is another section that
later on, I want to, I think we're gonna
458
:need to go deeper on if you're looking
at a Dakota homeschool curriculum.
459
:Some portion of that needs to be
able to explain, okay, here's what.
460
:If this is a computer, computer vision,
and being able to analyze images and
461
:being able to analyze videos, here's
kind of what's going on with that.
462
:And then say, okay, here is what
language processing is like, and
463
:then say, okay, here's how a.
464
:Um, Like a mid journey is actually
creating images and it's, you know,
465
:the way I heard it explained is
it's, it's got all these different.
466
:Images that have been
labeled and described in.
467
:In whatever language.
468
:Then you go in with your language
model and you use that, it goes
469
:back and says, well, then here's
all the probabilities around.
470
:What kind of colors, what pixel color is
next to what picture color based on that.
471
:You throw in some randomness there,
and then it starts to put these things
472
:together and it comes out looking like.
473
:There could like.
474
:The computer knows what
a cat is or something.
475
:So.
476
:I think we need to put in English
something that like, uh, We are, we could
477
:explain to at an elementary school level.
478
:On how AI works in the same way that
you might explain to kids how the body
479
:works, how a car works, how a phone works.
480
:Right.
481
:We need.
482
:We need to do the same
thing with AI, how AI works.
483
:Yeah.
484
:This episode is an introduction.
485
:I think there's so much
information out there.
486
:Obviously we can't possibly
cover all of it, but.
487
:This is a call to action for
homeschooling parents to.
488
:To dig a little deeper and not be afraid
of it and figure out how to incorporate
489
:that information into homeschooling.
490
:One of the things I, I hear from.
491
:Our kids were in college right now.
492
:Um, when they have to write papers.
493
:Somehow their professors
can, can or cannot tell that.
494
:Somebody had the chat,
GBT write their papers.
495
:And so there's a whole lot of rules and
regulations around how you shouldn't
496
:use Chad GBT or other tools too.
497
:Perform your work.
498
:But I feel like that's a very dinosaur
age way of looking at it because
499
:yes, you should know how the right
paper, but then how can you use
500
:the AI tools that are available to
you to become even more productive?
501
:Because the world's moving very fast
and if we can up our productivity there.
502
:That also frees them up time
to learn even other things more
503
:deeply or, or, uh, work on.
504
:Connecting data points
versus having to spew out.
505
:Right.
506
:Regurgitating things.
507
:Yeah, I don't, I don't, if
you're a professor, I think.
508
:Today, you could probably
tell if the student just took.
509
:Chad GPT and, and just literally
dumped it out there as a kid.
510
:That's just trying to.
511
:Get by faster could take that and edit it.
512
:And it would probably have a hard time.
513
:I, the technology though, is it increases.
514
:It's going to get harder
and harder to tell.
515
:You know, with that.
516
:So the bigger thing to me is.
517
:Do I want.
518
:I want to be able to use this
to, I want our kids to be able to
519
:use the tools available to them.
520
:For an hour, we're gonna get
into jobs and things later,
521
:but I want them to be able to.
522
:You know how to use it for good.
523
:The question of whether you use
something for good or bad, that's
524
:more of a moral teaching, right.
525
:And it's a tool.
526
:I can use my car to drive safely
from, one place or another,
527
:or I can transport drugs.
528
:Right.
529
:I can.
530
:Use a gun for.
531
:Hunting or I could, you know,
commit a crime or something.
532
:So there's a knife.
533
:The knife, making a gourmet
meal or hurting someone.
534
:Right.
535
:She didn't say don't use a knife
because you could hurt somebody with it.
536
:Well, no, you still use a knife
because it's a very useful tool.
537
:The meme I liked, I want, like someone
says, um, when someone's trying to
538
:attack Bitcoin, they say we need to stop
Bitcoin because of terrorist use it.
539
:Well, it's.
540
:It's permissionless.
541
:Right?
542
:So if the terrorist uses a road,
we're not taking away all the roads.
543
:And terrorist uses a cell phone.
544
:We're not taking away all the cell phones.
545
:You know, You just.
546
:Th this technology is here is.
547
:It is.
548
:Uh, increasing in terms of its
effectiveness and abilities at a speed
549
:that we, we have a hard time with.
550
:So the question is what do you do with it?
551
:And so, all right, so next.
552
:Uh, want to go into two different types.
553
:Groups of AI, one is narrow.
554
:One is broad.
555
:An example of a narrow AI is
for example, Um, a program that
556
:was trained specifically to spot
something out of a group of things.
557
:So for example, if you had a picture
of Waldo and you train your AI to
558
:spot Waldo, and that is the only
information that you put into that
559
:algorithm, then that algorithm
can spot while though faster than.
560
:People and a general, broad, um, algorithm
can because that's all they know.
561
:Or if you program it to identify, you
were mentioning before cancer cells.
562
:Well, if that's the only thing they
know, and that's the only information
563
:they're processing through, they
can do that very, very quickly.
564
:Now that doesn't take into
account anything else.
565
:So the example that we were talking
about before about cancer, My.
566
:My rebuttal to you about using
something like this, like AI to, to
567
:identify that, or as a diagnosis.
568
:Uh, diagnosis tool or as a whatever
in the, in the medical field.
569
:We, we are realizing more
and more that the human body.
570
:Requires a holistic approach to, um,
to treatment of different ailments.
571
:And so if you have a bot that just
identifies itself, it doesn't mean
572
:that he then has the back thing.
573
:That algorithm has the.
574
:Answer to how to.
575
:Make you better, but it does
make the diagnosis much faster.
576
:And then you can move on to the other
holistic stuff that you can do to treat.
577
:Two different things.
578
:So the point that narrow and
general or narrow and broad.
579
:Are two different ways of
kind of categorizing AI tools.
580
:And I, the way I would, if I could
summarize what you're saying is that.
581
:Even after you have that.
582
:The fact that AI can spot.
583
:Maybe from x-rays or.
584
:Um, scans or whatever can spot cancer.
585
:More reliably and faster than a human can.
586
:That's great, but you still, now you'd
have, now, now people can move on
587
:and start working on the diagnosis.
588
:Better.
589
:Treatment the treatment.
590
:Right.
591
:So now you've.
592
:You've leveraged AI to do something
better than humans can do faster.
593
:And now you can get to, to
the next to the next stage.
594
:So that's an example of using
AI as a tool for your benefit.
595
:Um, if we were to use a broad AI,
like a chat GBT and say, Hey, look at
596
:these pictures or Gemini or whatever.
597
:Uh, Google's new AI tool coming out.
598
:If you say, Hey, look at this
picture, is it a cancerous?
599
:It's going to have to process
through a lot of information.
600
:A lot, most of them irrelevant.
601
:To make that determination and
then may or may not be correct.
602
:And so in that sense at the narrow.
603
:Then the more narrow.
604
:Programmed AI would be a more efficient
tool versus a broad, because we always
605
:think, oh, Bigger data is better,
but it's not necessarily so right.
606
:Okay.
607
:So the key point is there's.
608
:There's different types of AI.
609
:There could be, they can be adjusted
to very specific needs where they
610
:can be adjusted to be very broad.
611
:And they both have different.
612
:It depends on the data
set that is put in there.
613
:I think it has a different ethic.
614
:Yes.
615
:How you train it?
616
:I think it has to do with
what's the use case or what
617
:you're trying to use it for, so.
618
:Okay, great.
619
:Which is a great lead into.
620
:So the kind of things that you can do.
621
:With this.
622
:So, um, being able to detect fraud.
623
:That's great.
624
:The healthcare stuff we've already,
we've already talked about.
625
:Um, you want to, you want to.
626
:Well, virtual assistants, AI powers,
virtual assistants, like Siri and Alexa.
627
:And I must say that they are very useful.
628
:You know, But they make a
lot of mistakes as well.
629
:So it's a tool self-driving cars.
630
:I know that there's lots
of places testing it.
631
:Uh, what is a Tesla has self-driving
mode, but only working in certain.
632
:Types of environment.
633
:You have the self parking car, like
parallel parking and you just push a
634
:button and then take your hands off.
635
:The wheel type things, so
they can be very, very useful.
636
:Very practical.
637
:Yeah.
638
:And they're, they're basically there.
639
:But I would put them
in the narrow category.
640
:Right.
641
:If you're, if you have.
642
:I don't know how many
billions of images of.
643
:Cars and trucks and bikes
and roads and things.
644
:And then all around a Tesla,
you have all kinds of inputs and
645
:it says, oh, based on the way.
646
:This other car is moving.
647
:Then the chance the, of them
like a collision is high.
648
:Therefore apply the brakes.
649
:Right.
650
:And then you, it has every time,
every person out there with their.
651
:Does this driving is
adding to that database.
652
:That's training it.
653
:To become.
654
:To go and better.
655
:It's not like you just said.
656
:Drive in it.
657
:It thought of how to drive is.
658
:It's just an, it's a narrow application.
659
:And then you have facial recognition.
660
:You have.
661
:You have so many other things so
that the number of applications.
662
:I know the one in the
Bitcoin community is popular.
663
:It comes up is using
AI to help write code.
664
:So you don't have to be an expert
code writer to do some basic coding.
665
:You could use AI to help build
websites, or you could be.
666
:You can use AI to check for
mistakes too, in your coding, right?
667
:They do.
668
:And they do that.
669
:They're the ones that are more,
some of the more advanced folks
670
:we'll have, we'll actually be
able to use AI to, you know, to.
671
:To generate the code to do
things and save hours and days.
672
:And be able to do what may
have required someone else.
673
:Oh a week of work to do.
674
:Can now be done in.
675
:I don't know, 20 minutes
or whatever it is.
676
:So in knows how to use.
677
:The application.
678
:So that's programming.
679
:That's pretty cool.
680
:I know that content creators use it.
681
:On the web.
682
:Everything from writing up your Amazon.
683
:Amazon descriptions too.
684
:Advertisements.
685
:And things like that.
686
:So it's pretty interesting.
687
:Um, lots of that.
688
:Applications.
689
:I don't think we need to
really go too deep on this.
690
:I think most.
691
:People already get this part.
692
:I don't think we need to really.
693
:I don't think there's
much more to add on this.
694
:On this particular part of the discussion.
695
:Well, okay.
696
:So these are examples of how
AI can be interpreted to.
697
:To threaten, uh, human jobs.
698
:So, for example, for content
creators, they used to have to
699
:hire somebody, a copywriter to.
700
:Right there.
701
:Copy or hire an editor to edit there.
702
:They're writing and suddenly.
703
:They don't need to do that anymore.
704
:They're using AI to help them,
but I would challenge that notion.
705
:In this way, if you are a copywriter
and other people are using AI to.
706
:To write their copy, then you can.
707
:Get ahead of the curve by saying,
well, I use these tools instead of
708
:charging you for 10 hours of work.
709
:I only have to charge you one
hour because I still have human.
710
:Um, discernment that AI
doesn't necessarily have to
711
:apply to a certain situation.
712
:Okay.
713
:That makes you more productive too.
714
:Two comments.
715
:I think.
716
:The thing that I'm taking from that is.
717
:If our kids were younger and I
wanted to teach them about AI, I
718
:would want to give them a framework
of how to think about this.
719
:And.
720
:Uh, one of the things that is
powerful is to compare like
721
:the person that was creating.
722
:You know, the buggy whips or whatever it
was that you needed when there's horse
723
:and carriages and along come the cars.
724
:If the buggy whip guy went to Congress and
said, you got to pass laws, that people
725
:can't make cars, because I want to go out.
726
:You're going to put people out of work.
727
:Well that you're trying to fight.
728
:Where technology's going.
729
:And this is why I know we've already
talked about another other episodes,
730
:but why the price of tomorrow is
just so brilliant to talk about.
731
:In simple terms about how
technology is deflationary.
732
:What it means for you?
733
:If you're young.
734
:Student you're being homeschooled.
735
:It's not that you're
going to be out of work.
736
:The nature of the work
is just going to change.
737
:So if you want to add value, Then.
738
:If you learn how to use these tools.
739
:Better than other people.
740
:You are insanely valuable.
741
:In the next stage of where we're going
with the economy and the types of jobs.
742
:In other words, Don't be the one.
743
:Worried about how to
protect the, the buggy whip.
744
:Be the one who's learning how
to be a mechanic on the car.
745
:Right.
746
:You.
747
:You, you know, depending
on what you, I think the.
748
:I'm trying to figure out the right way
of saying it to me is the framework of
749
:thinking about what the, what this means.
750
:And when the FID comes along
about everybody losing their jobs.
751
:And I'm an artist or you're a
copywriter or whatever it is.
752
:You still need someone who
knows how to use these tools.
753
:Right.
754
:And if you do decide to at least
understand what AI is, there are so many
755
:different ways that you can go into it.
756
:So many different applications,
like we were just talking about.
757
:It is a world of opportunity for you.
758
:It should not be feared.
759
:It should be looked at as
how do I use this tool?
760
:For myself and what I want
to do, what's gonna make me.
761
:Further and make it either.
762
:Further.
763
:As opposed to trying to fight
the fact that technology.
764
:Cause you're not going to stop
technology from improving.
765
:Government regulators
and things like that.
766
:They might be able to try
to put up some artificial.
767
:Moat.
768
:Temporary temporarily.
769
:But ultimately they can't
stop the advancement of where
770
:the technology is going.
771
:It's a much healthier approach to say.
772
:Well in the free market,
when you have something new.
773
:Then it's going to open up new types of
jobs and new types of things for people to
774
:work on that we can't even imagine today.
775
:Yeah.
776
:That's the, it's an
actually really good thing.
777
:So.
778
:Oh, look at that.
779
:So the next slide actually.
780
:In our, in our notes.
781
:Talk about that.
782
:There's going to be a lot of jobs.
783
:That are going to.
784
:Really benefit from people
who know how to use it.
785
:So I'll just give another example.
786
:So we had a friend who worked for Ford.
787
:And his job was, he was on the
assembly line and his shoulder
788
:was constantly injured because
of the repetitiveness of the job.
789
:And it was, it was causing
him so much pain and.
790
:If.
791
:If that part of the job was
automated by AI somehow, and
792
:he was able to do something.
793
:Different that the AI couldn't do.
794
:He would be a happy lay employee and.
795
:And it's just like AI.
796
:It kind of like, like, um, Like the cars,
they, you know, when people say, oh,
797
:you're going to take away the jobs of the.
798
:The buggy drivers?
799
:Well, buggy drivers had
a really tough life.
800
:They were out there in rain and snow and
cold and went in and it was miserable.
801
:But if they had a car, they
get to sit on the inside.
802
:Maybe they've become car drivers.
803
:And so it's just a, it's just a
different way of looking at it.
804
:And also what you mentioned, the
new AI related professions created.
805
:We can't even begin to
imagine what those are.
806
:For example, 10, 15 years ago, when,
when our boys were really interested in.
807
:Um, Minecraft and all the little
boys were on YouTube, you know,
808
:recording themselves, playing.
809
:And I was like, what are you doing?
810
:This is what your brain is
going to melt into butter.
811
:And it's a complete waste of your time.
812
:You need to go and study
something more important and so
813
:that you can get a better job.
814
:You know, going to.
815
:When you go older and suddenly you
have all these young millionaires
816
:and what were they doing?
817
:They were playing Minecraft on YouTube.
818
:Yeah.
819
:So those professions, there
was no way that we could have
820
:anticipated that possibility.
821
:So this is the double-edged sword of,
of the impact in terms of society.
822
:Because if you're, if you're China and you
want to monitor where people are spending.
823
:Their their money.
824
:And now, like, I know I use a, a garment
cause I want to March my sleep well,
825
:if they have access to your sleep data,
They, they, they know when you go to bed.
826
:They know, you're your GPS.
827
:They know where you drive.
828
:They have your financial records
so that they know what you eat.
829
:It's really scary.
830
:And if you tie that license plate
recognition, facial recognition.
831
:If you use all of that together.
832
:These tools of AI could be used
in a:
833
:Same scenario.
834
:So it's actually another reason
that we really didn't even intend
835
:to get to, at least I didn't
intend to get to in this thing.
836
:And that is.
837
:We actually need people.
838
:To be aware of these things
so they can, they can.
839
:These bonds.
840
:Respond.
841
:Appropriately.
842
:The government saying that they're
going to protect us and make
843
:sure there's no harmful speech
and there's other things they.
844
:It's always going to lead to the opposite.
845
:Just like things like the
Patriot act and things like now.
846
:And.
847
:We hear all these different abuses.
848
:Um, So there are risks.
849
:Of this tool being used in very bad
ways, because it's just a powerful tool.
850
:And another thing that needs to be
taught to our kids is this context.
851
:Uh, about this gets back to the more
freedom oriented ideas of a Bitcoin.
852
:And the, the constitution, the
different amendments, right?
853
:I mean, this is when we talk about
the first amendment, the fourth
854
:amendment, other things like this.
855
:We just need to understand that
this is a really powerful tool.
856
:It can do a lot of good.
857
:We need to be aware that in
the hands of someone who.
858
:Has different motives,
different incentives.
859
:It could be used bad ways.
860
:In that case, you need
to be aware enough to.
861
:To protect yourself as best
you can ideally speak up and
862
:stop that from happening.
863
:Like, I don't want to see
the same things in us.
864
:Like the China.
865
:Um, it feels like we're, they
already know a lot about us anyways.
866
:Like we're already there.
867
:So I think it's good to
have some level of concern.
868
:I don't think it should paralyze you.
869
:And I don't think it should make
you freak out, but I think it's
870
:good to have a little concern.
871
:Well, the thing is that as Jeff
Booth mentioned, In a park, as I
872
:listened to with Preston, he said,
Even if you don't participate.
873
:Uh, they still know about you
and they can infer who you are.
874
:Yeah.
875
:So you not participating
is only hurting yourself.
876
:All over yourself, right.
877
:I'm not being educated on it is
hurting yourself and your children.
878
:Yes.
879
:So we might as well stay ahead
of the curve and, and stay
880
:informed a hundred percent.
881
:A hundred percent.
882
:Can I.
883
:And speaking of that
Ellis, I like to slide.
884
:The next slide for those
that are listening.
885
:This is an exponential growth of AI.
886
:So.
887
:The.
888
:The.
889
:Technology is going, is advancing so fast.
890
:And, and what we mentioned earlier
in our show where it's actually a
891
:de-centralizing type of technology.
892
:Is.
893
:Just mindblowing.
894
:I just can't think of any
other word to describe that.
895
:And as you get into.
896
:In the Bitcoin space free and open.
897
:Um, Open source software,
um, is a huge deal.
898
:If you look at the Nasr development,
for example, and how fast that's going.
899
:Well, The AI development
is just exploding.
900
:And you can't get that
genie back in the bottle.
901
:You just can't.
902
:You can't hide from it.
903
:You can't hide from it.
904
:And the, one of the, one of the
ones that I'm really excited about.
905
:That, uh, when we were adopting
Bitcoin, I had the opportunity
906
:to listen to, uh, Elyx.
907
:Uh, not Alex.
908
:Um, specifying the, the guy who's
developing spirit of Satoshi.
909
:This is you.
910
:You're going to have people who take
the initiative and they're going to
911
:use it in really good, powerful ways.
912
:And for those of, if you're
not yet familiar with spirit of
913
:Satoshi, please check it out.
914
:I think the spirit of satoshi.ai,
I think is the actual.
915
:Link, but essentially what he's, what
they're doing is they're building.
916
:A language model that is based
on things like libertarian ideas,
917
:Austrian economic ideas, Bitcoin ideas.
918
:So that when you go and ask it a question,
You're not going to get, you're not going
919
:to get an answer that talks about crypto.
920
:Currency in general, if you go to Chet
GPT and ask that same general question.
921
:It's pulling from a lot of, a lot
of other things that are biased.
922
:In that, so.
923
:So I would say, well, that's biased.
924
:Well, yeah.
925
:Okay, well, whatever you feed
your, whatever you feed your.
926
:Your model in terms of how you train it.
927
:It gets, it's going to reflect that.
928
:I don't think that's a
bad thing and I just.
929
:I am so excited.
930
:To just.
931
:That we are part of
this, this history where.
932
:We're hitting this inflection
point and this, this AI revolution
933
:is just getting started and holy
macro, if you can, if it can pass.
934
:Law exams now and be a go champion and
you can do all these things, other things.
935
:And as far as we can tell so far,
We're still at the most basic level
936
:of what you would call intelligence.
937
:Right.
938
:It's not actually thinking
these are just the base models.
939
:I it's.
940
:It's really, it's, it's hard to imagine.
941
:It's hard to imagine what
this technology will be.
942
:A year from now or five years from
now, or certainly 15 or 20 years.
943
:From now.
944
:Yeah, who was it that I heard this from?
945
:I am.
946
:I think I was on a workshop
with a marketing expert.
947
:And he was talking about AI and he
said, we are so early in terms of
948
:the people who are in the workshop
using AI in their marketing campaigns.
949
:And he said, the thing is because
AI is evolving so quickly.
950
:If you don't catch up to the movement.
951
:The gap is going to widen in.
952
:A speed that you can't even imagine.
953
:So stay ahead of the curve.
954
:It's basically our call to action today.
955
:And, uh, not just obviously for you,
but for the sake of your children, you
956
:really need to take the helm in how
they're exposed and not be afraid that,
957
:you know, if you were to introduce
them to AI, they're not going to learn
958
:how to read and write because they're
just going to speak into the computers.
959
:Well, what if they didn't read and write
what is, they were able to do something
960
:that we can't even begin to fathom.
961
:So just be aware of the fear.
962
:And know that you're whatever you
decide to do, you're still in control.
963
:Of this tool.
964
:Yeah.
965
:They talk.
966
:Right.
967
:I agree with that.
968
:The call to action.
969
:You should start you.
970
:You need to learn for yourself though.
971
:And learn through them.
972
:I remember I still, this was
an example from about, I don't
973
:know, maybe six months ago.
974
:And it was on a different podcast where
Preston was talking about using AI to
975
:help build an automatic dog feeder.
976
:With his son.
977
:Or whatever they were doing at home.
978
:And I was like, what?
979
:I just couldn't like,
I just was blown away.
980
:And.
981
:And it wasn't like, you're you have
like Jarvis there, but if you're, if
982
:you're at home, I'm starting to do this.
983
:Now I'm only just starting.
984
:If I have a, an issue with something,
instead of going to Google.
985
:I'll go to chat GBT and I'll ask,
and it could be like a home project.
986
:It could be a programming project.
987
:It could be.
988
:Um, I don't know, it just it's.
989
:It's just fascinating.
990
:And so the call to action is.
991
:As you were saying.
992
:Like your framework and open
open-minded and try to learn the stuff.
993
:You don't have to go and get a
degree or take formal classes or that
994
:like you just pick something and.
995
:Just go play with it.
996
:So to me, the way I interpret your
call to action would be like this.
997
:There are dozens of.
998
:Of image.
999
:Creation types of like, basically
like language to, to, to image.
:
00:48:16,858 --> 00:48:19,528
Tools out there now there
are tools for videos.
:
00:48:20,008 --> 00:48:22,798
There are just general language
models that can answer.
:
00:48:23,188 --> 00:48:23,878
Questions.
:
00:48:24,328 --> 00:48:25,108
Just pick one.
:
00:48:25,828 --> 00:48:27,238
Just pick one and go try it out.
:
00:48:27,688 --> 00:48:30,658
And actually just get some
experience as fun with it.
:
00:48:30,688 --> 00:48:33,238
Maybe align it with somebody's interests.
:
00:48:33,448 --> 00:48:34,228
So if you're feeling.
:
00:48:35,428 --> 00:48:35,608
Right.
:
00:48:35,608 --> 00:48:39,598
If your kid is interested in a certain
thing, If your kid is very visual, maybe
:
00:48:39,598 --> 00:48:41,938
that that child would be better off.
:
00:48:42,448 --> 00:48:42,928
With.
:
00:48:43,828 --> 00:48:46,798
Whichever, you know, is Dolly
or mid journey or whatever.
:
00:48:47,308 --> 00:48:48,148
The others are now.
:
00:48:48,688 --> 00:48:49,828
If it's someone who really.
:
00:48:50,158 --> 00:48:52,348
Um, I don't know, maybe
they're very technical.
:
00:48:52,468 --> 00:48:54,028
Maybe they're very engineering.
:
00:48:54,598 --> 00:48:55,258
Oriented.
:
00:48:55,858 --> 00:49:00,328
Well, pick a technical project, go
build something like an automatic dog
:
00:49:00,358 --> 00:49:03,148
feeder and use AI to be your coach on.
:
00:49:03,928 --> 00:49:05,668
What do I need to account for, for.
:
00:49:06,178 --> 00:49:09,598
Uh, display, what do I need to count
for and design and et cetera like that.
:
00:49:09,628 --> 00:49:11,278
I mean, it's that, to me.
:
00:49:12,238 --> 00:49:14,698
I think it's going to sound
overwhelming when you say this is how
:
00:49:14,698 --> 00:49:16,408
much stuff is going, how fast it is.
:
00:49:16,678 --> 00:49:17,938
This is just an intro.
:
00:49:18,148 --> 00:49:18,418
No.
:
00:49:18,448 --> 00:49:18,748
I know.
:
00:49:19,018 --> 00:49:20,698
Follow up presentations.
:
00:49:21,028 --> 00:49:21,538
Right.
:
00:49:21,568 --> 00:49:22,528
I right.
:
00:49:22,558 --> 00:49:24,328
But you're, you're saying
the call to action is you.
:
00:49:24,718 --> 00:49:25,318
You need to get.
:
00:49:25,918 --> 00:49:28,078
Started on this and
what I'm trying to say.
:
00:49:28,348 --> 00:49:29,578
In a long-winded way.
:
00:49:30,148 --> 00:49:32,188
Is it doesn't have to be the whole thing.
:
00:49:32,188 --> 00:49:33,838
Just pick something small.
:
00:49:34,378 --> 00:49:35,818
And go, go play with it.
:
00:49:35,968 --> 00:49:38,518
There's a lot of free
versions of things out there.
:
00:49:39,208 --> 00:49:40,378
Are we going to list them here?
:
00:49:40,378 --> 00:49:43,018
Or are we going to go into
examples down the road?
:
00:49:43,018 --> 00:49:44,728
Because we're already at 50 minutes.
:
00:49:45,838 --> 00:49:45,988
No.
:
00:49:46,258 --> 00:49:46,378
Going.
:
00:49:46,378 --> 00:49:47,998
to tell them to go use chat GBT.
:
00:49:48,028 --> 00:49:50,098
Are we telling them to
go use beautiful Diane?
:
00:49:50,128 --> 00:49:52,018
Like there are so many out there.
:
00:49:52,918 --> 00:49:53,758
I don't think so.
:
00:49:53,818 --> 00:49:55,438
I think today's purpose was.
:
00:49:55,858 --> 00:49:56,638
Why included.
:
00:49:56,938 --> 00:49:57,208
Right.
:
00:49:57,988 --> 00:49:58,678
I think we're done.
:
00:49:59,368 --> 00:49:59,728
Okay.