Artwork for podcast Learning Bayesian Statistics
#145 Career Advice in the Age of AI, with Jordan Thibodeau
Business & Data Science Episode 14512th November 2025 • Learning Bayesian Statistics • Alexandre Andorra
00:00:00 01:52:17

Share Episode

Shownotes

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

Visit our Patreon page to unlock exclusive Bayesian swag ;)

Thank you to my Patrons for making this episode possible!

Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Joshua Meehl, Javier Sabio, Kristian Higgins, Matt Rosinski, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık, Suyog Chandramouli and Guillaume Berthon.

Takeaways:

  • AI is reshaping the workplace, but we're still in early stages.
  • Networking is crucial for job applications in top firms.
  • AI tools can augment work but are not replacements for skilled labor.
  • Understanding the tech landscape requires continuous learning.
  • Timing and cultural readiness are key for tech innovations.
  • Expertise can be gained without formal education.
  • Bayesian statistics is a valuable skill for tech professionals.
  • The importance of personal branding in the job market. You just need to know 1% more than the person you're talking to.
  • Sharing knowledge can elevate your status within a company.
  • Embracing chaos in tech can create new opportunities.
  • Investing in people leads to a more engaged workforce.
  • Navigating corporate culture requires understanding your role and relationships.
  • M&A trends in AI reflect the evolving landscape of technology.
  • High compensation packages are not a new phenomenon in tech.
  • Career growth often requires stepping outside your comfort zone.
  • Soft skills are essential for effective communication in the workplace.
  • Understanding the dynamics of M&A can provide insights into industry trends. AI is creating real economic value in customer service.
  • Speculative activity often overshadows real economic activity in tech.
  • Memorable M&A experiences can have a profound impact on people's lives.

Chapters:

10:39 The Impact of AI on Work and Culture

18:11 Understanding the AI Revolution

30:05 Career Advice in the Age of AI

38:08 Innovative Company Culture and Experimentation

41:04 Interview Dynamics and Performance Bias

42:29 Augmenting Work with AI and Learning

46:46 Navigating Organizational Boundaries

51:33 The Importance of Soft Skills in Tech

01:01:21 Mergers and Acquisitions in the Tech Industry

01:15:08 The Reality of Tech Salaries

01:18:18 The Impact of AI on Customer Service

01:20:36 Speculative vs. Real Economic Activity in Tech

01:22:33 The Cycle of Tech Booms and Busts

01:24:22 Heartfelt Stories from the M&A World

01:31:36 A Personal Vendetta Against Cancer

Links from the show:

Transcript

This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

Transcripts

Speaker:

This episode is a journey through the modern tech landscape.

2

:

My guest, Jordan Thibodeau, went from being a passionate gamer to building a remarkable

career at Google and Slack, where he's helped shape teams, culture, and learning

3

:

communities inside some of the world's most influential companies.

4

:

Jordan is one of those rare people who can talk about AI, mergers and acquisitions, and

the future of capitalism, and somehow…

5

:

make it all feel deeply human.

6

:

In this conversation, we explore how AI is transforming work, what it really means to

adapt to constant change, and why investing in people is still the most powerful growth

7

:

strategy in tech.

8

:

We also dive into the chaos of the AI boom, the opportunities hidden in it, the massive &A

trends reshaping the industry and the reality behind those eye-popping compensation

9

:

packages.

10

:

Jordan shares how knowledge sharing, soft skills, and personal branding can accelerate a

career far more than a perfect resume ever could.

11

:

Whether you're navigating your first job in tech or rethinking your path in an AI-driven

world, Jordan's story will remind you that progress in work and in life always comes from

12

:

curiosity, courage, and community.

13

:

This is Learning Vision Statistics, episode 145.

14

:

recorded October 14, 2025.

15

:

Welcome to Learning Bayesian Statistics, a podcast about Bayesian inference, the methods,

the projects and the people who make it possible.

16

:

I'm your host.

17

:

Alex Andorra.

18

:

You can follow me on Twitter at Alex underscore Andorra, like the country.

19

:

For any info about the show, LearnBasedStats.com is Laplace to be.

20

:

Show notes, becoming a corporate sponsor, unlocking Beijing Merch, supporting the show on

Patreon, everything is in there.

21

:

That's LearnBasedStats.com.

22

:

If you're interested in one-on-one mentorship, online courses, or statistical consulting,

feel free to reach out and book a call at topmate.io slash Alex underscore Andorra.

23

:

See you around, folks, and best patient wishes to you all.

24

:

And if today's discussion sparked ideas for your business, well, our team at PMC Labs can

help bring them to life.

25

:

Check us out at PMC-labs.com.

26

:

Hello my dear Bajans!

27

:

Today I wanted to thank Guillaume Berton.

28

:

Yes, I can say the real pronunciation because Guillaume too is a French man.

29

:

So welcome Guillaume, bienvenue à Learning Bajan Statistics.

30

:

I'll see you in the Learning Bajan Statistics Discord.

31

:

Thank you so much for supporting the show on Patreon.

32

:

If you too want some Bajan perks like access to the Discord, some monthly Q &A when you

need them,

33

:

and from time to time some books giveaways.

34

:

Well, you can get into that too at patreon.com slash learn based stats.

35

:

This is always extremely helpful and I'm always very grateful to anybody who can chip in

because I literally use this money to fund the show, contacting the guest, editing,

36

:

recording, hosting, then marketing and publishing the episode.

37

:

All of that is done.

38

:

Thanks.

39

:

you patrons.

40

:

thank you so much again Guillaume.

41

:

See you in the discord and in the meantime let's talk about career advice in the age of AI

with Jordan.

42

:

Jordan Thibodeau, welcome to Learning Fortune Statistics.

43

:

Dude happy to be here and I couldn't even spell Beijing and I'm really impressed I was

able to like understand the words.

44

:

I'm really excited about that.

45

:

So happy to have you here Shout out to Jesse I don't know this guy first since I was 13

years old and I'm like always like grubowski Then we screwed his name up, but he's an

46

:

awesome bro.

47

:

We go way back.

48

:

He met him on the internet

49

:

Um

50

:

We were playing Warcraft 3 on the forums.

51

:

I won't give our usernames, and I won't guess docs, and there's a lot of just crap out

there.

52

:

We were very nice though.

53

:

We talked shit to people.

54

:

And then he was just like, hey, I'm super smart, and I'm going to teach you all these

things in political science.

55

:

this is before I went to college.

56

:

was like, because we're the same age, never forget Campbell's Law.

57

:

Never forget that any statistic that you focus on seeing as their goal, by doing that, you

undermine it.

58

:

And just remember that, Jordan, because you're going to see it in corporate a lot.

59

:

And I was like, what?

60

:

Never man, stop smoking crack.

61

:

And then when I came to Google, I was just, I would see people like this is the metric

where I'm before like hit this and the people would just do these squirrely things on the

62

:

side and undermine themselves.

63

:

So anyways, Jesse connected me to you.

64

:

So happy to be here.

65

:

Thank you for having me on the show.

66

:

Yeah, no, you bet.

67

:

I mean, I am super happy and excited to have you on today.

68

:

It's going to be uh an original episode that I think is going to be super helpful and

actionable for my audience who

69

:

As you know, he's more, more technical probably than than yours.

70

:

And we'll get to your show in a, in a few minutes.

71

:

But as I'm guessing, people can already hear Jordan is used to making podcasts.

72

:

He talks very well as you can hear, I really encourage people to go check out his podcast,

the SVIC podcast.

73

:

Because you talk about a lot of very interesting topics at the intersection of tech, AI,

74

:

uh HR advice and this is really this really practical and actionable for for the audience

really recommend it So we'll put that in the show notes if you're on YouTube if you're

75

:

watching the episode on YouTube We have the the collab a new feature activated so you will

see Jordan's YouTube channel on these so yeah, welcome Can you also can you also can you

76

:

also shut out the only fans account?

77

:

It's called it's called feet in AI.

78

:

We have one supporter is quitting Tarantino.

79

:

It gives me about $20,000 a month and all I gotta do is show my feet and read AI research

is fantastic.

80

:

No, it doesn't exist.

81

:

But it can exist if your users want it whatever the users want.

82

:

So thank you very much for introducing us and I appreciate it.

83

:

So where do we start?

84

:

Now you bet you bet.

85

:

And and yeah, thanks.

86

:

Thanks, Jesse for introducing us completely agree.

87

:

Jesse's

88

:

think the closest to a genius I've ever worked with and just been friends with.

89

:

you know, like each time you work and talk with him, he raises the bar for anybody then

you have around you.

90

:

that's a blessing.

91

:

That can be a curse too, but that's mostly a blessing.

92

:

It's true.

93

:

And the thing with Jesse is he also, he's one of those weird...

94

:

uh Quants that like also has amazing people skills and could be in politics So he knows

how to like he knows where you are and he knows reference points of things that click in

95

:

your mind and you can put the complex and simplify it to make sure you learn and you're

also not put down by him because he doesn't he has like no ego or anything and he used to

96

:

He was mentoring me and he's both bro.

97

:

Mine same age, but in college He was trying to help me learn some concept and he basically

was like giving me Star Wars references and basically like, you know You're the day about

98

:

system right now Yoda

99

:

and I believe in you and you just need to lift the X-wing out of the swamp.

100

:

I believe in you, Jordan." And it was like super cool.

101

:

anyways, I usually talk a lot of shit about him in person, so I don't want to overly fluff

him up all the time.

102

:

So anyways, he's kind of a douche though too, but people have to learn that personally.

103

:

No, we're just kidding.

104

:

So where do we begin, my friend?

105

:

Yeah.

106

:

So last thing about Jesse, he was on the show, episode 124, people, and that's going to be

in the show notes.

107

:

Probably the lowest rated episode he ever had, right?

108

:

Did you like lose like 10,000 subs when he had him on there?

109

:

Yeah, it's tough.

110

:

Yes, yes, yeah, But it's got a huge half-life.

111

:

So actually I lost a lot of people because they weren't smart enough to understand JC.

112

:

then the smart people just flooded in.

113

:

then I earned even more subs, but just later so that when people had listened to JC and

understood, then they subscribed en masse.

114

:

Fantastic.

115

:

you basically lost the All In podcast.

116

:

uh

117

:

one IQ people and then you ended up gaining like the uh Lex Freedman PhDs in there.

118

:

like uh excellent excellent trade in audience.

119

:

Well done.

120

:

Exactly.

121

:

Yeah, that was perfect for me.

122

:

So yeah, so today we're going to mainly talk about what you Jordan oh enough about Jesse.

123

:

If you folks haven't heard about this episode, do listen to it because Jesse has a very

interesting background first and also he works on state space models, which are

124

:

a very powerful branch of models.

125

:

So let's keep it here.

126

:

What about you, Jordan?

127

:

So first, yeah, maybe tell us where are you joining from right now?

128

:

What are you doing nowadays?

129

:

how do you join?

130

:

Yeah, joining from meth lab actually.

131

:

So this is actually a Ghibli of my dad and his favorite dog.

132

:

and everything.

133

:

Her name is Josie.

134

:

When she ruffs, she rolls her Rs.

135

:

Like she's Mexican, even though she's Australian Shepherd.

136

:

She's got a bit of a complex.

137

:

She's like, he's part Mexican, so I'm going to appeal to him.

138

:

I'm a good executive.

139

:

I'm like, stop it.

140

:

So I born and raised in the Bay Area and eventually got a job at Google.

141

:

Spent about 10 years at Google.

142

:

I worked eight years in mergers acquisitions.

143

:

And then after that, Slack reached out.

144

:

was like, hey, do you want to rebuild?

145

:

like, what do you pay?

146

:

And I got out like what?

147

:

use emojis like a 13 year old girl on here.

148

:

I'm like a man, but pays good.

149

:

So then I joined and I was like, I'm never going to a small company.

150

:

I'm never going to a big company again, because I joined Google at 30,000 heads and they

grew to like 200,000 heads and it would became, it was like a fast moving like big

151

:

company.

152

:

And by the time I left, it was like a fast moving government, which is not very good and

great company, fantastic experience, great people.

153

:

10 10 would do it again, but I was looking for something smaller where I could get the

impact.

154

:

and I wanted to build, so Slack was about 4,000 heads.

155

:

And for people who are not in the Valley, 4,000 is pretty big.

156

:

Way before I worked there, I used to work at a factory in Silicon Valley from age of 12, I

working weekends every now and then, and then once over 18, I started working there

157

:

full-time, and that was like 10 or 15 people, so I from blue collar to white collar, and

that was really cool experience.

158

:

But 4,000 is considered small, I guess, kind of in the Valley, and so was nice, and then

within 90 days,

159

:

mark uh...

160

:

mark bennie of like haha no you don't acquired as and so was like no back in the beast but

we got twenty seven billion dollar valuations of the case top goes up i like it and then

161

:

uh...

162

:

you know the acquisition by this where you're waiting for the federal government to

approve your deals you can like senior and for a year but the same time i was still doing

163

:

acquisitions for slack but couldn't tell salesforce because that's called jungle gun

jumping we don't want to jail deal closes and eventually

164

:

You know when you find your first true love?

165

:

I found mine October 30th, 2022 and that was when ChadGPD came out.

166

:

I saw that and I was like, oh my god, like this thing can do script kitty work for me.

167

:

Most of your audience is like, I invented Johnny-Five and I do all this coding and I

manage a database of a hundred thousand lines of code.

168

:

And most of the nature are like, we're trying to get a spreadsheet to connect to Doc and

produce PDF.

169

:

And so like.

170

:

I learned how to do that myself manually and then it took me months to learn how to do

that before ChatGPT and then I went to ChatGPT and I was like, this is what I'm trying to

171

:

do, can you build something?

172

:

And it just spit out the code in like 15 seconds.

173

:

I was like, God damn it.

174

:

So I got excited about that.

175

:

And then my cohost, he's VP of End, he used work at Carta and the Wealthfront and he's an

engineering director at Google.

176

:

And he's one of those like Mensa level brain people like Jesse who like breeds everything

from biology to tech.

177

:

to AI, to science, to history, to psychology, and you can just have a conversation about

whatever with him, but he's also a really cool person, and he also knows about hip-hop, so

178

:

it's like perfect fit.

179

:

He was like, yeah, it's ChatGPT, let me teach you how it works, and I'm just gonna send

every AI research paper to you, and then eventually we started our podcast.

180

:

By the meantime, I said, hey, is there a way that we can connect ChatGPT to HR databases

and basically answer customer service HR questions of like, how do you even know I'm 401K?

181

:

Hi, I'm from France, I've come to America, like.

182

:

what the hell is wrong with your healthcare system?

183

:

And so those type of questions.

184

:

And so we built a proof of concept that I wrote a white paper in Salesforce, basically

explained like, we're using this thing called MoveWorks, which was a semantic search bot,

185

:

which was better than keyword search, but it was not as good as chat GPT with embeddings,

God tier.

186

:

And so uh I wrote a white paper on how we could save millions of dollars if we just had

internally at Salesforce, they would just build this bot out.

187

:

And at the time when chat GPT came out,

188

:

We had literally VP of AI at Salesforce like, this thing sucks, it's overrated, it makes

mistakes, and blah, blah.

189

:

And a lot of engineers too were up in arms, like yeah, this thing's privacy and concerns,

and blah, blah.

190

:

But a lot of it was like, there's a degree of envy and fear.

191

:

People never cop to fear or envy.

192

:

It's very easy.

193

:

Like when I worked at the machine shop, I was also going to college at Santa Clara

University and community college at the ends before then.

194

:

and I wanted to become a hedge fund manager at time.

195

:

That's another long story.

196

:

But I read a lot about free market stuff, and I'm a kind free market guy.

197

:

But I would always hear people say, oh yeah, and in the factories they lose their jobs

because of trade, free trade, also technological disruption.

198

:

That's a part of it too.

199

:

And those moms who work at the factory and they have two kids, all they gotta do is

retrain and go work at Goldman Sachs and become a partner and become a quant.

200

:

It's like, that's not how real world works.

201

:

And so a lot those engineers who would read Thomas Friedman's World is Flat and crap like

that would say to those people, yeah, just go retrain yourself.

202

:

But then when Chuck GPT came out and started producing some script and some code, a lot of

them were like, text coming from me, to a degree.

203

:

To be clear, our show, there's three things that we, three groups we don't really like.

204

:

The AI dreamers who are like, AGI tomorrow, post-labor economics.

205

:

The do-mers.

206

:

AI do-mers like this is P doom scores gonna kill you all like you'd and all those other

coops back in:

207

:

do is like triangulate to people who are operators who are in the field who want to know

what is real and what is good and what can be deployed and so by doing that you end up

208

:

getting all three groups to hate you so yeah eventually you know I wrote my white paper

and I tried to push to get this thing launched

209

:

And what happened was that the organization was ready yet, but eventually that became in a

different form agent force, which is a little bit overhyped.

210

:

But uh one thing that your viewers probably can relate to and maybe mention in the

comments section, how many times did you see fresh tech that you actually saw that could

211

:

do something better for your organization, but your organization was not ready yet

culturally or for whatever reason?

212

:

So it's like that timing aspect and luck.

213

:

So anyways.

214

:

uh

215

:

It did that for about three years.

216

:

We were able to get ChatGPT approved for all of HR and Salesforce.

217

:

I led that initiative and then we got the semantic search bot going through all of

Salesforce and I met the CEO, guys, this company's called MoveWorks, and I was advising

218

:

him, like, can you just iterate ChatGPT because this thing makes your bot look pretty bad.

219

:

He was like, sure, but we're gonna double the price of what we charge you from X millions

to X millions more.

220

:

And I was like, that's a first class ticket for your startup to go to nothing.

221

:

And eventually they,

222

:

had to sell the service now for not a good price.

223

:

So anyways, uh Joe and I started a podcast and that was a good opportunity for me to take

a break from corporate, take care of my dad for two years, who's in remission from cancer

224

:

for knock on wood 10 years.

225

:

And so doing the podcast on the side for two years has been great.

226

:

And that's kind of where I am.

227

:

So hopefully people will start listening.

228

:

probably, hopefully if they are, like and subscribe Alex show, it's really good.

229

:

So there you go.

230

:

Yeah, yeah, yeah, no for sure.

231

:

definitely encourage people to give it a listen because um you have a bunch of insiders

from Silicon Valley, from the tech world and at a time where this world seems to evolve so

232

:

fast I think it's extremely valuable to have this kind of outlook and people who really

know what they are doing very deeply and

233

:

I really like what you were saying that you are trying to thread the needle in between the

three extremes of this topic and I think it's the most valuable position to be in at least

234

:

for me you know to gain insights as the most scientifically as I can do so and maybe we'll

get back a bit more into your origin story a bit later but I'm sure since we're already

235

:

talking about that a bit

236

:

What's your current vision right now from making the show, from knowing everything you

know about the tech world?

237

:

What's your current vision of where the AI revolution currently is?

238

:

How do you see it blended with the different jobs, especially in comparison to the

numerous fears that we were seeing at the beginning, which were like programmers and...

239

:

scientific people fearing for their jobs?

240

:

What a great question.

241

:

also shout out to your question, and I shout out to your show too, because I feel like if

you listen to enough episodes of your show, you actually get a marketable skill that can

242

:

actually get you a good job and make money.

243

:

like, people from my show who are listening, like, please sub this guy, he's super duper

smart.

244

:

And so, what a great question.

245

:

I think where I could start with it first is I still think we're in the first inning of

this whole entire thing.

246

:

People have attention spans that are, and myself included, that are going shorter and

shorter as time goes on.

247

:

I find myself, it's harder for me to sit down and read books.

248

:

I used to be able to sit and just bang through books, and now it's like, what's on

Twitter?

249

:

What's this blog post say?

250

:

Do I get too long to even read?

251

:

And so, um on the internet, internet right now is directing the news cycle.

252

:

in various platforms, millennials now have arrived and they're reporters and we get most

of our information from the internet and that then sparks what our conversations are going

253

:

be.

254

:

And then also most startup founders, most captive industries, politicians are there.

255

:

And the thing with the internet though is it suffers from the issue of the silent

majority.

256

:

So what happens is you have the 1 % on either spectrum that are basically framing the

conversation and you'll get 1 %ers, not just on political spectrum, that can just be super

257

:

hardcore tech nerds.

258

:

and they can be so ahead of, on tip of the spear, bleeding edge, that they already assume,

like right now, everyone's using LLMs and everyone's using chat GPT and everyone knows

259

:

about rag search and everyone knows about perplexity and everyone knows about AI agents.

260

:

When I was talking to my friend who works at a startup, 50 % of the employees at a startup

don't actually use AI in their work.

261

:

And just like, what?

262

:

What's going on here?

263

:

You it's like you're full of like, you got Z's and millennials.

264

:

Again, people just don't use it.

265

:

When I was...

266

:

teaching people how to use ChatGPT in Salesforce.

267

:

This is almost seven or eight months after ChatGPT was launched.

268

:

I was going around, people were having me speak to teams.

269

:

VPs were like, you know what's going on, can you talk to our teams about this?

270

:

And people, I'd ask them, have you used ChatGPT or have you used Gemini or anything?

271

:

And they'd just be like, no.

272

:

And I'm like, well, wait a minute, your whole entire work is writing projects and either

sending memos and emails and things like that, and you haven't even touched this thing

273

:

that can automatically generate those?

274

:

I know.

275

:

Delve, delve, delve, dash, dash.

276

:

It isn't this, it's actually that.

277

:

But there still is a lot of things that can help you out with your comms.

278

:

And so I still think we're in the first inning.

279

:

I also think that uh people are looking at what's going on in the hardware cycle and

they're trying to read the tea leaves and say, well, NVIDIA now is doing dice here and

280

:

dice here financing regimes to sell more GPUs, which makes sense.

281

:

They could use all their money to say, you know what, let's go get into the foundry

business with TSMC.

282

:

But Jensen Wong knows that's a stupid idea because that's hundreds of billions of cap

expenditure and it's going to take years to see a benefit.

283

:

And by the time he actually gets his chips out, the whole entire market could cave in and

now he's wasted a lot of money.

284

:

So instead what you do, you invest in other companies that buying your GPUs so that you

can keep incremental sales increasing and that you can start getting more GPUs retired so

285

:

you can sell newer GPUs.

286

:

But here's the thing.

287

:

The problem with his business is

288

:

It's always a hardware cycle, so there's only so many physical products that you can give

people's hands.

289

:

Eventually you just say enough.

290

:

Like I used to sell to my grandmother, like I used to come to the house looking like

Mowgli from Jungle Book, very emaciated, and I would stay summer with her, and she was

291

:

typical Latino grandmother who was like, you so skinny, you're eating five meals a day.

292

:

You're gonna get tamales, enchiladas, you're gonna homemade tacos, we're gonna make Taco

Bell Cinnamon Twist with a real way, Buenuevo's for better, then we're gonna do churros,

293

:

and then you have horchata.

294

:

and I would leave the house looking like Winnie the Pooh with big old man boobs and my

stomach coming out.

295

:

And eventually it's like, no moss, no moss grandma, no more.

296

:

And the same thing with GPUs at a certain point.

297

:

It's like, people are going to have a good enough amount where they could buy more, but

it's like, do I really need to buy more at a certain price point?

298

:

Now I'm not saying it's the end of the future where tech gets good, we don't need new

tech, but it's just more people don't need to buy as many at certain volumes.

299

:

And Jensen saw that happen when the crypto boom happened.

300

:

Everyone need GPUs and then the crypto miners figured out a way to use ASICs, like just

basically dedicated chips specifically for Bitcoin and then it just crashed his market

301

:

out.

302

:

And then he literally said, we got lucky with chatGBT, it just pulled us out of nowhere.

303

:

So he's thinking to himself, how do I keep that, the demand side going and not focusing so

much on the supply and TSMC is thinking the same though, he too, they didn't want to put

304

:

their neck out there, increase supply and then it turns out we go into a brutal.

305

:

hardware oversupply which leads to deteriorating margins.

306

:

So Jordan, where am I going with all this?

307

:

I'm trying to say is once the hardware bubble hits capacity, which always happens in every

technological bubble since like the steam engine to rail to infrastructure in the United

308

:

States for like skyscrapers and highways and things like that, we then are going to see

the input prices for AI go down, which is then will benefit the software stacks such as

309

:

OpenAI.

310

:

and that will lead to them giving an opportunity of having higher margins that they can

then uh give deeper discounts to different enterprise customers and expand wall-to-wall

311

:

integration of their um software so they can expand more.

312

:

Or they can uh now, it's gonna lead to less AI labs being out in the market and less

competition for labor, which then means they can pay lower labor costs and no more of

313

:

these 50, $60 million pay packages.

314

:

That was the market landscape.

315

:

Now the product landscape, when the stuff came out, everyone was selling, overselling AI

agent this, AI agent that, blah, blah, blah, and it's gonna be autonomous agents taking

316

:

your job.

317

:

But what I like to do always is, uh are basic use cases in corporate America?

318

:

Like, can you file an expense report?

319

:

Can you book a meeting for us regularly?

320

:

um Can you set up a project for us?

321

:

Can you organize an event for us and not hallucinate?

322

:

And for my metrics and use cases and people I'm speaking to internally as companies,

323

:

The autonomous agent story wasn't there.

324

:

The way I looked at it as is the intern story.

325

:

An AI at NLM is basically you're getting an intern, 23 years old from Harvard who's very

sheltered, has helicopter parents, gets nervous about saying anything inappropriate, can

326

:

sometimes do amazing things, but when it comes to the last 5 % of the last mile you need

done is not there yet.

327

:

I was in a plane ride with a guy who works at a Thomas Carr company.

328

:

said, you know, I will use these things to code all day long and it's fantastic.

329

:

And it will come out with amazing ideas and pull the right libraries for me for certain

things.

330

:

And then some things are just super easy layups.

331

:

Instead of giving me an elegant 30 lines of code, pull requests is great.

332

:

It gives me 400 to 700 lines of something monstrosity.

333

:

And so that's okay for him because he knows how to look at this and what real code is.

334

:

But he also knows we're not there at the point of like set it and forget it for uh robust

jobs engineers are seeing in their corporate jobs.

335

:

Maybe for us script kiddies, set it and forget it for like things I've mentioned, but

still it has hiccups.

336

:

So where I see things going is, it's going to continue to augment labor as like an Iron

Man suit, uh but we're not in the situation of like post-labor economy.

337

:

Now there are certain sectors that get hit harder by it than others.

338

:

And one sector that's definitely gonna get hit hard is customer service.

339

:

That's why I was pushing so hard for the integration of ChatGPT, because I could see that

it could increase our case deflection rates.

340

:

Case deflection rates are just fancy ways for HR people to sound important.

341

:

But anytime Alex is like, hey, I'm getting paid so much money in my job, I don't know what

to do, and how do I log into my 401k?

342

:

He files a ticket, which is a case.

343

:

And you can use ChatGPT to kind of answer some of those questions and lead to maybe 50 %

of those tickets not actually hitting a human.

344

:

So anyways, I've rambled for very long.

345

:

time.

346

:

I hope I answered your question.

347

:

Don't forget to like and subscribe.

348

:

Bayesian, Bayesian learn Bayesian statistics.

349

:

It's fantastic.

350

:

Go for it.

351

:

No, yeah, that's that's perfect.

352

:

I think it's great.

353

:

It gives me a lot of a lot of doors that want to open.

354

:

Yeah.

355

:

And so on the on the last on the last sponsor made basically on how to use um how to use

JNI products in general to

356

:

augment productivity um you had lately a very good episode that i'll put in the show notes

i don't remember the name of the guest i always forget it but i talked to you about that

357

:

on whatsapp yes suly omer that's the episode with with suly omer i'll put that in the in

the show notes i have the link here um i think this episode is great because you guys lay

358

:

out the the state of the you know like the state of ai and also how suly is using that

himself in his work

359

:

That was interesting because that was resonating with a lot of my personal experience, um

where most of the time it will get me the first 80 to 95 % and then I'll need to take back

360

:

control and basically uh knock that out of the park.

361

:

But that's amazing because that allows me to to ship things much faster than I used to.

362

:

be able to because before I had to do everything almost from scratch.

363

:

Whereas now there is a lot of repetitive aspect or like, yeah, low intellectual value at

the very, very beginning or mostly in the middle actually of modeling work that you can

364

:

just automate with these products.

365

:

And then you'll knock that out of the park, which is great.

366

:

I found an interesting point in your episode with Sully was that it's actually very good

if you are an expert on what you're using already.

367

:

And so mostly if you're senior in one dimension of your work, that's going to be awesome

because that's going to give you the Iron Man suit package basically that you were talking

368

:

about.

369

:

The problem is when

370

:

you're very junior and you're fresh in all the dimensions, then it can be harder.

371

:

And probably my advice here would be get really good at some dimension of what you're

doing.

372

:

And then you'll use these tools to get even better, but you need to get very good at some

dimension so that you're attractive to a business in the first place.

373

:

And so I think it's also a good place for us to transition here where

374

:

Like I'm also curious to hear what your thoughts are.

375

:

For instance, if people in my audience, m you know, they are working in companies where

either AI is arriving.

376

:

So that means either they are using like other chef tools, like they get access to cloud

code, cloud, Gemini, uh chat GPT, whatever.

377

:

They need to be able to use that internally or

378

:

they need to come up with their own fine-tuned LLMs or they really like these new models,

this new way of doing things and they want to get into the core of the reactor and so get

379

:

a work at Google, at OpenAI, things like that.

380

:

What would you recommend to these people in my audience that they do from an HR advice

perspective, which is a hat you've worn uh during...

381

:

a lot of years.

382

:

Oh, great question.

383

:

In and to like reframe of like if they wanted to get into a top tier firm to work on

either training models or work at Google and become a product manager and deploy these

384

:

type of systems.

385

:

Yeah, let's start with that.

386

:

And then I think there is an interesting question that that is going to be what if they

want to stay in their company, but, you know, augment the work with

387

:

with these new tools.

388

:

But I think it's kind of two different use cases.

389

:

So let's focus on the first one you just talked about and then we'll dive into the other

one.

390

:

Excellent.

391

:

So uh great question.

392

:

So getting into these top tier firms, um these companies get a ridiculous amount of media

attention and applications coming in.

393

:

And so it's kind of like trying to get into Harvard or Stanford or something like that

everyone wants in.

394

:

Now, the first tip I have for folks is

395

:

You might have the rare bird who applies directly in the front door and goes through the

screening process and goes through sources and a recruiter sees their resume and gets

396

:

plucked out and has a conversation and goes to a hiring manager, goes through interviews

to get a job.

397

:

There's rare birds who get that.

398

:

Most mortals, the way they get in is they know someone who works there and they say, can

you look at my resume?

399

:

And, sorry, hey, can you refer me to this job here?

400

:

uh And here's a link to it.

401

:

And that...

402

:

Increases the odds at least that you're gonna skip some of the steps of going through a

sorcerer or going or at least it forces that recruiter to actually look at the resume it

403

:

sent to them and Your person that you've asked for referral if they know you well, and

you've worked with them They can write a nice little note saying, you know, I've worked

404

:

with this person for X Y & Z you should really consider looking at them and

405

:

That will put you in the front of the line.

406

:

Doesn't guarantee you'll get a job there, but definitely helps.

407

:

Now that's for just rank and file.

408

:

I want to work in engineering and product.

409

:

I'm not building AI models or doing anything super bleeding edge.

410

:

uh Skills that are well known and seen in medium sized uh companies and small companies,

which relatively would be easier to get into.

411

:

Then the next thing, which is AI research.

412

:

There's only so many people who can be in that sphere.

413

:

met some people who are elite in that area who are even who are AI researchers at Google

and Facebook and they even have trouble getting into the right jobs in other organizations

414

:

or top-tier startups like OpenAI right now because there's only so many people who can

work at such a level have the track record and have the ability to do so.

415

:

I will say though if you want to get referred into jobs at Google, Facebook, the rest of

these larger tech companies that are more standard roles it always helps that when you go

416

:

through referral.

417

:

send that LinkedIn message but have it set up so easily the person that you're asking to

refer you, all they have to do is control C, control V, and get the referral into their

418

:

company's internal HR database.

419

:

If you're like, oh, let's get a coffee chat, and blah, blah, blah, and be around the bush,

no, let's catch up, you're wasting that person's time.

420

:

It's best to just go straight for the heart of matter, like, this is what, help you in

well, this is what I'm looking for.

421

:

I'm looking for this role, can you copy and paste this message you can send to the

recruiter?

422

:

Here's my resume.

423

:

in a link form, not a PDF so they have to upload anything.

424

:

They can just put the link in wherever and person can click it and see your resume.

425

:

And then also I'm putting my name and my email, my phone number and my uh area code or my

time zone.

426

:

Just make it as easy as possible for them to get into the applicant tracking system.

427

:

Those are, things are very little but it means a lot because the person receiving it is

constantly getting these types of requests and a lot of people don't know how to actually.

428

:

formulate their thoughts properly or actually communicate properly.

429

:

And I've been receiving into this a lot and I love helping people but sometimes it's like,

what do you want me to do here for you?

430

:

I'm confused and you have my attention now you're wasting it.

431

:

Whereas an assassin like salesperson or a top tier referrals, that's exactly what you need

to see to get through there.

432

:

regarding your other, and another thing is even if you get into these interviews, these

jobs, like there's always like these different boot camps and things like that.

433

:

I don't wanna get into that.

434

:

What I will say is,

435

:

If you've gotten into a phase of they've actually recruiters, since let's get a phone call

going or a hiring manager wants to a phone screen, it's basically them saying to you

436

:

implicitly, you have all the qualifications to get this job.

437

:

And the rest of it is once you go through an interview loop, are you going to say

something stupid or just completely have your head explode, which leads them giving you a

438

:

reason of not giving you a job.

439

:

So, and also there's an aspect of...

440

:

is randomness to the interview process.

441

:

We like to act like there's a science of interviewing.

442

:

We actually had a team at Google who would just go through people who were not given

offers or rejected and really dig into the feedback and say, what actually this person has

443

:

a very awesome track record, maybe there's something we did or they're just having a bad

day and get the right interviewers in there.

444

:

People would come back, get a job from Google, and then they would track their performance

compared to people who originally got accepted.

445

:

And the people who originally were rejected ended up outperforming the people that we

originally thought were good.

446

:

which then showed to us actually, this whole interview thing is if anything like

performative work or maybe at best making sure we don't have like a complete fraud or ax

447

:

murderer in there, like what really matters is like what is your track record and what

have you done in the past and maybe who's vetting you.

448

:

That's why, and go ahead and jump in, can ramble on more for it.

449

:

no, I think...

450

:

Yeah, so and you'll continue for sure on these on these train of thoughts, but yeah, just

wanted to say first.

451

:

Yeah, thanks.

452

:

I think it's super valuable and actionable to hear you say that for people.

453

:

uh Because yeah, as a candidate, you can get you can get completely lost in all these

applications and these uh and these processes.

454

:

so knowing how it works from the inside, what people like you in

455

:

inside are looking for is extremely valuable because uh it makes the candidates not waste

time both for the recruiter and for themselves which is very important and I am also super

456

:

impressed that you guys did that at Google this is great I think it's a great signal from

a company to be able to look into what they've done who they rejected and then be able to

457

:

say hmm actually we didn't have a really

458

:

good legitimate reason to um reject that person.

459

:

um Let's reach out to them again.

460

:

Right.

461

:

Yeah.

462

:

never heard of that and that's amazing.

463

:

Yeah, it was great because Google, mean, every company goes through like a golden age and

you only know about you're at a golden age in your life or at a company until it passes

464

:

because you start re-looking at it say, wow, like I was in really good time, but the time

I was so focused on minutiae that I didn't realize I was at a golden age.

465

:

We might look at this path in this last 10, 15 years in stock market returns.

466

:

It's like a golden age where was like you just invested in anything and had cash flow and

you were winning.

467

:

And then we might go through with my great grandpa and grandpa went through a depression

where I was like 20 some odd years, 30 years, the stock market was just nothing.

468

:

And we'll look back and be like, man, we had a really good.

469

:

And so that was a golden age of Google where you had a lot of people who were just didn't

have egos, didn't think they were capital of industry.

470

:

understood the culture of what Google is.

471

:

a gigantic experiment on that page.

472

:

We're running all these experiments.

473

:

We don't know the answers.

474

:

We're going see what fits and go from there.

475

:

And so why don't we run our company the same way?

476

:

And we're doing that for interviews.

477

:

Now another thing is I was an &A, Mergers and Acquisitions.

478

:

I didn't go deep enough into that.

479

:

But I would have situations where we have a team of 300 engineers and product managers,

top tier people, genius people, working on certain product and it wasn't working.

480

:

And the VP would say, hey, uh Jordan and team, go out there and find a company that's

doing this better.

481

:

and we were bringing a five person team of like these people who had like just random

backgrounds, something go to college or whatnot.

482

:

And they built a product that was just cracked.

483

:

That was just like crushing it for barely any payroll.

484

:

And they had the market and we would then bring them in for interviews.

485

:

Like they want to do the deal and we had to do interviews to make sure you're good.

486

:

And I would have to sit with just like these people.

487

:

Did you ever play Warhammer 40K at all by any chance?

488

:

You know what Warhammer?

489

:

The video game, the little miniatures, it's a miniatures and it's a computer game.

490

:

there's warhammered on a war i love art me jesse's play our kiss all the time i was

actually ranked globally twenty two in warcraft three two first two random team to the

491

:

point where if you're in the west coast ladder that's the real men play i'm sorry i want

to talk to you about east coast europe but like what i want to play east coast or europe

492

:

it looks like jv league of like okay i'm just gonna like maybe play behind one in my my

back and play against people west coast was cracked you know why i was cracked because at

493

:

around um

494

:

4 o'clock 5 p.m.

495

:

is when the Koreans came online and they were the first in the 90s to make gaming a

professional sport you can make money off of and they used to fill stadiums of kids

496

:

watching top players like slayer boxer and things like that in the 90s play Starcraft and

these people were great and they made money but unfortunately some of them there's a not

497

:

unfortunately there's mandatory military service in South Korea and so they would have to

end their career to defend their country which is fine what not so anyways

498

:

the korean still have the same ethos in in the gaming world and so all the koreans logged

into onto warcraft three and then they would accept me to play with them to get high

499

:

enough and they would call me goes to look at was pro and i had to like some of my re old

replace on the website to down but have my replay files on their me playing the koreans

500

:

and it was like fantastic and because of an idiot all hindsight bias i was like who knew

if you truly this issue quit go get a political science major in like a few years later

501

:

what

502

:

RTS gaming became big and becoming pro became big and people were making good money.

503

:

anyways, uh I digress on all that and where I was going.

504

:

But what would happen is we would get this guy who would come in through a girl, come in

through an acquisition who was super good.

505

:

And then I'd get this person who was like a Warhammer 40K uh psycher, who are these people

who have like, they have the telepathy skills and their geniuses, but they're like

506

:

demented because they're trying to prevent themselves from chaos coming in.

507

:

And that's what like a senior engineer at Google is.

508

:

He's like, my mind bears a great pain because I'm like just so smart and they're just

cracked and they can do great things.

509

:

And I have to sit and tell them like, look, I'm M &A, I'm a script kiddie.

510

:

But just in this interview, just because this person can't just like do map reduce or

understand perfectly different auditing tables in their mind or code things from scratch

511

:

and know how to do logarithms instantaneously and matrix multiplication like

instantaneously doesn't mean they're a bad engineer.

512

:

We brought him in here because your VP knows that this product, if it gets into Google, is

going to be fantastic.

513

:

Can you more focus on their experience of what they've done and judge them on that?

514

:

And within two interviews, we would know if person was good or not and decide to hire

them.

515

:

And so to bring this all back to what I started with in the beginning, a lot of interviews

is performative work.

516

:

And another thing is we have main character syndrome.

517

:

So someone who's really good in your audience might go deep into Google interviews and

then not get the offer.

518

:

not think of, wait a minute, you crushed it, but there was three other people who were

interviewing for the same job, and maybe the guy who got it over you had just 10 more

519

:

years of experience, or maybe the guy, the girl who got it over you had less experience,

but was really good friends with people in the interview loop.

520

:

Those are things you just can't control.

521

:

And I think from your audience focusing on statistics, hopefully they don't fall, I mean,

it's easy for them to see when other people fall prey to statistical...

522

:

uh

523

:

Mistakes of thinking that like the the edge case represents the whole entire population

But when you were in the driver's seat and focusing on something that you want all that

524

:

logic goes out the street and you focus more on like your own emotions like the whole

saying of uh If a lawyer defends himself you have an idiot of a lawyer and also a fool of

525

:

a client and it's kind of something happens when it comes to statistics uh now connecting

it to

526

:

People saying in their company, how do I augment myself with AI and like what should I be

doing?

527

:

How do I get started?

528

:

I think the first thing is Learn Bayesian statistics watch the show see who he has on and

learn read about it then people think that like to be an expert you have to have a PhD in

529

:

some topic or whatnot no to be an expert is You just need to know 1 % more than the person

you're talking to it Just do I do a little bit more about the subject than you relative to

530

:

this person in this room not

531

:

Do I know relative to the world?

532

:

No.

533

:

Relative to the world, my knowledge of AI, maybe I'd say middle of the pack, maybe.

534

:

I'm not gonna say top tier, but I'll say middle of the pack.

535

:

But then if you get me in a group of HR people, I would say I'm probably higher in the

pack.

536

:

And then if you get me in a room of people who are PhDs in AI, I would say I'm a novice.

537

:

So when it comes to your own company of whatever size it is, if you want to learn more

about these tools and things like that, the first thing you do is stay up

538

:

on the topic, read about what other people are doing, hear about stories, and then share

those resources inside your company.

539

:

And what starts happening is, this was happening at Salesforce, I was just a random HR

guy, and before I knew it, was talking to the presidents of the company and got mentioned,

540

:

and my project got mentioned in one of our Wall Street quarterly calls.

541

:

It all happened because I was just known as a guy in Slack who was just sharing stories

about AI all the time.

542

:

And then even to engineers at Salesforce who were principal engineers,

543

:

who would then come to me and ask me questions.

544

:

And I was just like, what's going on with this world right now?

545

:

this age, I'm an age, I'm the M &A guy.

546

:

You should be talking about tech.

547

:

No, no, no, like what you saying?

548

:

Embeddings?

549

:

And you said something about alignment and pre-training and post?

550

:

I was like, I slept in the Halldane Express.

551

:

But that's also a sign of when a new novel, important tech happens, it shakes up the

social order.

552

:

And I love that chaos because what it does is it jolts people.

553

:

And what happens is some people put their head in the sand and they can't handle it.

554

:

Some people reject it and fight it.

555

:

And because those two groups are doing their drama, it then opens up the whole entire

playing field and there's less competition.

556

:

So someone who's watching them, Bayesian statistics might have your good podcast and say,

wait a minute.

557

:

I just learned something valuable that I could use for an advantage to get my promotion

and do better.

558

:

So by you just being on top of it and sharing resources in your company, eventually people

who insert coming to you and looking at you as a thought leader or

559

:

And just the fact that you might even not be able to answer your questions, but you might

know someone who can in your organization, you get the reputation of, this person gets

560

:

stuff done.

561

:

And you then get an aura of plus 10, get stuff done.

562

:

And then you get aura of plus 20, where people are sharing, not only do you connect me to

someone, I want to connect back with you that you helped me solve my problem.

563

:

And this is what I learned about what happened.

564

:

And this is with this technology, where it worked and where it didn't work.

565

:

And then you're on your path of becoming an expert.

566

:

And you go from there.

567

:

I could go into very like, you should probably go learn like, um read these papers here

and don't use pine cone as much anymore because it's not important.

568

:

And you should be focusing on this aspect, chat, GPT agent.

569

:

It's like, no, no, no.

570

:

Everyone has their own unique quest they're on.

571

:

It's more of where can you plug yourself in so you can get information flow coming to you

so that you can use that information for your own unique benefit.

572

:

So I'll end my, in my, my, my, my, uh

573

:

sermon and don't forget to like and subscribe learning basing statistics with our python

and other awesome stuff.

574

:

Okay, good.

575

:

Yeah, and actually how, how did you do that?

576

:

You know, when you were, for instance, to take the, your Salesforce example, you were

sharing, you're sharing resources with people.

577

:

how did that work?

578

:

Because I'm guessing also there are, so it sounds like Salesforce is not like that.

579

:

And that's great.

580

:

But I know

581

:

people who are in companies where it's expected that people stay much more in their line,

you know, and so that's great that you didn't have anybody tell you, hey, you're an HR,

582

:

you should leave these technical stuff to the technical people, Jordan.

583

:

Okay.

584

:

So yeah, like how did it work concretely?

585

:

How did you end up being able to sharing that?

586

:

And what, what advice would you give to people to, to do that?

587

:

Because I do agree, this is very important to do.

588

:

And I do the same always share proactively.

589

:

But I know some people have difficulties because they have pushback from, from other

management.

590

:

First thing I would say is, um, my dad used to be a critical care trauma nurse and he used

to work at Valley Med in Kaiser and Valley Med was a high level trauma center where he

591

:

would get gunshot victims coming in.

592

:

And there would be other nurses who would go to other ERs and like nice suburbs and would

never see a gunshot victim.

593

:

And after he did that for a couple of years, like he would just, like some nurses would

see it first time and just freeze.

594

:

And for him, he would see someone like gunshot victims crash and they've always gnarly

injuries.

595

:

And the first thing he focused on is like, what's the biggest problem that I can triage

now?

596

:

And that was gonna, it's the 80-20 of keeping this person alive.

597

:

And then I'll move on to the broken leg and stuff like that, which sounds really weird,

but it's like, know.

598

:

And so for your question, I would say people, if you're an organization and they're

telling you to step in your line, stay in your line, first question you gotta ask is why

599

:

am I in this organization in the first place?

600

:

And I know it's like, well, I get a mortgage and that's okay.

601

:

If you get golden handcuffs, you're getting 100 millions of dollars, keep them on.

602

:

But this audience, based upon who you are, Alex, how smart you are, you're attracting very

smart people that a lot of other companies would love to have you to have these people in

603

:

their organization, running around asking questions, sharing information.

604

:

And so that's first thing I've worked on is like finding the right place.

605

:

And I tell people, you should always be monogamous with your partner.

606

:

And in a

607

:

Open relationship with your employer.

608

:

If you are not at least once a month or 60 days, like just seeing what's around, talking

to recruiters, talking to your friends to see what opportunities are, you're killing

609

:

yourself because you don't know there might be, this company right now is giving you

grief, but there'd be a company down the street that has a better culture, 30 % pay bump,

610

:

and wants you to be you and would love to have you.

611

:

So again, open relationship with your employer.

612

:

Going back to your question about if I'm in a company that's like in their lines and

things like that, my role in &A was basically, my job is to acquire a company that has a

613

:

good culture because the current company I'm working in is looking for something that our

culture can't produce and trying to bring that culture into our company to resuscitate it

614

:

and bring back more energy without my company's culture squashing it.

615

:

So my job is, please tell me Alex, you play video games, you played the original Legend of

Zelda games, the sprite versions, you know when you have the sword?

616

:

and you're trying to figure out which walls to bomb and you see you go, hear, doonk,

doonk, doonk, and it's like, can't put a bomb there.

617

:

You walk up and you go, tink, tink, tink, and you put a bomb there.

618

:

My job when doing M &A is I would have people who throw brick walls against me saying, no,

you can't do this.

619

:

No, you can't do this, blah, blah, blah.

620

:

And I'd be like, doonk, doonk, doonk.

621

:

Okay, that's a federal law.

622

:

can't do that.

623

:

And I go, tink, tink, tink.

624

:

Oh, that's because someone just, whatever is told to do this.

625

:

So I'm going to go bomb that wall and push through it.

626

:

And so I...

627

:

default towards I'm going to do things that are legal and culturally acceptable, but I'm

gonna keep on pushing until someone tells me, okay, you can't do that.

628

:

Doesn't mean I'm gonna be a jerk or anything like that, but I'm not gonna, whole saying is

seek forgiveness, don't seek permission.

629

:

Because if you're constantly, always, oh, can I do this, can I do that?

630

:

You're never gonna be able to learn and grow and take advantage of something where the

social order has been destroyed by AI, now you have an opportunity to.

631

:

step up in the social order and have more opportunity.

632

:

If anything, people here on the show is, anytime you see a moment of chaos in tech where

things are shifting, companies are going bankrupt, money's being thrown around, new ideas

633

:

are happening, that should be a signal to you, saying, okay, it's now time for me to put

more risk on the table in my career, not meaning go YOLO options, but meaning be a little

634

:

bit more aggressive of what I'm pushing to learn and what I'm trying to do in the company,

because you don't know where that's gonna set you up.

635

:

Yeah.

636

:

Yeah, not totally real.

637

:

And then if you start, you know, poking a bit and then seeing that you've got too much,

too much pushback to your taste.

638

:

And yeah, that's a good signal that, okay, this, relationship is not for me.

639

:

Like they don't want me for myself.

640

:

You know, they just want me to behave exactly like they want everybody to behave.

641

:

And so they don't really want, want me, they just want generic people.

642

:

And so that's not where.

643

:

Well, I'll give the best of myself.

644

:

And to be, and sorry, cut you off.

645

:

me, I agree with you and to be more concrete because I want to give a better answer.

646

:

My first answer was not my standards, but, or I want to add to, I want to self-deprecate.

647

:

Don't forget to like and subscribe, learning Bayesian statistics.

648

:

What you should be focusing on is number one, your relationship with your immediate

manager.

649

:

That's not one where you're like tink, tink, tink and blowing up walls and trying to, that

should be, that's person that butters your bread.

650

:

and the person who's your skip level is the person that butters that person's bread.

651

:

So you wanna make sure that's aligned.

652

:

But then when you're working cross-functionally in matrix organizations with different

teams and things like that, it doesn't mean you come out there and be a jerk.

653

:

I mean, when I'm poking the sword and looking for what the false wall is, I'm always

putting in a lot of charisma, because I like joking and laughing at people, so that allows

654

:

me at least...

655

:

If I'm making a request, people don't want to turn me down.

656

:

But the same time they do turn me down, it doesn't make it all situation awkward.

657

:

So I'm not a jerk to people.

658

:

And I don't also believe in, there was a lot of scientists that think that I want to go

into science and math because there's no politics involved.

659

:

And all that matters is scientific truth.

660

:

It's like, no, if there's humans, there is politics.

661

:

Like Jonas Salk, I think he was the one who created the polio vaccine.

662

:

There was some drama regarding when he got to the cure or something, he didn't.

663

:

um stroke the right egos and mentioned the right people when he eventually won some award

and ended up he didn't do the politics properly and hurt himself.

664

:

There was a guy who determined to create I think germ theory and then real not germs.

665

:

He realized that the reason why people are getting sick in the hospital is because doctors

aren't washing their hands.

666

:

And so he was like, you just wash your hands, people will live.

667

:

But he was a jerk to people and brash.

668

:

And because he was jerk and no characters with him, he pissed off so many doctors and no

one listened to him and people

669

:

died because of it, and it was only until after later he passed on, someone new people

like were like, you know that guy no one really liked?

670

:

He was right about that actually.

671

:

And people were like, okay, I guess I'll listen to you.

672

:

knowing the softer skills is important in there and knowing which battles to fight or not.

673

:

Like if it's on your own turf and you want to do AI and this is what you own and control,

then you have some leeway to go there.

674

:

But if you're jumping onto someone else's turf, you better come correct.

675

:

and make sure that you're not ruffling feathers and things are aligned properly before you

decide to put that bomb against the wall and push forward.

676

:

And also when I say bomb for the record FBI I'm not saying you should blow anyone's home

is a Zelda reference.

677

:

um where should we go to next?

678

:

Should we talk about any one other topics you think we should talk about or things you

want me to double down on?

679

:

Yeah.

680

:

So yeah, just to piggyback on what you just said.

681

:

um Yeah.

682

:

like the way the way you say things is indeed very important.

683

:

um If you can add some humor in it, it's always better.

684

:

And then in the end, you see that's not working out for you, just, you know, like look

around as Jordan was saying.

685

:

yeah, like all that thing made me think about a friend of mine.

686

:

I talked to his who has some managerial

687

:

uh position but he still has a manager and I remember he told me he was talking with his

boss to convince him to let people let some people go to some conferences tech conferences

688

:

you know to learn new new skills and and his boss told him something like like he didn't

want to people to go because he was saying what if they learn something new and then leave

689

:

and

690

:

And then my friend just answered, just answered, what if they don't learn anything and

stay?

691

:

It's much worse.

692

:

So yeah, like I think it's a like the second mindset is to me much more interesting and a

much better signal of a better company than than the first one.

693

:

So true.

694

:

We had as a really good point with the president of HR named Laszlo Bach, who was the SGP

president of HR when I was at Google.

695

:

Great dude.

696

:

Fantastic.

697

:

and he wrote a book called Work Rules, just about how Google is HR, and he had a saying on

his internal employee page that everyone could see, and the saying was basically, invest

698

:

in people and grow them to the point that they can leave, but treat them so well that they

don't want to leave.

699

:

And it was just like, was just.

700

:

Right.

701

:

And my dad had the same ethos in his company.

702

:

He ran an HVAC company for 150 employees.

703

:

He started with doing gutter cleaning and then grew to the point where some of the uh sky,

sky rises in some of the skyscrapers in San Francisco, he used to uh call in a team that

704

:

would bring in commercial grade HVAC rooftop systems from Chicago, bring them in on rail,

and then you would call in helicopters to pick them up and then put them on top of

705

:

commercial sky rises and do other HVAC work.

706

:

And he used to say like he has ethos with companies like you're either continue growing or

I am then going to give you a check and you're going to go start your own HVAC business so

707

:

can grow somewhere else.

708

:

I mean, keeping the organization that you can continue growing.

709

:

But if you're so good that I'm holding you back, I want you to become a partner and you

create your own company.

710

:

So there are people who still would reach out, stills in retirement saying thank you for

getting me started.

711

:

And so if you see people who have this...

712

:

zero sum game mindset of like, need to keep people in prison here this organization and I

need to prevent them from growing.

713

:

You got to get out of there immediately.

714

:

And I hope your buddy decided to kind of like move on or something.

715

:

Should we go into M &A?

716

:

Should we go into crazy offers that we're seeing in AI?

717

:

Like what else should we should we talk about?

718

:

Yeah, I had that.

719

:

Yeah.

720

:

Had that for next.

721

:

Just wanted to add.

722

:

So yeah, like to wrap it up on that part of the conversation.

723

:

Basically, sure, always keep in touch with the hard skills part which is for my audience,

the math, the stats, the algorithms.

724

:

um That's what you do when you listen to this show, when you read any book, you um learn

any new skills in an online course.

725

:

That's great.

726

:

I know most of my audience already do that.

727

:

um But don't only do that, also be able to learn.

728

:

how to communicate your ideas, how to make sure the models you're building with your sweat

and blood are actually understood and used by people who don't um have technical

729

:

backgrounds.

730

:

And so that's where, as we're saying, the soft kills enter.

731

:

um I think one of the best ways to do that is try and work in an open garage way, you

know, like, of course, I'd say that because I'm an open source developer, but the more you

732

:

The more of your work you have outside, you know, publicly where people can see what you

do, the code you have, um, and, the way you communicate about that, whether that's through

733

:

a podcast or a blog or anything that's going to help you tremendously.

734

:

And finally, um, I have a great book recommendation for people who want to learn how to

communicate a bit more efficiently.

735

:

That doesn't come.

736

:

let's say naturally to you, even though I think everybody needs to learn that.

737

:

It's Charles Duhigg, super communicators book.

738

:

just put it in the show notes.

739

:

Great one.

740

:

He's a great writer.

741

:

So it's very easy to read.

742

:

Can also listen to it, of course, or not able to say it.

743

:

Like if you feel like that's something you want to progress on.

744

:

think it's a good dude.

745

:

Get your get send me the affiliate link for you better have Amazon affiliates.

746

:

If you don't do them to slap you, say a little love.

747

:

But send me the Amazon liquid affiliates because I want to make sure you get the 50 cents

if I click the link because I'm going to it by right now.

748

:

So I appreciate that.

749

:

Also, would follow on is there's a regarding charisma and everything.

750

:

I think it's a learned skill and people actually have it innately.

751

:

It's actually a seed inside them.

752

:

Some people have fully bloomed into a gigantic, beautiful tree.

753

:

Some just never have watered it.

754

:

There's a book called The Charisma Myth and it is written by...

755

:

The Christmas Wrath, How Anyone Can Master the Art and Science of Personal Magnetism by

Olivia Fox Cabane.

756

:

I have the audio version.

757

:

I've listened to about four or five times when I was riding my bike to the Google bus just

to get it into my brain, just like I listened to the 48 Laws of Power multiple times not

758

:

to become a sociopath and Machiavellian, but I consider it like in Harry Potter, the

two...

759

:

protect yourself from the evil ones, you need to actually understand the dark arts, and so

that's always a good one for corporate understanding the games people play.

760

:

Another point...

761

:

The Charisma Myth, How Anyone Can Master the Art and Science of Personal Magnetism by

Olivia Fox Cabane.

762

:

And then The 48 Laws of Power by Robert Greene.

763

:

I highly suggest everyone listen to the audio version, it is the best narrated book I've

ever heard in my life, and just listen to the opening.

764

:

is i mean you should listen to the whole entire thing but i guarantee you if you listen to

the whole entire thing you're to reflect back in your career and say yeah there's places

765

:

where i've actually violated certain laws that's hurt me um i'm i think it's recommended

reading for anyone going into a any corporation especially a large corporation if i would

766

:

have read that book before i went to google it would have my career would even been even

better because of it um good to know yeah put these two books like the yeah

767

:

48 laws of power audio audiobook version highly recommended for 48 laws of power I Had

another point that you're that you you were mentioning About work and just cycling through

768

:

my mind right now.

769

:

I'm sorry.

770

:

I'm I a terrible thing.

771

:

Mine's a terrible thing to waste um Let's see here Yeah, I'll come back to me later sorry

about um No, mean we're seeing all other things at the same time.

772

:

So yeah, but yeah

773

:

Something I'm also curious to hear you talk about is, yeah, like the crazy offers we're

seeing right now.

774

:

Oh, yeah.

775

:

Deal making.

776

:

Oh, God.

777

:

I love &A.

778

:

I worked in about 150 deals in my career at Google and at Slack.

779

:

gosh, right now people are looking at all these deals and saying, my God, like, this is

crazy.

780

:

It's destroying the foundations of the Valley and blah, blah.

781

:

It's like, please.

782

:

You weren't focusing on &A.

783

:

This stuff is going on for decades.

784

:

It's always been going on, but now it's just the media spotlight is there.

785

:

What is the- us a bit of an elevator pitch of what you mean by the offers that are going

on right now in the valley because I'm sure a lot of people outside are not as versed.

786

:

Exactly.

787

:

Yeah.

788

:

So people see Alexander Wang.

789

:

Great question.

790

:

You're a great host.

791

:

Again, like and subscribe to his channel.

792

:

Alexander Wang.

793

:

from scale a i a daily link company super smart duty worked early in core when my friends

was executive core at and we brought him at seventeen eighteen years old he was performing

794

:

better than uh...

795

:

stanford engineers of master's degrees and we did layoffs a set a whole bunch of seventeen

year old over those people because he's a critical contributor core guys cracked and

796

:

anyone who's like always not a i research or blah blah you're smoking crackford nine's guy

he's good energy places cards properly

797

:

Zuck is like 40 something, so he's got least, he's a little minimum 15 more years leading

if he wants to.

798

:

But if he's looking for a hair apparent and things work out with Alexander and they could,

Alexander could be a hair apparent to be like taking over Facebook and that would be

799

:

amazing.

800

:

Now, um he had Scale AI and if they labeled him in company that just made a lot of these

AI models possible from even DeepMind's models they had to work for, doing work for

801

:

OpenAI.

802

:

People attack him because they're like arrogant and say, he didn't get his AI research,

blah, blah.

803

:

Well, he was a guy giving you the gasoline so your models would work.

804

:

So I value him more than I value anything you give with your PhD.

805

:

And so they bought his company for Facebook, got 49 % of the company for $14.9 billion.

806

:

It's actually $14.3 billion.

807

:

They didn't get the whole entire company and they got board seats.

808

:

And it was a whole entire plan which is basically to...

809

:

free alexander from scale ai so he could come in and lead facebook's new ai team now

alexander he was able to pick the most cracked people from scale ai 50 folks one of my

810

:

friends actually made it made it onto there i was like yes my friends made it hell yeah

we're in boys and it turned out that people were like why are they paying this much money

811

:

for him they're not even buying the company and it's you think of it in this terms i

812

:

because I was a typical American back in the day when Facebook bought WhatsApp for 10 % of

Facebook's value.

813

:

I was like, what the hell is this crap?

814

:

And then, because I love Indian people and they're fantastic, even though I look like I'm

Indian, but I'm not, and the reason why I say I look like it is because when I'm on

815

:

Twitter and I defend H1Bs, I get racist, go, you stupid Indian, you support H1Bs.

816

:

I'm like, not even Indian, what's going on here?

817

:

You know, jeez.

818

:

You know, and so, yeah.

819

:

You know, so you know the series, the TV series RC development.

820

:

No, no, tell me you You should watch that.

821

:

That's amazing.

822

:

Link in the show notes series from the 2000s.

823

:

Um, and so you're like in one of them, one of the guys saying, yeah, we're going to get

some help from, um, our Mexican friends from Columbia.

824

:

And, uh, I think they are just called Colombians.

825

:

It's like exactly.

826

:

It's so good.

827

:

He's put a link in there.

828

:

love good comedy.

829

:

I was raised.

830

:

of my life, would be get my elementary school homework done and then get home in time at

5, 3, and 6 to watch Simpsons reruns.

831

:

And I just watched 90's Simpsons 24-7 to the point where I see Conan O'Brien, because he's

the rightful major, just genuine fan of Conan O'Brien.

832

:

He's the greatest thing ever.

833

:

But I love fantastic comedy.

834

:

then, so anyways, people would say whatever.

835

:

And kind connect back to Alexander Wang.

836

:

So they ended up, Facebook bought WhatsApp for 10 % of value.

837

:

And I was like, well, this is crap.

838

:

And eventually Indian people are fantastic.

839

:

And I got a lot of awesome Indian friends.

840

:

And they would say, oh, you're going to text message me?

841

:

No.

842

:

You stupid American.

843

:

We use WhatsApp.

844

:

And thank God for Indian Americans.

845

:

Now my whole entire life is on WhatsApp.

846

:

I have, you know, you're part of all my groups.

847

:

It's like fantastic.

848

:

And I was doubting Zuckerberg then.

849

:

And now I'm like, I'm an idiot.

850

:

Now with the scale AI deal he paid so much, he put down 10 % of Facebook's value back in

the:

851

:

Dog.

852

:

10 % of the value, okay?

853

:

Now, do you know what that's worth now?

854

:

I mean, I think Facebook's worth what, $3 trillion last I checked or something?

855

:

So you're talking at least $300 billion for 40 people.

856

:

That makes this whole scale ideal look like...

857

:

Peasants feed it work.

858

:

I did the calculation it worked out to was like less than you guys are all stats people's

please look M &A not stats person be easy on me It worked out to be like less than like 1

859

:

% of its current market cap So it's a rounding error now look at okay I like investing and

investing is about taking risk and you have to do risk reward analysis And when you buy a

860

:

certain stock, it's okay I can buy a stock at $100 a share and it might go up five bucks

over next year

861

:

but at same time I'm longing it and she could also go down $70 or $50 a share.

862

:

That's not a good risk-return.

863

:

Whereas I could buy a $10 stock and it's a good opportunity to go through phase one, phase

two, three FDA trials and get approved because I've been doing the research.

864

:

It's a good molecule, good team, and this thing could probably be a hundred-bagger on

this.

865

:

So 10 to 100, so a 10X at the risk of it going to nothing and me losing my $10.

866

:

10 to 1, that's a good risk.

867

:

So Zuckerberg is like, you know what?

868

:

I could end up, this goes nowhere and Alexander does nothing for me.

869

:

And I paid like less than like 1 % of, it's 14 billion.

870

:

So actually it's not even 1%.

871

:

It's like 0.001 % or something ridiculous.

872

:

And that's not even, it's like a rounding error for us.

873

:

But if he nails it, he could lead our market cap to 3 trillion to 4 trillion to 5

trillion.

874

:

So that's a good risk.

875

:

And another thing, they've already showed the LLMs are integrating and it's increasing

their ad conversion on uh Instagram and also on Facebook.

876

:

So it's already paying dividends on this thing.

877

:

But my main point is everyone's losing their mind over these compensation packages, but

they've already been going on forever in the Valley.

878

:

It's that people are not paying attention.

879

:

Nor does it mean it's destroying the social contract in the Valley.

880

:

I've heard a stupid argument being made that's saying, well, what's going on now is

they're just stealing good talent from the company, leaving it there and screwing everyone

881

:

over.

882

:

It's like no, if Zuckerberg had its way, if let's say me and you are at Facebook and we're

in the &A team, I get to work with you, which is fantastic, I'm gonna call God Fantastic,

883

:

Bayesian guys here, like and subscribe, this is great.

884

:

ah We're gonna go buy a company for $10 billion.

885

:

Do we wanna buy the whole entire company, get all the engineers, all the IP, or do we just

want to pay 10 billion, only get a few people, but then the company continues and could

886

:

compete against us?

887

:

and work with our competitors.

888

:

What would you like?

889

:

Taking the company out with 10 billion or leaving a zombie company that could actually

cause you problems down the road?

890

:

What would you do?

891

:

You would go for the former, right?

892

:

Well, the reason why they didn't go for the former is a thing called the FTC and

antitrust, and they're trying to block deals.

893

:

And so that's why they went for 49 % control stake.

894

:

If they went for 51 % stake, the FTC would go, investigation, antitrust, let's go.

895

:

And so that's why you're seeing these weird deals.

896

:

It's not because, that's because Silicon Valley wants to do it.

897

:

No, they want to just buy you and have you and get rid all the skeletons.

898

:

They don't want to have this zombie thing around that could cause them problems down the

road, just like we saw with the windsurf deal.

899

:

The windsurf deal, originally, windsurf is basically, it is like a cursor.

900

:

It's not that big in the market.

901

:

It was doing decently.

902

:

OpenA wanted to buy them because right now what's going on is cursor.

903

:

helps you do code and it taps into OpenAI models.

904

:

When you use cursor, it then sends a request to OpenAI models and OpenAI sees the code

you're requesting but does not see if Alex accepted that commitment and said that was good

905

:

or not.

906

:

And OpenAI needs that because if they get that signal, dog, they can say, Alex didn't like

all that, that sucked.

907

:

Let's try a new model.

908

:

And then Alex is this is great.

909

:

Here's more money.

910

:

And that was the missing piece of the pie.

911

:

So they went to do that deal.

912

:

And then it turned out that also is your I think your heater is clicking on real quick.

913

:

Sorry.

914

:

I'm like like I probably I'm probably in the spectrum that your heater or AC in the

background.

915

:

That probably is the how do you say in English the oh there you go.

916

:

OK.

917

:

I'm good.

918

:

I'm sorry.

919

:

It's weird.

920

:

It's like it's weird sometimes.

921

:

OK.

922

:

So you're cool.

923

:

So what happened was I know right.

924

:

Don't forget like and subscribe.

925

:

What happened was the deal was announced.

926

:

The rumors were happening and it was going to happen.

927

:

And then all of sudden, because Microsoft invested in OpenAI and has this really good

contract that says we get to right solve your IP, Microsoft's like, hey, so if you're

928

:

going to spend $3 billion on Windsurf, we also want all that IP so we can make our own

competing product.

929

:

And Sam Hultman was like, hey, go jump in a lake.

930

:

I'm not going to do that.

931

:

And this was delaying the deal.

932

:

And Sam wanted to buy the whole entire company.

933

:

Just because the word that were going to get the completion that OpenAI could get

Windsurf, which means they'll see completion data of when Alex uses GPT-4.0 for

934

:

completions versus Claude for completions, that's the holy grail of information, Anthropic

said, actually, because of even the rumor of that, we're blocking Windsurf from being used

935

:

in Claude from using Windsurf, just the rumor of that.

936

:

So then, ah OpenAI walks away from the deal.

937

:

Now, Windsurf, who was like a second or third in the market compared to Cursor, is now

getting one of the best Claude models cut off from access, which is like getting gut

938

:

punched.

939

:

I you know more about this.

940

:

I'm just M &A.

941

:

I don't know what the hell I'm talking about, okay?

942

:

I slept at Holiday Inn Express last night.

943

:

And so, all this is going on, and then Windsurf's like, are we gonna do?

944

:

Google reaches out and says, hey, we'll give you like $2.4 billion.

945

:

but we only want the top engineers, we would take the whole entire company, but we're

under antitrust scrutiny too.

946

:

So do you either want to have this $2.4 billion exit be broken up because of this, or do

you want to deal with the situation?

947

:

And so they gave them the same deal with the deal situation.

948

:

Do you want to make this deal happen even though it's the best of like making it land?

949

:

Because this is the same type of deal that Google gave um

950

:

the people from Character AI when they took Gnome in crew, they basically said, we want to

buy your whole entire company, but because they antitrust, we'll give you all a lot of

951

:

money, but then we'll just take over Gnome and whatnot.

952

:

So they did the same deal.

953

:

And because of this, again, people were like, oh my God, Google's screwing people over

this end of the valley.

954

:

It's like, no, they're just playing the game is slanted against them in the M &A

direction.

955

:

And I'm not playing a violin for a multi-trillion dollar company.

956

:

They have money fights up there.

957

:

They have the Willy Wonka chocolate river, all the other good stuff.

958

:

But this is what they have to deal with to make a deal go through.

959

:

And what Google did was nice is it said, okay, we're gonna do this, take the people we

want, but we're also gonna give you an extra $100 million in the treasury of uh Windsurf

960

:

so that you can pay out retention and stay bonuses to the people that remain so they get

paid pretty well.

961

:

But then Devin came in and said, we wanna buy what's existing left because we're not deal

with antitrust scrutiny.

962

:

we can do that.

963

:

And that's how it all landed.

964

:

So, connecting all this back, these gigantic pay packages aren't anything new.

965

:

And the fact that they're so huge is just because you engineers are beautiful.

966

:

can just clickety-clack on a keyboard.

967

:

You said you do open source stuff.

968

:

You're clickety-clacking there, and then Jesse's like, this is great.

969

:

I'm just gonna grab some of that and put that into my stack, and now I'm making money.

970

:

And so, because you're so great, you can paint a lot because of scale.

971

:

um So that's how this all connects.

972

:

And another thing, too, is people like to

973

:

take the edge case and say it's representative of the whole entire sample population.

974

:

Hey look, I know about stats.

975

:

I'm on Bayesian's stat show with Alex.

976

:

This is great.

977

:

And so they look at that &A and say it's indicative of the rest of what's going on in &A.

978

:

And it's like, no, that's like me seeing what Taylor Swift's plans are for getting married

and saying, that represents all women in America.

979

:

It's like, you're getting someone worth billions of dollars and saying it represents all

women in America.

980

:

Stop smoking crack.

981

:

And when you deal with acquisitions in the billions of dollars, you're dealing with savior

economics.

982

:

What is savior economics?

983

:

You have CEOs who have trillions of dollars in market cap and they're worrying about

losing that power.

984

:

And so if they see something they want, that's going to be their savior.

985

:

Oh Jesus Christ, I feel the spirit.

986

:

They're willing to say, YOLO dog, like whatever you need.

987

:

So I'm just gonna throw ridiculous amounts of money on.

988

:

And so seeing these gigantic offers, now people are finally paying attention to M &A and

saying it's indicative of like what uh a nice mid-tier engineer is going to get or

989

:

something.

990

:

A mid-tier engineer is never going get that.

991

:

That's like the odds of one in a trillion.

992

:

So that's why I tell people just don't focus on that instead of focus on your core job.

993

:

And if you're working in tech right now, when the average, I think the median income in

America, let's check right now.

994

:

I think it's 40 or $50,000.

995

:

Median income.

996

:

You're not talking about San Francisco and the Bay Area.

997

:

It's $40,000 is 2023.

998

:

That's what maybe junior engineers, mean year two or year three engineers that maybe at

Google or some even firm will get in a bonus, you know, and that's like at the low end.

999

:

So if you're in tech right now and you're employed and you're working in a medium tier or

top tier company, I mean, it's because I was saying you're winning.

:

01:16:02,662 --> 01:16:03,982

Don't look at these.

:

01:16:03,982 --> 01:16:08,342

50 million, 100 million dollar deals because that's a lottery ticket, you know?

:

01:16:08,622 --> 01:16:11,542

And so anyways, don't forget to and subscribe.

:

01:16:12,462 --> 01:16:13,002

Go ahead.

:

01:16:13,002 --> 01:16:14,242

I say that as a joke.

:

01:16:14,242 --> 01:16:22,902

mean, yes, do like and subscribe, but I just throw it in there as like, because everyone

always says it all the time, and buy Skillshare and get AG1 and then go into Masterworks.

:

01:16:22,902 --> 01:16:23,702

Is it Masterworks?

:

01:16:23,702 --> 01:16:25,542

Did Masterworks support you at all?

:

01:16:25,542 --> 01:16:25,942

No?

:

01:16:25,942 --> 01:16:27,682

Okay, I'll make a Masterworks joke later on.

:

01:16:27,682 --> 01:16:29,262

go, continue, you're on the floor.

:

01:16:29,722 --> 01:16:30,602

Yeah.

:

01:16:30,602 --> 01:16:32,422

No, yeah, I mean.

:

01:16:32,642 --> 01:16:42,564

That makes me think also, you know, a bit of the, but the, the way athletes are getting

compensated, you know, at some point money doesn't mean anything, you know, especially

:

01:16:42,564 --> 01:16:52,512

like whether that's baseball or soccer or football at some point, like the ones at the

very, very, very high tier almost doesn't mean anything anymore.

:

01:16:52,512 --> 01:16:55,970

You know, what's, what's 10 more millions per year.

:

01:16:55,970 --> 01:16:58,672

Yeah, Ronaldo's like, just throw it to the pile, I guess.

:

01:16:58,672 --> 01:17:03,955

I guess I'll what what Latin American or European country should I buy with my salary now?

:

01:17:04,835 --> 01:17:15,202

You know, it's like, yeah, like, or messy getting 10 % of the of the Apple TV contracts

with the MLS, know, I, I wish we would.

:

01:17:15,202 --> 01:17:16,623

I wish this was my show right now.

:

01:17:16,623 --> 01:17:21,596

Instead, like you're what you do is so much more fascinating with the crap I do, because

you're doing sports and stats.

:

01:17:21,596 --> 01:17:22,686

And that's just like,

:

01:17:22,830 --> 01:17:24,050

That's holy grail.

:

01:17:24,330 --> 01:17:26,110

did you grow up watching ESPN?

:

01:17:26,110 --> 01:17:26,810

Or you're in France.

:

01:17:26,810 --> 01:17:28,150

Did you have ESPN in France?

:

01:17:28,150 --> 01:17:29,990

Like, all I did was just...

:

01:17:29,990 --> 01:17:30,090

Good.

:

01:17:30,090 --> 01:17:30,990

we do...

:

01:17:30,990 --> 01:17:35,610

Like, we do if you have the cable, or you can always get ESPN through something.

:

01:17:35,610 --> 01:17:37,150

But we never covered...

:

01:17:37,150 --> 01:17:41,350

We never never covered God's sport, which is European football.

:

01:17:41,350 --> 01:17:45,430

We were mostly, back in the 90s, mostly, American football and things like that.

:

01:17:45,430 --> 01:17:50,050

So probably not as much, but like, I was always in elementary school to high school,

like...

:

01:17:50,126 --> 01:17:54,126

Etta Simpson's like time to go to ESPN just watch sports for like a couple of hours.

:

01:17:54,126 --> 01:18:00,486

And I never thought you could get a career in that and to see you have a career in that

I'm just like my boy has made it.

:

01:18:00,486 --> 01:18:01,446

Hell yeah.

:

01:18:02,566 --> 01:18:06,026

Yeah, so where to next?

:

01:18:06,026 --> 01:18:15,806

Yeah, I mean, so I think I want to play this out because you've already been very generous

with your time and I need to ask you the last two questions.

:

01:18:15,806 --> 01:18:17,546

But first,

:

01:18:18,323 --> 01:18:23,547

I'm curious so we can go in either direction.

:

01:18:23,547 --> 01:18:40,131

I think there is an interesting thread about all the GPU stuff that you mentioned a bit

but I think it's interesting lately we've seen a lot of deals, Oracle, OpenAI, GPU cars,

:

01:18:40,131 --> 01:18:46,686

data centers, blah blah blah and I know you have some thoughts about that maybe to make

that clearer to people.

:

01:18:46,738 --> 01:19:02,871

mainly also to bring always bring that back to Why would that matter to a technical

audience that is either working in tech or take adjacent or looking into getting into tech

:

01:19:02,871 --> 01:19:05,753

so either that or I'm also curious.

:

01:19:05,753 --> 01:19:10,496

What was your uh Craziest or funniest ever?

:

01:19:10,857 --> 01:19:13,228

I mean a deal that you have

:

01:19:15,318 --> 01:19:19,718

Alex this show needs a million subs dawg you guys those are some good questions dude.

:

01:19:19,718 --> 01:19:26,518

Like y'all, I'm serious if y'all are listening to this this guy's working a full-time job

and putting this on for you for free like just do me a favor and reshare and share it with

:

01:19:26,518 --> 01:19:35,898

your homies like it means a lot like f these algorithms the algorithms are like he didn't

do a booty shake video in this um where are the cats in this video so we're not gonna pump

:

01:19:35,898 --> 01:19:41,420

it so like your shares and retweets are huge and you know and so oh

:

01:19:41,420 --> 01:19:43,432

God, the first question, the second question.

:

01:19:43,432 --> 01:19:57,343

um First question, when it comes to any tech boom, um there is real economic activity and

there's speculative activity.

:

01:19:57,384 --> 01:20:08,980

And so when it came to crypto, the last tech boom, there was very little real economic

activity being generated from crypto outside of people using...

:

01:20:08,980 --> 01:20:19,873

using bitcoin as a store of value and that was a wet dream for libertarians for people who

were worried about US debt for people who wanted an easy way to move money across you know

:

01:20:19,873 --> 01:20:27,275

the border and things like that I will say stable coins with the call some brothers are

doing is going to lead to real economic value but before what the call some brothers

:

01:20:27,275 --> 01:20:36,846

during stable coins stuff like that and I'm not a big crypto guy it was just mostly just

like speculative money running around now they I based on the fact that I've

:

01:20:36,846 --> 01:20:47,086

Basically, when I saw ChatGPT, I reorganized my whole entire career, and then I saw that I

could answer customer support tickets, and then I'm seeing now that there's, I mean,

:

01:20:47,086 --> 01:20:58,606

simple statistics of research on, if you give a customer service employee access ChatGPT,

and they are new to your product and what you're doing as a job with customer service, if

:

01:20:58,606 --> 01:21:05,766

you give it ChatGPT that has access to just how the product works, they will perform as

well as a customer service agent.

:

01:21:05,870 --> 01:21:08,311

who has had five to six months of experience.

:

01:21:08,891 --> 01:21:13,252

That's real, and especially in customer service that has high attrition and turnover.

:

01:21:13,252 --> 01:21:22,375

um If it makes it so you can onboard people faster, then your training costs are gonna go

down, a human training costs, not AI training costs.

:

01:21:22,375 --> 01:21:30,337

And then also, they've shown that people who are using AI as customer service agents have

less stress and are willing to stay in their job longer.

:

01:21:30,337 --> 01:21:32,538

So that's real economic activity.

:

01:21:32,538 --> 01:21:34,318

Now, here's always the issue.

:

01:21:34,318 --> 01:21:40,858

For every $1 of economic activity, you have about $10 in speculative activity chasing it

because they're fighting to get to that $1.

:

01:21:40,858 --> 01:21:43,218

They're fighting to get to the real revenue Jordan is paying.

:

01:21:43,218 --> 01:21:45,458

They're fighting to get that MoveWorks contract.

:

01:21:46,218 --> 01:21:52,278

And speculative money is all the VC money coming into there who are getting money from

pensions and whatnot.

:

01:21:52,278 --> 01:22:03,118

And they're just doing a thousand-flower strategy of just writing checks with many

startups and seeing if one can hit and can give me a 1,000, 10,000 X return.

:

01:22:03,118 --> 01:22:05,458

that will make up for all the failures.

:

01:22:05,778 --> 01:22:09,058

And so that's what we're seeing happening in real time.

:

01:22:09,058 --> 01:22:19,078

we're starting to see, and so there's not been a tech boom, substantial tech boom in

modern financial history without a bubble.

:

01:22:19,078 --> 01:22:20,658

It has to happen.

:

01:22:20,658 --> 01:22:25,578

And that's why I'm not trying to be like some libertarian pushing whatever.

:

01:22:25,578 --> 01:22:27,918

kind of, this is matter of more liberal in my politics or whatnot.

:

01:22:27,918 --> 01:22:30,858

And I think people in the elderly should have teeth and people should be able to eat.

:

01:22:30,858 --> 01:22:33,230

But other than that, for us able-bodied, we should probably work.

:

01:22:33,230 --> 01:22:35,770

It sucks though, I hate working, I just wanna play video games.

:

01:22:36,150 --> 01:22:45,090

And so, you have a centralized planned economy, you don't experiment with stuff because

everything's centralized planned, and then if you do plan something, you're using tons of

:

01:22:45,090 --> 01:22:51,610

taxpayers' resources, and when it does explode, because central planning doesn't work that

much, you can make some arguments for China, it's pretty bad.

:

01:22:51,610 --> 01:22:59,310

Whereas in our system, it's like, no, we're just gonna have Andreessen Horowitz, we're

gonna have Sequoia and the rest of them just throw trillions of dollars at things.

:

01:22:59,310 --> 01:23:01,730

And if it fails, because it does fail,

:

01:23:01,730 --> 01:23:09,903

They're going to eat it on their balance sheet and they're not going to be doing what we

did in the financial crisis and using federal subsidized loans and then ask the federal

:

01:23:09,903 --> 01:23:10,814

government to bail you out.

:

01:23:10,814 --> 01:23:13,436

uh So it's messy.

:

01:23:13,436 --> 01:23:14,556

That's the way it has to be.

:

01:23:14,556 --> 01:23:16,357

But also it's like a forest fire too.

:

01:23:16,357 --> 01:23:18,628

When the bubble does happen, it's going to send...

:

01:23:18,628 --> 01:23:25,021

I'm in California, there's always a fire going on somewhere constantly, either creed in

the forest or whatnot.

:

01:23:25,021 --> 01:23:27,150

And when these forest fires do happen...

:

01:23:27,150 --> 01:23:32,310

It takes out the little startups and different companies, which is a tragedy and I hate it

because a lot of people get hurt.

:

01:23:32,310 --> 01:23:40,410

But when that does happen, it clears out the undergrowth, which then leads to cheaper

input prices for the companies that actually are generating real economic value.

:

01:23:40,410 --> 01:23:44,030

And it allows them to then deliver services cheaper to the end user.

:

01:23:44,030 --> 01:23:46,830

But it's a brutal process, but it always happens.

:

01:23:46,830 --> 01:23:51,370

And just because it does happen, it's not representative of any value actually in AI.

:

01:23:51,370 --> 01:23:52,590

There is value.

:

01:23:52,590 --> 01:23:55,734

It's just people are using it as an excuse to say, see, I told you so.

:

01:23:55,734 --> 01:23:57,615

it's going to crack you're to happen.

:

01:23:57,655 --> 01:23:58,836

So there's basically the haters.

:

01:23:58,836 --> 01:24:07,717

uh Now craziest thing I've seen in &A, there's stories I just can't tell because I just

eventually I took a two year career break.

:

01:24:07,717 --> 01:24:08,912

I'm going go back in the business.

:

01:24:08,912 --> 01:24:14,575

But I'll tell you the story that makes me look back and feel like I've done good things in

my life.

:

01:24:14,575 --> 01:24:19,888

We had one deal where a company had about 20, 30 employees.

:

01:24:19,949 --> 01:24:22,446

They were like the Uber of a home cleaning.

:

01:24:22,446 --> 01:24:24,966

and the CEO of company was fantastic.

:

01:24:25,286 --> 01:24:29,386

was, I her name, but that's terrible, but I gotta look it up in notes.

:

01:24:29,386 --> 01:24:35,486

The company's called Homejoy, and I don't know if you remember back in the day, hold on

real quick.

:

01:24:36,046 --> 01:24:38,406

That's my dad, Gary, say hello.

:

01:24:38,525 --> 01:24:39,266

How you doing?

:

01:24:39,266 --> 01:24:40,286

Gary says hello.

:

01:24:40,286 --> 01:24:45,566

Gary, we're gonna finish up in 15 minutes, I love you, And so I take care of my dad just

to prove he's real.

:

01:24:45,566 --> 01:24:48,026

I actually know it's just AI, it's Sora too.

:

01:24:48,246 --> 01:24:49,546

None of this is real.

:

01:24:49,546 --> 01:24:51,874

Just what is real is like, like and subscribe.

:

01:24:51,874 --> 01:25:01,937

Okay, so She's awesome person who ran it, but she didn't have the uber money to say hey

What we're doing is technically illegal under current rules, but I'll just see you in

:

01:25:01,937 --> 01:25:09,289

court because I have unlimited money She was just about to raise her next amount of money

in the state of California like send her a letter saying what you're doing is not up to

:

01:25:09,289 --> 01:25:17,861

the rules to where you're classifying employees so every venture capitalist said we're not

giving you any more money and It was terrible and so she then had two weeks.

:

01:25:17,861 --> 01:25:19,122

I left a payroll.

:

01:25:19,122 --> 01:25:20,878

She came to us in a week and I was like

:

01:25:20,878 --> 01:25:30,298

Google like these are awesome people but we just went tits up and they have mortgages and

everything and I don't want to see them go unemployed I feel terrible as a founder like

:

01:25:30,298 --> 01:25:41,238

can you please help my people I'm not gonna get any money out of this I'm here to work as

long as possible to make sure they land and I was like god damn it that was so heartfelt

:

01:25:41,238 --> 01:25:49,006

and sincere and you're such a great person give me the information on the people and so we

spent two weeks just grinding

:

01:25:49,006 --> 01:25:56,686

to do due diligence on them and everything and we got them in and got them jobs and they

all landed and it was like it felt so good that all of them got jobs and it made me feel

:

01:25:56,686 --> 01:26:01,166

good that I did something well and then I went back to my job hating my life and I'm just

kidding.

:

01:26:01,166 --> 01:26:08,706

Second one was I did deals for Google X and there was this guy who was this typical

engineer who was just like I just want to solve all her problems and I know about people

:

01:26:08,706 --> 01:26:16,866

with Alzheimer's they have shaky hands and they can't eat soup and that sucks so I'm gonna

make a spoon that based upon your gyrations it auto-corrects and it'll allow you to have

:

01:26:16,866 --> 01:26:18,466

soup even though your hands are shaky.

:

01:26:18,986 --> 01:26:20,437

And yeah, ball...

:

01:26:20,437 --> 01:26:22,969

Dude, this is why I love tech, dawg.

:

01:26:22,969 --> 01:26:27,152

It's just like, how can you not be excited about technology?

:

01:26:27,152 --> 01:26:30,224

There are a lot of things that are bad in society.

:

01:26:30,224 --> 01:26:31,615

I majored in political science.

:

01:26:31,615 --> 01:26:32,225

I know them.

:

01:26:32,225 --> 01:26:33,596

I can go on forever.

:

01:26:33,596 --> 01:26:36,158

But tech is always a glimmer of hope.

:

01:26:36,158 --> 01:26:40,701

It is always opportunity to give people more, pull people out of poverty, give people

better.

:

01:26:40,701 --> 01:26:43,134

My dad, for instance, he has terrible arthritis right now.

:

01:26:43,134 --> 01:26:44,544

He can't use a computer anymore.

:

01:26:44,544 --> 01:26:46,185

I gave him CHAT GPT+.

:

01:26:46,185 --> 01:26:48,076

He talks that thing every day.

:

01:26:48,160 --> 01:26:50,351

search for news on this, what's going on here and there.

:

01:26:50,351 --> 01:26:53,431

It's giving him part of his life back and it's fantastic.

:

01:26:53,431 --> 01:26:56,562

I had the Notebook LLM CEO on her channel.

:

01:26:56,562 --> 01:26:57,613

She's fantastic.

:

01:26:57,613 --> 01:27:07,245

She was saying, whenever a person uses most tokens on her new app, Hux, which basically

looks at all of your Google Calendar Gmail and it's like an admin and in the morning gives

:

01:27:07,245 --> 01:27:11,266

you an audio recording of what's going on in your life and what's going up.

:

01:27:11,266 --> 01:27:11,816

It's fantastic.

:

01:27:11,816 --> 01:27:12,697

I use it.

:

01:27:12,697 --> 01:27:17,920

She said one guy just uses it to have conversations while he's in the truck because

:

01:27:17,920 --> 01:27:24,133

not because he's replacing his wife or his kids, but he just wants someone to talk to on

an eight hour drive.

:

01:27:24,314 --> 01:27:26,955

And it's making him feel less lonely, which is cool.

:

01:27:26,955 --> 01:27:28,686

um So, tech's fantastic.

:

01:27:28,686 --> 01:27:37,362

So anyways, going back to the deal, he eventually gets a spoon working and his

girlfriend's like, yeah, I know a family, a grandmother who just makes soup all the time.

:

01:27:37,362 --> 01:27:39,463

She can't, because her hands shake, now she can't even eat it.

:

01:27:39,463 --> 01:27:40,684

Let's try the spoon.

:

01:27:40,684 --> 01:27:45,070

So, grandmother's there, grandmother's kids are there, grandmother's...

:

01:27:45,070 --> 01:27:54,150

whole family's there and then the guy the engineer's there and then his girlfriend is

there and uh grandma like they surprise her with a spoon they get they show her soup and

:

01:27:54,150 --> 01:28:02,970

she's like i can't eat this because of my hands and they're like no try this spoon they

turn on and she's like wait a minute it's like steady and then she goes in and she gets a

:

01:28:02,970 --> 01:28:11,430

little bit soup and she like gets happy again and she's and she's like teardrop and then

her kids start crying

:

01:28:12,328 --> 01:28:16,560

And then like, you know, everyone else starts crying and then his girlfriend starts

crying.

:

01:28:16,580 --> 01:28:22,632

And the engineer is looking around is like, fuck, oh there's something wrong.

:

01:28:22,632 --> 01:28:23,773

And truly something wrong.

:

01:28:23,773 --> 01:28:24,963

What's wrong?

:

01:28:24,963 --> 01:28:27,814

And so he didn't say, he goes to his girlfriend, can I talk to you on the side?

:

01:28:27,814 --> 01:28:32,216

And he's like, I'm so sorry that this messed up.

:

01:28:32,216 --> 01:28:33,352

Like what's wrong?

:

01:28:33,352 --> 01:28:33,952

Tell me to fix it.

:

01:28:33,952 --> 01:28:38,369

She's like, his girlfriend's like crying again, because his wife's like, you dummy.

:

01:28:38,369 --> 01:28:40,850

You gave them your grand, their grandmother back.

:

01:28:40,850 --> 01:28:42,400

She has her life back.

:

01:28:42,606 --> 01:28:52,766

and she gave him a kiss and said thank you and he's like oh and so i was just like that's

that's what that's why i love what i'm in a that's why i love what tech that's what you

:

01:28:52,766 --> 01:29:00,126

all should be excited about what's going on in our life now let's get to real work i know

this is your show you have a what's your you have a question about me going back in time

:

01:29:00,126 --> 01:29:11,446

right or what was it who don't yeah but first that was great like thank you i really love

these uh these answers and uh you're welcome wouldn't be be bought anywhere

:

01:29:11,846 --> 01:29:18,546

I, you know what, got my, uh, Alzheimer's spoon.

:

01:29:18,546 --> 01:29:20,390

have known much more.

:

01:29:20,390 --> 01:29:20,750

Yeah.

:

01:29:20,750 --> 01:29:25,252

It's not like Alzheimer's gonna get down with years.

:

01:29:25,392 --> 01:29:26,623

We get older and older.

:

01:29:26,623 --> 01:29:28,574

Exactly.

:

01:29:28,574 --> 01:29:29,054

Exactly.

:

01:29:29,054 --> 01:29:30,164

And that's another story.

:

01:29:30,164 --> 01:29:35,547

wish, I want to, I have a VC friend who works with Peter Thiel and he's all in biotech and

he just has stories for Daze Dog.

:

01:29:35,547 --> 01:29:38,178

We didn't really talk, but we talk about this like.

:

01:29:38,474 --> 01:29:41,695

I like this solution with the guy was trying to do, but I like the better solution.

:

01:29:41,695 --> 01:29:51,951

I'm just like, can we just get like something that fixes that figures out which genes in

concert or what thing is going on our body that leads you having Alzheimer's and just fix

:

01:29:51,951 --> 01:29:52,342

it.

:

01:29:52,342 --> 01:29:54,702

Just get that out of our system.

:

01:29:54,843 --> 01:29:55,463

Rebuild.

:

01:29:55,463 --> 01:29:55,963

I got it.

:

01:29:55,963 --> 01:30:01,256

I got to see an Amazon, but like Google bought the technology and then reorgs and things

like that happen.

:

01:30:01,256 --> 01:30:05,869

And so I, I, they were selling it, but I think they stopped selling it for some odd

reason.

:

01:30:05,869 --> 01:30:07,289

I will go more into that.

:

01:30:07,289 --> 01:30:08,620

Um,

:

01:30:09,234 --> 01:30:10,744

So yeah, thanks man.

:

01:30:10,744 --> 01:30:11,445

Good.

:

01:30:11,445 --> 01:30:20,457

I'm impressed by your ability to do digressions in digressions in digressions and then

still be able to get back your train of thought.

:

01:30:20,457 --> 01:30:21,998

This is impressive.

:

01:30:21,998 --> 01:30:28,469

I think you're like a like I think if I remember correctly, Victor Hugo, the writer was

doing that in his book.

:

01:30:28,469 --> 01:30:38,318

when you read it's like, I don't remember if it's him or Emile Zola, but like it's like

one parentheses in Stein, one parentheses side one and then you know, it gets closed.

:

01:30:38,318 --> 01:30:41,538

So like the modern version of Victor Ego.

:

01:30:41,538 --> 01:30:42,638

Thank you.

:

01:30:43,078 --> 01:30:44,578

You're making this brown man blush.

:

01:30:44,578 --> 01:30:45,918

No, I appreciate that, dogs.

:

01:30:45,918 --> 01:30:47,718

I want to make sure the thread is connecting.

:

01:30:47,718 --> 01:30:51,878

The thread's connecting like fantastic because it's just like my mind will kind of like

it's like branching.

:

01:30:51,878 --> 01:30:52,918

It's like decision tree branches.

:

01:30:52,918 --> 01:30:55,738

I'm oh, I got to touch that real quick, but I to bring it back up to this.

:

01:30:55,738 --> 01:31:02,298

It's kind of like when the Gemini does tree search and it's like, okay, how do I exploit?

:

01:31:02,298 --> 01:31:04,118

But then when do I stop exploiting?

:

01:31:04,118 --> 01:31:05,438

is Monte Carlo, almost like Monte Carlo.

:

01:31:05,438 --> 01:31:06,898

I'm going stop, but you know what I mean.

:

01:31:06,898 --> 01:31:07,838

Okay, cool.

:

01:31:08,320 --> 01:31:09,750

Let's go to your awesome question.

:

01:31:09,750 --> 01:31:10,931

had a question.

:

01:31:11,172 --> 01:31:11,572

Yeah.

:

01:31:11,572 --> 01:31:12,422

Last two questions.

:

01:31:12,422 --> 01:31:17,835

I ask every guest at the end of the show because I feel like we could do a three hour

episode.

:

01:31:17,835 --> 01:31:19,356

Like no, no problem.

:

01:31:19,356 --> 01:31:21,748

But let's be respectful of your time.

:

01:31:21,748 --> 01:31:29,732

So uh first question, uh if you could have, no, no, actually you make me, you make me lose

my train of thought.

:

01:31:29,732 --> 01:31:31,023

This is not good.

:

01:31:31,263 --> 01:31:32,064

First question.

:

01:31:32,064 --> 01:31:36,806

If you had unlimited time and resources, which problem will you try to solve?

:

01:31:36,942 --> 01:31:40,462

Oh, so I have a personal vendetta against cancer like many people have here.

:

01:31:40,462 --> 01:31:41,742

They all know someone.

:

01:31:42,542 --> 01:31:45,242

My mother, God bless her, she's in heaven right now.

:

01:31:45,242 --> 01:31:46,702

She fought cancer for 14 years.

:

01:31:46,702 --> 01:31:50,482

She passed away during my first year of high school.

:

01:31:50,482 --> 01:31:56,582

And when we went to the church service, it was standing room only, the priest said, I've

never seen this place so back.

:

01:31:56,582 --> 01:31:58,842

There's people like outside that there.

:

01:31:59,582 --> 01:32:07,042

And I remember the saying where there was, know, there's a lot of great rich, wealthy

people, but

:

01:32:07,352 --> 01:32:10,743

for people sometimes you come to the you come to the funeral for uh...

:

01:32:10,743 --> 01:32:19,887

your your pocketbook and see we can get all the state before my mother's funeral who she

was on welfare and nothing people were there for their hearts and so we lost an amazing

:

01:32:19,887 --> 01:32:30,112

person in the community which is always in my heart and she just she was on welfare had

nothing living in silicon valley had two kids that my dad wasn't in my biological dad

:

01:32:30,112 --> 01:32:34,634

wasn't there and but she every day was like what can i focus on this good

:

01:32:35,214 --> 01:32:43,934

I still, as time goes on, I have more respect for her because I couldn't imagine having

kids and knowing that you wouldn't, you're gonna die before they even cross stage V18.

:

01:32:43,934 --> 01:32:44,734

Like that pressure.

:

01:32:44,734 --> 01:32:50,954

But she was like, I can't change that, but what I can change is the smile on my face and

laughter and making people feel good.

:

01:32:51,654 --> 01:32:58,574

gical dad, my adopted dad, in:

:

01:32:58,574 --> 01:33:00,694

And the doctor said, we'll do chemo.

:

01:33:00,774 --> 01:33:03,854

And chemo only shrunk the tumor by 30%.

:

01:33:03,854 --> 01:33:11,114

And my co-host, Joe Tronaski, who's a friend of mine and I love him like a brother, he was

reading research in immunotherapy.

:

01:33:11,114 --> 01:33:13,934

like, ask your dad, your doctor about this type of immunotherapy.

:

01:33:13,934 --> 01:33:15,074

It went through phase three trials.

:

01:33:15,074 --> 01:33:16,354

They probably can get it.

:

01:33:16,354 --> 01:33:20,034

And I was just like, God damn it, Joe, why are you so just awesome?

:

01:33:20,034 --> 01:33:26,214

So eventually I asked about that and he got the right type of, he got key treta or

something.

:

01:33:26,214 --> 01:33:28,914

And now knock on wood, he's been in remission for a decade.

:

01:33:28,914 --> 01:33:30,334

He was over there, right there.

:

01:33:30,334 --> 01:33:33,134

Doctor gave him a few treatments and was like, yep, well.

:

01:33:33,134 --> 01:33:36,434

looks like it's gone, all gone, you're not stage four anymore.

:

01:33:37,094 --> 01:33:45,854

Come back once a year for like an MRI and I was like, and so I was like crying tears of

joy at the same time, then like, did you ever see that movie, There Will Be Blood by

:

01:33:45,854 --> 01:33:47,034

Enchants?

:

01:33:47,034 --> 01:33:47,714

Yeah.

:

01:33:47,714 --> 01:33:56,654

I see your eye expressions, I love that, because it's like, you know, there's a, I watched

it, I've seen it like three times, there's a scene when he discovers there's oil, and then

:

01:33:56,654 --> 01:34:02,574

all of a sudden you see him go to the ancestors office, he's like, and he hears music

going, the background is mine, goes dun dun dun.

:

01:34:02,574 --> 01:34:04,414

because he realized he makes so much money.

:

01:34:04,414 --> 01:34:06,374

He's like, give me every land, give me it all.

:

01:34:06,374 --> 01:34:12,914

And so when that happened, like back when I bought it, was like, okay, I'm going on the

internet looking for different types of stocks.

:

01:34:12,914 --> 01:34:16,874

And I yellowed into one stock called Cart Pharma.

:

01:34:16,994 --> 01:34:19,014

And it was going through the phase trials.

:

01:34:19,014 --> 01:34:21,474

And it was like ups and downs and eventually got acquired.

:

01:34:21,474 --> 01:34:24,154

was like, yes, I was like, my dad lives, I got a little bit of money.

:

01:34:24,154 --> 01:34:24,914

Yay.

:

01:34:25,294 --> 01:34:28,734

So cancer, I all my time on cancer research.

:

01:34:28,734 --> 01:34:32,854

I feel like I just, I'm seeing certain cancers get cured.

:

01:34:32,914 --> 01:34:39,457

And it's just we're at the right time now with either immunotherapy, CAR-T therapy, uh

just different ways of doing it.

:

01:34:39,457 --> 01:34:41,718

uh I feel like there's progress being made.

:

01:34:41,718 --> 01:34:47,541

Before, my mother's generation was just like, okay, cancer is like a terrorist inside of a

city.

:

01:34:47,541 --> 01:34:55,004

And so the way that chemotherapy works is we're just gonna drop a nuke on that city and

hope we get the terrorist.

:

01:34:55,004 --> 01:35:00,238

Whereas with CAR-T therapy and whatnot, it's like we're sitting in SEAL Team 6.

:

01:35:00,238 --> 01:35:04,198

and they're gonna do one shot, one kill on that cancer and everyone else is gonna live.

:

01:35:04,198 --> 01:35:07,118

And that's why I think we're seeing in our lifetime and it's beautiful.

:

01:35:07,118 --> 01:35:14,458

So all my resources on just investing in researchers who are doing cancer, bringing cancer

drugs to phase trials.

:

01:35:14,458 --> 01:35:17,018

Now, your second question, why don't you go for that one?

:

01:35:17,338 --> 01:35:23,878

Yeah, yeah, but first, yeah, damn, fantastic answer, Jordan, thanks.

:

01:35:24,638 --> 01:35:27,278

And yeah, completely agree with you, that's also weird.

:

01:35:27,278 --> 01:35:28,332

That's why...

:

01:35:28,332 --> 01:35:29,783

take in science are amazing.

:

01:35:29,783 --> 01:35:33,464

Like, yeah, because now completely agree with you.

:

01:35:33,464 --> 01:35:41,888

And I feel like so far as I've never known my grandmother from my mother's side, she died

a few years before I was born.

:

01:35:41,888 --> 01:35:42,498

Sorry.

:

01:35:42,498 --> 01:35:50,911

So that's always been something, you know, like, I would I would have loved to know to

know her and that was of course, because of cancer.

:

01:35:50,911 --> 01:35:57,734

And I feel like in her time, that was really it was like, like HIV.

:

01:35:57,888 --> 01:36:00,900

at the beginning where it was a death sentence.

:

01:36:00,900 --> 01:36:12,428

now, so now HIV is pretty much people can live with it and they just take pills without a

lot of big side effects, which is just like already incredible.

:

01:36:12,609 --> 01:36:25,838

And then cancer now, more and more different cancers are curable and people can still live

comfortably without having too many troubles and that's...

:

01:36:25,912 --> 01:36:26,612

That's fantastic.

:

01:36:26,612 --> 01:36:30,243

And yeah, that's a research always, always follow closely.

:

01:36:30,363 --> 01:36:41,406

And so like you, I'm, I'm, feel like, I mean, I hope so I'm obviously biased, but I feel

like we're at a tipping point where probably in the coming years, hopefully we'll find

:

01:36:41,406 --> 01:36:50,589

something that will, that will hopefully obliterate these, um, these terrible, uh, I mean,

is, that even a disease?

:

01:36:50,589 --> 01:36:50,999

I don't know.

:

01:36:50,999 --> 01:36:55,360

Even the, the scientific word, I don't think it's a disease, but yeah, like these.

:

01:36:55,374 --> 01:36:57,994

Right, it's true because you think about like what is it doing?

:

01:36:57,994 --> 01:37:05,734

It's replicating, it's doing like mitosis, meiosis on steroids, which cells should be

doing, but it's doing way too much and there's a little bit of damage there.

:

01:37:05,734 --> 01:37:14,534

So it's like you're trying to tell your body, okay, there's a way you can like allow it to

have the meiosis, mitosis, I'm sorry, I'm not biology person, all throughout the system.

:

01:37:14,534 --> 01:37:16,934

But then this cluster of cells is doing way too much.

:

01:37:16,934 --> 01:37:19,674

Can you like do apoptosis and tell it to stop and die?

:

01:37:19,674 --> 01:37:24,534

And I think like immunotherapy, what it was basically doing was going through

:

01:37:24,854 --> 01:37:34,987

my dad's system and saying okay the white blood cells are looking at these cancer cells

and saying no these are just his cells are fine leave it alone but then the immunotherapy

:

01:37:34,987 --> 01:37:43,239

was like no we're gonna flip protein on these because these are damaged and then let the

white blood cells just kill these things and leave everything else compared to

:

01:37:43,239 --> 01:37:50,951

chemotherapy was like we're just gonna blow everything up and no shade against

chemotherapy in the persons that created it because it did it was best at time was better

:

01:37:50,951 --> 01:37:53,422

than we had before but to finally see

:

01:37:53,422 --> 01:37:56,642

we're moving the frontier in cancer treatments, that is positive.

:

01:37:56,642 --> 01:38:00,402

ith my mom when she passed in:

:

01:38:00,942 --> 01:38:02,262

Another thing, your grandmother.

:

01:38:02,262 --> 01:38:03,722

Still sorry hear about that.

:

01:38:03,862 --> 01:38:05,482

I want to do a side tangent.

:

01:38:05,542 --> 01:38:09,262

Right now there's a big people on Twitter, Brian Johnson, all of them want to live

forever.

:

01:38:09,262 --> 01:38:11,342

And I think it's fantastic.

:

01:38:11,342 --> 01:38:13,002

I'm glad he open sources research.

:

01:38:13,002 --> 01:38:17,002

can hail the man he wants, but he opens up what he's trying to do when he loves life.

:

01:38:17,002 --> 01:38:17,822

Great.

:

01:38:17,982 --> 01:38:22,286

Now, I was joking with Jesse, because I was telling Jesse, I was like, I think it's great

that we're going to

:

01:38:22,286 --> 01:38:23,666

people were gonna be able to live forever.

:

01:38:23,666 --> 01:38:26,186

Jesse's like, did this mean immortality?

:

01:38:26,186 --> 01:38:35,366

I'm like, f off Jesse, what I'm talking about is elf, Lord of Rings, elves type

immortality where yes, you can still get shot by Star-Ran by a dagger and die, but you're

:

01:38:35,366 --> 01:38:42,626

not gonna die of cancer, you're not gonna die of the sniffles, you're not gonna die of

diseases, and you could live forever so long as you don't get hit by a chariot or a

:

01:38:42,626 --> 01:38:43,226

catapult.

:

01:38:43,226 --> 01:38:44,126

That's what I want.

:

01:38:44,126 --> 01:38:47,846

And I was thinking, for people who are against that, I don't want to ever live forever.

:

01:38:47,966 --> 01:38:50,542

Okay, would you your grandma to live forever?

:

01:38:50,542 --> 01:38:56,662

If you say you want your grandmother to live forever in good health, then either, there

might be some screws loose.

:

01:38:56,662 --> 01:39:04,142

And I was thinking about your story, your grandmother, how awesome if she was here today,

seeing you, how awesome you are, where you're going with your life, you just get a call

:

01:39:04,142 --> 01:39:07,302

from grandmother once a week, I just want you to know I love you, I'm proud of you.

:

01:39:07,302 --> 01:39:09,262

Like, people need that encouragement.

:

01:39:09,262 --> 01:39:16,162

And then the fact that your grandmother would be living longer and healthier, and then

you're taking care of the kids, maybe she's like, you know what, I'm feeling good enough,

:

01:39:16,162 --> 01:39:19,566

I wanna go adopt some kids who are not getting love and encouragement too.

:

01:39:19,566 --> 01:39:21,226

and give them grandma love.

:

01:39:21,226 --> 01:39:26,806

So more people who are struggling and going through depression or thinking about hurting

themselves for others have grandma love.

:

01:39:26,806 --> 01:39:28,806

And that's something that gets me excited.

:

01:39:28,806 --> 01:39:32,046

I still think longevity though is going through a hype cycle.

:

01:39:32,046 --> 01:39:34,826

thinks they're gonna like, it's always, it's like self-driving cars.

:

01:39:34,826 --> 01:39:39,486

Back in:

gonna have them all.

:

01:39:39,486 --> 01:39:43,446

10 years later, we still haven't gotten the whole Bay Area approved for Waymo yet.

:

01:39:43,446 --> 01:39:44,146

Anyway, I digress.

:

01:39:44,146 --> 01:39:46,166

I wanna get to your main question, cause I have notes.

:

01:39:46,166 --> 01:39:47,846

What's the second question about?

:

01:39:49,358 --> 01:39:51,638

Yeah, second question.

:

01:39:51,938 --> 01:39:53,778

But yeah, let's go.

:

01:39:53,778 --> 01:40:03,718

So second question is, if you could have dinner with any great scientific mind, dead,

alive, or fictional, who would it be?

:

01:40:03,718 --> 01:40:08,138

And could I time travel to their time, their back in their time?

:

01:40:08,138 --> 01:40:10,258

And also, how I interact with them?

:

01:40:10,258 --> 01:40:12,578

that change, will that allow me to change the timeline?

:

01:40:13,538 --> 01:40:15,718

So the question is open.

:

01:40:15,758 --> 01:40:16,258

So sure.

:

01:40:16,258 --> 01:40:16,658

Okay, great.

:

01:40:16,658 --> 01:40:17,118

No, I love it.

:

01:40:17,118 --> 01:40:18,418

This is a really good question.

:

01:40:18,722 --> 01:40:19,863

Okay, I have some notes on this.

:

01:40:19,863 --> 01:40:21,904

um I love history.

:

01:40:21,904 --> 01:40:22,965

I love history a lot.

:

01:40:22,965 --> 01:40:31,491

If I think if I could go back, if I, once I hit my number and whatever, I think I want to

go back and become a community college professor either in political science or history.

:

01:40:31,491 --> 01:40:35,243

I used to political science, but things are too divided now and I don't want to deal with

you're the Antichrist.

:

01:40:35,243 --> 01:40:38,836

I want to focus on tight history, medieval history, Roman history, whatnot.

:

01:40:38,836 --> 01:40:47,914

So what I would do is there's a guy named uh Hero of Alexandrian or Heron of Alexandrian

who lived around first century AD in uh Alexandria.

:

01:40:47,914 --> 01:40:59,122

shocker and he came with this this a device called an alio file let me show you what what

it looks like here you probably know this because you're super duper smart dude um alio

:

01:40:59,122 --> 01:41:16,033

file hero do do do do I'm gonna allow this time I'm gonna if I can share my screen real

quick um here it is uh he basically got this uh this little device which was like this is

:

01:41:16,033 --> 01:41:17,198

a cute little thing

:

01:41:17,198 --> 01:41:22,438

It's a simple bladeless radial steam turbine which spins when the central water container

is heated.

:

01:41:22,438 --> 01:41:25,538

Torque is produced, steam jets exit the turbine.

:

01:41:25,758 --> 01:41:31,338

The Greek Egyptian magician, Hero of Alexander, described the device in 1st century AD and

many sources give him credit for the invention.

:

01:41:31,338 --> 01:41:36,318

However, Vertuvius was the first to describe the appearance in his day architecture.

:

01:41:36,318 --> 01:41:39,258

So I'm going to say, Hero created it.

:

01:41:39,258 --> 01:41:42,538

Now if you look at this, this is like a proto-steam engine.

:

01:41:42,538 --> 01:41:44,098

It doesn't have the piston.

:

01:41:44,498 --> 01:41:46,542

It doesn't have the right gears.

:

01:41:46,542 --> 01:41:56,082

it doesn't have the cooling mechanism for the coil so it can get into a piston and suck

water in and back out whatnot but this is 70-80 % of way there.

:

01:41:56,082 --> 01:41:58,402

Yes, the metallurgy isn't correct and blah blah blah.

:

01:41:58,402 --> 01:42:07,722

What I would do when I took my notes here is I would meet him in his time and he'd

probably see me and be like, oh interesting, Egyptian's talking to me even though I'm not

:

01:42:07,722 --> 01:42:10,562

Egyptian or he might see an Indian but I'm like, even not.

:

01:42:10,642 --> 01:42:12,382

And I'm just like, I'm just-

:

01:42:13,110 --> 01:42:17,433

Exactly, I get deported now!

:

01:42:17,433 --> 01:42:26,870

uh But I'm disguised as uh a Greek in Alexandria and I tell him, this is awesome what

you're doing, but could you adapt this with gears?

:

01:42:26,870 --> 01:42:34,986

I tell him, gears in the piston, and I'll show him what this is, and to make it so it can

open temple doors and get people closer to God.

:

01:42:34,986 --> 01:42:41,154

uh But in order for us to this, we need to create Tech Priest, or Hammer 40K reference.

:

01:42:41,154 --> 01:42:45,626

to maintain this and eventually tech pre-skilled support by the temples and they can go

from other temples.

:

01:42:45,626 --> 01:42:49,618

So now you have an order connected to God to protect this technology.

:

01:42:49,618 --> 01:42:56,022

I didn't say, hey, now what you need to do is connect it to a water wheel and then we can

start like doing industrial revolution.

:

01:42:56,022 --> 01:42:56,972

Why not?

:

01:42:56,972 --> 01:42:59,204

because slavery was still a gigantic thing.

:

01:42:59,204 --> 01:43:02,386

Europeans were enslaved, everyone was enslaved, and you have free labor.

:

01:43:02,386 --> 01:43:04,238

why uh mechanize?

:

01:43:04,238 --> 01:43:07,150

You saw this when my ancestors were enslaved in the South.

:

01:43:07,150 --> 01:43:15,056

The South in America never wanted to go past a greater economy and never go into

industrialization because you had free bodies, you know?

:

01:43:15,137 --> 01:43:24,684

And so anyways, this leads to tech priests now, all over these temples who know this art

of how to make doors magically open because he created a system where you could put a coin

:

01:43:24,684 --> 01:43:26,796

into a little house and it would

:

01:43:26,796 --> 01:43:32,188

dispense uh holy water when you're in a shrine and people were like, wow, this is like

God.

:

01:43:32,188 --> 01:43:37,970

If he had these doors opening up and the priest would say, God come visit us Zeus and the

doors.

:

01:43:39,251 --> 01:43:44,609

People would flip and they'd want to become tech priests too and become a whole guild

protecting the secrets of how this works.

:

01:43:44,609 --> 01:43:48,555

Not to protect it, but like making sure that it's not economical focused.

:

01:43:48,555 --> 01:43:50,406

It's focused on spirituality.

:

01:43:50,406 --> 01:43:53,837

So money is coming to it, even though this doesn't make economic sense, we still have

slavery.

:

01:43:53,837 --> 01:43:56,758

uh And then what happens is uh

:

01:43:56,758 --> 01:44:01,939

the kids start seeing their parents learning this and they start going to random trades

and they say, you know what?

:

01:44:01,939 --> 01:44:07,581

I'm a miner now and I realized there's so much water down here and I have bucket brigades

of slaves doing this, not working.

:

01:44:07,861 --> 01:44:10,202

yeah, I remember that steam technology.

:

01:44:10,202 --> 01:44:12,562

My dad was using to the doors.

:

01:44:12,563 --> 01:44:21,125

We might be able to use the pump water out of these uh mining rigs because slaves are

dying in there, but now we can go deeper and get more resources.

:

01:44:21,125 --> 01:44:22,866

So now you have economic use cases.

:

01:44:22,866 --> 01:44:24,766

uh And then...

:

01:44:24,766 --> 01:44:29,770

Rome start seeing more profit from this and they get prestige on doing this with great

temples.

:

01:44:29,770 --> 01:44:35,675

We have our mind shafts being cleared by this and now people are seeing these tech priests

have a lot of uses and we're going to start expanding all this.

:

01:44:35,675 --> 01:44:45,364

uh Then you're going to start seeing temples adopting, sorry, bathhouses adopting this and

things like that and people start replicating this.

:

01:44:45,364 --> 01:44:52,910

Then as time goes on, it becomes so efficient that the institution of slavery, maybe in

200, 300 AD,

:

01:44:52,910 --> 01:45:01,970

start seeing some cracks because the metallurgy gets better, the efficiency, the work

these T-dimensions can do makes better sense, and you see the Roman Empire actually extend

:

01:45:01,970 --> 01:45:05,630

itself and not collapse in the West by 300, 400 AD.

:

01:45:06,450 --> 01:45:12,690

y, Rome didn't collapse until:

Byzantines, they thought there were Romans.

:

01:45:12,690 --> 01:45:15,830

And they'd be like, Byzantine, that's what some person called this later in the future.

:

01:45:15,830 --> 01:45:18,250

But let's just say the West doesn't collapse.

:

01:45:18,330 --> 01:45:22,670

So then what happens is you start getting better machines, get a free labor force,

:

01:45:22,670 --> 01:45:27,450

of educated artisans and engineers, a middle class starts developing.

:

01:45:27,950 --> 01:45:33,250

You then see proto democracy maybe start happening in 300 AD or so, like a true democracy.

:

01:45:33,250 --> 01:45:38,290

Rome had a oligarchy, if best, and you had to be of noble birth.

:

01:45:38,590 --> 01:45:42,490

Emperor powers fade, and then you see a political economy shift.

:

01:45:42,990 --> 01:45:47,566

ee capitalism start happening:

:

01:45:47,566 --> 01:45:53,006

us having capitalism in like:

500.

:

01:45:53,486 --> 01:45:57,566

And then we see more education, scientific methods starts happening.

:

01:45:57,866 --> 01:46:01,826

Global trade network is powered by steam fleets.

:

01:46:01,826 --> 01:46:05,506

Feudalism, the Dark Ages and serfdom, we skipped that.

:

01:46:05,506 --> 01:46:13,926

So people forget that like Eastern Europe with serfdom, like most Eastern Europeans and

re were slaves until the late:

:

01:46:13,926 --> 01:46:15,146

We skip all that stuff.

:

01:46:15,146 --> 01:46:16,398

We skipped it all in Europe.

:

01:46:16,398 --> 01:46:22,098

ting press and electricity by:

:

01:46:23,758 --> 01:46:29,878

o we took the steam engine at:

got that led to Industrial Revolution.

:

01:46:29,878 --> 01:46:32,178

We bring it into it by a thousand years.

:

01:46:32,298 --> 01:46:36,998

So where we're at right now, we're 700 years in the future that we live in right now.

:

01:46:36,998 --> 01:46:40,118

Could you imagine 700 years more of progress of what we're experiencing right now?

:

01:46:40,118 --> 01:46:46,018

So we probably would have Mars taken care of, all diseases eradicated, free labor.

:

01:46:46,018 --> 01:46:54,634

what not we might have we might have not had a little one or were two we might have had

you know all these bad things that happen but most importantly that means i never would

:

01:46:54,634 --> 01:47:07,972

have these workday or concur and i've never have these working technology my life would be

so that's my that's the reason why i want to do it is avoid working and concur but anyways

:

01:47:07,972 --> 01:47:10,984

that's my answer i hope you liked it

:

01:47:11,660 --> 01:47:12,531

Man, that was great.

:

01:47:12,531 --> 01:47:13,101

That was great.

:

01:47:13,101 --> 01:47:17,395

That was officially the most worked out answer of the whole show.

:

01:47:17,395 --> 01:47:20,197

uh that's episode 146.

:

01:47:20,197 --> 01:47:23,790

know, like, yeah.

:

01:47:23,790 --> 01:47:29,144

145 people before you worked on that and you were the most worked out.

:

01:47:29,144 --> 01:47:30,766

yeah, well done.

:

01:47:30,766 --> 01:47:40,424

Thank you, dawg, because like back in, I learned about him back in like high school and my

mind would just keep on ticking about like, what if, like, what if he, like someone told

:

01:47:40,424 --> 01:47:41,068

him like,

:

01:47:41,068 --> 01:47:42,519

This is not just a toy.

:

01:47:42,519 --> 01:47:44,410

This is the future dog.

:

01:47:44,410 --> 01:47:54,127

Like, know, and then thinking of like, how do I anchor it in something that's uneconomical

that they're going to protect and see value and anchor it in religion back then?

:

01:47:54,127 --> 01:47:57,629

ah So anyways, that's how my mind works.

:

01:47:57,629 --> 01:48:01,772

I mean, so that makes me think of a book I read a few years ago.

:

01:48:01,772 --> 01:48:10,938

um That's called How to Invent Everything, a Survival Guide for the Stranded Time Trapper

by Ryan North.

:

01:48:10,938 --> 01:48:15,802

Um, so I did a link in the show notes put that it's really good.

:

01:48:15,883 --> 01:48:17,804

it's a fun one, you know, like you can read it.

:

01:48:17,804 --> 01:48:25,372

It's, it's like small snippets of like, yeah, like any big inventions, uh, how to make

that again.

:

01:48:25,372 --> 01:48:27,553

So yeah, like I think you would, you would enjoy it.

:

01:48:27,553 --> 01:48:32,097

seems like you enjoy, um, like, uh, how is it, how is it called?

:

01:48:32,118 --> 01:48:35,090

Um, alternative history or something.

:

01:48:35,090 --> 01:48:35,401

Yeah.

:

01:48:35,401 --> 01:48:37,082

I mean, just think about like.

:

01:48:37,494 --> 01:48:39,755

I grapple with systems thinking of history.

:

01:48:39,755 --> 01:48:43,506

I'm kind of a systems guy of like systems environment, like leads a lot of things.

:

01:48:43,506 --> 01:48:46,587

And then there's the great man theory, great man or woman theory of history.

:

01:48:46,587 --> 01:48:48,177

It's always just great people.

:

01:48:48,177 --> 01:48:53,518

And I'm more of a system side, but there are times in history where it's like, this one

person was paying attention.

:

01:48:53,518 --> 01:48:57,720

They would have saw the attack coming in and it would have prevented that society from

collapsing or something.

:

01:48:57,720 --> 01:49:00,500

And so I love those what if scenario.

:

01:49:00,500 --> 01:49:02,661

sorry, last thing.

:

01:49:02,661 --> 01:49:05,402

There was a democratic election in China.

:

01:49:05,402 --> 01:49:06,796

uh

:

01:49:06,796 --> 01:49:12,120

like in the:

assassinated.

:

01:49:12,120 --> 01:49:14,272

He was well liked by the population.

:

01:49:14,272 --> 01:49:17,014

If he would have that not would have happened.

:

01:49:17,094 --> 01:49:19,316

You never would have had the whole Taiwan thing.

:

01:49:19,316 --> 01:49:20,938

You never would have had maybe Mao.

:

01:49:20,938 --> 01:49:22,799

You might have had a democracy.

:

01:49:22,799 --> 01:49:28,434

It was more the European side that was like these a lot of peasants are living in terrible

life.

:

01:49:28,434 --> 01:49:31,534

Let's give them education and give them some sort of stake in society.

:

01:49:31,534 --> 01:49:40,354

would have had instead of the:

1980 economic revolution in China, you would have had starting in 1912.

:

01:49:40,354 --> 01:49:42,454

And we have a completely different conversation right now.

:

01:49:42,454 --> 01:49:44,714

Maybe they would be closer allies to us.

:

01:49:44,714 --> 01:49:46,654

So anyways, thank you so much.

:

01:49:46,654 --> 01:49:48,454

I want to your own show, but I will come hypocrite.

:

01:49:48,454 --> 01:49:49,814

Thank you so much for being on here.

:

01:49:49,814 --> 01:49:50,974

Like having me on here.

:

01:49:50,974 --> 01:49:52,774

I had a really hope you enjoyed.

:

01:49:52,774 --> 01:49:56,234

Let me know in the comments section guys, be nice to me.

:

01:49:57,094 --> 01:49:58,314

Now, yeah, that was super fun.

:

01:49:58,314 --> 01:50:01,454

Thank you so much, Jordan, for taking the time.

:

01:50:01,711 --> 01:50:13,191

so people if you want to dig deeper as always check out the show notes um that was really

great to have you on the show it was a very original episode um but at the same time very

:

01:50:13,191 --> 01:50:24,871

practical and i think uh and i always have always love having uh people on the show who

are like like you obviously uh very smart very passionate but you also seem like someone

:

01:50:24,871 --> 01:50:31,078

who is very uh um you know generous and in touch with um people's feelings and

:

01:50:31,176 --> 01:50:41,794

and how to make the world a better place with tech and &A and anything and I think it's

awesome because we need we need more people like you and to hear more about people like

:

01:50:41,794 --> 01:50:52,322

you instead of hearing very bad stuff all the time and that's also why I make this show so

yeah Jordan thanks again for taking the time and being on this show dude Alex thank you so

:

01:50:52,322 --> 01:50:58,166

much I really appreciate your time too thank you for having me here I had a great time and

thank you all your listeners for listening to this

:

01:50:58,220 --> 01:51:00,160

Love to hear your thoughts in the comments section.

:

01:51:00,160 --> 01:51:01,825

Thank you so much, take care.

:

01:51:06,040 --> 01:51:09,753

This has been another episode of Learning Bayesian Statistics.

:

01:51:09,753 --> 01:51:20,242

Be sure to rate, review, and follow the show on your favorite podcatcher, and visit

learnbaystats.com for more resources about today's topics, as well as access to more

:

01:51:20,242 --> 01:51:24,325

episodes to help you reach true Bayesian state of mind.

:

01:51:24,325 --> 01:51:26,287

That's learnbaystats.com.

:

01:51:26,287 --> 01:51:31,131

Our theme music is Good Bayesian by Baba Brinkman, fit MC Lass and Meghiraan.

:

01:51:31,131 --> 01:51:34,293

Check out his awesome work at bababrinkman.com.

:

01:51:34,293 --> 01:51:35,478

I'm your host.

:

01:51:35,478 --> 01:51:36,439

Alex and Dora.

:

01:51:36,439 --> 01:51:40,678

can follow me on Twitter at Alex underscore and Dora like the country.

:

01:51:40,678 --> 01:51:47,947

You can support the show and unlock exclusive benefits by visiting Patreon.com slash

LearnBasedDance.

:

01:51:47,947 --> 01:51:50,328

Thank you so much for listening and for your support.

:

01:51:50,328 --> 01:51:52,630

You're truly a good Bayesian.

:

01:51:52,630 --> 01:51:56,132

Change your predictions after taking information.

:

01:51:56,132 --> 01:51:59,434

And if you're thinking I'll be less than amazing.

:

01:51:59,434 --> 01:52:02,737

Let's adjust those expectations.

:

01:52:02,737 --> 01:52:05,307

Let me show you how to be a good Bayesian.

:

01:52:05,307 --> 01:52:15,949

You change calculations after taking fresh data in Those predictions that your brain is

making Let's get them on a solid foundation

Chapters

Video

More from YouTube

More Episodes
145. #145 Career Advice in the Age of AI, with Jordan Thibodeau
01:52:17
142. #142 Bayesian Trees & Deep Learning for Optimization & Big Data, with Gabriel Stechschulte
01:10:28
141. #141 AI Assisted Causal Inference, with Sam Witty
01:37:47
140. #140 NFL Analytics & Teaching Bayesian Stats, with Ron Yurko
01:33:00
139. #139 Efficient Bayesian Optimization in PyTorch, with Max Balandat
01:25:22
137. #137 Causal AI & Generative Models, with Robert Ness
01:38:18
135. #135 Bayesian Calibration and Model Checking, with Teemu Säilynoja
01:12:12
134. #134 Bayesian Econometrics, State Space Models & Dynamic Regression, with David Kohns
01:40:55
131. #131 Decision-Making Under High Uncertainty, with Luke Bornn
01:31:45
128. #128 Building a Winning Data Team in Football, with Matt Penn
00:58:10
126. #126 MMM, CLV & Bayesian Marketing Analytics, with Will Dean
00:54:46
125. #125 Bayesian Sports Analytics & The Future of PyMC, with Chris Fonnesbeck
00:58:14
124. #124 State Space Models & Structural Time Series, with Jesse Grabowski
01:35:43
123. #123 BART & The Future of Bayesian Tools, with Osvaldo Martin
01:32:13
122. #122 Learning and Teaching in the Age of AI, with Hugo Bowne-Anderson
01:23:10
120. #120 Innovations in Infectious Disease Modeling, with Liza Semenova & Chris Wymant
01:01:39
118. #118 Exploring the Future of Stan, with Charles Margossian & Brian Ward
00:58:50
116. #116 Mastering Soccer Analytics, with Ravi Ramineni
01:32:46
115. #115 Using Time Series to Estimate Uncertainty, with Nate Haines
01:39:50
114. #114 From the Field to the Lab – A Journey in Baseball Science, with Jacob Buffa
01:01:31
113. #113 A Deep Dive into Bayesian Stats, with Alex Andorra, ft. the Super Data Science Podcast
01:30:51
112. #112 Advanced Bayesian Regression, with Tomi Capretto
01:27:18
108. #108 Modeling Sports & Extracting Player Values, with Paul Sabin
01:18:04
106. #106 Active Statistics, Two Truths & a Lie, with Andrew Gelman
01:16:46
97. #97 Probably Overthinking Statistical Paradoxes, with Allen Downey
01:12:35
96. #96 Pharma Models, Sports Analytics & Stan News, with Daniel Lee
00:55:51
91. #91, Exploring European Football Analytics, with Max Göbel
01:04:13
87. #87 Unlocking the Power of Bayesian Causal Inference, with Ben Vincent
01:08:38
85. #85 A Brief History of Sports Analytics, with Jim Albert
01:06:10
83. #83 Multilevel Regression, Post-Stratification & Electoral Dynamics, with Tarmo Jüristo
01:17:20
2. #2 When should you use Bayesian tools, and Bayes in sports analytics, with Chris Fonnesbeck
00:43:37
3. #3.1 What is Probabilistic Programming & Why use it, with Colin Carroll
00:32:33
bonus #3.2 How to use Bayes in industry, with Colin Carroll
00:32:06
5. #5 How to use Bayes in the biomedical industry, with Eric Ma
00:46:37
8. #8 Bayesian Inference for Software Engineers, with Max Sklar
00:48:41
11. #11 Taking care of your Hierarchical Models, with Thomas Wiecki
00:58:01
22. #22 Eliciting Priors and Doing Bayesian Inference at Scale, with Avi Bryant
01:06:55
23. #23 Bayesian Stats in Business and Marketing Analytics, with Elea McDonnel Feit
00:59:05
32. #32 Getting involved into Bayesian Stats & Open-Source Development, with Peadar Coyle
00:53:04
33. #33 Bayesian Structural Time Series, with Ben Zweig
00:57:49
58. #58 Bayesian Modeling and Computation, with Osvaldo Martin, Ravin Kumar and Junpeng Lao
01:09:25
63. #63 Media Mix Models & Bayes for Marketing, with Luciano Paz
01:14:43
80. #80 Bayesian Additive Regression Trees (BARTs), with Sameer Deshpande
01:09:05