Artwork for podcast Learning Bayesian Statistics
#146 Lasers, Planets, and Bayesian Inference, with Ethan Smith
Causal Inference, AI & Machine Learning Episode 14627th November 2025 • Learning Bayesian Statistics • Alexandre Andorra
00:00:00 01:35:19

Share Episode

Shownotes

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

Visit our Patreon page to unlock exclusive Bayesian swag ;)

Takeaways:

  • Ethan's research involves using lasers to compress matter to extreme conditions to study astrophysical phenomena.
  • Bayesian inference is a key tool in analyzing complex data from high energy density experiments.
  • The future of high energy density physics lies in developing new diagnostic technologies and increasing experimental scale.
  • High energy density physics can provide insights into planetary science and astrophysics.
  • Emerging technologies in diagnostics are set to revolutionize the field.
  • Ethan's dream project involves exploring picno nuclear fusion.

Chapters:

14:31 Understanding High Energy Density Physics and Plasma Spectroscopy

21:24 Challenges in Data Analysis and Experimentation

36:11 The Role of Bayesian Inference in High Energy Density Physics

47:17 Transitioning to Advanced Sampling Techniques

51:35 Best Practices in Model Development

55:30 Evaluating Model Performance

01:02:10 The Role of High Energy Density Physics

01:11:15 Innovations in Diagnostic Technologies

01:22:51 Future Directions in Experimental Physics

01:26:08 Advice for Aspiring Scientists

Thank you to my Patrons for making this episode possible!

Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Joshua Meehl, Javier Sabio, Kristian Higgins, Matt Rosinski, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık, Suyog Chandramouli and Guillaume Berthon.

Links from the show:

Transcript

This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

Transcripts

Speaker:

Today we are heading straight into the heart of high-energy density physics.

2

:

The place where lasers crush matter, astrophysics meets the lab, and Bayesian inference

becomes indispensable.

3

:

My guest is Ethan Smith, a PhD candidate at the University of Rochester, working at the

intersection of plasma spectroscopy, diagnostics, technology, and Bayesian data.

4

:

analytics.

5

:

studies what happens when you use some of the world's most powerful lasers to squeeze

matter to extreme conditions, the same conditions you would find inside planets, stars, or

6

:

supernovae.

7

:

This is an episode for anyone who loves physics, who loves patient methods, or who just

wants to hear how scientific discovery actually happens behind the scenes.

8

:

This is Learning Vision Statistics, episode 146, recorded.

9

:

October 7, 2025.

10

:

Welcome to Learning Bayesian Statistics, a podcast about Bayesian inference, the methods,

the projects, and the people who make it possible.

11

:

I'm your host, Alex Andorra.

12

:

You can follow me on Twitter at alex-underscore-andorra.

13

:

like the country.

14

:

For any info about the show, learnbasedats.com is Laplace to be.

15

:

Show notes, becoming a corporate sponsor, unlocking Bayesian Merge, supporting the show on

Patreon, everything is in there.

16

:

That's learnbasedats.com.

17

:

If you're interested in one-on-one mentorship, online courses, or statistical consulting,

feel free to reach out and book a call at topmate.io slash Alex underscore and Dora.

18

:

See you around, folks.

19

:

and best patient wishes to you all.

20

:

And if today's discussion sparked ideas for your business, well, our team at Pimc Labs can

help bring them to life.

21

:

Check us out at pimc-labs.com.

22

:

Ethan Smith, welcome to Learning Basian Statistics.

23

:

Thank you for having me.

24

:

Thrill the beer.

25

:

Yeah, it's great to have you here.

26

:

Thanks a lot to JJ Ruby for putting us in contact.

27

:

I hear you guys are doing some fun stuff at Rochester, doing some physics things.

28

:

We'll definitely talk about that.

29

:

First, both your first and last name are very challenging for me to say with an English

accent as a French

30

:

man, because I want to say Ethan Smith.

31

:

Uh, and so it's very hard for me to say Ethan Smith.

32

:

It's like twice to the age in a row.

33

:

That's like, making my life hard.

34

:

apologize for that.

35

:

Yeah.

36

:

That's, that's my bad.

37

:

I do like, I do love the name Ethan that that sounds really good in English in French.

38

:

I don't like it's etan, but Ethan sounds really like very classy.

39

:

Yes, I agree.

40

:

Unfortunately, it's a very common name.

41

:

uh There is another Ethan Smith in my field and we go to lot of the same conferences and

it causes a lot of issues.

42

:

Yeah.

43

:

Yeah.

44

:

I am not surprised.

45

:

will confess that while I was preparing for the episode, um looking you up on the internet

was not easy because there was a lot of Ethan Smithies.

46

:

And if you input Ethan Smiths Rochester, um

47

:

There is actually one Ethan Smith who unfortunately died in a plane crash.

48

:

Yeah, from Rochester, Minnesota.

49

:

Yeah, exactly.

50

:

I know him.

51

:

damn.

52

:

Yeah.

53

:

So I was reading up on that.

54

:

was like, damn, this is a very sad story.

55

:

Not me though.

56

:

Different Ethan.

57

:

I'm still alive.

58

:

Yeah, thankfully.

59

:

So yeah, like actually, let's say talking about you.

60

:

Let's start talking about you.

61

:

um

62

:

So always when I, when I start asking to the guests says, think, you know, because, uh,

you listened to the show, you told me.

63

:

And so I very, very grateful for that.

64

:

and so I know, you know, you do a lot of cool stuff, um, listeners don't, you don't know

yet.

65

:

So yeah, just give us the, the origin story.

66

:

What are you doing nowadays and how did you end up working on that?

67

:

Yeah.

68

:

So I'm getting, I'm finishing up my PhD.

69

:

at the University of Rochester.

70

:

um Specifically, I'm in the field of high energy density physics, uh which is sort of a uh

niche subfield uh in physics where we use extremely high powered lasers to compress matter

71

:

to some extremely distressing conditions.

72

:

And it turns out when you compress matter with lasers, can...

73

:

you know, recreate some of the states that you would find, for example, at the center of

giant planets, or even in some cases at the center of stars.

74

:

And so that lets you recreate these systems in the laboratory and directly study how they

behave uh sort of at a much smaller scale than you can with observational astronomy.

75

:

And so that lets you learn a lot about the material properties of these astrophysical

objects.

76

:

And it also lets you create some just really interesting states of matter that you can't

find anywhere else on earth or even sometimes in the universe.

77

:

um And so, we create these very hot, dense states of matter for a billionth of a second.

78

:

And what the focus of my work is on is understanding how do you make a measurement, one,

how do you make a measurement of a system that that extreme and exists for a very short

79

:

amount of time?

80

:

And two, how do you interpret those measurements?

81

:

So measurements that we get out of these systems are often very integrated.

82

:

ah know, there's this inverse problem of trying to take up, know, you have an image of

this plasma that you've created and you have to try and back out what was the temperature

83

:

and density of this, you know, miniature sun we've created in the laboratory.

84

:

And so that's sort of the thrust of my PhD is trying to use data science techniques,

including of course, know, Bayesian inference.

85

:

to take this set of uh information-rich but complicated measurements and understand what's

happening in these experiments.

86

:

um And so we've learned a lot.

87

:

uh J.J.

88

:

Ruby, who was on the show before, was sort of got the ball rolling with thinking about

using Bayesian inference in this context.

89

:

And I've sort of inherited a lot of his work and carried it on to the next generation of

grad student.

90

:

And then I'm sure some grad student will come after me

91

:

and carry it even further, hopefully.

92

:

Oh, I didn't answer the second part of your question, which is how did I end up doing

this?

93

:

Which is a fair question, because it's very, when I tell people what I do, they're like,

how did you even know that you could do that?

94

:

ah And so I grew up in Rochester, New York, not Minnesota.

95

:

ah And at the University of Rochester, we have two of the largest lasers in the world.

96

:

They're actually the largest lasers at any academic facility anywhere.

97

:

And those are the Omega 60 and Omega EP lasers.

98

:

They're very impressive if you ever are in town and you want to come, you know, take a

tour.

99

:

Each of these lasers is about the size of a football field.

100

:

uh And, you know, we take these massive lasers and focus them down onto a, tiny point.

101

:

uh And it's very impressive.

102

:

And so, you know, I went to school in the area.

103

:

I went and toured.

104

:

these facilities as part of my undergraduate research program.

105

:

And so I was like, oh, that's really cool.

106

:

I didn't really think anything of it.

107

:

Like, of course this giant laser would be in Rochester.

108

:

Yeah, like that makes sense.

109

:

And then I was, you know, I decided to apply to grad school and I was, you know,

considering the University of Rochester, but I didn't want to really stay.

110

:

I wanted to go, you know, go somewhere else.

111

:

But then I met Rip Collins, who is my current advisor at a conference in Fort Lauderdale

of all places.

112

:

And he told me about the cool science that they were doing with those lasers, know, all of

the high energy density physics, the planetary science that you can do with these things.

113

:

And that really, you know, sort of convinced me that, yeah, like this is, you know, this

is even cooler than you might suspect by the fact that there's these football field size

114

:

lasers.

115

:

You can actually do really interesting science with these.

116

:

And that sort of inspired me to go down that road and then, you know, ended up working

with him for the past, ooh.

117

:

on five years now.

118

:

So it's been an experience, been a wild ride, doing a lot of laser experiments.

119

:

Yeah.

120

:

Okay.

121

:

So you were attracted by the big shiny laser, basically.

122

:

This was an important role.

123

:

I'll say the lasers themselves are impressive, but I didn't have any interest in...

124

:

using them because I wasn't super motivated by the main uh mission of the laboratory for

laser energetics, which is where these football field sized lasers are, is the pursuit of

125

:

fusion energy, which is great and I think a worthwhile endeavor, but it ah didn't move me

because it's just about uh effectively making as many neutrons as you can from fusion.

126

:

And you just want that number to go up.

127

:

And that to me always felt like more of an engineering problem than a physics problem.

128

:

ah And so when you go and tour these facilities, a lot of emphasis is placed on this

fusion uh arm of the program.

129

:

And so what really inspired me was the physics of it all, right?

130

:

To make the number of neutrons go up, you need to really understand the physics of these

very extreme systems that exist at millions or billions of atmospheres of pressure.

131

:

And that's a very interesting and difficult problem from a theoretical standpoint and from

an experimental standpoint and statistical analysis.

132

:

And so that's what really got me interested.

133

:

And then, you know, I met JJ through that and he got me interested in Bayesian inference,

um which is an even more interesting subset of an interesting problem.

134

:

So I would say the Bayesian inference at the end of the day is what got me more than the

giant lasers.

135

:

Hmm.

136

:

Okay.

137

:

Interesting.

138

:

So thanks JJ first and second.

139

:

oh So basically the, like the physics was here from a very, from the very beginning,

right?

140

:

So, um, yeah, can you also share your path into physics and, um, how did you first get

interested in that?

141

:

What was also your undergraduate training like and how do you think that shaped your

142

:

your research direction today.

143

:

Yeah, sure.

144

:

So I took a pretty straightforward path.

145

:

you know, took science classes in high school.

146

:

I liked it.

147

:

I came from a very liberal arts family.

148

:

So my parents are both teachers.

149

:

My dad taught history and my mom's an English professor.

150

:

And so, you know, sort of teaching is the family business in my family.

151

:

And so I had always assumed I would go teach.

152

:

But I knew I didn't want to do English or history or any of these liberal arts things

because I hated writing.

153

:

uh And so I wanted to do, you know, I liked science and math a lot more.

154

:

um And so my family was always like, well, what are you going to do with that?

155

:

What are you going to, you know, a science degree?

156

:

What use is that?

157

:

You know, you know, and so I went into undergrad.

158

:

went to uh SUNY Geneseo, which is about, you know, 40 minutes outside of Rochester.

159

:

So very far from home for me.

160

:

ah

161

:

I, you know, I had taken physics most recently, uh, as a, know, junior and senior in high

school and I really liked it.

162

:

So I said, well, we'll keep, we'll keep, you know, stick with it.

163

:

And so I majored in it and I really liked in, in, uh, college as an undergrad.

164

:

The plan was to go into, um, teach high school teaching.

165

:

So I was gonna, you know, enroll in the education school there and then do that.

166

:

But then I decided that.

167

:

That's not what I wanted to do.

168

:

And so I started um doing research, undergraduate research with a physics professor at

Geneseo, Kurt Fletcher, shout out.

169

:

And um that's, I mean, that's where I really learned, you know, that I could do science

and that doing science is really fun.

170

:

So that was a much more hands-on research project that I did in undergrad, turning

wrenches.

171

:

And we were building uh a spectrometer for, you know, ion backscattering studies or

whatever.

172

:

And, you know, that's, that's when I fell in love with science and doing science, both

from a hands-on perspective and just, you know, thinking about science is, is a really

173

:

interesting and rewarding pastime.

174

:

And so then I was like, well, maybe I should do more of this.

175

:

And so, you know, I applied to grad school, got in, and then I've, you know, been doing it

for uh five and a half years now and probably going to do more of it after this.

176

:

So I think that's, that's sort of the.

177

:

the timeline for me.

178

:

It's always just been like, ah, there's probably more science to know, more physics to do.

179

:

And so it's been a rewarding process so far.

180

:

Yeah, we'll see.

181

:

Yeah.

182

:

Um, that, makes total sense.

183

:

I can, yeah, I can vouch for your, uh, your passion for the, the topic.

184

:

I'm also going to bet that you're going to do that for, for a long time.

185

:

and so actually what, what you already talked about is that your research lies and the

intersection of high energy density physics.

186

:

plasma spectroscopy and vision inference.

187

:

So could you give us a big picture of what that looks like, of what your current portfolio

project is?

188

:

Because I'm guessing that lot of listeners are not familiar with high energy density

physics and plasma spectroscopy.

189

:

That's very fair.

190

:

High energy density physics is an interesting one because it exists at the intersection of

a lot of fields, right?

191

:

Because you have these very hot, dense

192

:

balls of plasma, you have plasma science going on, you have laser science because you're

using these giant lasers, you have to know how to use those and how to compress things

193

:

effectively.

194

:

And then of course, you know, how do you make a measurement?

195

:

Well, these things get really hot, they emit x-rays, they emit neutrons from fusion

reactions, and so you have to understand nuclear physics, you have to understand atomic

196

:

physics, and all these things.

197

:

And so sort of

198

:

At a high level, uh the kind of high energy density physics experiments I work on are

spherical implosions.

199

:

So you can think of uh the standard high energy density physical experiment is you take a

planar foil, a piece of material, metal, and you just whack it on one side of the laser,

200

:

drive a shockwave through it, and compress it that way.

201

:

And that lets you get up to millions of atmospheres of pressure, which is uh unreasonable

conditions.

202

:

And it turns out that there's sort of a limit to how hard you can hit something with a

laser ah before you just start, you know, dumping all this energy into nonlinear

203

:

instabilities and whatnot.

204

:

And so a way to get to higher pressure is to do it in convergent geometry, which

essentially means you'd take your laser illumination from one side and you illuminate a

205

:

spherical target from all sides.

206

:

And so now...

207

:

Instead of getting compression in one dimension, you're getting compressions in all three

dimensions, right?

208

:

And so that lets you, know, exponentially amplify the pressures you can reach up to, you

know, billions, billions of atmospheres, hundreds of billions of atmospheres even.

209

:

And so those, uh those convergent systems are even harder to measure.

210

:

At least in a planar geometry, you can, you know, measure something from the backside of

your target and maybe get a direct measurement of how fast it's going.

211

:

But in these experiments,

212

:

You know, it's complete.

213

:

All your measurements are completely integrated.

214

:

The X-rays that you measure from these have to travel through the target itself.

215

:

So you have to already know what temperature and density the target's at to understand

those X-rays.

216

:

And so the, the sort of, main focus of, of my research is understanding what's happening

in that 10 million degree ball of fire.

217

:

You know, it's a spherical plasma that's emitting X-rays and neutrons and all this stuff.

218

:

How do you understand how you image it?

219

:

ah How do you understand the spectra that come out of it?

220

:

Right?

221

:

If you have bound electrons in there, they're emitting, you know, the characteristic

emission like you see, uh I guess in science class, when you look at the neon lights, for

222

:

example, the fluorescent lights.

223

:

ah But how do those, you know, how do those lines change when you're compressed to 10

billion atmospheres of pressure and you're at 10 million degrees Kelvin?

224

:

Like the atomic physics of that are challenging.

225

:

But you can measure those transitions and you can learn a lot about the system from that.

226

:

And so then the question is, how do you measure these systems?

227

:

What do you measure?

228

:

And then what measurements give you the most information?

229

:

ah And so that's where the sort Bayesian inference comes in and working in an information

content sort of paradigm where you take all these measurements, you take X-ray pictures,

230

:

you take uh spectrographs of these systems.

231

:

What information can you get and what's the sort of max?

232

:

There's a certain number of measurements you can make, right?

233

:

You're limited by space on the target chamber.

234

:

ah And so understanding the information content of each measurement and what's the most

effective combination is something that is very non-trivial and something that I'm looking

235

:

into actively.

236

:

And then the third piece of it is sort of what discoveries can we make with these tools,

right?

237

:

you know, we have these Bayesian statistics, we have all of these images and spectroscopy.

238

:

uh What physics discoveries can we make and, you what can we do?

239

:

How does that uh translate to our understanding of astrophysical systems and, you know,

the universe at large?

240

:

And I think that's really the most interesting part is we build all these really uh

complicated tools to all these fancy statistics and whatnot.

241

:

uh to sort of peer inside these extreme systems and understand really what's going on, not

only in these systems, but in other extreme astrophysical systems.

242

:

so there's a lot of interesting discoveries that are coming out of that about atomic

physics, about nuclear physics, about what role does radiation play, right?

243

:

You're producing an enormous amount of photons.

244

:

uh

245

:

just because you have this hot, dense ball of gas.

246

:

And those, you can produce enough photons to affect the dynamics of the system.

247

:

The radiation flux is so large that the system behaves differently.

248

:

And understanding that is very non-trivial because a lot of these, you know, systems can't

be created anywhere else uh except, you know, here in the lab and understanding that is

249

:

very difficult.

250

:

So all of this requires some level of, you know, statistical inference.

251

:

And that's really what my, the big picture of my work is building the tools to do that

statistical inference.

252

:

Because you can make the analog very similar uh type of experiment in particle physics at

particle colliders.

253

:

You have very distressing conditions there as well.

254

:

Although those are at order of magnitude higher energy than we can produce with our laser

systems.

255

:

um

256

:

To understand what comes out of those violent particle collisions where you're at, know,

GEV, TEV energies requires very complex statistical tools, very high level understanding

257

:

of your detector and very good models of those physical systems.

258

:

And, you know, they're sort of the particle physics field is a lot further along down this

road than we are.

259

:

And so we have to sort of, we're starting from a very much lower level.

260

:

And so we have to understand uh from a fundamental level, the physics of the system, the

detector, and how to do that inference rigorously.

261

:

Yeah, this is just fascinating.

262

:

So concretely, what does that look like to work with that kind of data?

263

:

First, what does your data look like?

264

:

Where do they come from?

265

:

um What's the size of them?

266

:

How does it look like to concretely work on these kind of projects before we dive a bit

more into the modeling complexities?

267

:

Yeah, sure.

268

:

oh

269

:

That's a good question.

270

:

The data oftentimes looks like a uh dot on a camera, like a little spherical blob, or

maybe if you're lucky, a series of spherical blobs that correspond to different times.

271

:

um Or if you're even luckier, you could get maybe a streaked continuous record in time

with one of these streak cameras that we've developed in the field.

272

:

uh And what it looks like is trying to build a sort of generative model of these blobs.

273

:

So using our best available knowledge of the physics, right?

274

:

We think we understand hydrodynamics.

275

:

We think maybe we understand how X-rays are produced in plasmas and how they propagate.

276

:

That may or may not be true, but we can take our best available models and try and build

277

:

a generative model of how the physics, we think the physics will evolve and how that will

manifest in the observed emission.

278

:

And then we need to also understand how our detector is constructing that blob.

279

:

And so that requires knowing what the resolution is and the sensitivity to different

energies of X-rays and as well as the statistics, right?

280

:

And that's the most complicated part.

281

:

And something that we're really working on is understanding

282

:

you know, if you have a given X-ray flux, what signal, what distribution of signals are

you going to measure?

283

:

And, know, you have Poisson counting statistics for how many photons are absorbed, but

then the rest of it is a complete mess because it's so complicated.

284

:

And you're trying to make a measurement in a fraction of a billionth of a second.

285

:

And so all hell breaks loose basically, the statistics get very complicated.

286

:

There's correlations between neighboring data points.

287

:

And so understanding that is a really

288

:

sort of nuanced and complicated piece of it.

289

:

But all of it is just to produce this little, you know, dot on your camera, basically,

which has an intensity, it's got a radius, maybe it's got some shape, but it's very simple

290

:

data.

291

:

We're using technology from the 1800s.

292

:

It's like pinhole cameras and geometric optics.

293

:

It's not nearly as fancy as some of the...

294

:

diagnostics that are being fielded in other fields.

295

:

But that's largely because it can't be, right?

296

:

It's very hard to have X-ray optics in a compact space because X-rays don't really refract

the way that visible light does.

297

:

Damn, yeah.

298

:

This is fascinating.

299

:

So maybe first, can you walk us through the main technical or methodological obstacles you

face just even before starting to analyze and model this data?

300

:

Like, what's the first step in your workflow when you're working on one of the concrete

projects you've named at the beginning of the show?

301

:

Sure.

302

:

So the first step.

303

:

is designing these large scale experiments to be done on these football field size lasers.

304

:

And so that work usually starts about a little more than a year out.

305

:

You propose an experiment, it gets accepted or rejected or iterated on by the committee.

306

:

You get time on the laser, you get usually one day.

307

:

ah And so then your day is scheduled for the next fiscal year and you have to

308

:

design your experiment.

309

:

So you got to say, all right, what's my target going to look like?

310

:

How it's the laser pulse going to look like?

311

:

How much energy do I want?

312

:

What am I looking to actually observe here?

313

:

And what measurement is going to give me the most information?

314

:

And so that requires a lot of simulations.

315

:

So we run these very advanced state of the art uh radiation hydrodynamics codes to sort of

predict how the system is going to behave.

316

:

It's never right because these systems are really complicated.

317

:

And so the codes can never quite capture what's going to happen, but we can get an idea.

318

:

uh And then a piece I've really been pushing is trying to do it predictively.

319

:

So sort of as a byproduct of this whole Bayesian inference generative modeling, you have

to have a full forward model of your detector.

320

:

So that lets you then input your hydrodynamic simulation, for example.

321

:

and predict what exact signal am I going to observe?

322

:

it, you know, am I going to saturate my camera?

323

:

Am I going to be able to resolve the signature, the spectral signature that I want?

324

:

And so it involves iterating, you know, at that level to try and simulate your data and

say, this is what I'm going to see.

325

:

This is what I think I'm going to see.

326

:

ah And then you do the experiment.

327

:

It's completely different.

328

:

And you're like, well, that's not what I thought was going to happen.

329

:

And then you have to start trying to interpret that data, figure out why, what's wrong,

what the simulation got wrong.

330

:

and then try and build a better model that's able to capture the data and sort of model

from, you know, physical principles, build a physical model of the system to model this

331

:

data and interpret it and try and extract the most information out.

332

:

So the technical challenge of designing the experiment is in its own right, you know, a

challenge because doing large-scale science is challenging.

333

:

It requires coordinating with a large team of people and

334

:

doing a lot of very tedious paperwork to make sure that you can feel the experiment you

want to feel.

335

:

And then after that, the data is incredibly complicated.

336

:

think, um you know, JJ famously uh only ever uh published data from one shot from his

entire graduate career because these experiments are very difficult to do and then even

337

:

more difficult to analyze.

338

:

And so it's considered a very, successful uh PhD thesis if you can analyze at least one

data set.

339

:

And so being able to extend that and do it for multiple data sets is something that's even

more challenging, even once you've built up the tools, because each individual shot of the

340

:

laser is slightly different.

341

:

And the nuance of that is something that is...

342

:

Because anything can happen.

343

:

You can always have some fluke occurrence.

344

:

There's sort of an old apocryphal tale that someone was, someone had an experiment on the

laser and a squirrel chewed through a cable on the roof and the whole thing just shut off.

345

:

And for the, you know, that was it.

346

:

They didn't get any data because the squirrel chewed through a cable on the roof.

347

:

It's like, well, couldn't have predicted that.

348

:

The code did not, uh did not contain the necessary squirrel physics um to predict that

outcome.

349

:

And so any, any number of things can go wrong.

350

:

Um, but being able to, to build a robust model and statistical inference engine to

interpret these data sets and get meaningful information out, uh, is, that's, I would say

351

:

that's the largest, um, sort of technical challenge.

352

:

two, these are, I mean, they're, they're, there's a lot of data, but it's also not a lot

of data by modern standards, right?

353

:

We can shoot the laser maybe one, you know,

354

:

12 times in a single day, which is a lot for a laser that size.

355

:

But you maybe get eight shots of usable data.

356

:

And so it's very hard to scan through parameter space and answer the questions that you

might want to answer.

357

:

So it's a very complicated and difficult uh paradigm to operate in.

358

:

But it's very fun and rewarding as well.

359

:

Yeah.

360

:

Yeah, again, guys.

361

:

Uh, and how does, what's the timeline, you know, on these projects you're working on?

362

:

Like for how long does it tend to, like, what's a range of duration you guys are working

on these, on these projects?

363

:

Because it sounds so daunting that like, I'm guessing you need a lot of time.

364

:

Yeah.

365

:

Well, I mean, the timeline for an experiment as you know, as I said, a little over a year,

but then to actually analyze that, I mean, we're talking.

366

:

the length of a PhD thesis, right?

367

:

JJ analyzed one shot in his thesis.

368

:

I'm going to analyze slightly more than one shot, but not by much.

369

:

uh It takes, yeah, I would say it takes about five years to finish one of these projects.

370

:

uh And a lot of that is because, you know, I'm a graduate student, I'm learning.

371

:

There's a lot I didn't know, I didn't know when I started that I know that I don't know

now.

372

:

ah And so that definitely makes it

373

:

hard to get going at the beginning.

374

:

uh But when you're starting from scratch, right, you don't have any of the uh tools

necessarily built out yet.

375

:

It's very difficult to get going.

376

:

so, you know, that's sort of, hopefully the, you know, I've built up enough tools that

the, you know, my successor, whoever they may be, will be able to build on that just as I

377

:

built on what JJ did and, he built on what came before him.

378

:

And so.

379

:

Hopefully it accelerates, it is very challenging to be able to build out these tools.

380

:

Plus, I'm guessing a lot of people are working on these projects, Physics projects are

usually very collaborative.

381

:

So how does that look like for you?

382

:

How many people are working with you on these projects?

383

:

Maybe it's even inter-country collaborations.

384

:

I think it's very interesting to also look a bit at that and...

385

:

how it looks backstage because not a lot of projects are as high, as large scale and

international as physics projects.

386

:

Yeah, that's a good point.

387

:

So one thing that's kind of unusual about the field of high-end density physics is it is

very collaborative.

388

:

We collaborate uh all over the world.

389

:

There's laser systems that we go use all around the world in Europe, Japan, all over.

390

:

Um, and so there is that collaboration, but at the same time, it's small enough scale that

you sort of own your project.

391

:

Right.

392

:

So I proposed my experiments.

393

:

had a lot of help fielding that experiment, right.

394

:

From a, you know, diagnostic and laser side, right.

395

:

I don't know anything about laser systems.

396

:

Uh, so there's a whole team of laser professionals at the lab who run the laser.

397

:

Um, and so, you know, there's a lot of help from them, but at the end of the day, you

know, I designed it as, as a graduate student, I.

398

:

Designed the target, what the laser is going to look like, what diagnostics we're going to

field, and then I'm in charge of analyzing it.

399

:

And so it's very collaborative in the sense that everyone works together to get good data,

but you do own your own project at the end of the day.

400

:

And there's a lot of freedom that comes with it to pursue what questions you think are

interesting, um, and sort of push the, push the boundaries.

401

:

But of course, you know, there's always, um, you can always go talk to experts in the

field.

402

:

um

403

:

because there is that collaboration, those connections.

404

:

But it's definitely not a collaboration in the sense that like CERN or Atlas are where you

have, you know, a hundred people working towards the same uh goal.

405

:

It's a lot more individualized.

406

:

Yeah.

407

:

Okay.

408

:

Okay.

409

:

I see.

410

:

uh And so is there a particular mentor?

411

:

that has actually been very important for you so far in your career that has helped you

jumpstart your learning or maybe unblock you on a project?

412

:

Yeah, well, I'll definitely shout out JJ Ruby here.

413

:

He taught me everything I know about Bayesian inference.

414

:

um But then, he went off to go work at Lawrence Livermore and now he's at the Houston

Astros.

415

:

but we still, we still talk.

416

:

So he's, he's been a huge help in, um, you know, helping me in, cause he was in, he was in

my shoes not long ago.

417

:

So he has a lot of, uh, advice, but then, yeah, I, it's been, it's a lot of, um, community

among the graduate students at the lab.

418

:

and so I, there's a lot of peer to peer, mentoring.

419

:

Um, I'll, I'll shout out David Bischel and Alex Chin.

420

:

were two senior graduate students who sort of helped me out when I first came in and knew

absolutely nothing.

421

:

And, you know, they sort of taught me what they knew.

422

:

ah And now I'm the senior graduate student.

423

:

And so I'm trying to pay that forward and, you know, teach me what I learned and what they

taught me.

424

:

so there's this sort of, you know, sense of community among graduate students because

we're all sort of in the same boat.

425

:

Grad school is hard.

426

:

And so it's very...

427

:

It's very helpful to have that.

428

:

em You have someone you can talk to who's been through this.

429

:

So it's not just one mentor, it's a whole lab of them.

430

:

Yeah, I can guess that.

431

:

And yeah, of course, I will put also JJ's appearance on the show in the show notes for

people who want to dig deeper because JJ definitely has a very interesting background and

432

:

path that I think is going to be very valuable for listeners to listen to, especially if

are interested in...

433

:

Either physics or baseball analytics or both.

434

:

Definitely give a listen to this episode.

435

:

And now let's talk a bit more concretely.

436

:

How do you incorporate patient inference in your work?

437

:

Especially so the kind of high energy density physics uh data you have.

438

:

What advantages or limitations arise from using patient techniques?

439

:

Yeah, great question and one that people ask me all the time.

440

:

They're like, well, what's the point?

441

:

Who cares about Bayesian inference?

442

:

Why are you doing this?

443

:

To which I say Bayesian inference is really the tool that you need to answer a problem

like this.

444

:

at a very high level, right, you observe this data and you need to understand the system

that created it.

445

:

And so this is a, you know, it's an inverse problem.

446

:

And so it's very difficult to just invert the data.

447

:

You have to do it.

448

:

in a generative framework.

449

:

And so right away Bayesian inference is very well suited for that kind of thing.

450

:

You don't have to do Bayesian inference.

451

:

You can do forward modeling uh without Bayesian statistics.

452

:

But it's just such a natural framework for it.

453

:

And then on top of that, it allows you ah what I think is probably the most powerful thing

is to include prior information in a way that you can't.

454

:

So we have historical data, for example, going back.

455

:

several decades to the older version, you know, the, the Omega laser system has been

around since the nineties.

456

:

So we have a wealth of historical data that helps us understand these systems.

457

:

And so that you can use that as a form of prior knowledge to sort of inform, you know,

your expectations of what's going to come out of what you're going to learn from these

458

:

experiments.

459

:

And so that, you know, provides a self-consistent way to sort of include that prior

knowledge that will, you know,

460

:

improve the statistical inference that you're able to make from these.

461

:

Because you can take a lot of measurements, but they're not, they may not have as much

information as you need them to have.

462

:

And so being able to impose prior knowledge is also very valuable.

463

:

And then also, you know, one thing we struggle with in our field is propagation of

uncertainties, because you have very highly nonlinear systems.

464

:

And sometimes you can't uh build a model that's able to uh sort of characterize your level

of uncertainty and Bayesian inference folds that in completely self-consistently and

465

:

correlations and whatnot.

466

:

you're doing uh Markov chain Monte Carlo, and even if you're doing something like

stochastic variational inference, you can still get a much better understanding of the

467

:

uncertainty uh in your system based on your data, which is

468

:

very non-trivial for these systems.

469

:

So I would say that's the big three uh reasons why we use Bayesian inference to analyze

the data.

470

:

It's because I think it just makes sense.

471

:

Yeah, mean, preaching to the choir, but yeah, completely relate, of course.

472

:

And actually, can you describe one experiment or project that

473

:

you're particularly proud of where you used Bayesian inference, um what you measured, how,

why it's significant and what does the model look like?

474

:

Yeah, sure.

475

:

Well, it's still ongoing, so it's not done.

476

:

I don't know if I'm allowed to be proud of it if uh it hasn't been published yet, but

that's fine.

477

:

um So one thing that...

478

:

um

479

:

This sort of the what's become my main project at this point is uh making a sort of

measurement of the equation of state of a billion atmosphere pressure plasma.

480

:

So the equation of state is just the relationship between pressure, temperature, intensity

basically.

481

:

You think of the ideal gas law, uh van der Waals equation of state maybe to name a few.

482

:

And so

483

:

It's a very important material property, right?

484

:

Because it says, if you're at a temperature and density, what's the pressure, what's the

energy density, and so forth.

485

:

And so we've sort of developed the tools to do this at lower pressures, uh fairly

routinely.

486

:

You can measure the temperature, density, and pressure of a planar experiment, for

example.

487

:

So you can get all the very high pressure, but you can't, as I said, there's a limit to

how hard you can hit something with a laser.

488

:

So you can't measure the equation of state beyond

489

:

you know, 10 million atmospheres or so.

490

:

And so, you know, you have to go to a convergent geometry, but that's very difficult.

491

:

And so, um sort of what's become my mission is making that measurement using Bayesian uh

inference techniques and other statistical methods, know, machine learning and whatnot, to

492

:

take this really, this big data set um that, you know, consists of

493

:

multiple different X-ray cameras, spectrometers, all sorts of, you know, the full kitchen

sink and analyzing that data set, this multi-messenger data set, uh comprehensively at the

494

:

same time and sort of in the data space as much as possible.

495

:

So sort of from first principles, if you will.

496

:

And so, you know, this was a really complicated system just to build, right?

497

:

You have to build models of all these detectors.

498

:

and understand the statistics uh in some sense, and then do the inference, which is going

to be a very high dimensional problem um requiring the very advanced sampling tools

499

:

provided by PyMC and other libraries.

500

:

And so what we were able to do is take that data set and

501

:

throw this very complicated model at it, which is it's complicated, but it's simple,

right?

502

:

Because it's basically, you know, you have your temperature and you have your density and

your pressure, and you have a very simple, but sort of complete physical picture of how

503

:

they evolve in time and space that's sort of parameterized in terms of these physical

quantities, but flexible enough to be able to fit the data while encoding, you know, our

504

:

prior belief on the physics of the system.

505

:

Like for example, you we think the temperature should go up and then down, right?

506

:

And then stuff like that.

507

:

And so doing that, building this model that, you know, treats the evolution of the system

and then, you know, propagates that forward.

508

:

What X-rays does that produce?

509

:

How do we observe those?

510

:

What would we measure if that were the case?

511

:

Bayesian inference, you know, the likelihood of uh data given theta.

512

:

And then we are able to sort of independently constrain the

513

:

pressure, temperature, and density from the available suite of measurements.

514

:

And so that gives you sort of separate measurements of what's the temperature and density

and what's the pressure.

515

:

And those are related by the equation of state.

516

:

And so this gives you a very fancy way to measure the equation of state using these

advanced statistical methods.

517

:

And so we were able to do that for the first time, and that gives you sort of insight into

what physics is happening, right?

518

:

You can sort of...

519

:

You can compare it to your models.

520

:

can do further inference on those conditions that you infer and try and understand what's

happening here.

521

:

there sort of interactions that we aren't taking into account in the physics of the

system?

522

:

Is it just the ideal gas law?

523

:

Is there something else going on?

524

:

uh Are the photons that we're creating producing their own pressure?

525

:

Let's ask a lot of interesting questions.

526

:

to my knowledge, it's the first measurement uh out there.

527

:

uh

528

:

these billion atmosphere pressures.

529

:

And so, you we're still working on uh putting that together for publication, but just, you

know, the sheer lift of putting it together and doing it in a self-consistent way is

530

:

something that I'm proud of and has really, you know, pushed uh what JJ did with his

single diagnostic, single shot analysis to a, you know, a more complete framework.

531

:

And then, you know, because I took multiple shots of data,

532

:

I can, I think I've analyzed seven, seven shots now.

533

:

Uh, so that's, you know, a vast improvement for, for the statistics, know, square root of

N was so much more, uh, so much more certain.

534

:

Yeah.

535

:

Yeah.

536

:

Yeah.

537

:

Damn.

538

:

Well done.

539

:

Um, yeah, I know this is, this is such a, such a hard work, but very impressive.

540

:

And how, so first practical question, because I'm always curious.

541

:

What?

542

:

What did you use to code the actual model and to sample?

543

:

Did you use any already existing package?

544

:

um Did you have to come up with your custom packages, maybe even custom sample?

545

:

How does it work?

546

:

What does it look like?

547

:

So I am a big proponent of Python, doing everything in Python, which is uh an increasingly

popular opinion, out.

548

:

Python...

549

:

uh

550

:

When I first started, no one in my lab was using Python, which was crazy, except for JJ.

551

:

It was, everyone is using like Matlab, Excel, Mathematica.

552

:

Python was...

553

:

Damn, this is A lot of people still use Excel.

554

:

I will say, you in their defense, what they're able to do with Excel is nothing short of

magic.

555

:

mean...

556

:

Yeah, yeah.

557

:

I mean, at that level, you have to be an Excel...

558

:

I mean, they're doing everything I can do in Python.

559

:

They are doing an Excel.

560

:

It just looks ridiculous.

561

:

Yeah.

562

:

Yeah.

563

:

Damn.

564

:

Wow.

565

:

Okay.

566

:

Um, but good choice.

567

:

I, I, I completely support you, Ethan and JJ, this, this endeavor.

568

:

uh, yeah, so, so we're big, big, uh, proponents of Python and the sort of big knock on

Python is that it's slow.

569

:

but that's actually becoming less and less of an issue when you have things like, you

know, just in time compilation, uh with, Jack's.

570

:

and other things that are, you automatic differentiation.

571

:

All of that is sort of closing the gap between Python and these sort of machine, you know,

Fortran and C++ and whatnot.

572

:

And so the ease of use is so much better that, you know, coding in Python is really,

think, I mean, you obviously agree, is the practical choice at this stage.

573

:

uh And so, you know,

574

:

A lot of it is pre-existing, right?

575

:

There's the Jax libraries.

576

:

A lot of code exists.

577

:

I didn't have to recode NP.traps or anything like that.

578

:

But there's a lot of very bespoke code that I had to rewrite in Jax.

579

:

That's an implementation that doesn't exist that does exactly what I want it to do.

580

:

so I had to do a lot of software development uh considering I...

581

:

came into college or uh graduate school rather with zero, basically zero Python

experience.

582

:

And so I had to teach myself from the ground up uh how to code in Python and then how to

code in Jax and then how to develop a large scale software package.

583

:

ah It's not beautiful, but it does work.

584

:

that's another thing that I guess I should add that I'm proud of is the sheer amount of

code that I've written.

585

:

uh for this project.

586

:

It's a massive repository.

587

:

uh Just to, you know, analyze a couple pieces of data.

588

:

yeah, shout out to Python, shout out to Jax.

589

:

um We used to use PyMC as our sampling library to do the inference.

590

:

We switched recently to NumPyro because that was, you know, this is when NumPyro was sort

of new and Jax was sort of cutting edge.

591

:

And now think PyMC uses Jax at its backend.

592

:

Yeah, I mean, you can.

593

:

Yeah, we have the three backends.

594

:

So C is the default one.

595

:

You can use Numba and you can use Jax.

596

:

So depending on the project you're working on, be useful to use Numba or it will be faster

using Jax.

597

:

It depends on your model.

598

:

yeah, that's great.

599

:

Anyway, you're using a very good open source package and you build on top of um it.

600

:

What are you using as a sampler?

601

:

latest MCMC?

602

:

So yeah, it depends.

603

:

eh Usually we use, know, not Snowy U-turn sampler.

604

:

um I mean, have to sort of coax it to do what you want on occasion, but it's...

605

:

sort of the Ferrari of samplers, as far as we're concerned.

606

:

Occasionally, they will, you when you drive a Ferrari on a road full of potholes, you will

occasionally encounter a negative result.

607

:

And so we may have to use, you know, something like sequential Monte Carlo SMC or some

other more robust but slower.

608

:

uh Nest and sampling is another one that I've used a lot.

609

:

So it's unfortunate because you'd like there to be one sampler that always works, but that

is never the case.

610

:

And you always have to do something else.

611

:

Sometimes when you get really desperate, you'll be like, all right, we've got to use a

totally different sampling library to get what we want.

612

:

But that's part of the and part of the fun.

613

:

It's already working for most of the, like for 98 % of the cases, and since it will be

amazing.

614

:

I mean, that's, and so, yeah.

615

:

And that's great to hear that you didn't have to code your custom sampler.

616

:

Right.

617

:

So you see folks, you don't need to, you don't need to write your custom samplers anymore.

618

:

Especially I'm, I'm, I'm speaking to the economics, uh, among econometricians among us.

619

:

Like don't stop writing your own custom gift sampler.

620

:

Just, uh, just use the ones that are in PIMC or NumPyro or Stan or whatever you're using

as a probabilistic programming language is going to work well.

621

:

And there are now.

622

:

More than the nut sampler, there are many more of them.

623

:

I % agree.

624

:

And I've been trying to proselytize a little bit around different research groups.

625

:

I've got some Bayesian inference outreach uh presentations that I've been giving, trying

to spread the word that, hey, Bayesian inference is not as hard as it used to be in the

626

:

olden days.

627

:

You don't have to write your own sampler.

628

:

You can just use one of these excellent uh

629

:

pre-made ones that are only getting better and more sophisticated.

630

:

exactly.

631

:

The barrier to entry is so much lower now.

632

:

Exactly.

633

:

It's way, way lower.

634

:

So now you can spin up a patient model and sample from it very fast and don't even have to

worry about the sampling issues because it's all taken care of for you.

635

:

I would say it's

636

:

It's a great time to leave.

637

:

m And a lot of people are scared of Bayesian inference because it sounds fancy, but it's,

know, really, you know, quite easy.

638

:

just build your model, turn the crank, Bayes rule and get out your posterior.

639

:

Yeah, exactly.

640

:

Not a lot of things to do now.

641

:

It's pretty much, pretty much all automated.

642

:

It's solved.

643

:

Yeah.

644

:

Bayesian inference is it's all done.

645

:

Yeah.

646

:

Actually, do you have um any good practices that you want to share with people when

they're working with big code bases as you are?

647

:

um Well, in your case in Python, but can be, can be anything else?

648

:

Like what are, what are the good practices you've found to not lose yourself basically

when working on these, on these big code bases and also probably to help.

649

:

uh collaboration as we were talking about.

650

:

Well, the first thing is always comment your code.

651

:

And I say this with full knowledge that I am not very good at this, but always, for your

future self at the very least, you know, write down your thought process so you can come

652

:

back to it.

653

:

But then on a, you know, sort of more sophisticated level, uh there's a sort of white

paper by Andrew Goman.

654

:

I think it's called a principal Bayesian workflow that sort of lays out best practices for

constructing a model.

655

:

And so, you construct your model, you uh test the prior predictive distribution.

656

:

You you say, okay, is my prior correct?

657

:

And then you go.

658

:

And I think one thing that's really useful, especially in, you know, the sort of work that

I do is building a sort of synthetic experiment and, you know, making sure that you can

659

:

get out

660

:

the information you think you can get out, right?

661

:

Because if you build this whole synthetic set of synthetic diagnostics and your model, and

then you take the data and you try to analyze it and you actually don't get any

662

:

information out.

663

:

Well, great.

664

:

You you wasted a bunch of time and money on that experiment and you didn't get what you

wanted out.

665

:

And so being able to sort of, before you've even done the experiment, predict what you're,

what you're going to measure and then what information you can get out of that is hugely

666

:

helpful.

667

:

And really, you know, should be a prerequisite for doing any of these experiments and

showing that you think it's going to work.

668

:

I mean, there's value in just shooting a laser and seeing what happens, but at this stage,

you know, the field is advanced enough that we should be doing this a little more

669

:

predictively.

670

:

And then two, sort of um benchmarking and testing your model along the way.

671

:

And so one thing that I really struggled with is I sort of put the model together all at

once.

672

:

And I just pressed go and it didn't work.

673

:

And I was like, well, what, why didn't it work?

674

:

What's breaking?

675

:

And so then I was like, well, okay.

676

:

I took it apart and piece by piece, you know, and I tested each module, made sure it did

what I thought it was doing.

677

:

You know, ran the inference, you know, ran the synthetic experiment, but let's say, okay,

let's say there's no noise.

678

:

It's a perfect measurement.

679

:

Can I, does this work?

680

:

so building confidence in your model is very important and convincing yourself that it

works so that you can convince others.

681

:

And that helps you find a lot of the issues.

682

:

Because you definitely can lose yourself ah in the model building process.

683

:

As fun as it is, ah once the model gets too big, there's nothing you can do.

684

:

gotta...

685

:

You gotta just, you know, one brick at a time.

686

:

And after you place each brick, make sure that that brick is a brick.

687

:

Hmm.

688

:

Yeah, yeah, okay.

689

:

Yeah, I see what you mean.

690

:

So basically...

691

:

Having...

692

:

A principled patient workflow, building block by block and testing it along the way.

693

:

Is that correct?

694

:

So how do you benchmark and test your model actually?

695

:

How do you, what does a good model look like to you in this kind of project?

696

:

How do you call a model good?

697

:

How do you evaluate it?

698

:

How do you benchmark it?

699

:

I think it's a very important question, not only in your case, but in everybody's case,

like all my models, I need to benchmark them.

700

:

So yeah, I'm curious, what does that look like for you and what does that mean to have a

good model?

701

:

Right.

702

:

So for me, for the specific use case that I work on, a good, well, all models are wrong,

but some models are useful.

703

:

So a good model is one that is useful that you can use to get useful information about

your system out.

704

:

And so what that usually looks like is we'll run a, you know, state of the art, a full

physics simulation, you know, hydrodynamics code.

705

:

simulate our experiment, generate synthetic data from that using our state-of-the-art

detector models, and then do inference on that synthetic data and say, okay, so we know

706

:

what the ground truth value is from our simulation.

707

:

Can we recover that with our model?

708

:

And oftentimes the model is a vastly simplified physical model, right?

709

:

Because these state-of-the-art simulations take hours to sometimes days to run one

simulation.

710

:

So you can't

711

:

really iterate over parameter space on that time scale.

712

:

And so it's important to do this full physics and benchmark your reduced order model on

that.

713

:

And then if that works great, so maybe you benchmarked it against a 1D simulation, it

worked great.

714

:

You go up to a 2D simulation or a 3D simulation and see, it still work?

715

:

Do we still get useful information?

716

:

And so verifying that you can get out of this set of integrated measurements.

717

:

Can you recover the ground truth value?

718

:

In the limiting case, when you know what it is.

719

:

And that builds confidence that the result you get out of the experimental data is

meaningful and useful.

720

:

And so that's a very important piece of model development for the specific use case I work

on.

721

:

I don't know what that really looks like in other fields where maybe in social sciences

and whatnot.

722

:

I don't know if you can simulate an entire economy.

723

:

uh

724

:

and the stock market, maybe you can.

725

:

Yeah, you probably can.

726

:

um I think Jesse Grabowski would be able to talk about that, but I know he does a lot of

simulation work.

727

:

mean, at some point also it looks a lot like physics simulations when you're simulating a

full economy.

728

:

uh There's a lot of interactions and dynamics.

729

:

so state space models actually look

730

:

Like it look a lot like um physics simulations.

731

:

You can definitely use that in physics, I'm sure.

732

:

a lot of physicists do go work on Wall Street after their PhDs.

733

:

It's either baseball or Wall Street.

734

:

Right.

735

:

Yeah.

736

:

I mean, you can really do anything with physics.

737

:

yeah, that's what you could have answered your parents.

738

:

Well, you can do anything with a physics degree.

739

:

Well, I didn't know that at the time, but I know.

740

:

Yeah.

741

:

Now you can't, now you can't.

742

:

And actually, since you talk about simulations, I know Andrew Gell-Mann, for instance, is

a big proponent of simulating the data and testing the models on simulated data, which I

743

:

do all the time also because, you know, I'm a good student.

744

:

listen to Andrew.

745

:

He knows way better than I do.

746

:

How does that look like for you?

747

:

You know, how do you...

748

:

balance, simulation, modeling, experimenting your own work?

749

:

And a bit more generally, which side, if there is one, theory or experiment, do you find

more challenging or more rewarding?

750

:

Yeah, well, so I do a fair amount of simulations, right?

751

:

I use these large scale physics codes to simulate my experiments.

752

:

ah And, you know, as I alluded to earlier, they're sort of always wrong.

753

:

They're not predictive.

754

:

which is a big problem, ah especially when you start thinking about using implosion

experiments for fusion applications, right?

755

:

You want to make more neutrons, and so you want to be able to use your simulations to say,

what do I change to make more yield, more fusions?

756

:

ah And so having a code that's not predictive is not particularly useful in that capacity.

757

:

And so...

758

:

For me, the use of the simulations is that it provides a sort of physically

self-consistent system to test your models on.

759

:

It's not correct.

760

:

It's not a perfect picture of what's going to happen in your experiment, but it provides a

sort of synthetic ah universe, a self-consistent universe for you to sort of test your

761

:

models on.

762

:

And so to me, I mean...

763

:

I think any experimentalist would tell you that the experiment is king.

764

:

Reality, what actually happens is the most important thing.

765

:

And the theory should try to be predictive of what actually happens rather than,

oftentimes what happens, you'll do the experiment, it doesn't match your initial

766

:

simulation.

767

:

And so you'll tune your simulation to sort of match it and you'll add some fudge factors,

whatever.

768

:

you're like, look, we discovered this thing.

769

:

It's like, wow.

770

:

It would have been nice if you could have predicted it beforehand.

771

:

uh And so I sort of you know make fun of my theorist colleagues a little bit and could see

know they have these predictions and it's like well It doesn't match doesn't match the

772

:

experiment if not, then I don't you know, but then they'll always find that well Maybe

your experiment is wrong.

773

:

It's like but you know it happens.

774

:

Okay, good this You know, we're all sharing the same experiential reality We all agree

that we did the experiment and this is what happened.

775

:

But it's a it's a yeah, it's an interesting uh inner play

776

:

because both sides are saying the other one is wrong and that it has to be, we have

quantum accurate density functional theory calculation.

777

:

This is the truth.

778

:

But every side has its, there are skeletons in everyone's closet.

779

:

It's things that we don't understand.

780

:

so, ah trying to sort of unify those two silos is an ongoing struggle and something that

we've been trying to work on sort of.

781

:

have theorists and experimentalists work together on experimental design, for example, uh

to do an experiment that better answers the outstanding questions.

782

:

um But as, you know, in so far as I go, my uh philosophy is always that the theory should

try to be predictive of the experiment.

783

:

And if it's not, then it's not a useful simulation outside of this, you know, idea of a

synthetic experiment to benchmark your models.

784

:

Yeah, yeah, yeah.

785

:

Yeah, yeah, that makes sense.

786

:

and yeah, I completely, completely agree with you.

787

:

have a way of doing things for sure.

788

:

Uh, that's also in my own work.

789

:

How, how I do it in the end.

790

:

Experiment and autism perpetuation are king.

791

:

It's like if, if they don't go well, but the rest is looking good, then I, my, my belief

in the model being an actuate depiction of reality is not.

792

:

is definitely decreased by a lot.

793

:

And something I am very curious about just to change gears a bit, a bit more general

question, but it seems to me like high energy density physics can be contributing to a lot

794

:

of different subfields of physics, uh astrophysics, planetary science, materials research

even under extreme conditions.

795

:

uh

796

:

How do you see high energy density physics?

797

:

It's hard for me.

798

:

uh We usually just uh abbreviate it as HED physics, if that's easier.

799

:

Okay, HED.

800

:

That's going to be easier.

801

:

So how do you see HED contributing to the different subfields and what are maybe some

exciting projects that are looking into that currently?

802

:

Yeah, absolutely.

803

:

So, I I sort of alluded to this earlier, but you're recreating states that exist at the

center of astrophysical objects.

804

:

And so it turns astronomy from an observational science to one that you can do in the

laboratory.

805

:

ah And so one of the, this is sort of the focus of the Center for Matter Atomic Pressures,

which is an NSF Frontier Center at the University of Rochester.

806

:

um So we're, you know,

807

:

It's using NSF money to get time on these lasers to study outstanding sort astrophysical

questions.

808

:

And sort of one of those uh questions is sort of dealing with the missibility of hydrogen

and helium in these gas giant planets.

809

:

So, you we know now from the Juno missions and other space probes that we don't really

understand what's going on inside these gas giants inside Jupiter.

810

:

particular, know, is the core, is it a solid core?

811

:

Is it dilute?

812

:

Turns out it's somewhere in the middle, where it's the sort of fuzzy core.

813

:

And that sort of flies in the face of everything we thought we knew about planetary

formation.

814

:

And so, you know, these laser driven laboratory experiments are sort of, you know,

positioned to help us understand uh things about how we, you know, how the component

815

:

pieces of the center of Jupiter

816

:

How do they behave under those immense pressures?

817

:

You have ah millions and millions of atmospheres pressures at the center of Jupiter.

818

:

And so how does hydrogen behave?

819

:

It turns into a liquid metal.

820

:

I mean, that's weird.

821

:

We should know more about that.

822

:

And so being able to create that state of matter in the laboratory and study how it

behaves has huge implications for our understanding of these gas giants and other even

823

:

terrestrial planets.

824

:

ah Even our own Earth, right?

825

:

There's a lot of really interesting geoscience being done at these laser facilities

through CMAP and otherwise.

826

:

And so that's one end.

827

:

And then, know, Torit, you can go to these convergence systems like I have that can create

stellar interior relevant conditions.

828

:

And so you can start to understand, you know, our own sun, which there are still a lot of

questions about, turns out, even though it's the closest star and we can make a lot of

829

:

measurements of it.

830

:

You can't really see, you can't send a probe into the center of the sun.

831

:

Turns out it'll get incinerated.

832

:

And so we don't know even what's, even there's some debate about what's in the sun.

833

:

We're not a hundred percent sure what the metallicity, what elements are in the sun.

834

:

And that stems largely from disagreement between spectroscopic measurements and uh

helioseismological measurements.

835

:

So that's measuring the oscillations of the sun.

836

:

and inferring from that the composition.

837

:

And so, you know, those are sort of two separate measurements, but they both require

knowledge of, you know, high energy density plasmas and, you know, their equation of

838

:

state.

839

:

How does sound waves propagate through the sun?

840

:

uh That requires knowledge of material properties of hydrogen plasmas at billions and

billions of atmospheres.

841

:

uh then, know, spectroscopy requires knowledge of atomic physics and distressing

conditions, know, millions of degrees.

842

:

And so,

843

:

Being able to create these systems in the laboratories is hugely impactful for

understanding the uh physics.

844

:

Or rather, it could be.

845

:

I don't think it's living up to its potential uh as of yet, right?

846

:

And that large part of that is this, uh there's a large amount of siloing uh in physics in

general, but especially in this field, because you have people who know how to do laser

847

:

experiments, and you have people who know how to do astronomy or astrophysics.

848

:

And they don't talk usually.

849

:

They're in different subfields.

850

:

They might meet at a conference, but they don't, you know, they're speaking different

languages effectively.

851

:

And so being able to, you know, bring them together and say, Hey, what are the

astrophysical questions we can answer?

852

:

You know, cause we're doing these experiments based on what we can do.

853

:

And that's not always directly relevant to these astrophysical.

854

:

you, you know, bringing in astrophysicists and say, you know, if you could snap your

fingers and we could tell you one thing.

855

:

about the matter at the center of Jupiter, what would it be?

856

:

And then finding ways to answer that with laser experiments.

857

:

And that's one of the missions of CMAP is, I think they're calling it in-reach as opposed

to outreach, talking within the scientific community and trying to break down the walls of

858

:

that silo because it's very difficult to merge disciplines when each one is so myopically

focused on what they know and what they know how to do.

859

:

uh And then I'll just say the last thing that we're sort of doing is sort of pushing ah

the boundaries uh of these fundamental theories in physics, right?

860

:

When you assemble matter to these extreme conditions, you start uh pushing towards the

breakdown of these uh quantum mechanics, special relativity, not general relativity yet.

861

:

We can't make black holes.

862

:

Everyone always asks me.

863

:

So can you make a black hole with this later?

864

:

It's like, no, not even close, sorry.

865

:

But we can sure get things hot enough where special relativity begins to play a role.

866

:

And so what we're able to do by pushing the boundaries of these fundamental theories is to

see where they break down.

867

:

Quantum mechanics is sort of always considered in a vacuum, zero temperature, even

isolated atom.

868

:

What happens if you put that atom in a dense plasma that's 10 times the density of lead?

869

:

It's very difficult to calculate, but we can do it in the laboratory and we can observe

what happens.

870

:

So I'll shout out work by my colleague, David Bischel, ah who is measuring, he's a

spectroscopist.

871

:

And so he's using implosions and doing spectroscopy on those and seeing how the presence

of this incredibly dense plasma affects the energy levels of your atom.

872

:

And it turns out that, you you think of the energy, the quantized energy levels of your

atom is sort of fixed, right?

873

:

know, 13.6 EV for hydrogen.

874

:

But when you're in a dense plasma, they can actually shift.

875

:

can move the energy levels of your atom.

876

:

And that's, you know, kind of very complicated to understand and model.

877

:

the free electrons are being sort of sucked inside by the nucleus and screening your

energy levels so that the energy levels shift.

878

:

And that's sort of changing how we think about quantum mechanics and energy levels and

unbalanced states and free electrons and all that.

879

:

And so there's a lot of interesting physics to be done with these giant lasers.

880

:

And we're really only just scratching the surface, uh largely because we couldn't really

interpret these measurements too easily until now.

881

:

And we live in the information age now with all these diagnostics and Bayesian inference

and machine learning.

882

:

Lots of, lots of interesting stuff.

883

:

Yeah.

884

:

Yeah.

885

:

This is, is absolutely incredible.

886

:

Um, thanks for, for sharing that on the show.

887

:

Definitely add any links that you think are going to be relevant, um, to the show notes

because, I'm sure a lot of listeners are going to want to, uh, to learn more about the

888

:

different, topics you just, I mean, research topics you just, you just talked about.

889

:

Um.

890

:

And so you've been generous with your time.

891

:

So I'm going to start playing this out.

892

:

I still have a lot of questions for you, but a main one is if, so you see, well, you just

said we're living in a great time now when it comes to diagnostics.

893

:

ah Do you see any new diagnostic technologies?

894

:

ah Whether that's

895

:

novel detectors or quantum sensors or anything else that you're excited about or keeping

an eye on, not only for your work, but for your field in general?

896

:

Yeah.

897

:

Well, so, I mean, this is a great time for diagnostic development.

898

:

We're, you know, at the, at the LLE, we're, we're developing new diagnostics all the time.

899

:

You know, many, many PhD thesis are uh devoted to developing these new diagnostics.

900

:

um In terms of.

901

:

I think the most, you know, the one that I have my eye on uh is there at LEDA developing

the next generation street camera.

902

:

uh And so a street camera, basically, it's, you know, just a detector that takes a signal,

an X-ray signal and sweeps it in time.

903

:

So you get a continuous record of your X-ray.

904

:

So you can use that to take a time resolved spectrum ah or for example, a time resolved

image.

905

:

ah And so they're very useful diagnostics, but they always break and

906

:

ah You know, there's a joke going around that there's a conservation of street camera.

907

:

So, you know, if one is working, that means somewhere in the world, there's another street

camera that is not.

908

:

And that holds fairly true for me in my experience.

909

:

And so they're working on the next generation that's going to be very robust.

910

:

ah it's Sean McPoyle is the graduate student who's developing it.

911

:

um And so he's doing a very,

912

:

thorough bottom-up approach, know, simulating the electron optics that are required to

build this thing and, you know, building the best possible one, optimizing it from a very,

913

:

you know, from first principles approach.

914

:

And so that's going to enable really precise ultrafast uh spectroscopy of these, know, uh

billionth of a second lived plasmas.

915

:

And that's going to be able to, you know, totally push the boundaries of what we can

measure.

916

:

Cause I mean, even, you know,

917

:

the officials work looking at these moving energy levels.

918

:

That measurement is by modern spectroscopic standpoints terrible.

919

:

The resolution is not super great.

920

:

The time resolution is not, you can see that the lines move, you don't get much more than

that.

921

:

ah And so this next generation is going to be much more, it's going to be higher

resolution, it's going to be better contrast.

922

:

And so that's something that I'm really excited about.

923

:

It's called the BHX spectrometer.

924

:

um The other thing that I'm really excited about, um let's see, we have time resolved

X-ray diffraction.

925

:

ah So one thing that we do a lot of, know, X-ray diffraction, standard in material

science.

926

:

um We also do it at this sort of nanosecond scale where we blast a sample with a laser and

then we hit it with a bunch of X-rays and you can measure the diffraction peaks that tell

927

:

you what structure it's in.

928

:

um And so that's told us, you know, we've discovered new uh states of matter.

929

:

using that technique.

930

:

uh And so it, you know, but it's a single measurement.

931

:

And so now we're developing the capabilities to do time resolved diffraction where you're

measuring the movement of those Bragg diffraction peaks with time as you compress it.

932

:

So you can see phase transitions happen, which is a really outstanding question in

material science in general is how do phase transitions happen and how do they happen at

933

:

extreme conditions?

934

:

And so that's going to the very

935

:

you know, watching that closely to see what insights that develops.

936

:

And then, know, last thing is, you know, the sort of development of high rep rate, you

know, diagnostic technologies.

937

:

Cause right now we're in a, you know, we're sitting on sort of a pond of data.

938

:

You know, we've been operating the Omega laser since the nineties.

939

:

There's a lot of data, but we're taking 10 shots a day, one shot an hour basically.

940

:

And that produces a lot of data, but it's, you know,

941

:

In terms of, know, compared to particle physics, it's nothing.

942

:

Like I could store most of the data.

943

:

I can store all the data I've taken from my PhD on my laptop.

944

:

And it's not a, not an issue.

945

:

Um, but you know, the next generation of lasers coming out are going to be much faster.

946

:

They're going to be, you know, dipole or a diode pumped.

947

:

And then they're going to be able to fire, you know, on the Hertz scale.

948

:

So we're in time, like one shot every five seconds or a minute even.

949

:

And so.

950

:

that is going to rapidly increase the amount of data that we have just in general.

951

:

so understanding what to do with that is a challenge that I think is sort of uniquely

suited for the sort of data science techniques that I've been developing and other

952

:

collaborators have been working on to understand that.

953

:

And then what can you do if you have a shot every five seconds?

954

:

What do you do with that level of repetition rate?

955

:

uh What science questions can you answer and then how do you process that data and deal

with it?

956

:

How do you even build a target that you can shoot that fast?

957

:

And so there's a lot of, I think that's really the frontier of the field.

958

:

It's where the field is headed is this high rep rate frontier uh to be able to start

behaving more like other fields where you do a bunch of experiments and you can get very

959

:

good statistical certainty.

960

:

Yeah.

961

:

Actually that reminds me earlier in the show, you were saying that

962

:

Um, in one of your projects, the, the data changes with time and space.

963

:

so I'm curious how, how are you modeling that if you already are?

964

:

um Well, so, I mean, it goes at the end of the day, goes back to, you know, the

hydrodynamic equations, Navier-Stokes, and then, you know, however you want to reduce it

965

:

from there.

966

:

um And so understanding how it evolves with time and space is, you know, a very

non-trivial physics problem, especially, you know, when you're dealing with laser plasma

967

:

coupling and then, you know, beyond just the basic physics questions of what's the

equation of state of this thing.

968

:

How does a laser, now you introduce a laser field and transport properties are even less

well understood.

969

:

And so, you know, I've sort of adopted of, you know, try and keep things as simple as

possible.

970

:

So Newton's, know, keep things at like a Newton's second law level, right?

971

:

You balance your forces, right?

972

:

The laser is exerting some force on your target and then that necessarily produces an

acceleration.

973

:

And so one nice thing about Bayesian inference is you can have these parameters.

974

:

that you infer from your data.

975

:

So if you have a trajectory of your imploding capsule, you can say, OK, well, I know the

acceleration from these images that I've taken.

976

:

I can see this thing converging.

977

:

So I know it's going inward.

978

:

There must be some force driving that acceleration.

979

:

I can refer that from Bayesian inference.

980

:

so trying to keep the models as simple as possible without so you don't have to solve the

full Navier-Stokes equation with radiative transfer and all that.

981

:

The level that we're at in the future, you know, we'd like to be able to use these

state-of-the-art um Codes these radiation hydrodynamic codes as the model right because

982

:

that's a complete picture of all the physics involved The problem is there's you know,

they're slow, you know, they're taking hours to run one simulation and they're also you

983

:

know, there's a lot of parameters in these very complicated codes as you can imagine and

So, you know the sort of future is to build a sort of Bayesian version

984

:

of one of these codes that's ideally written in Python, ah fully differentiable with

respect to the parameters and much faster.

985

:

And probably that looks something like building a neural network surrogate ah of one of

these codes or something.

986

:

um We'll see what the future holds, but that's the dream is to move to more and more

complex physical models.

987

:

uh

988

:

sort of trying to follow the blueprint that the particle physics people do.

989

:

They have this very complicated, complete physical model and they use that to make

predictions.

990

:

And so we'd like to be able to do the same and then infer using the best available models.

991

:

for now, very simple, very simple pictures.

992

:

Yeah.

993

:

I mean, you have to start somewhere, especially with all the complexities you were talking

about before.

994

:

And the block building philosophy, you were...

995

:

uh

996

:

talking about, which I definitely agree with, because that way, know, what is working and

what is not.

997

:

So that's extremely helpful.

998

:

Now I was asking because me, I hear spatial temporal and I think about Gaussian processes,

of course.

999

:

you know, like I was curious.

:

01:20:11,294 --> 01:20:12,334

of course.

:

01:20:12,675 --> 01:20:16,628

And that's part of the simple model, right?

:

01:20:16,628 --> 01:20:18,730

When you don't know how something evolves in time, right?

:

01:20:18,730 --> 01:20:19,981

The temperature is a great example.

:

01:20:19,981 --> 01:20:25,756

That's, you know, depends on all of the thermal processes, conduction, radiative heating,

you know, expansion.

:

01:20:25,756 --> 01:20:28,567

All of those things go into that and it's very difficult to model that.

:

01:20:28,567 --> 01:20:32,808

ah But you know, the temperature goes up and then down.

:

01:20:32,808 --> 01:20:37,379

And so the perfect sort of vehicle for that is of course, modeling as a Gaussian process.

:

01:20:37,379 --> 01:20:42,601

And that's an infinitely flexible tool for that and something we've we've leaned on a lot.

:

01:20:42,601 --> 01:20:43,031

Yes.

:

01:20:43,031 --> 01:20:45,771

I forgot you're big fan of Gaussian processes.

:

01:20:45,771 --> 01:20:47,652

I should have led with that.

:

01:20:48,112 --> 01:20:48,512

I am.

:

01:20:48,512 --> 01:20:48,932

am.

:

01:20:48,932 --> 01:20:55,684

So did you try actually some GPs on that or that actually doesn't work because

:

01:20:55,684 --> 01:21:05,921

When the data is too big or even better justification would be, well, we know more

structure than we can impose on a GP.

:

01:21:06,341 --> 01:21:06,902

No, yeah.

:

01:21:06,902 --> 01:21:09,703

Well, so we use Gaussian processes.

:

01:21:09,914 --> 01:21:15,087

I mean, I use Gaussian processes at every available opportunity just because they're a

very powerful tool.

:

01:21:15,087 --> 01:21:19,871

So what we'll often do is model a piece of the model.

:

01:21:19,871 --> 01:21:23,873

For example, the temperature history is a function of time as a Gaussian process, right?

:

01:21:23,873 --> 01:21:24,988

m

:

01:21:24,988 --> 01:21:28,430

That's sort of, know, there's stuff that we know more about, right?

:

01:21:28,430 --> 01:21:35,583

That the Newton's second law piece, I was talking earlier, there's physics pieces that we

know and when possible we use the physics.

:

01:21:35,943 --> 01:21:41,225

But you know, when you can't, you want to use something that is flexible and able to

capture the data.

:

01:21:41,225 --> 01:21:45,467

ah Well, you know, we know the temperature is a continuous function of time.

:

01:21:45,467 --> 01:21:49,469

And so Gaussian process is the perfect sort of prior to place on that.

:

01:21:49,949 --> 01:21:51,760

And sort of, you know.

:

01:21:51,760 --> 01:21:56,994

lets the data inform what that is while giving an estimate of the uncertainty.

:

01:21:59,417 --> 01:22:00,898

Yeah.

:

01:22:00,898 --> 01:22:01,668

Great.

:

01:22:01,668 --> 01:22:04,040

This is a very interesting thing to do.

:

01:22:04,230 --> 01:22:15,670

And both in Amparo and Pinesy, have very good GP modules that you can use flexibly um in

your models without forcing the model to be a pure GP.

:

01:22:15,670 --> 01:22:19,293

So this is something extremely, extremely helpful.

:

01:22:19,293 --> 01:22:20,854

um

:

01:22:21,794 --> 01:22:27,198

Maybe a few more questions before I ask you the last two questions.

:

01:22:27,198 --> 01:22:34,042

I promise I won't be too long, but I'm curious now, playing this out, thinking a bit more

about the future.

:

01:22:34,042 --> 01:22:37,624

What's your vision for scaling up experiments?

:

01:22:37,925 --> 01:22:42,127

Whether that's energy, repetition rate, fidelity.

:

01:22:42,308 --> 01:22:51,233

In the next few years, what do you hope the field will get when it comes to scaling up

experiments?

:

01:22:51,521 --> 01:22:54,344

Yeah, well, so it's going to be a sort of two pronged approach.

:

01:22:54,344 --> 01:23:01,609

think what we're already seeing is a move towards high rep rate at existing facilities.

:

01:23:01,810 --> 01:23:08,295

So the sort of most prominent example is the dipole laser at the European Exfel in

Hamburg, Germany.

:

01:23:08,295 --> 01:23:16,102

And so they've built a reparated sort of Joule class laser that can do, you know, one

shot.

:

01:23:16,102 --> 01:23:19,945

It's supposed to be, I think one Hertz, but it's not quite there.

:

01:23:19,993 --> 01:23:27,618

ah It can shoot every couple minutes or so, which is much better than the previous

version, which had like an hour cool time or whatever.

:

01:23:27,618 --> 01:23:36,083

And so now you can just rattle through a bunch of different targets and you can do

someone's whole PhD thesis in an hour.

:

01:23:36,083 --> 01:23:45,058

ah And so I think moving that direction, this high rep rate is very, um that's where we're

heading.

:

01:23:45,058 --> 01:23:46,861

And I think that's a good direction to head.

:

01:23:46,861 --> 01:23:51,082

The other piece is going to be, we're going to need to build new facilities.

:

01:23:51,082 --> 01:23:56,024

And, you know, the Omega laser is, as I mentioned several times, very old.

:

01:23:56,024 --> 01:23:57,104

It's from the nineties.

:

01:23:57,104 --> 01:23:59,305

It's using nineties laser technology.

:

01:23:59,305 --> 01:24:00,345

It's showing its age.

:

01:24:00,345 --> 01:24:02,906

um And it's a really good facility.

:

01:24:02,906 --> 01:24:03,846

It does really good work.

:

01:24:03,846 --> 01:24:11,828

um But we're going to have to start thinking about the next generation of laser drivers to

be able to sort of keep pushing the boundaries.

:

01:24:11,828 --> 01:24:14,209

And so obviously the national ignition facility.

:

01:24:14,209 --> 01:24:17,151

out at Livermore is uh mega joules.

:

01:24:17,151 --> 01:24:21,973

So it's 10 times bigger than the Omega 60 laser, which is only 30 kilojoules.

:

01:24:22,273 --> 01:24:30,869

And so that serves its own purpose, but you can only shoot that once a day because it's so

massive, the lamps take forever to cool down.

:

01:24:30,869 --> 01:24:38,292

And then you create so many neutrons from your fusion reactions that it uh makes the whole

chamber radioactive.

:

01:24:38,292 --> 01:24:43,929

so there, think, is a middle ground here sort of between the

:

01:24:43,929 --> 01:24:53,883

kilojoule class laser that Omega is now and the mega joule laser at NIF, you know,

probably like a hundred kilojoules is probably the sweet spot uh where it's also, you you

:

01:24:53,883 --> 01:25:02,907

were, you're able to rep rate that and do all of, all of the, the cool science that the

Omega 60 laser does now uh and more, right?

:

01:25:02,907 --> 01:25:11,141

Cause you have a little bit more energy, but it being able to do that at scale at rep

rate, I think is going to do, would be a very worthwhile endeavor.

:

01:25:11,141 --> 01:25:21,420

And we'll see there, you know, they're in discussions to build the next generation of

Omega 60 which is sort of tentatively named the unimaginative Omega next Which I think is

:

01:25:21,420 --> 01:25:33,160

a terrible name and needs to be changed before we start proposing it to the government but

um Yeah, I think that's that's the way the way forward is reparate and you know more

:

01:25:33,160 --> 01:25:38,825

energy But you know, I don't think we need to go, you know, people are talking about

building nif to

:

01:25:39,126 --> 01:25:44,270

uh Right, because now that they got ignition over there, we need to build a 10-megajoule

laser.

:

01:25:44,270 --> 01:25:50,213

And that's great, but I think there's a lot more interesting scientific questions that can

be answered with a 100-kilojoule laser.

:

01:25:50,213 --> 01:25:54,877

uh Like Omega 60, but Omega next.

:

01:25:54,877 --> 01:25:55,897

Damn, yeah.

:

01:25:55,897 --> 01:26:00,981

Yeah, that sounds like a powerful beast to build.

:

01:26:00,981 --> 01:26:06,384

um Now, one last question before the last two questions.

:

01:26:06,384 --> 01:26:07,845

um I know...

:

01:26:08,327 --> 01:26:16,368

So you're, you're maybe starting to teach it or at least you're, you're doing some

Bayesian outreach as you were saying.

:

01:26:16,368 --> 01:26:24,121

So I'm curious what advice do you younger scientists entering the field?

:

01:26:24,441 --> 01:26:29,363

What skills, mindset, training do you think are most critical?

:

01:26:30,063 --> 01:26:30,413

Yeah.

:

01:26:30,413 --> 01:26:37,778

Well, I think it's one thing I wish, you know, people had told me, and I think a

misconception about

:

01:26:37,778 --> 01:26:43,872

you know, grad school in general, is I tell people I'm doing, you know, I PhD in physics

and I'm like, you must be so smart.

:

01:26:43,872 --> 01:26:45,453

You must be a genius.

:

01:26:45,573 --> 01:26:48,155

And it's like, I don't think that's necessarily true.

:

01:26:48,155 --> 01:26:52,398

I think more than anything, it's mental toughness is what you need to cause it.

:

01:26:52,398 --> 01:26:54,319

Grad school is a grind.

:

01:26:54,479 --> 01:26:57,321

And there are, there are days when you're like, I hate this.

:

01:26:57,321 --> 01:26:58,001

I want to quit.

:

01:26:58,001 --> 01:27:02,945

Um, and you know, there are, you know, just as many days where like, you know, this is

great.

:

01:27:02,945 --> 01:27:06,597

Like I, I feel like I'm unraveling truths of the universe.

:

01:27:06,799 --> 01:27:13,271

And being able to sort of manage those highs and lows takes a remarkable amount of mental

toughness.

:

01:27:13,271 --> 01:27:21,883

And so what I've seen over the course of my career is graduate students who are successful

are ones who are able to grind it out.

:

01:27:22,123 --> 01:27:24,444

And it helps to have a sense of community.

:

01:27:24,444 --> 01:27:30,585

And so that would be my other piece of advice is make friends with your lab mates and

fellow grad students.

:

01:27:30,926 --> 01:27:34,746

it'll get better, allegedly.

:

01:27:36,903 --> 01:27:44,430

it's, it's important to, to have mental toughness and not lose sort of the big picture,

uh, side of it all.

:

01:27:44,430 --> 01:27:44,620

Right.

:

01:27:44,620 --> 01:27:51,697

Cause it, you know, well, you're doing cutting edge science in, it's, it's very, uh, you

know, remarkable in that sense.

:

01:27:51,697 --> 01:27:57,512

And so being able to not lose sight of that while maintaining your, your toughness is, is

I think that's the key to success.

:

01:27:57,512 --> 01:27:59,163

That's, that's my advice.

:

01:28:01,073 --> 01:28:01,373

Yeah.

:

01:28:01,373 --> 01:28:02,243

Yeah.

:

01:28:02,243 --> 01:28:16,834

So resilience and compounding effect of small daily grind is, definitely something you

recommend and definitely push to with my own students.

:

01:28:16,834 --> 01:28:20,617

Like, because I get the same, yeah, I guess the same reactions.

:

01:28:20,617 --> 01:28:27,372

Um, each time I talk to people about what I do and it's always like, oh yeah, I'm not good

at math.

:

01:28:27,372 --> 01:28:29,183

know, I've never, it's not for me.

:

01:28:29,183 --> 01:28:29,939

Um.

:

01:28:29,939 --> 01:28:38,379

And I think trying to develop more of that growth mindset where it's well, you know, I'm

not sure it's for me either.

:

01:28:38,379 --> 01:28:41,579

You know, it's just, I, I love it and I love learning new things.

:

01:28:41,579 --> 01:28:48,379

And I think working on difficult problems is the best way to learn new things because

otherwise I get bored.

:

01:28:48,379 --> 01:28:56,811

But this is extremely uncomfortable and hard because by definition, if you're always at

the frontier of the field you work in.

:

01:28:57,115 --> 01:28:59,016

You're working on things that aren't solved yet.

:

01:28:59,016 --> 01:29:10,224

And so you don't have that feeling of, uh, yeah, being comfortable because you never know

exactly how you're going to solve what you're working on.

:

01:29:10,224 --> 01:29:12,665

And so, even if there is a solution.

:

01:29:12,946 --> 01:29:13,516

Yeah.

:

01:29:13,516 --> 01:29:14,657

Well, so that's even worse.

:

01:29:14,657 --> 01:29:15,327

Yeah.

:

01:29:15,327 --> 01:29:18,290

That's not up in that Pandora's box, but yeah, yeah, no, for sure.

:

01:29:18,290 --> 01:29:21,055

Um, yeah, you don't even know if there is a solution.

:

01:29:21,055 --> 01:29:26,715

Um, and, and so, yeah, I think having.

:

01:29:27,347 --> 01:29:33,727

loving what you're working on because it gives you something you really value.

:

01:29:34,607 --> 01:29:47,287

And just being able to persistently get point one percent better every day is going to

give you a very valuable compounding effect.

:

01:29:47,907 --> 01:29:49,207

100 % agree.

:

01:29:50,767 --> 01:29:51,187

Awesome.

:

01:29:51,187 --> 01:29:57,013

Well, Ethan, I think it's time to let you go because as you've seen in the Google doc app.

:

01:29:57,013 --> 01:30:00,546

I have more questions, but let's call it a show.

:

01:30:00,546 --> 01:30:06,090

um But before letting you go, of course, you know, I'm going to ask you the last two

questions.

:

01:30:06,090 --> 01:30:07,932

I ask every guest at the end of the show.

:

01:30:07,932 --> 01:30:14,997

ah So first one, if you had unlimited time and resources, which problem would you try to

solve?

:

01:30:15,518 --> 01:30:18,100

So, all right, this is a bit of a niche one.

:

01:30:18,921 --> 01:30:25,886

But I'm very interested in this idea of picnonuclear fusion.

:

01:30:26,438 --> 01:30:28,610

And so, you know, most people are familiar with thermonuclear fusion.

:

01:30:28,610 --> 01:30:29,921

That's what happens in the sun, right?

:

01:30:29,921 --> 01:30:33,443

have, you know, hydrogen molecules, very hot, they crash into each other and they fuse.

:

01:30:33,443 --> 01:30:45,111

ah There's another mechanism that's believed to occur in some super dense astrophysical

objects, like in white dwarfs, for example, where you have sufficient density that you

:

01:30:45,111 --> 01:30:52,456

have, you know, two atoms that are close enough together that they spontaneously fuse, not

due to, you know, high velocity collisions, but just because of proximity.

:

01:30:52,776 --> 01:30:55,818

And if it, you know, it sounds a lot like a cold fusion.

:

01:30:55,948 --> 01:30:59,941

But this is a little bit more uh scientifically sound.

:

01:30:59,941 --> 01:31:09,929

And so there's a longstanding uh belief of my advisor that you can do this on a laser

facility like the National Ignition Facility.

:

01:31:10,290 --> 01:31:13,812

But you would need to make it incredibly high density.

:

01:31:14,453 --> 01:31:18,497

incredibly, you'd have to keep it cold enough that thermonuclear fusion doesn't happen.

:

01:31:18,497 --> 01:31:25,272

And so if the government gave me unlimited time and money, I would just devote the entire

time of the National Ignition Facility.

:

01:31:25,272 --> 01:31:28,343

to trying to make as dense of an object as I could.

:

01:31:28,343 --> 01:31:31,334

And maybe we'd get the black hole that everyone always asks me about.

:

01:31:31,334 --> 01:31:38,786

But ideally, you'd show this peak no nuclear fusion is, I think, one of the coolest

things.

:

01:31:38,786 --> 01:31:45,368

um And it's a largely unsolved problem because we have no idea what's happening in those

white wars.

:

01:31:45,368 --> 01:31:49,209

And we can't yet make that plasma in the laboratory.

:

01:31:49,209 --> 01:31:50,629

So that's the problem.

:

01:31:50,629 --> 01:31:54,090

It's personal pet project.

:

01:31:55,926 --> 01:31:56,346

Okay.

:

01:31:56,346 --> 01:31:58,127

Yeah.

:

01:31:58,127 --> 01:31:58,887

I love that.

:

01:31:58,887 --> 01:32:05,789

um niche answers are definitely encouraged to that question.

:

01:32:05,789 --> 01:32:08,230

em Second question.

:

01:32:08,230 --> 01:32:15,371

If you could have dinner with any great scientific mind, dead alive, alive or fictional,

who would it be?

:

01:32:15,432 --> 01:32:24,394

So this is going to be controversial given the, he was kind of a villain in Oppenheimer,

but I'm going to go with Ed Teller.

:

01:32:24,702 --> 01:32:36,562

Uh, just because, you know, he's, he's sort of a controversial figure, uh, in, scientific

history, but I think he, I think he had a very interesting view on science and sort of,

:

01:32:36,562 --> 01:32:37,837

uh, interesting mind.

:

01:32:37,837 --> 01:32:39,728

And I would love to sort of pick his brain.

:

01:32:39,728 --> 01:32:50,593

He also, um, you know, a lot of the work that he did, uh, you know, post Manhattan project

set laid the groundwork for, you know, the high energy density physics as we know it

:

01:32:50,593 --> 01:32:51,353

today, right?

:

01:32:51,353 --> 01:32:54,190

He was using nuclear bombs and set of giant lasers.

:

01:32:54,190 --> 01:32:55,451

uh as a driver.

:

01:32:55,451 --> 01:33:00,423

um But it would be interesting to sort of pick his brain about the foundations of the

field.

:

01:33:00,423 --> 01:33:07,898

And of course, he helped invent the uh Metropolis-Hastings uh algorithm that is the

foundation of all Bayesian inference.

:

01:33:07,898 --> 01:33:10,119

I'd be interested to talk to him about that.

:

01:33:10,119 --> 01:33:10,619

Damn.

:

01:33:10,619 --> 01:33:11,220

Yeah.

:

01:33:11,220 --> 01:33:12,700

Yeah, great answer.

:

01:33:12,700 --> 01:33:17,043

And I think you're the first one to actually answer that.

:

01:33:17,043 --> 01:33:18,564

I would not be surprised.

:

01:33:18,564 --> 01:33:20,304

I don't think that's a very popular answer.

:

01:33:20,304 --> 01:33:23,453

um

:

01:33:23,453 --> 01:33:27,214

Awesome, well, Ethan, thanks again for being on the show.

:

01:33:27,214 --> 01:33:28,595

That was fantastic.

:

01:33:28,595 --> 01:33:31,666

You were able to cover so much ground.

:

01:33:31,906 --> 01:33:34,687

I am very happy about that.

:

01:33:34,807 --> 01:33:36,478

You're doing fascinating work.

:

01:33:36,478 --> 01:33:40,489

So I am extremely happy to have you on the show.

:

01:33:40,489 --> 01:33:45,151

Let's make sure to add all these great resources to the show notes.

:

01:33:45,151 --> 01:33:47,332

I've already added a few of them, folks.

:

01:33:47,332 --> 01:33:52,654

So make sure to check them out if you want to dig deeper.

:

01:33:53,104 --> 01:33:57,622

Ethan, thanks again for taking the time and being on this show.

:

01:33:57,622 --> 01:33:58,809

Thank you so much for having me.

:

01:33:58,809 --> 01:34:02,690

And thanks again to JJ Ruby for putting this in.

:

01:34:07,028 --> 01:34:10,721

This has been another episode of Learning Bajan Statistics.

:

01:34:10,721 --> 01:34:21,230

Be sure to rate, review and follow the show on your favorite podcatcher and visit

learnbastats.com for more resources about today's topics as well as access to more

:

01:34:21,230 --> 01:34:25,313

episodes to help you reach true Bajan state of mind.

:

01:34:25,313 --> 01:34:27,255

That's learnbastats.com.

:

01:34:27,255 --> 01:34:32,119

Our theme music is Good Bajan by Baba Brinkman, fit MC Lars and Meghiraan.

:

01:34:32,119 --> 01:34:35,261

Check out his awesome work at bababrinkman.com.

:

01:34:35,261 --> 01:34:36,456

I'm your host.

:

01:34:36,456 --> 01:34:37,517

Alex and Dora.

:

01:34:37,517 --> 01:34:41,666

can follow me on Twitter at Alex underscore and Dora like the country.

:

01:34:41,666 --> 01:34:48,935

You can support the show and unlock exclusive benefits by visiting Patreon.com slash

LearnBasedDance.

:

01:34:48,935 --> 01:34:51,316

Thank you so much for listening and for your support.

:

01:34:51,316 --> 01:34:53,618

You're truly a good Bayesian.

:

01:34:53,618 --> 01:35:00,422

Change your predictions after taking information and if you're thinking I'll be less than

amazing.

:

01:35:00,422 --> 01:35:03,725

Let's adjust those expectations.

:

01:35:03,725 --> 01:35:06,234

Let me show you how to be a good Bayesian.

:

01:35:06,234 --> 01:35:16,937

You change calculations after taking fresh data in Those predictions that your brain is

making Let's get them on a solid foundation

Chapters

Video

More from YouTube

More Episodes
146. #146 Lasers, Planets, and Bayesian Inference, with Ethan Smith
01:35:19
144. #144 Why is Bayesian Deep Learning so Powerful, with Maurizio Filippone
01:28:21
142. #142 Bayesian Trees & Deep Learning for Optimization & Big Data, with Gabriel Stechschulte
01:10:28
141. #141 AI Assisted Causal Inference, with Sam Witty
01:37:47
140. #140 NFL Analytics & Teaching Bayesian Stats, with Ron Yurko
01:33:00
139. #139 Efficient Bayesian Optimization in PyTorch, with Max Balandat
01:25:22
138. #138 Quantifying Uncertainty in Bayesian Deep Learning, Live from Imperial College London
01:23:09
137. #137 Causal AI & Generative Models, with Robert Ness
01:38:18
136. #136 Bayesian Inference at Scale: Unveiling INLA, with Haavard Rue & Janet van Niekerk
01:17:36
135. #135 Bayesian Calibration and Model Checking, with Teemu Säilynoja
01:12:12
134. #134 Bayesian Econometrics, State Space Models & Dynamic Regression, with David Kohns
01:40:55
133. #133 Making Models More Efficient & Flexible, with Sean Pinkney & Adrian Seyboldt
01:12:12
132. #132 Bayesian Cognition and the Future of Human-AI Interaction, with Tom Griffiths
01:30:14
131. #131 Decision-Making Under High Uncertainty, with Luke Bornn
01:31:45
130. #130 The Real-World Impact of Epidemiological Models, with Adam Kucharski
01:09:04
129. #129 Bayesian Deep Learning & AI for Science with Vincent Fortuin
01:02:42
128. #128 Building a Winning Data Team in Football, with Matt Penn
00:58:10
127. #127 Saving Sharks... with Python, Causal Inference and Aaron MacNeil
01:04:08
125. #125 Bayesian Sports Analytics & The Future of PyMC, with Chris Fonnesbeck
00:58:14
124. #124 State Space Models & Structural Time Series, with Jesse Grabowski
01:35:43
123. #123 BART & The Future of Bayesian Tools, with Osvaldo Martin
01:32:13
122. #122 Learning and Teaching in the Age of AI, with Hugo Bowne-Anderson
01:23:10
121. #121 Exploring Bayesian Structural Equation Modeling, with Nathaniel Forde
01:08:12
120. #120 Innovations in Infectious Disease Modeling, with Liza Semenova & Chris Wymant
01:01:39
119. #119 Causal Inference, Fiction Writing and Career Changes, with Robert Kubinec
01:25:00
118. #118 Exploring the Future of Stan, with Charles Margossian & Brian Ward
00:58:50
117. #117 Unveiling the Power of Bayesian Experimental Design, with Desi Ivanova
01:13:11
116. #116 Mastering Soccer Analytics, with Ravi Ramineni
01:32:46
115. #115 Using Time Series to Estimate Uncertainty, with Nate Haines
01:39:50
114. #114 From the Field to the Lab – A Journey in Baseball Science, with Jacob Buffa
01:01:31
113. #113 A Deep Dive into Bayesian Stats, with Alex Andorra, ft. the Super Data Science Podcast
01:30:51
112. #112 Advanced Bayesian Regression, with Tomi Capretto
01:27:18
110. #110 Unpacking Bayesian Methods in AI with Sam Duffield
01:12:27
107. #107 Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt
01:21:37
106. #106 Active Statistics, Two Truths & a Lie, with Andrew Gelman
01:16:46
104. #104 Automated Gaussian Processes & Sequential Monte Carlo, with Feras Saad
01:30:47
103. #103 Improving Sampling Algorithms & Prior Elicitation, with Arto Klami
01:14:38
102. #102 Bayesian Structural Equation Modeling & Causal Inference in Psychometrics, with Ed Merkle
01:08:53
100. #100 Reactive Message Passing & Automated Inference in Julia, with Dmitry Bagaev
00:54:41
98. #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié
01:05:06
97. #97 Probably Overthinking Statistical Paradoxes, with Allen Downey
01:12:35
94. #94 Psychometrics Models & Choosing Priors, with Jonathan Templin
01:06:25
90. #90, Demystifying MCMC & Variational Inference, with Charles Margossian
01:37:35
89. #89 Unlocking the Science of Exercise, Nutrition & Weight Management, with Eric Trexler
01:59:50
87. #87 Unlocking the Power of Bayesian Causal Inference, with Ben Vincent
01:08:38
86. #86 Exploring Research Synchronous Languages & Hybrid Systems, with Guillaume Baudart
00:58:42
84. #84 Causality in Neuroscience & Psychology, with Konrad Kording
01:05:42
83. #83 Multilevel Regression, Post-Stratification & Electoral Dynamics, with Tarmo Jüristo
01:17:20
56. #56 Causal & Probabilistic Machine Learning, with Robert Osazuwa Ness
01:08:57
68. #68 Probabilistic Machine Learning & Generative Models, with Kevin Murphy
01:05:35
71. #71 Artificial Intelligence, Deepmind & Social Change, with Julien Cornebise
01:05:07
78. #78 Exploring MCMC Sampler Algorithms, with Matt D. Hoffman
01:02:40
80. #80 Bayesian Additive Regression Trees (BARTs), with Sameer Deshpande
01:09:05