Artwork for podcast MisterNonGrata
Disputed Truth at the Pentagon: Mr. Non Grata on Problem Solving & Human Resistance
27th August 2025 • MisterNonGrata • Bill Alderson
00:00:00 01:01:00

Share Episode

Shownotes

📝 Show Notes / Description

Episode Summary

In this episode of Mr. Non Grata, Bill Alderson welcomes psychology expert Kim Mueller to explore one of the most difficult paradoxes in organizational life: why people often resist the very truth that could solve their biggest problems.

Drawing from Bill’s extraordinary experience — including his role in restoring Pentagon communications after 9/11 — this episode dives into a case where a 45-second intermittent outage plagued over 10,000 users at the Office of the Secretary of Defense. Despite unlimited resources, the problem went unresolved for more than a year. Why? Because of human dynamics: ego, fear, secrecy, and relationships hidden beneath the surface.

Together, Bill and Kim unpack:

  • ⚡ Disputed truth — why leaders and teams sometimes avoid answers they don’t want to hear.
  • đź§  The psychology of denial — from self-preservation to ego defense mechanisms.
  • 🕵️ Problem-solver resistance — how outside experts become “Mr. Non Grata” even when they bring critical solutions.
  • đź”’ The Pentagon case study — how organizational compartmentalization, secrecy, and personal dynamics delayed a fix for over a year.
  • đź’ˇ Leadership lessons — building a “no-fault” culture where truth is welcomed, not hidden.

This episode is a powerful reminder that every technology problem is also a human problem.

If you’ve ever felt unwelcome after fixing something big, or wondered why teams sabotage their own solutions, this conversation will resonate deeply.

Transcripts

Bill Alderson:

Welcome to another episode of Mr.

2

:

Non Grata.

3

:

I, Bill Alderson, am none other than Mr.

4

:

Non Grata.

5

:

And since I started this series,

I've talked with a number of

6

:

people who've listened to it.

7

:

And they call me up and

say, Bill, I've been Mr.

8

:

Non Grata in so many situations.

9

:

This is really interesting.

10

:

My last job, I became Mr.

11

:

Non Grata.

12

:

I became not welcome or unwelcome.

13

:

After I did a whole bunch of really great

things for an organization, I became Mr.

14

:

Non Grata, just like you're talking about.

15

:

So it's not an isolated situation.

16

:

And I'm not the only Mr.

17

:

Non Grata.

18

:

You could be Mr.

19

:

Nongrata.

20

:

Maybe you and I know one

another from the past.

21

:

Maybe we worked on a problem together.

22

:

Critical problem, high

visibility, high stakes issue.

23

:

Maybe you'd like to be on the podcast.

24

:

Talk about your perspective of

how you or someone else was Mr.

25

:

Nongrata.

26

:

And of course, I refer to Mr.

27

:

Nongrata As a person, could

be female, could be male.

28

:

And, as a way of introduction,

I have with me Kim Mueller.

29

:

And I'll let her talk about

her educational background and

30

:

experience in the area of psychology.

31

:

Because as we unravel these stories, true

stories, About how people can become Mr.

32

:

Non Grata as they are trying to

uncover high visibility, high stakes

33

:

problems to find disputed truth.

34

:

Ooh, disputed truth.

35

:

That's what my whole life's work has been

about, is discovering disputed truth.

36

:

And you would think that if

it were merely an IT or a data

37

:

problem, We'd be in good shape.

38

:

But it's never just an

IT or a data problem.

39

:

It becomes a human problem.

40

:

And that's why I asked Kim Mueller to

sit down with us and talk about why

41

:

would people not want to know the truth?

42

:

Because there are times where people

do not want to know the truth.

43

:

And I can quote chapter and verse.

44

:

Name and date, where people

did not want the truth.

45

:

Now, I'm pretty diplomatic.

46

:

I don't go around accusing

individuals in particular.

47

:

And some of these stories, you

may know who I'm talking about.

48

:

I can never confirm nor deny.

49

:

I have worked with NSA.

50

:

I've worked with FBI.

51

:

I've worked with CIA.

52

:

I've worked extensively with DOD.

53

:

I've worked with police departments, I've

worked with county, local, you name it.

54

:

I have worked with all of these type of

situations from 911 dispatch systems that

55

:

were broken to intelligence systems that

were broken in Iraq and Afghanistan and

56

:

throughout the world for DoD biometrics.

57

:

I'm no stranger to some of these problems.

58

:

And today, we're going to go over a

particular problem, but as I said, I

59

:

just wanted to introduce Kim Mueller

and let her talk a little bit and maybe

60

:

respond to this topic of why don't

people want to know the truth at times.

61

:

Anyway, Kim, welcome to the show.

62

:

Kim Mueller: Thank you, Bill.

63

:

Thanks for having me today.

64

:

I think you're probably asking

the million dollar question.

65

:

Why do people not want to know the truth?

66

:

So I'm interested in hearing

what you have to say today and

67

:

engaging in this conversation.

68

:

As you mentioned, my background is

psychology, not technical person by

69

:

training or by heart or by nature.

70

:

I have a degree in clinical psychology

and I've worked in the behavioral

71

:

health field the majority of my career.

72

:

Although, in the last decade or so, I

have crossed the border into the IT world,

73

:

just a bit, given the need for electronic

health records, even in behavioral health.

74

:

So I have a little taste of the

IT world, but my background and my

75

:

way of thinking is really rooted in

psychology, why people do what they

76

:

do, why people think what they think.

77

:

Bill Alderson: In your experience,

have you encountered various diagnoses

78

:

of people who may not be in the IT

field, but in other types of fields

79

:

and discovered somewhat ironic

behavior when it comes to whether they

80

:

wanted to know or not know the truth.

81

:

We all have these as a person.

82

:

We, We can't convict ourselves,

even, of certain things.

83

:

And we don't always want to know the

whole truth and nothing but the truth,

84

:

because it might crush us if we really

knew the truth even about ourselves.

85

:

Kim Mueller: Absolutely.

86

:

By nature, we as humans have a

tendency towards self preservation.

87

:

And that doesn't just mean

preserving our physical.

88

:

Being, but our emotional well being, our

pride, the things that make us who we are.

89

:

And so it's not really surprising when

you think about it, that if a truth

90

:

is going to out that someone isn't

really who they're supposed to be,

91

:

or who they've led people to think

they are, if the truth will out that,

92

:

then they're gonna be very reluctant

to want to go where that truth is.

93

:

Bill Alderson: Interestingly, today's

situation is a very high stakes, very high

94

:

visibility situation, and we're going to

see some of those human characteristics

95

:

in the story that I will tell.

96

:

And I'll just tell you right now, this

is absolutely 100 percent the truth.

97

:

Nothing added and nothing taken away.

98

:

So in 2001, most people know that

I was called to the Pentagon to

99

:

help them recover communications

immediately following the 9 11 disaster.

100

:

I was standing in my driveway and I got

a phone call and I answered my cell phone

101

:

and it was a Pentagon general asking me to

bring Myself and my team to the Pentagon

102

:

to help them recover communications

because they had moved a few hundred

103

:

servers that were getting water damaged.

104

:

Obviously there's all sorts of things.

105

:

Some things were completely disaster.

106

:

Um, were a disaster and destroyed.

107

:

Other things were just

starting to get wet.

108

:

So they moved all the servers.

109

:

physically out of those

locations and into other areas.

110

:

And they asked me to come and help them

reroute network, reroute firewalls,

111

:

reroute all of these different systems

and troubleshoot all the problems

112

:

that related so that they could bring

the Pentagon back up and so that

113

:

everyone could communicate effectively.

114

:

So I got that call and that's

pretty much what I do today.

115

:

I help people, technologists

and other leaders.

116

:

Be ready to receive that call.

117

:

That's my mantra.

118

:

That's my whole focus.

119

:

That's my opus, is to help other

people be ready to receive that

120

:

high visibility, high stakes call.

121

:

Or, maybe it's just the call that they

get from their own local organizations.

122

:

Or their locale.

123

:

Nevertheless, it's a critical problem.

124

:

So I'm helping people get

ready to get that call and be

125

:

able to respond effectively.

126

:

And it's not only from a technical level.

127

:

I obviously, most people know

that I'm a packet person.

128

:

My handle is packetman007.

129

:

I have been analyzing computer

network packets since:

130

:

Lockheed Missiles and Space Company.

131

:

And looking at packets and trying

to diagnose problems from the

132

:

packet level, which means the

very core of all functions today.

133

:

Every security function in the world

first has an interloper trying to get in,

134

:

and they try and get in with a packet.

135

:

And once they're in, whether

they've elevated a privilege or

136

:

gotten into an API, Or, done SQL

injection, it doesn't matter.

137

:

There's packet evidence.

138

:

Whether it's on the way in, or whether

it's on the way out, it's a packet.

139

:

The fundamentals of security

start with a packet getting in.

140

:

And end with a packet getting

out of the wrong information.

141

:

So that's my worldview is packets.

142

:

So when there's a problem,

that's the perspective.

143

:

I take the fundamental approach

of looking at the packets.

144

:

Do you know that within a security

breach that the hackers, once they

145

:

get in, did you know that they can

turn off all those logs that all those

146

:

security experts are pouring over?

147

:

Yeah.

148

:

All those logs, that they spend,

Millions of dollars trying to go

149

:

through all those logs to find

out what the hackers were doing.

150

:

Do you know that they

can turn off those logs?

151

:

But yet, the packets of them getting

in, the packets of them laterally moving

152

:

over, the packets of them exfiltrating

the data, they can't hide those packets.

153

:

Packets are the fundamental building

block of everything in security.

154

:

At the beginning and at the end.

155

:

So whether it's an API that has been

violated, a service that's been violated,

156

:

a privilege that's been elevated,

a SQL injection, it doesn't matter.

157

:

It starts with a packet

and it ends with a packet.

158

:

But here's the warning.

159

:

Hackers hide, delete those packets.

160

:

It's files of the logs.

161

:

They delete the logs, all the

breadcrumbs, but they cannot delete the

162

:

packets that actually did the damage.

163

:

And so that's why packets are the

fundamental of all cybersecurity

164

:

diagnosis at the beginning and at the end.

165

:

It

166

:

Kim Mueller: sounds a little

bit like when you're talking

167

:

about a medical problem per se.

168

:

And you have definitive physical evidence.

169

:

In your case, the definitive

evidence is in the packets.

170

:

Bill Alderson: Yeah, it's in the packets.

171

:

So you hear about all these

people talking about privilege

172

:

elevation all these various things.

173

:

Those are security things that we

need to be looking at, but it starts

174

:

with the packet getting in and it

ends with a packet getting out.

175

:

So we need to have security experts that

are focused on the true forensics at

176

:

the root level that hackers cannot hide.

177

:

They can't hide the

packets of them getting in.

178

:

of lateraling around and

getting access to the data and

179

:

finally exfiltrating that data.

180

:

Kim Mueller: Sounds like you can

follow them pretty much wherever

181

:

they go and regardless of why.

182

:

Bill Alderson: Yeah, it doesn't matter

whether it's a good packet or a good

183

:

transaction or a bad transaction.

184

:

The issue is that looking at all those

packets is like boiling the ocean.

185

:

There are, billions and

billions of packets.

186

:

It's knowing when to look,

where to look, why to look.

187

:

And find those breadcrumbs that

criminals are leaving behind.

188

:

It's a criminal case.

189

:

It's why we call it Network Forensics.

190

:

So anyway, I want to get on to the

problem that we're going to talk about

191

:

today, which is pretty interesting.

192

:

The Office of the Secretary of Defense.

193

:

Their organization has well

over 10, 000 users in OSD,

194

:

Office of Secretary of Defense.

195

:

The CIO of the Office of Secretary

of Defense, or OSD, was one of

196

:

the directors of the Pentagon.

197

:

When I went in 2001 and this problem

happened a few years later after

198

:

we had recovered the communications

and some things were improved.

199

:

I'll talk about how the Pentagon

was improved after 9 11.

200

:

Which led to actually some of the

circumstances of this problem that

201

:

we're going to talk about today.

202

:

So when the Pentagon was hit, it took

down a lot of critical situations,

203

:

a lot of critical communications.

204

:

It didn't hit at the

worst possible location.

205

:

As a matter of fact, we were

pretty lucky because it hit a part

206

:

of the Pentagon that was the U.

207

:

S.

208

:

Army's area of the Pentagon, and

it was, they call them wedges.

209

:

And it had just been rebuilt.

210

:

Reinforced concrete a lot

of new infrastructure, and

211

:

a lot of reinforcement.

212

:

So that where the aircraft hit, it

didn't go all the way through and

213

:

onto the other side of the Pentagon.

214

:

It pretty much stayed in that one wedge.

215

:

But it went almost all the way into the

center, where I used to go and have lunch.

216

:

Very interesting the whole situation.

217

:

But nevertheless, we recovered

communications, and then a couple of

218

:

years later, I got a call from the guy

who used to run the Pentagon network, and

219

:

now he's the Secretary of Defense's OSD,

and he's in charge, he's the CIO of OSD,

220

:

or Office of the Secretary of Defense.

221

:

So I come in, and he says, Bill, I've

got another one for ya, and it's a doozy.

222

:

For the past 13 months, after we've done

a lot of renovations, PINRIN was the

223

:

name of the organization that renovated

the Pentagon after actually they were,

224

:

they had just renovated, PINRIN had just

renovated the Army's part of the network,

225

:

or the Army's part of the wedge that

I told you about had been reinforced.

226

:

And thankfully a lot of people

hadn't yet moved in because it was

227

:

recently renovated and that's why

we didn't lose quite as many people.

228

:

It was a little over a couple of

hundred, not, several thousand.

229

:

There's various times the Pentagon

houses somewhere between 22, 000 to

230

:

30, 000 people and they're all in these

office buildings and so if you take

231

:

one fifth of the Pentagon, it would

have been one fifth of the Pentagon

232

:

pretty much that would have been killed

had everything been up to top levels.

233

:

So it's, probably about 8,

000 people would have died.

234

:

Had it not been for the renovation and

the fact that they hadn't all moved in.

235

:

So anyway, they moved

all these servers, right?

236

:

And now, we've got this problem.

237

:

But before we get into that problem, I

want to just allow you to maybe think

238

:

about it a little bit and maybe react to,

when there is a mission critical problem.

239

:

High visibility, high stakes.

240

:

Over 10, 000 users, The situation

was every day, haphazardly, nobody

241

:

knew when, the whole network

would go down for 45 seconds.

242

:

Now the Pentagon has more

than just the Pentagon.

243

:

They have outlying buildings in

other locations in the regional area.

244

:

They have other buildings

that are part of them.

245

:

And they have very high speed

fiber, dark fiber between all those

246

:

buildings in order to Basically

run everything, but the Pentagon

247

:

is the central communications hub.

248

:

But a couple of times a day for

the last 13 months, 45 second

249

:

outages haphazard during the day.

250

:

Now why would an organization suffer who

had massive amounts of budget, unlimited

251

:

money, the smartest and best people,

The smartest and best vendors, why

252

:

would they not solve this problem that

they've been suffering for 13 months?

253

:

Do you have any ideas?

254

:

Do you have any, what goes on?

255

:

What are people saying to themselves?

256

:

What are leaders saying to themselves?

257

:

What are the technologists

saying to themselves?

258

:

Kim Mueller: I think that at

multiple levels, you have the ego

259

:

of many people involved, right?

260

:

Personalities.

261

:

These are one level,

262

:

you have the personalities of people at

all different levels, in the military.

263

:

And when you look at military rankings

and such, it's not unusual that you

264

:

would have people at certain higher ranks

that may have some personality traits,

265

:

a little bit on the narcissistic side.

266

:

And you may have people in lower

level positions that have more

267

:

inferiority kinds of personalities

and lack some self confidence.

268

:

So you've got people at all

of these different levels.

269

:

And it sounds like the problem

they had is, as you said, how

270

:

could you let a problem like this

go on for this amount of time

271

:

. No one wants to admit that

the problem is theirs.

272

:

Typically what happens when you have

a problem, whether it's a technology

273

:

problem or any other kind of problem,

everybody wants to point their finger at

274

:

the other side, that normally happens.

275

:

Bill Alderson: Or at least they want

to say, it's not my problem, I've

276

:

looked everywhere, it's not my problem.

277

:

Kim Mueller: It's got

278

:

Bill Alderson: to be somewhere else.

279

:

Kim Mueller: Exactly.

280

:

And the users, I think that, they suffer

from what we know as learned helplessness.

281

:

They stop reporting.

282

:

They just stop reporting.

283

:

I, I'm not going to waste my

time reporting a problem that's

284

:

been happening for, as you said,

not months, near over a year

285

:

. So you've got users.

286

:

who are no longer even reporting that it's

a problem, and you've got people who may

287

:

or may not be a part of the problem who

are so busy identifying that it's somebody

288

:

else's, they're not really taking the

time To literally look at the realities

289

:

of what if they did something wrong.

290

:

And then you have the people who,

are at the top, who don't even really

291

:

want to admit there's a problem.

292

:

Because if there's a problem in my

house, and I'm the director of my

293

:

house, if I'm a high ranking officer,

then ultimately, it falls back on me.

294

:

Bill Alderson: One of the things,

Kim, that I should add is that this

295

:

problem took them completely down for

45 seconds, two or three times a day,

296

:

but it didn't take them down completely.

297

:

And everyone who reported it, one

person would say it was 30 seconds,

298

:

another one would say it was a minute,

another one would say it's 33 seconds.

299

:

There was no definitive symptom or

diagnosis and it was all somewhat

300

:

hearsay and they were, they all knew that

something was happening intermittently.

301

:

So an intermittent problem often is the

worst kind and for me, as a technologist,

302

:

it's one of the worst kinds too because

I have to instrument in order to catch

303

:

it and like I said, catching all those

packets is like boiling the ocean.

304

:

And it's difficult.

305

:

So you have to use.

306

:

A Swiss Army knife

approach to the problem.

307

:

Kim Mueller: And then you have,

as a part of that problem and

308

:

feeding into that problem, you

have all different kinds of people.

309

:

As you said, there are technology

problems, but every technology problem,

310

:

ultimately, technology doesn't run itself

311

:

. People are involved.

312

:

Human beings.

313

:

One human being who has, is

having a marital problem at home.

314

:

Another human being who maybe has some

health issues and missed a lot of work.

315

:

There are so many things going on in the

humans that touch these technologies.

316

:

And in order for them to get to the root

cause of something like an intermittent

317

:

problem, If it wasn't taking everything

down, and it was happening, there wasn't

318

:

anything obvious pointing to where it

was, I'm sure it was really easy to just,

319

:

oh that can't be this, that can't be

that, and, you could put it off somewhere

320

:

else, because there wasn't any direct

line to follow to, It's you, or it's

321

:

yours, or it's this department, sounds

322

:

Bill Alderson: the other thing

is, the military is excellent at

323

:

what we call compartmentalization.

324

:

There, when I was there for

9 11, there are well over, I

325

:

believe we counted 125 enclaves.

326

:

That means different networks.

327

:

So you think Joint Chiefs, Army, Air

Force, Navy, Marines, Coast Guard, right?

328

:

Where you think the five

main services, right?

329

:

There are within those services.

330

:

OSD is yet another one.

331

:

Office of Secretary of Defense

is above all of those others.

332

:

So if you start counting all of the

different computer networks within this

333

:

system, there's, there was like over 125.

334

:

And some of those different networks

were affected and some were not.

335

:

Most affected was those 10,

000 plus users of the Office

336

:

of Secretary of Defense or OSD.

337

:

And the CIO of OSD, I had met a few

years back, and he called me and said,

338

:

Bill let's figure out, do you think

you can figure out where this is?

339

:

Due to compartmentalization, and

due to the fact that it's spread

340

:

out the, it wasn't, the Pentagon is

one of the biggest office buildings

341

:

in the world, so it's pretty big.

342

:

And, but in addition to that, it has

tentacles that go out to buildings all

343

:

around in the area and that becomes

part of the Pentagon's network.

344

:

So there's just this enormous, this

enormity and this complexity that occurs.

345

:

And then you have you have a

group that's responsible for

346

:

the fiber optic of the Pentagon.

347

:

You have a group that's response,

so that's the physical layer.

348

:

You have a group that's the

construction part of the

349

:

Pentagon, who puts in new cabling.

350

:

You have all of these different segmented

entities, and then you have PenRen, which

351

:

is Pentagon Renovations, which are tearing

apart another wedge of the Pentagon,

352

:

and remodeling it and rebuilding it.

353

:

And these are more construction people.

354

:

After 9 11, there was a freeway

that went directly right by.

355

:

You could throw a rock at

the Pentagon on the freeway.

356

:

Penryn Pentagon Renovations wrote the

check to move that freeway over quite

357

:

a bit, about a quarter of a mile away.

358

:

So consequently, there were a lot of

changes that happened and PenRen was a

359

:

very powerful organization and spent,

billions renovating and continues

360

:

to renovate the Pentagon today.

361

:

It's still an organization.

362

:

So when you look at the Pentagon,

it's much more than just

363

:

Army, Air Force, Navy, Marine.

364

:

It's the procurement arm of the Pentagon.

365

:

It's the political arm of all

of those different things.

366

:

All of the real forces

who actually, defend us.

367

:

are not at the Pentagon.

368

:

It's only leadership and procurement

predominantly and policy that

369

:

comes out of the Pentagon.

370

:

And that's 25, 000 people.

371

:

So it's a very complex web of technology,

of people, of compartmentalization.

372

:

And so then I come in and it's

why do they want a guy like me?

373

:

Cause I ended up with some experience

in the Pentagon, helping them

374

:

recover communications in the

Pentagon at 9 11, so I was uniquely

375

:

prepared and I have wide shoulders.

376

:

And a constitution that allows me to

get in there and figure things out.

377

:

And that's what I do.

378

:

Kim Mueller: One of the things that

you mentioned earlier was standing in

379

:

your driveway and getting that call.

380

:

And, of course, the Question in my mind

was What's going through your mind when

381

:

you get that call, and you were going

into, flying into a place where most

382

:

people weren't even allowed to go, right?

383

:

This was right after 9

11, and here you were.

384

:

So you have your technical expertise,

but, was there also that, that

385

:

human part of you that said, I'm

flying into a danger zone here.

386

:

Bill Alderson: Certainly there is

that, but I I was a Navy man at a young

387

:

age, spent four years, Vietnam era.

388

:

I was not in Vietnam, but I was

very nearby over in Pearl Harbor.

389

:

And that's on a ship.

390

:

And so I was nearby and I had

stood and raised my hand to defend

391

:

and support the constitution of

the United States of America.

392

:

And that is something that everyone under

my voice who is a military member knows.

393

:

That, that never expires.

394

:

So we're predisposed to knowing

that we may move into danger.

395

:

But yeah, the whole world at that time,

going in there at that early moment

396

:

to the Pentagon at 9 11, and we'll

talk about that at another time I just

397

:

wanted to basically set the background

that the reason why I came in to solve

398

:

this problem a couple of years later

was because I helped design some of the

399

:

renovations and my reports and that sort

of thing about how we fix some things.

400

:

So I want to go into a

little bit of the technology.

401

:

And one of the things that you

should know is that we learned

402

:

a lot of lessons at 9 11.

403

:

We learned that you probably want to

have more incoming and outgoing lines.

404

:

We don't want to have, we want

some redundancy in multiple ways.

405

:

And then also what happens if all of

our backups and everything were in

406

:

the Pentagon and it gets hit again?

407

:

Then we are susceptible to a

single point of failure yet again.

408

:

After the Pentagon was renovated yet

again in order to recover from the 9 11

409

:

disaster they, We took and built secondary

locations for, remember I told you

410

:

there's about 125 different institutions

in there, 125 different enclaves.

411

:

They all had a mission, and the

mission, it could be could be stopped.

412

:

If they hit the center point or single

point of failure, which is a Pentagon.

413

:

So they said, we probably

better have backup sites for

414

:

all of our people to go to.

415

:

And they had to design those in and

over a hundred miles away and in

416

:

diverse locations, they put secondary

facilities, but believe it or not the

417

:

problem that I was troubleshooting

had to do somewhat with that new.

418

:

Data Redundancy.

419

:

Because when you would go File, Save, and

it would store on the local file server

420

:

your file, or your database, or what have

you, that data would save at the Pentagon.

421

:

Simultaneously, it would

save and it would save.

422

:

At that new alternate site

that they put out there.

423

:

And that was the difference from

when I was there at 9 11 and when

424

:

I was there a couple of years

later to solve this problem.

425

:

So I knew all these different things

were happening and we use these

426

:

giant Which is now Dell EMC, but

these EMC data storage devices.

427

:

And we had them in the Pentagon, and then

we had them out in all those alt sites.

428

:

Then we had very high

speed, links to all of them.

429

:

And when you, the, some of the

definitions of the problem was I saved

430

:

a file, or I did a database store, And

it's stored at the Pentagon and then

431

:

all of a sudden everything stopped.

432

:

So there's various technologies that are

what we call synchronous or asynchronous.

433

:

And that's how some of this

technology of data storage would work.

434

:

You'd save it to the Pentagon storage

so that you could retrieve it.

435

:

But simultaneously.

436

:

You would save it at the alternate site.

437

:

Kim Mueller: A backup, so to speak.

438

:

Bill Alderson: Backup in two ways.

439

:

It operated as a backup for data,

but it also operated if something

440

:

happened at the Pentagon, all the, all

those employees or another group who

441

:

are responsible for that part of the

mission would have to be relocated.

442

:

Because if the building is like,

it's a big office building, right?

443

:

So if part of the office building

got damaged, they not only had

444

:

to have the people, but also the

data that the people could access.

445

:

If you look at 125 different enclaves,

125 different buildings, different off

446

:

site locations, it was a very complex web.

447

:

And in OSD, if you stored a file,

It would store it to Pentagon, and

448

:

then it would momentarily thereafter

store over in the alt site.

449

:

And there was a problem somewhere in that.

450

:

So we need to go File, Save,

it would stop for 45 seconds.

451

:

It wouldn't store local,

it wouldn't store remote.

452

:

So there were, there was a problem.

453

:

So it had some things

to do with the symmetry.

454

:

of where the data was stored,

whether it was asynchronously

455

:

stored, or synchronously.

456

:

And different types of databases,

different types of systems, have

457

:

different types of requirements for

whether they are stored asynchronously,

458

:

or whether they're stored synchronously.

459

:

Databases, if you get out of step with

your database, you can corrupt it.

460

:

There's some pretty sensitive pieces,

and I don't want to go into all the

461

:

different technological purposes and

reasons for that, but there are, if

462

:

you store a file on your computer and

then store it to a USB drive, it really

463

:

doesn't matter in time frame, that doesn't

really matter, you're backing it up.

464

:

This is simultaneous backup, in real time.

465

:

Going to the Pentagon Network.

466

:

And the Pentagon servers and then

over to the alt site servers.

467

:

And then we have this other

situation with all of this data

468

:

because it's all classified.

469

:

And it's not just classified, it's

classified, for official use only.

470

:

It's classified at the confidential level.

471

:

It's classified at the secret level.

472

:

It's classified at the top secret level.

473

:

And then it's classified And, special

access required within each one of those,

474

:

and derivations, and it's separated

depending upon the type of mission.

475

:

Whether it's a, no kidding, a military

mission where they're flying and doing

476

:

something, or whether it's just the

general secretary has the email in order

477

:

to get the commands going between places.

478

:

There's the tactical part of the

Systems, and then there is the office

479

:

automation part of the systems.

480

:

But I'm here to tell you that if we

don't have the office automation part

481

:

of the systems, a lot of the command

and control is somewhat impacted.

482

:

I mean

483

:

Kim Mueller: I'm curious, Bill, when

you are called into one of these

484

:

situations, and as you said, it

had gone on for a very long time.

485

:

Are the folks locally, where

you're working, are they A bit

486

:

reluctant to give you information?

487

:

Are they excited that you're there?

488

:

What happens when you go

in to solve a problem?

489

:

Bill Alderson: It's always a mix.

490

:

There are some people who are so happy

to see me because they've been trying

491

:

to solve the problem for months or

years and so they know that okay.

492

:

And I'll just admit that a lot of times

I solve problems, not that I solve them,

493

:

but I'm there as the facilitator and I

am the one who gets the, no kidding the

494

:

time, the focus, the resources to focus on

the problem so that it can get resolved.

495

:

It's not always That's me that solves

that problem I facilitate it, but I got

496

:

the opportunity and they didn't, right?

497

:

And so when I go in there, usually

everybody's, we put people on teams and we

498

:

start working and the first thing we do is

try and find out, okay, what's the Let's,

499

:

no kidding, let's get a problem statement.

500

:

Let's make sure it's accurate.

501

:

Let's make sure, it's, and I talk

about this in almost every episode,

502

:

the old adage about blind people

identifying the elephant and one

503

:

person, touches the trunk and he

says, Oh, this is some sort of a hose.

504

:

And the other person touches one of

the legs and it's oh, this is a tree.

505

:

And another person touches the tail and

it's oh, this is a sweep or something.

506

:

And then they touch another part of it,

oh, this is a hairy grizzly bear, right?

507

:

Because it has hair on it in some

places and some places it doesn't.

508

:

And so it's like touching an

elephant and everyone has a

509

:

different a different diagnosis or

a different feeling or a different

510

:

perspective and every single time.

511

:

So the first thing you have to do

is get everybody together and then

512

:

come up with all of the symptoms.

513

:

So one of the best things I think

I learned is to be a listener.

514

:

And you want to make sure you

hear every symptom from everybody.

515

:

And of course, the old woman

who's been there for 50 years and

516

:

is about to retire, doesn't know

really anything about computers.

517

:

She has a perspective.

518

:

And it's usually pretty accurate.

519

:

So you want to listen to

those non technical people and

520

:

their version of the symptoms.

521

:

And then you want to talk to the

people who are running the networks

522

:

or running the servers and you want

to listen to everybody's chronology of

523

:

events and how this Thing just happens

and nobody can ever figure it out.

524

:

And there's finger pointing between

the server team and the router team

525

:

and the switch team and the, and

then, then we have these other things

526

:

because of classification that I

mentioned, encryption and every one of

527

:

those encryption levels is different.

528

:

So you have different encryption

gear for every level of

529

:

certification of classification.

530

:

They have different levels of

rigor that they have to have.

531

:

Consequently it's a

very complex Situation.

532

:

And so it's no wonder that people

in these large environments have

533

:

these type of problems, but I

was fortunate I got to come in.

534

:

I was trusted and I got to come in.

535

:

And so yeah, some people loved

having me come in and somebody,

536

:

some people like, if they'd let me

have all this time and resources and

537

:

focus, I could have solved it too.

538

:

And it's true.

539

:

It's true.

540

:

So one of the reasons why we're doing Mr.

541

:

Non Grata is to help people understand

inside their organizations how to

542

:

perform critical problem resolution

and how to basically dissect the

543

:

anatomy of a critical problem.

544

:

And I go into this in my training

and, and I have a whole methodology

545

:

that I go through when I'm

trying to solve these problems.

546

:

But anyway, so bottom line is every

four, for 45 seconds, every once in a

547

:

while, the whole doggone thing goes down.

548

:

So they have monitoring

systems that cost millions.

549

:

They have diagnostic

tools that cost millions.

550

:

They have people who are operators of all

those tools that are top level experts.

551

:

Here's the thing.

552

:

They could not figure out what the

problem was because they It went across

553

:

enclaves, it went across responsibilities,

it went across classification levels.

554

:

And so all of these things, and few people

have a network documentation system.

555

:

I helped them document the Pentagon.

556

:

After 9 11, I showed them how to do it

and led them into, and I've done that

557

:

for a lot of Fortune 500 companies.

558

:

I go in and say, look, you don't

have the type of documentation that

559

:

you need in order to troubleshoot

this problem or remodel this

560

:

system or modernize this system.

561

:

And just like an architect who

has to have the blueprints of

562

:

the original design I once met.

563

:

I met on a flight in California,

a fellow who was an architect

564

:

for the state of California.

565

:

He happened to be an architect

who was retrofitting the

566

:

universities of California.

567

:

And one of the big problems was he

didn't have all the original blueprints.

568

:

So he didn't know where the iron

was inside those concrete walls.

569

:

He didn't know what the

structure was inside those walls.

570

:

And he was responsible for

doing earthquake retrofitting.

571

:

And without those, what did he have to do?

572

:

He had to spend a lot more money

and a lot more time and energy

573

:

and years on drilling core samples

inside these buildings, inside these

574

:

walls, inside these structures.

575

:

Why?

576

:

To determine what they were built of

so that they could build earthquake

577

:

retrofitting to modernize them.

578

:

And it's the same thing with networks.

579

:

If you don't have a good blueprint of

your network and your systems you're

580

:

really going to run into trouble when

you try to start modifying those.

581

:

And especially now as we

experience the hybrid cloud and

582

:

premises and that sort of thing.

583

:

And but that leads me to.

584

:

When I first went to the Pentagon,

they didn't have any network

585

:

documentation because that system

that held all that was destroyed.

586

:

So we had to reverse engineer the Pentagon

before we could diagnose the problems.

587

:

Okay, so this particular problem,

every 40 every day, a couple of

588

:

times a day, it would go down.

589

:

Everyone is basically

giving me the symptoms.

590

:

And I have to get the

diagnostic information.

591

:

I have to monitor.

592

:

So I monitored with a Swiss

Army Knife like solution.

593

:

I went out to one of their big office

complexes, and I set up my monitoring

594

:

as if I were one of the users.

595

:

And you can't really monitor

everything in Squared.

596

:

You can't boil the ocean.

597

:

So you pick one user, or one little

subset of users, and you monitor them.

598

:

They're going to be indicative of the

45 second outage, just like everyone.

599

:

You don't have to boil the ocean, you

don't have to look at all of them.

600

:

Find one user, or one set of users who

have this problem on a regular basis,

601

:

And monitor their access to the network.

602

:

Sniff the packets at one station.

603

:

You don't have to get

petabytes of packets.

604

:

You just need one representative

person who's having the problem.

605

:

Whether it be application

or what have you.

606

:

But then I also had to monitor.

607

:

Which means I had to, there's a technical

term that we call it a network ping.

608

:

Network ping.

609

:

Or a PING packet, an ICMP, Internet

Control Message Protocol, PING.

610

:

And it's what we use to determine

if there's connectivity or not,

611

:

and how fast the connectivity was.

612

:

And we use Traceroute to find out

what the path of the connectivity was.

613

:

And so we have to use what

I call a Swiss Army Knife.

614

:

Because I couldn't use that

4 million HP OpenView system.

615

:

It wasn't, it didn't look

at that one person's.

616

:

Behavior.

617

:

It looked at this huge network

and they couldn't figure it out.

618

:

Too many alarms happening

at the same time.

619

:

Too many problems

happening at the same time.

620

:

So I went in and said, look, I want

to find this one office location.

621

:

I want to, and then I want to end

everybody in that office because a big

622

:

cubicle area to pipe up when it happens.

623

:

So I know when it happens.

624

:

So I set up monitoring.

625

:

Like a, what you'd call a, what I

call a Swiss Army knife monitor.

626

:

I set up one of the machines

as a monitor to see when it

627

:

had access and when it didn't.

628

:

And I would monitor it

on a very high frequency.

629

:

And I did that from multiple

locations so that if it happened

630

:

over here, but not over here.

631

:

So I figured out how to

instrument the environment without

632

:

spending 4 million on something.

633

:

I did a Swiss Army knife and I said if

this one set of little users here has

634

:

the problem, I'll be able to diagnose it

by capturing their packets at the same

635

:

time that I was also monitoring and I

will have evidence of the monitoring.

636

:

So anyway, I found that

there were these outages.

637

:

I captured a bunch of the packets.

638

:

Oh, we lost connectivity

to this, and this.

639

:

And I said the, and I had

to monitor on two sides.

640

:

I monitored on the server side back

to the client and on the client side

641

:

over to the server simultaneously

so I could see both sides.

642

:

So I was monitoring this way and this way.

643

:

And I found that for some reason

that this one enclave was responsible

644

:

for whatever was the outage was.

645

:

And so when I inquired about that,

it's Oh, that's the encryption enclave.

646

:

That's where they do all the KG

gear and KG encryption is inside,

647

:

so I looked at it from one side.

648

:

Yep, it's a KG area and

from the other side.

649

:

Yep.

650

:

It's a KG area and I said somewhere in

there we're losing connectivity, so they

651

:

didn't want to play ball for some reason.

652

:

Now that's in your area.

653

:

Why, if I give them evidence that

their area is responsible, why

654

:

would they not want to play ball?

655

:

Kim Mueller: It sounds to me like you

might be dealing with a little bit

656

:

issue of cognitive dissonance, where

if your truth doesn't match the truth

657

:

that they have given in their own mind.

658

:

If I believe that the problem is

outside of me, and you're telling me

659

:

the problem is within my area, then

that doesn't compute in my head.

660

:

Bill Alderson: And like I said,

these are different enclaves

661

:

or different responsibilities.

662

:

So the packets would go from the

office area, go in and get stored

663

:

in a certain location, and then

they would have to be encrypted in

664

:

order to go outside the Pentagon.

665

:

And I identified that this encryption

area That was one of the problems and they

666

:

said, Oh you're not clear to that level.

667

:

So very convenient.

668

:

The military and the government often

classifies things so that they can keep

669

:

anybody who's going to find the problem.

670

:

Out of the area, and they use

classification and access.

671

:

Now, a most needed capability.

672

:

I've worked on classified programs,

special access required, and

673

:

compartmentalized stuff all my life.

674

:

But now it's being used against me.

675

:

And I said here's the symptoms, and

I just showed them the data, and

676

:

I said, it's somewhere in there,

and they weren't playing ball.

677

:

Now, I went in to explain this, and they

had a high level meeting in one of these

678

:

big, fancy, mahogany row, boardrooms.

679

:

And there was, I think four or five high

level secure high level, executives.

680

:

Somewhat knowledgeable of IT and

the CIO of the Pentagon was there.

681

:

He was my sponsor.

682

:

And so I'm explaining and showing them

charts and graphs about my monitoring

683

:

and how I instrumented so that I

could set up to show them my work.

684

:

I could show them definitively This

is where the packet went to, and

685

:

this is where it stopped and didn't

come back out, and I monitored the

686

:

other side, and it didn't go out the

other side, and it didn't come back.

687

:

This is the area, this is the

compartmentalization that's

688

:

appropriate, that, and this is

the area where your delay is.

689

:

This is where that 45 second outage is.

690

:

I can't diagnose it because They say

I'm not cleared for that level, and so

691

:

I gave them all the information, and I

said, then you guys are going to have to

692

:

take it on the inside and figure it out.

693

:

They kept contending that they

weren't responsible, and I said this

694

:

is a pretty simple binary issue.

695

:

Packet goes here, 45 seconds.

696

:

I'm sorry, that is it.

697

:

All of a sudden, there's

this guy in this meeting.

698

:

He was the director of Penryn.

699

:

Remember the guy who

wrote the check to move

700

:

Kim Mueller: the

701

:

Bill Alderson: freeway?

702

:

It was that guy.

703

:

Do you think he had a

few friends in the world?

704

:

Yeah.

705

:

Think he's pretty powerful?

706

:

Unequivocally.

707

:

So he yelled at me, in

military term, Stand down!

708

:

Because that area That was something

that his team was responsible for.

709

:

Kim Mueller: Talk about defense mechanism.

710

:

It's like a literal defense mechanism,

and we talk in the psychology world

711

:

about the defense mechanisms that we

use and in this case, they're using, in

712

:

essence, a military defense mechanism,

a stand down, they're Saying that you

713

:

can't go any further because you don't

have the classification to do it.

714

:

Bill Alderson: It wasn't only that.

715

:

It was, I was accusing that area of being

the problem and they weren't cooperating.

716

:

That's why I yelled, stand down.

717

:

Kim Mueller: And then what?

718

:

Bill Alderson: Very interestingly, I had

a friend who He went to the, Annapolis,

719

:

he was a naval officer, and he went to

Annapolis, and he took me out to Annapolis

720

:

while I was there that particular week,

and we chit chatted and it just so

721

:

happened that he had a friend who was

in the CSI of the Navy, I can't remember

722

:

what it, what was, what's it called?

723

:

Kim Mueller: NCIS?

724

:

Bill Alderson: Yeah, NCIS.

725

:

He had a friend in NCIS and they

were trained, they were called

726

:

in, apparently unbeknownst to me,

but how did I end up knowing this?

727

:

But because this friend of mine who

took me over to Annapolis, and by

728

:

the way, he took me to the chapel.

729

:

And in the chapel, at the bottom

of the chapel, is where they

730

:

recovered, from Europe, from a from

an abandoned grave, the bones of

731

:

our first admiral, John Paul Jones.

732

:

Yeah, so if you guys are ever in

Annapolis, definitely go to the

733

:

chapel and go down to the bottom

and they have a crypt there and the

734

:

actual bones retrieved from a from a

cemetery in Europe and we recovered

735

:

those and put them in the chapel.

736

:

The bones of John Paul

Jones, our first Admiral.

737

:

Anyway, while he was showing me all of

this and we were talking about it, he

738

:

said he had this friend who was at NCIS.

739

:

And he was his roommate when he went

to Annapolis, so they were tight.

740

:

And so he was, that was an Inspector

General thing, or NCIS thing anyway.

741

:

And he couldn't tell me

anything, but he knew something.

742

:

And he says, there's a reason why they

don't want you to solve this problem.

743

:

Kim Mueller: Someone's

got something to hide.

744

:

And maybe that something doesn't

even have anything to do with

745

:

what you think you're solving.

746

:

Bill Alderson: Might not

be a technology problem.

747

:

Now, it ended up being a

technology manifestation.

748

:

Come to find out a little later on,

that it was pretty much forced, that

749

:

it was that organization's problem.

750

:

And interestingly, and by the way,

if you are a investigative journalist

751

:

and you want to go back in time and

find this out, more power to you.

752

:

But the then director of Penryn,

three weeks after I left.

753

:

Retired

754

:

unexpectedly.

755

:

Kim Mueller: Could be coincidence.

756

:

But I seem to remember someone once saying

there's no such thing as a coincidence.

757

:

Bill Alderson: And I think that would have

been the guy, Mark Harmon, at NCIS, right?

758

:

Kim Mueller: Exactly.

759

:

Bill Alderson: Okay.

760

:

No such things as coincidence.

761

:

And you would be right.

762

:

And there it is.

763

:

A play on words and a play on things.

764

:

So anyway, come to find out, this

executive had assigned this leader

765

:

over that area of technology.

766

:

And

767

:

That's why he was later.

768

:

I later I learned from my friend who

was Friends with the IG and that sort

769

:

of thing that the woman that he had

assigned responsibility over that area

770

:

that was responsible was his mistress.

771

:

Kim Mueller: And now the human factor.

772

:

Bill Alderson: So what kind of things

do you think were going on there?

773

:

Kim Mueller: Certainly, again,

774

:

we all have a belief of who we

are and who people believe we are.

775

:

There is, there's yourself

as you see yourself.

776

:

And then there's yourself as

you believe others see you.

777

:

And when what others see gets shattered,

then it breaks down the human psyche.

778

:

And I think that's in some of these

situations that you're describing, even

779

:

though it's a technical problem and

really isn't a person's, it doesn't

780

:

change who that person is today or

tomorrow, but because of how our work

781

:

world is organized and how people

value themselves based on their work,

782

:

if what happens in that situation is

that their character gets, basically

783

:

their character gets assassinated.

784

:

Interestingly,

785

:

Bill Alderson: so much for watching.

786

:

I'll see you next time.

787

:

I forced the move.

788

:

The group resisted and

refused to work with me.

789

:

But later I found out, before I left

the Pentagon, we solved the problem.

790

:

Because this woman's entire

team was working 24 hours, 7

791

:

days a week while I was on site.

792

:

Because as I kept narrowing down,

and it kept getting more proof,

793

:

definitive proof, That their area of

responsibility was indeed the cause

794

:

of the 45 second outages haphazardly.

795

:

Kim Mueller: So they needed

to solve it before you did.

796

:

Bill Alderson: Bingo.

797

:

That's one of the dynamics.

798

:

It's

799

:

Kim Mueller: like finding the witness

before you, before the bad guys find them.

800

:

Bill Alderson: Exactly.

801

:

But fortunately, I came in and like I

said, I was able to use my, Swiss Army

802

:

Knife, I was able to do these things

and prove this is where the problem was.

803

:

And when that happened, that

caused this leader to explode in

804

:

front of a lot of other people.

805

:

And like I said, about three

weeks after I left, now this was

806

:

Kim Mueller: Because what he

was hiding was much larger

807

:

than the 45 second outages.

808

:

Bill Alderson: Yes, and they just

didn't want to admit that that her

809

:

responsibility was where the problem was.

810

:

Now, it could have been

a technical problem.

811

:

So what?

812

:

Hey, we found this and that was

the problem and that was And Bill

813

:

identified it as being in that location.

814

:

It was not that they were

the source of the problem.

815

:

That was the problem.

816

:

It was that they were trying

to hide it from others.

817

:

And in that result, it ended up, he yelled

at me to stand down in this big meeting.

818

:

And that escalated the analysis

of, and the investigation that

819

:

was going on about him and.

820

:

This woman that worked for him at a high

level that he oversaw was his mistress.

821

:

It was

822

:

Kim Mueller: So he had, a lot

of guilt going on for what

823

:

was happening in my opinion.

824

:

He had some guilt over the fact that this

was his mistress and, there's this problem

825

:

and he doesn't want any of this found out.

826

:

And, when those kinds of things happen

and anxiety bubbles up you default.

827

:

And what was his default?

828

:

His default was stand down.

829

:

Bill Alderson: So anyway, suffice

it to say, after we were done, they

830

:

did, her group did solve the problem.

831

:

We got it all fixed.

832

:

I was successful once again, thank God.

833

:

And I went off and, but I started

thinking, man, this guy probably wrote

834

:

a lot of checks to a lot of different

people and he had a lot of power.

835

:

And I was Mr.

836

:

Non Grata.

837

:

In a big way, that resulted in his

838

:

retirement.

839

:

I watched my back.

840

:

And I still wonder if there weren't

some things because it was not a pretty

841

:

picture and there were huge consequences.

842

:

And by the way, this

didn't get in the papers.

843

:

Never got in the papers.

844

:

Kim Mueller: Early on when

you were talking about getting

845

:

called in after this problem,

846

:

One of the questions that came

to my mind was the why now?

847

:

Because in the mental health field,

people deal with mental health problems.

848

:

Depression, anxiety, even severe

problems, hallucinations, but when they

849

:

seek help, there's always a why now.

850

:

And so one of the first

questions you ask is, why now?

851

:

Why now?

852

:

And people will say something

very generic, right?

853

:

And then you say, okay, what's changed

in the last 24 hours that made you

854

:

pick up the phone or schedule this

appointment or whatever, right?

855

:

What's the why now?

856

:

And so I'm curious what the why now

of is, why now did you get called in?

857

:

I was

858

:

Bill Alderson: never made

privy to those things.

859

:

However, one can surmise.

860

:

It's the subject of books being

authored on the topic, right?

861

:

All sorts of different things.

862

:

Clandestine, secret situations

inspector general emergency retirements.

863

:

That sort of thing.

864

:

And so my friend had told me

that he knew that there was

865

:

some investigating going on.

866

:

And I think that's how I ended up being

there, because they were desperate

867

:

to solve the problem, but couldn't.

868

:

And I was the one who could prove that.

869

:

Beyond a Reasonable Doubt, that

was the area that was the problem.

870

:

And you had

871

:

Kim Mueller: no skin in the game.

872

:

Bill Alderson: I have no skin in the game.

873

:

Kim Mueller: Regarding

whose problem it was.

874

:

I

875

:

Bill Alderson: was from California.

876

:

I got another phone call.

877

:

I just, responded.

878

:

Kim Mueller: Yeah, and they needed

someone from the outside to be able

879

:

to come in, because internally it

sounds like everybody was really you

880

:

talk about a complex network, right?

881

:

It's a complex network of human

beings as well, and relationships.

882

:

And it, I don't care what the

environment is, whether it's the

883

:

military or, a fortune 500 company

or whatever it is, it's made up of

884

:

people, human beings with feelings,

with relationships and with problems.

885

:

Bill Alderson: Said.

886

:

Thank you, Kim, for walking through

this with us and taking a look at,

887

:

some of these complexities that

have been associated with some of

888

:

these high visibility, high stakes

situations that I've been called into.

889

:

I often tell people that

I feel like Forrest Gump.

890

:

I have no idea why I'm the guy who

gets called in on these things.

891

:

But they end up being pretty doggone big.

892

:

And I am just an instrument and

I just go in and do what I do.

893

:

And and it's always

been interesting to me.

894

:

And once in a while, like now during

this podcast, I get to go back and

895

:

discuss these different sorts of

things and bring them to light.

896

:

And of course this is long ago and

probably some people are deceased.

897

:

By this time, and certainly the

statute of limitations is over, but it

898

:

still makes for a doggone good story.

899

:

And a hundred percent of whatever,

of everything that I said today

900

:

is a hundred percent true.

901

:

Kim Mueller: Thank you for sharing

your story with me and giving me

902

:

an opportunity to take a look at

it through the psychology eye.

903

:

Bill Alderson: Thank you,

Kim, for joining me today.

904

:

Until next time, Mr.

905

:

Non Grata, signing off.

Links

Chapters

Video

More from YouTube