Artwork for podcast Stories on Facilitating Software Architecture & Design
Uncovering the Ghost Decisions in Your Architecture
Episode 330th September 2025 • Stories on Facilitating Software Architecture & Design • Virtual Domain-Driven Design
00:00:00 00:19:02

Share Episode

Shownotes

In this episode, Andrew Harmel-Law, joined by Andrea Magnorsky and Kenny (Baas) Schwegler, discusses "ghost decisions," which are fundamental architectural choices that are often undocumented, implicit, or even forgotten. These decisions can cast a long shadow, influencing everything from technology choices to team structures.

Key Takeaways

  • Implicit Decisions: Andrew shares his experience with projects where he found that the most fundamental architectural decisions had already been made, often implicitly or as a result of legacy choices. These are often not explicitly architectural decisions, but rather things like the product being built or the team structure, which have a significant architectural impact.
  • Reverse-Engineering ADRs: To address these "ghost decisions," one team Andrew worked with began reverse-engineering Architecture Decision Records (ADRs). They documented historical decisions, including the three different options they selected and the reasons why they were rejected at the time. This provided clarity on the original constraints and allowed the team to revisit decisions when the context had changed, such as a startup growing into a large scale-up.
  • Documenting Disagreements: The episode also touches on the challenge of documenting the human factors behind decisions, such as conflicts and power imbalances. Andrew suggests using phrases like "
  • adopted despite" or "rejected despite" in ADRs to acknowledge opposing viewpoints and ensure all perspectives are represented in the official documentation. This approach can help people feel acknowledged and provides valuable context for new team members.
  • The "Telephone Game": Kenny highlights the dangers of relying on undocumented decisions, which can lead to a "telephone game" where information is distorted or lost as people join and leave the team. Without a clear record, it becomes difficult to understand why certain choices were made, making it harder to evolve the codebase.

Transcripts

Andrea Magnorsky:

hello and welcome to virtual TDD we are doing

2

:

today a stories on facilitating

software design and architecture.

3

:

In this episode, the third one

of our series, we are with,

4

:

Kenny, Andrew, and myself.

5

:

We're gonna start today with

Andrew telling us a story about

6

:

how they have been facilitating

software design and architecture.

7

:

Over to you, Andrew.

8

:

Andrew Harmel-Law: Thanks

Andrea, and thanks for, having me

9

:

Kenny as well, and both of you.

10

:

It's super cool.

11

:

I did manage to join the first one,

but it's super cool to join this one.

12

:

The story that I wanted to share is kind

of, it's an amalgam of lots of different

13

:

experiences I've had with facilitating

software architecture and design.

14

:

Um,

15

:

so if people recognize different bits

of this, 'cause I'm a consultant,

16

:

and I work for ThoughtWorks.

17

:

people recognize different bits

of this, that's because I see this

18

:

kind of thing over and over again.

19

:

But the thing I wanted to talk about and

kind of maybe get your thoughts on is.

20

:

Every time I've, I think almost

every single time I've started a

21

:

software project, come on board.

22

:

Even if I come on board very early at

the beginning, I'm always trying to

23

:

find like the first or the, the most

fundamental architectural decisions and

24

:

trying to figure out what those are.

25

:

'cause they've happened already.

26

:

Right?

27

:

And maybe they were implicit or maybe

they were, just like legacy decisions

28

:

from like, the history of, why the

project got to the point it's got to.

29

:

And what's interesting to me is the

fact that those decisions which are

30

:

kind of fundamental or underpinning

or very early kind of stage decisions,

31

:

typically have a very long shadow and

they have a very long effect on things.

32

:

And what's most interesting

to me is the fact that don't.

33

:

It, it's, it's not even like everybody

knows they're there and people don't even

34

:

acknowledge the fact that they exist.

35

:

one example, I went to one place and,

I've been to multiple places, but for

36

:

the benefits of this story, I went

to a client and they were working on

37

:

a rewrite and a rebuild of things.

38

:

And so therefore it was

a rewrite and a rebuild.

39

:

So everyone was kind of treating it

as if it was greenfield and if it

40

:

was completely fresh, things which

kind of touched and informed the

41

:

fundamentals of what we were doing.

42

:

so those kind of things were almost like

non architectural decisions, but decisions

43

:

which have an architectural impact.

44

:

So for example, when people were picking

like, what's the thing we're gonna build?

45

:

Or what's the product

we're gonna build, right?

46

:

That kind of frames the landscape

and within which you're doing,

47

:

and it kind of dictates where

the edges and the boundaries and

48

:

the scope of your landscape is.

49

:

typically when I join,

someone's already figured out.

50

:

How many people you'll need in

teams and what teams you'll need

51

:

and all these things kind of Right.

52

:

And again, like as, like, I

don't need to convince anyone

53

:

that, that, that has an impact.

54

:

Um, but what's interesting is, is to

figure out what those things have been.

55

:

And typically when I have come in

and like, and trying to figure out

56

:

architecturally, 'cause we need

things to be quite black and white.

57

:

Right.

58

:

Quite, very distinct.

59

:

Right.

60

:

Kind of in and out.

61

:

Yes and no.

62

:

All of these kind of things.

63

:

So from an architectural decision

perspective, once or twice I've done

64

:

this, where I've come in and we'll

end up having conversations and I'll

65

:

eventually say this feels like there's

a previous proceeding decision.

66

:

The time that this worked the best was

I started somewhere and there was a

67

:

bunch of decisions which were kind of.

68

:

Historical decisions, which, you

know, were not maybe made the most

69

:

optimal decisions, they were the

decisions that had been taken.

70

:

And that was where everything was and

that was where everything was headed.

71

:

so after a big, long conversation with one

of the architects that I was working with.

72

:

They offered.

73

:

And so this was their idea, not my idea.

74

:

They offered to kind of write down

those big decisions like reverse

75

:

engineer the ADRs, kind of like filling

in the history, you know, 'cause

76

:

like adr, I, definitely feel anywhere

kinda like an immutable change log of

77

:

what you're doing in your software.

78

:

But we were missing the initial

one or two or three things.

79

:

So they wrote down these ADRs around

about when we move these teams and when

80

:

we decided to pick this technology.

81

:

And they even recorded like, these are

the three different options we selected

82

:

and these are the reasons that we thought

were bad at the time, but we still did

83

:

it because these were the circumstances

and it was interesting for them to

84

:

kind of go back and do that kind of

archeological digging back through stuff.

85

:

But it also, gave a lot of clarity

to the constraints that were there

86

:

and why those constraints existed.

87

:

And then that allowed people to kind of

challenge them if we were like, right,

88

:

we can't do this because of this, and

they're like, well, the reason we didn't

89

:

do this at the start was because if

you look in the a ADR we didn't do it

90

:

because we didn't have enough people,

or we didn't, you know, we like product

91

:

management, didn't think this was as

high a priority as this other thing.

92

:

And then we could say, right now

we're looking at this decision again.

93

:

'cause you can kind of go back

and refactor some of these

94

:

fundamental decisions or like

not, or revisit, not refactor

95

:

With that clarity as to why, 'cause

people kind of think that some of these

96

:

decisions are, untouchable, right?

97

:

You can't go back to them.

98

:

But if you go back and remind yourself

of the context, then you can say,

99

:

this context has totally changed.

100

:

Like we've, we were a 12

person startup, right?

101

:

And now we're a 200 person scale up.

102

:

So now we do have people

to do stuff, right?

103

:

Or there was another decision, I think

this one's in my book, where the client

104

:

wanted to move really fast and they were

basing everything that they did on AWS.

105

:

But there was one thing that

they wanted, there was a chip set

106

:

that was only available on Azure.

107

:

So therefore they took this decision

to do something knowing that they,

108

:

their strategically target platform,

that they didn't have this chip

109

:

set that they wanted available.

110

:

Then they could go and say, right,

for this small decision, we're gonna

111

:

take this thing and go to GCP and

it'll be like a bounded decision.

112

:

So it kind of freed them up and it

reminded them what the actual constraints

113

:

of their previous decisions were as

opposed to these kind of not very clear.

114

:

Implicit decisions setting.

115

:

Artificial kind of boundaries on things.

116

:

And that was super important.

117

:

The reason we did it at that time was

because the client I was working for

118

:

was a scale up, so they needed all of

this clarity to make it clear to all

119

:

of the new people joining why we're

doing this, why we're doing that.

120

:

I think having done loads of things

since then, I think it's a kind of

121

:

approach, which could work really

well, like these fundamental decisions.

122

:

Like if you're working in a multi-tenant

system, what do we mean by a tenant?

123

:

What's the boundaries of our

systems, et cetera, et cetera.

124

:

What things, what's our core domain?

125

:

What are the things we build?

126

:

What are the pieces we don't

build, et cetera, et cetera.

127

:

so loads of those things are typically

implicit, but you can pull them

128

:

out and make them explicit, even

if they're kind of underpinning.

129

:

So that's my experience.

130

:

And I was wondering like what you're,

I guess maybe you've seen this kind

131

:

of thing before as well, there.

132

:

Kenny (Baas) Schwegler: Well, it

really reminds me of some workshop

133

:

I've did in the past called.

134

:

Appreciative Inquiry, which had nothing

to do with decisions, but everything

135

:

to do with like events or decisions,

And it, really reminded me about

136

:

that, And now I'm thinking about,

oh, it would be really well to do an

137

:

event storming, like event storming.

138

:

What decisions have we done?

139

:

So far and what are

decisions that we still make?

140

:

you can 100% sort of

like events from that.

141

:

What are the events that decision took

place even though they're not right.

142

:

so that's my first thought.

143

:

And then with appreciative inquiry,

what we did then was looking at

144

:

what are some good and bad decision

and there's a difference, right?

145

:

What are some.

146

:

Good and badly made decisions and

what were some good decision that

147

:

had bad outcomes because right,

the resulting there, there, there's

148

:

a difference between the two.

149

:

But I think you could definitely

like storm that out in a whole group.

150

:

That was my first thought.

151

:

And then you can take, which

decision do we want to take with us?

152

:

Which decision do we need to change?

153

:

And then you can do sort of like a

reset maybe what what you're saying.

154

:

Then write some of these

stuff down as an as.

155

:

Okay, we took this decision, this is

what we know, and then just move on.

156

:

Andrew Harmel-Law: Yeah,

157

:

like baseline or like snapshot in

like an event driven system, right?

158

:

You're like, we don't, we

could go all the way back.

159

:

But then the further we go back,

we're into like people remembering

160

:

and like half of those people or

maybe all of those people left, right?

161

:

Like I see things before we're like,

nobody can quite remember what the

162

:

decision was or why it was taken.

163

:

It's just, we use this

software to do this.

164

:

And you're like, why?

165

:

No one can even remember

why this happens anymore.

166

:

Andrea Magnorsky: that was kind of my

question as you were talking about this,

167

:

is there a particular trigger that made

you want to, like, is it more about

168

:

like, I found I, Andrew found it a good

practice too when I start the place, try

169

:

to make this explicit to make your life

as in making change easier basically, or

170

:

is there particular signals that you are

looking for to implement this strategy?

171

:

Andrew Harmel-Law: That's a good question.

172

:

So I think now, because lots of my

practice has kind of come about from,

173

:

like, we were talking about this a bit

before, started recording Andrea, right?

174

:

Like, I don't think there's,

there's a lot of this in my book.

175

:

I had a, not a fight with O'Reilly,

but I was trying to make, when I

176

:

wrote my book, I was like, I don't

want this to be like a recipe.

177

:

Do these things and then you will get

success because you kind of don't,

178

:

it's more like, watch out for these

things and then you'll probably

179

:

be able to figure out what to do.

180

:

So.

181

:

came about because it was a

specific client and we were there

182

:

specifically to help them scale up.

183

:

So they'd made loads of decisions

super fast and they figured out what

184

:

their problem solution fit was, and

their product market fit was right.

185

:

They'd figured that out,

but they'd run really fast.

186

:

So the people who were there remembered,

although the different people who

187

:

remembered different aspects, like

the reason we picked Amazon over

188

:

GCP is this, and if you spoke to

someone else in the data team,

189

:

they'll be like, we hate those guys.

190

:

We, lost the battle.

191

:

Whereas like the people who wanted

AWS don't think anyone lost.

192

:

They're just like, we

picked the best thing.

193

:

So that was kind of like to

to just for that purpose.

194

:

It was like keep having to explain

the same thing over and over again

195

:

to every new person who joins we just

want to explain and we want to say,

196

:

we know it's not perfect and we know

everyone wasn't happy and we know

197

:

we may regret it because at the time

it was good, but now it isn't good.

198

:

Um,

199

:

Kenny (Baas) Schwegler: That because

that's something interesting you, they,

200

:

we need to re-explain everything and

there's a telephone game in there.

201

:

Andrew Harmel-Law: totally.

202

:

Kenny (Baas) Schwegler: the

decision, and this is what I

203

:

see many times as well, right?

204

:

Some person made decision,

never wrote it down.

205

:

And then when you come in, you

talk to one person, they say, A,

206

:

and you talk to one person B, and

there's already a conflict, right?

207

:

And then when you ask is,

okay, who made the decision?

208

:

When is it made?

209

:

Nobody would say, ah, well,

you know, it's over there.

210

:

I made that decision

and this is the reason.

211

:

And it, it's never one coherent story

when, when you don't write 'em down.

212

:

Andrew Harmel-Law: totally.

213

:

Kenny (Baas) Schwegler: That reminds

me of there's a total telephone game

214

:

going on there and then ranking.

215

:

Andrew Harmel-Law: Yeah.

216

:

And because this, this is, and that

goes to two points that kinda, I think

217

:

we don't worry about enough, but make

a big deal difference because build

218

:

software, I don't think I've ever seen.

219

:

The software being maintained by a team

with the same team that built it, right?

220

:

We move around a lot, right?

221

:

We do a bunch of stuff.

222

:

We arrive somewhere.

223

:

I mean, I know I'm a consultant, right?

224

:

So I do this, this is my job.

225

:

But even then, there's very infrequently

someone who's been there forever.

226

:

And then there's the danger of, you

know, like Alberta brand Le would call

227

:

'em, like them becoming the dungeon

master and all this kind of stuff, right?

228

:

But most people move around, like

very few people are living in

229

:

the code bases that they created.

230

:

They're living in the code bases

that someone else created, and

231

:

they're trying to, and if they

don't know the reasons why.

232

:

like you said, that you don't get,

decisions are good and bad, and that's

233

:

a, what Diana and Ian would say, right?

234

:

Architecture, like you do the trade

offs and all that kinda stuff.

235

:

There is no best decision, or

sorry, there's no right decision.

236

:

It's the best one given the circumstances

and what you know, et cetera, et cetera.

237

:

And then you hope that decision gets

into code, which maybe may, doesn't,

238

:

maybe doesn't happen, but like, this

is like the thing, like the explaining,

239

:

like you're saying Kenny, right?

240

:

Like people go.

241

:

They go like, you told me this thing

when I joined, or I read these pages

242

:

on the wiki and then I looked at the

code base and I'm having difficulty

243

:

understanding, at least if you have some

ADRs, at least for significant things,

244

:

you can explain like the historical forces

that were affecting this kind of thing.

245

:

Which helps people like raise the empathy

for the code base and each other and

246

:

all of the other different things and

understanding they can reverse engineer

247

:

the mental landscape of what was going on.

248

:

As to why things are like they

are and then they know what they

249

:

can maybe try and fix or maybe try

and change or maybe try and evolve

250

:

And I think

251

:

because I've talked about this elsewhere,

I think at New Craft, like there's

252

:

like different bits of code bases can

have like strong ego power, right?

253

:

Like it can feel like you shouldn't

touch it because someone who's probably.

254

:

a senior executive, like wrote

these parts of the code and

255

:

we all know we can't touch it.

256

:

And you're like, but why?

257

:

Like they don't live there anymore.

258

:

It's like they built this

thing and then they've left and

259

:

gone and done something else.

260

:

Andrea Magnorsky: Some people, some

places that they still, they, like

261

:

I know of, I've been there and I'm

sure you've probably been there too,

262

:

of places that they left and they're

theoretically in some other position.

263

:

But in reality, if you

touch that, their baby.

264

:

You know, these people will

cry and will come down and be

265

:

like, oh, you touched my baby.

266

:

What is the story with this?

267

:

Andrew Harmel-Law: Yeah,

268

:

interesting and I think like,

I've seen this too, right?

269

:

Like if you write that down, 'cause people

forget, like they're like, oh, I did the

270

:

best thing and I built the best thing.

271

:

But they never, I don't think, when

you speak to most people when they're

272

:

doing things, they're like, no, there

were a bunch of forces and these

273

:

are the things that we were taking

account of at the time and stuff.

274

:

Andrea Magnorsky: actually, this is thing.

275

:

So I'm gonna answer this last, but

how do you write down about the power

276

:

imbalances, in ADRs in a useful way?

277

:

Andrew Harmel-Law: See,

I don't think I do.

278

:

I think this is something I

want to think about more is I'm

279

:

thinking about it more and more.

280

:

It's definitely not in the book, I mean.

281

:

I think it should be in the book,

so yeah, but it's not, I think it's,

282

:

Kenny (Baas) Schwegler: This is

a good question because I face

283

:

this myself, and I'm very curious

about other people's thoughts.

284

:

we maybe can do a whole

episode about this.

285

:

Like, there are some personal reasons

that are very, let's be honest, we are

286

:

in a patriarchal or in a male-dominant

industry where if you're dealing with

287

:

emotions, It's very hard to talk about.

288

:

Most of the time.

289

:

I find it very hard.

290

:

Well, I find it easier now, but I've been

to a lot of, therapies to talk about that.

291

:

But, I've had situation where, you

know, there's a clash and there's a

292

:

lot been said about an architecture

decision writing that down felt very.

293

:

Hard for them to like write down.

294

:

So going into the conflict management,

should we write down the conflicts

295

:

and the power play that was there?

296

:

Because that's relevant context, but

it's also very hard to write that down.

297

:

A how do people interpreted

that and B, it's very fragile.

298

:

Andrea Magnorsky: It's

extremely context aware.

299

:

So my question really was about

that, the things like, it's important

300

:

because if there was a conflict and

even the decision, the thing that you

301

:

were explaining earlier, Andrew, that

thing of, oh, we chose AWS and the

302

:

data, people are like, oh, we actually

preferred GCP or something else.

303

:

And it's like all those

people that did even.

304

:

That should be in the a DR, but it

probably isn't this is why it's so

305

:

important because when you revisit

this decision, people that have

306

:

a memory will be like, oh, we're

gonna get them back, the staff.

307

:

Or maybe not.

308

:

Or maybe they'll be like, you

know what, actually we live with

309

:

AWS and it's, it's all right.

310

:

We always just wondered, and so.

311

:

My general take is to say there

was a conflict and not, and it's

312

:

like, it might be worth talking

to people in data and blah.

313

:

The, these were the two points,

but, it's not always possible

314

:

to write that down safely.

315

:

And that's where Kenny's comment

come about, the safety of,

316

:

Andrew Harmel-Law: Yeah.

317

:

Kenny (Baas) Schwegler: We

318

:

Andrea Magnorsky: of who's writing what.

319

:

Kenny (Baas) Schwegler: yeah, we should

definitely dive deeper, but I think

320

:

you just portrayed a heuristic, right?

321

:

What if something on a personal

conflict, like we call it, we call it

322

:

personal based and relationship based.

323

:

So what if there's a

relationship based conflict?

324

:

Do we write that down?

325

:

But I think it's a good heuristic

to just write down, there was

326

:

a personal conflict, just.

327

:

Talk with these people.

328

:

I think that that would be a good

heuristic already to, to help people.

329

:

Like what, if you deal with that, do

this for now and maybe in the next

330

:

episode we could, definitely dive deeper

331

:

Andrew Harmel-Law: there's one thing

actually I forgot about this in my

332

:

book says, there's someone at my

current client who's using this, and

333

:

it's really good because there's a

lot, there's where I'm at the moment,

334

:

there's a lot of teams with a lot of

people trying to move really fast.

335

:

Not everyone agrees with

every single decision.

336

:

And, but there's a thing I've

suggested, and I've seen it

337

:

occasionally in ADRs for when you've

got the pros and cons of each option.

338

:

You can say adopted despite, right?

339

:

So we take this, even though we

know that this is a downside, right?

340

:

So we've taken this option, even

though this is a downside or rejected

341

:

despite, and I've seen that work

well where people, they at least

342

:

feel acknowledged and they feel

acknowledged not in the comment section,

343

:

they feel acknowledged

in the body of the thing.

344

:

So you can say, right, I disagree

with you, but I'm still gonna

345

:

acknowledge that kind of stuff.

346

:

And like you say, Kenny.

347

:

It doesn't need to be like these

people fell out with each other, but

348

:

this was a disagreement and stuff.

349

:

There's bits in my book about

coalescent argumentation.

350

:

So it's like trying to figure out

where the area of disagreement is,

351

:

because when there's disagreements

frequently it's like, right, I don't

352

:

like Andrea, so therefore any idea that

Andrea comes up with is terrible, which

353

:

let's be honest, is not gonna be true.

354

:

But maybe there is a thing where we're

like, right, this bit we disagree.

355

:

And then when we, when you zoom in on it.

356

:

So writing about ADRs can be good.

357

:

And again, my big thing,

small ADRs, right?

358

:

Because if it's a big a DR then

there's a, there's the, the

359

:

chances of a clash is far bigger.

360

:

If you break that up into smaller

pieces, you're like, this comes from

361

:

Reinert, and it's like principles

of product development flow.

362

:

Within every big decision,

there's one or two bad ones.

363

:

you break a big decision

into 20 small decisions, then

364

:

like 18 of them will be good.

365

:

And then two of them might

be bad or hard, right?

366

:

But at least you can then focus and

then you can have a discussion and a

367

:

disagreement about something specific

as opposed to, I will block your whole

368

:

thing because I don't like the fact

that you're going to use Node for this

369

:

tiny little, some library I don't like

370

:

Kenny (Baas) Schwegler: think next session

we could definitely invite Gien talking.

371

:

About decision making, because

decision making theory, there's

372

:

also a lot about what we want.

373

:

So I think that's an interesting, how

would we write that down, what we want?

374

:

So that's a personal interest.

375

:

Andrew Harmel-Law: Yeah.

376

:

Andrea Magnorsky: I

think we're on time-ish,

377

:

Kenny (Baas) Schwegler: Yep.

378

:

Andrea Magnorsky: so maybe we

should do some closing comments.

379

:

Maybe a little summary wouldn't be amiss.

380

:

So it feels like.

381

:

Andrew told us a story about

surfacing, implicit, and very

382

:

important, decisions that happened.

383

:

That's a good way to kick off things.

384

:

Then we talked about how, conflict,

how we could maybe include

385

:

conflict or not in this, in ADRs.

386

:

Anything else?

387

:

I, I mean, there's more

than that obviously,

388

:

Kenny (Baas) Schwegler: I think another

heuristic here would be, write down past

389

:

decisions that cause conflict, right?

390

:

If there's a lot of confusion.

391

:

just start writing them down

and, and reset them in a way.

392

:

Andrew Harmel-Law: Yeah.

393

:

Because decisions are made

at a point in time, right.

394

:

It's what you knew and

thought and felt at the time.

395

:

Andrea Magnorsky: So is

that, a goodbye for now

396

:

Kenny (Baas) Schwegler: yeah.

397

:

Andrea Magnorsky: for this episode?

398

:

Well, thank you very much

and see you next time.

399

:

People listening to facilitating

software architecture and design.

Links

Chapters

Video

More from YouTube