Artwork for podcast RevOps FM
The Future of Data Privacy in Digital Marketing - Stephane Hamel
Episode 3214th May 2024 • RevOps FM • Justin Norris
00:00:00 00:46:05

Share Episode

Shownotes

Every company pays lip service to privacy. But few seem to really value it in practice.

In marketing and sales, we rely on personal and behavioural data for so many commonplace tactics, and this reliance is only increasing.

It's a kind of arms race. Even if you wanted to, it feels like you can't afford to give up the perceived advantages of using personal data.

And even as we continually hear about the "death" of this or that type of tracking, the reality is that hundreds of billions of advertising revenue are on the line. The incentives to keep collecting and commodifying our data are huge.

I wanted to dig deeper into the current state of privacy in marketing today, and I could think of no better person to reach out to than Stephane Hamel.

He is a legendary practitioner and thought leader in the digital analytics space and one of the foremost advocates for privacy and ethics in marketing.

Thanks to Our Sponsor

Many thanks to the sponsor of this episode - Knak.

If you don't know them (you should), Knak is an amazing email and landing page builder that integrates directly with your marketing automation platform.

You set the brand guidelines and then give your users a building experience that’s slick, modern and beautiful. When they’re done, everything goes to your MAP at the push of a button.

What's more, it supports global teams, approval workflows, and it’s got your integrations. Click the link below to get a special offer just for my listeners.

Try Knak

About Today's Guest

Stephane Hamel is a digital analytics and marketing expert with over 35 years experience. He has spoken at over 150 industry events and was named one of the Most Influential Industry Contributors by the Digital Analytics Association. As an educator, he’s taught thousands of students at institutions including UBC, SimpliLearn, ULaval, Harvard, YorkU, and others. In recent years, he’s focused his attention on the ethical use of data in a marketing context

https://www.linkedin.com/in/shamel/

Key Topics

  • [00:00] - Introduction
  • [01:49] - Current state of privacy in marketing
  • [03:30] - Vision of a truly privacy-oriented company
  • [08:20] - How a 100% consent-based model would work
  • [10:59] - Is privacy-respecting marketing better marketing?
  • [13:26] - Analogy of seatbelts
  • [17:32] - Getting people to care about privacy
  • [20:50] - Advertising in a privacy-first world
  • [24:21] - Conflict between privacy and commercial interests
  • [31:41] - Google and the “death” of third-party cookies
  • [36:47] - AI literacy
  • [41:44] - Career impact of being privacy-oriented

Learn More

Visit the RevOps FM Substack for our weekly newsletter:

Newsletter

Transcripts

Justin Norris:

So a few weeks ago, there was a dust up on LinkedIn

2

:

about a new intent tool that enables

marketers to de anonymize specific

3

:

individuals that visit your website.

4

:

So not just companies, but specific

individuals, which tool it is and the

5

:

people involved aren't really the point of

the story, the incident stuck in my mind.

6

:

Though, because of how it reflected

our attitudes towards privacy in the

7

:

marketing world, so many commonplace

marketing practices do very similar

8

:

things, like when we drop a marketing

automation cookie that lets us see all

9

:

of a user's activity on our website.

10

:

Or when we de anonymize a company and

get a location that lets us triangulate

11

:

a user's identity, or when we give money

to massive corporations like Google and

12

:

Facebook, there's entire business models

essentially predicated on various forms.

13

:

Of identity profiling.

14

:

So the relationship of marketing to

privacy is complicated to say the least.

15

:

Never totally sat well with me, although

it is something I've accepted kind of

16

:

as a cost of being in this industry.

17

:

However, the whole incident really made

me want to dig deeper into the current

18

:

state of privacy in marketing today.

19

:

And so I could think of no better

person to reach out to than Stefan Amel.

20

:

He is a legendary practitioner and thought

leader in the digital analytics space and

21

:

one of the foremost advocates for privacy.

22

:

And ethics and marketing.

23

:

I first met Stefan incidentally

about 12 years ago when he was

24

:

my instructor for a web analytics

certificate that I was taking.

25

:

And I have followed his work ever since.

26

:

So Stefan, I really appreciate

you being here today.

27

:

Stephane Hamel: Thank you for having me.

28

:

It's a really a pleasure

even after 12 years.

29

:

That's amazing.

30

:

Justin Norris: Let's dive in and maybe as

a first step, just to understand from your

31

:

point of view, you think about this a lot.

32

:

A lot of marketers go about their

day without giving too much thought

33

:

necessarily to the privacy piece.

34

:

From your perspective, where are

we in terms of our attitudes, our

35

:

practices with respect to privacy?

36

:

Have we gotten better?

37

:

Have we gotten worse?

38

:

Or is it kind of a mess all in all?

39

:

Stephane Hamel: that's

a really good question.

40

:

Is it better or worse?

41

:

better in a sense that there

is more consumer awareness.

42

:

Sadly, maybe because of all the data

leaks stories that we hear in the

43

:

press and so on marketers are more.

44

:

Aware and conscious about the legal

frameworks be at the GDPR in Europe or

45

:

the new law 25 in Quebec or even what's

happening in the States and so on.

46

:

The negative side of things

is that I see that the.

47

:

Ad network ecosystem or all the ad tech

industry is basically saying, Oh, with the

48

:

death of third party cookies organizations

are enticed to collect more first party

49

:

data one end and the marketing and ad tech

industry is trying to find other ways of.

50

:

Continuing to collect as much

information as before, if not

51

:

more information than before.

52

:

So the net result is that from a

privacy perspective, I'm not sure

53

:

the end result is going to be

better than using, let's say, third

54

:

party cookies and stuff like that.

55

:

It's evolving very quickly.

56

:

there are positive and

negative sides to it.

57

:

But generally speaking, I think we're

aiming into a more privacy friendly

58

:

marketing ecosystem on the net.

59

:

So.

60

:

That's cross our fingers.

61

:

Justin Norris: your example that you

just gave, you kind of alluded to it.

62

:

A lot of marketers and a lot

of companies approach privacy

63

:

kind of like a set of rules.

64

:

That they have to comply with,

but they can also get around

65

:

in legitimate ways if they can.

66

:

So, you know, we set up one loophole

and then, so we go around it.

67

:

Instead of using cookies, we use,

some other form of tracking users.

68

:

So it's kind of complying with the letter

of things, but not necessarily the spirit.

69

:

And you recently wrote about this

distinction between legal compliance

70

:

and having a true privacy orientation.

71

:

What would it mean for a company

to actually be privacy oriented?

72

:

Stephane Hamel: For me, the huge

difference between, especially if

73

:

you look at the GDPR in Europe it

wasn't, enacted several years ago.

74

:

So they went through all of the steps

and still are, dealing with you know,

75

:

clarifications of the legal text and

having specific cases where they can say,

76

:

okay, this is a good example, or this

is a bad example with bill 25 in Quebec,

77

:

we see that the same thing is going on.

78

:

So the law is there companies are trying

to comply, but mostly from what I've seen

79

:

is that for now, at least it's more like.

80

:

Legal compliance.

81

:

Okay.

82

:

I need to do whatever I need to

do the minimum amount of things

83

:

in order not to get a slap on

84

:

my wrist need to check the

box that says, Oh, yes, I've

85

:

done this and that and so on.

86

:

And this creates a scenario where.

87

:

You comply from a legal standpoint,

but you don't necessarily comply from

88

:

the principles of privacy from, basics

of marketing, which is listening to

89

:

your customers and being aware that

if you have a banner on your website.

90

:

And you offer them the

choice to opt in not.

91

:

If they say no, I reject all cookies and

I don't want to be tracked, then why do

92

:

you try to find ways to circumvent that

and say, Oh yeah, okay, I'm going to use

93

:

this tool that, It doesn't use cookies.

94

:

So it must be right.

95

:

Or I can still use Google analytics

because Google tells us there's a consent

96

:

mode that we should trust Google that when

they say, Oh yeah, that's privacy aware.

97

:

Trust us.

98

:

You can use that.

99

:

There was no problem.

100

:

So, my hope is that we're going

to go from legal compliance

101

:

where really, it's a pain.

102

:

It's an expense.

103

:

And you need resources.

104

:

It's a distraction.

105

:

It's a barrier to innovation and

so on and shift the mindset so

106

:

that being privacy aware from the

principles respective the ethics of it.

107

:

It means that now it's not an expense.

108

:

It's an investment.

109

:

It means that it becomes a brand value.

110

:

In the long term, and that

financially it's worth something.

111

:

It's an opportunity for innovation.

112

:

It's an opportunity to say, okay,

I've been using all of those

113

:

marketing tools for so many years.

114

:

Do I really need all of those?

115

:

all of those tools that

are collecting data?

116

:

Each one trying to pull more data

into whatever their system is.

117

:

So shifting the mindset a little bit.

118

:

Justin Norris: Are there any

companies that are actually doing

119

:

this today or does it remain

aspirational from your point of view?

120

:

Stephane Hamel: Sadly I guess I have to

admit that it's more aspirational than,

121

:

really seeing companies that are really

embracing the whole principles of privacy.

122

:

But I think that's going to

change slowly, gradually.

123

:

We're going to see changes where

one example that I like to mention,

124

:

which I find absolutely absurd is

when you look at a privacy policy.

125

:

Oftentimes the first statement is your

privacy is really important to us.

126

:

should be a value proposition

from that brand, right?

127

:

But then you realize that if you

are in Europe, you get a banner.

128

:

If you are in Quebec, you get a banner

because the law forces you to put a

129

:

banner and say, okay we want to collect

such and such information and so on.

130

:

But if you're anywhere in the rest of

Canada, for example, there's no banner.

131

:

If you are in the States.

132

:

There is no banner and it's open bar.

133

:

So it's like, okay, your privacy

from a brand perspective, your

134

:

privacy is important to us,

but it depends where you live.

135

:

to me, it doesn't make sense.

136

:

Either the privacy is important to

the brand as a whole, or it's not.

137

:

Justin Norris: And obviously I

smiled when you said, you know,

138

:

that line, your privacy importance

cause they all start with that.

139

:

So it's a little bit of like a

fig leaf or in a marketing speak.

140

:

Is privacy actually important?

141

:

So let's think about a potential

company that really did value privacy.

142

:

And you have this hashtag you often use

in your posts, no consent, no tracking.

143

:

So a hundred percent consent based

model, maybe just walk us through what

144

:

it would look like for a company to

be using that model, what they would

145

:

collect, not collect, how they would

interact with customers and so on.

146

:

Stephane Hamel: So I guess I came up

with the no consent, no tracking hashtag

147

:

just because It's an opportunity to step

back and think about it as a marketer,

148

:

before you actually do something.

149

:

So, I didn't want it

to be like an absolute.

150

:

I want it to be an opportunity

for really thinking about your

151

:

marketing tactics and being proud

about your marketing tactics also.

152

:

one of the things I often mention is

that we should be proud of what we do

153

:

from a marketing standpoint and not

wake up in the morning, being afraid

154

:

that, Oh, it the day where someone will

tweet something or post something What

155

:

we're doing in our marketing tactics

and we'll say, Oh yeah this company says

156

:

they do this, but in reality they do

something else or they are creepy and

157

:

they are using tools that are arvested.

158

:

Too much information, stuff like that.

159

:

So really, basically no constant,

no tracking is just an invitation to

160

:

think about the principles of privacy,

of data minimization, of consent

161

:

of control and things like that.

162

:

so a company could do that

fairly easily by looking at.

163

:

The Martek stack that they use

right now and seeing if any of

164

:

those tools are absolutely necessary

in the first step, what data they

165

:

are collecting for which purpose.

166

:

So basically doing an audit.

167

:

Of your martech stack and seeing if

it's all relevant using a consent

168

:

banner on your website where there's

no tricks you know, you have the

169

:

same types of buttons for agree and

reject, for example, no dark pattern

170

:

anywhere and respect the user choice.

171

:

I don't want to be tracked, then.

172

:

Don't track them.

173

:

Actually, one of the mistake,

is that most banners that we

174

:

see are talking about cookies.

175

:

But it doesn't really matter.

176

:

And I'm not talking from a

legal standpoint, but from a

177

:

principle standpoint, it doesn't

matter if it's cookies or not.

178

:

What matter is, are you

collecting personal information?

179

:

I don't care how you do it.

180

:

Whether you use cookies or

fingerprinting or any other method.

181

:

If I say I don't want to be

tracked, then respect my choice.

182

:

Otherwise what happened is that

consumers are taking control

183

:

and they are using ad blockers.

184

:

And stuff like that.

185

:

So at the end of the day, the marketers

are not in any better situation.

186

:

Justin Norris: if we model through

mentally what that consent respecting

187

:

mode of marketing would look like?

188

:

I guess one thing that comes to

mind is less gating, less trading

189

:

contact information for content.

190

:

And Lo and behold, a lot of people now are

coming around to the notion that that's

191

:

actually just a better customer experience

and actually better marketing in general.

192

:

Do you think that respecting the

principles of privacy leads us to better

193

:

marketing in all cases, in some cases,

or is it sort of an, it depends scenario?

194

:

Stephane Hamel: Well, I guess I have

to say, you know, I've done so much

195

:

consulting that I have to say, it depends.

196

:

Justin Norris: Of course.

197

:

Stephane Hamel: That's the

universal question to everything.

198

:

But seriously, I think there are so

many scenarios where marketing could

199

:

be improved if we only went back to the

roots of marketing there were marketing

200

:

campaigns before the web and before

the internet and the world worked.

201

:

existed, There was TV,

radio, print and so on.

202

:

Now everything is digitized.

203

:

But the concepts of

marketing remain the same.

204

:

It's building trust.

205

:

It's listening to your customers.

206

:

It's offering value.

207

:

And I think it's been too easy

for people to simply, okay,

208

:

I'm going to spend X amount of

dollars on Google or on Facebook.

209

:

So it's, Kind of reassuring to say, Oh I'm

spending a couple of thousands of dollars.

210

:

I see how much traffic I get and

I see the net result of that.

211

:

Well, as much as possible, given the

data that I can collect, it's kind

212

:

of reassuring to work in this way.

213

:

And that's how it's been for

the past, maybe 20 years.

214

:

Can you imagine how the internet

would be if from the start.

215

:

The consumer had retained control over

the data they want to share instead of

216

:

the evolution that we've seen, where

Oh, there's a new trick and there's

217

:

a new technology that allows you to,

you know, use geolocation and, collect

218

:

so much information from different,

browser behavior, navigation and so on.

219

:

And then, Oh yeah.

220

:

Okay.

221

:

We can be data brokers and.

222

:

Merge the data in one place

or several places merged that

223

:

with other sources of data.

224

:

Oh, that's the only rail.

225

:

That's a gold mine, except that

the person that benefits the most.

226

:

I would say probably the least

is the consumer themselves.

227

:

So I think we need to change that, prior

bang and think differently, just like, you

228

:

know, in, in the courses that I'm teaching

I'm talking about the evolution of the

229

:

seatbelts in cars where initially it

was, you know, consumers didn't want it.

230

:

And car manufacturers said,

Oh, that's an additional cost.

231

:

And it's going to be more expensive.

232

:

It's a barrier to innovation and so on.

233

:

And look at where we are today.

234

:

We have super efficient cars, very

secure very safe because we don't only

235

:

have seatbelts, but we have airbags

and we have the frame of the car.

236

:

So all that innovation came because

security was an important aspect.

237

:

And consumers can choose between cars that

are reputed to be more secure than others.

238

:

So privacy is going to be the same.

239

:

We, consumers will pick, will choose

brands that they trust because they

240

:

see that privacy is part of the

value proposition of that brand.

241

:

In the long run, I think

that's how we're going to win.

242

:

I would say the privacy battle.

243

:

Justin Norris: taking the seat belt as

an analogy, you think perhaps that, you

244

:

know, our attitudes towards privacy will

evolve from something that's let's pay lip

245

:

service to it, but get around it to more

like, no it's really going to be baked

246

:

into the model and people are going to

247

:

respect it.

248

:

Stephane Hamel: Yeah.

249

:

I think so.

250

:

Because going back to that seatbelt

analogy, it took regulations and laws

251

:

to impose that people wear a seatbelt.

252

:

And it also took about 20, 25 years of.

253

:

marketing campaign in education so that

people realize that when you get in

254

:

a car you know, put on your seatbelt.

255

:

So it has become a security

by default in a way.

256

:

I hope it won't take 20, 25 years

to educate people so that privacy

257

:

by default becomes the norm.

258

:

That it becomes automatic that when

you purchase something new, the default

259

:

password is not password, for example.

260

:

that's so obvious.

261

:

But how many internet devices and

IOTs and, things like that were

262

:

shipped with default passwords?

263

:

That's not privacy and

security by default.

264

:

When I do audits, I see the consent

banner, I reject all, and I see there's

265

:

still tracking going on not just one by

mistake, it's almost like a deliberate

266

:

facade so that you, okay, yeah, we

have a consent banner, we have the nice

267

:

legal text from a technical standpoint,

it simply doesn't work because the

268

:

privacy engineer which is a basically

a new role that we see emerging.

269

:

You need to have someone from an

engineering standpoint that understands

270

:

how to, enable that banner and make sure

that they're tracking doesn't fire anyway.

271

:

So there's a whole set of roles and

new job functions that are emerging,

272

:

be it from a legal standpoint or from

a privacy engineering standpoint that

273

:

are becoming like more important that

didn't exist just a few years ago.

274

:

Justin Norris: What you just

described with the cookie

275

:

banner, I've seen that too.

276

:

Do you think that's often just

incompetence or do you think it's malice,

277

:

like deliberate flouting the user's

278

:

Stephane Hamel: No, I I hope that

most of the time it's simply because

279

:

getting very difficult to merge.

280

:

The profiles of someone who understand

the legal perspective, who understand

281

:

the marketing perspective, and

also understand how technology

282

:

works to do all of those things.

283

:

Having a profile that merged the

three areas is virtually impossible.

284

:

So that's why it's a teamwork.

285

:

And maybe now more than ever, because

I feel like we're in a enabling

286

:

phase, a transformation phase.

287

:

And we need the skills and the

people who have the right skills

288

:

to do those kinds of things.

289

:

So I think, You don't

know what you don't know.

290

:

So do your best, but it

doesn't mean it's fully

291

:

compliant.

292

:

Justin Norris: that clicking

the reject button doesn't

293

:

magically disable all the scripts

294

:

their, on their website.

295

:

I want to pull on this

296

:

thread of the seatbelt analogy a bit

more, cause I find it actually really

297

:

interesting, but I want to pull on

And one of the things you could say

298

:

about seatbelts is that the impact

of not wearing one is very dramatic

299

:

if something goes wrong and it's very

visible and obviously very horrible.

300

:

With privacy, and that debate I

mentioned around this particular

301

:

tool at the beginning is kind of

interesting from that perspective.

302

:

There were the people that were like, this

is wrong, you're a bad person if you do

303

:

this, you know, it's that point of view.

304

:

And then there were people

that were like, I don't care.

305

:

Like whatever, de anonymize me.

306

:

What's the big deal.

307

:

And it's like, okay, yes, I can see that.

308

:

It's not the same thing as shooting

through your windshield at a hundred

309

:

kilometers an hour, but there's

this sort of cavalier attitude.

310

:

And do you think that the lack of kind

of visible impact or people like, Hey,

311

:

I don't care if Google knows what I'm

doing, or I don't care if my iPhone

312

:

is listening to me or, you know.

313

:

think it was one of the founders of

Google that was like, if you're not

314

:

doing anything bad, like, why would

you, you know, would you want to

315

:

Stephane Hamel: Oh yeah.

316

:

I hear

317

:

Justin Norris: Um, do you fight that

sort of cavalier attitude towards privacy?

318

:

Stephane Hamel: Yeah.

319

:

Yeah.

320

:

Either, you know, things

like I have nothing to hide.

321

:

They know everything anyway.

322

:

And they is who exactly we don't know.

323

:

But they know everything anyway.

324

:

And also if I'm going to be exposed

to some advertising, I would prefer

325

:

to get something that is customized

and tailored to my interests.

326

:

And yeah, okay.

327

:

I guess, you know, like the seatbelts

people were saying What if I

328

:

crash, you know, in a lake and I

don't have enough time to get out

329

:

of the car, I'm going to drown.

330

:

so all kinds of saying, yeah, but what if.

331

:

Exceptions, things that in reality have a

very small chance of actually happening.

332

:

But when you realize that your data

has been leaked a couple of times,

333

:

and that suddenly you see a credit

card happening on your credit report.

334

:

You see some credit cards popping up that

you have no clue where it comes from.

335

:

When you see that you can get in

trouble because your identity is

336

:

being stolen and stuff like that.

337

:

Maybe it's like a crash.

338

:

it creates such a situation of

frustration, lack of control.

339

:

You feel like you have that, possibility

always over your head you're going

340

:

to have issues the next time you want

to purchase a car because your credit

341

:

report is going to be so bad or issues

with the government or issues crossing

342

:

the border at the customs because

someone else has stolen your identity.

343

:

So those are pretty serious things.

344

:

So I think people will care more and

maybe that personally, that's one of

345

:

the reasons why I went from being an

advocate for analytics, and I'm still an

346

:

advocate for analytics, of course, but

more privacy aware because my identity

347

:

was my data was leaked several times.

348

:

Over the years and just recently I had

an attempt of someone in Europe trying

349

:

to purchase something using a credit

card that I never ordered, So getting

350

:

through the trouble of stopping the

transaction, the stress that it creates

351

:

you don't want to live that just like

you don't want to live a crash in a car.

352

:

Justin Norris: Well, no, that is serious.

353

:

I wonder again, just thinking about

consent and tracking, if given the choice.

354

:

I usually say no, unless I'm in a

hurry and they use a dark pattern.

355

:

Like you said, it's hard

to find the reject button.

356

:

I'm like, ah,

357

:

whatever, I just need to get on.

358

:

But I do say no.

359

:

I use an ad blocker.

360

:

I try to use more privacy aware,

you know, browsers and technology.

361

:

Which again, I'm To some extent conflicts

with you know, attempting to use those

362

:

technologies as an advertiser in my work

as a go to market professional But all

363

:

that is to say if the choice is really

clear and really respected And enough

364

:

people say no eventually those marketing

tactics lose their efficacy because

365

:

they're no longer there So do you feel

like those things maybe just die and we

366

:

have different ways of reaching people

and like what would those ways look like?

367

:

Stephane Hamel: That's a good point.

368

:

Two things on that.

369

:

One is just yesterday I had a

conversation where the concept that

370

:

the internet or the web exists because

it's subsidized by advertising, right?

371

:

And that's why we get so many

things for free because it's

372

:

supported by advertising.

373

:

And if there's too many people who are

blocking those ads or saying no, then

374

:

the whole system is going to fall apart.

375

:

But at the same time, when

I look at media websites.

376

:

That are often complaining about

you know, the big, bad, the Google

377

:

and Facebook that are stealing their

advertising revenue on media websites.

378

:

Yet I go there and what I

realize is that if I use an ad

379

:

blocker, I don't see the ads.

380

:

But there is no contextual ads either.

381

:

And contextual ads doesn't require

any, no collection of personal data.

382

:

It's like, if I'm on a news

website and I'm in the car section.

383

:

Well, guess what?

384

:

Maybe it's the right place to

put ads about, the cars and

385

:

all things related to cars.

386

:

And even beyond that, because

you know that if people are

387

:

interested in cars, they might

be interested in something else.

388

:

You don't need to track

everyone to do that.

389

:

just makes sense.

390

:

And at one point in the early days

of the web, that's how it was.

391

:

So there's that misconception that

in order for ads to be effective, it

392

:

absolutely needs to collect personal

data to do it because you need to know

393

:

what are the interests of your audience.

394

:

To some extent, it's true, but

collecting personal data is not the

395

:

only way to know about your audience,

And for that matter, if you go into

396

:

your Google profile and your Facebook

profile and go deep, because you need

397

:

to find it, and look at what Google

and Facebook think are your interests,

398

:

and you're going to be in for a shock.

399

:

Because they might say that I'm a woman,

that I'm 35 years old, and that I'm

400

:

interested in Pokemon and stuff like that.

401

:

Well, when it's actually totally wrong.

402

:

But I have very little control over that.

403

:

So all those interest based advertising,

and there's so much fraud also.

404

:

That's the other issue.

405

:

That maybe contextual ads are going to be.

406

:

Maybe a good alternative,

much more privacy friendly

407

:

you don't need as much data.

408

:

it can be much simpler.

409

:

So why not?

410

:

Why not

411

:

Justin Norris: That's

412

:

very interesting.

413

:

It

414

:

reminds me of magazines, you know,

I haven't read like a real magazine

415

:

in a while, but like back in the day

you get like car and driver and it

416

:

had a certain type of ad in it and

you'd get, another type of magazine

417

:

and you kind of know who the audience

418

:

is and who's interested in that

and you advertise accordingly.

419

:

Thinking about, you know, the central

issue here, the central conflict

420

:

from my point of view is that at

least on a level of perception.

421

:

Companies are seeing a conflict

between Respecting privacy and

422

:

their own commercial interests,

and I'm sure you've seen this as a

423

:

consultant going in making Recommendations

like the de anonymization tool is

424

:

a perfect example Oh, we can see

who exactly is on our website.

425

:

We can reach out to them.

426

:

It's a great play But is it okay in

terms of privacy, but it could make

427

:

us money, that's tough particularly,

being somebody in house and having to

428

:

stand up for that and say like let's

make less money potentially for this

429

:

abstract principle of privacy which feels

like a victimless crime because nobody

430

:

really sees Or is aware of the extent to

which they're being tracked Like how do

431

:

you make that case of like you need to

sacrifice your tracking do less profiling

432

:

have less effective potentially cold

outreach less personalization I think

433

:

the contextual advertising that you

mentioned is one, effective potential

434

:

way of thinking about that differently.

435

:

How else, do you make that case?

436

:

Stephane Hamel: I'm not saying not

to track anything because if you

437

:

have a, let's say a consent banner

on your website and the person says,

438

:

yes, I want to get your advertising.

439

:

interested, And if it's done in a

respectful way, you build additional

440

:

trust, from a marketing standpoint,

Do you prefer to have like a thousand

441

:

prospects that are, not really

qualified, or do you want to have maybe

442

:

a hundred that really said, yes, I want.

443

:

I think I would much rather go with

the fewer that are much more qualified

444

:

and not waste my daughter and my

time on the 900 others that maybe I

445

:

have a slight chance of converting

them into, you know, clients.

446

:

there was that obsession about data

as if it would solve everything.

447

:

discourse that was let's collect

everything, dump it into a big data

448

:

lake, and eventually we're going to

figure out what we want to do with it.

449

:

That doesn't work anymore.

450

:

Even if the data is It's cheap to

collect, cheap to store, but pretty

451

:

expensive to actually leverage.

452

:

So we need to change that and maybe,

small data is pretty good too.

453

:

You can make very good decisions

on less data, but much better data.

454

:

Hopefully that's, one of the

aspects that we don't have a choice.

455

:

We're going to see it

anyway, because if we don't.

456

:

Embrace concepts.

457

:

We're going to be forced anyway.

458

:

One clear example of that is if you

look at the you mentioned you are

459

:

using an ad blocker and I do too.

460

:

If you look at the general population

depending on the source, it might be

461

:

30 percent of the general population

that use ad blockers recently, I

462

:

saw 50 percent in the States I think

that would be pretty high, but let's

463

:

say between 30 and 50 percent of the

General population use an ad blocker.

464

:

Maybe sometimes they don't even know.

465

:

It's just part of their system or they

use a browser that, blocks third party

466

:

cookies already and stuff like that.

467

:

When I do conferences and I remember

once I had about a thousand people in

468

:

the room and they were largely marketers.

469

:

And people in analytics

and stuff like that.

470

:

And I asked them if they were using an

ad blocker, it's not super scientific.

471

:

I didn't count them, but I can tell you

70 to 80 percent of their hands went up.

472

:

So right there to me,

there's something broken.

473

:

If the general population use,

let's say 30 percent of them use.

474

:

And that blocker, but when you ask

marketers, 70 percent of them use an ad

475

:

blocker, what's wrong, does it make sense?

476

:

And we are in this industry

and we both use an ad blocker.

477

:

So to me it's a really good

example of something is broken.

478

:

We need to fix that.

479

:

Justin Norris: So knowing what we

know, we choose not to be tracked.

480

:

There is a hypocrisy

481

:

in, in continuing to try

to do that to other people.

482

:

So your point of view is that through

permission based models and through

483

:

alternative models, you ultimately could

have, let's say an equal, if not greater

484

:

level of efficacy of your marketing.

485

:

Or do you think there is a

486

:

hard point at which there is that sort of.

487

:

devil on one shoulder angel on

the other shoulder where it's like

488

:

am I going to make more money or

am I going To do the right thing?

489

:

Do you feel that marketers have to

make that choice or can they have

490

:

their cake and eat it too ultimately?

491

:

Stephane Hamel: Yeah.

492

:

one of the argument I hear very

often is if, as a company, if

493

:

I don't do it, my competitors

will, so I don't have a choice.

494

:

I need to continue to,

you know, play the game.

495

:

But as I mentioned, we're in a

transition phase in the long run.

496

:

If I have, as a consumer, I have a

choice to do business with a company

497

:

that is reputed to have, three

data leaks over the past two years.

498

:

And then the alternative is another

company that has a really strong

499

:

reputation in terms of privacy and

their, and how they do marketing.

500

:

Eventually a consumer is going

to win and they will go with

501

:

the one they trust the most.

502

:

There is no way around that.

503

:

I won't mention the company, but

I saw a company that they had a

504

:

data leak twice in the same year.

505

:

not small ones.

506

:

if I'm just a general consumer,

maybe I'm not super educated, but

507

:

I understand what is a data leak.

508

:

I understand that, okay, they didn't take

care of the data about their customers.

509

:

Am I going to go with them or

maybe the other alternative?

510

:

Just like, There are consumers that

really don't care that the company is,

511

:

polluting the environment and they don't

have good practice and stuff like that.

512

:

And there are consumers who will be

very conscious about selecting the

513

:

products that are more environmentally

friendly and stuff like that.

514

:

The same thing is going to happen.

515

:

It's a value proposition.

516

:

Justin Norris: Almost like another

branch of corporate social responsibility

517

:

and we do see, environmentalism or

even in the fashion industry Just

518

:

noticed, for example, like at H and M

saw someone who had bought something

519

:

and they were promoting like that.

520

:

It was like no sweatshop labor or

something like these issues companies do

521

:

not necessarily because they care deeply.

522

:

Maybe they don't, but at least

they see it as profitable to align

523

:

themselves with those issues.

524

:

Stephane Hamel: In the long

run, we see the difference

525

:

between, the fast fashion where.

526

:

It's so cheap they are exploiting

people and stuff like that.

527

:

It's still sell, but you see in Europe,

they actually enacted a law that goes

528

:

against the fast fashion trend to

limit the impact of those things.

529

:

So right now for example, in

Canada, we see that PIPEDA is

530

:

over 20 years old and, Bill C 27.

531

:

Is gradually and slowly coming back to

which would be basically the new version

532

:

of the privacy law in in the States.

533

:

There is a lot of action going

on in each of the states and

534

:

even at the federal level to.

535

:

have better privacy regulation.

536

:

we're at that stage where there's

a lot of lobbying, of course, also.

537

:

So we'll see how it goes, but I think

for sure it cannot go backward where.

538

:

It's going to be the open bar and free

for all, where you do anything you

539

:

want, you collect everything you want

and you sell it, share it and merge it

540

:

with anything else that won't come back.

541

:

And I think that's a good thing.

542

:

Justin Norris: If we think about the

major players in this industry, like.

543

:

used the term in my post the other

day the advertising industrial complex

544

:

like sort of half tongue in cheek

But we look at google ad revenue last

545

:

year over 237 billion dollars facebook

ad revenue over 100 billion dollars.

546

:

These are not small numbers Will

those sorts of you know corporate

547

:

and institutional forces do they?

548

:

have a role in, in terms of stopping

progress or taking apparent progress

549

:

and kind of redirecting it in ways that

ultimately still allow them to do what

550

:

they do and that they will benefit from

551

:

Stephane Hamel: Yeah.

552

:

if we look at Google, for example, it's

funny because they postponed it again the

553

:

famous death of third party cookies, or

some people call it the cookie apocalypse.

554

:

But it's essentially what the ad network

industry has been relying on to, track

555

:

people across different websites.

556

:

And if you cannot use third party

cookies, then it's much harder to do that.

557

:

So Google say, oh, we care

about your privacy and so on.

558

:

Yet they've postponed the limiting

of third party cookies for so long.

559

:

When you look at other browsers.

560

:

be it, you know, Firefox, be

it Safari I'm using Brave.

561

:

They all block third party

cookies anyway, right now.

562

:

And Chrome the browser from Google,

is one of the last to do it.

563

:

And Google still continues to say,

Oh, we care about your privacy.

564

:

We're going to stop trying,

enabling third party cookies.

565

:

But they postponed it again

because their alternative.

566

:

Either didn't pass the test of privacy

or is already being challenged in

567

:

court or, you know, risk of being

challenged in court in Europe because

568

:

the alternative to third party cookies

is actually potentially more invasive.

569

:

so again, it's a discourse where we

care about your privacy, but we're

570

:

going to hold back until we find another

way that will pass the legal test.

571

:

And then we're going to block

the third party cookies.

572

:

Why don't they do it right away

if they really believe in privacy?

573

:

Because most of the other browsers are

already limiting the third party cookies.

574

:

So, won't sacrifice, hundreds of millions

or billions of dollars in revenue, by

575

:

doing something that would basically,

Just, cut that stream of revenue.

576

:

They want to find another alternative

and sadly the other alternative might

577

:

involve hashing your email or your phone

number and combining it, fingerprinting.

578

:

So being able to identify the unique

characteristics of your device.

579

:

While people got used to clearing

their cookies, changing your email

580

:

address, your phone or your device.

581

:

It's much more difficult.

582

:

So that's why I'm saying the alternative

to third party cookies, it might

583

:

be actually worse because you won't

change your email all the time.

584

:

Justin Norris: the cure worse

than the disease in that case.

585

:

So,

586

:

so the whole debate in essence then It's

kind of moot because it's like, coming

587

:

back to the idea of the letter of the

law and the spirit of the law, it's like,

588

:

all right, we'll not do this one thing,

but then we'll use local storage or we'll

589

:

use, like you said, some hash of the

device does this whole cookie apocalypse

590

:

then just kind of a Tempest in a teapot,

like do marketers even really need to

591

:

think about this because ultimately

they're not going to be affected.

592

:

Stephane Hamel: Yeah.

593

:

I said for, let's look at each of

the perspective for the consumer,

594

:

it doesn't change anything.

595

:

the web is.

596

:

Still going to work.

597

:

for a marketer.

598

:

It will change the ability to

do a retargeting, for example.

599

:

But it's an opportunity to look at,

as I mentioned, your market stack and

600

:

see which are more effective and do

you really need all of those tools

601

:

that you subscribe at 19 bucks a

month and you forgot about it, right?

602

:

So that's for marketers for agencies.

603

:

it's pretty sad to say that there are

some agencies that are playing the card

604

:

of talking about the cookie apocalypse

as if it was absolutely terrible

605

:

and, you know, your business won't

survive if you don't pay us to fix it.

606

:

So, I've seen agencies using this

kind of language almost to maybe

607

:

spread some you know, FUD, a fear,

uncertainty and doubt to their clients

608

:

who really are not expert in that.

609

:

And then the last layer are the ad tech

and martech industry itself that are,

610

:

you know, some of them are probably

going to probably not survive, but

611

:

others are going to try to circumvent.

612

:

The third party cookies by using

something else, as I mentioned, like

613

:

fingerprinting and stuff like that.

614

:

And then there's others that will really

innovate and find better ways of offering

615

:

their services and their solution.

616

:

And just like we've seen the

seatbelts going from, just one.

617

:

Strap and now we have like a strap

on the chest that's innovation.

618

:

So hopefully we're going to see

some of the players embracing more

619

:

privacy and improving their products.

620

:

It's an opportunity to innovate or die.

621

:

Justin Norris: Let's talk a little bit

about AI, obviously a huge sea change in

622

:

the last year and a half with privacy,

risks and perils that really haven't even

623

:

been fully identified, or most people

aren't aware of from, companies maybe

624

:

dumping their whole corporate strategy

into chat GPT without really thinking,

625

:

but where that goes to maybe the ability

of AI to profile us in different ways.

626

:

I don't know.

627

:

There's all sorts of things.

628

:

How are you thinking about this?

629

:

What worries you?

630

:

What maybe there are benefits

that I haven't considered?

631

:

What's your point of view?

632

:

Stephane Hamel: Yeah.

633

:

I think there's a distinction to make

between the general publicly available.

634

:

Gen AI, like a chat GPT, for example,

where, as you mentioned, you know, don't

635

:

go there and put all your strategy.

636

:

there's now an option that you can

say, don't use what I'm telling you

637

:

to continue to train your model.

638

:

But still, there's the availability of

other AI models that you can actually

639

:

enable within your environment.

640

:

And in a secure way, and it

can be amazingly powerful.

641

:

It can be useful.

642

:

And again, that's another thing

where there's no way back.

643

:

The genie is out of the bottle.

644

:

That's for sure.

645

:

And it's interesting because.

646

:

When chat GPD came out I was teaching

a marketing class to uh, executive MBA,

647

:

basically, and I had about 30 managers

in the class and that was in December,

648

:

2022 and we started talking about

it and we said, well, you know, let's

649

:

give it a try and let's ask marketing

questions and see if we could use

650

:

it to not do basic stuff like, okay

you know, improve my text and stuff

651

:

like that, but do market research and

generate new ideas and stuff like that.

652

:

And those 30 managers, they had never

touched chat GPT before, and they

653

:

were like kids in a candy store.

654

:

They were so amazed and excited

and seeing the possibility.

655

:

So following that in January, 2023.

656

:

I said in my class, okay, use it,

go ahead, it won't be one of those

657

:

classes or school where, you know,

universities are saying, don't use it

658

:

it's cheating, it's plagiarism and stuff

like that, I said, no, go the other way.

659

:

Use it just let me know how you use it.

660

:

That's the only thing I ask is

just be transparent about it.

661

:

And what I see is there are

students who are afraid of that

662

:

and they don't know how to use it.

663

:

And others that are embracing it

and the distinction is very clear.

664

:

I'm afraid there's going to be a

literacy gap between those who know

665

:

how to leverage those tools, be it.

666

:

So those who know how to use them and

those who don't, and that's going to

667

:

make a difference in the workforce,

that's for sure, because the quality

668

:

of the result is not the same.

669

:

On the other hand, my expectations

from the students have increased.

670

:

My exam questions are more

difficult because I know they

671

:

can use ChatGPT to answer.

672

:

And sometimes, I will ask questions that

I know cannot be answered by Google.

673

:

GPT or any other tool.

674

:

So my job is a little bit more difficult,

but I use chat GPT to make my job a

675

:

little bit easier also at the same time.

676

:

So that's true of my role as a

teacher, but it's true also in

677

:

consulting it's true for marketers.

678

:

It's I'm also giving a short course

on data science and marketing and

679

:

I use it to generate Python code.

680

:

I do just so many things with it.

681

:

So.

682

:

It's amazing.

683

:

It's I wish I had those

tools when I was younger.

684

:

Justin Norris: I know what you mean

though, about it being a skillset,

685

:

because I had another interview recently

with someone using using an automation

686

:

tool that allowed you to leverage AI

within it for an outbound sales use case.

687

:

And Yes, you can use it to do a lot

of things, but the amount of thinking

688

:

that had to go into saying like now

go to this website and look at this

689

:

and give me this specific question

and you actually have to deconstruct

690

:

a thought process and then encode it,

and that is a skill that, person had

691

:

developed to a very high

692

:

Stephane Hamel: if, but that's the point.

693

:

If you don't know what you're

talking about, you won't be

694

:

able to leverage those tools.

695

:

You will ask very basic questions and

you will get crappy answers and you will

696

:

end up saying, Oh, Gen AI doesn't work.

697

:

But that's not true.

698

:

It works, but you need to first know

which questions you want to ask.

699

:

And in order to know that, you

need to understand the discipline.

700

:

Or the domain, the expertise

of what you're talking about.

701

:

Otherwise it doesn't work.

702

:

and we're going to see either people

using it to save time or people

703

:

using it to increase the quality.

704

:

Hopefully we're going to see more people

using it to improve the quality of

705

:

the end result, not just to save time.

706

:

Because you can save time,

but get crappy results.

707

:

Justin Norris: Do bad stuff faster.

708

:

Bad word faster.

709

:

maybe the last question, Stefan, I'm

just curious from a personal basis.

710

:

You've been transparent and have

written on a few occasions on LinkedIn

711

:

about kind of the career impact that

you've seen about being privacy first.

712

:

Which in some cases has maybe limited

some of your professional prospects.

713

:

Can you talk about that a little bit and

just how would you advise Other marketers

714

:

who are out there maybe a bit earlier

in their career and can even less sort

715

:

of afford to have those impacts I guess

716

:

Stephane Hamel: Yeah, was a combination

of multiple things at the same time.

717

:

One was having been a freelance for

so long and on the consulting side,

718

:

and I'm over 55 now and having that

crazy idea that maybe with the amount

719

:

of expertise and experience that I

have, I could go and offer my expertise

720

:

to, you know, Maybe a startup in

privacy tech or something like that.

721

:

So I faced multiple issues including

ageism, including the fact that working

722

:

remote is a barrier sometimes the fact

that going from the consulting world

723

:

actual client side is a challenge

when you've done it for so many years.

724

:

And then on top of that there's the

aspect of being, more privacy conscious

725

:

what I realized is that for example,

while I was super, super busy with.

726

:

Coaching agencies, for example,

and working with clients directly.

727

:

What happened is that if you take an

agency, maybe they're a little bit shy

728

:

of working with me because once I open

the engine and I look behind what is

729

:

going on, I might reveal some stuff

that maybe they would rather not see.

730

:

the same is true for companies also.

731

:

So maybe I've cornered

myself a little bit.

732

:

But at the same time I see

that's starting to change now.

733

:

I see that I'm working with

clients where they want to embrace.

734

:

A more privacy aware privacy

conscious, not just because of

735

:

the legal aspect, but because they

feel that's the right thing to do.

736

:

It's a strategy, right?

737

:

So this is starting to change.

738

:

And even if I have I think a fair amount

of credibility, I hope in the field I

739

:

enrolled myself into New York University

certificate in information privacy.

740

:

Just to confirm my knowledge

first and also to get that kind

741

:

of stamp of credential that

I felt that I needed to get.

742

:

And in those courses, I see that people

are coming from all kinds of background.

743

:

So I expect there's going to be an

increase in demand in the market

744

:

for people who are in the field

of information privacy, which is

745

:

different from security standpoint.

746

:

You know, security being really

hardware network you know, the

747

:

security aspect but with more

knowledge into information privacy.

748

:

So I think there's a great

opportunities if you combine.

749

:

Marketing, if you combine legal and

you combine the technical aspect, at

750

:

least a minimum understanding of it.

751

:

I think there is a great future.

752

:

Justin Norris: That's amazing.

753

:

Well, I think we have to wrap now, but

I just want to say I really appreciate

754

:

how outspoken you are on this issue,

your commitment to it, even when

755

:

it's not necessarily easy or popular.

756

:

I think it's an important

discussion that has to be had.

757

:

And just really interesting digging

into this and exploring it with you.

758

:

So thank you so much

for being on the show.

759

:

Stephane Hamel: Thank you,

Justin, for having me.

Links

Chapters

Video

More from YouTube