Artwork for podcast Great Security Debate
Complacency in the Loop
Episode 679th February 2026 • Great Security Debate • The Great Security Debate
00:00:00 00:43:33

Share Episode

Shownotes

AI is growing in use within information security, but are we ready to trust it to do all the things we hope it can, and do so automatically without doing harm? Context is king, and training to that level is only possible when you give all your experience to the AI. What are the tradeoffs to doing so? What happens when we depend on AI and forget (or worse, never learn) the underpinnings of what makes the AI system work (remember the calculator debates of the 1980s?). And does the end justify the means when it comes to AI use? And what is that “ends” anyway? Efficiency, automation, knowledge? Erik, Dan, and Brian discuss it all in this week’s Great Security Debate!

Show Notes

  1. Report on reasons Israel didn’t catch oct 7 attacks - https://www.npr.org/2025/03/05/nx-s1-5318591/israel-shin-bet-security-failure-october-7-attack
  2. Shin Bet Report - Source Doc (Hebrew) - https://www.documentcloud.org/documents/25551448-yqry-tkhqyr-shyrvt-hbytkhvn-hklly-710/#document/p1
  3. Waymo hits student on bicycle - https://www.theverge.com/2024/2/7/24065063/waymo-driverless-car-strikes-bicyclist-san-francisco-injuries
  4. Waymo violates school bus rules - https://www.cbsnews.com/news/waymo-recall-3000-vehicles-software-school-bus/
  5. Podcast Recommendation - Agentic Dan - https://distillingsecurity.com/episode-64-agentic-dan/
  6. TV Recommendation - Pluribus - https://tv.apple.com/us/show/pluribus/umc.cmc.37axgovs2yozlyh3c2cmwzlza
  7. Trust Issues in AI -
  8. “I bought this Tesla before Elon went crazy” magnet - https://geni.us/DXQYk
  9. OpenAI adds ads - https://apnews.com/article/chatgpt-ads-openai-advertising-83812a066375a805fa2e29b28fc77da1
  10. Satya Nadella AI Internal Memo - https://africa.businessinsider.com/news/nadellas-message-to-microsoft-execs-get-on-board-with-the-ai-grind-or-get-out/sq0fe52
  11. FDA AI Rules - https://www.fda.gov/regulatory-information/search-fda-guidance-documents/considerations-use-artificial-intelligence-support-regulatory-decision-making-drug-and-biological
  12. Utah AI Prescriptions - https://www.politico.com/news/2026/01/06/artificial-intelligence-prescribing-medications-utah-00709122
  13. Movie Recommendation: Terminator - https://geni.us/59025
  14. Book Recommendation: The Cuckoos Egg - https://geni.us/hYE9
  15. Book Recommendation: AI 2041 - https://geni.us/5dtV54h
  16. Anthropic Super Bowl Ad - Scott Galloway on why Anthropic's Super Bowl ad got to Sam ... - Fortune
  17. China facial recognition payments - https://www.chowhound.com/2073279/grocery-store-facial-recognition-china-smile-to-pay/
  18. Movie Recommendation: Sneakers - https://geni.us/P7SB
  19. The Dawn of the Post Literate Society - https://jmarriott.substack.com/p/the-dawn-of-the-post-literate-society-aa1
  20. Leading Causes of Death in the US, 2023 - https://www.cdc.gov/nchs/fastats/leading-causes-of-death.htm
  21. Automated car sex in backseat - https://dailydot.com/driverless-car-sex-autonomous
  22. Podcast Recommendation: The Final Act - https://distillingsecurity.com/new-podcast-coming-soon-the-final-act/
  23. Calculators and Children - https://www.linkedin.com/pulse/crunching-numbers-debate-over-calculator-use-math-education-church-sg6te/
  24. Podcast Recommendation: Mentorcore - https://distillingsecurity.com/tag/mentorcore/

Some of the links in the show notes contain affiliate links that may earn a commission should you choose to make a purchase using these links. Using these links supports The Great Security Debate and Distilling Security, so we appreciate it when you use them. We do not make our recommendations based on the availability or benefits of these affiliate links.

Transcripts

Speaker A:

Welcome to the great Security Debate.

Speaker A:

This show has experts taking sides to help broaden understanding of a topic.

Speaker A:

Therefore, it's safe to say that the views expressed are not necessarily those of the people we work with or for.

Speaker A:

Heck, they may not even represent our own views as we take a position for the sake of the debate.

Speaker A:

Our website is greatsecuritydebate.net and you can contact us via email at feedbackreatsecuritydebate.net or on Twitter.

Speaker A:

Twitter at Security Debate.

Speaker A:

Now let's join the debate already in progress.

Speaker B:

You sure you're not in product sales?

Speaker B:

It now has AI.

Speaker A:

It all has AI.

Speaker B:

Think about that.

Speaker B:

Go back, go back to.

Speaker B:

You were just walking us through like the change in acronyms and a waf, because now it can do something else.

Speaker B:

Isn't that just indicative of our field and the haves and have nots.

Speaker B:

Like it's still a waf.

Speaker B:

It's just you've got companies that refuse to evolve and become something more to address evolving technology.

Speaker B:

So, oh well, you're just a waf.

Speaker B:

Now I need WAFF plus plus.

Speaker C:

Yeah, yeah, yeah.

Speaker C:

Does your WAF do clippy too?

Speaker B:

Because I'm getting so sick of all the acronyms.

Speaker B:

That's.

Speaker C:

Yeah, because a lot of it is like the idea of well, we're using AI, so it does this.

Speaker C:

So now it's automated this.

Speaker C:

So we can't just call it next gen waf, we have to call it X.

Speaker C:

But it's interesting because like you say, shift left security as code, but now you have AI writing code and now you have a solution that's looking at the code that's also AI, that's an agent and it's like the human in the loop part.

Speaker C:

So I don't want to get down this path, but I was reading like a 15 page article on what happened in the failure of Israel to detect the Hamas October attack.

Speaker C:

And it basically a good portion of it was the fact that the human in the loop part had been removed, that they had so many automated systems that when they detected multiple times these large gathering of.

Speaker C:

Of troops or people in the area and then they left, that it didn't consider it a threat or a probe.

Speaker C:

Right.

Speaker C:

And that they had disbanded this team that they had that was responsible for more of that human in the loop and that as they analyze chatter and, and you take all these things into account, video documentation, GPS tracking, that the human asset where they would depend on also people in the neighborhoods, right.

Speaker C:

That can give them real time information of hey, there's a Lot of people gathering at this mosque at this time of the day, which is not a time of prayer.

Speaker B:

Right.

Speaker C:

So this is a direct indication of a meeting.

Speaker C:

Right.

Speaker C:

And the lead up like as they went through all the indicators that they had after the fact, they were like, had we had a human in the loop, yes, we would have seen this coming.

Speaker C:

But we had removed so many of those processes because we became so good at using technology for tracking people, for movements, for going through audio of conversations and all of this, that the human in the loop missed this because we removed so much of that.

Speaker B:

This is, but this is, this is the horrible conditioning that we've seen in the marketplace.

Speaker C:

Right.

Speaker B:

That if you think about anytime we have a technical advancement, what do we do right away?

Speaker B:

Well, how many people can it remove?

Speaker B:

Because we right away go to.

Speaker B:

Oh, it's got to automate the whole process.

Speaker B:

And every time we do that, what do we do?

Speaker B:

We lull ourselves into this sense of complacency and comfortability that it's handling everything.

Speaker B:

Context matters, especially in our world.

Speaker B:

Context really matters.

Speaker A:

But none of this is new.

Speaker A:

I come back, I'm looking at 25, 25 years ago when we started putting sim in and looking at, you know, you start take signals in and.

Speaker A:

But you knew that all the signals included.

Speaker A:

I was in banking at the time.

Speaker A:

You knew that the signals included all of the pre existing fraud or all the preexisting bad behavior.

Speaker A:

So those, those behaviors were baked into the norm.

Speaker A:

So now when we start to get into heuristic, this is also a move from signatures to heuristics.

Speaker A:

So you start looking at, at trends.

Speaker A:

The, the, the, the normal includes bad things.

Speaker A:

So now how do you discern from the, from the baked in bad things to the new bad things to the real bad things to the actual.

Speaker A:

To be able to discern the noise from the signal.

Speaker A:

And so this is, this is not a new problem.

Speaker A:

It's just a different, A different scale or re stamped with a different title.

Speaker C:

Yeah.

Speaker C:

And you could take the same thing for that.

Speaker C:

Oh, what's the car company Autonomous cars out California paired with General Motors at one point.

Speaker A:

Waymo.

Speaker C:

Come on, Brian, why do I draw blanks?

Speaker A:

Waymo.

Speaker C:

Waymo.

Speaker C:

So you know, Waymo operated in a very.

Speaker C:

In San Francisco has always been a test bed because that 0 to 25 miles an hour is very important versus highway driving and the ability for all the different factors that come into play that are kind of outside the realm.

Speaker C:

But it's very much city driving.

Speaker C:

Right.

Speaker C:

And then last week you had the the student hit on a bicycle, right.

Speaker C:

Which was in a residential.

Speaker C:

And this is new territory to Waymo, where they had started operating in this resident in more residential areas.

Speaker C:

And yes, you have, you know, like it didn't recognize the idea to stop for the school bus, right?

Speaker C:

Like it drove right around the school bus and the school buses stopped so the kids can walk in front of it right.

Speaker C:

As they're dropping it off.

Speaker C:

And this is outside its realm.

Speaker C:

They don't have that right.

Speaker C:

Or those scenarios.

Speaker C:

Right.

Speaker C:

For it to learn from.

Speaker C:

But even more so was what they referred to again as the human in the loop.

Speaker C:

Like from learned experiences.

Speaker C:

What do people know when they're in a school zone?

Speaker C:

That there's probably kids around here, right?

Speaker C:

Like, you know, there's kids somewhere that could bat out from behind a tree, right.

Speaker C:

Or just ride their bike across the street.

Speaker C:

Or when you're coming around a corner, think about the fact there might be a kid because that's the school right there, right.

Speaker C:

The car wasn't conditioned for that.

Speaker C:

It was conditioned for city driving, right.

Speaker C:

Like make your way around the bike or.

Speaker C:

Right.

Speaker C:

Because there's.

Speaker C:

They're everywhere.

Speaker C:

And lo and behold, a small child got hit.

Speaker C:

And that's what some of the parents were saying.

Speaker C:

Like the.

Speaker C:

The vehicles weren't conditioned for this environment, they were conditioned for a city environment.

Speaker C:

And how could Waymo miss that, right?

Speaker C:

And where was the human in the loop part to say, hey.

Speaker C:

And again, it comes back to.

Speaker C:

You can automate it as much as you want, but data matters, right?

Speaker C:

Conditions matter, scenarios matter totally.

Speaker C:

And in environments matter.

Speaker C:

And if you became very comfortable in one environment and you go from city to.

Speaker C:

To urban life, it's a completely different environment.

Speaker A:

We know all of this time, you know, tying this back into security, this is.

Speaker A:

We're unfortunately retreading the.

Speaker A:

The episode on.

Speaker A:

On AI Agentic Dan.

Speaker A:

He has the title of which I am Dan F. But no, baby.

Speaker A:

The idea that, that, that context.

Speaker A:

The context.

Speaker A:

Unless you can train on the brain experiences, none of it's going to be able to unless you can get a true hive mind, which in theory makes good sense, is very interesting.

Speaker A:

But the, the downsides I think are quite monumental.

Speaker A:

By the way, watch the series show Pluribus if you haven't.

Speaker A:

I highly recommend it.

Speaker A:

But.

Speaker A:

But the, the implications of.

Speaker A:

The implications of having that.

Speaker A:

That I guess expecting that something can operate without all the inputs or with only a subset of inputs.

Speaker A:

So it's good enough.

Speaker A:

It's like a high school senior.

Speaker A:

I know enough to be dangerous, but I Don't know enough to like really do the job.

Speaker B:

This is so fascinating to me though I go back to your comment, Dan, that we've had this conversation before, right.

Speaker B:

That this isn't anything new.

Speaker B:

So if you pull, pull granted.

Speaker A:

It's only been three episodes.

Speaker A:

It's only been three episodes though, Eric.

Speaker A:

So it's not that long ago.

Speaker B:

AI out of the mix.

Speaker B:

And how much does this conversation look like when we're all debating insource versus outsourced Sock and those that wanted to insource everything made the argument that we can't outsource it to some company that doesn't have any context, doesn't understand our organization because they're not going to know what's going on.

Speaker B:

We're having the exact same conversation.

Speaker B:

We're just inserting a technology instead of people that work for a different company.

Speaker A:

Right.

Speaker A:

I still have that argu.

Speaker A:

I still have that conversation.

Speaker A:

I still, I'm still a build it Inside kind of guy, but.

Speaker A:

Yeah.

Speaker B:

Yeah.

Speaker B:

Proponent of hybrid declaring how we.

Speaker B:

Our view on this.

Speaker A:

Thanks, fence sitter.

Speaker B:

You gotta give me the Mr. Miyagi now.

Speaker B:

Left side good.

Speaker B:

Right side good.

Speaker A:

Also good.

Speaker A:

Yeah.

Speaker A:

I mean, at the, at the end of the day, at the end of the day, you know, there is a.

Speaker A:

There's danger in rushing into everything.

Speaker A:

And I. Oh, I'm.

Speaker A:

I want to be clear because I've actually had this epiphany lately.

Speaker A:

My concerns with evolution of technology in this way are not about the technology.

Speaker A:

My.

Speaker A:

It's funny and I'm a cynic.

Speaker A:

You all know my.

Speaker A:

I'm a cynic beyond cynic.

Speaker A:

But my concerns here are not about what the technology itself will do or turn into.

Speaker A:

It's what the humans that build the technology, the other things they will use the information for is a way.

Speaker A:

That is not what they claim this is.

Speaker A:

My issues are trust based.

Speaker A:

They are not about the technology itself.

Speaker B:

Why do you drive a Doge car?

Speaker A:

You have to read the, you have to read the, the.

Speaker A:

The magnet on the back says I bought.

Speaker A:

I bought it before I knew.

Speaker A:

Before he was this big of a crazy nut.

Speaker B:

I digress.

Speaker A:

But the, I mean the, the.

Speaker A:

It's not about the technology.

Speaker A:

I, I like the technology.

Speaker A:

I just do fear what's going to be.

Speaker A:

You know, what the other ways that it can be used, you know, we get a hive mind together.

Speaker A:

We get all this collection of data.

Speaker A:

Now how does it get used against you?

Speaker A:

How does it get used for other purposes?

Speaker A:

How does it become more ads?

Speaker A:

You know, all of these kinds of things and so, so I, you know I've been really leaning into isolated versions of, of, of some of this stuff and really quite enjoy the capabilities.

Speaker A:

I think that the, I think that the tech is, it was really coming along great but it's also moving like it's blazingly fast and blazingly fast.

Speaker A:

t you all to remember back to:

Speaker A:

It worked, it was great, great features, but it was not, it was beta.

Speaker A:

It was even like alpha beta.

Speaker A:

And I think we're seeing that a lot here.

Speaker A:

This stuff is great but you gotta take it with a grain of sal.

Speaker A:

Assault.

Speaker A:

The fact that we're even debating humans not needing to be in the loop at this moment, at this, this maturity.

Speaker A:

I think that's the idiocy, not the tech itself because the tech is, is got some great forward looking capabilities.

Speaker A:

But to put this much trust and you have to take the satya Nadella and the what's his nuts and the, the other what's his nuts and all the people who are doing that, who are.

Speaker A:

Who their motivations are financial.

Speaker A:

Yeah.

Speaker C:

Lot of nuts.

Speaker A:

That, that one guy who said no, that's, that's 67.

Speaker A:

No, no.

Speaker A:

There's Zucker, there's the Zuckerberg of AI and then there's the other one and then there's Sachin Nadella who seems to think that AI is the only answer because he's so far into it that if he backs out he'll look bad.

Speaker A:

But I mean these, the.

Speaker A:

You take them out of the equation because they have other motivations.

Speaker A:

Why are we all moving so fast?

Speaker A:

Why we we should be looking at it.

Speaker A:

Why are we al.

Speaker A:

Why do we really believe that it's ready?

Speaker A:

I think it helps us.

Speaker A:

I think we use it to inform but I don't think we're ready especially in an industry like security.

Speaker A:

I'm glad that the health industry is moving forward with it cautiously but has put barriers on using it for full automation.

Speaker A:

I maybe we talked about this in the other episode but the fact that Utah is letting AI do automated prescriptions I think is a bridge too far.

Speaker A:

I'm not familiar with the guide rails on it but, but like at its surface maybe that's starting to border on, on, you know, on, on.

Speaker A:

On dangerous ground.

Speaker A:

But like I, I think use it.

Speaker A:

We should use it.

Speaker A:

We should use it in isolated to break the deal with the trust issue and we should deal with it.

Speaker A:

We should use it in a controlled and don't let it just Start taking over the world.

Speaker A:

Not in the.

Speaker A:

Not in the Terminator way, but I mean, you know, just doing its own thing and being, you know, trusting it to no end until we have a better sense or it's a little bit, quite a bit more mature.

Speaker C:

You know, it's funny you talk about, because we were talking about like API security, WAF security, etc.

Speaker C:

You go back and look at open source technologies, the word dependencies, right?

Speaker C:

Like, where are these dependencies built in?

Speaker C:

And we have used the statement before, like your parents luxuries become your necessities, right?

Speaker C:

Like that timeline is even shortened even more.

Speaker C:

It's like, well, that luxury I had five months ago has now become my necessity, right?

Speaker C:

Like, like the time span of like, yes, we had one little TV in the house now, and that was a black and white with six channels.

Speaker C:

And by the way, the channels were free back then, right?

Speaker C:

Where now it's like, I have five subscriptions.

Speaker C:

I paid 200amonth for this, plus I had to add this.

Speaker C:

And by the way, Disney plus just increased another 100 bucks.

Speaker C:

But my kids love it, so I added that.

Speaker C:

Like, whoa.

Speaker C:

Like, where I'm going with this is why does even cyber security exist, right?

Speaker C:

Like, there was this really cool thing called the Internet that was created as a way to communicate and share information from university to university.

Speaker C:

And it was going to help research and change the world and fight disease.

Speaker C:

And before you know it, like, that's why the book the Cuckoo's Egg was so great, because it was written about something so long ago that was literally just using that telco connection to say, hey, how far can I reach and how far can I go and how deep can I get in?

Speaker C:

Then what can I learn?

Speaker C:

And it was kind of that creativity, right?

Speaker C:

But this connected sense.

Speaker C:

And then, yes, I'm.

Speaker C:

I'm going very wide and broad back there to where I'm going with this.

Speaker C:

But now how far is the technology going?

Speaker C:

And you talk about human in the loop.

Speaker C:

It was always humans in the loop and curiosity of using the technology to do things, sometimes not great things, which is why security existed.

Speaker C:

But then you talk about that data, right?

Speaker C:

AI will look like in the year:

Speaker C:

But it was written about two years ago and they chose AI 41 because 41 kind of looks like AI.

Speaker C:

So it's like, AI, AI, but these are practitioners.

Speaker C:

And I, I don't want to refer to everyone as Experts, but people that have spent a long time working on large language models, data analytics, et cetera, that then worked with somebody in kind of the creative space to say, here's four or five examples of where we see it going, and here's how that technology would work today and how it would work then.

Speaker C:

Right?

Speaker C:

But they're like the ability to do some of this video creation, to pretend to be.

Speaker C:

To create a video of yourself and someone else.

Speaker C:

Like, go back to:

Speaker C:

And it was a fake, but it was really good.

Speaker C:

Right.

Speaker C:

And at that time, all you need is someone to watch it one time, right?

Speaker C:

And if they know nothing about technology or how that could have been done, they believe it and they share it a thousand times over.

Speaker C:

And before you could even say in 24 hours that, oh, that wasn't him, the damage is already done.

Speaker A:

Right?

Speaker C:

Right.

Speaker C:

And you can't take that back once the image is put into somebody's head, Right?

Speaker C:

So it really gave people a snapshot.

Speaker C:

And the book talks specifically about that, but then takes it to a whole other level where people are now wearing around fake masks, right?

Speaker C:

Because they're trying to hide their identity.

Speaker C:

But there's all these camera systems that can detect a fake mask, and it's illegal to wear the fake mask, Right?

Speaker C:

So you get these companies that are creating really, really good ones, and then you got to have alternate IDs for everywhere you go, because how you go through a train station and all of this.

Speaker C:

So you look into the vehicle technology, because I started with that simple waymo example of human in the loop and not knowing their environment.

Speaker C:

Right.

Speaker C:

And the automotive companies very much want to get into that data gauge because that's why the cell phone's so valuable, because of the metadata and the tracking.

Speaker C:

Right.

Speaker C:

Why is Palantir worth what they are?

Speaker C:

Right.

Speaker C:

What do they have their hands into all over, right.

Speaker C:

Is all of that metadata, and if you're that vehicle.

Speaker C:

So, like, we talk about wanting to just get away and get out into the woods, Dan.

Speaker C:

Like, I just want to go up north where there's no cameras or anything.

Speaker C:

But I drove my car there, right?

Speaker C:

And my car is this new Tesla.

Speaker C:

It's tracked everywhere I went, and it's got six cameras on the thing, videotaping everything around it.

Speaker C:

That data, that's.

Speaker C:

That's the.

Speaker C:

When people say the data, you know, the new gold is the new gold or the new oil or the new silver based on silver pricing, insert comment.

Speaker C:

That value, like.

Speaker C:

Well, they say, I don't know what the value of it is.

Speaker C:

Yeah, you do.

Speaker C:

Because if you have a lot of it and a lot of it on people, where they go, what they do like, and then you're saying, going to add these agentic AI agents that just make decisions and we'll take over this task for this human and this task for this human.

Speaker C:

All of that was us like, like we empowered it by teaching it everything we do to say could it do it better, but can it do it better?

Speaker C:

And people will argue this.

Speaker C:

Like we jokingly, Dan said that, you know, AI was just a really creative

Speaker A:

search, really big search engine.

Speaker A:

Yeah, yeah.

Speaker C:

Which it can be, right?

Speaker C:

But then you just had open AI decide to start dropping ads into it.

Speaker C:

Which then was it Anthropic or Claude that came out with that great ad?

Speaker C:

I mean, it is.

Speaker A:

I don't think, I don't think it actually comes out until Sunday.

Speaker A:

But.

Speaker A:

Yeah, but when, but when this episode airs.

Speaker A:

Yes.

Speaker A:

If you watched it during the Super Bowl, I believe there will be an anthropic ad that caused a whiny.

Speaker A:

By the way, what's his nuts.

Speaker A:

One was Sam Altman, that's the name I was looking for.

Speaker A:

Which resulted in a very whiny post on Twitter about, about ads, etc.

Speaker C:

So, so where my ask here is like, you have compliance requirements, right?

Speaker C:

HIPAA, etc.

Speaker C:

Like, you have this medical data, but all this other data which we've called metadata for a long time.

Speaker C:

But it's my data, right?

Speaker C:

And like if in order to have my car get updates, right, for all the software you put in it now, I have to say I, I agree with you.

Speaker C:

Tracking everything I do just in order to get this update that I might need.

Speaker C:

Like, it kind of feels like you got my arm twisted behind my back, right?

Speaker C:

It's like, I know you ordered the pizza.

Speaker C:

I need all this data or else we don't put the pepperonis on it.

Speaker C:

It's just how it's going to go this time, right?

Speaker C:

That's an old pepperoni that you ordered.

Speaker C:

We don't have it anymore.

Speaker C:

You want the new one?

Speaker C:

I need to know everything first, child.

Speaker C:

Give me all the info, right?

Speaker C:

It's like I just wanted the pepper.

Speaker C:

I just wanted tires on the car to work.

Speaker C:

But you have an ECU in it in the braking system that needs an update.

Speaker C:

I just didn't want to die.

Speaker C:

So you really need to know everything about me.

Speaker C:

Where I go, how fast I go, right?

Speaker C:

How I took this corner at 35 and where I was at this time of the day when I drove by this person who might have committed a crime but had nothing to do with me.

Speaker A:

Believe me, I do contemplate the idea of re.

Speaker A:

Of getting my, you know,:

Speaker A:

The computer in it was barely enough to power the power the screen that showed the.

Speaker A:

The radio station.

Speaker C:

But.

Speaker C:

And this is why security in cyber.

Speaker C:

Because now the idea that, oh, well, now I can use my vehicle for making transactions, right?

Speaker C:

And I remember thinking that this was a really cool idea at CES, like 10 years ago when I worked in ADAS systems.

Speaker C:

Like, oh, that'd be cool.

Speaker C:

I just pull up and it already knows and I just fill up with,

Speaker A:

oh, like the Taco.

Speaker A:

The Taco Bell app has it.

Speaker A:

Yeah, I remember some mentions of that.

Speaker A:

That never really took off, did it?

Speaker C:

No.

Speaker C:

And a really bad idea, right?

Speaker C:

Like, why are you telling me?

Speaker C:

Like, like, but, but to the consumer, it's like, oh, man, that'd be so great, right?

Speaker C:

If I didn't have to get my wallet out and touch that and the handle of the gas thing, right?

Speaker C:

I just pull up and I'm filling it up, right?

Speaker C:

And it charges, right?

Speaker C:

And you know, maybe plays an add or two and I'm like, yeah, cool, I'll buy that song.

Speaker C:

I like it, right?

Speaker A:

Like consumerism.

Speaker C:

Wherever money's transacted, right?

Speaker C:

Crime exist, right?

Speaker C:

So I always say, if you can eradicate crime 100%, then great, cool.

Speaker C:

Let's connect all these systems and pay for everything everywhere we go just by walking by, right?

Speaker A:

Well, to be fair, to be fair, that's exactly what happens in China.

Speaker A:

I mean, China's got a facial recognition payment system that is.

Speaker A:

It's amazing.

Speaker A:

It is an amazing.

Speaker A:

You literally walk in any.

Speaker A:

Any store, you put your face on the screen, you're up to the screen and you're paid.

Speaker A:

It goes against Alipay or we pay or one of the.

Speaker A:

One of the services.

Speaker A:

But like, this is.

Speaker A:

This is this.

Speaker C:

It's it.

Speaker A:

You.

Speaker C:

It.

Speaker A:

It embodies the idea of, you know, you are.

Speaker A:

Your identity is, you know, my what?

Speaker A:

My voice is my passport.

Speaker A:

Verify me from sneakers.

Speaker A:

You know, my face is my passport and I.

Speaker A:

It can get me to anything.

Speaker A:

It already is your passport in many countries.

Speaker B:

But to me, though, that's just.

Speaker B:

That's the hallmark of consumerism, right, though it's exactly consumerism.

Speaker A:

Well, it's consumerism and it's.

Speaker A:

It's capitalism.

Speaker A:

I think then the Two are distinct or they're nuancedly distinct.

Speaker A:

But yes, it is the, the consumer,

Speaker B:

the consumer component trumps.

Speaker C:

Right.

Speaker B:

Because ideas that don't meet the constructs of what a consumer is looking for should in essence fail.

Speaker A:

Right, Right.

Speaker B:

But you, you create something out there where a consumer is willing to trade off privacy and control for efficiency that I can just, it can just scan my face and it'll just process the payment.

Speaker B:

Sure.

Speaker B:

I'll give you whatever you want.

Speaker B:

Right.

Speaker B:

And that's, I don't know, that's a, that's a tough one because now, now you also get into the argument that what's the, the place of legislation and government in this?

Speaker B:

Is it to protect the unknowing consumer on the trade offs that they're actually making or should the consumer be more verse to understand the risks that they're taking and what they're giving away?

Speaker A:

I wish I could, I wish I could say that the latter has any chance of success.

Speaker C:

Like I'd have to plug it in because I'm not familiar with Chinese law.

Speaker C:

Right.

Speaker C:

But you know, just doing a quick, you know, glance over certain crimes.

Speaker C:

Right.

Speaker C:

I, I probably would be a little afraid of breaking the law in China for fear of what some of the severe swift penalties are versus in North America because you got greater access to legal counsel where you don't there.

Speaker C:

So I, where am I going with that statement?

Speaker C:

Like if, if I commit this financial crime in the U.S. it's kind of like prove it, I'm innocent till proven guilty.

Speaker B:

Right.

Speaker C:

Like I am not saying that's why it's targeted over here.

Speaker C:

But you look at like I shared that article about the, the gentleman that Andy Greenberg did the write up on and Wired, which was a great, I mean I talk about like an in depth look at somebody that was being held against their will in doing the pig butchering scam.

Speaker C:

Right.

Speaker C:

And going after, you know, people that spoke as they were.

Speaker C:

Actually he was in charge of going after Indians, he was of India descent and buddy, there was whole groups on different floors like this one's targeted for North America etc.

Speaker C:

But you look at, you know, wherever money's transacted, right.

Speaker C:

And the idea of crime in the way we're connected over here and going after people, right.

Speaker C:

Or people committing crime.

Speaker C:

I'm not saying that having all those things connected in China works because they have, you know, a different type of legal system.

Speaker C:

Right.

Speaker C:

And people are afraid to break the law so they just abide by it and no one's going to do bad stuff.

Speaker C:

With it.

Speaker C:

But you know, you have that much data, you also have that much power.

Speaker C:

Right.

Speaker C:

Because you know that much more about everybody.

Speaker B:

Yeah.

Speaker A:

And with good intentions.

Speaker A:

Sorry, everyone starts out good.

Speaker B:

I guess I wish good intentions as defined by whom?

Speaker A:

Well, the.

Speaker A:

Right.

Speaker A:

I'm saying if, if the good intentions persisted and stayed as good outcomes and didn't wander from.

Speaker A:

Well, you, you know, you make a very valid point as to who defines what good intentions are.

Speaker A:

Because you know, I said Zuckerberg has a very different definition of good intentions.

Speaker A:

So you know those.

Speaker C:

Yeah.

Speaker A:

But then you, but then you have to ask the question, is it worth even.

Speaker A:

Is it worth starting something that you know is going to end up in a train wreck or is going to end up in a place that is bad for society?

Speaker A:

I come back to a recent study on.

Speaker A:

On.

Speaker A:

It actually was a.

Speaker A:

It was a. Yeah, it was a recent study, but a good write up and I'll post it on.

Speaker A:

On the downfall of language of communication or the ability for people to communicate directly tied to the, to the timing of the, of the release of the, of the, of the first full screen smartphones.

Speaker A:

And if you knew at the time that this was going to be the outcome, would you have not developed it?

Speaker A:

Should you not have developed it?

Speaker B:

Yeah.

Speaker C:

Did AI41 like the first use case there?

Speaker C:

Really unique that it's called the Golden Elephant.

Speaker C:

And imagine in:

Speaker C:

Right.

Speaker C:

Like, and this is looking at:

Speaker C:

So like take the cost of insurance and say it's going to be this.

Speaker C:

As long as you give it access to all your applications and everything you do.

Speaker C:

And the more you do, the better the rates are.

Speaker C:

And then it will also give you ads.

Speaker C:

And if you buy through those.

Speaker C:

Right.

Speaker C:

You get better insurance rates.

Speaker C:

Right.

Speaker C:

And it shows insurance being a big deal.

Speaker A:

But it's already, that's already in place, Brian.

Speaker C:

Right.

Speaker C:

But you gotta agree, it goes even way further.

Speaker C:

But what was interesting was the daughter was like, mom, how could you give it access to mine?

Speaker C:

You.

Speaker C:

You can't get.

Speaker C:

I, I didn't say you could do that.

Speaker C:

And she's like, I'm the mother and I make those decisions.

Speaker C:

Right.

Speaker C:

But then she's like, you know, I noticed my brother quit eating sweets because, you know, the golden elephant was there as a reminder.

Speaker C:

And the parents started keeping less sweets around.

Speaker C:

My dad started driving way more careful and his insurance went down.

Speaker C:

And because the golden elephant would remind him with this little symbol that he's going too fast.

Speaker C:

She goes, so he went from this crazy driver to safer.

Speaker C:

And then this person quit smoking, right?

Speaker C:

She's like, so I saw all these good things happening.

Speaker C:

But what happened in the end?

Speaker C:

The person she loved the most or really liked, she kept noticing that the app would push her away from trying to communicate with that person.

Speaker C:

And what she ended up discovering when they finally got together to talk was his background.

Speaker C:

Even though they changed the names and the government got rid of abolishing the idea that there's people of lower hierarchy, the data was always there, and it knew his family was of lower descent, so it didn't want her getting close to him.

Speaker A:

Interesting.

Speaker A:

That is interesting.

Speaker B:

We got to be so careful on this, though, because we're almost getting into the ends justify the means type of argument.

Speaker B:

Of course, we look at the end of the story and then go, well, we shouldn't have even started the story to begin with.

Speaker B:

But there's, you know, correlation versus causation.

Speaker B:

There's so many different inflection points along the way.

Speaker B:

Like your example, Dan, on the phone.

Speaker B:

Was it the phone that really created what we're doing with communication today, or was it the introduction and the ability and the evolution of social media that then could exist on the phone that really changed a lot of that, that there's so many different pieces.

Speaker B:

And it's because you could make same argument about the.

Speaker B:

The car, right?

Speaker B:

That if we agree that, you know, because what is it?

Speaker B:

Vehicles kill?

Speaker B:

Gosh, I forget where it sits on the.

Speaker B:

The hierarchy of.

Speaker B:

Of most desks, but it's.

Speaker B:

It's pretty far up there, if I recall correctly.

Speaker B:

So you can make the argument, should the car never have never been created?

Speaker A:

Of course, it's that.

Speaker A:

That's the perfect example of, you know.

Speaker C:

Yes.

Speaker A:

Now there are other things, you know, there are other things that should be that could be changed along the way to decrease those, but they're bad for certain audiences or certain populations or certain things with money, etc, and therefore it's, you know, they're not done.

Speaker A:

But no, you should.

Speaker A:

The initiative, especially because you didn't know at the time, you didn't know when you created the car, that in a hundred years people would be driving 70 miles an hour having sex in the back because there's an automated driver in the front.

Speaker A:

I mean, all of these kinds of things, Brian, not.

Speaker A:

Yes, I'm looking at you, Brian.

Speaker A:

We have the six for all the

Speaker C:

listeners that aren't watching.

Speaker C:

I was stunned.

Speaker A:

We have a six and the camera car apparently had six cameras of video recording the whole thing.

Speaker A:

The we didn't.

Speaker A:

We did.

Speaker A:

It's where we take it as society that is the, you know, at each inflection point, should there be an ethical review?

Speaker A:

Should there be a review of the implications?

Speaker A:

hope that you can foresee in:

Speaker B:

Slippery slope argument, of course.

Speaker C:

Of course.

Speaker A:

So bringing it all back around.

Speaker A:

Oh, go ahead.

Speaker C:

Oh, no, no, no.

Speaker C:

I, I, I, I would agree, Eric.

Speaker C:

Like, the thing is, you don't know what you don't know, so you go down the path and you learn.

Speaker C:

But I think, like, yes, we don't know where this end game is, but what we've learned in the last 10 years, five years and last year and a half, we're going really, really fast now.

Speaker C:

Not.

Speaker C:

I'm not talking.

Speaker B:

Oh, yes.

Speaker C:

I'm talking to the idea of how we use AI.

Speaker C:

Agentic AI.

Speaker C:

I'm talking.

Speaker C:

We still haven't even gotten education there on security.

Speaker C:

Like people, Dan and I.

Speaker C:

And I'm still in the learning phases of setting up my network and all my security protocols in the house, but people come in and see this, and they're like, oh, my God, what is this giant that you got your own server rack and all this stuff like, oh, come on.

Speaker C:

Like, this is overboard.

Speaker C:

I'm like, well, no, here's what I can do, and here's security.

Speaker C:

And they're like, oh, man, you're just over the top.

Speaker C:

And it's like, if you're not thinking about this, didn't you just tell me, like, your dad and somebody in your family just had their bank accounts hit and that you just clicked on something, like, just in a small basis.

Speaker C:

We haven't even got past the education piece.

Speaker C:

Right.

Speaker C:

Of really allowing people to understand how they engage with tech and what to look out for.

Speaker C:

And now we're already to the fact of get ready.

Speaker C:

It's really hard to look out for it because now that person that's being held captive, sending you this message.

Speaker C:

That message is really good.

Speaker A:

And that video is great, but the loudman on TV says that if I buy the Xfinity thing, it will do.

Speaker A:

It will be security for me.

Speaker B:

I mean, taking, Taking your argument.

Speaker B:

I'm just kind of thinking about this, Dan, and what you made about the reference to the phone and being the de.

Speaker B:

Evolution of communication.

Speaker B:

We changed that lens and we put it on security.

Speaker B:

Security.

Speaker B:

If we think about some of the evolutions that we've made in recent history that start to obfuscate some of the elements that we grew up with.

Speaker B:

Right.

Speaker B:

That we had to understand protocols, we had to understand looking at the raw data to figure some of this out, younger generations that are coming to the field, is there a huge disservice that's now going to be done because they didn't have to go through that learning to understand how things actually work.

Speaker B:

I just have a chatbot or a tool now that gives me the answer.

Speaker A:

Eric, your hair is not gray enough to be able to make that kind of old fart comment.

Speaker A:

But I told, I mean, I don't disagree.

Speaker A:

You know, this is, it's.

Speaker A:

If you, if you translated it into, in, into the last round of this, which is that you didn't pay your dues argument insecurity.

Speaker A:

And this applies.

Speaker B:

So I don't want to say that.

Speaker A:

No, no, but, but, no, but think about it.

Speaker A:

It's not far.

Speaker A:

It's just another iteration of you didn't pay your dues, so you don't understand and you can't trouble.

Speaker A:

I think as much as I love to go into old man shaking fist at cloud mode and say that the kids don't understand, society has not disappeared.

Speaker A:

And every generation makes this same kind of comment about the kids that come.

Speaker B:

Oh, totally.

Speaker B:

Every generation.

Speaker B:

So they'll figure generation behind them is

Speaker A:

going to be the, the death of us.

Speaker A:

Exactly.

Speaker A:

I mean, I do think that it's a society that, you know, the shelf life of, the shelf life of, of a democracy is, you know, is near.

Speaker A:

And we do see a lot of trends systemically toward, you know, the end of the Roman Empire, hint, let's try and not do that.

Speaker A:

But I think from a.

Speaker A:

It's different inputs, different.

Speaker A:

We have different catalysts, different things.

Speaker A:

And this is what I want to explore on the new show, on the, on the, the final act show is a lot of the how we got here and why we do some of the things we do with a vision toward not necessarily foisting the things that we had to do on the ones that come next.

Speaker A:

Maybe they do find a way to do it better.

Speaker A:

Maybe they don't need, you know.

Speaker A:

And I grew up in an era, I'll take it really far back.

Speaker A:

I grew up in an area where calculators were absolutely the devil.

Speaker A:

You had to do the arithmetic.

Speaker A:

I got my first calculator.

Speaker C:

I can't get at it quickly, where's my calculator?

Speaker A:

But I got my first calculator when

Speaker B:

I graduated high school, stuck in the pocket protector.

Speaker A:

Yeah, it was a great calculator.

Speaker A:

It is one of my favorite calculators in the world.

Speaker A:

I still use it today, 35 years later.

Speaker A:

I do have a stapler because I do use paper.

Speaker A:

But I guess my point is that the calculator, it meant you didn't need to know how to do the arithmetic, how to do the underpinnings.

Speaker A:

And with each generation they go a little bit higher in terms of what they automate.

Speaker A:

Now we can do the arithmetic in the calculator and to use the, use the standardized testing as your benchmark.

Speaker A:

And the, the people who know me know how laughable it is that I use standardized testing as any benchmark for anything because I think it's one of the most messed up industries in terms of actual academics.

Speaker A:

But the, but you know, they've now raised that you can do these things and then they raised it one more time.

Speaker A:

You can now bring in smart, you can only bring in dumb calculators now you can bring in smart calculators of a certain kind that can't run their own programs.

Speaker A:

Because we can manage this.

Speaker A:

And what it is, is you change the parameters, you, you build in for that level of automation and you grow beyond that.

Speaker A:

What.

Speaker C:

So here, here's where I agree and disagree.

Speaker C:

All right.

Speaker A:

The show is called the debate.

Speaker A:

So let's go.

Speaker C:

So in like if you were to take an exam right.

Speaker C:

In calc 2 cal 3 transport phenomena in the question basis they gave you to solve, if you didn't have the calculator to do some of what I'll look as the low end computation quickly and had to write everything out by hand just to get into the differential equation piece of it.

Speaker C:

That would be like, that would be a four or five hour exam versus a two hour exam.

Speaker A:

Yep.

Speaker A:

But to get that's what they were.

Speaker C:

Right.

Speaker C:

But that basis though of understanding.

Speaker C:

And this goes back to before you can get here though, you have to understand problem solving at its root to understand quantitative problem solving.

Speaker C:

You do have to learn why 2 +2 equals 4 to understand why in diffy Q2, 2 may not equal 4.

Speaker C:

Right.

Speaker C:

Given these parameters.

Speaker C:

But without that you're just like, oh, just putting it in the computer.

Speaker C:

But don't understand the logic of why you're solving.

Speaker C:

Right.

Speaker C:

The quantitative piece of it.

Speaker C:

So that learning and progression to getting there is I, I still view as very important.

Speaker C:

It's why I love doing that little, what I'll call simple arithmetic with the kids and I straight up said in front of Kelly that, hey, I am totally fine with printing, you know, a few pages of just a hundred different, you know, equations like requests like two plus two, two plus three, two plus five all the way across.

Speaker C:

And just having the kids memorize, right?

Speaker C:

Like, they get that this plus this equals four.

Speaker C:

Like, they can move the pieces.

Speaker C:

They're logical enough there.

Speaker C:

Let's, let's hurry this along, memorize that, and let's get into the multiplication where I was able to show them, like some of this is patterns.

Speaker C:

Like let's take 9, for example, and go 2 all the way to 9 times 9, 2 times 9, 3 times 9.

Speaker C:

And then show them the pattern that's created on this side right?

Speaker C:

Now let's do the same thing with this.

Speaker C:

Why does that pattern exist and then show that, right?

Speaker C:

Because that's what exists in nature.

Speaker A:

But that is curiosity.

Speaker A:

That assumes curiosity.

Speaker A:

And I think before.

Speaker A:

And by the way, it's, it's infrequent in any of our debates that I have to disclaim or I feel the need to disclaim when the position I'm taking is not my own.

Speaker A:

But when it comes to arithmetic and calculators, I need to.

Speaker A:

My position is that calculators are the devil and if you don't understand how it works, you will never do math correctly.

Speaker A:

So, I mean, so the position I put out before isn't my own personal one.

Speaker A:

My own personal one is do it by hand, damn it, do it by hand.

Speaker A:

Because otherwise you won't understand how it works.

Speaker A:

And all of that said, the.

Speaker A:

I think part of the reason we've had to move on is we've lost curiosity.

Speaker A:

The we've stopped being curious about why two plus two doesn't necessarily equal four.

Speaker A:

We've lost curiosity of why we have to make, why we have to make area assessments in small incrementals and not being able to get a perfect answer of the area under curve.

Speaker A:

Like all of these things.

Speaker A:

The thing you described, Brian, requires curiosity.

Speaker A:

And I think we've lost curiosity for fundamentals because people think it's passe and want to move on to the big things.

Speaker A:

I fear that over time this leads to incomplete solutions and not being able to troubleshoot.

Speaker A:

So bringing it all the way back around to where Eric started, you know, the fact that you don't understand the fundamentals on a network mean that you're going to be dependent on the tool to tell you and won't necessarily know when it's full of.

Speaker A:

Yeah, and I, I, that is that I, I will die on that hill.

Speaker A:

I have been lucky enough to have enough representative curious people coming up through the organizations I hire for that curiosity in order to overcome a more systemic focus on automation and laziness.

Speaker A:

We'll call it Crutch needing in order to live in the world of security.

Speaker A:

Whoa, what does the system tell me?

Speaker A:

So I say be curious.

Speaker A:

Go find out how it all really works.

Speaker A:

Tear apart the computer to the extent you still can and you will be instantly hirable.

Speaker A:

And on that note, we are at the end of our time.

Speaker A:

Thank you Brian.

Speaker A:

Thank you Eric.

Speaker A:

And thanks to you, the listener.

Speaker A:

We love having these debates and we love hearing from you.

Speaker A:

If you have topic ideas, if you have a dispute, if you think I am full of it and that calculators are the only answer and that who needs to know what a PCAP says in order to be able to troubleshoot or secure a network?

Speaker A:

Send it to us.

Speaker A:

Security debate distillingsecurity.com you can find us on our website, www.distillingsecurity.com all the episodes from the Great Security Debate from our Mentor Core program and all of our new shows will be available on distillingsecurity.com and on our YouTube page, YouTube.com@the sign A Great Security Debate.

Speaker A:

Thanks and we'll see you again on the next Great Security Debate.

Links

Chapters

Video

More from YouTube