Recently and over the past few years, world events may have included cybersecurity components in their enactment. So, Brian, Erik, and Dan started talking about the role of security in critical infrastructure protection, asking questions about the ethics and thresholds for government and corporate roles in cyber retaliation, whether we as security practitioners have a role (or an obligation, or even a liability) to close vulnerabilities that can be used in primary or retaliatory scenarios. How much of human nature makes cyber retaliation a foregone conclusion, or can we find ways to reduce the need or use or availability of ways in via the technology. From Stuxnet to Iran to Caracas, using cybersecurity is a prevalent vector of retaliation, but does it always have to be that way? Or will it end with WOPR’s recognition that the only way to win the game is not to play at all?
It’s hard to talk about modern cybersecurity and not bring in current events, and even harder to keep it from turning political. We tried very hard to do a good job in the latter as we talked about the former.
Thanks for being part of the debate!
Show Notes:
Some of the links in the show notes contain affiliate links that may earn a commission should you choose to make a purchase using these links. Using these links supports The Great Security Debate and Distilling Security, so we appreciate it when you use them. We do not make our recommendations based on the availability or benefits of these affiliate links.
Welcome to the great Security debate.
Speaker A:This show has experts taking sides to help broaden understanding of a topic.
Speaker A:Therefore, it's safe to say that the views expressed are not necessarily those of the people we work with or for.
Speaker A:Heck, they may not even represent our own views as we take a position for the sake of the debate.
Speaker A:Our website is greatsecuritydebate.net and you can contact us via email at feedbackreatsecuritydebate.net or on Twitter, Twitter at Security debate.
Speaker A:Now let's join the debate already in progress.
Speaker B:So not to have a political conversation today, but what would you guys think about talking about, like, obviously when the Ukraine war started, there was cyber warfare used.
Speaker B:There's been a lot of learning about drone usage, right?
Speaker B:And like the amount of money we spend been like, you go back to World War II, right, the U.S. you know, turning Willow Run into like, how could we build a bomber a minute, right?
Speaker B:Like the ability to build at scale, but the cost of what it was to build these giant machines and what that cost is today to build a battleship, to build this.
Speaker B:And then you look at how that war has changed using, I mean, and you can see the fiber optic lines, like when they had to switch to fiber and it's got spools and there's these fiber lines all over fields, everywhere, right?
Speaker B:So there is a destruction of habitat that ends up over time.
Speaker B:But this asymmetric warfare and using drones and then all of a sudden the US goes after Maduro and as part of that operation used cyber warfare to black out the city or may have it's.
Speaker B:It's hinted towards that it was used.
Speaker A:Yeah, I'm not well enough versed in this, in what's in those actions to speak meaningfully about it in terms of what has just happened.
Speaker A:Everything would just be.
Speaker A:Would be conjecture.
Speaker B:Well, it's.
Speaker B:The idea of this isn't to talk specifically about Venezuela, right.
Speaker B:And the attack, etc, but if cyber was used to do the blackout right before an invasion similar to Ukraine and what they were trying to do with their wipers and so forth in their initial, you know, days leading up to their attack.
Speaker B:And what you saw come out the last two years after the Ukraine war, where critical infrastructure, there's been a big precedent placed on the Purdue model in looking at critical infrastructure when it comes to energy facilities, whether it's gas, electric, nuclear, and what it takes, right, to actually protect that.
Speaker B:You can drive around and see gas pipes coming out of the ground that.
Speaker B:I mean, there's one in like a median in between I think it's in between a jets pizza and a bell tire and most people wouldn't know what that was.
Speaker B:But that's actually a gas pipe that comes up and there's a meter on it.
Speaker B:Right.
Speaker B:And if somebody swerved and ran over that, hit it.
Speaker B:Yes, but it's also openly exposed.
Speaker B:There's no fence around it.
Speaker B:Etc.
Speaker B:So there's all that talk about how do you protect that critical infrastructure, both physically, but now more so from the cyber front because you're not going to get somebody that's going to parachute in to take over that one little gas line in between that bell tire and that jets pizza.
Speaker B:That would be like Red Dawn.
Speaker B:This is more the conversation Wolverines.
Speaker B:I love that movie growing up like the old version, right?
Speaker C:The original was so good.
Speaker B:The original was so cool.
Speaker A:But the idea here though is, you know, if I guess the corollary is, well, two things.
Speaker A:One, all the things we rail on other countries for doing, we're now doing visibly.
Speaker A:Does that put us in the same camp as the countries we rail on?
Speaker C:2, most those countries would argue that.
Speaker A:We actually made the first move without question.
Speaker A:I mean, I guess at least from the person, again, the perception that's out there.
Speaker A:I don't have internal details to know the order of operations, but it has been made believe that, you know, it has been made to believe that that's the case.
Speaker B:Are you stuck specifically going back for.
Speaker A:Well, yeah, I mean I think that's one example.
Speaker A:Probably one of the better, more visible examples.
Speaker A:Yeah, but then the other is the more we do this is a philosophical change, philosophical thing.
Speaker A:The more we do to build stronger infrastructure, the more we do to build it in the right way.
Speaker A:Are we in fact removing the ability for our own government to use it against other countries, build a better, stronger SCADA system?
Speaker A:Do we then, you know, is it our obligation as security professionals to remove it not just from the North Koreans and the Iranians to use it, but also our own government to use it for those purposes?
Speaker A:Because if we do it right, it removes it off, it moves it off the table altogether.
Speaker A:Is that anti patriotic?
Speaker A:Would somebody ever see it as anti patriotic and hold it against you as a practitioner?
Speaker A:I don't know.
Speaker A:Interesting philosophy.
Speaker A:Question.
Speaker B:Answer.
Speaker B:So from an education standpoint and for listeners out there, because Eric, to your point, it's.
Speaker B:It's like chicken or the egg or who, who threw the first punch?
Speaker B:And you're talking to two kids out on the recess, they're like, well he did.
Speaker B:It's like, well, two weeks Ago he did this, and.
Speaker B:Well, a month ago, he did that.
Speaker B:When you go back in time, which.
Speaker A:One of you was born first?
Speaker B:Had their b. Yeah.
Speaker B:Business breached about a year and a half ago.
Speaker B:And within like four or five days after the Incident Response Team, etc, they knew.
Speaker B:Exactly.
Speaker B:Well, apparently it was this group.
Speaker B:We should go after them.
Speaker B:We should be sending the military.
Speaker B:Take these guys out.
Speaker B:I'm like, well, hold on.
Speaker B:I'm like, you do realize that we may have thrown the first grenade, right?
Speaker B:And he's like, what are you talking about?
Speaker B:And I'm trying to explain.
Speaker B:He's like, oh, that doesn't even matter.
Speaker B:But he came after my business.
Speaker B:You know, we need to go attack them, right?
Speaker B:And I was like, hold on.
Speaker B:Let's go back.
Speaker C:So.
Speaker B:And this goes back to history, right?
Speaker B:Like, it's why I love reading old books, right?
Speaker B:And understanding what, like, Iran.
Speaker B:And I go back and say, well, I shouldn't go too far into this because I don't want to make this super political.
Speaker B:But you go back in the day when Mosadak was removed from office, he was a huge ally, right?
Speaker B:Like, Iran was like, at the end of the day, they were kind of like, they didn't think we did it.
Speaker B:They thought the UK did it.
Speaker B:Then when they found out it was us, they're like, but I thought you were my best friend.
Speaker B:I thought you were my neighbor, right?
Speaker B:It's like, you know, your neighbor finds out it was you, and now you're not friends anymore and you're moving, and now you've never talked since, and you're now complete enemies, right?
Speaker B:So history is really, really important.
Speaker B:So, Dan, I think first the question answers historically, right?
Speaker B:Like, it's not who did what or why.
Speaker B:Because even when you talk stuxnet, in the book Sand Worm too, when those guys walked into Idaho National Lab, and we're like, we need to show you guys something.
Speaker B:You had all the top brass from the military, and they're like, look what happens when we press this button and launch this code.
Speaker B:And, like, the generator is shaking out of control.
Speaker B:It's about to blow up.
Speaker B:And he stops it.
Speaker B:And they're all like.
Speaker B:And he goes, I got through to him.
Speaker B:And they're like, can we weaponize this?
Speaker B:He's like, no, that's not what I was coming to show.
Speaker B:Right?
Speaker A:It's a great.
Speaker A:If you think about it, go back to real genius.
Speaker A:Real genius.
Speaker A:Great movie.
Speaker A:You know, they're developing an interesting laser system, and at the end, it gets disappeared to be.
Speaker A:To become a weapon does that mean that people should innovate?
Speaker C:Jerry.
Speaker A:Jerry.
Speaker C:To the cleaner.
Speaker A:To the cleaners.
Speaker A:Exactly.
Speaker A:And to the popcorn vendor.
Speaker A:So does that, you know, does that in turn, you know, the threat of that risk quashing innovation?
Speaker A:I don't think so.
Speaker A:I think that many people are, you know, people are doing innovation because they want to do innovation, but human nature will always find a way to use it for nefarious or weaponized purposes.
Speaker A:So I guess I don't think that's the case.
Speaker A:But the.
Speaker B:It's human.
Speaker B:The problem's human nature here, Dan.
Speaker B:Right?
Speaker B:Like, Plato's Republic shows this beautiful utopia.
Speaker B:And, like, if you're the person in college that only read 75 of the book, you still believe that utopia is.
Speaker B:It's possible.
Speaker B:And then you get to the last.
Speaker B:That last quarter, and it's like, yeah, but because of humans and human nature, this is why it all fails and falls apart and we suck.
Speaker B:Right?
Speaker B:And you're like, oh, I was about to write my paper and how we could have a utopia and we could all be good.
Speaker B:Right?
Speaker B:It's the human nature part.
Speaker B:And to your point, like, if we could build, like, the better system, we know what needs to be done or how it could be done.
Speaker B:Right.
Speaker B:But even going back in the last five years, 10 years, and there's been people arrested that worked for software companies or production equipment companies who were traveling abroad selling their equipment and then were arrested by a.
Speaker B:A group or a country because they're like, oh, you know what you're doing?
Speaker B:And come to find out their software had those back doors built into it, and those back doors were agreed upon by a U.S. government.
Speaker C:Or like, we don't talk about that here.
Speaker C:CFO that might have been asset.
Speaker B:And now we talk about Huawei, but we were doing the same thing.
Speaker B:It's like, well, everyone's like, well, China, you can't buy anything from China because they're putting back doors in it.
Speaker B:Well, where did they learn that from?
Speaker B:Because all the software they originally bought was coming from here, and you had backdoors built into.
Speaker B:It's like, well, that was different because back then everything had backdoors.
Speaker A:But I. I don't think that's changed today.
Speaker A:There's a move toward, you know, just think about this in the.
Speaker A:In the.
Speaker A:In the communication space, the, you know, chat control, which is currently being debated in.
Speaker A:In the EU with regard to, under the guise of child protection, being able to look into all chats, all conversations, break in encryption, look back to moves in the US to try to do that for.
Speaker A:Excuse me, for.
Speaker A:For imessage and such, again, under the guise of child protection, which again, and I've said this in episodes before, do not ever think for a minute that the children aren't worth protecting.
Speaker A:But I think that the moniker is used in the wrong ways in many of these cases.
Speaker A:It's used as a cover because.
Speaker A:And, and even if you take the politics out of it, any back door you put in that's used for a legitimate purpose, let's assume these are legitimate, will also be used, found out, used and exploited for illegitimate purposes.
Speaker C:You're back to.
Speaker C:And I know we talked about it at one point about the.
Speaker C:Back when Apple was getting a lot of pressure from the FBI to build in the back door.
Speaker A:Yeah, exactly.
Speaker A:Same thing.
Speaker A:And all of these things are pathways in.
Speaker B:And, and they, I mean, it's evident now, but they were using that one incident out west in California and that one attack as a reason.
Speaker B:San Bernardino to poster child, Apple to the community, to create ruckus, to be like, they're not letting us in and there might be another attack.
Speaker B:They need to let us in.
Speaker B:You must let us in.
Speaker B:What owned it, right?
Speaker B:It's like, yes, you.
Speaker B:Once you got in, what else were.
Speaker C:You going to do?
Speaker B:What else could you look for, right?
Speaker B:And not.
Speaker B:We've already done these conversations on AI, right.
Speaker B:And where we're at today.
Speaker B:But they needed these data models, they needed access to data, lots of it, right?
Speaker B:It's why all these terms and conditions, nobody reads them at the bottom of their application, at the bottom of the app when they download, it's like, yes, yes, more, more app, more app.
Speaker B:You get an app, you get an app.
Speaker B:Everyone gets an app.
Speaker B:But again, human nature, leadership, right?
Speaker B:Not every country, I mean, this goes back to even like on some of the banking laws and why Panama was some of the reasons why MOSAC F was able to exist and why somebody then breaching out the Panama Papers for everybody to read and was like, wow.
Speaker B:And there was a lot of, if you want to call them good people, whatever that means from a lot of countries, from a lot of very powerful positions that had shell companies set up all over the world, right?
Speaker B:Even Lionel Messi, his father had set up a massive shell company.
Speaker B:I think the only reason, like, he was kind of aware something's weird is he actually got a paycheck back after he made $100 million.
Speaker B:But the government's like, oh, you didn't make enough money this year, so you get money back in taxes.
Speaker B:Like, oh, no, that's weird.
Speaker B:Well, yeah, because your father set up a bunch of shell companies, right?
Speaker B:It's how they found FIFA, right?
Speaker B:The Panama Papers, right?
Speaker B:So wherever, you know, if, if there's countries that are going to allow back doors or loopholes and if the infrastructure is not going to be set up, right, there's always like, why does security, cyber security, always going to exist, right?
Speaker B:And the first is wherever there's communication, right?
Speaker B:There's the oldest tradecraft in the world, which is cyber espionage.
Speaker B:So people are going to try to get to that communication, right?
Speaker A:I didn't think that was the oldest.
Speaker A:I didn't think that was the oldest trade in the world.
Speaker B:Some people call one of the oldest trade crafts in the world one of the oldest, the gathering of information.
Speaker B:Two is the idea of where money's transacted, right?
Speaker B:So crime, right?
Speaker B:Most crime is done for the purpose of money.
Speaker B:So where money's transacted.
Speaker B:So unless you can eradicate crime 100%, and if money flows through technology today, then cyber security is always going to exist and the last is command and control.
Speaker B:So unless we can get rid of war 100%, then there's always going to be need for command and control.
Speaker B:And going back to kind of the circular part of this was using cyber, right?
Speaker B:Like improving your systems, your scada, your infrastructure, right?
Speaker B:Is really, really important to protect people's interests, protect communities, protect your technology.
Speaker B:But as long as there's command and control and cyber warfare is going to exist too, and the ability to try to like, hey, if we're going to attack, can we black out the entire communications?
Speaker B:Can we black out all the lights?
Speaker B:Can we do this?
Speaker B:Right?
Speaker B:And it's the idea of preparedness, right?
Speaker B:That's the same reason why people like, we've had the conversations about DFIR and being prepared and why you do tabletops and run through them, constantly running through them, you won't find an entire city today outside of Ukraine that prepares on an ongoing basis for blackouts and everything else, right?
Speaker B:They're just not ready for that.
Speaker B:And so from a cyber warfare standpoint, or from a warfare standpoint, I think you're going to see more of cyber because of the connective tissue for the way communities are put together, the way communication's done, etc.
Speaker B:And it goes to the other point of communications, and that is false communication.
Speaker B:The ability to put out rhetoric in advance to get people to, you know, fight, right?
Speaker B:To get minds thinking this might.
Speaker B:Yeah, I don't like it.
Speaker B:So, thoughts, Eric?
Speaker C:A whole Lot of thoughts.
Speaker C:It's.
Speaker C:I mean, there's.
Speaker C:There's a million different directions that we could take this.
Speaker C:Right.
Speaker C:Because I, I mean, it's interesting if you're talking about the social influence of cyber.
Speaker C:If we would make the argument that social influence in a direction that we would generally agree is not realistic or is against.
Speaker C:Against one's best interests.
Speaker B:Right.
Speaker C:It's.
Speaker C:That whole realm is fascinating.
Speaker C:I remember sitting through a talk at one point where they were going through.
Speaker C:So if you recall, Russia is heavily predicated on natural gas.
Speaker C:Right.
Speaker C:That's one of their big exports.
Speaker C:But they have a pretty high floor for the cost that they need because of where they're getting it out of.
Speaker C:It's super expensive to get it out of there.
Speaker C:So if you actually go back to.
Speaker C:If you remember when there was all of a sudden this huge movement against fracking, somebody did a presentation on the connected shell companies, on the messaging that was going into Facebook that propagated the idea that fracking was so harmful to the, the environment, to the people around them and everything.
Speaker C:But it all tied back to Russia, right.
Speaker C:That Russia had to create the sediment that fracking is bad, to get people to rise up against it, to keep their prices high enough where they were competitive on the world scale.
Speaker C:And it becomes interesting.
Speaker C:Like, do we deem that.
Speaker C:Is that a cyber attack?
Speaker A:I think anything.
Speaker A:Well, go ahead.
Speaker B:Oh, I was gonna say not to go into fracking is good or bad.
Speaker C:But I'm, I'm not making an argument on that.
Speaker B:Some of the areas where you do fracking isn't good for the environment where, where Russia's pulling a lot of that out.
Speaker B:It doesn't really.
Speaker B:Like, we opened up a lot of parks, US Parks, and started fracking.
Speaker B:And when you frack that liquid material, you create these giant ponds and then you suck the water up into sprayers into the air to get it evaporate.
Speaker B:And there is a silica and a chemical and everything else that ends up airborne.
Speaker B:So in the areas we were doing it.
Speaker B:And I think to.
Speaker B:What Russia was playing into was we know how bad it can be for the environment.
Speaker B:We know.
Speaker B:They don't know.
Speaker B:Let's make sure they know because this will impact everything.
Speaker B:Right?
Speaker B:Like, yeah.
Speaker B:And so to your point, yes.
Speaker A:Just because it wasn't necessarily misinformation, it was just accelerated information much faster.
Speaker A:But I think that that's the.
Speaker A:Yeah, the.
Speaker A:All of these topics follow.
Speaker A:Yeah.
Speaker A:People say that the.
Speaker A:The CIA triad is out, is outdated.
Speaker A:The idea of confidentiality, integrity and availability.
Speaker A:I think it Is it is still absolutely apropos.
Speaker A:They can be extended.
Speaker A:There's other things like resilience and you know, immutability and things like that that are more, you know, more in a cloud based lifestyle.
Speaker A:But I mean that's the integrity component of, of, of, of CIA.
Speaker A:Is the content you're seeing accurate?
Speaker A:Is it remain accurate?
Speaker A:Is it, you know all of these kinds of things play in.
Speaker A:But I come back to my original question which is is it potentially a, I guess a selling point?
Speaker A:We, we talk a lot on this show about getting people to listen to the why of information security is one of those reasons if we do it right it can stop wars.
Speaker C:I would, I, I, I'm going to take the argument that the answer is no.
Speaker C:Right.
Speaker C:Because I don't think that we can actually design anything that is going to stand the test of time and as we evolve, go back what, six years did anybody have chat GPT on their bingo card?
Speaker C:And now look at what's happening.
Speaker A:I still don't have chat GPT on my bingo card but look at, look.
Speaker C:At the evolution of that.
Speaker C:And that's completely changed our discussions and what we're thinking about that if you compare power consumption over time, build out of data centers and all that, it's vastly changed and we didn't see that coming.
Speaker C:Right.
Speaker C:So I think if we spend too much time trying to create this utopian architecture that is supposed to stop world wars or this fundamental way of going about it, I think it's a utopian idea.
Speaker A:Well, I'm not talking about, I, I, I think at a macro level I agree with you but I think at a micro level atomic changes, we make small changes that improve the security of this and prevent them from being exploited in, in you know, in ways that are, they're known and it therefore helps the, you know, has a greater good output.
Speaker A:If you take and extend what you just said, you know, $20 to MCWT shouldn't be there because we can't solve the problem at a macro level being given appropriate opportunities in the workplace.
Speaker A:You know, MCWT for those who don't know, take a look at them, they're great.
Speaker A:The Michigan Council, Women in Technology, you know, just these are, there are macro challenges that are largely insurmountable or at the, at the epoch, epoch level of time it will take to truly change.
Speaker A:But in the meantime does that mean we don't put any investment or any interest in solving it at a micro level?
Speaker C:It's not binary.
Speaker C:You can't, we can't sit here with a defeatist attitude and go, ah, well, we can't win, so we might as well.
Speaker A:Exactly.
Speaker A:And that's what I wanted to warn because on the surface your, your response gave a little bit of that.
Speaker A:I did not want people to think that that was the case.
Speaker A:I'm an optimist.
Speaker A:We go do something else, we're screwed.
Speaker A:Just pack it up and go.
Speaker C:I mean, could I make the argument what you're really getting at is really accountability for.
Speaker C:Are we having the conversation about accountability for organizations and the quality of the code, the quality of what they're actually.
Speaker A:Producing, and an understanding of the role that part too.
Speaker B:But before your question though, some of this is pushed by regulation, right?
Speaker B:So like when, when the government came out to say, okay, critical infrastructure falls into this, and within critical infrastructure, some of our largest manufacturing falls into this, they have to do these things.
Speaker B:In doing that, does your government.
Speaker B:So let's say that we're talking the United States or you live in the UK or wherever you live.
Speaker B:If your government puts those regulations in place, then it causes someone to make investment, right?
Speaker B:Which causes a maturity which says a, their infrastructure gets better, their protections get better.
Speaker B:But it makes it even more difficult on your own government to get access into those things.
Speaker B:Because the back doors or everything else that they had, had discovered or built, you know, go back to the shadow brokers and all the things that they were able to pull out of the nsa.
Speaker B:Little tools that the NSA and CIA said we would share these after such time.
Speaker B:Well, this one was a really good one.
Speaker B:So we can still use that really cool stuff, right?
Speaker B:And if your own government's doing that and other governments are too, I mean, let's look at China as an example.
Speaker B:So now if you're out there doing bug bounties and you find something useful, now, first you have to report it to the government before you publicly disclose it to get your claim.
Speaker B:And the government decides whether or not you can.
Speaker B:Why would that be?
Speaker B:Right?
Speaker B:Like, and there was a time period in place where the US was very similar, right?
Speaker B:It was, hey, I found this.
Speaker B:I'm highest bidder.
Speaker B:If the company only wants to give me a thousand bucks, here's this organization that's willing to give me a hundred thousand because this one's cool, right?
Speaker B:Like, so, Eric.
Speaker B:Yes.
Speaker B:The code and everything else needs to get better.
Speaker B:There needs to be a point where.
Speaker C:Don'T worry, AI is writing the code now.
Speaker C:It's going to be pristine.
Speaker B:Oh my gosh, you're right.
Speaker B:Their next episode is going to be on the fact that we don't have to worry about coding.
Speaker B:AI's got that figured out for us.
Speaker A:That's right.
Speaker A:Well, now it can write prescriptions on its own, so clearly it's trustworthy.
Speaker A:I'll put a link in the show Notes, if you haven't seen that from Utah.
Speaker C:This is, this is.
Speaker B:But, but Eric, does that mean that governments would be less likely to put in regulation around it because they want access to their own people's information?
Speaker B:Right.
Speaker B:So, like, I'm not saying that's the US but there's some countries out there like, yeah, but if we start putting those regulations in, we're going to have to spend more money on our own to get more information on our own people or to get access to this or when it comes to bugging someone's house or breaking into their phone or all those things that we do from an investigation standpoint are going to be hampered and there's FTEs involved in that.
Speaker B:From a government standpoint, does money spent.
Speaker C:By a government actually matter?
Speaker C:You're talking about an entity that can print more anytime they want.
Speaker C:That's pre, an economy that's predicated on the existence of debt and has to be growing debt for the economy to actually sustain.
Speaker B:So now we're talking the creature of Jekyll Island.
Speaker C:But it might be, but I don't think, I don't think the monetary aspect comes in.
Speaker C:I, I think part of the problem is that we continually take the wrong approach with this because I think our natural inclination is we go, well, there's got to be regulation.
Speaker C:All right, well, in the us.
Speaker C:Dan shaking his head so, all right, so we do, maybe we do agree on this.
Speaker A:No, I, I agree.
Speaker A:I think there's a, there's a trend away from regulation for right or wrong should be.
Speaker A:I think the only, I think morality.
Speaker A:I have a, I have a problem with modern levels of morality, doing, driving people the right thing and therefore regulation is the balance to it.
Speaker A:But we can't deny there is a trend away from regulation at the moment.
Speaker A:Whether we like it or not.
Speaker A:It's a truism.
Speaker C:Right, well, then it's particularly tough in the U.S. right, because we are not a comp.
Speaker C:We are not a country where there is a right to privacy.
Speaker C:Right.
Speaker C:So if that's the basis where you're starting from creating regulation on top of that, you still miss the mark.
Speaker C:Right, because what are we actually doing it for?
Speaker C:What, what is our end goal?
Speaker C:Totally different in the eu.
Speaker C:Yeah, get some of that.
Speaker C:But part of this Philosophical, even that's evolving is that we do not have a society that bands together.
Speaker C:The whole idea of the free market was that we were supposed to vote into existence and out of existence companies that didn't meet the social norms of what we wanted by how we spend our money.
Speaker C:That doesn't happen.
Speaker A:No, no.
Speaker B:Well, no.
Speaker C:Because individuals now that just exist on our own and don't band together.
Speaker A:Yeah, exactly.
Speaker A:And are.
Speaker A:And only live in the edges.
Speaker A:Only live in the perimeter of.
Speaker A:It is all me or what I agree with or it is none and I will never come off those polls.
Speaker A:And therefore I can't find middle ground and band with somebody who's 85 like me.
Speaker A:And we can't find that also somewhat culture, right.
Speaker B:It's Plato's Republic and the idea of human nature.
Speaker B:But human nature is different, right.
Speaker B:Culturally, like this goes back to the Omiyari, right?
Speaker B:And the idea, like I was having that same conversation with someone like, you show up to work late, do you.
Speaker B:Or you show up to work early, do you park in the first spot or do you park in the back so that someone that's arriving late can park in the front.
Speaker B:They're like, oh, first spot.
Speaker B:I'm like, but why that co worker who's a running late would need that spot so they can get into the office.
Speaker B:He's like, well, that's their problem.
Speaker A:I'll put the link to the article that came out shortly after our own discussion about this on the podcast.
Speaker A:We like to think that the article was written because they listened to the podcast and therefore felt it was necessary to write about.
Speaker A:But we'll put the link in the show notes.
Speaker B:But like, that's the human culture, Dan.
Speaker A:Like, I know.
Speaker A:Well, no, it is the American culture.
Speaker A:It is not all humans.
Speaker A:I think that local cultures do vary.
Speaker B:From that which going back to not to harp on Plato's Republic.
Speaker B:It's a very interesting read in the sense that American culture wasn't even around when that book was written.
Speaker B:But it dove into the aspect of how important culture is.
Speaker B:And like, people talk about that all the time.
Speaker B:But using that one example of parking in the back versus the front, you could use that same example across security.
Speaker B:And now we talk about modern warfare and everything else from a culture standpoint, making the decision, right, should we use this or should we not?
Speaker B:I can use this, right?
Speaker B:Is that the right thing to do?
Speaker B:And you set a precedent as a country when you decide to do those things, right?
Speaker B:And regardless of recent events or going back in time, right.
Speaker B:At some Point.
Speaker B:A decision's made to use it.
Speaker C:And as part.
Speaker C:Part of this, in the absence of a generally accepted ethical compass, that we.
Speaker C:We allow ethics to come back to each individual to make those decisions.
Speaker B:Right.
Speaker C:That if you go to the Socrates comment that, no, no man knowingly does evil.
Speaker C:Right.
Speaker C:Which is talking about the, the ability of the, of the human mind to rationalize whatever we're doing.
Speaker C:I can steal because I'm hungry and can't pay my bills.
Speaker C:I.
Speaker C:It's for the greater good.
Speaker C:I am going to attack these people because they're going to attack us at some point.
Speaker B:Point.
Speaker C:Right.
Speaker A:That oil price is just too high.
Speaker B:Yeah.
Speaker B:And I'm.
Speaker C:I mean, we can talk about it.
Speaker B:What's.
Speaker C:What's the fix?
Speaker C:I don't.
Speaker C:That I struggle with.
Speaker A:Well, when it comes.
Speaker A:When it comes to economics, there's no single answer because everything, Everything in economics takes a decade to play out.
Speaker A:And there's all sorts of other factors that play into it and really don't.
Speaker C:We could slap tariffs on something today, apparently, and change.
Speaker C:Oh, sorry.
Speaker B:So Angela's Christmas.
Speaker B:Yeah, Angela's Christmas.
Speaker B:I. I love that one.
Speaker B:The little girl steals the baby Jesus.
Speaker B:Right?
Speaker B:It's this.
Speaker B:It's.
Speaker B:It's a great little cartoon.
Speaker A:I think it was Angela a cousin in your family or something?
Speaker B:No, no, it's a.
Speaker B:It's a little nut.
Speaker C:One day was walking down a rash of Jesus.
Speaker B:The little girl thought the baby Jesus was cold because the baby was sitting in the little manger with no clothes on.
Speaker B:So she took the baby Jesus home because she wanted to put a sweater on it.
Speaker B:She's like, I'll make you warm baby Jesus.
Speaker B:And in doing so, reflects on a story that the mom tells the children.
Speaker B:Everyone sit down.
Speaker B:I want to tell you the day.
Speaker B:And she dives into the story of when she was born and how it should have been the happiest day of her life.
Speaker B:But it was the saddest day of her life.
Speaker B:Life.
Speaker B:Because her husband wasn't there.
Speaker B:Because the house was heated with coal and it was frigid and the baby was about to come, and they didn't have any coal, and he went out to try to get some coal, and he took a few lumps of coal off the back of a truck and didn't pay for him to come home to try to heat the house for the baby and was arrested.
Speaker B:Right.
Speaker B:Whether or not this is a true.
Speaker A:Story, is this a parable about what happens when the new data center takes all the electricity in the area?
Speaker B:Yeah, we're Gonna need.
Speaker A:We're gonna need coal.
Speaker B:So, yeah, Eric, to your point, right, like, you know, I didn't have heat.
Speaker B:Baby was coming.
Speaker B:Is it okay to steal?
Speaker B:But there was a law, there was a rule that said, I will not do this.
Speaker B:Right?
Speaker B:And you broke the law.
Speaker B:He went to jail.
Speaker B:And the mom's at home by herself with the baby.
Speaker B:Right?
Speaker B:There's that moral compass part, right?
Speaker B:And then the.
Speaker B:The movie ends also with, you know, them, the family going back and the little girl returning the baby Jesus.
Speaker B:Right?
Speaker B:And I keep using that.
Speaker B:You got to hear the way they say it in there, right?
Speaker B:Because they're like, but mom to baby Jesus.
Speaker B:And, you know, they return the baby Jesus and into the manger.
Speaker B:And the priest comes out and catches him.
Speaker B:He's like, there she is.
Speaker B:Arrest that girl.
Speaker B:Right?
Speaker B:And, you know, the officer sees this little girl and he's like, you're right, I need to arrest her.
Speaker B:Right?
Speaker B:And he goes.
Speaker B:And he doesn't arrest her.
Speaker B:He.
Speaker B:He jokes with her and he's like, but I don't see this as a laughing matter.
Speaker B:And he's like, it is no laughing matter.
Speaker B:Nobody should take a little girl away from her parents on Christmas Eve.
Speaker B:That's.
Speaker A:So we'll be back.
Speaker A:We'll be back on Wednesday then to get the girl.
Speaker B:Yeah.
Speaker B:So it's that moral logic part, right?
Speaker B:And Right.
Speaker B:Yeah, yeah.
Speaker B:There's.
Speaker C:Go to the movie theater and, you know, the previews before the movie.
Speaker C:You're like, wow, that was so long.
Speaker C:I now feel like I've seen the movie.
Speaker B:You're welcome.
Speaker B:But right.
Speaker B:So for everyone that's on.
Speaker A:The ones with.
Speaker A:The ones with too much information in the preview generally are not the ones you want to go pay to see because they gave you all the good stuff already.
Speaker C:Right?
Speaker B:So, I mean, my point to that and the whole movie and everything else is trying to justify when it's right to break a law and why was the law created?
Speaker B:Right.
Speaker B:And you're trying to keep the masses in line from doing things.
Speaker B:And same thing when it comes to, like, when's it the right time to use cyber.
Speaker B:Right.
Speaker B:It say whether it's in warfare, whether it's to gather information.
Speaker B:Right.
Speaker B:On groups in your own country and so forth.
Speaker B:Right.
Speaker B:And we were predicated on the idea of freedom of speech, but also on the right to privacy.
Speaker B:Right.
Speaker B:And.
Speaker B:And.
Speaker B:Or were we.
Speaker B:And what rights do you actually have?
Speaker B:And when can those rights be invaded?
Speaker B:Right.
Speaker B:And I just think it's such a unique topic because of the way we're connected.
Speaker B:And the way that cyber is connected and the way that our infrastructure was originally built and will change over time.
Speaker B:And you guys said that you don't agree on regulation and not on regulation.
Speaker B:That that should.
Speaker B:More regulation is better because in this case in point in the movie, more laws put into place.
Speaker B:There's.
Speaker B:There's gray area.
Speaker A:I mean, I'd prefer, to be clear, I'd prefer that we don't need regulations because everybody does the right thing all the time.
Speaker C:But the right thing according to who?
Speaker A:Me?
Speaker A:Because we're in an individual society and I'm the only one that matters.
Speaker A:No, of course.
Speaker A:I mean, having societal norms is important.
Speaker A:But now.
Speaker A:But, but Brian, to the thing you just.
Speaker A:To what you just said then.
Speaker A:Are we also entering a time of corporate retaliation using security attacks?
Speaker C:No.
Speaker C:No.
Speaker C:The answer.
Speaker C:The answer is no.
Speaker A:Sorry, that was a. Brian.
Speaker A:Sorry, Brian.
Speaker A:It was based on what Brian said.
Speaker C:I'm taking him.
Speaker A:Okay.
Speaker A:I want to make sure I didn't confuse you because I've known to do that.
Speaker C:I'm taking a hard stance on that one.
Speaker B:Erica stepped in and said, no, although.
Speaker A:Recently, although if you steal the baby Jesus, you might have to get retaliated against.
Speaker B:I, I don't think corporations should enter into that.
Speaker B:That realm.
Speaker B:And I think the person that makes that decision would have to be the most informed person on how technology works.
Speaker B:So if that's your CEO or your board and you guys say, you know what, let's retaliate, let's go do something.
Speaker A:But why is that line different than national?
Speaker A:Why is that line, both of you and myself included, it's a hard line, and no company should never do that.
Speaker A:We're all in agreement.
Speaker A:But what changes between that and a little bit further left in the left, not politically left, but just further back in the, in the chain.
Speaker A:What.
Speaker A:Where do we draw that line?
Speaker A:And does it mean that we need to come back to a more parochial definition of yes and no?
Speaker A:Laws are.
Speaker A:Laws are tricky like that.
Speaker A:Laws are binary.
Speaker A:You cannot do this.
Speaker A:And unless you have a list of exceptions, and then the exceptions are the only way out.
Speaker C:The way I look at this, and I.
Speaker A:And by the way, I'm purposely poking here, but yeah, yeah.
Speaker C:Well, that's.
Speaker A:It is.
Speaker A:It is a debate show.
Speaker C:We purposely take positions that may not reflect our own.
Speaker A:Exactly.
Speaker C:I think, I think we talked about this in a really early episode, right.
Speaker C:And the comment was made that if we take.
Speaker C:Let's play out in terms of a physical invasion, what happened in a cyber invasion.
Speaker C:Right.
Speaker C:That if we look At a.
Speaker C:Maybe a group of terrorists that happened to invade Nakatomi Plaza.
Speaker C:Whose right is it to go up and invade?
Speaker B:Right?
Speaker C:Greatest Christmas movie ever.
Speaker B:I love that we're doing Christmas movies.
Speaker B:Angela's now we got Die Hard.
Speaker C:Yeah, but if you look at it, if you look at it, that's where I see the law is very clear.
Speaker C:Right.
Speaker C:And who reserves the right to step in and put down that.
Speaker C:Insurgents.
Speaker C:I look at it the same way from a cyber perspective.
Speaker C:That is not the company.
Speaker C:Bruce Willis.
Speaker C:Yes, absolutely.
Speaker C:Yippee ki yay.
Speaker A:But, but that, but now philosophically, you take that one step back.
Speaker A:That.
Speaker A:Okay, fine.
Speaker A:In a direct response.
Speaker A:But again, why not?
Speaker A:Why can I not do that as a company?
Speaker A:If somebody attacks my company, why can I not blow them out of the water?
Speaker A:Technologically wise.
Speaker B:Technologically, technically you can because you can go hire an offensive security team.
Speaker B:Our own government is hiring offensive security teams to go do retaliatory acts.
Speaker B:Right?
Speaker B:You could hire an officer.
Speaker A:Three minutes ago all three of us said that's out of bounds.
Speaker C:I see that as a right reservedist.
Speaker C:Reserved for the government.
Speaker B:Government.
Speaker C:But that is a law enforcement.
Speaker B:You should.
Speaker B:But oh it again, culturally in leadership you screw with.
Speaker B:Like you can even look at that from our own president, right?
Speaker B:If a leader of a business gets pissed off enough, right.
Speaker B:He's going to say let's go do something.
Speaker B:Right?
Speaker B:Or what can we do?
Speaker B:And everyone's going to explain why you can't.
Speaker B:And he's still going to be.
Speaker B:It's like my buddy, the owner of the business that I was telling you about, he was so upset and emotional, right?
Speaker B:And it's those emotions.
Speaker B:If you could remove the emotional side.
Speaker B:So emotional was like we should be going and attacking.
Speaker B:I'd rather pay this money not to this IR team.
Speaker B:I'd rather pay someone to go get them.
Speaker B:Right?
Speaker B:And like this.
Speaker B:That was a dead serious conversation.
Speaker B:This was like pre having a beer.
Speaker B:This wasn't like six beers deep thinking a ration.
Speaker B:This was like emotional.
Speaker B:Should companies do it?
Speaker B:I think the fallout would be way more massive.
Speaker A:Global thermonuclear war.
Speaker A:Yeah, because the one fires, then another fires, then more fire because the result of it and now you've destroyed everything.
Speaker C:We're just playing a game.
Speaker C:I just want to play a game.
Speaker A:By the way, all of these will be linked in the show notes and if you click and link to buy a book, buy a movie, it helps support this show and the distilling security network.
Speaker A:So please do.
Speaker A:They are referral links.
Speaker A:I just Figured I'd say that because apparently that's what people do on YouTube.
Speaker B:We should.
Speaker B:So we should have done this as a Christmas episode, man.
Speaker B:All the people downloading Die Hard.
Speaker A: th of: Speaker A:Black Friday Episode of the Shopping episode.
Speaker A:But any.
Speaker A:But I mean, but I mean, but to come back to.
Speaker A:You're right, I think the.
Speaker A:Yeah, it does create.
Speaker A:If you, if you come out with vigilante justice like that, you're on that scale, you're gonna end up in a bat in a mutually globally bad place.
Speaker C:It is the digital reversion back to the wild, wild west that now I get to be the judge, the jury, the executioner.
Speaker C:And the vast majority of organizations are not equipped to make those type of decisions or even understand the broader ramifications of what they're actually getting into.
Speaker A:But was there a jury and a warrant for a cyber attack against a city by a government, state or against a government Against a government?
Speaker A:I mean, from a government against government.
Speaker A:To take.
Speaker A:To take.
Speaker A:At the risk of taking an analogy too far, there was no jury.
Speaker A:There was no.
Speaker C:You're talking about the originating attack.
Speaker A:There was no warrant.
Speaker A:No, I'm talking.
Speaker A:Even in the response, was it justified?
Speaker A:How do you determine it was justified?
Speaker A:How do you determine it's a proportional response, which is a, you know, a term in this space?
Speaker A:All of those kinds of things important.
Speaker B:There it is.
Speaker B:Yeah.
Speaker B:And well, like you could use the Venezuela case.
Speaker B:There was an indictment, was a sealed indictment.
Speaker B:This, the city of New York had issued X, Y and Z a year and a half ago, right.
Speaker C:On double secret probation.
Speaker B:But there was no warrant to say, but now we got to go in and get him.
Speaker A:But I'm not talking about the what.
Speaker A:Yeah, again, it's the how.
Speaker A:In every case, the vehicle of enforcement either comes through a law that says this is how it's allowed to be done, it comes through a constitutional amendment.
Speaker A:Again, I'm talking in the US that says these things.
Speaker A:These things are in and out of bounds.
Speaker A:It comes through a warrant or a subpoena that says, this is the way you execute on this.
Speaker A:And those are the kind of things that I think in this space, in the cyberspace, in the cyber retaliation space are still missing.
Speaker B:I would agree.
Speaker B:And the infrastructure is.
Speaker B:The infrastructure.
Speaker B:It's old.
Speaker B:The.
Speaker B:The idea of the stuff that we can do and in a warfare, in a command and control situation, cyber is going to be part of warfare moving forward.
Speaker B:And the people that make those decisions are going to be the leaders of war.
Speaker B:Right.
Speaker B:And the idea though, if it's proportional, is going to fall to the leaders that are not part of that to decide is this the right thing to do in the aspect of understanding of what that fallout is, I think is better understood today than it was understood 10 years ago.
Speaker B:Like if we do this, nobody will know.
Speaker B:Well, at some point they're going to know.
Speaker B:When they find out, then they're going to believe that they can use those same tactics.
Speaker B:Right.
Speaker B:So, you know, for lack of history repeating itself, when you looked at Stuxnet and the world, you know, figured out that these were government organizations who won't say 100% but with 99 point oh, with a pretty good sense of clarity, these were man made.
Speaker A:And if your company wants 99% and if your company wants 99% assurance, this SIM our sponsor today.
Speaker A:Now, we don't have a sponsor, but this would be a great lead into a SIM ad read.
Speaker B:And for those watching, I do give cyber security advice for tacos.
Speaker B:Yes, I do have some favorites.
Speaker B:Feel free to reach out, but they're.
Speaker A:In round hundreds of thousands of tacos.
Speaker A:The all of this comes together to say that it's an evolving space.
Speaker A:I watched, I watched goldeneye rewatched goldeneye over the nice.
Speaker A:So I kind of prefer the idea of just of just taking out all the tech and making sure power goes out without having to touch a computer.
Speaker A:It just all of it, all of it dies because of a. I'd rather not have a nuclear pulse in the atmosphere, but it means you don't have to go in and hack.
Speaker A:You just take care of it from above.
Speaker A:Unfortunately, on that note, we're out of time.
Speaker A:Thanks, Brian.
Speaker A:Thanks, Eric.
Speaker A:Thanks, Eric.
Speaker A:Thanks, Brian.
Speaker A:Whichever one of you decides you want to be Eric and which one decides to want to be Brian?
Speaker A:And thanks to the listener, we are so glad you're here.
Speaker A:We hope you enjoyed our not a political episode and we hope you'll join us again.
Speaker A:Don't hit that unsubscribe button.
Speaker A:We promise we'll be back in two weeks with another episode of the Great Security Debate.
Speaker A:We'll see you again on the next one.
Speaker B:It.