Artwork for podcast Secured by Galah Cyber
Bruce Large discusses the importance of threat modelling in operational technology security
Episode 2923rd May 2024 • Secured by Galah Cyber • Day One
00:00:00 00:49:07

Share Episode

Shownotes

Summary

In this episode of Secured, host Cole Cornford interviews Bruce Large, a security architect and evangelist at Secolve, the OT security specialists in Australia. They discuss the importance of threat modelling in operational technology systems and the need for engineers to consider the potential for cyber attacks. Bruce also shares insights from the ISA/IEC 62443 series of standards, which provides guidelines for secure system development in OT. Additionally, they touch on the significance of unions in the tech industry and the benefits of joining organisations like Professionals Australia. Tune in for a fascinating conversation on application security and more.

Timestamps

1:25 - Bruce's professional background

2:40 - Defining "engineer" in different contexts

6:20 - Differences between computer engineers and civil engineers

8:20 - Threat modeling

12:40 - How we treat safety in software vs other industries

18:30 - Bruce: we should be encouraging lifelong learning

24:00 - ISA/IEC 62443 safety standard

29:00 - The Year 2038 Problem

34:20 - Unions & industrial relations

43:40 - Rapid fire questions

Mentioned in this episode:

Call for Feedback



This podcast uses the following third-party services for analysis:

Spotify Ad Analytics - https://www.spotify.com/us/legal/ad-analytics-privacy-policy/

Transcripts

Cole Cornford (:

Hi, I'm Cole Cornford, and this is Secured, the podcast that dives deep into the world of application security. Today, I'm talking to Bruce Large, security architect and evangelist at Secolve, Australia's foremost OT security consultancy.

(:

I think Bruce brings some super great insight into the space. He's definitely brought me up to speed on all sorts of things like the PLC Top 20, and all of his engineering standards, which I didn't really know because I don't get involved in the space all that much.

(:

He brings great insight into the space, whether as a chartered engineer or a community leader, and he's just a genuine and good-natured dude. So I hope you get a lot out of this episode. And I'm joined by Bruce Large. Bruce, how are you going today, mate?

Bruce Large (:

I'm going good, Cole. Thank you. Thank you for having me.

Cole Cornford (:

Oh, absolute pleasure. How's the day been? We are recording the day after Anzac Day, and I'm surprised that you have not done what most people do and taken an extended super long weekend.

Bruce Large (:

Yeah, I think there was a bit of a calendar situational awareness fail on my part. The day is good, but I think I've had too much coffee and I'm starting to see sound, so we should be in for an exciting podcast today.

Cole Cornford (:

All right, well, if you start buzzing, I'll try to put you down a little bit.

Bruce Large (:

Thanks, mate.

Cole Cornford (:

So Bruce, maybe for our audience, maybe give us a bit of background about who you are and where'd you come from.

Bruce Large (:

Yeah, no, thanks man, and yeah, awesome to be on your podcast. I'm a big fan. I think we actually first met at BSides Brisbane from memory, the first BSides, I think. And it's been very exciting since.

(:

I've recently moved to a company called Secolve. We're an OT, operational technology cybersecurity startup, and I have two hats. I'm a principal OT cybersecurity architect, so that's my background.

(:

I'm actually a chartered engineer, so I do telecommunications engineering and focus on cybersecurity. And my other hat is the chief evangelist, which is very exciting because I'm very active in the community and I enjoy teaching people about operational technology, cybersecurity.

(:

I'm involved with the SANS community, involved with different electrical engineering telecommunications groups,. And it's actually really cool to work for a company that acknowledges that we can help improve the community and I really, really like that.

Cole Cornford (:

Yeah. Two things I wanted to talk through about just quickly on those, Engineers Australia and chartered engineer. I know that most people listening to my podcast are pretty much only really dealt with software engineers, not so much civil or electrical or whatever. But would you be able to just clarify a bit more about what the difference is between one?

Bruce Large (:

Yeah, definitely. Yep. So I'm actually currently the chair of the Information Telecommunications and Electronics Engineering College in Queensland. So we're called the ITEE. ITEE covers things like software engineering, mechatronics, control systems, telecommunications, electronics.

(:

This is a term that often comes up. We make jokes around big E engineering, little E engineering, but it really comes down to what in Queensland we call is a professional engineering service.

(:

So we have a thing called the Professional Engineering Act from I think 1924 from memory. Queensland was the first state in Australia to actually have the requirement of if you're providing a professional engineering service, you have to be registered.

(:

There are multiple ways to get registration. I personally am a chartered professional engineer of Engineers Australia. But there's other methods like being a registered professional engineer through Professionals Australia. And for people who registered prior to 2008, they actually registered directly with the Board of Professional Engineers of Queensland.

(:

The moral of story though is what is a professional engineering service? And there is guidance from the Board of Professional Engineers Queensland, the BPEQ, but it really comes down to the application of engineering knowledge, the use of advanced mathematical calculations.

(:

And most importantly, if you're providing judgment that is not in what we call a prescriptive standard. Now, in my world, operational technology, we'll talk about a thing called the Purdue model.

(:

So the Purdue model, it's an abused model, but it basically is around the different functional levels in a control system. It kind of originally started out with the ISA, International Society of Automation, and how to connect IT systems into computer manufacturing systems.

(:

Probably say the late '80s, people was plugging computer networks together, everything got pretty crazy. So they decided we need to come together to figure out how do we actually interconnect IT and operational technology or industrial control system applications.

(:

So for me, a simpler way to pull through all that is has it got a real life safety impact? And if it does, what is that engineering rigor or professional engineering service around that?

(:

So in my world, if you think about things like power systems, water systems, transport systems, things that can actually endanger society, there really needs to be some sort of engineering. And the public have an expectation of the safety and things like that.

(:

And importantly, as a chartered and registered engineer, and they are different terms, but as a registered engineer, you can be held accountable for bad practice, you can be fined, you can be dis-registered.

(:

And I think in IT, we just have this blurring of terminology of what's engineering and what's not. I personally try not to get triggered by it. It is difficult, but I think at the end of the day, from an engineer's perspective, as in a professional engineer's perspective, some of the stuff that software engineers do, we would call...

(:

What we would call a technician. However, there is a difference between software development and software engineering. And depending on your context, one's not right or wrong, it's just which one's the right fit for the job?

Cole Cornford (:

Yeah, I think there's also a lot of malpractice would be considered if we... In a way that we went about software engineering was considered from a normal engineering lens.

Bruce Large (:

Yeah, yeah.

Cole Cornford (:

I think that we have to cut corners. We have to... If you're trying to release at scale and to respond to changing market conditions, there's no way that you can do it if you are having the rigors in place from early waterfall-based software development.

(:

And so I think that where it makes sense to have that level of rigor, then I definitely see people applying good AppSec practices, good software quality practices, building and thinking through preconditions, post-conditions and all this kind of stuff.

(:

And so that would be building, let's say, spacecraft or cars, pacemakers and medical devices, those kind of things where the consequence for making a mistake is threat to human life. It's safety, it's just tremendous damage. But if you're building just a standard SaaS web application that what's the consequence if it goes offline?

(:

A couple of people can't upload expense receipts and they'll get mildly annoyed. You just cut corners, man. And so I don't think there is really as much... I guess the software engineering discipline itself doesn't really put the engineering into it all that much anymore.

Bruce Large (:

Yeah, definitely. And I think it really comes down to that scoping exercise. And I'm very excited today to be able to talk to someone like yourself who has an awesome following of security and application security practitioners to talk about something that's actually probably just not really well-understood. Which is, how do we do product security for operational technology components? So I don't know, I'm happy to jump into my first thought on the topic.

Cole Cornford (:

Yeah. Let's get into it. Tell me more.

Bruce Large (:

So I've had the pleasure of going back to universities and speaking to engineering students and they'll often ask, "Hey Bruce, what's the one thing I could do or what should I focus on?" And for me, it's actually teaching people who create operational technology components to think about threat modeling.

(:

So too often in the systems, the focus is on functional outcome. I need my system to achieve X, Y, Z in these constraints, in this economical manner. Unfortunately, not everyone considers a hostile operating environment.

(:

So if you think about engineering and building robust engineering systems, a term actually from my finance degree is endogenous and exogenous, right? So threats that exist inside the system versus threats that are outside the system.

(:

So in cyber security, and we think about cyber physical systems, we're dealing with motivated threat actors. They have a goal, they're trying to achieve some impact to a system, they will find the weakest link.

(:

I've made a joke previously around, it's like the Velociraptors in Jurassic Park, they will attack you at your weakest. Now, some engineers don't really think around those abused use cases.

(:

So how could my system be attacked? And often, they don't understand or consider that their system might be a stepping stone to a bigger objective. If we think to some kind of OT security incidents, we think about things like Stuxnet. Would a Siemens engineer think that their product could actually be the fifth domain of warfare? Unlikely.

(:

Maybe thinking about say, communication vendors that create serial gateways. They're not always considering that their device might actually be controlling super important critical national infrastructure and could be targeted by an incredibly hostile adversary.

(:

So for me, I think just having that time and design, because 90% of your operational life expenses are locked in at the design phase. It's incredibly hard for these systems to be updated, upgraded, or changed once they're in production.

(:

So I think when I shout to engineers, just taking some time to red team the design and think around if I was a hostile actor in the system, what could I do? And we get into things like applying STRIDE to operational technology systems.

(:

Even just having that conversation around what are the trust boundaries, what are the interfaces, what are the undocumented engineering interfaces? And I just think if there's only one takeaway people take from this conversation, it's there's an incredible amount of benefit to doing a desktop threat modeling exercise in the design phase.

Cole Cornford (:

I agree. And I guess I see it a lot of time just in a broader product security ecosystem as well, where people have just chosen to go ahead with a design without spending the time to do a threat model in advance.

(:

Obviously, vulnerabilities are you need to be fixing them, but there are only vulnerabilities if there is a threat actor that has an intent to actually act on it and cause some kind of damage to your system.

(:

And I think a lot of people misunderstand that there are threats who do want to be acting on and causing damage to these systems. And we listen to all those spy movies and stuff and people say, it's not going to matter to me.

(:

All I'm doing is just assembling a couple of widgets to throw on a conveyor belt. And then that conveyor belt is somehow ends up in an airport and then the airport somehow... And then four or five steps later, that's the way that the Ocean's Eleven, six different degrees of-

Bruce Large (:

Yeah, definitely.

Cole Cornford (:

You know what I mean? When they go through and do all sorts of crazy shenanigans that only exist in the movies, except they absolutely do happen in real life, and it's because people have just left doors open in 50 places at once. Yeah, we're always going to be taking the easiest routes to get into a system where we can. But yeah, I think it makes sense.

Bruce Large (:

Yeah. And I mean, threat actors, threat actors will find that least path for resistance. Their goal orientated. I always come back to the P in APT is persistent. So they will be always testing to find a least path for resistance.

(:

And I guess that's where it's that holistic whole of system engineering approach that should be addressed. But anyway, that's probably not for me waxing lyrical about threat modeling, but I just think it's such a fundamental concept that we need to be teaching people.

(:

And I guess actually just a very quick aside. I remember listening years ago to a podcast of if we think about how we treat, sorry, teach trades people how to be safe from day one. So think you're an electrician. Day one, we teach people how to work for electricity in a safe manner. We don't really do the same when we teach people software development.

(:

We just teach them, build it as quick as possible, iterate quickly, deliver value. We probably should be teaching them earlier in their education that application security matters. And it might have actually been on one of your podcasts, Cole, that I remember listening to this. Yeah.

Cole Cornford (:

Yeah. Well, this is one of the interesting things because I come back to this a lot is where do we start? Do we start by educating people at high school, or at university, or as they enter the workforce, or when they're already in the workforce and have transitions?

(:

At what stage is it effective to be focusing on building in these practices and helping engineers to solve those kind of problems like security issues? And what I tend to find is that just what 12 different non-functional requirements.

(:

It could be anything from accessibility, internationalization, performance, usability, security is just one of them. And engineers tend to care about two to three disciplines. So they're really, really super focused on, and it might mean that they're picking up NICs or BUN and some kind of new fandangled component for performance reasons.

(:

But because of that, I find it's really difficult for education to be an effective approach unless you're able to give people a regimented and process oriented way of building software. And almost every company that I interact with uses fundamentally different ways, loosely based on agile ways of working.

(:

So there'll be scrum, scrum ban, can ban, whatever. Okay. So all of these different ways of working, but ultimately, at the end of the day, there is no consistency in how people build software.

(:

And because there's no consistency, there's no way to say, here's the regimented way to be effectively given the checklist of, firstly, you need to check to see if you've got a clean area, you've demarcated where the spill is.

(:

You've got your PPE on and all of these kind of checks. And also people aren't incentivized to do this. If you think about tradies in Australia, to do any kind of job effectively, you need to be registered and licensed and have insurance and have the correct PPE in place to even begin doing the work.

(:

And you're only competing against other tradies domestically who have the exact same regulatory standards on top of them. Whereas with software engineering, you could be competing against people in China, in India, in Vietnam, in America, in Belarus, and these people do not have the same types of limitations.

(:

They may just be happy to produce awful software that meets your functional requirements, but has no engineering built around it whatsoever. But the cost does matter to businesses a lot.

(:

And so I think it's really hard for us to go out and say, hey, let's look at how we're approaching engineering and bring that into those kinds of mechanisms around safety, around credentialing, around professionalization, when we're competing against a global talent base. And secondly, there is no consistency to how people even do the work in the first place.

Bruce Large (:

I think it's a really good point you raised there, Cole. I've had this conversation with many people before. I think we actually fix this from the asset side or the buyer side. So how do we educate asset operators or people buying software to really value AppSec?

(:

[inaudible 00:16:31] an example for me, I used to work for a company called Powerlink, the transmission network service provider for Queensland. So when you're on the highway, the really big poles and wires, and I was the operational technology cyber security team leader. And we were building quite a few new OT systems, right?

(:

Now, I'm a huge fan of that secure system development lifecycle. And when I came in, there was some really great work. Good standards, good architecture, but we were focusing more on validation of security via pen testing.

(:

And on some of these projects where we were developing or working with partners to develop software, I actually engaged a AppSec consultant to come in the beginning of the project to just give guidance around system design, system architecture, development best practices.

(:

And people were like, okay, I get we need to do this, but won't this cost more or will this delay the project? And I was like, well, no, it actually makes it better because we won't find these issues at the end of the engagement process. Let's push it into the beginning of the process.

(:

And it helps too, the Australian Energy Sector Cyber Security Framework, AESCSF it does call out requirements especially for middle three and security profile three, that you are embedding secure software development practices into third-party engagements.

(:

But I just think we need to change the conversation from this is security making things difficult to this is security enabling us to build better software earlier in the process so we're not discovering things at the end of the project when we're under time pressure.

(:

And I'm actually quite interested in the future watching these projects. Can you baseline scope variation or schedule variation? And obviously, you and I, we've had many beers with pen testers. There's a lot of common problems and often, it's because developers aren't aware or they haven't received the training.

(:

And to your point, I actually think there's probably no best time to learn it. I think what we should be encouraging is lifelong learning in this field and how do we adapt? And I think it's really important in some ways, how do we actually unlearn things? How do we actually stay current with changing systems, methodologies and approaches?

Cole Cornford (:

I'm finding that with a lot of penetration testers right now who've spent a lifetime learning network security and have learned some active directory exploitation. And they're now coming into an environment where companies are increasingly using Okta.

(:

And web orphan and just in time privileged access instead of just having domain controllers and all just networks. Networks are defined with software. Terraform is instantiated to cloud environment and we already know what the boxes look like and how they look.

(:

Because of that, they're finding that the skills that they learned 15 years ago are becoming relevant in other areas, but they're getting quite pigeonholed. And it's stressing them out and they feel like there's a constant rat race to stay on top of the technology.

(:

And I don't feel like that's the case at all of application security, because fundamentally, we're trying to teach people how to fish. And it turns out that everybody's still got sticks.

(:

And we have nets and we have poles and we have even ocean liners and we know how to do sustainable farming with platform engineering. If we really want to stretch the metaphor.

Bruce Large (:

I'm in for this, let's go.

Cole Cornford (:

Look, let's go on our fishing journey together, Bruce. Where are we off to?

Bruce Large (:

I don't know, but let's see if we get a big one for the day.

Cole Cornford (:

Well, this is a big one. So it is a challenge. I know that businesses in Australia are also very much aligned a lot more to OT security than in other jurisdictions around the world. Because our economy is really heavily linked to either professional services that support resources or critical infrastructure.

(:

They're not so much aligned to technology and software engineering. But these companies over here that are in those industries are getting disrupted by technology firms. And increasingly need to start looking at how IT fits into the operations and starting to adopt it ways of working into their ecosystems.

(:

And so I'm just having more conversations now with companies that are largely OT firms, who see the writing on the wall and want to be moving into that direction. And even with the penetration testers who've had [inaudible 00:21:13] government workers as contractors just doing essential aid kind of stuff, are seeing that the skill set is not as useful as it used to be.

Bruce Large (:

Yeah, definitely. And I can't remember the term, but there's this software's eating the world, right?

Cole Cornford (:

Bloody, what's his name? Andreessen Horowitz?

Bruce Large (:

Yeah.

Cole Cornford (:

a16z. I'm pretty certain that he's the guy. A while back, I remember doing a preso at Parliament house. Yeah, just casually.

Bruce Large (:

As you do. As you do, Cole.

Cole Cornford (:

Just what you do. You just go to Parliament house and you just talk to people. Don't worry. There was no pollies in the audience. It's just a-

Bruce Large (:

Oh, good day elbow.

Cole Cornford (:

Good day elbow, let's talk about secure software delivery. But I did basically go there and say hello government people, we need to be changing how our industry works and we need to focus on professionalization instead of elevating the profiles of cool effectively burglars and then patting them on the back for just helping each other learn how to burgle better.

(:

That's not what we should be doing at the same time. Yeah, I did mention, I'm pretty certain it was Andreessen Horowitz and 10 years ago, he said that in his letter at the a16z as he started his venture firm. And now, it's insane how much money and how accurate he was because I think nine out of the 10 companies on the NASDAQ are the richest in the world. Are software firms, and the 10 richest is Berkshire Hathaway. And the reason it's so rich is because it has a 1/3 stake in Apple. So...

Bruce Large (:

Yeah.

Cole Cornford (:

Yeah. There we go. Look at that.

Bruce Large (:

It is quite profound and I think one of the issues is communities and the public don't really have the tools to assess good quality software. And that's why we're seeing regulation. And I think we were chatting before the show, around the Biden software executive order stuff and things around S-bombs and all this stuff.

(:

I think people are now realizing, wow, we have such complex systems that are built upon complexity and it's unclear what are the foundations. And we're having to build regulation to ensure these critical systems have had security by design and security baked in as opposed to move fast break things. Push early, push often, which works perfect in agile use cases like you were saying before.

(:

But if that's software that's controlling operational technology or critical processes, as a society, we don't really have that appetite. We're happy for it to be a bit slower.

(:

People often ask me, "Hey, well Bruce, how do I actually do good OT security, right?" Yeah, I'm a bit of a zealot. You would say for the ISA/IEC 62443 standard series. It is a complex standard series, right? It's 14, 15 standards depending how you count. And they're actually a mix of prescriptive standards and informative technical reports,

(:

But there's four lenses, if you will, to 62443. There's the part one series that's all around terminology, concepts, metrics, things like that. Part two is around administrative policy, whole of organization kind of view.

(:

And there's an important thing there we call a CSMS, a cyber security management system, which is effectively the OT information security management system. We outline things like how do we do training, how do we do risk controls? Those kind of things.

(:

We then come to the part three series that's around systems. That's the one most people are familiar with. So part three, two is around risk assessments for operational technology systems. And that's where we have things like zones and conduits.

(:

We have things like security level targets, security level target actuals, complete and abilities. We then also have, sorry, achieved, but yeah, security level targets. And then at the fourth layer or the part four series, we have components. And that I think is the most relevant for your audience.

(:

So how do we actually build robust... The standard series calls them IACS, industrial automation and control systems components. And there's four categories they have there. There's software applications, there are embedded systems.

(:

So these are the traditional IED, intelligent electronic devices, relays, remote terminal units, that kind of stuff, network devices, which I think we can all agree. And then what they just call host devices, which are really computers and things you use in the environment.

(:

But the part four series I think is really useful for your traditional OT vendors to have a framework they can apply for whole of life secure product security focus. So you can get the standards a few different ways. I think the best one is actually if you just sign up to isa.org, you can view the standards online for free, but you can't download or print or anything like that.

(:

But in part four one, talks to what are these foundations for an actual requirements for secure system development. And it's a pretty good model. Talks to things like security management guidelines, how to specify security requirements, which is we're actually quite good, because that's where things like threat modeling and things come out that we're familiar with.

(:

Things like secure by design, secure implementation. And very important things like, well, how do you actually push updates to products? So once you've built the system, especially if they're in difficult places to get to, dangerous places, remote places, it's quite difficult to just do an over-the-air update, especially if it's controlling something safety critical.

(:

And then something I really like is what they call security verification and validation. So in system engineering, we have this concept called the V model as in Victor. So we move down design activities and we move up testing and assurance activities.

(:

And importantly, rather than just relying on the activities that happened before, you link back to the corresponding design activity. So I believe test-driven development, that kind of thinking, but applied into OT.

(:

So I think that 62443 standards to me is actually a really great place for once you've got past that initial threat modeling, understanding your system context and things. If you're a bit stuck and you're thinking, I don't think OWASP is actually that appropriate, spoiler alert, it is still very useful.

(:

But if you're thinking, I don't know what to do, I know I have these requirements and things expected of me, but I'm not building a web app, how do I think about this? I'd actually really encourage you and your listeners to check out 62443 part four one. Yeah.

Cole Cornford (:

A lot of people I think who haven't worked in those kind of ecosystems or spaces or even interacted with people in those spaces just will not recognize that the approaches that work in standard software engineering are not... Just can't do them, you cannot do them.

(:

They'll whinge about legacy software and then they'll realize that everything that's hard about legacy software in an enterprise is basically the de facto for embedded systems. That you cannot change it for 10 to 15 years. There is no mechanism to be updating a PLC or an embedded device. And if it was a good mechanism, you want to make sure that that's very, very safe.

Bruce Large (:

And I mean, I don't know if it's come up in your show before, but have you heard of the year 2038 problem?

Cole Cornford (:

Not 2038, but I'm going to assume it's close to a Y2K issue, right?

Bruce Large (:

Exactly. So this is.. Y2K is the Disney version of the year 2038 problem basically.

Cole Cornford (:

Oh, okay.

Bruce Large (:

So the 2038 problem is Unix time ticks from 1970. In the year 2038, at that point, we will roll over the 32 bit double byte, a double word. Yeah, we'll roll over the 32 bit value and we'll actually go back to 1970.

(:

Now, unlike Y2K, which were big obvious things like computer systems main frames, whatever, this is every embedded device. And I think the feasible solution is just upgrade everything to 128 bit processor architectures. But the 2038 problem's absolutely going to be crazy for operational technology, right?

Cole Cornford (:

Well, let's talk about that a little bit more then. So what is the implications of it overflowing back to 1970? How does that affect the types of devices?

Bruce Large (:

It'll be the logic.

Cole Cornford (:

Yeah.

Bruce Large (:

Yeah. It'll just be the logic, right? So depending on and the use of time. So in some industrial processes, having time all in synchronism is incredibly important. If you're thinking about monitoring a process, and for example, electricity industry, there are things called Phasor measurement units that measure in real time the power quality, if you will, of the grid. That all relies on GPS, NTP, PTP, very precise time protocols to understand through the whole network.

(:

And we're talking between different states, for example, that the electrical waveform is behaving as expected. Depending on complex process control systems, if there's something that relies on time, it will cause a problem. And yeah, it'll be interesting to see what creative ways we find to work around this, right?

Cole Cornford (:

Yeah. Because it's effectively the really big problem beside we know what the solution is, but the problem is unique for pretty much every single device at that point. Because some devices, it will have no impact on whatsoever because it's just the time is not relevant to how it operates.

(:

And then there's other devices that it is a system of record, for example. And then if your records are just fundamentally wrong now at that point, you can't use them. Or if it's making decisions about when to open or close gates or to-

Bruce Large (:

And maybe it'll be fine because it's all relative time. So yeah, it'll be an interesting one.

Cole Cornford (:

So I guarantee there'll be a bunch of programmers who've hard coded specific time values in, and then it's going to do some stupid math badly, and who knows what's going to happen, right?

Bruce Large (:

Oh, yeah.

Cole Cornford (:

If my experience in reviewing code is anything, it's just like people who do not consider it a future.

Bruce Large (:

As someone who's written dodgy hacks, I can totally appreciate this.

Cole Cornford (:

I see it so often where someone's written source code that basically refers to a hard coded DNS entry, or an IP address, or just something like that. And then at some point, they're like, wow, something's happened.

(:

We didn't renew that certificate or the DNS record, we lost access to it. Or our company got acquired and deleted those boxes, and then suddenly, their mission-critical software is broken and they can't fix it. And it's like, yeah, if only I had the foresight.

Bruce Large (:

This brings back painful memories in a previous life when I worked in the railway industry of hardware certificates on locomotives and how do you support that? And for heaven's sake, do not lose that certificate authority or it's going to be an incredibly bad day.

(:

Yeah, no, it's exciting. And I think that's why I love OT. It interacts with the physical world, whereas IT, information systems, they're interesting. But it's that interface of information systems to the physical real world that I quite like. And I'm looking forward to getting down to our New Castle and hanging out with some OT people and your beaches. I've seen the photos. They're quite nice, Cole. So yeah.

Cole Cornford (:

You better come out. We'll do a walk down [inaudible 00:33:27]. I can't say that you'll enjoy it particularly much because like me, you enjoyed beer and T-bone steaks, and that doesn't conduce to 2.5k hill walks down to the beach and back up again. Especially if you're carrying a baby like I did last time on Tuesday.

Bruce Large (:

Yeah, no, it's going to be weird. If I'm carrying a random baby. Just me, man. You can carry me down the hill. How's that sound? So good one. So yeah, look, I think definitely I'd encourage people if you are working around even internet of things, industrial internet things, right? The 62443 standard series is actually a really great reference for you use and I think something that's familiar to engineers as well, right?

Cole Cornford (:

Yeah. I wanted to shift gears a little bit because I noticed something you're quite passionate about, which is effectively belonging to a union and industrial relations and how important it is for us to be thinking about that.

(:

And I've worked at companies in the past where it's literally never come up in conversation because the general vibe is, well, if I don't like my employer, I'm just going to go get a new job as a software engineer somewhere else.

(:

And effectively, there've been so many cases where large companies have colluded to produce wages. We've seen that in the mid-2000s with all the big tech firms refusing to poach from each other. And then eventually, Facebook coming in and then just starting to raise the bar on salaries and getting rid of the artificial cap and absolutely disrupting the industry.

(:

Now, we're not going to get a benefactor company coming to Australia to start doing this kind of stuff, but the other way is actually to look at joining an organization like Professionals Australia. So would you be able to talk about why it's important to be thinking about... Also as side, I'm a business owner and I'm still going out there to talk about unionization, so just...

Bruce Large (:

No, definitely would love to. And I'm actually the incoming president for professional engineers, so Professionals Australia and some of your viewers, sorry, listeners, viewers in Australia might remember.

Cole Cornford (:

Either way, whatever.

Bruce Large (:

Some of your listeners may remember APESMA right when I was at uni. So I graduated uni mid-2000s, APESMA. I think it's the Association of Professional Engineers, Scientists and Managers Australia. APESMA rebranded to Professionals Australia.

(:

We have multiple groups. One of those groups is professional engineers. Another group which would be most applicable to your audience is MPD. So the managers and professionals division, and that covers IT workers, technology workers.

(:

We also cover scientists, pharmacists, architects. And we also have alignments to things like local government and Commonwealth front. Now, one of the groups in Professionals Australia is called the TSI, which is technology, software and information technology.

(:

So TSI cuts across all those divisions and basically represents the STEM workers. So science, technology, engineering, mathematics. And it's an exciting time for our industry. We're growing. There's a lot of investment, there's a lot of people joining the industry. And people like us, we have a lot of experience, we have a lot of power.

(:

We have the ability to easily move jobs, but unfortunately our peers, our more junior peers, comrades, if you will, they don't have that power. And I think it's important that as an industry, we say, this is the minimum standard we accept, right?

(:

This is how we want to treat each other, and this is how we think as an industry we should behave. And unionism is a way to allow workers to group together and to use their power to make it a more equitable conversation with management.

(:

And what's actually cool about our union is we actually represent managers as well. So managing is hard. You and I, we both know this. We've had these conversations. People aren't like machines. People are subjective. And we all definitely have good days and bad days.

(:

So I think it's important that as a union, we find that fair outcome for all parties. So we want to find win-win situations. We don't want to force a company into a point of them being completely [inaudible 00:38:11], but at the same time, if that company can only exist by not paying its workers what they deserve, then they probably shouldn't exist.

(:

And on top of this, as we said earlier, some of these things, we should make decisions as a society that we do value quality, or we do value the safety of these systems rather than just the cost.

(:

So yeah, so I would definitely encourage anyone on the podcast if they have questions about the union, hit me up and definitely check out professionalsaustralia.org.au, right? It's the best way to see and connect with the union movement.

(:

And I think it's important we talk about, like you said. I feel like for some reason, it's become a taboo topic when it shouldn't, right? Just this very evening, I'm heading out to the Labor Day dinner.

(:

It's important to remember it was actually Stonemasons in Melbourne that did the eight-hour day, right? Prior to that, I think it was 10 hours weekdays and eight hours on a Saturday.

(:

But it was actually the Stonemasons in Melbourne working together in solidarity to actually achieve positive change for everyone. And the more you look into it, things like sick pay, long service leave, superannuation, these are all good labor, working class union outcomes.

(:

And we're knowledge workers. We have a different background, but I think it's important that we actually work together to get a better outcome for all rather than just a select few becoming incredibly wealthy off the labor of others.

Cole Cornford (:

Yeah. And I really wanted to bring this up as well, because recently, I've been seeing a lot of people in the cybersecurity space working across penetration testing, government's risk and compliance. But almost always in consulting firms, starting to lose their roles and then having difficulty re-entering the workforce.

(:

And a big part of that is that they effectively have no support network. And now they're cutting each other down trying to get whatever's available scrappily in the market. And it doesn't have to be this way. So we can support each other, we can work together.

(:

I know, again, it's weird because I am technically the management and I have the power and I am the bourgeoisie, and people will start fighting me about that kind of stuff. But-

Bruce Large (:

The revolution will spare you, comrade.

Cole Cornford (:

That's it. Maximilian Rob [inaudible 00:40:28] said to me, "Cole, today, your head's okay."

Bruce Large (:

No. And look, I think there's a furphy here too, that if you're a manager, you're a bad person. And I think I'm a very proud member of the Australian Labor Party. I like to say I'm economically literate, but not a... We could probably beep that out.

Cole Cornford (:

Straight to the primos. I'm economically literate.

Bruce Large (:

Yes, yes, I have a master degree in finance I'll have you know. But no, I'm part of a group called Business with Labor, and I love our tagline of commercial outcomes with a social conscience.

(:

So you can make money, but you can be a decent person while doing it. You can make really positive engaging workplace experiences. And I know from our conversations, you 100% do that, mate. And just because you own a business, doesn't mean you can't be a decent boss. And I think too often, companies get too big and they lose that magic number of 150.

(:

I mean, you would've seen this in startups just that once that company gets too large and it stops being about the people and the interactions and the culture, and it just becomes about profit and shareholder value and things like that.

(:

So yeah, so I think it's an exciting time for our union. There's a lot going on. There's quite a few high profile employers where we're trying to get enterprise bargaining agreements set up.

(:

Definitely even if you just check out Professionals Australia on LinkedIn, some really great... I just posted the other day, actually, I can send the link to you through the viewers, but just like, why do you need an enterprise bargaining agreement?

(:

It's a really important tool to make sure there's equity in the workplace. And unfortunately, there's just not a lot of literacy in industrial relations. It's just become this, dare I say, it's probably that Howard era of attacking unions.

(:

And we've just seen this constant eroding of union power and also union membership, right? Union membership is incredibly low at the moment. I think it's 15% in the workforce. And I think as the software and knowledge workers, we're an incredibly new industry.

(:

We're probably what? 20, maybe 30 years at best. So it's a good time. If you don't really know, you should go educate yourself, right? And yeah, there's definitely people here to help you with that.

Cole Cornford (:

Yeah, I think there's an unfortunate view that people like construction workers, nurses, miners, teachers, unions are for those and they're not for us because we're going to be the business owner who is rich and exploiting the labor of all the... Even though you are the exploited class that's in this case and a lot of the times, and you just may not realize it yet.

(:

So anyway, enough about politics. Go check out Professionals Australia at some point, not sponsored. Mate, this has been a really long and awesome conversation with you. I have two last questions to power through before I have to rush to daycare because I know that's my daughter, she loves me, but not that much.

Bruce Large (:

You got to get the priorities, mate. Yeah. So hit me. What do we...

Cole Cornford (:

First question, first question for you is the best purchase for under 100 bucks?

Bruce Large (:

Ah, I'm going to be cheeky. You didn't specify what currency. I would actually highly encourage people to join ISA. So if you join ISA.org, I think it's $70 US, so a little bit over 100 Australian, but it's a great way to get access to the ISA standards. And also, you can actually join an ISA, I think it's a branch. And in that, you can actually connect with other security practitioners.

Cole Cornford (:

Yeah, there you go through some books, get some more knowledge. Question number two, speaking of books, is what's the best book to give a person who's in InfoSec?

Bruce Large (:

Yeah, for OT, I think there's two. So I think if they're a blue team architect engineer, that kind of, I'd actually highly Knapp and Langles Industrial Network Security. To me, it is the textbook of OT security. I had a Facebook reminder pop up, I think it was 13 years ago, I was reading it, edition one. It's quite a good book.

(:

I think for the pen testers, I quite like Hacking, ICS Exposed. And I know there's also another packet, industrial network security book as well. But I think depending on your focus. If it's blue team, I'd say Knapp and Langle. If you're red team, either of those two books, and I can send through the links for you.

Cole Cornford (:

Yeah, that'd be good. And how about if it's not just to learn pure security stuff, what would you give to someone to just help them grow as a person?

Bruce Large (:

Oh, grow as a person. I don't know about growing as a person, [inaudible 00:45:21] there. How to Win Friends and Influence People. So just a general book or?

Cole Cornford (:

Yeah, just a general.

Bruce Large (:

Because I actually have two if I can.

Cole Cornford (:

Okay, go ahead.

Bruce Large (:

I think the OT, and it's not technical, but I'd really recommend the book called Sandworm By Greenberg, Andy Greenberg. He is a journalist for Wired. It's a super great book. Talks about the evolving fifth domain of warfare.

(:

In terms of just general book, I actually really like a book called Design Your Life, I think by Bill Burnett. It's a book from, I think it might be UCLA in the US, but it's a really good book about making informed decisions and actually taking control of what you want to do in your life.

(:

And helping people connect with what they like, what they want to do, and just really giving structure to some really big decisions you're going to make. So yeah, design Your Life. I think it's a really good book. I've recommended it to a few people and hopefully people like it.

Cole Cornford (:

It's better than just going on autopilot. I know that I've spent a lot of time in my really early twenties just figuring this out because I had a couple of critical life events that just made me just say, wow, okay, life is super short. And I need to start figuring my shit out right now. Because I at this point, want to be married and have kids.

(:

At this point, I'd like to have financial security. At this point, I want to be doing philanthropic stuff and giving back to the community. Now, I've weirdly enough achieved those kind of goals, which is not really what Design Your Life is supposed to be. You get there that your timeframes might be a little bit ambitious, but I highly recommend people doing that introspection and thinking about where you want to be.

Bruce Large (:

Yeah. And I think for some of these things where it takes a lot of time and effort to get to where you want to go, it's very hard to do it ad hoc and at random. So nice one. Well, I'll tell you what, it's been a pleasure, Cole, really enjoyed it.

Cole Cornford (:

Yeah, thanks for coming on, mate. I hope you have a wonderful dinner tonight and I'll see you next Bsides.

Bruce Large (:

Definitely, mate. No, I'll be flying the flag tonight having a few beers. And mate, very much looking forward to seeing you and everyone at BSides Brisbane.

Cole Cornford (:

Thanks a lot for listening to this episode of Secured. If you've got any feedback at all, feel free to hit us up and let us know. If you'd like to learn more about how Glass Cyber can help keep your business secured, go to glasscyber.com.au.

Links

Chapters

Video

More from YouTube