Artwork for podcast Secured by Galah Cyber
Open-Source Software: Balancing Innovation and Security with Ilkka Turunen, CTO of Sonatype
Episode 3810th October 2024 • Secured by Galah Cyber • Day One
00:00:00 00:46:56

Share Episode

Shownotes

Episode Summary

Ilkka Turunen is the CTO at Sonatype, a company that helps millions of software developers use open-source software while minimising security risk. In this conversation, Ilkka chats with Cole Cornford about the benefits and risk of using open-source software, how Maven helped standardise software development processes, the different approaches to AppSec regulation in Australia and Europe, and plenty more.

Timestamps

1:33 - Ilkka's career background

4:00 - Varying quality of open-source software

6:10 - How Maven helped standardise software development processes

13:00 - The balance between speed of delivery & quality

17:00 - Importance of environment parity in software dev

21:40 - Risk of using 3rd party code in software

25:10 - Regulation of AppSec in Australia vs Europe

32:10 - How new European software security regulations will be enforced

35:00 - Recommendations for compliance with European regulations

39:00 - Rapid fire questions

Mentioned in this episode:

Call for Feedback



This podcast uses the following third-party services for analysis:

Spotify Ad Analytics - https://www.spotify.com/us/legal/ad-analytics-privacy-policy/

Transcripts

Cole Cornford (:

Hi, I'm Cole Cornford, and this is Secured, the podcast that dives deep into the world of application security. Today, I've brought on Ilkka, who is the CTO at Sonatype. Ilkka is an incredibly smart person. He's been working at Sonatype for a bit over 10 years, specifically dealing with the software supply chain and open-source security risks. I know that we, in this episode, cover a lot of deep dive into everything from effective patch management, do we go to N-1 and N-2? How did we get to that point? The different origins of how packaged management systems came to be. Sonatype R, the owner of Maven, and Maven was one of the first ways to actually get people to have a standardized method for pulling in packages instead of just vendoring and copying code from the internet. And eventually moving into what the future of regulation looks like for the sector. I know a couple of our listeners are probably going to be looking to export into Europe, and so NIS2, which is going to be coming out in the future, is something that we do spend a little bit of time talking about too.

(:

But yeah, Ilkka is just lovely, a very good communicator, super kind person, and willing to answer any questions about software supply chain, backed with a wealth of experience. So I hope you enjoy this episode.

(:

I'm here with Ilkka. How are you going, mate? Good to see you.

Ilkka Turunen (:

Yeah, really well, really well. Thanks for having me.

Cole Cornford (:

So Ilkka, I think it'd be good for my audience to just get a bit of background about yourself and what's your experience in software security?

Ilkka Turunen (:

Yeah, sure thing. So my name is Ilkka Turunen. I'm the field chief technical officer at a business called Sonatype. Sonatype are a software supply chain security specialist, which is a very fancy way of saying we care a lot about open source and dependencies that go into software, rather than the actual code that you write yourself. So I'm kind a weird entrant into the realm of security, because I'm not a security person. I'm actually a software engineer, and I stumbled on this because I was brought up in the era of cloud adoption, DevOps, move fast, break things. I worked in a lot of projects where we hobbled things together, kind of built CI/CD systems. We built the software based on a huge amount of open source.

(:

And then one day I got this pitch from this small company going, "Hey, have you ever thought about all the security vulnerabilities you might be having and those things?" And I thought, "Surely, open source knows what they're doing, and even more so, I know which open source to pick." And it turns out I sure didn't, and it turns out there's a lot more of it than I thought that there would be. So I've been working with Sonatype for the past decade, and in that time we've been seeing the evolution of people leveraging open source from an ad hoc perspective to open source basically becoming the major source of software innovation in most businesses. And most businesses today are software-defined businesses. And then nowadays, of course, we're really seeing a global expansion of minimum software security standards and that sort of thing. So somewhere in the middle, I went from writing code myself and thinking, "Hey, this is super cool. [inaudible 00:03:19] release goes out," to actually, and there's a lot of things that hang together there, rely on one another.

Cole Cornford (:

It's crazy that we think about that everyone else has kind of figured out what they're doing in life, and you go speak to the doctor and the doctor's looking at you all professional in your suit and tie and you're like, "Wow, that guy really knows what he's doing." And he's just sitting there thinking to himself, "I'm two weeks into being a GP, I guess I should keep wearing a tie because people will respect me a bit more." So I understand we think about open source developers and we're like, "Man, if they're that game to be putting all this stuff on the internet and it must be secure and sophisticated and great, and they're awesome engineers." And a lot of the time they're just like, "I'm really annoyed that I get cats in my front yard. So it would be good if I just open sourced a product that can help other people to get rid of cats by playing a high pitched noise randomly for no reason when cats come by in front of my ML model." Not that I built something like that, but-

Ilkka Turunen (:

Yeah, yeah, no, I mean that's exactly it. And the biggest breaker of that illusion is just go read a mailing list for a project or look through a couple of PRs and their comments. You very quickly realize they are very talented people, actually open source, if you measure sort of security outcomes, open source projects are way better than any business on the planet at releasing new versions, at creating sort of secure software. So in that sort of sense, they do know what they're doing, but they're very focused on that one domain because do one thing, do one thing well. And so often that's not how that software is actually used when you bring it into a bigger context like, hey, somebody repurposes your cat identifying ML model to identifying squirrels. It turns out that it gives you false positives on red squirrels or something, and I raise a PR about that. You might fight me for it because you're like, "It's not about squirrels, it's about cats, et cetera. So there's I think sort of compounding effects sometimes when it comes to that.

Cole Cornford (:

Open source is, I love the ecosystem and I think a lot of people just pull the components from it that they're going to use. And then like you said, you've got the skunk works of all sorts of random things that are being used in ways that they are not intended to be used. And then sometimes you're like, "Wow, I've got a missile, but why is it running Jenkins?" I wouldn't be surprised at this stage with what I've seen in my career.

Ilkka Turunen (:

I'm pretty sure that that exists. And I'm pretty sure that Jenkins is a 1.6 that nobody's touched for five years because if I touch it, it'll break.

Cole Cornford (:

Who's going to be walking up to a missile and being like, "Yeah, better trigger [inaudible 00:05:53] and see what happens.

Ilkka Turunen (:

Yeah, I'm not sure missile [inaudible 00:05:58], pressing the big red deploy button is probably not what we mean in the software business.

Cole Cornford (:

So open source, I guess it's quite challenging though because you get a lot of insight because I believe Sonotype does a lot of work or is a custodian of Maven, is that correct?

Ilkka Turunen (:

Yeah, so we run Maven Central. So we were actually co-founded by the two original creators of Maven itself. I think the very first [inaudible 00:06:23] was somewhere around 2008, and originally it was actually software consultancy, but as Maven, the technology took off, I guess in case you're not into esoteric software built tool history, Maven was sort of the very first programming language built tool that introduced this idea of a standardized build process, which is kind of an odd thing to even say nowadays, but back in the day, building software was public. Basically build your own code. If you needed dependencies, or you needed libraries, you needed to put it into the class path of the software, which is either an environmental or parameter, the runtime or something else. Imagine DLLs in a folder. You needed to make sure that they were all over there. And it was a very manual thing.

(:

I used to talk to people sort of mid-2000s who were build engineers who were very happy that they were able to compile a piece of software, put out a release with all of these dependencies in a week. Because a big part of that job was about, hey, remembering how to run the build, then how to [inaudible 00:07:23] dependencies, put that in. Basically what Maven did was really introduce this idea of a standardized build with standardized steps in the lifecycle of that build. And one of the major innovations of that was that they decided that dependencies should not be something that you source yourself. You shouldn't be going to Lycos or Altavista or whatever and finding random jars on the internet. They just created a little CDN where the tool would automatically call and download dependencies. So if I needed a framework, I'd just say, "I need this framework," and the tool would do that for you in the background.

(:

If that sounds familiar, that's because every programming language today works exactly like that. So you have a built package JSON or a POM file or a Gradle file or whatever. You write out your dependencies that you need, and the tool knows where to fetch it from and it does automatically resolve it, and then it'll look at its dependencies. So we run Maven Central, it's the default location for all open source that's used in the Java derived languages. So Java, Scala, Android, that sort of thing. This year we're probably going to serve somewhere around 1.4 trillion requests from it. So that's 12 zeros. That's what I call a very big number.

Cole Cornford (:

That's a lot. Thinking of, because I've read all sorts of source code, heaps and heaps of Java struts and spring apps from a long time ago and even before then, like Perl Scripts and so on. And most of those older applications, circa 2000, 90s, they do vendoring where they go off... For people who don't know what vendoring is, they basically would grab a dependency from a third party and then they would copy what's necessary out of that dependency and bring it into their source code, and then they would be in charge of it at that point. You're basically purchasing it and now it's yours.

(:

Whereas the default model that we operate with nowadays is to almost never do that. It's to say, "Cool, we don't want our developers to understand React, and we certainly don't want to have them have to maintain their own version of React. So we're just going to download the latest version of React and use that on a continuous basis." And so I think it's really cool that you've been part of that journey really early on to get people to... It's something that sounds stupid, and then eventually everyone's like, "Wait, actually, that's really smart." I can't think of a good example. And I don't know, what do you think? What's another example in history where someone's gone out and done something and then you look at it and you're like, "Wow, that just makes sense now."

Ilkka Turunen (:

So funnily enough, manufacturing itself. So if you think about the automotive industry, this sort of classic story, right? Henry Ford made the Model T and that was an affordable car for the masses that popularized the car. And now we all go drive down the road in our Holdens, or whatever you guys drive down in Australia, right?

Cole Cornford (:

You'll make some Australians very sad by saying that because Holden went out of business like ages ago. Well, it's a classic now. Classic car.

Ilkka Turunen (:

Yeah. I'm so out of touch with that though. But the innovation in that wasn't selling the car for the cheap, the innovation was, hey, at that point cars were sort of a luxury good for the rich and well healed to buy. You bought it on a whim because you were a lord of a manor or a really rich inventor or whatever, and you bought it as a leisure thing. You drove it around in a field or maybe on a country road, and then you put it back in the parking lot. And what the Ford company did was realize that actually if you don't manufacture every single nut and bolt yourself, if you put it together in a standardized fashion and you put them out with some sort of production goals in mind, turns out there's an economy of scale that you hit at one point and you put it out faster and with higher quality, and it's easier to maintain down the line as well.

(:

And when you think about what it did to the popularization of the car, there was a lot of demand clearly for it. So they needed to meet it. So they came up with that solution. The same thing happened with this in software. If you think about early 2000s, yeah, there was a lot of software, but it was a sort of very specific discipline. It was only constrained by the ability of you to hire the right freaks and geeks to write it for you, versus once this concept of modularized development became a thing, it was much more about just deriving business value and filling in your own unique thing. So lo and behold, more software exploded into the market. Today almost everything is so software defined, almost every business relies on a huge amount of pre-made innovation. So I think that's our paradigm.

Ilkka Turunen (:

... huge amount of pre-made innovation. So I think that sort of paradigm shift tends to happen in any sort of manufacturing thing, even though we don't like to think about ourselves as that. And I think the secondary effects of all of that is to follow for us as well.

Cole Cornford (:

Well, what I'm interested in is despite the fact that we've basically tried to replicate advanced manufacturing from Toyota, because we talk about systems like Kaizen and Kanban and using repeatable build processes to try to get high quality outcomes, we still produce terrible software constantly. And I'm wondering what approaches of it and regulation we should be looking at trying to do. But it seems that the industry at large is rewarded for delivering at speed. But I know that you've got something that you've been cooking up about whether it's better to have something that's high quality and really secure, or whether it's better to do something at a faster velocity. Can you tell me a bit more about that piece that you're thinking of?

Ilkka Turunen (:

Yeah, yeah, yeah. No, for sure. So funny thing about lean by the way, is lean manufacturing was invented by this guy called Dr. W Edwards Deming, who got shipped to Japan in the sixties and really started helping out Toyota to do that.

Cole Cornford (:

Didn't he do in Detroit as well? He took all that back to Detroit, didn't he? Yeah. Yeah, he's good. He's smart.

Ilkka Turunen (:

Yeah. And actually there's a really good interview of him from the eighties that you can find on YouTube. It's like the best remedy for corporate fatigue you'll ever find.

Cole Cornford (:

Well, I'm going to go find that interview and I'm going to give it to all my listeners afterwards.

Ilkka Turunen (:

I definitely recommend it. But the sort of fundamentals... There's a lot of smart thinking about it that he introduced into it. And we then later as a software industry, basically stole and made lean and agile a thing for ourselves. But the one thing we haven't really done, it was one element of his manufacturing process was this idea of supply chain management. He actually had some rules about how auto manufacturers could optimize their car throughput, and one thing that he said to them was, minimize "The amount of third parties that you consume." So I just said that Ford invented the idea that, "Hey, maybe you don't need to manufacture everything in the factory, just assemble it." But what Dr. Deming basically said to Toyota is, "Yes, but don't rely on an infinite amount of things. Like set some minimum standards about what kind of bolts or brake pads you're going to put into the car and minimize the amount of suppliers that do that."

(:

Because natural inclination is a brake pad, is a brake pad, is a brake pad, right? As long as it clicks the wheel and stops the car, what's the difference? Well, yeah, if I get a brake pad from a cut-price manufacturer, they might have, I don't know, put sawdust in it, a third of it. So it turns out the stopping power is nowhere near where sort of a well manufactured one it can be. Although it's not a direct analogy, and there's actually a lot of controversy in the software world about talking about manufacturers and producers, it's sort of the same thing as well. Deming's insight was really that as a business, you do need to set some standards on what you're using to build your things because it has an impact on the quality, on the manufacturing, even on perception and safety later on the line.

(:

So one of those things that we do as a business is we publish this piece of research every year. We look at how people are consuming open source, how people are then using it, what does good look like in terms of products, like open source components. But we also look at companies like how are they leveraging it? How are they leveraging DevSecOps? Are they being productive? Are they secure for... And one of the things that we looked at a couple of years ago was really this sort of intersection... Simple question to ask. There are companies that prioritize productivity, get the thing out the door as fast as possible, move fast, break things, that sort of thing. They're also companies that are highly regulated and really go at it from a security first perspective. "Hey, make sure that you've complied with all the standards. Make sure that you do all the things." And when you measure both security and productivity outcomes, you get this sort of grid. And it turns out that the companies that had the highest security and joint productivity scores weren't the ones that prioritized one or the other. It turns out that those were the companies that had an equal balance of security and productivity maturity and what we would probably call DevSecOps adoption. Turns out that they were by far the most productive and they were by far the best at dealing with security incident. So new incident came in, they cranked out a fix much, much faster than comparative peers. It is sort of a surprising outcome when you think about security actually helping productivity. But as we were kind of talking about in the pre-show, it kind of makes a lot of logical sense when you think about it.

Cole Cornford (:

Yeah, when I do a lot of consulting and people feel free to steal my ideas, it's all good. Love you all. Buy Galah Cyber. So when I do consulting a lot of businesses, I do look very heavily for good software engineering practices like having environment parity. It's a common one. And I know that most businesses seem to have a good, quite distinct environments between development tests and production. And security functions will say, "Yeah, but you don't put your dev tools in your dev environment. And then the test environment you want to have some parameters are on and the production environment needs to have all the protective controls in place." But then what you end up with is that something in production just doesn't work and you can't replicate it on your machine in the dev environment or in a test environment and it just doesn't make sense to you.

(:

And then if this is a security incident context instead of just like a normal SRE situation, your team is in a world of hurt because now you have no real ability to actually diagnose the issue about good observability and you can't test that what you're doing in your dev and test environment so that it's actually going to go and fix the problem in production. And so I always push really hard to say, "Well, why is there so much disparity between these environments?" And it's not just related to that. It could be build reproducibility. I've met heaps of places that say, "We don't have the source code for this application that's really important and powers our entire business." And I'm like, "Oh, okay."

Ilkka Turunen (:

Good luck with that then. Because you're absolutely right.

Cole Cornford (:

Where do you go from there, right? You just have to build a new application or you just have to accept the risk and hope your business doesn't collapse overnight, right?

Ilkka Turunen (:

You're essentially outsourcing all the... You are assuming all the risk and you're outsourcing the resolution of that risk to someone else. That's really what's ends up happening in that sort of scenario. Unfortunately, it's quite a common phenomenon. Built reproducibility is sort of the number one issue. For example, in supply chain management as well. There is no guarantee that a piece of open source is going to be distributed tomorrow. Your build have succeeded today. Every day you run a new build, it fetches express from NPM, you put in Struts in your Java application or whatever. There's no guarantee that that's going to be there tomorrow. Open source projects are unpublished all the time.

(:

And so one of the sort of funky things that we often run into when people start on this sort of supply chain management journey is how do you assume and assure the reproducibility of just a historic build? I need to now do a rollback minus two versions down. Turns out that that particular library or framework was undistributed because it had the breaking bug or the project decided that it was the wrong major minor version release or something else, all of which has happened. That causes huge amounts of build failures down the line because it's very hard to debug. There's nothing wrong functionally with your code. It's actually a completely external dependency and it's now having an effect. And so even more so when you don't even control the source code of your thing or have the ability to reproduce it, you are placing a huge amount of business risk on yourself because you absolutely cannot mitigate that risk in any other way other than paying a lot of money for it.

Cole Cornford (:

And that's why I guess commonly I like to say that you need to have a mirror, like a local mirror. I know that Sonatype provides Nexus as one example of being able to mirror repositories. And then at the very least you have control over those artifacts and you don't... You can get future versions of those artifacts and keep to historical ones, but if something happens to NPM or to PyPI or to whatever, and you now have no ability to go and access them because say the creator just takes the package offline or it's a transitive package and it just breaks previous builds, that's going to affect you downstream. And I always find it... Another interesting thing I commonly come across is that people... They're told by security functions to always patch and be up-to-date no matter what.

(:

And that's often quite conflicting with engineering functions in my experience because engineers want to know that they've got a pinned version and they're going to keep it to that version because they know it works. And if they are constantly keeping to N, then it's going to break. And then they're going to be constantly firefighting keeping up with because of the pace of how much things change in the internet. But then every time that I hear security teams advocate for that, they also then start whinging about the fact that they can't trust the components of the downloading from the internet anyway. And it's just like they'll say, "Oh, look at XZ, what happens if they downloaded that and kept an end version release?" And then this is why I have a job being able to have pragmatic outcomes for these kind of situations. But yeah, I don't know if you found a way to balance those kind of two conflicting goals very well.

Ilkka Turunen (:

So the reality is that the first thing that every security team should probably do is recognize that by using external code, and we are using a lot of external code as an industry, and I say business in fact, according to science or research, about 90% of the binary footprint of any piece of software is actually external in nature. Third-party, open source dependencies, libraries, frameworks, APIs, et cetera. And this [inaudible 00:22:07] all the way down. You can look at containers. They have a huge supply chain. You can look at AI models. They also have a huge supply chain of inheritance and things like this. There's always risk. You cannot have a hundred percent secure piece of code guaranteed at all points in time. Things are also discovered over time. So the attitude cannot be, "How can I be assured that it's the best possible, most secure version?"

(:

The attitude needs to actually be, "What is the best version in context of the version that I'm in right now to go to?" And we actually have done some science about that. It turns out that a good rule of thumb is not to actually go to N. The best version tends to be N minus 2.3, which kind of makes sense, right? If you're a couple of versions behind, you always have a bit of a pressure valve. If it has some breaking changes, you can go up a couple of levels. By not always being on the bleeding edge, you also mitigate things like, "Hey, it's going to be a lot of work to upgrade." The community will have some experience so you can understand and plan better, but it keeps you close enough to usually benefit from most of the innovation that's already already there.

(:

Especially when you're talking about business critical engineering work, it turns out that that's actually a fairly solid place. So there's that, but at the same time, anything is liable to break at any time. New security vulnerabilities are discovered at all times. So every day, we discover about 400 to 600 pieces of malware that are disguised as open source. The average piece of Java software has about 150 dependencies in it when you count the entire dependency chain. And if you work in a public cloud or JavaScript environment, you can probably say it's 300 to 400 dependencies easily at the end of resolution. So with that sort of volume, it's just going to happen at one point. So the name of the game has to be-

Ilkka Turunen (:

... happen at one point. So the name of the game has to be, "How do I effectively monitor all of it? How do I give my developers as much automated reaction as possible?" It cannot just be, "Hey, hey, remember to upgrade." Because then they would do nothing but upgrade. And so the way that we solve that in our product is we look at both the upgrade distance. So I'm in version X that there's a new issue on it. What's the closest possible version that you could go in the least amount of steps that either mitigates as many security issues as you can, or this particular one? And what's the binary distance as in how much work do you need to plan in order to get there? So how many breaking changes are there? The generic answer is very much you need automation, you're going to need tooling to augment that. The volume is just too huge for any one individual to keep in their heads at all times because it's an activity that needs to occur every single day in order for you to be as secure as the industry demands it.

Cole Cornford (:

Yes. So basically keep a couple of versions behind. It's okay. She'll be right.

Ilkka Turunen (:

Yeah, well at least start with that and then I'll get a little bit more sophisticated as you go down the line.

Cole Cornford (:

Speaking of sophisticated, we're going to go move into some policy [inaudible 00:25:15] government-y kind of stuff. In Australia, we, at least as far as I know, have a very limited regulation as far as honestly, software security. We have the ISM, which is the Information Security Manual, and there's a couple of steps in there that say, "Do OWASP." And I don't think that that's particularly useful or helpful or prescriptive about what developers need to actually do. And if I look at what's happening over in the States and in Europe, I think that it's only going to be a matter of time until we start to see sophisticated language domestically as well. How are you finding software security is regulated over in Europe? Because I know that there's a lot of SBOM stuff going on in the States, but yeah, what's happening in your area?

Ilkka Turunen (:

Well, the funny thing is Europe's actually coming out with a whole gamut of regulation that's going to apply to the European Union. In fact in October, there's going to be a new set of directives called the Network and Infrastructure Security Act, version two, which updates a previous version. We just called it NIS2. And basically what it does, it sets sort of a designation for a lot of businesses, a lot of types of companies that are going to be classed as sort of societally critical. And historically, that meant the government and the military. But now it actually means if you run a social network, if you have anything to do with digital security, anything to do with handling personal information, it's a huge gamut. If you handle finance, if you handle medically sensitive information, it's sort of quite a wide array of businesses that are going to get covered by it.

(:

It's the first set of regulations that actually sets a minimum baseline of what we expect to see in a security operation. So the Americans have done a huge amount of work. And actually, the European regulation even cites some of that as sort of a baseline. But what they're really doing is they're going over and above and setting to the different verticals. Initially, it's these critical industries. Then January of 25 with financial industries are covered. And then in a couple of years time, any product with digital elements including just pure play software will need to comply with minimum security standards. So it's a big range of upcoming regulations. Pretty much all of it says you need to do a minimal level of application security activity. When you build software, you have to be able to assure the regulators and the market that you are shipping your software with no known exploitable vulnerabilities, which in itself is already interesting language.

(:

And then in a couple of years time, you're actually going to proactively have to self-certify that the software that you have shipped out conforms to a standard by achieving a CE mark, which is sort of something that we've done in physical manufacturing, never in software before. And to get to there, you basically have to say, "I've done static analysis, I've done a dynamic analysis, I've done software composition analysis, I've done a ton of these security things, and I have the audit trail to prove it." Which is going to be a huge burden. So when we look at the European regulations, they're very much actually aligned with what the Americans are doing. The Americans have already put some of that into production. For example, you cannot today release new medical devices without certifying that the software in it has no known security vulnerabilities to you, and they actually check it on their side as well. The FDA will do a certification on that claim.

(:

That in on itself is sort of a huge step change in what they think. Historically, we've always regulated ourselves by best practice and self-regulation, and that changes them into this direction. Now the biggest change in all of these regulations in Europe especially, but also in the US, is that they're updating something called the Product Liability Directive in Europe, the PLD. What that says is that if your software is found to have not been manufactured by these minimum standards set by all these regulations, then even if your end user used the software wrong, but it caused them harm, that's also defined as data loss. They can sue you with automatic liability assumed to your business without ability to contest it at all. And that's going to be a huge game changer just overall, I think, globally on how we think about how security should be done and what the potential implications of skipping it might be.

Cole Cornford (:

There's so much to unpack there. So going back to the very beginning, which you mentioned the idea of critical infrastructure. So in Australia, we have this a regulatory act called Security of Critical Infrastructure, the SOCI Act. And they basically went out and said, "We're going to cover telcos, we're going to cover utilities, we're going to cover military, we're going to cover banks, financial institutions." That's what they kind of deem as critical infrastructure. There's a few other pieces like education and spy agencies and stuff. But with SOCI, it was just focused on basic cyber. So it was always like, "Let's just get people to comply with essential eight and actively report on their security postures to their board of directors or whatever their governance model is." And it's fine, but it's basically nothing in there about applications. And the way that Australia works is we're very much about professional services and digging rocks out the ground.

(:

We don't have many businesses that are software and software businesses. All the banks are very sophisticated with software. I know that some bankers are going to message me afterwards and laugh hysterically about their Java spring apps from 20... But usually they're quite sophisticated. And we have a couple of outstanding companies of international acclaim like Safety Culture and Canva and Atlassian and Camplify. But largely, we are a mining economy and we spend a lot of our time for security thinking about how do we protect our resources industry. And so it's really interesting to me to see that you've kind of recognized that, "Okay, well, software security is endemic and an issue that we need to be addressing across all of Europe." And what you're defining as critical infrastructure then slowly moves into banking and financial services.

(:

I would've thought that'd be one of the first caps off the rank actually, and then two years later voted January 2027 that they need to mandatorily do things. That's a terrifying timeline to me because I know plenty of systems that are duct tape and glue that do the most important things that you can possibly think of, and then saying, "Cool, those systems that are written in the seventies in cobalt are running on ZOS with some RACF permissions now need to be considered for security." I guess that's terrifying. How is the government supporting businesses to actually think about how they're going to modernize and do a digital transformation to try to adhere to these laws? And because liability and accountability for directors, is it civil charges or is it criminal if you mess up?

Ilkka Turunen (:

Yeah, it's going to be... I think the US is going down the civil charges and criminal charges path. So for example, today already the Federal Trade Commission has the ability to directly pursue prosecution against businesses. I think following the CrowdStrike incident, their executives are currently under that sort of duress. And you look at something like SolarWinds, they did a pretty famous case on that. I think in Europe, what they're also doing is they're arming the regulators with a lot more teeth, a la GDPR to be able to sanction businesses. So for example, non-compliance with NIS2 could also mean that your directors are barred from running a business. But it can also mean GDPR-style fines that most commonly receive 2% of global annual turnover, which means that it's a pretty serious sum for most businesses, for each breach that's sort of discovered. Now the sort of silver lining in most of this is they're not trying to create this sort of regimen of we're going to come and inspect every single business. That would sort be insanity.

(:

But what they are sort of requiring you to do is at least self-certify that you have done all of this thinking, and it is a terrifying timeframe. It absolutely is, because frankly, you look at the world after the CrowdStrike incident, a lot of it is run on bubblegum and sticky tape. And it's actually that bubblegum and sticky tape was built by someone else. You don't actually even own it. And you're sort of completely reliant on making sure that that's good. And I think that's really what's prompted the regulators into action. They've seen these incidents happen over the years, [inaudible 00:34:02], the SolarWinds, the Struts 2s of 2010s, and increasingly they've seen the blast radius increasing because there's more software. And I think they've also collectively said, "A, we've got to make sure that businesses are held liable if they're not doing things, at least to some minimum standard. And B, we cannot have sort of societal resilience being threatened by this, especially since the world's generally probably a more unsafe place now."

Cole Cornford (:

That's right. So I've got a lot of founders who listen to my podcast because they like to learn about different parts of cybersecurity. It's an interest to them, but ultimately, they're focused on growing their businesses, and I imagine a lot of these founders in the future will be looking to export into Europe. So if you're a software founder, what would you recommend for them to try to comply with this NIS2 Act?

Ilkka Turunen (:

Well, the good news is NIS2 is probably not going to affect you really. But the law to be thinking about in the future is going to be something called the Cyber Resilience Act. That's really going to be the requirement. I think it is passing this year, early next year, and it's going to be implemented in sort of a three-year timeframe, I think. What that will really mean is when you place software onto the European market, it sets a minimum time span of how long your releases need to be supported. It's generally about five years.

(:

So if you put out a release, they're expecting you to be able to provide security updates to it for five years. So when you're designing and building a product, don't go overboard with the security, but do think about the time span and maintainability of that software. The second element is that they're requiring you to ship the product with no known security vulnerabilities onto the market. And if a new one is discovered, they're expecting that within a reasonable amount of time, you have the ability to push out a release that fixes it. So in that state, and if you unpack it at the very bare minimum, it's in your interest to-

Ilkka Turunen (:

Very bare minimum, it's in your interest to at least make sure that when you're building your software, you're placing a little bit of thought about the reliability of what you're using to build it. So, for example, if I'm going to adopt, let's say, a base model, because a lot of the startups are probably AI startups right now, you're going to adopt some model or you're going to train some model, can you show the pedigree of that model? Do you know how trustworthy it is? If it's from some random Chinese account on Hugging Face, there's already typos coding happening on Hugging Face. There's literally people publishing malicious AI models out there right now.

(:

The one key thing to understand is, "Hey, does it have a big enough community?" It doesn't have to be more than 15 minutes of your time just thinking about the pedigree of the model. You're probably going to get a more maintainable, more resilient supply chain as a result as well. So if you have the ability to think about the architecture, that is going to save you a lot of world of pain down the line. And then when you start exporting out in Europe and you run into these compliance issues, it's not about you have to do an audit necessarily, but it's, if asked, you need to be able to provide some artifacts.

(:

So very basic standards like automated builds, builds that you can use to show as evidence that you've done a minimum level of security testing, help you say, "Yep, we're compliant. We're able to run releases. We're able to track, for example, what open source we put into each version, and therefore there are no known security vulnerabilities to us when we release it."

(:

So even though it feels dreadful, actually, probably, it's much easier to do when you're a founder and when you're building a new product in a greenfield environment versus if you are one of these big banks or mining companies and you have legacy infrastructure from a hundred years ago running your mainframe or whatever. That's going to be a much harder ask. So I think it's actually probably going to be a competitive advantage because you can really build to those requirements. And as we said earlier, that can actually mean that your speed of delivery probably goes up.

Cole Cornford (:

And the good thing is that if you are exporting into those markets as a founder, and let's say you build a bank, well, if it's a digital bank, hopefully you'll be able to actually comply with these regulations and everyone else is too busy focusing on getting up to speed while you're too busy building new features and getting ahead of them, right?

Ilkka Turunen (:

Yeah, that's exactly it.

Cole Cornford (:

Do the right thing from day one and you'll be good.

Ilkka Turunen (:

Yeah, exactly, exactly, exactly. I mean, but that's what this all represents. It's an evolution of how we build software and how we are expected to build software, but it turns out there is opportunity in it because we can leverage all this automation that's been built over time, all these best practices that we've all accumulated. So as a founder, I spend time really thinking about that.

Cole Cornford (:

So don't be scared. Regulation can actually be your friend, and it's as much an opportunity as it is a risk. So we're going to switch gears a little bit. Two final fast questions for you, because I'm respectful of your time. So first one, for $100, what thing has made your life awesome?

Ilkka Turunen (:

$100?

Cole Cornford (:

Let's go 50 euros. 50 euros.

Ilkka Turunen (:

50 euros, made my life awesome. You know what? I think it was my Raspberry Pi.

Cole Cornford (:

Yeah? Tell me more.

Ilkka Turunen (:

That thing. I like doing little projects, so recently I just explored a little bit and I created this sort of AI picture frame that use an e-ink display with a little Raspberry Pi, just generates a picture every day with AI for me to look at, and it kind of looks like a print picture. But that Raspberry Pi lives from one project to another. I think I bought it five, six years ago, and I keep squeezing more juice out of it. It's like the most fulfilling thing in my life, if I'm honest.

Cole Cornford (:

So one of my mates, he built this product over in the States, which all it is is this really nice large cube, but it's got all these LED diodes on it, and basically it listens to the music that's playing on your sonar speakers and then changes to the album artwork of the song.

Ilkka Turunen (:

Nice.

Cole Cornford (:

So then you're like, "Oh, that's what... It's now Pink Floyd, Another Brick in the Wall." But it's called Tune Shine, so there you go. If he gets a hundred random orders around the world, he builds them all himself, good on him. But I love small little projects like that.

Ilkka Turunen (:

Yeah, me too. Me too. I'm all for it. And it's like as a software professional, I think it scratches that creative itch that we all have.

Cole Cornford (:

Yeah, I need to get back into doing fun projects, but I'm too busy taking care of my kids at the moment. But yeah, that'll be a couple of years time maybe I'll be able to get Monica into programming.

Ilkka Turunen (:

Hey, all I'm hearing is extra pairs of helping hands.

Cole Cornford (:

That's it. I'll just teach them soldering. So we'll see how that goes.

Ilkka Turunen (:

That's a safe and fun activity.

Cole Cornford (:

So probably the last question today is a book. So if you were to give someone, not even necessarily in the AppSec space, but just one of your friends a book, what book would you give to them and why?

Ilkka Turunen (:

So I've actually really, really enjoyed recently No Rules Rules, the book about Netflix culture that was written by their CEO as well as a researcher, a social scientist. But basically you kind of think about a corporate book like that. It's just going to be sort of our internal codex piece. But actually it really sort of dissects this idea of if we empower and make people more responsible by essentially removing checks and balances in a thoughtful way, but then also requiring what I call radical candor in terms of feedback and accountability that comes from that, turns out you create a really highly productive and highly energetic environment.

(:

In that book, they give sort of a framework of how to give effective feedback to your colleagues and to your work, and I really found it a sort of very useful framework just in general. It's not a very big book, but honestly, for the five, ten bucks you have to spend for it, it's well worth the read.

Cole Cornford (:

Yeah, because a similar book I've read a while ago was called Work Rules, which was... I think his name was Laszlo Bock, and he was someone who did the culture at Google in the early days. And I think nowadays he reflects on the book and says that it worked, but he doesn't think it would work again. And so I think it's important to remember that the culture within a lot of these businesses is often led by the leadership. And sometimes, unless you have complete conviction and managing and doing things in that specific way, it's going to be hard to replicate. I've seen people try to replicate the Team Topologies model from Spotify for building their engineering functions and then wondering, "Why is it not working?" And it's like, well, it's because you're not Spotify.

Ilkka Turunen (:

So funny enough, that's actually one of the reasons why I like that particular book over Team Topologies and others, is the reason why they landed on that particular culture is because of all the missteps they took in the previous company. And Reed Hoffman actually... No, it wasn't Reed Hoffman.

Cole Cornford (:

Hastings?

Ilkka Turunen (:

Reed Hastings, I believe, yes. Need to fact check that later.

Cole Cornford (:

We'll just say it was Reid Reid. There we go.

Ilkka Turunen (:

Yeah, exactly, exactly.

Cole Cornford (:

Reed Rabbit.

Ilkka Turunen (:

Reedie McReidface right?

Cole Cornford (:

There you go.

Ilkka Turunen (:

Any which way, he did all of those mistakes in the previous company, kind let it get out of control. Orders by rules, expense policies, all of this sort of stuff, and that's what I think kind of really speaks to it is, yeah, you do need strong convictions more than anything else. I mean, that's sort of the takeaway is, is whatever the principles that you're founding your business on and whatever the principles that you are running your business on, taking those is actually probably a better yardstick than doing what is sort of considered right by the book. And in his case, he said, "You know what? I really don't want to spend my time managing all of these rules that [inaudible 00:43:55] govern the people around me because that restricts them and restricts the innovation. That's what we want to do, is build innovation."

(:

Transplanting that sort of thing onto an existing business is probably impossible because the principles that guide that business are very, very different. Yeah, you probably can't be a bank and all of a sudden turn into a no rules environment, but that's exactly what I like about it because they kind of break down, here's the anti pattern that we observed in the past. Here's how we tried to mitigate it. It worked. The social scientist element of that book kind of talks about, "Well, he may say that it works really well for him as a executive founder, but actually, when I spoke to the average worker, I found that that doesn't actually happen like this in real life. These are the elements that actually worked." So yeah, I just really liked it because it's quite a balanced perspective like that.

Cole Cornford (:

Yeah, or the other option you can do is run your own company and then figure out stuff and then be like, "Wow, okay, maybe I should learn from these other people because going to the school of hard knocks is not so fun."

Ilkka Turunen (:

Is that the usual way that people's careers go nowadays? Start a company then write a book about it, and whether or not that was as success, at least you get sort of a motivational speaker track and income stream for the rest of your life.

Cole Cornford (:

That's it. So all I need is to write the book and then I'll be a motivational speaker around Europe in 2027, being able to talk about software security, right?

Ilkka Turunen (:

Exactly, exactly. We'll have you over in London and we'll get you talking in all the places.

Cole Cornford (:

Well, with that, I'm looking forward to being invited overseas. Thank you so much for coming on.

Ilkka Turunen (:

It's been a real pleasure. Thanks for having me.

Cole Cornford (:

Thanks a lot for listening to this episode of Secured. If you've got any feedback at all, feel free to hit us up and let us know. If you'd like to learn more about how Galah Cyber can help keep your business secured, go to galahcyber.com.au.

Links

Chapters

Video

More from YouTube