mnemonic security podcast

Feature velocity > software security?

October 26, 2020 mnemonic
mnemonic security podcast
Feature velocity > software security?
Show Notes Transcript

Why is it so difficult for security people to speak to developers? And the other way around…

For this episode, Robby has invited a veteran to the software security game, Nick Murison, Security Practice Lead at Miles. Nick started off as a penetration tester, and has been passionate about software security and training developers to think about security upfront for close to two decades.

They speak about software security within the development lifecycle, and bridging the gap between developers and security people. Nick also explains how he believes more organizations can get security into their development, and dives into the question “is DevOps really increasing or decreasing your security risks?"


Unknown:

From our headquarters in Oslo, Norway, and on behalf of our host Robby Peralta. Welcome to the mnemonic security podcast.

Robby Peralta:

Imagine a world without software, let it sink in a bit, and you'll probably conclude that the world is definitely a better place with it. Now think about software again, but software without vulnerabilities, let it sink in a bit. And they'll probably conclude that the world is definitely a better place without software vulnerabilities. That being said, you wouldn't be listening to this podcast and you probably wouldn't have a job within cybersecurity. By now, I think we've all understood that software security is complex, and concepts such as DevOps are influencing its popularity within the infosec world. Therefore, I'd thought I'd bring on a veteran to the software security game, who has spent his entire professional career and life focusing on the matter. Nick Murison, welcome to the podcast.

Nick Murison:

Thanks for having me,

Robby Peralta:

So the background of this chat! We had a kickoff for a customer of mine recently, which included a CISO trying to introduce security into their development lifecycle. And this is the smart CISO so so he was aware not to force anything on to his developers. So we basically invited everyone to a secure coding workshop to introduce the idea of security in a fun way without being so very intrusive. And long story short, that worked out great. It was an inspiration for the episode, and the colleague that I was with for that workshop, he said that you are the man to talk to you about software security. And voila, you're here today via the recommendation of my trusted advisor for software security. So congratulations!

Nick Murison:

Well, thank you. I'm honored!

Robby Peralta:

Well I don't know if I should say"congratulations". But yeah, honored at least cuz he's a he's a pretty respected guy.

Nick Murison:

He is definetely.

Robby Peralta:

So I'm here to pick your brain about security within the development lifecycle. Does that sound about right? Yeah, I have an opinion about?

Nick Murison:

Absolutely. I've got I've got a couple of thoughts.

Robby Peralta:

Cool. But before we jump into that....who are

Nick Murison:

Well, yeah, as you said, my name is Nick Murison, you? and I've actually been doing software security for the better part of 15-16 years now. And as any security consultant in the mid 2000s, my main job was penetration testing. So breaking other people's stuff. And it was really good fun, I had a lot of fun, I broke some really interesting systems, I learned a lot. And so it's a bit like solving puzzles. It's a challenge, you got to get your head into, you know, how the developers were thinking when they built something, and you've got to figure out, Okay, wha didn't they think of. But th one thing that kind of bugged m was after having don That happened a couple times in this episode, penetration testing for a fe years was essentially every week was the same deal. I would go in, I would start testing new system or application or whatever. And by the end of the week, I'd written a report that essentially told the developers, their baby is ugly. And that t ey got a bit tiring after a whil, it got boring. And so I st rted thinking about how do w actually fix these problems? ou know, it's easy to find them, but it's harder to fix them. nd the really hard challenge is o prevent them from happening in the first place. And so I got i volved in things like training developers on how to secure c dely. Code securely. I'm sure. So I got into training developers to code securely, I started doing more kind of secure code reviews, helping people doing threat modeling, and more kind of architectural security reviews to try and sort of catch big things earlier. Big security issues earlier. And then I kind of built my career up to the point where I was actually I ended up working for a company called Cigital. And if you haven't heard of digital before, they don't exist anymore. They got bought up by another company, but they were the they were the, the originators of several sort of modern software security concepts, including one thing called the building security and maturity model. And it's a model that essentially tries to describe what a company's in real life doing, as part of the software security programs, initiatives, and I ended up being one of the, shall we say, assessors for that. So I got to run around the world and an interview developers and security people or companies all over the world and essentially ask them what do you do as a partof your security program, and then help them based on what they were doing and sort of give them recommendations on what they could you be doing next? And so that's how I've spent the past few years now I work for a company in Norway called Miles, which is a software consulting company. They wanted to focus more on security. And I think it's a kind of a sweet spot because - one of the things I've learned is, it's really hard to teach security people how to speak to developers. It's a lot easier to take developers and teach them security concepts, and have them essentially be the security champions. So it kind of like what you guys were doing with your customer recently, where you essentially turned a bunch of innocent developers into, you know, evil hackers who are looking at everything. But every app that they use, and go, how can I make this thing, and it's really good fun to see that change in the mindset and also to see them be able to take the message out to the rest of the organization and help everyone become more aware of security.

Robby Peralta:

Those developers at the end of that little workshop, which was like a tournament. At the end of the workshop, they were going into their ticketing systems and make giving them themselves work because they wanted to improve their own code. And that was that without anybody telling them to do so. So there, yeah, they want their software to be as good as possible. So that's one thing I noticed. In your experience over these past 15 years? What's actually important for developers, and why is that relevant for security people to understand?

Nick Murison:

So if I were to sum it up in, in two words, it's feature velocity, which is a, which is a fancy way of saying as a developer, you want to get new features, new functionality, that are going to be exciting to the customers exciting to the users that are going to give the users and customers more value out of the app on the system. You want to do that quickly, and preferably quicker than your competition. And so your main focus is how do we make this? How do we make changes quickly? How do we make updates quickly? How do we add new things quickly? And that's going to be my main focus, and anything that kind of hints that or suggests that they need to slow down is going to be not necessarily a nuisance, but it's going to cause friction, and eyebrows will go up. And especially if you're developer with a tight timeline or deadline, you've got until Friday to get this new whiz bang thing and your iPhone app to work. And someone's come along and said, Can you fix these 20 security bugs? You can't do both. So you got to prioritize.

Robby Peralta:

And they want to go fast..Is that like a self motivated thing? Or is that because like some sales guys are begging them to do so. Just out of your experience.

Nick Murison:

My experience is definitely there's, there's either sales or product management or someone else kind of applying that pressure. And a lot of times it can be a combination of needing to push out new features quickly. Because sales, told the customer that sure we'll have it done by end of the month, it will be combination of that. And the fact that, maybe yo only have 4 developers on the team. And that's actually 3 developers to few for what you're trying to achieve based on your budget backlog. But you live with it. And so do the best you can. Yeah, but the pressure is there, definitely.

Robby Peralta:

So what about for security people? What's it what's important for security people? And why is that relevant for developers to understand,

Nick Murison:

essentially, that chief responsibility is to reduce and manage risk, security risk?. So the easiest way to do that is for nothing to change. If the app doesn't change, if there are no new features, then security can sit there and essentially work to reduce the risk that we already have. And then but every time you add a new feature, or change something, that's that's a new thing that needs to be assessed as a new potential risk that that's introduced. And so security and development are both working for the best of the company or the organization. They're both trying to do the right thing for the organization. But they're kind of sub-priorities. The individual priorities look like they can be conflicting a lot of time.

Robby Peralta:

But like you said, they're all they're all going in the right direction. So I guess it's part of this podcast. We're trying to get them to understand that right? But by the way, do you think secure code means better code?

Nick Murison:

Unless it's a trick question, yes. Some people might consider security to be kind of like a sub part of quality. Any security bug or any security flaw. And I can talk a bit about what the differences are there if you want. But any kind of security weakness is essentially a quality issue as well, because it's the application not responding or the system not responding correctly, when provided with some sort of input or some some activity. And so it could be as simple as you know, if I enter a negative price in a, in a shopping cart, you know, that's both a quality bug and a security bug. If it lets me do that, you know, quality wise, you know, you should be checking that it's the right, you know, integer and stuff like that. And from a security point of view, you know, you don't want your customers essentially getting paid to take your project or products off your hands.

Robby Peralta:

You must really love your customer. In that case, at least. Yeah. Hey, but you just mentioned the difference between a flaw and a bug if I understood you correctly. Go there!

Nick Murison:

Okay so it's, perhaps a bit pedantic terminology. But one of the things that I like to distinguish between is a bug in a flaw. So a bug is essentially when the developer makes a mistake in the code. So for example the developer intended to compare two variables, but instead they used the wrong kind of syntax, and they ended up actually setting one variable to the value of the other one for xample. That's a bug. And then hat can manifest itself in in arious different ways, lots of ind of injection attacks, take dvantage of that. While a flaw s more like a design level ssue, so a flaw in a system ight be the fact that - nowher So if I understood you correctly, security is usually a in your architecture, did yo design in any kind of lik authorization logic, to say okay, well, somewhere, someon actually has to check that th customer is allowed to do wha the customer is meant to do that the customer is logged in and is allowed to do thi action. And if there's a m ssing component, that's a de ign flaw, essentially And it seems to be about 50 50 when you ask big companie, do you have more and mor bugs than flaws, they tend to b about 50/50, a lot of times, aybe a little bit more bugs But the point of distingui hing between them is to make he point that it's not just evelopers writing code too fast and missing something as they rite. It's also on the architect re level. If you're not t inking about security upfront as part of your design, you cou d miss entire security compo ents. Instead of trying to ttack your mobile app on an Phone and figure out how to stea everyone else's data. Why don' they just talk to your web ba ed API that your mobile app tal s to? And if you've forgotte to put any kind of controls on the API to say, okay, yes, we we think we're only ever going to talk to a mobile phone pp. But what if someone trie to talk to us directly, we need to authenticate them. T at would be more. And that's a problem that we've seen a f w times. flaw. Not a bug. It can be, yeah, lack of security components is definitely a flaw.

Robby Peralta:

Interesting. And now this is really random, but I can't help myself. So when every time that patch comes out for some security, is it a bug or a flaw? Whenever somebody patches something, and people go out and reverse engineer and figure out what to change? They can go do something wrong with it malicious with it. Are they are they taking advantage of bugs or flaws? Or both?

Nick Murison:

Both times, basically, I mean, if you're looking at things like, you know, the patches that Microsoft put out, it's usually a bug of some sort, like it's. Well, no, no, that's not necessarily true. A lot of times it's a bug because you know, someone's made a slight mistake someone somewhere but other times it's actually a flaw, because the logic involved in some of these components is just so complex, that even an edge case and you could argue that's a flaw. Zoom, for example, had just to pick on zoom, which, by the way, have responded just fantastically, I think. You know, they a lot of their issues were flaws to start off with because they were just they were scenarios and situations that they've never thought of.

Robby Peralta:

Yeah, and that they can't blame them that much for that right? I was just sitting there thinking like, okay, you mentioned Microsoft, right? They have probably thousands of developers out there. And so I would have thought that they'd get their architects, right, they would have thought about all these possible scenarios. So they're much more likely to do a bug than a flaw. But at the same time, this is such as complex environment, other products and how they interoperate with each other, that it's actually understandable if they if they forget to leave a piece out, because there's so many moving parts, and it's so complex.

Nick Murison:

Exactly. I mean, sometimes it's even the, you know, if you're implementing a standard of some sort, like, let's say you're you're implementing, don't get too technical here. But like, let's say you're implementing a TLS. You know, the thing that basically encrypts all your HTTP traffic, all your web traffic, over TLS, if you're implementing TLS, version, let's say 1.1, which is a slightly older version, that version has a couple of issues in it, it has flaws in it design flaws. Now, if you can implement it, hundred percent accurately, according to the standard, the coding spec, your developers have done exactly what the spec says. But the spec is wrong.

Robby Peralta:

Inheriting flaws, yeah.

Nick Murison:

yeah, exactly. And you can't you can't put the blame on, for example, Microsoft, at that point you know, they had 200 cryptographers have had a look at this thing, and they couldn't find it. And then but three years later, someone randomly goes, Oh, hang on, if I do this, then there's a problem. So these some of these flows can actually be you know, they can exist for a long time before someone identifies them because they're so complex and convoluted. And it takes a lot of sort of figuring out mental arithmetic to figure out what's going on.

Robby Peralta:

Yeah, and and that does not combine very well with the whole speed of the, how fast all these things need to go correct. To keep up with business.

Nick Murison:

Exactly. I kind of want to drop in DevOps, because it's the elephant in the room, and we're gonna get to it. But DevOps is kind of a, there's different definitions of DevOps, you know, I've met a lot of customers, and they say, we're trying to go for DevOps and I say, Okay, what does that mean, for you? And everyone has a has their own definition. And I like to think of DevOps as basically, if you think DevOps is automation, then you're probably talking about CI/ CD, you know, automating your bill, ss and deployment and everything. And, you know, going from a developer writing a line of code to it being in production within an hour, that's, that's kind of more ci, CD, continuous deployment and so on. Agile is more about methodology, methodology. So how you kind of approach the whole concept of, we have a backlog, we have features, how we estimate this stuff, and so on. While DevOps is more about the people in the culture, devil more about making teams that are essentially autonomous to a certain extent, working on, you know, they have a specific responsibility, let's say, you know, you've got one DevOps team, that are responsible for managing your payment API. They develop it, they test it, they deploy it, they maintain it, if there's, if something goes wrong, they're also the ops team for it. And so that's kind of how I think of DevOps. The nice thing about DevOps is that it does enable you to basically, if you segment your your systems and architecture correctly, you can have individual teams working on individual parts, and making really frequent changes, really fast changes without impacting anyone else. And you're essentially, very quickly and organically moving forward, through your, you know, creating new features and so on. Now, I attended a conference virtual conference a couple of months ago, where I have heard the first of the following two statements in quick succession. DevOps increases security risk, followed by DevOps decreases your security risk.

Robby Peralta:

And both two conflicting thoughts.

Nick Murison:

They are, but they're both correct. Hmm. Essentially, it sort of depends on how how you approach DevOps, what, what is your maturity around it. And so the it, making changes quickly is a good thing for functionalities point of view, for features and so on. Because you can, you can launch new things quickly. It's a can be bad from a security point of view. Because if you don't have the right checks and balances in place, you're going to miss security vulnerabilities, and they're going to go out and they're going to be in production and get hurt by it. But it can also be a good thing because if you're able to make changes very quickly And you're able to spot when something launches or just before it launches, ideally, that might be a security issue, you can make a change to fix that, ideally, before it goes out the door very quickly. And so that increased speed can be can be a risk, they can also help you address risk as well, where you know, someone, something you can turn around really quickly.

Robby Peralta:

Hmm. So if I'm understanding you correctly, obviously there this is a very complex, complex environment to work in both for security and developers. And there's, I mean, there's no correct answer. It's all dependent on a lot of variables that not one any one person controls. So what are companies doing about this? How are they going about solving the issue?

Nick Murison:

different approaches? It sort of depends on on who you are, what your company is structured, like, what your focuses. So a lot of a lot of companies that are kind of ahead of the curve, from a security point of view, I would boldly say, are the heavily regulated and compliance driven organizations out there?

Robby Peralta:

So finance, energy. Exactly.

Nick Murison:

Yes. So these guys have essentially, you know, they've got a banking license, let's take finance, as an example, they have a license from their respective government to run a bank. And that license is contingent on having a lot of things in place, from a risk point of view, from a Ford point of view, from a security point of view. And so you, you are very well versed in making sure that you reduce risk. And that you have a lot of rules in place to make sure that, you know, people can't commit fraud. And anything that you put out there, production has passed, kind of a hurdle of checks, for better or for worse. So you end up with some organizations becoming slightly paralyzed by this because they're heavily compliance driven. And if you look at, you know, how we did compliance and security 20 years ago, it worked really well, because development was waterfall based. But you want to launch a new website? Well, that's going to take three years. And there's a giant Gantt chart, and you know, the first six months, such as planning and requirements, analysis, and so on. And then you got another three months doing architecture and design, and then you've got, you know, maybe 12 months to development, you got six months doing testing, and I've lost count of how many months they are. But anyway,

Robby Peralta:

I'm sure compliance misses those days. Those were the good ol' days for them.

Nick Murison:

I think so. Yeah. Well, and the thing that gave them defined time in the project plan to look at security and look at compliance.

Robby Peralta:

Now you're thinking about this.

Nick Murison:

Oh, it's, it's October, we're gonna think about security for the next two months. Fantastic. That doesn't work anymore. Because it's Friday at three o'clock, and the feature is going live at Friday at four o'clock. And you can't do all those things that you used to do, you've got to come up with a different way of approaching it. And that's where the, the tech companies, the software companies are kind of coming up the inside lanes of speak now, because they might have less of a compliance push. But they've got more of a grassroots effort to to focus on security. So you know, they've kind of got the concept of security champions, and in Norway, you might call it a inshallah, which doesn't translate very well to English, but you know, fire souls. Sounds like a Will Ferrell movie. But yeah, but the companies that I've seen have done a, they've done a really good job of this have, essentially some sort of team or department that is responsible for getting security into development. So you might call them an app sec team, you might call them a software security group, or something like that. And essentially, what their responsibility is, is to build security into how you do software development, or help the developers do it, depending on how the organization's structured, that they're hitting you without the meeting. Exactly. And that and that, and that has been, you know, classic problem. And unfortunately, you know, I, every once in a while when I speak to I'm kind of fortunate now that I work in Mars because I tend to be talking to developers. But every once in a while I speak to security people as well. And you know, they're they're frustrated and so the developers because they're still sort of, they haven't quite met in the middle. What some places they have but but not everywhere. And every once in a while I'm talking to security person and they just haven't quite got the mentality yet. They're still thinking that the developers are either not listening to them, or not smart enough or are maliciously ignoring them, or something, it

Robby Peralta:

I want to say security people think that about can be a variety of everybody

Nick Murison:

I was definitely guilty of that for a while. And, you know, having having lived through a few sort of, sort of interesting struggles in various organizations where I come in, and I'm the security guy, you know, here's Nick, he's had to break your stuff. And dealing with the confrontations and the conflicts that can cause you learn quickly that you got to get on the good side of the other developers, because the other ones actually. They're the ones building the new stuff. They're the ones pushing the business forward. Anyway, where was I? Yeah, so a lot of companies will have some sort of app sect team or software security group. And then they'll have something that you might call a secure development lifecycle, or software security development lifecycle, or something along those lines. And it's essentially, an overlay of here are activities that we're going to do as part of while development runs. to look out for security issues, we're going to, we're going to try and find issues early, we're going to try and fix them. And we're going to try and prevent them from happening in the first place. And there's a lot of activities you could be doing. And it sort of depends on where you are, maturity wise as organization, what you're doing is development already. Are you automating a bunch of stuff? Or are you still doing manual code reviews? Are you doing waterfall? Are you doing Scrum? Are you doing? Are you kind of a bit more cicd? Where are you kind of on that spectrum? Are you a flat organization where you're hierarchical? It all depends on certain things are going to work better for you than others, so and that, and that's the that's the challenge for all organizations starting off trying to, you know, when they saying, how do we get security into development? Well, there's a, there's 120 things you could be doing. pick three, and, you know, make those three things work. And don't try to make 120 things work, make three things work. And then when they work really well, look at Okay, what else can we do? So so it has to be a journey, it has to be maturity journey. You can't just go from zero to 200 in a year, for example.

Robby Peralta:

And that's what makes this podcast really hard. Because like, I wish we could just give everybody like the golden answer. But that's literally impossible for this for this topic.

Nick Murison:

It is, I think, I mean, a while ago, it was a much more daunting, I think, but nowadays, you're seeing a lot more technology that enables developers to kind of take upon themselves certain security responsibility. That just didn't exist before. And I think if you look at like, what GitHub is doing with their GitHub actions and their advanced security platform and stuff, where it's literally, hey, developer, do you want to turn on this feature that will and will tell you if you introduce a security issue? or Why not? If you want to add your you know, your shoe tracking system? app? Sure. That sounds fantastic. Thanks. Those things that make it easier for developers to get a handle on security issues early. They're becoming more normalized in love, in love the color modern development technologies?

Robby Peralta:

Yeah, that's what I've been hearing a lot. One of the common denominators of all these conversations I've had, is that put security into their existing their whatever they're doing today, put security in there. And then most least intrusive way. Exactly. If you want it to work.

Nick Murison:

Yeah, absolutely. Yeah, I mean, if you try and introduce an entirely separate project flow, for the security stuff, that doesn't work within the confines of how development is working, now, you're not gonna get anywhere. But if you if they've got a continuous integration pipeline where they you know, build the software unit, test the software, and deploy it into a testing environment and do some other testing in there and then deploy to production. Use that pipeline, you know, oh, they're building the software and they're running some unit tests. Fantastic. Why don't they run some, I don't know static analysis for security issues at the same at the same time. It's just, it's adding some stuff to the pipeline. Now you do have to be careful about does that cause the time to build to increase a lot? Does it introduce a whole bunch of new issues that nobody was prepared to handle? That kind of stuff. So you do have to be careful about it. But essentially, if you can do stuff that helps the developers write better code in a way that doesn't slow them down, then they're going to appreciate that.

Robby Peralta:

Huh, right. By the way, I'm trying to figure out who the best you know, it could just be that the seaso and the, the head of development team need to, you know, drink a lot of beers together and become best friends. But who in your experience has the most focus on software security? Is it c SOS? Or is it more like product directors and CTOs? Are? We laughed earlier about sales people actually caring now these days? Yeah, the most.

Nick Murison:

I actually, well, a lot of the most immediate caring is from sales. Usually. If you've just, you know, if you've just lost a deal, because your competition is able to show that they are they have a better grip on security in the app in their application, or product, or SAS platform, or whatever it is that you're selling. You know, if your competition can show a better security story than you can. That's, that's really painful for sales. And they're gonna go probably yell at product management and say, why isn't? Why don't we have a good story?

Robby Peralta:

As a sales guy, I can't blame that on the fact that they're trying to cover their a$$. I'm pretty sure they want to make sure that every knows wasn't their fault.

Nick Murison:

Exactly. Well, but I mean, yeah, and that's, you know, sometimes that's the case, but I have spoken to customers who, you know, they have had customers say, we're not going to go with you, because you didn't tell a good as good as it could be story as as your competition. And that's always usually around compliance, you know, do you are you certified against ISO 27001. And the vendor that is gets the deal. The thing to look, the thing that you need to look out for need to be aware of, is that when we talk about security and compliance, very little of it, specifically addresses software security. So if you look at ISO 27001 as a, you know, it's an international standard, it's been around for 20 years, ish. And everyone, all the big corporates have it. And a lot of corporates now expect, especially in Europe, they expect the vendors to have it, to be certified against it, and so on. It doesn't say a lot of us offers, Qt does say, you know, put put security engine development processes and so on. But it doesn't say much more than that. PCI, you know, if you're dealing with Payment Card Industry stuff, says a little bit more, but not a whole lot more. And so a lot of the standards are kind of wider, that, you know, they're looking at it, security information security as a whole, they're not looking at software security. So you know, I've, I've been in companies where, you know, I got flown in at short notice to to help them because they got hacked, and you know, the the head of IT security sitting there in his office, and you know, not having a day and behind them on the wall, is there ISO 27001 certificate framed. And that's, you know, as long as say that I said, then, you know, isn't a good standard doesn't, you know, prove a point. But it doesn't mean that you're infallible, it doesn't mean that you're 100% secure. And if some and if your focus has been on, let's say, infrastructure security for the past 20 years, and not on software security, I think, moving forward, a lot of the issues that are going to be kind of the big sort of page Turners, so to speak, are going to be more on the software side than on the infrastructure side, especially as we kind of we've all become intimately familiar with so many cloud based solutions over the past six months. And so the concept of, you know, having a having a nice firewall, and then an internal office network, and, and that kind of stuff that doesn't exist for a lot of companies. And for a lot of customers anymore, it's, you know, I have a laptop, I can be anywhere, and I'm talking a bunch of different cloud based services. It's not, it's not within a nice kind of secure network perimeter anymore. And so, you know, as more things get exposed directly on the internet, as more software gets exposed to it on the internet, it's gonna be some people are gonna have rude awakenings with regards to Oh, but I thought we had a firewall or the firewall doesn't solve doesn't solve all your security problems. And by the way, your app is kind of sitting out there in the cloud now. And it's not hidden away in a on an internal work anymore?

Robby Peralta:

Hmm. Well, interesting. It sounds like is it true to assume that like, you know, the larger, larger the company you are, the more likely you have people that are dedicated to care about your software security. But the smaller you are, then it's really dependent on the, you know, the, whoever cares about security, and whoever cares about sales, and whoever cares about development to actually have like a close relationship?

Nick Murison:

Yeah, I think I think that's probably true. I mean, if you're a large organization and you're regulated, then you probably have a dedicated team. And just because you have you have a certain amount of compliance work, you've got to get done. And so you've got to have some, some person power behind it. And smaller companies, you they're probably not going to have even a C. So you know, they might have Yeah, that's a

Unknown:

great point.

Nick Murison:

Yeah, they're, they're probably going to end up having like a couple of guys and gals sitting somewhere in the organization, who might be developers, they might be architects, or a combination of a variety of different roles going, they're asking the right questions, they're going well, what are we doing for security?

Robby Peralta:

Somebody is going to do this. Okay, guys, let's do it.

Nick Murison:

I mean, it might just be, you know, it might just stop, there might just be a case of asking the question and doing a couple of things. But ideally, what management should be doing is saying, Okay, well, of course, we need to take this seriously. But again, you need that driver, you need some sort of driver, and which is why you know, sales are actually a good driver. Because, you know, if you're, if you're losing deals, because you don't have a good security story, then that's a big incentive to do something about it. If you're, if you have regulatory compliance needs, then that's that trumps everything that's, you know, a massive incentive to get things right. But if you don't have one of those, then you're kind of, yeah, you're kind of in that space of, if someone cares about it, then then nurture that, because it's going to give you a lot return on investment, to raise those people up and make them champions.

Robby Peralta:

Hmm. Hey, that crystal ball of yours, you got to look into it and tell me what what do you expect from the space moving forward? Because it went really quickly the past one to two years?

Nick Murison:

I yeah, I don't know. I don't want to predict anything. It's, it's going to be more DevOps. There's going to be more various phrases based on DevOps that will amaze us. And we'll have to decode them, somehow, we'll have to have a dictionary at some point. I think the threat landscape is going to change a bit. So Europe has already had a had a good introduction to stricter privacy regulations. And I bring that up as a threat. It, I don't mean it as a threat. You know, privacy is a good thing. But from a, what do we need to deal with as a company? Well, GDPR is a pretty big stick to beat a company with, you know, if you're not doing privacy correctly in Europe, then you're gonna get fined, or potentially you might get fined. And I think, you know, the regulations in Asia and in the US are moving the same direction. So the the kind of the, your obligations as a company on a previous regulations are going to become a bigger headache moving forward, I think. So get it right now. Because

Robby Peralta:

I also heard somebody say that they thought that GDPR was actually one of the biggest pushes to actually people start caring about these sort of things, like, you know, security in the development process, because it's like GDPR, didn't really hit the security team as the same way as it hit the, you know, the development of the sales guys and sales teams in the way you're handling people's data and stuff. Yes, you read that?

Nick Murison:

Yeah, I would actually, I mean, I, I actually, I participated in this, this workshop in Stockholm a few years ago. And it was it was actually it was the year of GDPR. So I think GDPR went live, like

Robby Peralta:

18 to 70. Now,

Nick Murison:

yeah, somewhere, and it was about a month before it went live. And I was lucky enough to have quite a few people who work in finance, who were basically the security heads in a bunch of different finance organizations in Scandinavia, in a room together. And I was looking for a talking point, we need to have a debate. And so I thought, well, GDPR is fantastic, because, you know, it's it's it's relevant, it's coming up. And surely, they all you know, they would all really enjoy the opportunity to have a bit of a moan about how it's how it's made life hard. And so I sort of, I went in with that expectation and I started off and said, you know, So GDPR Wow, what a headache. Right? And the response I got was, I totally didn't expect it. And in hindsight, I should have realized that I was expecting the wrong thing. But actually, the reaction I got from the security people was, no, we love it. It's the first time we've got budget in years. Because it's a problem. It's, and it's coming up quickly. And the organization's very concerned about it. And, you know, they're thinking, Okay, well, if, you know, who do we throw money at to make this thing? This is

Robby Peralta:

our chance. Yes.

Nick Murison:

So. So yeah, I agree. GDPR, I think is actually has been a really good thing both both as a consumer, I think it's been fantastic. But also for organizations to, to realize that, you know, you from a, in particular, from a software security point of view, you actually need to have the right, the right controls in place in your software, to prevent customer data leaking out. And you actually need to be able to document that you've taken a risk based approach to this. And that sort of drives certain behaviors by the development of drives that okay, early on, we need to actually, I don't know, maybe do some threat modeling, or some other kind of design level review, to make sure that we're catching these risks.

Robby Peralta:

Hashtag secure by design.

Nick Murison:

Exactly. Yeah. Yeah. And hashtag ship.

Robby Peralta:

Yeah, yes. I've also heard that when I was trying to figure that squeeze that in the beginning, hey, last question. Do you expect that this is really mean questions? I know your answer, generally. But uh, do you expect rate any regulations to come out? Where it says, Yeah, software security, you have to do this, this, this, this and this, to be compliant from now on?

Nick Murison:

Um, I don't know. I don't I don't think the regulators are in the right place at the moment. I think, you know, you had you had you had a couple of people on from the Norwegian government a couple of months ago, and now they're doing some tremendous work, interfacing, you know, public sector and private sector, and sort of being a helping hand for people when it comes to Okay, well, we know security is an issue. We know, it's a big thing. But we don't know what to do about it. Well, you know, national security agent, authority NSM in Norway are doing some really good work through their National Cybersecurity center to help facilitate that. But when you look at some of the guidelines that they put out, it's it's still very IT infrastructure, and it heavy, it's not much on the software side, while at the same time, you know, some of some of the predictions about what's going to be a big technological trend moving forward. It's things like IoT and smart cities, and 5g and so on. And that's actually there's quite a lot of software involved there. Hmm. And so will we, at any point get more regulations around software security? I don't know if we'll get anything like that soon. But I think we'll, we'll might get more, I'm hoping that we get more regulations that are a bit like GDPR, where, essentially, it says, it's up to you to take a risk based approach to how you handle security here. We're not going to give you a laundry list of 10 things you must do, or 110 things you must do. We're gonna say, you know, you need to assess what you think is the biggest risk to your company. And address those risks. And that that kind of regulation, I think, is is moving in the right direction, because because otherwise you get you get into silly situations where, you know, why do we have a firewall in front of office? Because the regulations say so. We there's nothing in office like literally that we don't even have any laptops in office. Yeah, but the store

Robby Peralta:

doesn't mean question, because I Yeah, exactly. We just talked about earlier how complex these things are. And there's no one solution. So it'd be kind of hard for government to come and say you have to do this, because that would be wrong for the other 999 out of the hundred people you're telling it to. So it's a

Nick Murison:

exactly I think came at it. Yeah, I think I mean, you know, PCI kind of galore, criticism for, you know, their data security standard, because it is very prescriptive. It's been around for almost 20 years now, I think. And they kind of give you a laundry list of things you must do. And it actually is very useful. When you talk to companies who are you know, they don't know what to do. Well, here's a laundry list of things you must do. That's That's good. But even more mature companies who have come up with different ways of solving the same security challenge. It's but they're not doing the thing that the standard says that they must do. That's fine in PCI DSS, you can you can have one You can have a mitigating control, you can choose something else as long as you can defend it, that's fine. So, but more modern standards are more along the lines of, you know, as a company, you need to figure out what's going to be the big challenge, and you need to dress it.

Robby Peralta:

Hmm. Well, Mr. Murison, if I ever become a regulator of anything in a position like that, I know whom I'm going to call for advice. But hopefully I'm never in that position, because I don't think anybody wants me to be there. But in the meantime, I will thank you for your time. And I'm actually going to go make a follow up podcast on this topic with some other some other guys you know. Espen from Visma.

Nick Murison:

yeah, that's right. Yeah.

Robby Peralta:

Yeah, absolutely. So I'm gonna take this podcast making, listen to it, post it, and then he's gonna make one and then maybe I'll invite you on afterwards, and you guys can just have like this software security podcast moving forward.

Nick Murison:

That'd be fun.

Robby Peralta:

That'd be great. Nick, thank you for your time, and we will speak soon.

Nick Murison:

Thank you very much for having me.

Robby Peralta:

Well, that's all for today, folks. Thank you for tuning in to the mnemonic security podcast. If you have any concepts or ideas that you would like us to discuss on future episodes, please feel free to send us a mail to podcast@mnemonic.no. Thank you for listening, and we'll see you next time.