Tash Norris, is Head of Cyber Security for Moonpig, a review board member for DevSecCon, and a team member of B-Sides Manchester
This episode was originally streamed on Thu, 17-June-2021 to multiple platforms. You can watch the streams (along with the comments) on-demand on:
Rik Ferguson: [00:00:00] I never know if I should wait for the music to finish or not. It's always a tough one to call. I think I waited long enough this time. We're back. We're back with another episode and not the final episode, but yes, another episode of let's talk security. I have a fantastic guest for you today. People use the expression. You know, very easily, it slips off the tongue very easily and you've probably heard it a lot. People talk about, you know, she's the Tash Norris of, uh, of show jumping. She's the Tash Norris of international cuisine. You've probably heard people use that expression a lot. Well, today we are actually talking to the Tash Norris of cybersecurity. Um, she is the head of cybersecurity at Moonpig. Um, she is involved on the boards of various different CONs, like DevSecCon uh, like BSides Manchester. She has had, uh, a significant and productive career in cybersecurity so far. And there are many, many successful years ahead of her as well. Please welcome ladies and gentlemen, the Tash Norris of cybersecurity. Hello. Nice to see you. Thank you for joining us.
Tash Norris: [00:01:16] Thank you for having me. I'm very excited, very excited.
Rik Ferguson: [00:01:20] You are not just very excited. You are amazing. You are making the effort to join us in the middle of your holiday, right?
Tash Norris: [00:01:26] I am. Yeah. So it's probably about time I had a break from visiting various vineyards.
Rik Ferguson: [00:01:33] It was really good OPSEC because we're not actually even seeing the inside of your home. I think you like, you got a holiday accommodation just for OPSEC purposes for this show. Very good by the way.
Tash Norris: [00:01:43] Yeah. Yeah, absolutely. Yeah. Taking a break from some long walks and vineyards. Chucked the rest of the gang and dropped them in the middle of the Sussex countryside and yeah. Happy to be speaking to you.
Rik Ferguson: [00:01:55] Okay. Well, that's another thing that I, that I just thought of when we were talking just before the show started, this is the first all English accent episode ever of let's talk security. So that makes me, that makes me very happy. So let's, first of all, let's talk a little bit directly about you, uh, rather than specifically about security, and a little bit about where you work, because I think it helps give us some context for the discussions that we're going to have that are much more focused on security. So, first of all, Moonpig. Moonpig's been around for a while. Um, but I suppose, you know, there's a, there's an international concern, right? It's not a UK focused organisation. But I suppose there's a perception that, well, for me anyway, it's like, that's a greeting card company. It's not right. What is it?
Tash Norris: [00:02:42] So what I would call your ultimate gifting companion. I've got the line down. Yeah. So, so much more than cards now. Flowers, gifts with goods. We've done everything from the letter box packages, where you can get, your pud and your wine through. All the way to, um, there's kind of a more comprehensive gift sets of hampers full of beer and snacks, which is wonderful.
Rik Ferguson: [00:03:08] But it's not a physical storefront operation. Never has been
Tash Norris: [00:03:12] No, yeah. Purely digital, all digital. So we've got a US, Australia and then UK digital shopfront.
Rik Ferguson: [00:03:20] So you're one of those organisations, I don't know if this is a trend micro expression, or if it's a wider industry expression. I've been inside the trend bubble for so long. It's difficult to know where something came from, but we would call an organisation like Moonpig born in the cloud. Is that fair?
Tash Norris: [00:03:36] Yeah, absolutely. We've never had a big physical data centre at all. So yeah, we are, uh, pretty, um, I say pretty good on that digital innovation side. We were early adopters of a lot of different types of technologies.
Rik Ferguson: [00:03:50] And how did you get to where you are now? I mean, head of cyber security isn't your first role at Moonpig? I think memory serves me correctly and it's also, obviously not your first role in information security. So how did you end up where you are?
Tash Norris: [00:04:05] Great question. So I started my career in solutions architecture, uh, which is a bit of a, my friend calls it a catch-all phrase for pretty much anything. And in tech you can be responsible for all sorts of things. But I started on a graduate program in financial services in architecture. Did that for a few years. Loved it. I am what you would call a recovering diagram drawer. And I think the, uh, the pandemic is really, um, taking me out from my comfort zone. Um, I'm very used to drawing, um, diagrams even still now. Um, and there's so many wonderful virtual digital tools. Um, but I still would draw diagrams on pieces of paper and hold them up to the screen for no reason other than it just makes me feel really good to draw diagrams again.
Rik Ferguson: [00:04:50] A history of episodes of let's talk security, uh, that I just threw on the floor. Same, exactly the same. But then I was a solutions architect as well. That was actually 14 years ago. What I joined trend micro as, that was my, my job title back then. So, yeah.
Tash Norris: [00:05:05] Yeah. I love it. And, um, yeah, and I, uh, became really interested in cloud security cloud architecture. In fact, to start off with, and I had a wonderful boss, called Yusov Flabins who actually works for AWS now. And he spent a huge amount of time kind of cultivating that interest. And he's a great full thinker who really embraced cloud and serverless, and where that was heading. And, and it was still very early on, especially for financial services, a bit scary. Um, especially for those companies that had invested so much of their equity in physical data centres around the world. Um, and that he spent a lot of time kind of cultivating that passion and that, that interest in me and, and ultimately as, as great managers do, uh, released me to go to the security team. I'm still grateful for that. Yeah, then I started my career in the security side, a bit of security architecture and a bit of I guess what you would call them a security consultancy to internal teams. So working data directly with engineering teams, but this time, rather than as their architect, as their security person too. Um, I guess what had historically been in financial services is that person that says no, don't do those things. Um, that architect in me, I think approached it very much of let's get to yes. Let's find out how, um, let's find our path. And, uh, I think then at that point in my career, having had that benefit of someone like Yusov who really pushed me to invest in the cloud side meant that even though I was early in my career, I knew much more about cloud security and cloud architecture than any of my peers. And ultimately that's what helped me to advance, I think, much faster. And I had a number of different role types in financial services. Um, before eventually making that first hop, which is always so very scary, um, not just to another company, but to a different industry. So I went from financial services to e-commerce, and from very, very large financial services to this, this tiny commerce company, Photobox. Um, and gosh, that was a shock to the system.
Rik Ferguson: [00:07:09] So why, why was that? What was so, I mean, probably there were, there are a million things, but what was the biggest shock? Was it culture shock. Was it technology shock? Was it opportunity shock? I dunno. How would you characterise that change?
Tash Norris: [00:07:22] I think it was a mix of everything. I think, um, I had worked in a company that was well established, both from a kind of processing and, uh, methodology kind of perspective all the way through to very definitive roles and responsibilities. And so, um, it took me a while to realise that to progress the way I wanted to my career. And, um, I'm very competitive. I'm quite, quite driven and quite passionate. And it made me realise after a couple of conversations with some, some wonderful mentors that I needed to make that jump to a different industry, to really enhance my skillset and to be able to do more. And that's where the shocks to the system came. Where your roles and responsibilities were no longer so narrow, you had the ability to be much wider. Um to, I guess, dip your, your toes into different types of activities. And, and again, for me, that's where I think I was able to shine a little bit more. Again, uh, by trialling different things and learning what works.
Rik Ferguson: [00:08:21] You must have brought a lot of transferable skills though from the financial sector, because it's e-commerce. Because e-commerce is the heart of what you do at Moonpig. You must have brought a lot of regulatory knowledge and transferable skills, from financial services, right?
Tash Norris: [00:08:34] Yeah, absolutely. Photobox and Moonpig used to be kind of one big company. They split off in half and I went to the, the pink side. You might call it. Um, yeah, absolutely. I think the bit that I love about e-commerce groups, uh, is that there is the ability to develop, and push to production fast. Uh, and there's space to innovate quickly. And so there's a real drive to make sure that the security controls you have are automated. They're based on, uh, the ability to, to support and, um, encourage innovation. And I think the bit from financial services that I brought that perhaps, uh, Photobox, especially and Moonpig that I guess they were unfamiliar with, was some forms of, of what you might call security governance. Um, and so that's something that we were able to, to bring across, which is just an interesting journey in itself.
Rik Ferguson: [00:09:31] So, I mean you've already mentioned a whole load of, um, interesting areas that you kind of just mentioned in passing. So let's pick up on some of those just because that's where the conversation is right now. Uh, you know, we spoke earlier on that Moonpig is kind of a born in the cloud operation and that you've never actually had physical data centre presence that you own. Anyway, obviously you're present in a physical data centre somewhere, but not one of your own. Um, and you mentioned cloud security and you mentioned automation, and I know that you're a refute board member for DevSecCon so obviously DevOps and DevSecOps. Um, although actually there are several people who really despise the term DevSecOps. I wonder if you're one of them or not. Um, are deservedly big, uh, topics of conversation in the security world right now. Um, how has the theory and the practice actually, uh, around embedding security in a DevOps environment. How has that changed over the course of your career?
Tash Norris: [00:10:33] I think the biggest one for me is that the security teams have started to tweak that they should hire and be engineers. I think that personally, um, especially in it in an e-commerce world, I think that we need to utilise and adopt the same ways of development. Um, as our engineers, we need to be familiar with it, that helps us to be way more empathetic to the way that they move. But for me, that's the, the biggest change I'm no longer, I guess, researching the best practice goes from a wasp floor NCSC or, or nest, and then asking engineers to work out how they implement. Uh, my team are actively working with engineers, are part of those engineering teams, whether it's permanently or temporarily to be able to embed those controls. And I think that level of, of involvement and knowledge and empathy, um, to really truly know the pipeline that your engineers are working with really helps. I think, um, historically security teams have gone, we should introduce something like SAS to the pipeline. And then I know we might get to AppSec later, and SAS should break the build if it finds anything wrong. Um, for me, that's the security team that maybe just doesn't truly understand the pipeline one, because breaking the build is irritating and most SAS tools, we'll find something that's wrong with your code, whether that's actually a problem or not. But two, depending on how big that, that repository is, that scan, we could take a really long time. And if what they're trying to deploy is a bug fix or a task, um, or even just a small change having to wait, maybe 40, 50 minutes for your pipeline to complete is an absolute nightmare. And so having teams that understand that and they're familiar, uh, can perhaps scan on PIs instead, um, becomes really interesting. And, and for me, it's, that's where the real difference came in.
Rik Ferguson: [00:12:29] What are the, the kinds of, if you're talking about, for example, things that break the build, um, and, and the, the last thing that developers want is someone to get in the way of their highly agile CICD, DevOps, methodology, and pipeline. Um,so how does security work in that highly agile world and even more so in a serverless environment, because at that point obviously you've got to rely on having security embedded within the code because there's very little else that you have any control over. So in your world, how does security work in those, in those environments?
Tash Norris: [00:13:07] So we do have a, I guess, a number of different touch points. I think anyone can use when they are working with that type of architecture and that type of estate. Um, there's I feel like I might say this a lot. There's a wonderful diagram that actually, um, I used to work with, uh, John Haws, uh, from Facebook, quite a long time ago, who started and then Sonia Morissette and amazing AppSec engineer , she, uh, started to finish. And then we've kind of evolved over at Moonpig that covers, I guess, that whole software development life cycle and then different touch points throughout. And what we've started to do is look at that, um, evolution of security at Moonpig and where are the most important places for us to have that touch point within the life cycle. And then where do we automate? Um, where can we automate first? And then where do we, where do we need greater maturity before we have a kind of fully embedded, and that then helps us to cover everything from threat modelling and STAS, DAST, and Pen testing, um, all the way through now, that's very AppSec heavy. The other piece that helps us to, I guess, involve that, that security aspect, uh, is that we deploy infrastructure as many companies do via Terraform. And so there's some wonderful open source tools. TFSec is one of them that allows us to have a similar level of security controls over that Terraform. Your infrastructure is code piece, uh, which is wonderful for me because it means that we can start to automate that build as well. Um, having come from a kind of traditional architecture role, where we would perhaps do the threat model of our diagrams. Then we would get four or five different teams to build it because you'd have a database to use server admin, a VMT, and then someone else would deploy your software code on top of that. Actually, what we have now is, is one team doing your infrastructure on your application code. And that allows us to scan then that pipeline and then what we've been able to do, looking all of those different stages, my diagram, because I love it. Um, it's, we've been able to look at, now we've got scanning of our infrastructure code, scanning of applications three K. Uh, but then we've got those human elements touch points as well. So we've got threatened modelling at the beginning. We've got Pen testing where appropriate, and then we start to look at some of the more interesting and more fun tools. I security test that we might write off the back of a threat model, desk tools, as well as, scanning of our environment when it's already in production as well.
Rik Ferguson: [00:15:34] And, um, you know, I'm really interested, obviously, you know, in that kind of, um, agile environment you arein. How often are you pushing new features, new functions, new you build stuff? How, how quickly do you iterate that?
Tash Norris: [00:15:48] Gosh, um, anywhere between 10 and 20 times a day.
Rik Ferguson: [00:15:53] So how do you, how does any organisation, I'm not asking how you do things I'm asking for your opinion on how things should be done, which I suppose is a possibly a slightly different question. How does an organisation secure from, uh, Initial development of code from pulling in libraries from various different sources, both public and private, uh, from making sure that you're not embedding things in something that's being pushed to production that shouldn't be embedded within there. Uh, looking for, uh, misconfigurations critically. I think that's probably the biggest, biggest, uh, area of concern from a security perspective, uh, for finding vulnerabilities. How do you inject all of that automation into the process? Do you, I mean, like, do you have an overarching vision of everything that you're able to sit back and say, here is my God mode, or do you piece together a lot of best of breed things where, you know, you get really effective this, and then you have really effective that ? How do you work? How should one work?
Tash Norris: [00:16:53] So I think for me, if I take a step back, I think the first thing to understand is what am I trying to protect him? What's the value of the thing I'm trying to protect. It's so easy as a security person to dive straight into that people, processes, technology, and start to list off all of the various things that we should do from security champions, through different modelling, to Pen testing, every deployment, and, um, having someone manually review all PIs before they could be pushed to production. Um, but actually it's about balancing that risk and the value of the releases that you have as well as the resources you have available, uh, and the maturity of your environment. So when you get an idea of what it is, you're trying to protect. The resources you've got available to you. Um, and that, I guess that risk/value balance that you're trying to achieve because ultimately whoever you are, you're employed by a business that, that needs to produce a product. So I think once you've got a lens on that, that allows you to build that roadmap that can help you identify those key components you want to get first. Uh, and I'd always, always, um, vote for an advocate for, uh, automation first by possible, because that releases you to go onto the next thing, whatever that thing might be. Um, but I also appreciate that depending on the resources available to you. That's not always possible. And so you identify those key components in your development life cycle. And for me, that's from ideation. So not design, but from ideation, when your product team or your, um, chief product officer or someone from marketing or HR or wherever it goes, I want to recruitment website or, um, I want to be able to make recommendations to someone based on something they buy. All the way from the ideation stage through to kind of post production. Something's been up on your website for 10, 15, 20 years. Um, that's where I think some, some companies go, go wrong is they think, oh, that's old. It's been there for ages. Let's just not touch it and pretend it doesn't exist any more.
Rik Ferguson: [00:18:49] It'll be fine. Walk away. Look that way.
Tash Norris: [00:18:52] Exactly. Just do it. It will be fine. Um, and I think, um, once you, you have a handle on your, your important touch points, where things have gone wrong before that allows you to have to prioritise for me the bit that I historically have not always being part of the touch point for, and I've realised now is so, so valuable for I think for anyone trying to secure their business, is that ideation stage. Um, because what that allows me to do, and I'm sure it's well, um, many people love doing is get a bit of a headstart and that, that R&D side. Uh, so when someone says they have this crazy idea, For me a great security team is one that, you know, he goes, OK, how do we make that happen? What cool stuff can we do? How would we automate? So that when you get through to the design and the development stage, you're a bit of ahead of the curve. You know what common threats are for, I don't know, containers or AI engines, whatever it might be. Uh, and so you're able to facilitate and, um, support your teams in their innovation rather than be like, whoa, hold back. We need to do some research before you can do it.
Rik Ferguson: [00:19:57] You haven't done anything on this yet? Just wait for us to catch up with you. Which I suppose is where security operated, right? There was certainly in my experience, working at a big systems integrator that doesn't exist any more. Uh, before trend, there were very much silos. There was the desktop silo, there was the server silo, there was the network infrastructure silo, and security and privacy actually was the last silo that was added to the stack. If you know what I mean, things were built and designed from the cables to the gooey. And then it was handed off to security. There right now you people make that secure. Um, so you're talking basically about breaking down those silos and injecting security into every stage of a build, whatever the build might be right?
Tash Norris: [00:20:38] Yeah, I think, um, and this will forever be a problem and will forever keep us in jobs, but there's always going to be something that adds friction and that friction is going to cause someone to take another route. I think everyone's seen those pictures online of, um, big gates in the middle of Oakley fields with no fence. And so people just drive around the gate, right. And, so many people actually did that to the security world and it's so right. And I think I've learnt now that. It's much more effective and it's easier for me, it's easier for CTRs everywhere, for marketing teams everywhere if you are able to be part of those initial conversations, those ideation phases, so that you can get the head start that you need to do the right thing. As well as set expectations, because there may be times where you do need to say, actually, if you want to do that thing sounds great, but I need this investment. And I think it's important to have data points to be able to support that conversation. Cause there will be times there's always times where security teams feel like they need to say no. And for me, I think what I've learnt and what's been important for anyone kind of building that security program is that first of all, you can have those foot stumps if you need to have them, that's what I've called them before. You know, what am I really going to be throwing my toes out the pram for this week or what am I really going to bepushing for and actually, is it worth it and taking that step back. But equally, if I do come across something, certainly not weekly, hopefully, maybe once or twice a year, that I do genuinely feel passionate about rather than a no you can't do that, it becomes a case of actually for us to facilitate that this is the budget that I need, or these are the resources that I need to help you get to yes. And I think reframing that conversation is certainly in my career so far allowed me to work much more closely with engineering teams, and marketing teams, and product teams, rather than against them, rather than be that gate, rather than create that world of shadow tech or shadow marketing websites.
Rik Ferguson: [00:22:29] Engineering I think has always been a huge friction point or a huge area of potential conflict and braking activity. Um, which if you're in the world of development, particularly, you know cloud-based development, modern development, any kind of deceleration is going to be a large source of friction. Right, and that's what security traditionally has been. Another area that I know floats your boat, and actually that you've mentioned a couple of times already, is threat modelling. So take us a little bit through how threat modelling works for you in a born in the cloud business. Cause I imagine if someone is in a more traditional area, um, that they may have a very different perception of how threat modelling works and what the outcomes of that might be, than you. Uh, but I also imagine that the world that you operate in is one that everybody will be operating in within probably the next five, definitely the next 10 years. So how does threat modelling work in that highly agile environment? Do you have constants or is the only constant change?
Tash Norris: [00:23:46] I think I'm gonna, I'm gonna bring you a bit on a threat modelling journey before I get there. Cause I think it's a fun way to explain why I get so excited about this and why I've got so many perspectives. So I first learnt about threat modelling, way back early on in my career in financial services and someone gave me Adam Shostack's book on threat modelling and I devoured that book. It was with one of the first security books, I think called a security book that I was given. Um, although I'd argue one of my first few books, was actually a book called continuous architecture, which doesn't mention security at all, but it's a wonderful book if you're an architect. Um, and I devoured that book. I think it gave me a really interesting perspective as someone who is early on in their security career and felt like it was difficult to add value. I know when you're still learning there's so many people that know way more than you. Learning to ask the right questions from people in the room that are hugely knowledgeable about the things that they're building, help to pull out security threats that they would never have thought of by themselves. And they didn't necessarily consider them security threats, just things that impacted the availability of their platform or the integrity of their platform and the confidentiality of it. And I think that then gave me a level of confidence that I hadn't had yet in my career, in that security world. And so Adam Shostack's book became a little bit like a Bible for me. And then I actually met him at a conference. Um, it was called the open security summit and it was a very interactive conference. The idea was that you would listen to a talk and maybe you would have some round tables, you would discuss some ideas. And, I mentioned a kind of way that I use threat modelling, uh, with teams that are maybe a bit unsure about security or when it's a technology that I'm not familiar with. Um, and Adam, very wonderful, very friendly, very down to earth, considering the impact that he's had in our world went, oh that's a really, that's a really good point Tash and I went (squeal). I know for me, not only was he, and I hate this term and I know many people do, but like a bit of an InfoSec rockstar. And he certainly might not be the traditional rockstar type. Um, but also was someone that unlocked a lot of confidence than me. And so was he was a mentor without even realising he was a mentor and I actually spoke to him after, and very nerdily asked him to sign my copy of my threat modelling book I carry everywhere. And, um, he's, um, actually remained a, an incredibly useful person to touch base with. And I've been lucky enough to be on a couple of panels with him, but he, I guess, where I'm getting at here is it allow me to ask the right questions that helped me develop in my career.
And so gave me more and more confidence to realise the impact that we can have. If you ask the right questions, even if you're not familiar with a technology stack and especially in a world of cloud, of serverless, container based technology. It can be difficult for people to keep up to date. With all of the various threats and issues. And threat modelling isn't just about those niche security problems that you might find on various Twitter threads or telegram channels. They're generally very traditional bugs in the way people build with these things, whether it's around authentication, which is probably one of their most common, or encryption. And so, for me, front running really even the playing field, I think as I came through my career there. Um, I don't think I do do it any differently for these new types of technologies. Other than perhaps I and a guy called AV Douglin helped me to, to kind do this. If teams are struggling, which many teams do the first time they do that modelling, is I ask them to think about what would remove the value of what you're trying to do. So rather than kind of throw confidentiality, integrity, availability, or any other type of security architecture phrase. I've tried to take a step away and step back from that. And I think AV does this really well, um, and just get teams to think about what was the value of the thing you're trying to do. And this is where product toners has become really powerful for me and really important. And scrum masters too, or agile coach, whoever you've got this being a part of that team, there'll be really close to the value that they're trying to achieve. And so asking them the threat modelling questions rather than what can go wrong, what removes the value of this product? Like what would make it terrible for customers? Uh, and they might say, if they couldn't place an order. Okay, cool. What are the different things that could go wrong linked to stop you from placing an order? And that's how your engineering teams, like, it becomes a more useful language. That's when you can really get into the threats that you might have struggled to get, if you just said, OK, everyone, tell me what could go wrong with this architecture.
Rik Ferguson: [00:28:23] So for you, it's very much a people in the room, around the table, conversational exercise to begin with before you formalise the whole thing. But yeah, it's about getting people to verbally express, like a brainstorm.
Tash Norris: [00:28:35] Yeah, absolutely. And it, I think approaching it a bit more informally almost as a chat first really helps. Then you can kind of get into that diagram minding piece where you kind of step through the different stages of your architecture, whether it's a data flow diagram or a pure architecture, which you can still do with your serverless stack. I just talk about services rather than specific Lambda functions, although you can go there as well. If you feel like you need to do an architecture or infrastructure based threat model, which is definitely my love and my comfort zone. Maybe talk about services, that's where it gets a little bit more interesting. Um, but you can also, you know, that's when you step through your architecture, but for me, it's having those kind of informal chats versus being really powerful. The other way I've done it, especially more in the virtual world is I've taken just the product owner and just the lead engineer aside. Explain the value of what I'm trying to do, how it would work. I've got them to kind of almost have a little go of a threat model, just one-on-one, to coach. And then I've gone into the group room with two, what I would call kind of champions or advocates in the room, that have then been able to help that process go along nicely.
Rik Ferguson: [00:29:42] So you actually, at Moonpig you, I dunno, I was going to say you walked into a really fortunate situation, but maybe you don't see it that way. So I'm quite happy to be totally contradicted. That you had the opportunity to build up the team that you now have, effectively from scratch. So I suppose firstly, was that a really fortunate situation to be in. And secondly, how did you approach building an entire function that didn't exist before?
Tash Norris: [00:30:10] Yeah, so I am incredibly lucky to say that I've been able to pick every team member that I have and I'm not only lucky to pick them, but I'm lucky to have them. Um, but yeah, when I, when Photobox and Moonpig split, um, there is one security team for the Photobox group, uh, and there was, uh, a number of choices to be made where people could choose which side they went to. Um, and Photobox was a comfort. That's where we'd all be kind of hiding. See, that's the architecture we knew. Um, I had, um, you know, we're still being challenged to make sure I didn't sit in comfort. Um, and it's certainly something I would ask the audience to take away is don't, don't sit in that comfort space for too long. You don't grow up. Don't grow a whole lot there. And so I went over to the pink side is I've called it, the Moonpig side. And I was the first security hire. So I was a lead engineer at Photobox. I've had quite the journey up, lead engineer at Photobox, moved over to Moonpig as head of product security. Um, and this, uh, a slight aside here really important for me. Um, we've stopped calling it AppSec because in our world of cloud of, um, the way that we engineer now full stack engineering, there isn't really an AppSec and a CloudSec, for me, there's just one product. There's one stream. And if you try and separate them out and have two different teams, I think that becomes way too many touch points with engineers, with each other, conflicting processes potentially, and some challenges. So we created, I was, I was head of product security. Uh, although again, I was the only person in the whole security team.
Rik Ferguson: [00:31:40] Oh, did we lose you? Oh, no, you're fine.
Tash Norris: [00:31:44] Sorry. I'm so sorry. So that head of product security role for me is that where we got to? Yes, perfect. Um, was really important for me to, um, be able to encapsulate the things that we do, our products , our webinars. And so, um, I joined as head of product security and we had to hire a CISO. That was the first thing we did. And so, um, I worked with a CTO and said, please let me interview the CISOs. Um, but I was like, that's where I want to go. That's the role that I want to have. And so can I interview for them? Um, so I didn't want to interview to be the CISO. I asked to interview the CSOs that would be coming in so I could learn what they were pitching themselves as, the types of things they were saying, what was important to the people I was interviewing alongside. Um, originally I was going to ask, just to watch the interviews and then I was like, no, go for gold. Ask to be a part of the interviewing panel.
Rik Ferguson: [00:32:36] That takes some courage, right. That just even, I mean, I'm breaking your story, but I'm impressed by it. That's why I'm breaking it. That takes the courage to even ask that question, I guess, but maybe the environment is one where you were made to feel comfortable asking that kind of thing.
Tash Norris: [00:32:52] Yeah, absolutely very much. And, um, the CTO was super receptive. I think he was, um, keen for someone else to kind of ask some of those more techie security questions, I guess. Not that you necessarily expect the CISO to be super hands-on, but because we're so small, I think for us, that was the right thing. Um, and so yeah, we interviewed a number of really good candidates. Some I actually have kept in touch with today. Um, because I felt they were wonderful people on a similar journey to me. Uh, and so that's what it came down to was we got to, um, some final candidates and our CTO and our, our head of people, wonderful woman called Sasha works in, went and had a conversation and they came back to me with, we've got some wonderful candidates in that final stage. Um, we also think that you're comparable. Uh, and actually if we take a step back and think about your career journey, um, and where you want to go, um, I wasn't there yet. Um, and I recognise in myself, I wasn't there yet. Um, however, they were, they were willing to take that risk and I think what they had recognised, and I hadn't necessarily seen and I'm conscious of making sure I see in my own team now is how do you make sure there's space for that person to grow and to move and their concern was they bring someone in if I was already doing well, would I just move on? Um, and so they took a punt. Um, and I think it was like last February, they said they were, they would take the punt, they would let me have free reign for a bit and see how it goes. Um, and yeah, that stuck. So I'm still going. And I've hired my team out, which I've incredibly enjoyed doing.
Rik Ferguson: [00:34:29] It's been like you said, you're very lucky to have been in the place where you can choose them all and very lucky to have all the ones that you chose as well, which is super cool. There's a question that came in, on LinkedIn from actually a former colleague of mine, Mark. Hi mark. Thank you for joining us. Um, and his question, obviously, it was very specific at the time when you were talking. So I'm going to guess at which particular model he is talking about, he wants to know, did you get a lot of pushback on that? Or, um, did you shape the org around that model? And I think that model was probably talking about, and I know mark will jump in if I've got this wrong, he was probably talking about, um, keep having AppSec not be seperate. So, did you get push back or do you have trust or did you build the organisation around that model?
Tash Norris: [00:35:16] That's certainly what I got from that question. No, no push back. I think our engineers certainly build and operate in a way where we don't have back-end and then front-end. And so for them having cloud or infrastructure security, and then application security was weird. Cause that's not how they worked. And so for me to move the team to be like that made sense. Uh, it's important to note that isn't the only function of my team. I've got incident security, operations and incident response. I've got connect security, and then I've got my technology risk and compliance. So I've got those three pillars. Um, I know the business was incredibly supportive and I think it was important for me when I came in and I had that space to build a brand new function, to take that step back and make sure that we build ourselves in a way that aligned to the way the business worked. Um, that certainly helped us, I think, get the most done quickest.
Rik Ferguson: [00:36:05] Fantastic. Um, I've just looked at how much time we've been talking already. And I'm kind of disappointed because this, honestly, this time is flying by. And I think we've basically touched on three of the things that I had written down that I wanted to talk to you about. So I'm going to shift gears and move on. You just mentioned, um, security, operations, incident response. So that gives me an opportunity for a now very obvious segue into, um, current threats. Um, I know we spoke about cloud and how misconfiguration is, is, um, one of the biggest current threats and certainly the one that attackers are leveraging more than anything else. Certainly more than vulnerabilities for example. Leaving aside that, um, obviously the one big, uh, it's not even an elephant in the room, it's basically the room. Uh, the one big thing that everyone is talking about from a security operations and incident response perspective today is ransomware. Is that something that's impacted your operations to date, or is it something that is a lingering concern that you're like, OK, at some point I'm going to have to deal with.
Tash Norris: [00:37:07] So not impacted today which is a nice thing to say, but I'm sure it's a matter of time for all of us. So I would always advocate for speaking kindly for those that have been impacted. Um, it's too easy for us. Isn't it. To get down on their companies they've been called out. Yeah. I do. I, have some wonderful colleagues in the industry, and I do know someone that has been impacted. I think some people may have seen me tweet about this recently. The bit that was really interesting for me, and I think a lot of companies have is everyone has playbooks, right? Hopefully you're on your way to building your playbooks out. And so you'll have a ransomware playbook and that will talk about what you will do in the event of ransomware. You have your insurance details, hopefully if you've got it, um, your crisis management details for your business and then everything through to who the right suppliers are to liaise with depending what you need to do. The bit that not a lot of people have, and I was chatting to someone who works in a PR company recently who'd done some pretty big brands in my PR pieces of work. And they talked about a ransomware policy and, uh, took me a while to actually click what she was saying. But what she was talking about was do you pay or do you not pay and the importance. Yeah. And the importance of having that conversation in a unemotional and objective setting. And so having that conversation to know where your heads are at both at a security team, but at a board level or C level in a safe environment. So well before hopefully any type of event happens. And I thought that was interesting, I guess, at a number of points. One is that you all might differ quite wildly. Um, but also making sure that your, um, team members that might be a bit more impulsive are clear on things like sanctions and the impact of payment for your sanction there. So the impact on your insurance and actually the stance on your insurance. Something I've learnt recently is that some insurance will actually always advocate for paying, which is the opposite of what I initially thought.
Rik Ferguson: [00:39:07] Yeah. And I understand that's sometimes down to the fact that it can be cheaper than, than not paying and trying to remediate. Right. The cost of the remediation could end up for the insurance company, a bigger payout than the cost of their answer. And I guess as a threat actor, what the threat actor has to continually try and do is to strike the balance where for an insurance company, because that's typically we're moving more and more to the situation where they are the people paying. Uh, they got to strike the balance of finding that sweet spot, where actually from a financial and only from a financial perspective, uh, maybe it does make more sense to pay.
Tash Norris: [00:39:41] Yeah. And I think, um, what's interesting there for me, there's two parts. One is you've got that colonial pipeline situation where you pay, but then the decryptor to actually take so long that you ended up having to restore from your own backups anyway, um, all the way through to the if you pay, what are the PR and ethical implications? A lot of companies now have environmental, social governance, style mandates. And so actually payment of a ransom might well go up and against a lot of the public standards that you might have made. And so, um, you know, in, and amongst all of those pieces, one, would it get us up and running faster, um, two what are the kind of PR and ethical background, all the way through to actually, if I'm in a situation, and that highly emotional, I imagine, highly fraught, highly tense state, actually is it easier to, just to just pay and be done? And that payment and be done often for some of these ransomware, isn't just get your system up and running. It's the confirmation, if you will, that they're still not going to release that data publicly. Not that your regulator might agree. Um, and so for me, I deal with this ransomware policy post by this PR team to have those conversations, to really understand where you stand personally, where your board stand, where your policies, where your insurance teams stand when you're not in that situation, was a really interesting point. And what I hadn't previously thought about.
Rik Ferguson: [00:41:01] And you get to, to apply all of your threat modelling knowledge to that as well, if that's something that you're going to do within your business, right? It's again, it's about getting the right people around the table and saying, OK, this is, this is our standpoint is our stance. We do or we don't pay. But then you get to say OK so what is the potential impact of our distance we've adopted? What, what does that mean for the business? And then the gap analysis that says, OK, so what do we have to change as a result of agreeing that that's what we do.
Tash Norris: [00:41:30] Yeah, which is really interesting cause you get into this world, which I love of threat modelling an idea, or a policy or a process rather than just threat modelling a product. Um, and your absolutely right, you step through of what can go wrong. Will, you know, would we end up paying someone on the sanction lists? Um, would you have even been able to attribute the attack to a specific, um, attacking group or, or region so early on? What does this do for my insurance premiums, all of those types of things.
Rik Ferguson: [00:41:57] Yeah. So what's your stance on, um, and feel at liberty to tell me to shut up because I'm used to it. Um, what is your stance on, banning or not banning the payment of ransoms for ransomware, for, for digital attacks? There are conversations at national and international level. Should this practice, the fact of paying a ransom, should it be made illegal? Should it be criminalised?
Tash Norris: [00:42:24] I'd love to say yes. Um, however, the sceptic in me doesn't think it will make the attack type go away. And actually, I think it will, will put the focus or criminalise the wrong people. Um, so I'd love to say yes, I think that, um, it's important to drive the right behaviours. I think criminalising at this stage, we're probably not ready for that. Both as an industry from a maturity perspective and, um, I honestly don't think our attackers would be put off by that. Um, but I do think that, um, it's important to maybe think about what other types of things can we do to, um, I guess to both dissuade companies from paying. So what support can we give? Um, what advice can there be? You know, there's I think the cyber helpline does some great support and some work for free for people are caught up in this. Um, but yeah. I'd love to say yes. And I know at least a forte out there will be like, make it a criminal act. We should never pay. Um, but I honestly, I feel like I couldn't, I feel like that that that should be answered by someone who's been in that situation. That's been in the heat of it. That has really experienced it.
Rik Ferguson: [00:43:33] And what if that's the only option, right? If you're a medical institution, for example, and the choice is pay the ransom or potentially have people die. It's not much of a calculation, is it? And, and if you make it criminal, anyway, that's a big moral conversation on the topic of ransomware. Uh, we have another question coming in on LinkedIn, uh, which is quite specific, but I'm interested, um, in your, uh, in your view on it. Look, you may be thinking about your own environment because that's the one that you're totally familiar with. Um, how can we or how would you hope to be able to detect a ransomware attack before it gets executed? I guess the question was before it gets executed, I guess before encryption happens.
Tash Norris: [00:44:21] So I had an interesting set recently. I wouldn't, uh, I wouldn't quote me on this because I'm sure a hundred percent of all statistics are wrong. Uh, but something like, um, an attacker was sitting in your environment around 60 to 70 days before executing their payload, whatever that payload may be, and in this case, ransomware. Um, there's a number of reasons for that. They might want to get a handle on the size and the value of the data you have so that they can more properly. For the decryption process, um, it may be that they are trying to establish other avenues in and out of your business. I think it's important to recognise that if you pay the ransom, that doesn't necessarily mean they're going to go away and not re-ransom you further down the line. Um, but there are some great, if you think about someone sitting in your network for a period of time before they execute, then, uh, for me, EDR technology becomes really important, uh, good network monitoring, intrusion detection, intrusion prevention tools. And I guess before you get to all of that, logging and monitoring is really important and not to be underestimated, really baseline traffic, understand what your network traffic looks like, your application traffic should look like, um, alongside that then EDR space. And then we might get into some of that machine learning piece. So you can start to, um, recognise those attacker TTPs, or tools, techniques in processes and the characteristics of some of our attackers that might help you to then identify and isolate that traffic or quarantine that traffic.
Rik Ferguson: [00:45:46] Yeah, I agree fully with everything that you just said, and that's really the reason for, in my mind, the rise of the XDR market segment for want of a better term, right? Cause it's taking all of those different things that you just spoke about logging and monitoring, network traffic, and the end point of EDR, bringing them all under that umbrella of extended detection and response, being able to say, OK, I need to see the details about what's happening in my cloud environment, I need the details about what's happening in my end point, I need to work out how something initially came into the business, where it travelled to and through what processes were hooked, what payloads were dropped. Um, and hopefully as you said, being able to get all of it before your 60 days are up or whatever the number ends up being. Yeah, absolutely. Um, I hope that answered your question. Thank you for asking. I'm really conscious that I can't believe how quick the time is flying. I'm really conscious. We're getting close to the end and there was another subject, um, actually that you brought up with me that I find really interesting and I wanted to give you a chance to expand on. When we were talking about the post COVID workplace, how does, how does the near future, I'm not talking about let's do some far future prognostication stuff. That's a good word. Isn't it? Prognostication. Uh, but how does the near future of, um, of post COVID look like? Um, and you said to me that you've been doing some psychological safety and mental health briefings, meetings, instructions with teams and for leaders, and that there was a specific relation between that stuff and the post COVID stuff. Do you want to talk a bit about that? Cause I think it's really important.
Tash Norris: [00:47:28] Yeah, absolutely. So, um, the interest in psychological safety for me started a long time ago. Um, anyone who was in that, I think one of the first cyber house party panels might have heard me talk about my box being full. Um, kind of a couple of years ago, I was in a role where I was doing that security architecture, consultancy role with teams and a few people left. And I felt like I was being given more and more responsibility, which was both great, but a bit overwhelming. And I got to a point where the only way I can articulate how I felt was my box was full. Like literally you can't put anything else in my box and please leave me alone. And I remember just not having the words, I think, to articulate how I felt not having the I guess that awareness and education to know kind of a little bit more around mental health and mental fitness, I suppose. And I went to my boss and I said, um, my box is full. I'm really struggling. I feel like I'm thinking about my box all the time and I'm at home and I'm thinking about how full my box is. And I've gotten to a point where I don't think I can even put non-work things in my box any more, and I can't take anything out. I don't feel like I've got anything to give and I'm just not being very efficient or very good. And I remember my boss saying at the time, um, well, we've just got a lot to do so you'll just have to get a bigger box. And I don't think they were coming from a place of just get over it Tash, I think they were just genuinely like, oh, we're all stressed or whatever. Um, and that really sat with the, and clearly it's been a number of years since then. Um, and one of the things as I've built my team, I've tried to be really conscientious of, is making sure there's this space of safety that people can tell me if their boxes are full, but hopefully we can work on identifying those indicators that help us to make sure they didn't get to that point. And I still didn't have the language for it. And I'll admit I hadn't necessarily done the research or educated myself as much as I should. And I was really grateful to be part of a training module they did for some of the leaders here at Moonpig on psychological safety and safety for our teams. And I took some of the things that we learnt with that, and I kind of took it a bit further and did my own research. I would very much encourage everyone to do. And it kind of brought me to, I guess, these four areas of focus for my team. One is making sure it's inclusive. We all want to make sure we have inclusive spaces. We open a circle. We make space for people to work in an post COVID workplace, where you might have a hybrid model for remote and in the office. That means that when you're in person remembering to still give that extra long pause, to allow people to contribute, remembering to open the circle, the way you sit, if you're in a meeting room with a video, don't sit facing the video. So make sure that your spaces is inclusive. The other area of psychological safety for me was making sure it's safe for people to learn, to ask questions. And again, in a post COVID workplace, that meant that you were making space for people to still learn from each other. So you were still encouraging, and on our product security function especially, we encouraged pairing and pair programming and making sure that we communicate openly. And so making sure that we take important conversations. Online or digitally, even if we're in the office so that they're still accessible to other people in the team. And making sure that if we are doing anything where we're kind of teaching each other or showcasing, again we're still doing that in a way that is digital first or online first, even if we could have the majority of people in person. The other areas of safety for me, were about contribution and challenge. So again, making it sure it's safe and easy for people to contribute post remote. And then challenge, most importantly, is making sure there's that space for someone to say I disagree. I don't think we should do it that way. Which for me has been the most valuable way of building our team and our function.
Rik Ferguson: [00:51:15] I was, I was spurred to think about, um, the post COVID workplace and, and the comfort levels or otherwise of people returning back to work. When I was having conversations with my kids about going back to school. Um, and I was relatively surprised, um, at the level of fear that they have. For returning back to somewhere that they normally should be super familiar with and where all their friends are. And you would think, I'd be really impatient to get back, to see my friends, but there was a fear certainly from my kids anyway, of because they've been remote and because they've had a lack of contact, not only from their friends, but from the teachers. That they hadn't done everything that they were supposed to do. That they've forgotten more than they've learnt and that they will be immediately left behind on resuming their presence in the school building. And I suppose there's a huge parallel for that in the workplace as well, right. Where you're going to suddenly find yourself back in an environment that you should be comfortable and familiar with, but be really worried that maybe everyone else took a massive jump, uh, while they were working from home. And that, for some reason, you didn't, and that's going to be another level of fear for returning, I suppose.
Tash Norris: [00:52:32] Yeah, absolutely. I know one thing we've talked about is even unrelated pandemic fears of actually I haven't been around crowds for a while and that makes me anxious. One of the things for me that's really important is not forcing people back to the office too quickly or you know, potentially even at all. And then also being cognisant that those timings, if people have built maybe really strong relationships or routines where they want to be able to do the school run. And so it's really important to allow that to still happen and be a part of their life, even though they might want to work in the community as well. And so maybe it looks at flexible hours. Um, but there's certainly a space for me of making sure that the psychological safety about learning, contribution and challenging and inclusivity, not just, doing the around cyber topics, but also making sure that we feel safe to say actually, I don't think I want to come to London in December because it's crazy. And it makes me nervous and I'm anxious, or I don't want to come in until everyone is double dosed for my vaccine. Or I really want to understand your workplace policies and processes. And I want to, I still want to wear a mask in the office. If you've got a cold or flu, I hundred percent hope that you now have learnt to stay home when you're ill. Um, but also if it's just a sniffle, it's like, hey, we learnt the power of wearing masks. Let's just start wearing masks if you, if you feel a bit sick on public transport.
Rik Ferguson: [00:53:55] Trend micro is a very, um, I mean, we're headquartered in Japan and we're a heavily Japan influenced organisation culturally as well. Um, and that's been a standard part of sort of Asian culture in general, specifically, definitely Japanese. Um, is that if you are ill, you wear a mask. And it was the first time I ever went to Japan. When I asked a colleague, why, why are so many people wearing face masks? Because let's not forget. It used to be really weirdly unfamiliar to, to Western Europeans or to Europeans in general and beyond, to see people wearing masks in public. And why are people wearing? Are they afraid of getting sick or something? So no, they're sick. That's why they're wearing a mask. I'm like, oh, light bulb. Oh, that's a really clever and sensible thing to do. Why have we never done that? I agree. I sincerely hope it's a lesson that, that sticks. Listen, Tash, it's been an hour already. I usually end this, uh, my interviews by asking what's the greatest lesson that, that you feel that you've learnt yourself through this period of lockdown and COVID, do you feel like you've already answered that?
Tash Norris: [00:55:02] Yeah, I guess the biggest one for me is that making that space for psychological safety in your teams. So I've come back to making that space for inclusivity, so to learn, to contribute and to challenge and making sure that it doesn't just apply to your InfoSec world, but you know, your personal lives, the ways you interact in that return into work, in person in the flesh.
Rik Ferguson: [00:55:26] It's been an absolute pleasure Tash. No, no less than I had expected and hoped for when I invited you to come on the show. Thank you so much for, I know you're on your holiday and you had to get rid of your family in order to be able to do this. So I am eternally grateful. It's been a fantastic conversation. I'm sure everybody watching has so much to take away from it. And I'm just really grateful that you joined us. Thanks so much for your time.
Tash Norris: [00:55:49] Thank you so much for having me. Great fun.
Rik Ferguson: [00:55:52] I'll speak to you soon. There you go another hour of your lives and my life has flown by. Uh, it was an absolute pleasure. I had a list of questions over here. Uh, on my secret let's talk security boards. Um, and I think I probably managed to get through about half of them. It was an absolutely incredible conversation. Um, we have what I believe, unless one of you steps up and says I want to be on an episode, we have what I believe is the final episode of let's talk security next week. It's going to be equally fantastic. Please make sure you join us. Um, for now. I want to once again express my thanks to the Tash Norris of cybersecurity for joining us and wish you all the best with the rest of your day. I have been Ron burgundy and you stay classy.