Jake Williams, Founder of Rendition Infosec
This episode was originally streamed on Thu, 11-Jun-2020 to multiple platforms. You can watch the streams (along with the comments) on-demand on:
[00:04:57] Rik: Good morning, good afternoon, good evening, welcome. Thank you again for joining us. It's, episode six of the first season of Let's Talk Security.
It is, in fact, the final episode of this season. I know I had teased you earlier on, in the season, that I may have a- a surprise seventh episode with a, as yet, unconfirmed guest.
[00:05:21] Unfortunately she can't make it for very, very understandable reasons. I hope very much that she'll be able to join us, in season two, because, my producers and the network have assured me that our season, our- our show will be, renewed, for a second season.
Because apparently it's gone over really well and you people are really enjoying it.
[00:05:41] Once again, I'm grateful for your presence with us. Please feel free to use all the opportunities afforded you by social media.
We are broadcasting live, so the comments are there for you on YouTube, on LinkedIn, on Twitter, for you to jump in and ask your questions. We have... For this season finale, we have a really, fantastic guest for you today.
[00:06:05] if you've seen any of the- the tweets or posts promoting the show, you will know that that guest today, is Jake. Jake has a- a long and storied history.
He's worked, in defence, he's worked in National Security, he's worked in forensics, incident response, penetration testing, and Jake's area of specialisation, I guess, over and above all of that, and he'll correct me if I get this totally wrong, but from an outside position looking in, it looks like Jake's area of expertise is really all about thinking like the hacker.
[00:06:40] That's why I decided to call this episode, Become the Hunter, if you want to stop being hunted, the best thing you can do is become the hunter.
And there is no one better to talk to us about that mental shift that's required, than Jake. Jake, how're you doing?
[00:06:58] Jake: I'm doing outstanding. How about yourself?
[00:07:00] Rik: I'm all right. I just kind of dropped you straight in the stream there.
[00:07:03] Jake: I know, it's great.
[00:07:04] Rik: [crosstalk 00:07:03] you're off to one side thinking, "When is Ferguson gonna shut up?"
[00:07:06] Jake: Well no, I've got this nice multi monitor set up. It wasn't that I'm looking off to one side or whatever, it was I was looking at you, now I'm looking at the camera, right, kind of thing. [crosstalk 00:07:16]-
[00:07:15] Rik: Probably much, much prettier to look at the camera.
[00:07:17] Jake: Yeah, yeah. I'm always, uh... I'm always with that whole, like, where does the screen get positioned, where does the camera sit? It's a multi- the curse of the multi-monitor, right?
[00:07:25] Rik: So, I'm going, I'm going to drop you right in it straightaway. What's that little plaque behind you?
[00:07:31] Jake: that's actually a, uh- something I got many, many years ago. I used to do a bunch of military work. I say used to do. I was in the military for a number of years.
I was a paratrooper, did actual tactical work. And, it- it's something that, somebody got me a number of years ago, uh- around that, so-
[00:07:47] Rik: It's, Jake might play down some of his, military and national security experience. Some of it for, reasons that he has to.
But you- you told me a couple of amazing injuries, when we had a conversation, a couple days ago. Injuries that really made me wince, particularly the story about breaking your ankle, or was it both ankles?
[00:08:08] Jake: No, I just... I just broke the one ankle. It was... It was pretty nasty, yeah. I popped all three bones that kind of come together, and, yeah.
It's uh... No- di- didn't- didn't- didn't have a chance to get it set properly I guess is- is the right way to say it.
[00:08:21] Rik: Yeah.
[00:08:21] Jake: And so, yeah. It's- it's- it's been a lifelong, yeah, that- that's been lifelong fun, right [crosstalk 00:08:27] the VA is here to help, right?
[00:08:28] Rik: You had to walk on it, you had to stick with it for quite a long time before you got it treated, right?
[00:08:34] Jake: Yeah, a few days. Yeah, so, lo- long enough out of the... Out of the window they- they normally like to set, you know, that kind of stuff, right?
So, which has caused a bunch of complications there as well [crosstalk 00:08:44]- Operated on it, didn't. You know. You know the deal, right, so yeah, yeah.
[00:08:47] Rik: Yeah. No, I don't know the deal at all. That's why I'm sitting here wincing.
[00:08:52] Jake: [laughs].
[00:08:52] Rik: I can't even imagine [laughs]. I can't even begin to imagine what that's like. Thank you for your service.
And thank you to anyone who has served, who may be watching, uh. Live, or, or on the rebroadcast of this. Sincerely thank you.
[00:09:05] Jake: Thank you.
[00:09:05] Rik: Jake, one of the things, if- if I read any bios of you, and in fact, if I look at the Rendition page as well, Rendition is the company that you set up, right? You're the President and Founder of- of Rendition.
[00:09:18] Jake: Yeah.
[00:09:18] Rik: When was that, by the way? How long ago?
[00:09:20] Jake: it was, 2013, right, so 2013 is when we started. And, it was the same year I left the, the agency. Or National Security Agency, right?
So, but yeah, so that, yeah, so gosh, seven, man, seven years now. It feels like not that long ago. But it's been... Yeah, yeah.
[00:09:37] Rik: So it's been a... Been a mad ride. I know you- you've been involved in a whole bunch of things, throughout that time. You know, not withstanding all the stuff that- that you did before much of which I'm sure you can't talk about.
[00:09:47] Jake: Right.
[00:09:47] Rik: one of the things that I... That I noticed kind of was common across different bios and different sites, because you know, you do the Rendition stuff.
You do the- the SANS stuff. You do, you've just come straight here form another webinar that you were doing for this, so you're a man in demand. And obviously there's a lot of descriptions of you all over the place.
And one of the things I noticed that was... They all kind of had in common was this phrase called adversary emulation. And that that's kind of one of your specialities. What is that?
[00:10:18] Jake: Well, you know, we, uh... And- and I don't want to start a holy war here about, like, what is a red team versus a pen test, versus a vuln scan, versus a...
Everybody has opinions about all that and- and- and it's the kind of thing that you know, I... It's not worth the... Not worth fighting on. But- but I'll say that, you know, one of the places that- that I think we drop the ball a lot, in security is that, uh...
Or at least in the offensive security side, is that we are not acting like an adversary, right?
[00:10:44] w- we're handing tools to a, you know, handing tools to, a, pen tester and saying, "Hey, go- go hack stuff." Right?
"Go get domain admin." Right? And it's, like, stop focusing on domain admin, right? In fact, I did a- a talk about this, last week with- with SANS even. For their, you know, for their hack fest.
[00:11:01] And it's- it's all about act like the adversary, right? Start focusing on the things that the adversary focuses on, right? [crosstalk 00:11:08]-
[00:11:10] Rik: If it's not, I mean, admin, wha- what is it then if wha- that the adversary is- is focusing on? Or maybe there's lots of different things.
[00:11:14] Jake: Sure, sure. I mean, it... Domain admin is- is just... It's just a means to an end, right? You know, you, uh...
The- the way I- I like to think about is that when an attacker's targeting you, right? You know, you have two- two real motivations for- for hackers. And I'm not getting to hacktivism side. That- that- that's tiny minor comparatively.
[00:11:32] Rik: Yeah.
[00:11:32] Jake: you know, it's- it's really, you know, you have financial motivation. You have intelligence motivation, right? So we're talking about our real APT, style, you know, style groups, right.
You know, we're looking at a financial motivation. And those are generally our fin groups, you know, like again a fin seven. And then you'll look at, like, again, the you know, that- that intelligence motivation.
[00:11:51] But- but it's also about getting data to satisfy a collection goal, right? And I try to put myself in that, and obviously sit- having sat on the other side of the keyboard and fulfilling those gives you that kind of insight in thinking about it from this perspective.
But- but I- I'm- I'm really thinking about, oh there's a lot of folks like, "Oh, man, you know, OK, now we need to think about how would the attacker go and try and, you know, try and gain in this particular permission?" And I'm like, "I'm not thinking about that. I'm trying to think about what's their mission?"
Right? Because if you can start to step back and ask, "What's their mission?" They may not need to elevate, right?
[00:12:20] you know, and in a recent, recent pen test, we were targeting a, I said pen test, ad- real adversarial emulation kind of piece. We were targeting...
We were told to go target a bunch of a bunch of retail, uh- a bunch of retail, organisations. And- and this particular retailer, happens to do printing as- as one of their, you know, commercial level printing as- as- as one of their things, that they do.
[00:12:44] We- we- we phished a bunch of folks, and, you know, were able to consistently across a lot of their... And this became a big finding, by phishing their managers, their individual store managers, we were able to go and see the document share.
Apparently everybody ha- inside the- the particular store, access to document share for all the stuff that was being printed. The amount of legal documents and all this-
[00:13:04] Rik: Right.
[00:13:04] Jake: You know, it was just... It was mind blowing, right? Now, you know, a- again, you kind of step back and you say, "Okay, well this is, you know, th- this is a case where we're not elevating.
We're not going to, you know, to domain, you know, to domain admin." th- adversarial emulation. What does the adversary want?
Well, you know, if they're financially motivated, they want to go get on, you know, your- your payment terminals, right?
[00:13:25] but if they're intelligence motivated, what are they really going after? They're going after intellectual property, or in some cases your customers' intellectual property, right?
[00:13:33] Rik: Yep.
[00:13:33] Jake: And so we kind of talked a lot about... And- and really that's kind of where we're- we're kind of spinning back and saying, "Okay, let's think like the adversary. Let's think about how they target, right? And [crosstalk 00:13:43]-
[00:13:42] Rik: How is that distinct from, because another- another phrase which, rightly, is used a lot in our industry is threat modelling. How is adversary emulation distinct from threat modelling? Is it distinct from threat modelling?
[00:13:53] Jake: Yeah, I- I think it is, right? I- I think it is. So threat modelling, and- and this is kind of the next piece of the adversary emulation side is that we're using specific tools and techniques that we know particular adversaries are using, right.
[00:14:05] So, as opposed to, like, a threat emulation where I... Everything is wide open, right? I can just go, go, go. In adversary emulation, let's say that I'm emulating APT28, right?
So, largely attributed to- to being a Russian... A Russian GRU, right? So I might then take, you know basically say, "I'm going to use the specific techniques that we know that these individuals are using.
[00:14:27] Rik: Okay.
[00:14:28] Jake: Right? And I'm not going to use other techniques that we don't know they're using, right? So, the idea there being, if I know my organisation's being targeted by APT28, I'm going to use the tools and techniques that they're using, and go after the types of data that they'd go after.
And- and- and so as opposed to being a threat model, which is really an inside out look, this is an outside in, how are we going to go target the organisation look.
[00:14:48] Rik: So kind of da- The recent report that we saw from Mitro, the, against the Mitro attack framework, against, I think it was APT 29.
[00:14:56] Jake: Mm-hmm [affirmative].
[00:14:56] Rik: That's- that's kind of... They were [00:15:00] doing adversary emulation, effectively, if I've understood you correctly.
If you're taking, for example, all the tools and tactics of a particular threat group, and then running that against your security or in the case of Mitre, against a bunch of different security, products.
That's basically adversarial emulation. One given situation, one given set of, tools and techniques and seeing how it works out in terms of how does your defence stack up against that.
[00:15:22] Is that-
[00:15:23] Jake: Sure.
[00:15:24] Rik: Yep?
[00:15:24] Jake: Yeah, no. That- that- that's it. And you know, a lot of folks don't do this, we- we find, because it requires a lot of additional... It requires a lot of additional work, right?
You know. I- I have to limit myself in my red team activities, right? I have to limit myself to the tools and techniques that- that the actors would use. I don't obviously have all of the actor tools. I- I, you know, I can't, like, phone up APT28 and be like, "Hey, loan me some malware, [crosstalk 00:15:46]0"
[00:15:46] Rik: Yeah [laughs].
[00:15:48] Jake: Can I borrow your implant, right, kind of thing? They- they- they're not cool with that. It turns out me and the Russians just, we-
[00:15:53] Rik: Just don't get on [crosstalk 00:15:54]-
[00:15:54] Jake: No, what were the odds, right. I mean, like, they, uh... They're like, "No, you're not cool." but, no.
So, you know, there's- there's some development that has to go into this, right? We have to modify our tools, to- to leave specific tool marks, to make them look like that.
[00:16:08] we- we actually had an adversary emulation exercise, or engagement, uh a couple years ago, where, you know, suffice it to say that, there were a few people at the organisation that were, in- in the know. And, you know, they didn't follow their escalation paths correctly.
And, you know, the people in the know, found out about it late in the game. But they had actually brought in their, their instant response retainer. They- they retained incident response, you know, investigators to look at, uh-
[00:16:36] Rik: For real.
[00:16:36] Jake: -use of the incident. Yeah, for real.
[00:16:37] Rik: Okay.
[00:16:38] Jake: and- and they, they determined that their incident response, and this is when they really started to gear up, you know, was- was when they then, you know, pulled in, the people in the know.
And they're like, "Man, we have a confirmed..." and- and we got the call, as the-
[00:16:51] Rik: Right.
[00:16:51] Jake: As- as the, "Hey, we have to stop. We have a confirmed incident." Right? Like, stop the Pen test-
[00:16:56] Rik: Oh, wow. Okay.
[00:16:56] Jake: Stop the engagement, we have a [crosstalk 00:16:59]-
[00:16:58] Rik: These were the people that knew that you were on an engagement still hadn't put two and two together, because they called you to say stop.
[00:17:04] Jake: Yes. Yes. Because, and this- this is the key here, because they said, "Oh, yeah. They've looked at this. They're- they're certain it's Chinese APT. It's APT22." and [crosstalk 00:17:15] we're like, "Oh. That's exactly who we, you know, who we were emulating."
You said the Chinese, you know, we have a- a Chinese, uh- uh threat mo- or, you know, specifically believed that, you know, there was... There was a threat of a compromise from, you know, from Chinese APT groups. That's who we chose to emulate.
[00:17:29] Everybody was cool with that, and nobody put that together, right? Even the people in the know. When they were told, you know, nobody then said, "Oh, is it them?" Right?
We... Again, the tool marks looked good to- to the point the that incident response provider said, "Yep, that's- that's them." So-
[00:17:43] Rik: So sometimes you're on engagements that, rather than emulating a specific group, the field is wide open, right?
[00:17:51] Jake: Yeah.
[00:17:51] Rik: People just say, "Come on in. Do whatever you like."
[00:17:54] Jake: Let's go.
[00:17:55] Rik: Let's go [crosstalk 00:17:55] How far you get, what you get access to and so on. And that was the kind of engagement, I believe, that led to you creating DropSmack, which was back when you started the company, right? Around the same time.
[00:18:04] Jake: Yeah, absolutely. [crosstalk 00:18:06] that was around that. Yeah, it was around that time.
[00:18:07] Rik: I mean, that was a specific tool for a specific purpose at a- a given point in history. And it was 2013, for those who don't know.
And DropSmack was a tool for, compromised by misusing the Dropbox service in architecture with a booby trapped document, effectively, right?
[00:18:25] Jake: Mm-hmm [affirmative].
[00:18:26] Rik: is that still, an area of concern for businesses? Business I know that, there are now, of course, enterprise offerings that are relatively mature, that offer the same kinds of things as the Dropboxes and- and Google Drives of the world.
But all... But I also know that those kinds of services are still, widely used by individual employees within an organisation. So do you still find a use for tools like that to abuse those kinds of services? And, [laughs] I'm rubbish, I just keep talking.
[00:18:54] Jake: You're good.
[00:18:54] Rik: And, have you found that the further, because cloud adoption has increased significantly over those seven years, right?
[00:19:03] Jake: Yeah.
[00:19:03] Rik: Further adoption of cloud in all of its many and varied glorious different ways, has increased the attack surface in any way?
[00:19:10] Jake: Yeah, you know, it's funny. I was actually talking about this on- on Tuesday, believe it or not doing yet a different webinar. But, yeah.
On Tuesday, we were actually chatting about this, that the browser, right, being kind of the new operating system [crosstalk 00:19:22] you talked about, you know, cloud adoption, right?
You know, you know, the, uh... I- I think that beyond anything else, right, the attack surface definitely is increasing, right?
[00:19:30] you know, and- and one of the things we were talking about on, you know, talking about there on Tuesday, was the, you know, basically this- this whole spin of, if- if I know...
Let's say that I can't get data from, your whatever super secure, cloud service. Right, you've got all your data. This data used to be on prem, as- as you mentioned really is- is moving increasingly into the cloud, right?
[00:19:52] Salesforce, ri- You've got, you know, QuickBooks. You name it, this, you know, you've got tonnes of financial data out there, tonnes of- of- of lead data.
And- and these, in many cases are the crown jewels for organisations. And so, while- while those services themselves, I'm- I'm sure are very secure, you know, th- there are other services that you're likely using, SAS platforms within an enterprise that- that are not, right?
[00:20:14] And- and we... I do think that attackers are going to, you know, start attacking some of those lesser, known cloud services, lesser- lesser know, lesser secure SAS services, software as a service, you know, platforms, and use those, once- once they've compromised one of those, use those as the entry vector, into the- the browser, right?
The attack, you know, either the browser or the user's, machine. And then- then pivot from there into, you know, some of your more secure SAS services, right? Again, that browser is becoming that OS, right?
[00:20:44] you know, on the file synchronisation side, yeah, that- that stuff's still wide open, man. You know, it's- it's, yes, it's- it's a tough, th- it's a tough attack surface, right?
And- and- and every time, you know, we open up a service, whether it be, you know, a thick client like a Dropbox style service or- or SAS, use of any SAS platform, you know, when we allow that, you know, we allow everything that goes with that, right?
[00:21:08] And- and- and that offers just awesome opportunities for attackers for, command and control. As- as- as well as, data exfiltration, and- and- and getting, data and tools into the environment.
You know, one final thing that- that I'll mention that a lot of folks don't discuss a- a lot is that, in this shift from on prem to- to in the cloud, right, I hear all the time, and by the way, I'm a big... I'm a big fan of realistic security, right?
I don't like to play the game of- of oh, my gosh, the sky is falling because of this weird unicorn attack or whatever.
[00:21:38] Rik: Yeah.
[00:21:39] Jake: I, you know, le- le- let's be realistic about security here. But, you know, one of the things that, I hear a lot of security folks, commenting, like, "The cloud is bad." Right?
I'm like, "The cloud isn't good or bad. It's just different."[crosstalk 00:21:52] Yeah. The thing is I talk to folks about a lot though in the IT side, right, you know, folks will come in and they'll say, it's particularly in bigger organisations we consult with.
And they'll say, "Well, IT has done this study, and they show this, you know, the total cost of ownership is going to drop when we move this thing to the cloud." Right?
I'm like, "Yes, TCO is awesome. But I want to put the T in TCO." Right? Because, you know, T, it- it- it's total, right.
[00:22:15] and- and very often we see these TCO calculations, what they're not counting for is security, right?
We've already invested in how to secure our on prem assets, right? And as we then move that same data to the cloud, two things happen. One, we have to deploy, typically, additional security controls, right?
Either additional security controls, we know complexity is the enemy of security as well. That- that's another [crosstalk 00:22:37]-
[00:22:37] Rik: Yeah, and to be honest, a lot of the time you're forcing people to move from the solutions and services, and products that they're used to using in their on prem environment to tools that they're a lot less familiar with-
[00:22:49] Jake: Mm-hmm [affirmative].
[00:22:49] Rik: in a cloud environment. I think that's one of the things that's driving, to be honest, a lot of vendor consolidation, because otherwise you're forcing multiple, user interfaces, multiple management consoles, multiple reporting streams, and you end up with, you know, alert fatigue and alert overload, and so on.
[00:23:06] Jake: Sure, sure. But you know, the second thing, and this is something I don't hear a lot of folks talk about, right, is that we can't effectively test these SAS platforms and the security of these SAS platforms the way that we test, and used to test on prem, the same on prem interfaces, right? And so, uh-
[00:23:21] Rik: Particularly in multi [crosstalk 00:23:22] ones right?
[00:23:23] Jake: What's that?
[00:23:24] Rik: Particularly multi tenanted ones, because [crosstalk 00:23:26]-
[00:23:26] Jake: Oh, yeah, totally, totally. Like, I can't go pen test Salesforce, right? I have to trust that they, you know, if I'm a Salesforce customer, I have to trust they've got security handled, right?
On that side, and- and so... And I'm not saying anything bad about them. I 100% trust them, but mom and pop SAS, you know, SAS provider, eh, not so, you know. Yeah, not so much, right?
So- so there's a lot of... There- there's a loss of transparency, in many cases a loss of logging. We get some of that back with CASB, solutions and whatnot.
[00:23:52] But- but it is... It- it- it's a brave new world. But to your point, no, none of that's really changed, and- and- and additional SAS, options and- and more stuff in the cloud, I think is- is just creating more command and control channels than we ever saw before.
[00:24:05] Rik: And where do you think... So, if- if you're talking about, attacks on, cloud service... Not cloud service providers, SAS providers, let's say, do you think the risk there is more in attacks on APIs?
Or do you think it's in traditional kinds of infrastructure and operations, application level attacks, but again, it's those providers, or both?
[00:24:31] Jake: I mean, I- I think it's going to be both. I think a lot of it's going to depend on what is your, you know, what- what- what- what's your target, right?
You know, if I... If- if I'm targeting an organisation, right, so- so basically an organisation, I- I'm going to keep using Salesforce here, because it- it's- it's [crosstalk 00:25:01]it's a good example, they [crosstalk 00:25:01]-
[00:24:49] Rik: Poor Salesforce, it's not about you, we promise.
[00:24:50] Jake: Yeah, it's not about you. In fact, we are a Salesforce customer over at Rendition. We love Salesforce. Rock on.
[00:24:53] Rik: There you go.
[00:24:53] Jake: Right? But, but let's, you know, let's take them for instance, right? And- and say, you know, that, if- if I'm targeting an organisation that uses Salesforce, yeah, I- I'm really not going to be targeting, you know, the- the API.
I want that specific, that specific org's data. Right? And the quickest way to get there is to get in my browser or a browser of an authorised user there, and just suck the data out the same way that- that I would, right?
[00:25:18] And- and so, I might do that with an API, right? I might compromise the browser, and then issue or create my own API, secret, right? Who the heck's looking for that in their account? And, you know-
[00:25:28] Rik: Sure.
[00:25:28] Jake: -We should be reviewing that kind of stuff regularly. But then the attacker then at their leisure, uses the API to go to suck the data. I don't consider that an attack on the API.
That's really an attack on the end user, and then using that API.
[00:25:39] Rik: Yeah.
[00:25:40] Jake: But, uh [crosstalk 00:25:41]-
[00:25:41] Rik: Living off that attack, right? That's- [crosstalk 00:25:43]
[00:25:43] Jake: I mean, it feels-
[00:25:44] Rik: Without the binaries.
[00:25:45] Jake: Yep. Do you know I actually almost used the word. It almost spins right back into LOLBins, right? I almost said it, and I was like, eh- eh.
[00:25:54] Rik: for the... For the benefit of- of the audience, we were laughing about the- the- the- the expression LOLBins just before we went live, it's currently one of my absolute favourite words.
LOLBins is just amazing. It sh- it should be a new, uh... Any artist out there, watching who are in our field, that that would be a fantastic, title for a cartoon strip, in the infosec area, you definitely should use that.
Have a... Have a LOL- LOLBins cartoon strip.
[00:26:19] Jake: I love it [crosstalk 00:26:22]-
[00:26:22] Rik: Yeah, Dilbert or whatever, right. One of the other areas that you work in a lot, and I guess it dovetails totally into, the adversary emulation and, and pen testing stuff that we spoke about, is incident response.
You know, the other end of, you know, something bad happens, what- what do you do, as an organisation? We see the best and the worst of it, i think, in the news.
And we see it increasingly, with high level attacks, the- that- that do garner news coverage.
[00:26:58] sometimes they're handled really well. Sometimes they're handled spectacularly badly, and I think we can probably all think of, examples of both straight off the top of our heads.
You don't have to name names. Feel free if you want to, but you don't have to name names, but what I'm curious about is, what do you think people are increasingly getting right, and what are we learning from all of the stuff that's happened before?
And what do people continually overlook, get wrong, never quite get their- their hands around in incident response?
[00:27:28] Jake: I'm going to start with what they get right, or what we're seeing people get right, I guess m- more often, right? I mean there's a, there- there's...
By the way, as- as, you know, as we say this, know that there is a, a- a huge difference between, you know, I say this, between medium to large size organisations and- and much smaller ones, right?
[00:27:45] Rik: Yeah.
[00:27:45] Jake: but- but- but I think the thing that we're getting, you know, that we're getting right, when it comes to incident response is that in medium to large size organisations, there has been some discussion already, that- that an incident is- is, if not likely, or sorry if not inevitable, is- is likely over the course of, you know, some period of time, right?
And so- so there's... I- I am... It is a rare scenario today where I am walking into a boardroom, or to discuss with an executive team and, you know, I'm getting the whole, "Now, hold on a second. I thought our stuff was un-hackable." I'm not hearing that, right?
[00:28:22] Rik: Okay.
[00:28:22] Jake: That- that's-
[00:28:22] Rik: And you did before?
[00:28:24] Jake: I did, oh, yeah. Yeah, there was... There were people that were, like, you know, "Oh, my gosh, this, you know... We were... We put it together. It was all secure.
I have no idea how this could have happened. Yeah." We- we're out of that, right? In a medium to large size organisations, especially, like, there is a- a recognition. And- and it's not just individually that there's been a recognition.
At an organisation level, you know, at least there's been some discussion about the fact that, you know, because either one of their peers is hacked, or-
[00:28:50] Rik: Yep.
[00:28:51] Jake: You know, again, just seeing co- it's constantly in the news. You- you know, for us, we- we have... We have a bias, right? You know, we're, an availability bias, right?
Where we're seeing these stories that are huge in- in infosec, you know, side, that then really haven't broken out of that infosec, kind of space, right?
[00:29:07] Rik: Right.
[00:29:07] Jake: and so, you know, that- that's- that's kind of... Kind of a- a piece there, but we're seeing more and more executives that are really tracking these- these breaches, right? And- and seeing [crosstalk 00:29:17]-
[00:29:17] Rik: Now, where [crosstalk 00:29:18]-
[00:29:18] Jake: Yeah, yeah.
[00:29:19] Rik: Right?
[00:29:19] Jake: One of the things I tell people that are briefing boards. It- it's like, "Stop going in with FUD." Right? If you've got fear, uncertainty, and doubt, right? If you've got 10 minutes to brief the board, 15 minutes to brief the board, do not make the first...
I- I- I end up doing a lot of, like, you know, one on ones, or, dinners, aet cetera, kind of after hours stuff with these folks. And I'll tell you that, that is the number one complaint that I hear is the, "Stop telling me the problem. It's like, I know the problem."
Or get granular with it, right? But don't get in there to, "There are increasing hacking, whatever, 2.4 billion hacking attempts per year." Right?
[00:29:52] I saw something, where somebody posted, very high profile account. I'm going to leave out who, but, just last week, in fact, one of their virtual, you know, or online conferences, you know, they- they, uh basically tweeted from one of their presenters where they were like this ridiculous amount of money lost per minute, right?
[00:30:08] They're like, "Per minute, there's X millions of dollars lost to cyber crime."
[00:30:11] Rik: Yeah.
[00:30:11] Jake: We did the maths on it. We're like, "That is roughly 18 times the GDP of the planet, right?"
[00:30:18] Rik: So where does all that money come from? And where does it go?
[00:30:21] Jake: Yeah, exactly. Somebody in my replies actually hit it the best when they said, "Man, we could literally offer to pay every potential hacker a million dollars a year to not hack, and that would solve cybersecurity."
Like, if that... If that loss number is true... And they're like, "And we'd still come out phenomenally ahead."
[00:30:39] And I'm like, "You're right. That's how ridiculous some of these numbers are." [crosstalk 00:30:43]-
[00:30:42] Rik: Interesting causing, like, a denial of service on the hacker registry. So as people kind of sign up for- for their million, right? I'd be in the queue.
[00:30:49] Jake: That's exactly it. Me- me too, right? But you know, on- on that end, right? So- so- so it's the stop the FUD.
They- they don't need to hear that, and that's kind of the what we're getting right.
[00:30:57] The what we're getting wrong is- is translating that into, you know, basically translating that recognition of cybersecurity issues, and- and ultimately how we're going to do incident response, is translating that into language that they understand, right?
[00:31:13] Rik: Yeah.
[00:31:13] Jake: So I regularly hear from boards as well, I- I'm going to pivot this back, you know, like, maybe a little bit less techie and- and talk, you know, from board perspective.
One of the things you regularly hear from them is they'll say, "Well, you know, they- they came in last year, and they asked..."
And this is one of the things I actually do some work with with the VCSO and- and even just advisory services, where they'll say, "Look, you know, they- they came to us last year and said we need a million dollars for... Or two millions dollar for this- this SEM deployment. Right? And I thought that was it, and we were done with security."
[00:31:43] "And now they're asking for money for security again to deploy this other product. And I don't know, like, why didn't product X take care of produce, you know, or take care of that issue that, now they need more money for this year, aet cetera.
And- and, you know, really it's- it's not just about maintenance, right? We're in this constant evolutionary phase, right? CASB wasn't a thing years ago, because we didn't need it, right?
Now with all of these SAS deployments, I need CASB. It- it's not a... It- It's not a- a fringe tool. It is a core tool in a lot of security infrastructures.
[00:32:12] And so, what we're not doing a good job of, right, whereas you and I know that CASB word, right? And you and I know DLP, and you know what it does.
We know what the limitations of it are. We're not briefing that to boards effectively. And the way to do that, by the way, in my experience is, it- it's not to go in and be like, "You need CASB. Let me explain what CASB is. It's a cloud access service broker."
And I mean, you've already put them to sleep, right?
[00:32:34] Rik: Yeah [crosstalk 00:32:35]-
[00:32:35] Jake: Yeah, back up, like eight steps here, right? What you need to do instead is- is- is literally take- take it scenario-based, and- and talk about how each one of these things is a visibility tool, right?
Because, you know, at the end of the day execs and boards, you know board members, you, they- they understand the market, right. And- and- and, you know, markets, are inherently...
Most have, business degrees, right? They, most of them, inherently understand that, you know, uncertainty is- is a thing that does not bode well in business.
[00:33:02] Rik: Sure.
[00:33:03] Jake: And so I like to brief my security solutions as decision support tools, right?
[00:33:08] Rik: This is, it's almost adversary emulation again, right? Just in the totally different actually.
[00:33:12] Jake: Well...
[00:33:12] Rik: You see the board as an adversary and start emulating their way of thinking. What do they want?
What do they want to hear about? How do I [crosstalk 00:33:19] it to that, right?
[00:33:20] Jake: Target my message. That's exactly... But- but on the incident response side, we're doing a bad job of that, and that leads to bad IR outcomes, because we haven't looked at where our visibility gaps are.
And if we have, we haven't explained those in a way that- that ultimately our decision makers, our budgetary decision makers understand, right? And if our budgetary decision makers don't understand the impact of not spending... Right, what they're hearing is, "Spend this money or we're going to get hacked."
And the reality is, they know that everybody else is spending money and they're getting hacked anyway, right?
[00:33:49] And so when you flip that on its head, and they get hacked, and they're like, "Whoa, whoa, whoa. You know, now you're going to be able to tell me why we got hacked."
And you're like, "We don't have the logs. We don't have the- the actual data, the telemetry in place to be able to see that." [crosstalk 00:34:02] time out. We've spent a bunch on sec- why don't we have... What I like to do instead is say, "Hey, look, you know, here- here's an incident, right? Here's one that's in the news, right, insert any of them here."
I just saw a big article on Capital One this morning. Let's use them. You know, the, you know, at the end of it, they're going to ask, like, "Hey, what happened?" Right?
[00:34:19] And I want to be able to answer the, you know, be able to give the best answer possible, right?
What I like to do is take one of those scenarios and say, "Okay, well here's kind of how this played out. If this happened to us, here are the answers we couldn't provide.
Here are the questions I'm not going to be able to answer for you."
[00:34:33] Rik: You're doing a gap analysis, effectively, right?
[00:34:35] Jake: That's it. That's a hundr- that's actually what we- we actually call it an incident response gap analysis, yeah.
[00:34:39] Rik: Yeah.
[00:34:40] Jake: 100%, yep.
[00:34:41] Rik: Do you think part of the problem is that they're still a perception of, silver bullet solutions?
[00:34:47] Jake: Oh, there- there's definitely a lot of that, right? Silver bullet solutions, yeah. There- there's, wow, there's a huge, yes, huge focus on that.
In fact, you've actually pivoted right into one of my favourite saying there when it comes to that stuff. It- it's that you only need silver bullet solutions if werewolves [crosstalk 00:35:03] are in your threat model.
[00:35:04] Rik: Right.
[00:35:05] Jake: Right? So, yep, yep.
[00:35:07] Rik: Absolu- there's- there's a- a question coming from, from LinkedIn, which I suppose you- you'll have an opinion on from your adversary emulation, side of the business-
[00:35:22] Jake: Yes.
[00:35:23] Rik: Is, it... The question says, "What do you think about honeypots?" I guess that means, are they effective? Are they worthwhile?
[00:35:29] Jake: Absolutely.
[00:35:29] Rik: Are there different kinds of honeypot? Are there different use cases? Are there times when you never should use them?
[00:35:35] Jake: Yeah, so- so I... Man, I've got a lot of feels about these, right? We work incidents that happen because of honeypots, right? So, high interaction honeypots, right?
Where, folks set up intentionally vulnerable systems. We know we have a lot of churn in infosec. Bob sets up a honeypot and leaves. And then, you know, of course did a great pass down, right? A great pass down.
[00:35:56] and, you know, so hands that off. And of course, they've, you [00:36:00] know, maybe it's feeding alerts to his mailbox, or, you name it, whatever, right?
And, you know, and then of course, nobody's really monitoring this any more. Attackers just set up residence there because it is a [laughs]... It is an intentionally vulnerable system.
I'm not a big fan of those, right, to be straight up, like high interaction. And- and honestly, a lot of is really what- what are your goals, right?
[00:36:19] I will tell you I'm a big fan of, you know, some of... Some of, the, the deception technologies out there, right? We worked an incident recently where- where they were very, very effective in- in slowing the attacker down, right?
Because the attacker was- was looking around and interacting with, you know, different pieces of this, you know, basically this- this deception technology, right? So- so they were password spraying, these systems that didn't even exist, right?
So- so that was a- a phenomenal use of- of that deception tech. And- and you could call those, you know, call those systems that don't exist, right?
[00:36:55] They're- they're effectively network layer honeypots, right? And it was, you know, phenomenal, you know, phe- phenomenal use of that.
I will mention that, you know, I- I- I don't think that honeypots are the first thing you should do in security. Whenever I talk to folks about deception technology, I'm going to step past honeypots, because- because really, I think honeypot is like- like V1, version one of- of what is a broader deception technology architecture, right?
[00:37:21] Rik: Right.
[00:37:21] Jake: I- I think deception technology is- is a useful tool. But- but I also am- am very cautious to, to not deploy it first, right? And the- the way that I always analogise it to people is- is think about going to the circus, right?
And, watching the trapeze artists, right, they typically have a safety net, thankfully, right? But if you put me up on the trapeze, right? And that safety net is not going to do a heck of a lot for me, right? I am not already skilled in- in trapeze artistry.
If I autumn in the safety net, I may break my leg, right, by sticking it through the hole or whatever. I may bounce out of the safety net and autumn on the concrete.
[00:37:57] I am not equipped to use that safety net in any way, shape or form. There's a foundational use that has to be there first, right? And so, but with these well trained actors, right?
When I already have a good detection portfolio in place, if the attacker happens to slip through that safety net, right?
Or sli- slip through that detection portfolio, all the training that trapeze artists have, then they autumn out of the safety net, right? And so I like to think of it in that... In- in that way.
[00:38:22] what I am seeing people do today that I think is absolutely the wrong thing to do is, they're looking and saying, "Okay, I'm starting to really build or- or for the first time, have thoughtful, true thoughtful engineering around my whole security stack. I'm going to go deploy deception."
And- and without a real concept of- of how that's baked into everything else a- around their, you know, around their security... Their- their security model and their security stack.
[00:38:46] Rik: [crosstalk 00:38:46] quite often about honeypots is that they'll be deployed as the high interaction type models that you spoke about or maybe even something which, I guess, you would consider low interaction, maybe you just have a, um- a fake user account that [00:39:00] no one should ever log into, so as soon as it gets logged into you have a flag raised.
Something as simple as that, right?
[00:39:05] Jake: Mm-hmm [affirmative].
[00:39:05] Rik: But what is often neglected, around those deployments, is that people forget that they need to deploy their security around the honeypot just as effectively as they need to deploy it around the production systems.
Otherwise it's like you're- you're someone who sells gasoline for a living, and your honeypot is basically saying, "I'm going to leave a pile of open canisters of gasoline around the back and just see if anyone sets fire to them."
Because that's very quickly out of control, right as a... As a honeypot.
[00:39:35] Jake: Yeah, yeah. Well, then also, too, right? It's all about, you know, the high interaction versus, you know the- the, thing I was talking about before, right?
Is the, uh- where we had the attacker. It slowed down. The password spraying systems that didn't exist. This created a tonne of noise. It was great from a detection standpoint, right?
Created alarms that- that correlated with other activity that, by itself, may not have risen up above that noise floor. Right? But- but on the high interaction side, I see a lot of folks where they're like...
And you can argue about, like, is- is, you know, the point that we've negotiated in SSH sessions, is that now high interaction or SMD session?
[00:40:07] I think until you're actually on the box deploying tools, aet cetera, that's really where I go into high interaction, right? Everything before that can be just, you know, network trickery.
But, you know, the... A lot of folks are like, "Yeah..." There- there's like this James Bond kind of feel to it. I understand why, like, the deception technology is so popular, because it's like, a lot of the vendors even are pushing where they're like, "Yeah, it's going to be awesome because you're going to be able to watch what the attacker does, and we're going to push them down this, like, this- this maze of fake systems. So you can observe all their tools and their techniques and the whole..."
[00:40:37] And- and I'm here to tell you [crosstalk 00:40:38]-
[00:40:38] Rik: Nothing will go wrong.
[00:40:39] Jake: Yeah. Well- well, even if it works perfect, right, I- I always tell folks that... To come back and say, "Hey, first off, is that effective? Can you operationalise that data?" Right?
Because if you can't, then you're spending a bunch of money to- to- to what? I mean, look, I'm going to bounce the attacker out of the network anyway once I'm convinced I- I've got them contained, right?
[00:40:58] So- so when I see that those SSH attempts, those SMB attempts inside the networks, right? I know I've got an intrusion or- or at least I'm- I'm working on an incident response.
As soon as I'm positive I can contain the attacker, doing a little bit of investigation, I'm going to bounce them out of my network. I don't need to wait for them to watch, or wait and watch, you know, days, hours, whatever of- of additional activity in this, you know, this virtual environment.
[00:41:21] Because to your point, one, yeah, that's- that's tough. And, you know, there- there is a lot that can go wrong.
But- but even when it... Even when everything goes right, there are few organisations out there that can truly operationalise that level of data. [crosstalk 00:41:35]-
[00:41:34] Rik: Speaking of, there's a lot of things that can go wrong, I know that the industry as a whole, when- when the whole, COVID-19 global pandemic hit and we were...
Every country was shifting into lockdown mode and work from home mode.
[00:41:50] Jake: Mm-hmm [affirmative].
[00:41:50] Rik: There was a lot of marketing, sent out from a lot of companies about how to secure your- your- your work from home environment and...
[00:41:57] Jake: Yeah.
[00:41:58] Rik: Everyone... And- [00:42:00] and that's, you know, I... That's not un- I smiled when I said it, but it's not unjustified that...
Marketing, that was actually a good education opportunity for what we, I suppose, still believe is a... Is a pretty good reason.
People are in a new, architectural situation from an IT and a security perspective. They're in a new mental situation in terms of working from their home rather than working from the office.
[00:42:24] There's a whole mental model that goes along with that in terms of access to information, sharing information, use of systems and software and so on.
So we were all, as an industry, and I know that- that at Rendition you were no different, expecting to see, an uptick in attacker activity.
[00:42:40] Jake: Mm-hmm [affirmative].
[00:42:40] Rik: Have you?
[00:42:42] Jake: Y- Yes, and no. Right? So, from a COVID, you know, from a scammer side of- of things, sure. Absolutely, right?
[00:42:50] Rik: Yeah.
[00:42:51] Jake: and- and a lot of the scams, honestly have gotten, you know, have gotten better. Over the last several years, right?
I was actually, you know, had- had a client who misunderstood, you know, what do you mean by an advanced attack, right? And they're like, "This one's super advanced."
And it was advanced because, you know, they- they had basically, like, added, their- their domain, right? The- the- the victim's domain, to, like, the credential harvesting page."
I'm like, "Brother, that's not advanced." Right? Like [crosstalk 00:43:16] I can show you advanced, right? But, like, this is... That- that's- that's, like... That's, like, the you must be this tall to ride."
[00:43:22] And they're like, "Well, we train all of our users, to look for grammatical errors, and you know, like the...
They got all the spelling right. They even had our domain name on the website, and the whole..." I'm like, "Yeah, again, like, that- that's the 'you must be this tall to ride' and- and if that's ca- If that level is causing problems, let me show you some stuff that's really going to blow your mind kind of thing.
[00:43:41] So- so on the... On the low end of things, I mean, we saw... We- we saw a huge attacker uptick, right? But on the advanced, like the- the APT style, style threats, you know, we- we saw a little bit of a drop off in that, for- for a while.
It's definitely picked back up. But, you know, we had whole countries that were in lockdown, right? And, you know, if you picture again, I- I talk about, from the, you know, APT style side being really an intelligence kind of...
Kind of operation, you know, this isn't like something you send somebody home, an intelligence operative and be like, "Go hack from home." Right, kind of thing.
[00:44:14] You know, if they... If they're home, they're- they're probably not doing as much of this, right? I don't, for a minute, believe any country completely shut down around that, but I do believe that the- that there were several that had, reduced operations.
And, certainly some of your- your contractors that- that do that kind of work, I suspect weren't in the offices, and anyway, so- so we did see a drop off.
[00:44:32] It's really picked back up. We- we saw... It was kind of interesting, right? So at the same time that we say kind of a lull, a little bit of a lull in some of our, some of our APT operations, we saw the scams just go through the roof, right?
[00:44:45] Rik: Sure.
[00:44:45] Jake: So it was... There was never a spot where it was down. What I am seeing a- a difference in, you know, and this is across my non-monitor clients, right?
So the fact that we're on a SoC gives us a good, kind of feel for- for what's out there, at least across those, you know, those industries that we monitor. But, you know, we're not seeing as many reported intrusions.
And I think that, that's a really interesting piece there, and- and I've got some theories around that, but, I mean, at the end of the day, it's- it's- it's tough to say we're- we're starting to see that start to pick back up, but- but for that initial work from home piece, we saw kind of a dip in- in reported... external reported [crosstalk 00:45:19]-
[00:45:19] Rik: And you never know, right, if that's... If that's good or bad, seeing a dip in reported intrusions, does that mean people, you know, they-
[00:45:26] Jake: Yeah.
[00:45:26] Rik: The intrusions aren't happening? Or does it mean that people are not as vigilant, as they were when they were working from the office?
Or does it mean that the attacks got better? We had a conversation a couple days ago, you and I, i mean-
[00:45:40] Jake: Yeah.
[00:45:40] Rik: Had a conversation, where you raised a really interesting point about what has been c- become known as dwell time.
[00:45:48] Jake: Yeah.
[00:45:48] Rik: And you had a really interesting, as a result of the COVID-19 situation, you had a really interesting prediction, around that. You want to... You want to share that?
[00:45:58] Jake: Yeah, definitely. And I have to credit Brandon McCrillis, who's the, another former, you know, intel guy, CEO at Rendition. He's actually the one who- who really pitched this first.
But, you know, we- we originally thought that as- as we moved to that work from home, we're like, look, you know, fo- folks are losing some of the perimeter, defences, right, that they used to have as they- they moved these- these, these home.
[00:46:19] It's a changed environment. It's- it's new complexity. It's, you know, we- we thought that we'd see a big, a- a big spike, then. Followed by a somewhat larger spike as- as we transitioned back to the office, then we get all the monitoring back in place.
But dwell time. We- we know, from various reports that there are 60, 90, 120 days, 180, whatever it is, the time between when an attacker compromises a network, or an asset, and the time that they're detected, right?
And we know that there is... There's a span of time. It's- it's not immediate.
[00:46:46] And, you know, we're really sitting in that, you know, that- that 90 days out of... Out of work, right is... Or out of the office is kind of where we're at, 60 to 90 days, you know, depending on- on where you're at.
Some folks are moving back into the office, aet cetera, right? And so, you know, I- I think that we're going to see, a- a- a real spike coming, you know, as- as that kind of dwell time, you know, wears...
I don't want to say wears off, right, but as we hit that- that dwell time, right?
[00:47:12] Because assuming that, uh... And I'll even throw another kind of theory up here, it's something we talk a lot about, you know, internally at Rendition, but the, you know, as- as we move to home, we already had a lot of distractions around- around home, and that includes our security staff, right?
So they lost... The security staff lost a lot of their monitoring, capabilities, right, as we expanded the perimeter, just, like overnight, and tried to adapt there.
[00:47:33] We had new insider threat risks as people have laid off, or been laid off, right? Of- of a lot of, you know, just... It's just a reality in a lot of [crosstalk 00:47:41]-
[00:47:40] Rik: Disgruntled employees, yeah.
[00:47:42] Jake: That's it. That's it, right? Disgruntled. And even if you don't, you know, count the insider threat there, you know, terminating, slash, temporarily, you know, suspending these accounts, aet cetera, I have no doubt that in some organisations, there were those that were missed, right?
And those then create opportunities for additional opportunities for attackers, right?
[00:48:02] And so, I- I think that we're seeing, you know, I- I think that there's a lot in what we're not seeing.
And I think that, probably in- in the autumn, or- or- or rolling into August, September, I think we're going to see a real big peak of- of reported incidents that- that occurred in, you know, in April and- and May.
[00:48:19] Rik: So you think maybe in the next, for example, Verizon data breach report, the... Do you think... Would you put money on the dwell time having increased?
[00:48:28] Jake: I- I wouldn't necessarily put it on, put money there, for the DBIR, but only because of- of selection bias, right?
[00:48:36] Rik: Okay.
[00:48:36] Jake: you know, there- there's a selection bias there, in that, in the DBIR. And this isn't to knock the DBIR. It's a fantastic report. Read it. You absolutely should.
But- but then also understand where the data comes from, right? It comes from Verizon, and it comes from people who... And- and firms that feed, data to Verizon.
[00:48:52] then, so, you know, what you're looking at here are, you know, basically a cross section of, you know, of specific incidents.
And again, there's a selection bias there for- for- for that, right? I tend to think those are a little bit h- the higher end. They're the folks that would have been more likely to [crosstalk 00:49:07]-
[00:49:07] Rik: Yeah, I think [crosstalk 00:49:08]-
[00:49:08] Jake: More likely to have more coordinated security strategies, right?
But I think some of your smaller, medium enterprises, right, so let's sit in that- that three to 5,000 user type base, that's- that's- that's folks that generally aren't represented in the DBIR, I'd say.
And I think that's where you're going to see those... That big uptick, right? Where we really lost some of that monitoring.
[00:49:25] Rik: So let's switch gears a bit. When I tweeted earlier on, I think when there was an hour to go until we were due to start, and I was encouraging people to come and listen. I said we're not only going to talk tech.
[00:49:36] Jake: Yeah.
[00:49:36] Rik: Because there are some other really interesting areas that- that we could talk about. They're related.
But they're not directly about, you know, tools, technologies, adversary emulation, incident response. You were the defence expert for Marcus Hutchins recently.
[00:49:53] Jake: Yep.
[00:49:53] Rik: and, for the benefit of the audience, I know I'm OK with talking to Jake about this. Don't worry. I- I'm not spring it on him.
[00:50:01] Jake: Yeah.
[00:50:01] Rik: you were the defence expert for- for Marcus Hutchins, a very high profile, case.
[00:50:08] Jake: Mm-hmm [affirmative].
[00:50:08] Rik: lots of media coverage, and everybody had an opinion.
[00:50:14] Jake: Yep.
[00:50:14] Rik: You were on the inside.
[00:50:18] Jake: Yep.
[00:50:18] Rik: You have a much more qualified opinion than most people out there. So what's your take on that whole situation? Like, a bird's eye view?
[00:50:28] Jake: Yep. So- so first off, let- let me say that I obviously can't talk about the stuff that was, no, that's non-public in the case, right? So- so we'll just...
We'll kind of kick there. But that- that stuff does obviously inform, does obviously inform my opinions.
[00:50:41] you know, fir- first off, you know, he was... Marcus, as I think a lot of folks know was- was charged, you know, under a number of different, a number of different statutes, for allegedly writing, writing, and then ultimately selling malware.
[00:50:55] and so I think there's a... There's a couple of, you know, a couple of really interesting, really interesting points there. First off, when- when they originally arrested Marcus, he thought it was related to WannaCry.
There's been a lot of, you know, he's the- the gentleman that, you know, thankfully, in his work, you know, created the WannaCry or registered the [crosstalk 00:51:12] WannaCry [crosstalk 00:51:13] domain.
[00:51:13] Rik: I should have prefaced with that. Yes, thank you.
[00:51:14] Jake: Yeah, no, it's- it's... But I think it's an important thing to note, because, you know, and- and this is public, Marcy Wheeler, aet cetera, reported on this.
You know, they, whe- when he was originally being questioned, you know, believed that that's what they were talking to him about, right?
[00:51:30] and, it wasn't until he had made some, le- let's say, statements against interest, I think, is the way that, the lawyers present those. You know, did- did he learn that it wasn't about that.
You know, I- I think the first, you know, probably the first thing to kind of chat about there is, and- and you know, a lawyer friend of mine, said it the best. You know, it's basically if- if you are called to speak, or- or- or law enforcement approaches you to talk to them about anything, it- it's STFU, shut the heck up, right?
[00:52:00] Rik: Mm-hmm [affirmative] [laughs].
[00:52:01] Jake: And, and- and call a lawyer, right? You know, my- my family, is- is big in law enforcement. You know, so it's mostly a military and law enforcement family, and, you know,
I... My- my stepdad has routinely said, "You never talk..." He said, "You never... If the... If the police are there to talk to you, or the FBI, or whoever." It was the FBI in this case.
He said, "You never talk your way out of trouble." He said, "People regularly talk their way into trouble."
[00:52:24] So when somebody comes to you and says, "Hey, we just want to clear up a couple things. Let's chat about this." Shut up, right.
Say, "Am I being detained?" Right? Get a lawyer. They will then help you with that stuff, right? So- so I think that- that's- that's one thing that's- that's huge there.
[00:52:37] But- but I want to step back for, you know, another piece there, and just kind of on the government side, you know, they- they charged, Marcus under the, both the computer fraud and abuse act, and the wire tap act.
And- and this was a really interesting, case. It's one of the reasons I got involved with it. You know, the work was- was pro bono.
And, so- so was the legal work for that matter. I mean he had a legal defence fund, but- but I mean by, you know, I... The- the work was- was largely, you know, largely pro bono. They covered[crosstalk 00:53:02]-
[00:53:02] Rik: It was ongoing for a long time, right?
[00:53:04] Jake: Yeah, it was. It was, yeah. The- the- the... It basically covered, like, counsel's you know, flights to, you know, to Michigan to- to- to basically...
I think it was Michigan. No, Wisconsin, excuse me, Wisconsin to, to hear the, hear the case.
[00:53:15] So, you know, there's... There- there was a lot around that, but- but again, when- when the government charged on the wiretap act, that- that was an attempt to, really to go set precedent, right?
The wiretap act talks about creating hardware and software, for the, basically for wiretapping activities. And what they were trying to allude, to, was the fact that, you know, by creating malware that was able to inject itself into the browser and sniff communications, that, itself, was a violation of the wiretap act.
[00:53:45] And- and what we have then is- is this expansion of, you know, government, you know, government tools, right, to be able to charge folks with, and- and to say that a single crime is a...
Or a single act, right, building and selling this malware is a violation of multiple different statutes, right? And so, you know, again, that- that's- that's problematic to me. That's troublesome to me. There's a lot of other troublesome stuff about the case, but anyway, yeah. Yeah.
[00:54:09] Rik: And if there were... If there were one... This maybe isn't a fair question, and if it's not a fair question just tell me it's not and we'll move on, but if there were one lesson other than shut the hell up [laughs], if there were one qu- one lesson that you would say we could draw from this whole case, how it was handled, how, uh...
Wha- what the ultimate verdict was or what the activities were that led to the charges being brought? From any aspect of it, if there were one really important lesson that we could draw from that, what do you think that would be?
[00:54:42] Jake: I- I think, you know, really, if- if there's... And- and I'm going to go societal here, because I- I don't want to, like, get on the, you know, uh...
But- but I think the- the societal lesson, is that, you know, underrepresented folks, don't- don't stand a chance. You know, the, especially with digital evidence, right?
[00:55:00] I guess they don't stand a chance, but that you are, you are severely hampered, with- with the lack of- of- of good help there, right?
[00:55:07] You know, the- the... And- and I credit Marcia Hoffman and, Brian Klein did outst- council, did outstanding work, on- on this case. And- and the reason, largely, that, malware, you know, malware tech, Marcus Hutchins is- is not in, you know, basically got the sentence that he did, which was effectively time served in the US, you know.
Which again, I think is a fully fair verdict, given all the, you know, circumstances of the case.
[00:55:32] the, you know, when I look at that, they- they had to convince a 70-year-old judge. They had to explain what was going on to this 70-year-old judge.
And by the way, part of the sentencing decision that allowed... Because they were sentencing brackets, once he pled guilty, right? There are sentencing brackets that- that largely had to do with damages, right?
And so the way that he was able to get time served was to move the damages the government claimed down to an actual real number, right?
[00:55:59] Rik: Okay.
[00:55:59] Jake: And- and so that had to then, because- because again, as... Even as we go into sentencing there's high damages being claimed. Marcia and, uh Brian did outsta- I mean just- just phenomenal work, right, convincing the judge of that, right?
And, you know, again, I played a very small role there working through some of the claims and saying, "Okay, look. That- that's rubbish. Here's why it's rubbish, right." Here's...
[00:56:21] And then, very importantly, here's how to analogise this, here's how to take this techie thing, right, that we're trying to explain to a 70-year-old judge, Marcia again, you know, used to work with EFF, counsel of EFF.
She knows tech. She's one of the smartest tech lawyers I've ever met. Brian, again, super smart tech guy. He does a lot of, uh- you know, very, very, very technolo- tech, technically focused.
The judge, not so much, right? Again, we're talking about [crosstalk 00:56:43] 70-year-old judge.
[00:56:48] Rik: And you were working together to make that comprehensible to someone who has nothing to do with this area of life, technology, the world.
[00:56:52] Jake: Right. Right. Because while we'd like to think that our, you know, our- our jurists are- are not, you know, not biassed, well, let's be real about this, right?
If you've got, you know, random lawyer, standing up on one side, saying, "This is wrong." And you've got the FBI saying, "No, this is right." and- and- and it's all goobledygook to them, right? And they can't make a judgement, they are probably going to side with the FBI.
[00:57:15] Rik: Of course.
[00:57:15] Jake: And- and- and I understand that, right?
[00:57:17] Rik: Yeah.
[00:57:17] Jake: heck. I'm- I'm not a judge, I'd probably do the same thing, right? So we have to break that tech into something and analogise that tech into something they can understand.
And- and I think that- that from a societal standpoint, I don't have a fix for this, by the way, but, you know, if- if- if he hadn't had good representation, right, you know, from- from the lawyer side, obs, lawyers that understand tech, and then experts that- that can help, you know, translate that and- and fight some of these, fight some of these claims.
[00:57:41] I think that that turns out very differently. And- and I- I- I really hate to think about the folks that are sitting in, you know, sitting in prison today, you know, for- for various crimes where digital evidence is- is playing a role, right?
We all carry, you know, carry one of these, right? And- and so, you know, digital evidence is playing a role in a lot of cases that are not- not... That we don't think of as [crosstalk 00:58:03]-
[00:58:03] Rik: Not tech related, yeah.
[00:58:03] Jake: Yeah, cyber crime, or- or whatever related, yeah. And- and- and again, you know, we see cases regularly where- where folks are just railroaded and- and it's- it's, you know, with digital evidence.
And we're like, "Man, that analysis is rubbish." Right? "There's all kinds of holes to poke in it." But they just don't have the representation, so-
[00:58:17] Rik: So I'm going to pivot back because I- I need to give to people who are actually asking questions from online, because that's why I'm here.
[00:58:23] Jake: Yes, sir. Yeah. Of course.
[00:58:25] Rik: And we're still talking about, crimes and tools and technology. The question basically, alludes to the fact that it- it's always, a race between attacker and defender.
Defending technologies are being improved all the time. We were speaking about honeypots. The question relates to that, but I think it relates to all defensive technologies. Are they being outpaced by attackers? Or the other way around?
Or is it kind of neck and neck? How- how... What's your feeling from a- a- an IR and, incident investigation perspective?
[00:58:58] Jake: Attackers all... Will always outpace us, right? They'll always outpace us, and it's not because they're smarter, it's because of business. Right?
You know, I- I am not going to spend, to, you know, I'm not going to spend a lot on controls, to, you know, stop, and we've already seen, you know, I've stopped attacks that we don't have to, right?
We've already seen academic implementations of operating systems that, I don't want to call them exploit proof, I don't think anything is, right? But, you know, things that- that ultimately...
[00:59:24] And heck, there's, shadow stacks, a great- great example where Microsoft is- is looking into this, now. The original shadow stacks paper I think was written in 2000 or 2001, right?
So- so we're looking at this- this almost 20 year gap, right, between, and probably by the time it's implemented it will be at least 20 plus years. But, these largely kill stack based buffer overflows, like I- I mean, without additional information disclosure and keep over rights and the whole...
Like stack based buffer overflows are dead. Why haven't we been using these? And the answer is, man, they slow computing down a lot, right?
[00:59:54] And so, we then have to decide, right, you know, make these decisions around the... Is the cure worse than the disease, right? And so while, if you go into the academic literature, the defence is outpacing the offence ridiculously.
If you turn around to business, you see offence outpacing defence, and I think it's always going to be that way. I think as those innovations, right... We're not going to work to operationalise a lot of these academic type, type solutions, right?
[01:00:19] Until we... Even though they've been shown to be effective, but the state that they're in, in some cases, just even a theory, right, the- the- the cure is worse than the disease.
The disease has to be at least as bad as the cure, and then we'll start to see that- that shift. But- but I think it'll always be... I think we'll always be playing catch up in the business world.
[01:00:36] Rik: Yeah, my take on it was always that, as defenders, and I agree that the attackers will outpace the defenders. For me, an important part of it is that, as defenders, we- we have to play effectively with an open hand.
Our tools, our- our technologies, what we do business for, how we do business, all of the, [inaudible 01:00:58] organisations and everything else, and not to mention the security tools that we use is out there and that's the way the world works. If you're on the offensive side, that's not the case. You don't have to follow the rules.
You don't have to publicise what you do.
[01:01:12] You don't have to, QA your- your tools. You have to make sure they work for a certain... To a certain extent, but you don't have to worry about, you know, "Oh, my word, am I going to get a lot of tech support calls if I go public with, you know, it's still got this bug."
You know, so you can... You can iterate really fast. You can innovate. You can break the rules. You don't have to conform to RFCs. There's a whole lot of stuff you don't have to do.
So that kind of open hand, closed hand thing, I think plays- plays a big role as well.
[01:01:37] We're coming up really close to the hour, but there was one more thing, that I wanted to talk to you about, because I, I tweeted something a couple days ago.
I think it was the day before yesterday. And I was surprised by the strength of feeling of the responses that I got. Mostly on a side that didn't agree with me. And I was really interested in your take on it. I... In my original tweet just asked, "Hey, can I crowdsource a list of racist terms that we use in- in IT or specifically information security?"
And the terms that I had in mind were things like blacklist/white list, and, master/slave, when we're talking about configurations of- of service, for example.
[01:02:25] I passionately believe that there's the trope that black equals bad and white equals good, no matter the etymology of the words concerned, that's actually irrelevant, but the fact that we characterise bad things as being black like the blacklist and the black sheep and lots of other examples, and good things as being white, reflects the racist society that we're all a part of.
And whatever we can do to change that is only to the good.
[01:02:57] I got a lot of response from people who, for some reason, honestly I don't understand, were extremely resistant to, you know, why the heck should we make those changes?
That's ridiculous. They're not racist terms. They're not meant in a racist way. What's your take on that? Am I alone? Am I stupid?
[01:03:17] Jake: No, I- I- I don't think so, right? You know, the... I- I really analogise this to the circle game, and I know I'm going to get a couple haters after this, right?
But you probably are familiar with the circle game, right, where it's, like, the, it- it's the sign that's been co opted by, you know, a lot of these white power groups, right?
And then people come back and they're like, "Yeah, but, you know, that all started on FORTRAN." And I'm like, "Neat, right? It started on FORTRAN."
[01:03:40] I don't really care, at the end of the day, wa- the reality is, it's a sign that's been co opted by a hate groups, right? And- and obviously then, that is...
It is exclusionary, and it's hurting... The use of it is hurting some people. And so, even if I derive joy from or have nostalgic thoughts about playing the circle game, I- I don't, but- but whatever.
If, you know, even if I did, I- I don't think it's worth offending somebody over a choice of, you know, would I choose to play it?
[01:04:07] And then pivoting back into what you're talking about, what I choose to say, right? And- and- and I agree, I actually agree with you. I- I think wholeheartedly here, right?
You know, the, yes, there are a number of terms that are, you know, I think, either, overtly, you know, that are in our vernacular, right, that are either overtly racist or certainly have some racist undertones.
And I think where, you know, where possible, yeah, we should absolutely be shifting those. There- there's no reason to...
[01:04:33] If it's offending people, right, then, you know, if a particular term is offending people and we can communicate the same methodology, right, with, you know, by- by not doing that, I think that's a good... It- it's a good way to play. You know I [crosstalk 01:04:46]-
[01:04:47] Rik: Right?
[01:04:47] Jake: Yeah, why would you not change it, right? I mean, it's one of those, like whe- when I talk to somebody, like, I've transitioned from blacklist to block list, right?
And, you know, if somebody comes to me, and I've had people question me on this, and they're like, "Well, that's not offending anybody."
I’m like, "First off, I've talked to people who are like, 'That term put me off.'" Right? I'm like, "Done." Then if I can avoid that... I- I unfortunately don't have a good term for white list, right?
I don't have a... One that rolls off the tongue as easily for that, but- but I definitely am- am stopping, using the word, using the word blacklist. I've transitioned to the block list.
[01:05:15] But it's, you know, master/slave, it's primary and secondary. I understand why we used those terms originally. It- it wasn't to- to throw racism into tech.
It was, we were creating analogies, and I'm back to my whole analogy thing, but analogies are things that are already understood, right? But- but if those terms themselves are- are hurtful, right, and- and we now have a better societal consciousness around that, we- we should absolutely change those.
And you know, we can do that without a huge loss of fidelity. I- I don't understand. Really, I've struggle to understand when people say, 'No, I'm sticking with that term." And it's like, [crosstalk 01:05:51]-
[01:05:51] Rik: Hurts, what- what are you doing?
[01:05:53] Jake: Yeah, like, what's your endgame here, right? Like, where do you think you win on this? Yeah, it's just... It’s mind blowing to me, yep.
[01:05:59] Rik: Thank you, Jake, for the reality check. Thank you for- for, being willing to answer that question. Thank you for being willing to answer all of my questions, and the questions that- that we got, live as we went along.
You've been, an incredible guest. I couldn't have picked anyone better to end my first enter season of attempting to do live broadcasts online. Thank you everybody for watching.
Jake, it's been an absolute pleasure. I hope, we can do this again.
[01:06:27] Jake: Yeah, definitely. Hey thanks, thanks for having me on, Rik, really appreciate it.
[01:06:31] Rik: Cheers. That's it. I feel kind of, I feel kind of sad. That's- that's the end of the first season of- of let's talk security.
I've got to tell you before episode one I was... I was really nervous. And- and although I spend a lot of my time standing in front of large numbers of people speaking live in the room, I find that a lot less terrifying because I can see the reaction of the people watching, then when all I can see is effectively the- the lens of my camera right there.
[01:07:02] the feedback about what we've done with this first season has been really positive. I just want to take the opportunity to thank you all again, for- for watching the show, for sticking with us. I want to thank all of the guests that have joined me this first season.
I, hopefully I- I picked a very, a group with very…of very diverse skills and- and world views. I'm going to try and continue and expand on that theme of diversity, in the second season.
If you're interested in coming on and having a conversation with me in season two, whenever that is, please just hit me up on- on Twitter.
[01:07:39] Drop me a line, say, "Hey, I'd love to come and talk." and we'll sort something out.
Doing this show, like I said has been terrifying. It's been, rewarding, and it's been an absolute pleasure. And, thank you all very much for watching. See you season two.
And- and I'll just leave you with, the final traditional, I'm Ron Burgundy
[01:08:35] One thing that I've actively avoided all season is wearing these glasses while I've been broadcasting.