WEBVTT 1 00:00:00.360 --> 00:00:02.850 Anna Delaney: Hello, and welcome to this special zero trust 2 00:00:02.880 --> 00:00:06.270 edition of the ISMG Editors' Panel. I'm Anna Delaney, and 3 00:00:06.270 --> 00:00:09.180 this week we're joined by industry veteran and the creator 4 00:00:09.180 --> 00:00:12.990 of zero trust, none other than John Kindervag. And the merry 5 00:00:12.990 --> 00:00:16.920 party also includes, of course, my teammates, Tom Field, senior 6 00:00:16.920 --> 00:00:20.100 vice president of editorial, and Mathew Schwartz, executive 7 00:00:20.100 --> 00:00:24.000 editor of DataBreachToday and Europe. Welcome back. John, how 8 00:00:24.000 --> 00:00:24.210 are you doing? 9 00:00:24.000 --> 00:00:26.580 John Kindervag: I'm doing great. How are you guys doing? 10 00:00:28.190 --> 00:00:31.010 Anna Delaney: Very good. John, as you might recall from 11 00:00:31.010 --> 00:00:34.220 previous editions, we like to commence proceedings by asking 12 00:00:34.220 --> 00:00:37.250 you about your background, virtual or otherwise. You seem 13 00:00:37.250 --> 00:00:39.950 to be in your glowing studio. Is that right? 14 00:00:39.000 --> 00:00:44.880 John Kindervag: Well, I'm in my home office that I transformed 15 00:00:44.910 --> 00:00:48.540 into a studio at the beginning of COVID. So I never needed this 16 00:00:48.540 --> 00:00:51.720 studio, party because I was always on the road and suddenly 17 00:00:51.720 --> 00:00:55.170 I'm in the same chair. It is completely conformed to me now. 18 00:00:55.590 --> 00:00:59.400 Anna Delaney: Have you acquired any new pieces of equipment 19 00:00:59.400 --> 00:01:00.300 since we last spoke? 20 00:01:01.230 --> 00:01:07.680 John Kindervag: I do. I have a new webcam. So the new Anker 2K 21 00:01:07.680 --> 00:01:12.300 webcam that has digital zoom in it and stuff. So trying out a 22 00:01:12.300 --> 00:01:12.690 new one. 23 00:01:13.290 --> 00:01:14.160 Anna Delaney: Tom, did you hear that? 24 00:01:16.200 --> 00:01:17.250 Tom Field: I'm putting together a list. 25 00:01:17.970 --> 00:01:18.540 John Kindervag: There you go. 26 00:01:19.230 --> 00:01:21.420 Anna Delaney: Tom, are you flying again this week? 27 00:01:23.340 --> 00:01:29.310 Tom Field: this was flying into New York, nice view of Manhattan 28 00:01:29.310 --> 00:01:35.550 that you don't often get. It was a clear day. And it's 29 00:01:35.550 --> 00:01:37.500 appropriate because I look out the window and see this a lot 30 00:01:37.500 --> 00:01:38.010 these days. 31 00:01:38.160 --> 00:01:41.970 Anna Delaney: That's very cool. Mathew, you're outside as well. 32 00:01:42.270 --> 00:01:43.020 We're all outside. 33 00:01:43.560 --> 00:01:45.720 Mathew Schwartz: That's right. I'm engaging with my cityscape 34 00:01:45.720 --> 00:01:50.610 here, Anna, in Dundee. It was just a trio of windows. I liked 35 00:01:50.610 --> 00:01:53.760 the look of it, kind of a nice little street scene. So I 36 00:01:53.760 --> 00:01:55.740 usually go for something a little more dynamic. But you 37 00:01:55.740 --> 00:01:58.350 know, I wasn't in the air or I don't have this amazing home 38 00:01:58.350 --> 00:02:01.500 studio. I just have to make do with what's around me. 39 00:02:01.000 --> 00:02:04.510 Anna Delaney: It looks like a painting. Well, I'm sharing a 40 00:02:04.510 --> 00:02:08.350 recent view of a trip to St Leonards-on-Sea, which is a 41 00:02:08.350 --> 00:02:11.380 seaside town in the borough of Hastings. You might have heard 42 00:02:11.380 --> 00:02:16.360 of it. The Battle of Hastings was the site of 1066, the Norman 43 00:02:16.360 --> 00:02:21.730 Conquest. So sharing a glimpse of history here. Well, John, we 44 00:02:21.730 --> 00:02:24.580 have a series of questions for you. Tom, why don't you start 45 00:02:24.580 --> 00:02:25.090 the party? 46 00:02:25.630 --> 00:02:27.520 Tom Field: John, it is! Well, speaking of party, it's been 47 00:02:27.520 --> 00:02:32.110 three years now since zero trust really had it's coming out party 48 00:02:32.140 --> 00:02:36.730 at RSA 2020. As we start to prepare for, believe it or not, 49 00:02:36.730 --> 00:02:41.560 RSA 2023 in April, what would you say are the predominant zero 50 00:02:41.560 --> 00:02:44.200 trust storylines going into the event? 51 00:02:46.910 --> 00:02:52.130 John Kindervag: I think the wider adoption, the fact that 52 00:02:52.490 --> 00:02:55.790 we're talking about it at the federal government level, and 53 00:02:55.850 --> 00:03:00.020 we're seeing all kinds of movements in various governments 54 00:03:00.020 --> 00:03:03.260 around the world. I mean, it just has become, it's exploded. 55 00:03:03.620 --> 00:03:08.570 It started to really explode in 2016 after the OPM data breach 56 00:03:08.570 --> 00:03:13.010 and the OPM report from the Oversight and Government Reform 57 00:03:13.010 --> 00:03:16.310 Committee of the U.S. Congress. And it took a while for people 58 00:03:16.310 --> 00:03:20.840 to kind of bubble through that when Congress came out and asked 59 00:03:20.870 --> 00:03:26.540 OMB to push some guidance around zero trust. That was really the 60 00:03:26.570 --> 00:03:29.000 tipping point. It just took a while for people to see that. 61 00:03:29.420 --> 00:03:34.220 But what's unique about it is it's just a global movement. So 62 00:03:34.730 --> 00:03:40.280 this week, I've talked in Norway, I'm repping my Stavanger 63 00:03:40.280 --> 00:03:45.530 Vikings shirt because I don't have a background but I do have 64 00:03:45.830 --> 00:03:52.640 the 1953, 70-year flag for the Stavanger Vikings that three of 65 00:03:52.640 --> 00:03:56.240 my great uncles played on. And it was the first Norwegian 66 00:03:56.240 --> 00:04:01.370 soccer championship after the Second World War ended. So I 67 00:04:01.370 --> 00:04:06.140 went to Norway a while back and one of my great aunts was still 68 00:04:06.140 --> 00:04:10.220 alive then. I hope she still is now. And she gave this to me. 69 00:04:10.220 --> 00:04:14.360 It's a very old 70-year old banner. So I'm wrapping that 70 00:04:14.360 --> 00:04:17.810 today because I didn't have a virtual background. But yeah, 71 00:04:18.410 --> 00:04:24.650 I've talked to Australia, now I'm talking to England and 72 00:04:24.650 --> 00:04:30.020 Scotland. And, you know, everywhere, man, it's just crazy 73 00:04:30.020 --> 00:04:35.300 how this has become such a weird, global thing from when I 74 00:04:35.300 --> 00:04:37.850 first wrote the first paper and everybody thought it was 75 00:04:37.850 --> 00:04:38.870 completely insane. 76 00:04:40.430 --> 00:04:42.050 Tom Field: It was just three years ago, you appeared in our 77 00:04:42.050 --> 00:04:44.660 studio for the first time and you were the guest of one Mathew 78 00:04:44.660 --> 00:04:47.120 Schwartz. So Matt, let me pass the baton to you. I know you've 79 00:04:47.120 --> 00:04:48.020 got a question or two. 80 00:04:48.830 --> 00:04:51.200 Mathew Schwartz: I know, I try to keep things real here. You 81 00:04:51.200 --> 00:04:54.740 know, keep the discussion flowing, John. And I think one 82 00:04:54.740 --> 00:04:58.310 of the big buzzwords we're going to be hearing at RSA, because 83 00:04:58.310 --> 00:05:01.580 we're already hearing it now. RSA is not that far away. It's 84 00:05:01.970 --> 00:05:06.800 ChatGPT. Shocker? I know. But when it comes to ChatGPT, 85 00:05:06.830 --> 00:05:10.220 there's a lot of discussion about what it can do, what it 86 00:05:10.220 --> 00:05:14.000 can't do, what it might do and what we might not want it to do. 87 00:05:14.300 --> 00:05:17.840 And so I want to get your opinion, do you think ChatGPT is 88 00:05:17.840 --> 00:05:22.790 going to enhance or overthrow perhaps cybersecurity as we know 89 00:05:22.790 --> 00:05:27.380 it? Where do you see the rubber, maybe, first hitting the road? 90 00:05:27.680 --> 00:05:31.370 Are we in danger, perhaps of falling in love yet again, with 91 00:05:31.370 --> 00:05:36.230 the latest shiny new toy, when I think we can all argue that 92 00:05:36.230 --> 00:05:40.910 everyone often keeps overlooking the basics? And we get a lot of 93 00:05:40.910 --> 00:05:44.000 bang back for that buck if we were focusing more on them. But 94 00:05:44.000 --> 00:05:46.670 we've got ChatGPT for the moment. What's your take? 95 00:05:47.440 --> 00:05:49.450 John Kindervag: Well, I mean, right now, I think it's a 96 00:05:49.450 --> 00:05:53.740 novelty. And remember, this is the chatbot version of GPT-3, 97 00:05:54.100 --> 00:05:56.740 and then you're going to have GPT-4 come out. And then you've 98 00:05:56.740 --> 00:06:00.370 got everybody else coming out with an artificial 99 00:06:00.400 --> 00:06:03.310 intelligence/machine learning engine. And there's a lot of 100 00:06:03.310 --> 00:06:05.770 things that, you know, I've been doing work in this area for a 101 00:06:05.770 --> 00:06:08.680 long time. The company I work for - ON2IT, we have machine 102 00:06:08.680 --> 00:06:11.740 learning capabilities in our managed service. We're 103 00:06:11.740 --> 00:06:15.610 constantly looking at the traffic, so yeah, it's going to 104 00:06:15.610 --> 00:06:18.340 be fine. No one knows where it's going to go. I mean, it could be 105 00:06:18.340 --> 00:06:20.770 that Skynet become self aware, they build a bunch of 106 00:06:20.770 --> 00:06:27.070 terminators. And, you know, we go into a big kinetic war there. 107 00:06:27.070 --> 00:06:31.810 And probably we lose in real life against the robots. But 108 00:06:32.650 --> 00:06:35.290 that hasn't happened yet. I guess, you know, August, it's 109 00:06:35.290 --> 00:06:38.770 destined to. You know, every year in August, we wait for 110 00:06:39.250 --> 00:06:44.020 Skynet to become self aware. But I don't see it as an existential 111 00:06:44.020 --> 00:06:48.760 threat because the thing that constrains all of this as it 112 00:06:48.760 --> 00:06:52.570 relates to how we access resources on a network is 113 00:06:52.600 --> 00:06:56.140 TCP/IP, and I don't see that changing, right? We've got a 114 00:06:56.140 --> 00:07:01.270 couple of big things that kind of relate to each other, which 115 00:07:01.300 --> 00:07:07.180 are quantum computing, and living in a post-quantum world, 116 00:07:07.180 --> 00:07:10.570 and then, you know, all of the artificial intelligence stuff 117 00:07:10.570 --> 00:07:13.720 that's happening, and it probably will allow the 118 00:07:13.720 --> 00:07:19.780 attackers to craft more sophisticated attacks, and it 119 00:07:19.780 --> 00:07:23.860 will toast everybody who's not paying attention. I don't think 120 00:07:23.860 --> 00:07:27.730 it'll affect zero trust, because we're just focused on protecting 121 00:07:27.730 --> 00:07:30.160 the protect surface. So we've gotten to the point now, where 122 00:07:30.160 --> 00:07:33.310 we don't care what the threats are, because the policy in the 123 00:07:33.310 --> 00:07:37.750 companies that I manage, or we manage, and that I help 124 00:07:37.750 --> 00:07:43.180 architect have such defined policies, that, you know, it 125 00:07:43.180 --> 00:07:47.680 doesn't matter what the attack is, I really don't care. And 126 00:07:48.670 --> 00:07:52.240 everybody else is still in the 20th century in cybersecurity, 127 00:07:52.240 --> 00:07:57.940 and that's why this is profoundly resonant to leaders 128 00:07:57.970 --> 00:08:03.730 in cybersecurity, zero trust, I mean, because it has a path 129 00:08:03.730 --> 00:08:07.510 forward, it has a strategic vision and a mission that they 130 00:08:07.510 --> 00:08:11.290 can go for and accomplish to try to stop some of this stuff. But 131 00:08:11.290 --> 00:08:15.280 if you're old school, and if you struggle, say with something as 132 00:08:15.280 --> 00:08:18.790 simple as patching, which is actually not simple, but as 133 00:08:18.790 --> 00:08:23.020 basic as patching, well, then then you're in trouble. Right? 134 00:08:23.020 --> 00:08:29.290 If you have a pretty open policy set, and are just trying to deny 135 00:08:29.290 --> 00:08:36.550 bad traffic, it's going to be bad for you, right? So if you're 136 00:08:36.550 --> 00:08:41.500 relying on a single data point to make a decision, oh, they 137 00:08:41.500 --> 00:08:44.830 authenticated. So we're going to allow this traffic in, you're in 138 00:08:44.830 --> 00:08:48.010 trouble. Right? So you're going to have to think differently. 139 00:08:48.340 --> 00:08:54.010 And I think a lot of that thinking is, you know, what you 140 00:08:54.010 --> 00:08:57.760 need to do. We published it, you know, I'm working with ISMG. I'm 141 00:08:57.760 --> 00:09:00.190 working with cloud security lines. I'm working with your 142 00:09:00.220 --> 00:09:06.940 sister company, CyberEd. So you know, we're trying our best to 143 00:09:06.940 --> 00:09:11.380 get the right messages out. And there are some people who don't 144 00:09:11.380 --> 00:09:12.610 want that message to get out. 145 00:09:15.190 --> 00:09:17.230 Mathew Schwartz: Very good, thank you. I know Anna has got a 146 00:09:17.230 --> 00:09:20.260 question as well, when it comes to threats, I think. 147 00:09:20.680 --> 00:09:23.080 Anna Delaney: Very good. Yes, indeed. Thank you. Well, I want 148 00:09:23.080 --> 00:09:26.710 to move on to API security. And then speaking of RSA, our good 149 00:09:26.710 --> 00:09:30.760 friend Richard Bird predicts that API security will be the 150 00:09:30.760 --> 00:09:34.540 big buzz theme this year. So let's see. But in this past 151 00:09:34.540 --> 00:09:38.290 year, John, we've obviously seen several high-profile breaches, 152 00:09:38.290 --> 00:09:42.370 which resulted from API exploits such as Twitter and T-Mobile. 153 00:09:42.850 --> 00:09:45.460 And it seems that the threat actors are increasingly 154 00:09:45.460 --> 00:09:48.220 targeting API vulnerabilities. Certainly, from the 155 00:09:48.220 --> 00:09:51.070 conversations I've had with security professionals, they all 156 00:09:51.070 --> 00:09:54.460 seem to indicate that this is a really challenging area. What's 157 00:09:54.460 --> 00:09:58.150 your advice to them? And how can organizations ensure effective 158 00:09:58.180 --> 00:09:59.290 API security? 159 00:10:00.680 --> 00:10:02.900 John Kindervag: Well, I mean, API's are amazing. We're in the 160 00:10:03.170 --> 00:10:06.440 API economy and and one of the things I love about API's just 161 00:10:06.440 --> 00:10:09.500 to start off with, is they eliminate the need for 162 00:10:09.500 --> 00:10:12.050 standards. They solve the interoperability problems that 163 00:10:12.050 --> 00:10:16.580 we're trying to get with standards. And they incentivize 164 00:10:16.610 --> 00:10:19.970 innovation. So those are the good things about it. The bad 165 00:10:19.970 --> 00:10:25.190 things about it are, generally, the cybersecurity teams don't 166 00:10:25.190 --> 00:10:28.820 know they exist. They don't know how to control them, which 167 00:10:28.820 --> 00:10:31.940 they're just another TCP/IP connection. So they're 168 00:10:31.940 --> 00:10:34.850 controllable. There's nothing magical there. Its the 169 00:10:34.850 --> 00:10:37.610 programmatic interfaces, I think, that are very 170 00:10:37.610 --> 00:10:43.160 interesting. But they're controlled by the developers, 171 00:10:43.160 --> 00:10:47.990 DevOps, whatever you want to say. And those people are not 172 00:10:47.990 --> 00:10:50.360 incentivized to do good security, they're incentivized 173 00:10:50.360 --> 00:10:55.400 to move fast. I always call them the Ricky Bobby's of IT, right? 174 00:10:56.000 --> 00:10:58.340 They just want to go fast, they got a cougar sitting next to 175 00:10:58.340 --> 00:11:03.170 him, you know, in the car, they got to go fast. And that's 176 00:11:03.200 --> 00:11:05.870 because they're incentivized that way. And so there needs to 177 00:11:05.870 --> 00:11:09.350 be a change of incentive structures for DevOps, to 178 00:11:09.350 --> 00:11:14.480 integrate more and not just how many code pushes can I do a day. 179 00:11:14.750 --> 00:11:19.670 And then there are some tools out there that you can use to 180 00:11:19.700 --> 00:11:25.310 kind of lock that down and all of the stuff. If you're looking 181 00:11:25.970 --> 00:11:29.000 at the API, and what's going across the API, you'll be able 182 00:11:29.000 --> 00:11:32.720 to see whether there's an attack across the API, you just have to 183 00:11:32.720 --> 00:11:36.800 be looking for it. Right? I'm often reminded of the old 184 00:11:37.790 --> 00:11:44.600 Vaudeville joke of the drunk person on who's looking for his 185 00:11:44.600 --> 00:11:48.380 keys. And the cop says, Hey, what are you doing? I'm looking 186 00:11:48.380 --> 00:11:52.250 for my keys. Oh, okay. Let me help. I don't see any keys 187 00:11:52.250 --> 00:11:55.940 anywhere around here. Well, I lost him way over there. Why are 188 00:11:55.940 --> 00:11:58.280 you looking here because the light is so much better. And 189 00:11:58.280 --> 00:12:01.130 it's called the streetlight effect, right? So we're only 190 00:12:01.130 --> 00:12:04.100 looking at the places we have illuminated and we don't 191 00:12:04.130 --> 00:12:09.200 illuminate enough places. We saw this with GoDaddy. How many 192 00:12:09.200 --> 00:12:12.440 years were they in? GoDaddy? Three, is that what I read? 193 00:12:15.110 --> 00:12:16.040 Mathew Schwartz: At least a couple. 194 00:12:16.000 --> 00:12:20.770 John Kindervag: Yeah, look at that. But they're in there 195 00:12:20.770 --> 00:12:22.660 forever. And you don't know they're there. It's like, 196 00:12:22.960 --> 00:12:26.530 wondering, Hey, who's the person getting beer out of the fridge? 197 00:12:27.160 --> 00:12:29.860 I guess they can get beer out of the fridge. They belong here. 198 00:12:29.860 --> 00:12:33.490 I'll go back to bed. We don't do that. But you just don't have 199 00:12:33.520 --> 00:12:37.510 enough visibility, enough illumination, enough street 200 00:12:37.510 --> 00:12:40.330 lamp. Street lamps for like cowbells. Tom, you always need 201 00:12:40.000 --> 00:12:44.320 Tom Field: Indeed! More cowbell. John, you and I have had a 202 00:12:40.330 --> 00:12:40.780 more of them. 203 00:12:44.320 --> 00:12:46.990 chance to participate in a lot of videos, a lot of webinars, 204 00:12:46.990 --> 00:12:50.830 not nearly enough cowbell. And we've talked a lot about 205 00:12:50.860 --> 00:12:54.310 enterprises on how they've evolved to embrace zero trust - 206 00:12:54.370 --> 00:12:57.190 their understanding the protect surface, their defining what it 207 00:12:57.190 --> 00:12:59.440 is they need to protect before they go out to try to protect. 208 00:12:59.470 --> 00:13:04.150 Now, what about the vendor community? How has it grown past 209 00:13:04.150 --> 00:13:07.810 the initial reaction of "Zero trust? Yep, we sell that." 210 00:13:09.520 --> 00:13:11.980 John Kindervag: Yeah, I don't think that they are enlightened 211 00:13:12.370 --> 00:13:15.520 as they should, because they're still trying to spin zero trust, 212 00:13:16.090 --> 00:13:22.120 to be defined based upon the technology they sell. And I 213 00:13:22.120 --> 00:13:24.700 understand that I came from vendor, I work for a vendor, but 214 00:13:24.700 --> 00:13:27.370 although completely different vendor from its perspective on 215 00:13:27.370 --> 00:13:34.780 it, and, you know, their bottom line is the bottom line. Right? 216 00:13:34.780 --> 00:13:40.990 And, what we don't see is a lot of folks who really want to 217 00:13:40.990 --> 00:13:44.650 change the world and make it a safer place anymore. They want 218 00:13:44.650 --> 00:13:48.280 to meet the numbers from Wall Street, they want to, you know, 219 00:13:48.310 --> 00:13:51.160 they want to create the new thing, they want to become 220 00:13:51.160 --> 00:13:54.430 billionaires with yachts. And it's different than when I first 221 00:13:54.430 --> 00:13:58.360 got into it, because all of the early people in cyber were very 222 00:13:58.360 --> 00:14:01.930 well intentioned, you know, almost idealistic, I would say, 223 00:14:02.380 --> 00:14:06.070 and that's kind of gone by the wayside. Those people either 224 00:14:06.370 --> 00:14:10.210 have sort of been corrupted is one. One friend of mine said 225 00:14:10.210 --> 00:14:13.120 about another friend of mine, he said, because I was kind of 226 00:14:13.120 --> 00:14:15.670 like, wow, this guy, I would have never thought he would have 227 00:14:15.790 --> 00:14:18.460 been doing some of these things. And he said, yeah, he's 228 00:14:18.460 --> 00:14:22.480 discovered that he likes to be rich more than he likes to be 229 00:14:22.480 --> 00:14:29.440 right. And so what is your purpose for doing this business? 230 00:14:29.440 --> 00:14:33.460 To me, this is a public service, right? There's three adversarial 231 00:14:33.460 --> 00:14:39.250 businesses in the world. There's law enforcement, military and 232 00:14:39.250 --> 00:14:42.430 cybersecurity. And so you have to be willing to fight the good 233 00:14:42.430 --> 00:14:46.600 fight, and sometimes you're not going to be the billionaire. But 234 00:14:46.600 --> 00:14:48.850 if you can do something to make the world a little bit better, a 235 00:14:48.850 --> 00:14:54.310 little bit safer. you've contributed more than maybe all 236 00:14:54.310 --> 00:14:55.870 the billionaires in the world combined. 237 00:14:55.000 --> 00:14:59.380 Tom Field: Well-said, John, and let me turn you over to the 238 00:14:59.380 --> 00:15:02.140 Mathew Schwartz, who has taken the journalist vow of poverty 239 00:15:02.140 --> 00:15:04.360 and is doing good work indeed. So Matt, your question? 240 00:15:06.640 --> 00:15:11.950 Mathew Schwartz: Right, not rich. Yes. MFA. That's been one 241 00:15:11.950 --> 00:15:15.370 of the big attack vectors that we've been seeing. I don't mean 242 00:15:15.370 --> 00:15:19.930 to play zero trust dartboard and just take every last attack 243 00:15:19.960 --> 00:15:22.630 that's working incredibly well, and throw them in your 244 00:15:22.630 --> 00:15:25.840 direction. But I think the two-factor authentication, 245 00:15:25.870 --> 00:15:29.410 multi-factor authentication bypass attacks have been so 246 00:15:29.410 --> 00:15:32.050 interesting, because they've taken down some really big 247 00:15:32.320 --> 00:15:35.860 names; these push notifications where people say, Sure, no 248 00:15:35.860 --> 00:15:40.120 problem. I'll connect to that. What shortcomings have you seen 249 00:15:40.120 --> 00:15:42.820 with these MFA bypass attacks that have been hitting so many 250 00:15:42.820 --> 00:15:45.970 organizations? And what's your recommendation, John, in the 251 00:15:45.970 --> 00:15:49.930 context of zero trust? What specific improvements do these 252 00:15:49.930 --> 00:15:53.740 attacks demonstrate we still need to make? 253 00:15:55.250 --> 00:15:57.620 John Kindervag: Well, again, you're making a axis decision on 254 00:15:57.620 --> 00:16:02.030 a single data point. Did somebody accept the MFA 255 00:16:02.030 --> 00:16:05.810 challenge and give you a response? And then if so then 256 00:16:05.840 --> 00:16:10.010 equals yes. Allow? No, that's not enough data. That is one 257 00:16:10.010 --> 00:16:12.680 data point in the stack. So we talked about the Kipling method, 258 00:16:12.710 --> 00:16:16.370 which we've talked about before, how do you create policy for 259 00:16:16.370 --> 00:16:20.660 zero trust? Who should have access via what application? And 260 00:16:20.660 --> 00:16:23.420 ultimately how should we allow that access to happen? So how 261 00:16:23.420 --> 00:16:27.050 should we look at the whole packet? So I've always said MFA, 262 00:16:27.080 --> 00:16:30.170 which is really 2FA, think about this in this industry, by the 263 00:16:30.170 --> 00:16:33.950 way, here's a little bit of my soapbox. We had a number two, we 264 00:16:33.950 --> 00:16:37.850 changed it to the letter M and then we created a whole new 265 00:16:38.240 --> 00:16:40.760 category out of the same products, there's nothing 266 00:16:40.760 --> 00:16:45.110 different between 2FA and MFA, right? And there are fundamental 267 00:16:45.110 --> 00:16:49.610 flaws in the way we do it. Now, identity is always fungible. So 268 00:16:49.670 --> 00:16:53.690 if you look and you're saying that's the only criteria that 269 00:16:53.690 --> 00:16:59.060 I'm going to make a decision on? Well, you know, of course, 270 00:16:59.060 --> 00:17:02.510 you're going to get toasted, right? I mean, I always joke, I 271 00:17:02.510 --> 00:17:06.830 can prove this with two words Snowden Manning, the real 272 00:17:06.860 --> 00:17:09.560 Rihanna and Beyonce of cybersecurity. They're so 273 00:17:09.560 --> 00:17:13.520 famous. They're one word people, right? They had powerful 274 00:17:13.820 --> 00:17:17.960 multi-factor authentications called CAC - common access card. 275 00:17:18.350 --> 00:17:21.140 I called a CAC card once and then was stolen. Well, that's 276 00:17:21.140 --> 00:17:24.290 redundant. Yeah, I guess it is. I mean, it's literally on a 277 00:17:24.290 --> 00:17:28.580 little poll thing. And it plugs into their computer. So they are 278 00:17:28.580 --> 00:17:32.420 literally attached to their computer by a thin piece of 279 00:17:32.420 --> 00:17:37.250 string. And, so there was no question on the identity of 280 00:17:37.250 --> 00:17:40.790 these attackers. But no one looked at the packets post 281 00:17:40.790 --> 00:17:45.860 authentication. And therefore, the exploit technique was the 282 00:17:46.010 --> 00:17:49.880 broken trust model. So this is something I've been saying for 283 00:17:50.540 --> 00:17:55.100 you know, 13 years now that you can't just rely on one single 284 00:17:55.100 --> 00:17:59.330 thing and everybody wants a magic button, everybody talks 285 00:17:59.330 --> 00:18:02.930 about the silver bullet. And I often will, you know, if someone 286 00:18:02.930 --> 00:18:05.150 talks about a silver bullet, I'll say, have you seen a 287 00:18:05.150 --> 00:18:08.450 vampire killing kit in real life from one of the voodoo shops in 288 00:18:08.450 --> 00:18:13.730 New Orleans? Right? Well, is today Mardi Gras? No, yesterday 289 00:18:13.730 --> 00:18:23.300 was Mardi Gras, right? So if you go into one of the voodoo shops 290 00:18:23.300 --> 00:18:26.780 on the side roads in New Orleans, you'll find vampire 291 00:18:26.780 --> 00:18:30.950 killing kits. And in there, there's a flintlock pistol that 292 00:18:30.950 --> 00:18:33.860 has the silver bullet, right, because some vampires get killed 293 00:18:33.860 --> 00:18:35.570 by several bullets, but there's going to be some holy water, 294 00:18:35.570 --> 00:18:37.760 there's gonna be a steak. To put it through the heart, there's 295 00:18:37.760 --> 00:18:41.300 going to be a cross, a mirror, all the different things because 296 00:18:41.300 --> 00:18:45.110 different vampires get killed in different ways. So there is no 297 00:18:45.110 --> 00:18:48.020 such thing as a silver bullet. And what you have to do is 298 00:18:48.020 --> 00:18:51.680 figure out how to put together a system. Right? The vampire 299 00:18:51.680 --> 00:18:56.120 killing kit is a system for killing vampires, not that 300 00:18:56.120 --> 00:18:59.570 vampires actually exist, I don't believe that. But still, just 301 00:18:59.570 --> 00:19:02.690 the metaphor works really, really well. And so quit 302 00:19:02.690 --> 00:19:05.990 thinking about it as a product, quit thinking about a single 303 00:19:05.990 --> 00:19:09.830 technology, and then I will solve the problem. And it will 304 00:19:09.830 --> 00:19:14.060 be a project that's done. This is a process that is the entire 305 00:19:14.060 --> 00:19:18.380 structure of your organization, your company, whatever company 306 00:19:18.380 --> 00:19:22.430 you have in the world does not exist unless your IT and 307 00:19:22.430 --> 00:19:26.270 computer systems actually work. And the thing that keeps them up 308 00:19:26.270 --> 00:19:30.140 and running is the cybersecurity stuff, that is the structural 309 00:19:30.140 --> 00:19:35.840 engineering of this. So when we look at some of the tragedies 310 00:19:35.840 --> 00:19:41.180 recently, with some of the earthquakes, we should remember 311 00:19:41.270 --> 00:19:46.280 structural engineering, right? Hammurabi said some odd 1000 312 00:19:46.310 --> 00:19:51.110 years ago, if someone builds a building and falls down on a 313 00:19:51.110 --> 00:19:53.690 person and it kills that person, you're going to execute the 314 00:19:53.690 --> 00:19:58.400 builder. Right? And, we need to understand that that's what 315 00:19:58.400 --> 00:20:04.160 cybersecurity does. Cybersecurity can't be short 316 00:20:04.160 --> 00:20:10.400 changed. It can't be we can't do short cuts, we can't just not 317 00:20:10.400 --> 00:20:12.860 quite do it as well as we could, because we could do it cheaper. 318 00:20:12.860 --> 00:20:16.760 That's what happened in New Orleans with the dikes, right? 319 00:20:16.940 --> 00:20:23.690 People went to prison because the spec says, put the piling 320 00:20:23.690 --> 00:20:26.780 down so many feet, we don't need to do that, we can save a lot of 321 00:20:26.780 --> 00:20:30.590 money off of that by not using as much concrete and then put 322 00:20:30.590 --> 00:20:34.160 that money in our own pocket. Right? So we see that same 323 00:20:34.160 --> 00:20:37.730 problem in cybersecurity people are going cheap. And they're 324 00:20:37.730 --> 00:20:41.510 reducing budgets now because the economy's bad. Well, the 325 00:20:41.510 --> 00:20:44.300 attackers aren't reducing their budgets, folks, you need to 326 00:20:44.300 --> 00:20:48.020 probably increase your budget and when I see you know, some of 327 00:20:48.020 --> 00:20:51.020 the statistics about how much of your budget goes to 328 00:20:51.020 --> 00:20:56.570 cybersecurity versus everything else in IT, I think that that's 329 00:20:57.500 --> 00:21:00.560 criminal, in my opinion, that you spend so little on 330 00:21:00.560 --> 00:21:06.140 cybersecurity, and spend so much on all your new funky toys in 331 00:21:06.140 --> 00:21:06.620 IT. 332 00:21:09.750 --> 00:21:11.490 Mathew Schwartz: Well-put, I love the vampire killing kit. 333 00:21:11.490 --> 00:21:13.350 I'm going to have to get one of those together. Thank you! 334 00:21:15.800 --> 00:21:18.710 John Kindervag: Fly into New Orleans as you're going to, to 335 00:21:18.710 --> 00:21:20.210 RSA, you know. 336 00:21:20.480 --> 00:21:23.660 Mathew Schwartz: I'll see if I can swing it. I'll converse with 337 00:21:23.660 --> 00:21:26.570 my editor. See if I can write that one off as a business 338 00:21:26.570 --> 00:21:26.930 expense. 339 00:21:27.320 --> 00:21:31.700 Tom Field: You can fact check, John, on whether vampires exist 340 00:21:31.700 --> 00:21:32.150 anywhere. 341 00:21:32.810 --> 00:21:34.670 John Kindervag: Yeah, well, you know, the history of vampires is 342 00:21:34.670 --> 00:21:35.480 very interesting. 343 00:21:35.000 --> 00:21:38.900 Anna Delaney: For sure, well, it's been a rich conversation. 344 00:21:37.070 --> 00:22:03.200 Tom Field: John, and Matt might remember this, this is my hero: 345 00:21:38.900 --> 00:21:42.800 Vampires, Norwegian football, yachts. I've loved this. But, we 346 00:21:42.800 --> 00:21:46.730 have one final quick question just for fun, if you were to 347 00:21:46.730 --> 00:21:51.050 create a cybersecurity themed comic or cartoon, who would be 348 00:21:51.050 --> 00:21:55.880 your heroine, your hero, or even selection of baddies. Tom's 349 00:21:55.880 --> 00:21:57.410 right there, right in the action. 350 00:22:03.440 --> 00:22:07.730 zero. In this case, zero trust. John, I do this for you. 351 00:22:07.970 --> 00:22:08.810 John Kindervag: Thank you. 352 00:22:09.560 --> 00:22:11.720 Tom Field: How wonderful you are, zero, my hero. 353 00:22:14.780 --> 00:22:19.490 Mathew Schwartz: Well, not to steal John's thunder or work 354 00:22:19.490 --> 00:22:23.210 with a concept he just recently introduced but I would go for 355 00:22:23.690 --> 00:22:27.140 silver bullet man. It would be more of a nemesis who kept 356 00:22:27.140 --> 00:22:30.440 suggesting things that didn't actually work, leading to 357 00:22:30.470 --> 00:22:34.340 inadvertent death and destruction. I think high jinx 358 00:22:34.340 --> 00:22:36.890 could ensue, but it would just depend how you pitched it. 359 00:22:37.640 --> 00:22:38.690 Tom Field: Who was that masked man? 360 00:22:41.000 --> 00:22:42.710 Anna Delaney: Well, I was thinking more traditionally a 361 00:22:42.710 --> 00:22:45.890 bit of cat and mouse action. So definitely there's room for Tom 362 00:22:45.890 --> 00:22:51.380 and Jerry, a Pink Privacy Panther. Bit of Snoopy, and 363 00:22:51.410 --> 00:22:57.050 maybe a Tweety. So there's my comic for you. John, have you 364 00:22:57.050 --> 00:23:00.170 got any thoughts about who your hero or heroine would be? 365 00:23:01.100 --> 00:23:03.320 John Kindervag: Well, if you look at my Twitter profile, 366 00:23:03.320 --> 00:23:07.820 you'll see my Cynja avatar. So the Cynja, which actually 367 00:23:07.820 --> 00:23:12.110 exists, and we should resurrect that, well, it's a graphic novel 368 00:23:12.110 --> 00:23:16.790 series for kids done by Chase Cunningham and Heather Dahl, my 369 00:23:16.790 --> 00:23:21.740 good friends. And I'm the only other person outside of them who 370 00:23:21.740 --> 00:23:27.830 has a Cynja avatar. And so I'd like to see more Cynjas out 371 00:23:27.830 --> 00:23:33.530 there and maybe cyber ninjas. So if you next time you're on with 372 00:23:33.530 --> 00:23:40.730 Chase, get him to talk about the Cynja sometime because they were 373 00:23:40.730 --> 00:23:45.170 too early with that comic book series but I think that should 374 00:23:45.170 --> 00:23:49.580 defeat the Marvel superheroes big time. Because once we get AI 375 00:23:49.580 --> 00:23:53.960 on a network, I mean, Superman. Right? What are you going to do? 376 00:23:54.320 --> 00:23:56.180 You don't know how to program. 377 00:24:01.220 --> 00:24:04.040 Anna Delaney: True creatives. Well, John, this has been great 378 00:24:04.040 --> 00:24:07.070 fun. Thank you so much for joining the ISMG Editors' Panel. 379 00:24:08.000 --> 00:24:09.590 John Kindervag: Hey, thanks for having me. It's always fun. 380 00:24:11.030 --> 00:24:13.070 Anna Delaney: And thank you so much for watching. Until next 381 00:24:13.070 --> 00:24:13.460 time.