WEBVTT 1 00:00:00.210 --> 00:00:02.550 Anna Delaney: Hello, this is Proof of Concept, a talk show 2 00:00:02.550 --> 00:00:06.270 where we invite leading experts to discuss the cybersecurity and 3 00:00:06.270 --> 00:00:09.720 privacy challenges of today and tomorrow, and how we can 4 00:00:09.720 --> 00:00:13.140 potentially solve them. We are your hosts. I'm Anna Delaney, 5 00:00:13.170 --> 00:00:14.940 director of productions at ISMG. 6 00:00:15.620 --> 00:00:17.870 Tom Field: I'm Tom Field, senior vice president of Editorial 7 00:00:17.870 --> 00:00:19.850 operations. Anna, always a pleasure. 8 00:00:20.100 --> 00:00:22.290 Anna Delaney: Always a pleasure. How are you doing today, Tom? 9 00:00:22.810 --> 00:00:24.550 Tom Field: Very well, thank you. It's always a pleasure to be 10 00:00:24.550 --> 00:00:26.980 here and a delight to be able to discuss these issues and to have 11 00:00:26.980 --> 00:00:29.320 the quality of guests that we have today. 12 00:00:28.410 --> 00:00:31.379 Anna Delaney: Yeah, absolutely. And before introducing those 13 00:00:31.444 --> 00:00:34.607 guests, Tom, what's the most interesting thing in 14 00:00:34.672 --> 00:00:36.480 cybersecurity at the moment? 15 00:00:36.360 --> 00:00:39.453 Tom Field: Well, I don't know the most interesting thing. I 16 00:00:37.110 --> 00:01:44.220 And the big question remains, how will this affect the 17 00:00:39.517 --> 00:00:43.642 would say that today, what's on everybody's mind continues to be 18 00:00:43.707 --> 00:00:47.187 for three months now, what's happening in Ukraine, the 19 00:00:47.251 --> 00:00:51.247 Russian invasion, the role of cybersecurity. And I'll tell you 20 00:00:51.311 --> 00:00:55.243 my concern. My concern is that we are a world that like short 21 00:00:55.307 --> 00:00:59.174 stories, we like for crises to come up, for them to explode, 22 00:00:59.238 --> 00:01:03.105 for them to be resolved, and for them to go away. That's not 23 00:01:03.170 --> 00:01:07.230 happening with Ukraine. This is something that has been with us 24 00:01:07.294 --> 00:01:10.904 now for, think about this, almost a quarter of a year. I 25 00:01:10.968 --> 00:01:14.835 don't think it's going away in the next quarter of the year. 26 00:01:14.899 --> 00:01:19.024 And so the concerns that we had from the outset, about potential 27 00:01:19.088 --> 00:01:23.213 leakage of nation-state weapons, let's say, in the role of cyber 28 00:01:23.278 --> 00:01:27.273 and repercussions for Western countries that get involved with 29 00:01:27.338 --> 00:01:30.689 this, those don't go away. People might not have the 30 00:01:30.753 --> 00:01:34.878 patience for them. People might want this to go away and be sick 31 00:01:34.943 --> 00:01:39.390 of it. But it's not going to. So I think that's very much on my mind. 32 00:01:44.250 --> 00:01:48.930 equilibrium of these major cyber powers such as U.S., China and 33 00:01:48.930 --> 00:01:53.250 Russia? I hope our experts today maybe have some ideas and 34 00:01:53.250 --> 00:01:58.200 thoughts on this, and how is the conversation needing to change 35 00:01:58.230 --> 00:02:03.090 because there is the risk of becoming immune to all that's 36 00:02:03.090 --> 00:02:07.710 happening? As you say, we like news stories, news. So, what 37 00:02:07.710 --> 00:02:10.770 should organizations actually be preparing for? Because there's a 38 00:02:08.970 --> 00:02:09.570 Absolutely! 39 00:02:10.770 --> 00:02:14.760 lot of unknowns out there. How can they prepare? And as I said, 40 00:02:14.760 --> 00:02:17.580 how do we need to move the conversation from just talking 41 00:02:17.580 --> 00:02:20.280 about the basics and cyber hygiene? Or, do we? 42 00:02:20.640 --> 00:02:22.860 Anna Delaney: So looking forward to that? 43 00:02:22.960 --> 00:02:25.600 Tom Field: And here the concern, Anna, is that, you know, you've 44 00:02:25.600 --> 00:02:28.990 got other nation-state powers out there, North Korea, Iran, 45 00:02:28.990 --> 00:02:32.590 watching what's happening and taking notes. And you know, 46 00:02:32.590 --> 00:02:36.490 who's to say that one of them won't take advantage of this 47 00:02:36.490 --> 00:02:40.420 activity over there to launch another activity over here? You 48 00:02:40.420 --> 00:02:46.030 know, for years, we worried about terrorism and the risk of 49 00:02:46.030 --> 00:02:50.500 a dirty nuclear device? Should we be concerned about a dirty 50 00:02:50.500 --> 00:02:51.400 cyber device? 51 00:02:52.890 --> 00:02:56.520 Anna Delaney: That's a worrying question, but a good one. But 52 00:02:56.520 --> 00:03:00.030 also news this week that unlikely collaborators came 53 00:03:00.030 --> 00:03:04.020 together, that is, Apple, Google, and Microsoft. And it 54 00:03:04.020 --> 00:03:07.530 seems, they are coming together for passwordless tech. They've 55 00:03:07.530 --> 00:03:10.890 joined forces to support the newest developments in the FIDO 56 00:03:10.890 --> 00:03:15.870 protocol. So is this the moment when passwordless goes to the 57 00:03:15.870 --> 00:03:17.580 masses, comes to us all? 58 00:03:18.420 --> 00:03:20.670 Tom Field: Well, it certainly is a good publicity moment. We've 59 00:03:20.670 --> 00:03:23.670 got exactly the person here that can help answer the question of 60 00:03:23.670 --> 00:03:27.000 the real significance of this, and what this does mean for 61 00:03:27.000 --> 00:03:30.690 people that have been clamoring for passwordless for a 62 00:03:30.690 --> 00:03:31.530 generation now. 63 00:03:32.430 --> 00:03:33.900 Anna Delaney: Well, I think it's time to introduce them. 64 00:03:34.320 --> 00:03:34.710 Tom Field: You think? 65 00:03:34.860 --> 00:03:35.340 Anna Delaney: Yeah. 66 00:03:35.360 --> 00:03:37.760 Tom Field: Okay. Let's bring in our first guest. You know him as 67 00:03:37.760 --> 00:03:40.070 the managing director of technology business strategy 68 00:03:40.070 --> 00:03:43.670 with Venable LLP, we know him as Jeremy Grant. When it comes to 69 00:03:43.670 --> 00:03:48.560 identity, he is, as I say, in the movies, the dude! Dude, 70 00:03:49.130 --> 00:03:50.660 welcome, good to have you here today. 71 00:03:50.700 --> 00:03:52.890 Jeremy Grant: Thanks. Great to be here. Thanks, Tom. Thanks, 72 00:03:52.890 --> 00:03:53.220 Anna. 73 00:03:53.720 --> 00:03:56.600 Tom Field: Jeremy, as Anna teed up here, it was big news this 74 00:03:56.600 --> 00:04:01.010 past week that Apple, Google, Microsoft come together. And 75 00:04:01.010 --> 00:04:03.800 they are supporting the newest developments in the FIDO 76 00:04:03.800 --> 00:04:08.780 protocol. It was huge news coming as it did on World 77 00:04:08.780 --> 00:04:12.860 Password Day. What is the significance, the real 78 00:04:12.860 --> 00:04:14.210 significance of this move? 79 00:04:15.020 --> 00:04:17.628 Jeremy Grant: Well, I think the main takeaway is it's going to 80 00:04:17.682 --> 00:04:20.997 be easier than ever for service providers across the globe to 81 00:04:21.051 --> 00:04:24.366 make passwordless the default. And, you know, a little bit of 82 00:04:24.420 --> 00:04:27.844 background, the FIDO standards have been in existence for years 83 00:04:27.898 --> 00:04:31.050 now. They've gone through a couple of iterations. FIDO2 is 84 00:04:31.104 --> 00:04:34.364 what's most commonly deployed. But a key challenge with FIDO 85 00:04:34.419 --> 00:04:37.842 has been if you're deploying it, you're either embracing what I 86 00:04:37.896 --> 00:04:40.668 would call a roaming authenticator model, you know, 87 00:04:40.722 --> 00:04:43.548 say like a security key like a YubiKey, which I love 88 00:04:43.602 --> 00:04:46.699 personally, but maybe isn't going to be the sort of thing 89 00:04:46.754 --> 00:04:50.231 that most consumers are going to pick up. Or you could have what 90 00:04:50.286 --> 00:04:53.166 would be called an embedded authenticator or platform 91 00:04:53.220 --> 00:04:56.535 authenticator built into the laptop I'm using, built into the 92 00:04:56.589 --> 00:04:59.850 iPhone I have in my pocket, built into, you know, an Android 93 00:04:59.904 --> 00:05:03.273 tablet, which I think is going to be much easier for consumers 94 00:05:03.327 --> 00:05:06.153 in that you can have a true passwordless model FIDO. 95 00:05:06.207 --> 00:05:09.142 Essentially, there's an on-device biometric match or a 96 00:05:09.196 --> 00:05:12.348 pin match if you don't want to use a biometric, which then 97 00:05:12.402 --> 00:05:15.608 unlocks a asymmetric, private, cryptographic key behind the 98 00:05:15.662 --> 00:05:18.923 scenes. But there's always been a challenge with that ladder 99 00:05:18.977 --> 00:05:21.911 model, which is while the user experience is great for 100 00:05:21.966 --> 00:05:25.280 consumers, you only have one private key for each device. And 101 00:05:25.335 --> 00:05:28.541 so if you have multiple devices, you quickly get into a key 102 00:05:28.595 --> 00:05:31.584 management challenge, which becomes, well, a really big 103 00:05:31.638 --> 00:05:34.953 challenge for consumers, you know. Think about if the default 104 00:05:35.007 --> 00:05:38.213 is, you know, 18 months from today, you are logging in on a 105 00:05:38.267 --> 00:05:41.800 pad with a passwordless approach using FIDO for hundred different 106 00:05:41.854 --> 00:05:45.114 service providers that you do business with online, and then 107 00:05:45.169 --> 00:05:48.592 you get a new phone. And all the private keys are stuck on that 108 00:05:48.646 --> 00:05:51.852 one phone. And how do you get them to the other one, or the 109 00:05:51.907 --> 00:05:55.330 phone is lost or stolen, or, you know, anytime you're switching 110 00:05:55.384 --> 00:05:58.427 devices, you're not going to spend a week regenerating a 111 00:05:58.482 --> 00:06:01.633 hundred private keys. And so from a usability perspective, 112 00:06:01.688 --> 00:06:05.002 this is inhibited the deployment of FIDO. So the announcement 113 00:06:05.057 --> 00:06:08.371 this week is that all three of the big platforms are going to 114 00:06:08.426 --> 00:06:10.925 enable what's called multi-device credentials, 115 00:06:10.980 --> 00:06:13.914 essentially syncing those private keys across multiple 116 00:06:13.968 --> 00:06:17.337 devices. So that if I have, you know, an iPhone, an iPad and a 117 00:06:17.392 --> 00:06:20.380 Mac, all of those keys will be resident in all of those 118 00:06:20.435 --> 00:06:23.695 devices. And just like when I get to a new device everything 119 00:06:23.750 --> 00:06:27.173 ports over, those private keys will port over as well. That's a 120 00:06:27.227 --> 00:06:30.162 big advancement that you're seeing all three platforms 121 00:06:30.216 --> 00:06:32.390 embrace to enable full interoperability. 122 00:06:32.000 --> 00:06:36.890 Tom Field: And again, a context here. In the timeline of FIDO, 123 00:06:36.920 --> 00:06:40.790 you've been there from the start, how big of a watershed 124 00:06:40.790 --> 00:06:41.630 date is this? 125 00:06:42.410 --> 00:06:45.830 Jeremy Grant: I think it is the inflection point that's going to 126 00:06:45.830 --> 00:06:48.470 allow us to finally make passwordless login by default. 127 00:06:48.500 --> 00:06:50.810 You know, people talk for years about killing the password. When 128 00:06:50.810 --> 00:06:53.750 I was running the NSTIC program at NIST, we talked about part of 129 00:06:53.750 --> 00:06:56.690 our mission was shoot the password dead. Part of it has 130 00:06:56.690 --> 00:06:59.090 been getting to this point where you not only have standards, but 131 00:06:59.090 --> 00:07:02.060 also buy it. And I think that latter part is really important. 132 00:07:02.090 --> 00:07:05.360 I mean, already today, given the embrace you've seen by the three 133 00:07:05.360 --> 00:07:09.020 big platforms and other big tech companies, banks, chip makers 134 00:07:09.020 --> 00:07:12.080 and others, it's literally impossible to go buy a device 135 00:07:12.080 --> 00:07:16.100 today, running an operating system from Microsoft, Apple, or 136 00:07:16.100 --> 00:07:18.920 Google, that doesn't support FIDO out of the box in the 137 00:07:18.920 --> 00:07:22.880 device at the operating system level at the browser. But now 138 00:07:22.910 --> 00:07:25.460 the buy-in at this next step to enable these multi-device 139 00:07:25.460 --> 00:07:29.180 credentials and actually make it easy and practical for consumers 140 00:07:29.180 --> 00:07:31.160 to, you know, log in with asymmetric public key 141 00:07:31.160 --> 00:07:36.170 cryptography. I mean, that's the real news here is that they're 142 00:07:36.170 --> 00:07:40.220 all saying, we're not going to compete against each other on 143 00:07:40.220 --> 00:07:42.050 this point, or they might compete in the details of the 144 00:07:42.050 --> 00:07:44.570 implementation. But we're all going to agree to collaborate 145 00:07:44.570 --> 00:07:46.610 here because all three of those companies and I think a lot of 146 00:07:46.610 --> 00:07:50.420 others involved in FIDO Alliance realize that for the health of 147 00:07:50.420 --> 00:07:53.510 the security ecosystem, killing passwords really has to be the 148 00:07:53.510 --> 00:07:56.390 priority. And so this is, you know, from my perspective, a big 149 00:07:56.390 --> 00:07:57.050 step forward. 150 00:07:57.770 --> 00:07:59.120 Tom Field: We look to see what happens next. 151 00:07:59.960 --> 00:08:03.080 Jeremy Grant: I think a lot of it is people are, it changes 152 00:08:03.080 --> 00:08:05.510 some models in terms of how people have traditionally 153 00:08:05.510 --> 00:08:08.570 thought about FIDO. I think there's going to be, you know, 154 00:08:08.570 --> 00:08:10.340 I've certainly been getting a lot of questions from different 155 00:08:10.340 --> 00:08:13.160 clients and also government agencies. You know, I think 156 00:08:13.160 --> 00:08:15.410 there's an education period. And then I think, also what you have 157 00:08:15.410 --> 00:08:17.810 seen as an announcement that the companies are going to roll it 158 00:08:17.810 --> 00:08:22.640 out, as those capabilities start to actually show up in the next 159 00:08:22.640 --> 00:08:25.310 versions of operating systems and devices. You know, I 160 00:08:25.310 --> 00:08:27.440 basically think you'll see probably starting later this 161 00:08:27.440 --> 00:08:31.940 year, a big shift from online service providers to start to 162 00:08:31.940 --> 00:08:35.300 embrace these new passwordless approaches, to the point of 163 00:08:35.450 --> 00:08:37.940 saying by the end of 2023, is what consumers should be looking 164 00:08:37.940 --> 00:08:40.070 for as the default when they sign up for an account 165 00:08:40.000 --> 00:08:43.720 Tom Field: Very good. Now, speaking of watershed dates. It 166 00:08:40.070 --> 00:08:40.550 somewhere. 167 00:08:43.720 --> 00:08:47.620 was almost exactly one year ago that President Biden released 168 00:08:47.620 --> 00:08:52.810 his landmark cybersecurity executive order. MFA was a big 169 00:08:52.810 --> 00:08:57.640 part of that. As we approach that anniversary, are we more 170 00:08:57.640 --> 00:09:01.000 secure today than we were then? Has progress been made? 171 00:09:01.570 --> 00:09:04.210 Jeremy Grant: Yes, progress has been made. Have we actually 172 00:09:04.210 --> 00:09:07.840 gotten to where we want to be? No, not yet, unfortunately. And 173 00:09:07.840 --> 00:09:11.500 I think that's where we've still got some more work to do. You 174 00:09:11.500 --> 00:09:14.470 know, the problem with executive orders or other ethics that come 175 00:09:14.470 --> 00:09:16.450 from the White House. And by the way, this isn't unique to the 176 00:09:16.450 --> 00:09:19.420 Biden administration; this is within the administration is the 177 00:09:19.420 --> 00:09:22.480 mandates come out, but they need to be followed by action, which 178 00:09:22.480 --> 00:09:26.440 often means, you know, providing dedicated budget to agencies to 179 00:09:26.440 --> 00:09:29.050 actually make the changes that they need to acquire the tools 180 00:09:29.050 --> 00:09:32.350 that they need in order to comply. And I think, we've seen 181 00:09:32.350 --> 00:09:36.490 some good progress there, with more MFA adoption. I can 182 00:09:36.490 --> 00:09:39.220 certainly say just being in DC and talking to folks in the 183 00:09:39.220 --> 00:09:42.070 agencies like they're focused on this in a way that they weren't 184 00:09:42.070 --> 00:09:45.310 a year ago. But that doesn't mean that everything is now 185 00:09:45.310 --> 00:09:48.760 locked down with two-factor authentication. We certainly 186 00:09:48.760 --> 00:09:51.250 haven't gotten to phishing resistant two factor along the 187 00:09:51.250 --> 00:09:53.170 lines of what's called for in the White House zero trust 188 00:09:53.170 --> 00:09:57.520 strategy and the OMB memo that accompanied it. And so, you 189 00:09:57.520 --> 00:10:03.400 know, I think that the challenge is translating policy decrees 190 00:10:03.400 --> 00:10:05.470 into results. That's going to take a little bit more time. I 191 00:10:05.470 --> 00:10:07.030 think we've got a couple more years to go. 192 00:10:07.540 --> 00:10:09.310 Tom Field: Very good. Jeremy, as always pleasure to speak with 193 00:10:09.310 --> 00:10:10.120 you. Thank you so much. 194 00:10:10.340 --> 00:10:10.850 Jeremy Grant: Thank you. 195 00:10:11.330 --> 00:10:12.560 Tom Field: Anna, back to you. 196 00:10:13.220 --> 00:10:15.080 Anna Delaney: Thank you very much, gentlemen. That's great. 197 00:10:16.460 --> 00:10:19.670 Welcome, Lisa. I'd like to welcome back Lisa Sotto, partner 198 00:10:19.670 --> 00:10:22.910 and chair of the global privacy and cybersecurity practice at 199 00:10:22.910 --> 00:10:26.300 Hunton Andrews Kurth LLP. Good to see you, Lisa. 200 00:10:26.780 --> 00:10:28.880 Lisa Sotto: I'm delighted to be here, Anna. Thank you for having 201 00:10:28.880 --> 00:10:29.180 me. 202 00:10:29.600 --> 00:10:32.480 Anna Delaney: So Lisa, there seems to be quite a bit of 203 00:10:32.480 --> 00:10:35.870 movement in the current cybersecurity legal landscape. 204 00:10:35.990 --> 00:10:38.690 What are the changes and challenges that are top of mind? 205 00:10:39.830 --> 00:10:43.940 Lisa Sotto: Oh my, on the cybersecurity front, the U.S. 206 00:10:43.940 --> 00:10:47.810 legal landscape is changing at the speed of light. It's really 207 00:10:47.810 --> 00:10:51.260 extraordinary to see what's happening here. We have 208 00:10:51.710 --> 00:10:54.500 recognized, of course, cybersecurity is a key risk for 209 00:10:54.500 --> 00:10:59.210 many years now. But the velocity of change is truly incredible. 210 00:10:59.210 --> 00:11:02.870 And it's welcome. You know, I don't want to suggest that it's 211 00:11:02.870 --> 00:11:06.860 not welcome. But it is also overwhelming. And companies are 212 00:11:06.860 --> 00:11:10.520 now having to beef up their staff not only on the 213 00:11:10.820 --> 00:11:14.600 information security technologist front, but also on 214 00:11:14.600 --> 00:11:18.560 the legal and compliance front, folks who are well schooled in 215 00:11:18.560 --> 00:11:22.550 cybersecurity, on the compliance side of the house. 216 00:11:23.630 --> 00:11:26.420 Anna Delaney: So let's get into some specifics. What are some of 217 00:11:26.420 --> 00:11:29.390 the recent changes to reporting requirements? 218 00:11:29.600 --> 00:11:33.470 Lisa Sotto: Well, this is where it's quite extraordinary. And 219 00:11:33.470 --> 00:11:36.500 this is where we're really trying to keep up with what's 220 00:11:36.500 --> 00:11:40.820 going on. So I'll start with the Omnibus Appropriations Bill 221 00:11:41.060 --> 00:11:46.040 recently passed that has a meaty and really substantive chunk 222 00:11:46.040 --> 00:11:50.150 dealing with cybersecurity. The key provisions for the private 223 00:11:50.150 --> 00:11:54.950 sector are now the requirement to report covered events within 224 00:11:54.950 --> 00:12:01.100 72 hours of the event. And this is for critical infrastructure, 225 00:12:01.130 --> 00:12:06.050 yet to be determined. There's plenty here that requires 226 00:12:06.650 --> 00:12:11.390 flushing out. But these are certainly very substantive new 227 00:12:11.390 --> 00:12:15.230 rules. In addition, you need to let the system know if you're a 228 00:12:15.230 --> 00:12:19.580 covered entity within 24 hours of paying a ransom, if you've 229 00:12:19.580 --> 00:12:24.890 been hit with ransomware. Or banks, we now have a 36-hour 230 00:12:24.950 --> 00:12:28.760 reporting obligation. We have new rules in some key 231 00:12:28.760 --> 00:12:32.570 industries, like for pipelines, there's a 12-hour reporting 232 00:12:32.570 --> 00:12:36.980 obligation, for surface transportation, a 24-hour 233 00:12:36.980 --> 00:12:43.010 reporting obligation. We have an SEC-proposed rule that will 234 00:12:43.100 --> 00:12:47.300 likely be pushed through in some form or another; we're not sure 235 00:12:47.300 --> 00:12:49.550 if it's going to look exactly like this. In fact, comments are 236 00:12:49.550 --> 00:12:54.650 due today. So we'll get a sense of what the comments are put by 237 00:12:54.650 --> 00:12:59.090 tomorrow. So there's a four-business-day notification 238 00:12:59.090 --> 00:13:02.150 requirement for public companies. There's also a 239 00:13:02.150 --> 00:13:05.210 requirement, there are requirements for board and 240 00:13:05.210 --> 00:13:10.460 management oversight. This is consistent with the trend for 241 00:13:10.490 --> 00:13:15.620 high-level oversight by management and the board. But 242 00:13:15.620 --> 00:13:19.910 this overlays a whole new set of rules. And then of course, and 243 00:13:19.910 --> 00:13:23.780 I'll just reiterate this because it's worth thinking about in the 244 00:13:23.840 --> 00:13:29.360 mix of in the medley of requirements. We have 54 state 245 00:13:29.360 --> 00:13:32.990 breach notification laws at the state level; we have breach 246 00:13:32.990 --> 00:13:35.540 notification laws at the federal level. Those are our 247 00:13:35.540 --> 00:13:39.710 long-standing requirements. And they have deadlines that range 248 00:13:39.710 --> 00:13:46.070 from 72 hours to 60 days or so. So to say, we have a cacophony 249 00:13:46.070 --> 00:13:50.390 of rules is really an understatement. And by the way, 250 00:13:50.390 --> 00:13:54.170 we've only talked about the U.S. So things get really swirly when 251 00:13:54.170 --> 00:13:56.450 we start to talk about international bills. 252 00:13:56.960 --> 00:14:00.920 Anna Delaney: That's the next episode. But there's been also 253 00:14:00.920 --> 00:14:05.360 recent changes to U.S. privacy laws. So can you update us on 254 00:14:05.660 --> 00:14:06.890 some of the changes there? 255 00:14:07.370 --> 00:14:12.710 Lisa Sotto: It has been a really busy year. So it's important to 256 00:14:12.710 --> 00:14:16.910 remember that until 2020, we had what was known as a sectoral 257 00:14:16.910 --> 00:14:20.900 regime, meaning that we regulated privacy by industry 258 00:14:20.900 --> 00:14:24.740 sector. And the best examples of this are the Gramm-Leach-Bliley 259 00:14:24.740 --> 00:14:28.190 Act in the financial sector, HIPAA in the healthcare sector. 260 00:14:28.760 --> 00:14:35.450 In 2018, everything changed. And I won't say we're getting more 261 00:14:35.930 --> 00:14:39.260 into line with the rest of the world. We're not there yet. We 262 00:14:39.260 --> 00:14:43.730 will be when we have a federal law, and I hope it's a will and 263 00:14:43.730 --> 00:14:50.150 not maybe because of what's happening, and I'll go through 264 00:14:50.150 --> 00:14:56.060 that a little bit. So California started this trend in 2018. The 265 00:14:56.060 --> 00:15:01.010 law there became effective January 1, 2020. It was the 266 00:15:01.070 --> 00:15:05.810 first state out of the box to enact a comprehensive privacy 267 00:15:05.810 --> 00:15:11.630 law. Now, there were that really did lead to very dramatic 268 00:15:11.630 --> 00:15:15.440 changes on the privacy front in the U.S. But not to be outdone, 269 00:15:15.470 --> 00:15:19.790 other states followed suit. So we then saw in the last couple 270 00:15:19.790 --> 00:15:24.440 of years, Virginia, Colorado, Utah, and now Connecticut, I 271 00:15:24.440 --> 00:15:28.430 daresay we're going to see a number of other states coming to 272 00:15:28.430 --> 00:15:31.250 the fore as well, and nobody's going to want to be left out of 273 00:15:31.250 --> 00:15:37.340 this party. So you know, I would advocate strongly for a federal 274 00:15:37.430 --> 00:15:40.460 pre-emptive law; it's the only way that we're going to be able 275 00:15:40.460 --> 00:15:46.640 to manage this extremely complex web, because we can't just 276 00:15:46.970 --> 00:15:50.180 comply with the highest common denominator law, they're all 277 00:15:50.180 --> 00:15:53.450 different. So there isn't really such a thing as highest common 278 00:15:53.450 --> 00:15:54.200 denominator. 279 00:15:55.430 --> 00:15:57.350 Anna Delaney: There with all these changes, how do we keep 280 00:15:57.350 --> 00:16:00.020 pace? And how should organizations prepare? 281 00:16:00.000 --> 00:16:03.810 Lisa Sotto: I think the only way to really think about this is to 282 00:16:03.882 --> 00:16:07.980 think in terms of principles. Think in terms of the basic 283 00:16:08.052 --> 00:16:11.646 underpinnings of all of the privacy laws, the data 284 00:16:11.718 --> 00:16:15.888 protection laws globally. Notice choice where appropriate, 285 00:16:15.960 --> 00:16:20.058 individual rights, security, and, of course, enforcement, 286 00:16:20.130 --> 00:16:24.515 service provider provisions. So those basic principles really 287 00:16:24.587 --> 00:16:28.901 need to be to undergird every privacy framework in companies 288 00:16:28.973 --> 00:16:33.358 now. And if you get that right, then you're going to be about 289 00:16:33.430 --> 00:16:37.744 maybe 80% of the way there to comply with all of these laws. 290 00:16:37.816 --> 00:16:42.417 And, of course, we have to deal with variations on the theme. So 291 00:16:42.489 --> 00:16:46.730 there are vagaries in all of these laws that we're going to 292 00:16:46.802 --> 00:16:51.475 need to manage. But if we have a good framework in place, I think 293 00:16:51.547 --> 00:16:55.933 you can assume that you'll be pretty close to at least out of 294 00:16:56.005 --> 00:17:00.246 the starting gate. Let's put it that way. I don't know that 295 00:17:00.318 --> 00:17:04.920 you'll be close to the endpoint. But you'll be well on your way. 296 00:17:04.000 --> 00:17:08.320 Anna Delaney: Always helpful and informative. Thank you very 297 00:17:08.320 --> 00:17:08.950 much, Lisa. 298 00:17:09.310 --> 00:17:09.610 Lisa Sotto: Thank you, Anna. 299 00:17:10.559 --> 00:17:12.929 Anna Delaney: So coming all together now, I mentioned the 300 00:17:12.959 --> 00:17:16.439 ongoing and increasing geopolitical tensions at the 301 00:17:16.439 --> 00:17:20.489 start, what exactly should organizations be preparing for? 302 00:17:20.489 --> 00:17:24.509 And how can they do that? Jeremy, go for it. 303 00:17:24.900 --> 00:17:27.870 Jeremy Grant: Sure. Well, I'd say for starters, look, whether 304 00:17:27.870 --> 00:17:30.810 it's preparing from a compliance perspective, you know, along 305 00:17:30.810 --> 00:17:33.360 some of the issues Lisa was talking about, or preparing from 306 00:17:33.360 --> 00:17:36.570 a risk management perspective, a lot of times it's one and the 307 00:17:36.570 --> 00:17:38.820 same. I mean, you need to start to have a plan; you need to be 308 00:17:38.820 --> 00:17:42.360 thinking about it. And I think what we're certainly seeing, you 309 00:17:42.360 --> 00:17:48.780 know, on our side with clients coming to us is, between how 310 00:17:48.780 --> 00:17:52.170 complicated compliance is getting and the increased global 311 00:17:52.170 --> 00:17:55.860 tensions and heightened risks, a lot more companies are taking 312 00:17:55.860 --> 00:17:58.140 this seriously. And I think that's probably the best news of 313 00:17:58.140 --> 00:18:01.950 all. You know, if the time when you're starting to think about 314 00:18:02.010 --> 00:18:04.920 cybersecurity preparedness is after you've had an incident, 315 00:18:05.310 --> 00:18:07.290 that's going to be a little bit too late. Although, look, that's 316 00:18:07.290 --> 00:18:11.010 still represents at least a decent number of the calls, we 317 00:18:11.010 --> 00:18:13.860 get; hey, something bad happened, what do we do. But I 318 00:18:13.860 --> 00:18:16.590 think where I'm, you know, feeling more optimistic, is 319 00:18:16.590 --> 00:18:18.510 we're seeing more and more companies actually treating 320 00:18:18.510 --> 00:18:21.300 cybersecurity risks as something that they need to be addressing 321 00:18:21.300 --> 00:18:24.510 proactively, so that when something happens, they have a 322 00:18:24.510 --> 00:18:29.010 plan. And they are, you know, architecting systems and 323 00:18:29.010 --> 00:18:31.440 processes in a way that they actually have resilience so that 324 00:18:31.440 --> 00:18:37.530 they can overcome and attack. And so there, look, there's a 325 00:18:37.530 --> 00:18:39.720 lot of good attention on the space right now, that still 326 00:18:39.720 --> 00:18:41.640 doesn't mean that every company is doing the things that they 327 00:18:41.640 --> 00:18:45.150 should be doing in terms of, you know, proper planning or putting 328 00:18:45.150 --> 00:18:47.340 basic controls in place. Things like multi-factor 329 00:18:47.370 --> 00:18:50.670 authentication, which, still I would say, the lack of MFA 330 00:18:50.730 --> 00:18:54.000 provides the beachhead for almost every major incident we 331 00:18:54.000 --> 00:18:57.480 see. You know, starting there and with a couple of other 332 00:18:57.480 --> 00:18:59.760 controls, and then starting to build a broader plan around how 333 00:18:59.760 --> 00:19:02.820 you're going to actually prepare to respond if something does 334 00:19:02.820 --> 00:19:06.750 happen. You know, that's the sorts of things that say, the 335 00:19:06.750 --> 00:19:08.520 more enlightened companies are doing these days. 336 00:19:08.000 --> 00:19:12.560 Anna Delaney: Don't know how true this is but I read recently 337 00:19:12.560 --> 00:19:18.590 that just 22% of organizations currently have MFA deployed. 338 00:19:19.070 --> 00:19:20.810 Jeremy, does that sound right to you? 339 00:19:21.050 --> 00:19:22.970 Jeremy Grant: No, it sounds right. I mean, there was a 340 00:19:22.970 --> 00:19:26.840 speech from Alex Weinert, who leads Identity Security at 341 00:19:26.840 --> 00:19:29.780 Microsoft, did a speech at the Identiverse Conference about a 342 00:19:29.780 --> 00:19:32.030 year ago. There was really a deep dive into what happened 343 00:19:32.030 --> 00:19:35.990 with SolarWinds. And you know, what was stunning to me in the 344 00:19:35.990 --> 00:19:39.860 wake of that was, he pointed out even some of their companies 345 00:19:39.860 --> 00:19:43.160 they work with, who were actually impacted by SolarWinds 346 00:19:43.190 --> 00:19:46.940 were, you know, it was MFA that the organizations weren't using 347 00:19:46.940 --> 00:19:49.640 that provided that initial access point; a lot of them 348 00:19:49.640 --> 00:19:52.760 still had not turned it on. So I mean, there is a certain amount 349 00:19:52.790 --> 00:19:55.550 of being in this business where you're just constantly beating 350 00:19:55.550 --> 00:20:00.050 your head against the wall. And I mean, this, you know, getting 351 00:20:00.050 --> 00:20:02.300 back to our earlier discussion. This is one of the reasons I'm 352 00:20:02.300 --> 00:20:05.120 excited about the FIDO announcement on passkeys, in 353 00:20:05.120 --> 00:20:08.780 that we're finally like, why don't people deploy MFA? Well, I 354 00:20:08.780 --> 00:20:11.060 have to bolt it on to these systems. And while it's going to 355 00:20:11.060 --> 00:20:13.340 slow down my users, and they're going to complain if we can 356 00:20:13.340 --> 00:20:17.060 start to bake these controls in, so that it's just sort of the 357 00:20:17.060 --> 00:20:19.040 fault that we're not using passwords anymore, and there's 358 00:20:19.040 --> 00:20:21.890 nothing to fish. That's, I think, how we start to actually 359 00:20:21.890 --> 00:20:26.270 drive some real improvements in cybersecurity, as opposed to 360 00:20:26.810 --> 00:20:28.880 look, we've got, you know, sessions like this, where we're 361 00:20:28.880 --> 00:20:31.310 all talking about best practices and the right things, but that 362 00:20:31.310 --> 00:20:33.440 doesn't mean everybody's going to do it. And so, we need to 363 00:20:33.440 --> 00:20:35.180 make it easier for people to use. 364 00:20:36.980 --> 00:20:40.370 Anna Delaney: Lisa, thoughts on, again, this sort of geopolitical 365 00:20:40.370 --> 00:20:43.640 crisis that we're facing and how organizations should prepare and 366 00:20:43.640 --> 00:20:44.720 what they should prepare for? 367 00:20:46.280 --> 00:20:48.920 Lisa Sotto: Or so there's going to be times when the 368 00:20:49.880 --> 00:20:55.460 cybersecurity landscape is sort of hyperactive. But you know, 369 00:20:55.460 --> 00:20:58.850 the basic hygiene that we're talking about should be in 370 00:20:58.850 --> 00:21:03.200 place, of course, MFA, complex passwords, until we get to a 371 00:21:03.200 --> 00:21:07.670 place where there are no passwords. But you know, I would 372 00:21:07.760 --> 00:21:11.360 absolutely echo Jeremy's thoughts on the fact that it's 373 00:21:11.360 --> 00:21:15.800 good that things are a little bit hyperactive now, because it 374 00:21:15.800 --> 00:21:18.740 is getting boards and management, senior management 375 00:21:18.740 --> 00:21:21.770 and companies a little bit more exercised about preparedness. 376 00:21:22.100 --> 00:21:27.380 And we need, of course, right now to be hyper vigilant. So 377 00:21:27.440 --> 00:21:31.220 everything kind of on steroids right now. And that means making 378 00:21:31.220 --> 00:21:36.290 sure that you are doing your preparatory work, do tabletop 379 00:21:36.290 --> 00:21:40.190 exercises, make sure you have a state-of-the-art incident 380 00:21:40.190 --> 00:21:44.030 response plan that has protocols for certain exploits like 381 00:21:44.030 --> 00:21:46.910 ransomware, which is kind of a different beast and probably 382 00:21:46.910 --> 00:21:50.030 needs to be worked through within the organization in 383 00:21:50.030 --> 00:21:53.000 advance of something bad happening. We want to make sure 384 00:21:53.000 --> 00:21:56.360 that we have a strong vendor management program in place. 385 00:21:57.350 --> 00:22:02.660 Insider threat, we need a strong insider threat program as well, 386 00:22:02.660 --> 00:22:06.590 because that certainly continues to exist and will continue to 387 00:22:06.590 --> 00:22:11.360 exist. Training and awareness, there's no substitute for that. 388 00:22:12.050 --> 00:22:15.710 I read recently that something like 90% of these incidents 389 00:22:15.860 --> 00:22:19.910 occur because of a human error. So you know, there it is, 390 00:22:19.910 --> 00:22:22.670 training and awareness is everything. And of course, 391 00:22:24.170 --> 00:22:29.300 underpinning all of that is risk shifting through insurance. So 392 00:22:29.300 --> 00:22:33.200 we have to think about cyber insurance as well. So the bottom 393 00:22:33.200 --> 00:22:38.840 line here is, the basic hygiene needs to be followed. And then, 394 00:22:38.840 --> 00:22:43.790 continue to do those cyber preparedness exercises because 395 00:22:43.790 --> 00:22:47.180 there will be a day when you're going to need to exercise them 396 00:22:47.180 --> 00:22:47.690 for real. 397 00:22:49.460 --> 00:22:53.210 Anna Delaney: Unfortunately. But thank you. Tom, we're going to 398 00:22:53.210 --> 00:22:54.980 go to RSA soon. 399 00:22:54.000 --> 00:22:59.610 Tom Field: RSA Conference first time since 2020. Here we go. The 400 00:22:59.610 --> 00:23:01.260 body guard of cybersecurity is back. 401 00:23:01.000 --> 00:23:06.100 Anna Delaney: Indeed. Lisa, Jeremy, are you going? You're 402 00:23:06.000 --> 00:23:11.310 Lisa Sotto: I will not be at RSA unfortunately. So, the 403 00:23:06.000 --> 00:23:19.440 You're pretty busy yourself, I think, chairing a few and being 404 00:23:06.100 --> 00:23:06.700 going to be there? 405 00:23:11.310 --> 00:23:15.930 conferences are back in full force. There are a lot of them. 406 00:23:19.440 --> 00:23:20.040 on panels. 407 00:23:20.940 --> 00:23:25.680 It's a little bit crazy. It's good to be a little crazed after 408 00:23:25.710 --> 00:23:27.090 two years of quiet. 409 00:23:27.780 --> 00:23:31.530 Jeremy Grant: Yeah, I can echo what Lisa said. The travel has 410 00:23:31.530 --> 00:23:34.680 kicked off starting last week with I don't know where I'm 411 00:23:34.680 --> 00:23:39.150 going to be between now and the middle of July most weeks. Part 412 00:23:39.150 --> 00:23:42.840 of it's kind of nice that it's coming back. But also looking at 413 00:23:42.840 --> 00:23:44.700 this and going back and do with a little bit less somewhere, you 414 00:23:44.700 --> 00:23:47.940 know, in the middle, but I will be in and out at RSA. I'm 415 00:23:47.940 --> 00:23:51.990 speaking on Thursday, actually leading a session looking at 416 00:23:52.530 --> 00:23:55.650 voice technology and security implications around it. 417 00:23:56.640 --> 00:23:59.310 Including, you know, how it's been used in different cases, as 418 00:23:59.310 --> 00:24:01.920 well as issues like deep fakes that are starting to emerge with 419 00:24:01.920 --> 00:24:04.590 voice that might, you know, undermine it before we can even 420 00:24:04.590 --> 00:24:07.170 recognize the Prime Minister. So should be a fun session. 421 00:24:08.250 --> 00:24:10.200 Anna Delaney: Fascinating stuff. Well, I look forward to meeting 422 00:24:10.200 --> 00:24:10.650 you there. 423 00:24:11.340 --> 00:24:12.090 Jeremy Grant: Most definitely. 424 00:24:13.080 --> 00:24:15.510 Anna Delaney: Okay, well, thank you very much. This has been 425 00:24:15.510 --> 00:24:18.480 informative, fascinating and brilliant. Really enjoyed it. 426 00:24:18.540 --> 00:24:23.160 Thank you very much, Lisa Sotto and Jeremy Grant. And it's 427 00:24:23.160 --> 00:24:24.090 goodbye from us. 428 00:24:24.570 --> 00:24:25.530 Tom Field: Thanks, Anna. 429 00:24:26.092 --> 00:24:26.932 Lisa Sotto: Thank you.