WEBVTT 1 00:00:00.000 --> 00:00:03.000 Anna Delaney: Hello, welcome to the weekly edition of the ISMG's 2 00:00:03.030 --> 00:00:06.240 Editors' Panel. I'm Anna Delaney. And we have a surprise 3 00:00:06.240 --> 00:00:10.710 for you! And that is because we are joined by our special guest, 4 00:00:10.740 --> 00:00:15.510 David Pollino, former CISO of PNC Bank and a leader in the 5 00:00:15.510 --> 00:00:18.690 fields of information security and fraud prevention and risk 6 00:00:18.690 --> 00:00:22.200 management. And we also have, of course, Tom Field, senior vice 7 00:00:22.200 --> 00:00:25.620 president of editorial. Hello both, and welcome David. 8 00:00:26.400 --> 00:00:27.870 David Pollino: Hello. Thanks for having me. 9 00:00:28.950 --> 00:00:31.860 Anna Delaney: David, you caught us at a special time because Tom 10 00:00:31.860 --> 00:00:35.160 and I are in the same city. Can you believe, Chicago? 11 00:00:35.000 --> 00:00:36.200 Tom Field: In the same hotel! 12 00:00:36.240 --> 00:00:39.900 Anna Delaney: In the same hotel, different floors. And do you 13 00:00:39.900 --> 00:00:41.670 recognize the signage behind me? 14 00:00:42.960 --> 00:00:45.870 David Pollino: I do not. Maybe help me with that. 15 00:00:46.400 --> 00:00:50.234 Anna Delaney: So I've been introduced to fine dining here 16 00:00:50.323 --> 00:00:56.120 in Chicago. The joys of pizza casserole. It's a specialty I hear. 17 00:00:56.000 --> 00:00:59.810 Tom Field: It's called deep dish pizza, not pizza casserole. 18 00:01:00.860 --> 00:01:04.400 Anna Delaney: Well, some New Yorkers were referring it to it 19 00:01:04.460 --> 00:01:08.540 like that. But, Tom, how would you describe this deep dish, 20 00:01:08.540 --> 00:01:08.900 then? 21 00:01:09.440 --> 00:01:11.711 Tom Field: How will I describe it? I think it's a terrific 22 00:01:11.760 --> 00:01:14.757 meal. I enjoyed Donald's, Lou Malnati's. It's also the one and 23 00:01:14.805 --> 00:01:17.850 two, depending on your point of view in Chicago. When I come to 24 00:01:17.899 --> 00:01:20.896 the city, that's where I always like to go. And it's the first 25 00:01:20.944 --> 00:01:23.990 time in over two years. It's really nice to be back in Chicago. 26 00:01:23.000 --> 00:01:27.350 Anna Delaney: Yeah. So where are you Tom? Billy Goat? 27 00:01:27.360 --> 00:01:30.900 Tom Field: Billy Goat Tavern, icon of Chicago culture. It's 28 00:01:30.900 --> 00:01:34.350 the root of the curse of the Chicago Cubs before that was 29 00:01:34.350 --> 00:01:38.100 lifted some years back. And it's where my journalistic heroes 30 00:01:38.310 --> 00:01:40.800 used to hang out when they were working at the Chicago Tribune, 31 00:01:40.800 --> 00:01:43.260 Chicago Sun-Times. So it's a place I always like to come to 32 00:01:43.260 --> 00:01:46.950 visit and pay my respects when I'm in town. And for fans of the 33 00:01:46.950 --> 00:01:49.440 old Saturday Night Live, it is the origin of the old 34 00:01:49.440 --> 00:01:51.510 cheeseburger, cheeseburger, cheeseburger sketch. 35 00:01:52.200 --> 00:01:55.980 Anna Delaney: Yeah, very good. And David, at the bar too? 36 00:01:56.980 --> 00:02:00.130 David Pollino: I'm at some generic bar that you can find on 37 00:02:00.130 --> 00:02:02.890 stock photos on the internet. So I just wanted to blend in with 38 00:02:02.890 --> 00:02:03.940 your background. 39 00:02:05.310 --> 00:02:09.570 Anna Delaney: We are having a good time. So David, in response 40 00:02:09.570 --> 00:02:13.620 to US intelligence that Russia is exploring options for 41 00:02:13.620 --> 00:02:17.550 potential cyberattacks, last month, President Biden released 42 00:02:17.550 --> 00:02:20.760 a statement saying that it's a critical moment to accelerate 43 00:02:21.030 --> 00:02:24.300 our work to improve domestic cybersecurity and bolster 44 00:02:24.510 --> 00:02:27.720 national resilience. And he calls on the private sector to 45 00:02:27.720 --> 00:02:31.320 harden their cyber defenses immediately. And he uses a 46 00:02:31.320 --> 00:02:33.780 couple of interesting words here, vigilance, he calls for 47 00:02:33.780 --> 00:02:37.740 vigilance and urgency. So, I want to know your reaction to 48 00:02:37.740 --> 00:02:39.750 the statement and initial thoughts. 49 00:02:40.620 --> 00:02:43.440 David Pollino: Yeah, well, these types of statements or 50 00:02:43.440 --> 00:02:46.410 opportunities for security professionals come along 51 00:02:46.410 --> 00:02:50.520 occasionally. So it's important to be prepared for it. Biden 52 00:02:51.720 --> 00:02:55.140 concludes with saying we need everyone to do their part is 53 00:02:55.140 --> 00:02:59.940 almost calling on it as being a patriotic duty. So, whatever the 54 00:02:59.970 --> 00:03:05.220 motivation is for a sudden increase in security spending. 55 00:03:05.520 --> 00:03:09.930 It could be an attack on your sector, a peer company 56 00:03:09.930 --> 00:03:13.020 experiencing an issue, maybe it's a security incident at your 57 00:03:13.020 --> 00:03:17.670 company. There's lots of situations that could give an 58 00:03:18.480 --> 00:03:23.910 unanticipated escalation in resources and budget. And so 59 00:03:23.910 --> 00:03:27.990 it's important to be prepared for it. Sometimes, it's as 60 00:03:27.990 --> 00:03:31.380 simple as leftover money at the end of the year that needs to be 61 00:03:31.380 --> 00:03:35.190 spent before it goes back in the pot. But, these are 62 00:03:35.190 --> 00:03:38.490 opportunities that security professionals need to cherish 63 00:03:38.490 --> 00:03:41.610 and as I think it was a well-known person from Chicago 64 00:03:41.610 --> 00:03:46.050 used to say, "Never let a good crisis go to waste." And, you 65 00:03:46.050 --> 00:03:48.690 know, this is one of those situations. 66 00:03:50.070 --> 00:03:51.630 Anna Delaney: Tom, a follow-up question? 67 00:03:51.000 --> 00:03:53.824 Tom Field: Indeed, David and I've been fortunate that, you 68 00:03:53.884 --> 00:03:57.129 know, over the course of the past two weeks since this 69 00:03:57.190 --> 00:04:00.435 initial statement was made, we've been out in the CISO 70 00:04:00.495 --> 00:04:04.281 community and been a part of our summits in Seattle and Chicago 71 00:04:04.341 --> 00:04:07.526 and be able to get input from people, their reactions 72 00:04:07.586 --> 00:04:11.433 immediately after. So, a couple of questions for you. One, there 73 00:04:11.493 --> 00:04:14.137 has been a great push to strengthen critical 74 00:04:14.197 --> 00:04:17.803 infrastructure. And we talk about that an awful lot. We talk 75 00:04:17.863 --> 00:04:21.529 about the escalation of tensions by nation-state adversaries. 76 00:04:21.589 --> 00:04:24.955 What does this mean to the average enterprise that's not 77 00:04:25.015 --> 00:04:27.900 part of the traditional critical infrastructure? 78 00:04:29.070 --> 00:04:31.560 David Pollino: Well, the nice thing about critical 79 00:04:31.631 --> 00:04:35.688 infrastructure is they've been very transparent with best 80 00:04:35.759 --> 00:04:40.100 practices. So, depending on what your comparable industry is, 81 00:04:40.171 --> 00:04:44.370 whether you have IoT devices or whether you're similar to a 82 00:04:44.441 --> 00:04:48.853 bank, or some other government contractor, you have frameworks 83 00:04:48.925 --> 00:04:52.981 such as PCI, NIST, FFIEC CAT, and all of these frameworks 84 00:04:53.052 --> 00:04:56.824 include a number of best practices. So, in situations 85 00:04:56.895 --> 00:05:00.952 like this, that we're talking about when an unanticipated 86 00:05:01.023 --> 00:05:05.364 escalation in budget, funds or scrutiny on security comes by, 87 00:05:05.435 --> 00:05:09.705 if you're mapping yourselves to a comparable framework, then 88 00:05:09.776 --> 00:05:14.046 you'll know areas where you could be better. And you'll know 89 00:05:14.117 --> 00:05:18.173 areas where from a maturity perspective, maybe you're not 90 00:05:18.245 --> 00:05:21.661 quite where you want to be. These frameworks are 91 00:05:21.732 --> 00:05:25.788 prescriptive enough that if there's an area, such as, you 92 00:05:25.859 --> 00:05:30.129 know, data protection, incident response, that you wanted to 93 00:05:30.200 --> 00:05:34.684 invest in, you would basically have a list of controls or areas 94 00:05:34.755 --> 00:05:38.740 where you can increase your maturity. And to use another 95 00:05:38.811 --> 00:05:42.583 topic from, you know, a government program from years 96 00:05:42.654 --> 00:05:47.138 ago. You could have shovel-ready projects ready to go. So, when 97 00:05:47.209 --> 00:05:51.194 called upon by your management, your government or other 98 00:05:51.265 --> 00:05:55.322 budgetary factors that could help impact you, you'll know 99 00:05:55.393 --> 00:05:59.734 where you want to divert that money, and what you're going to 100 00:05:59.805 --> 00:06:04.217 do with it. And also, you would know what the results would be 101 00:06:04.289 --> 00:06:08.558 after you implemented it, which could be a maturity score, a 102 00:06:08.630 --> 00:06:12.900 higher level of compliance, or some sort of a goal achieved. 103 00:06:12.000 --> 00:06:15.990 Tom Field: David, when the news first came out two weeks ago, I 104 00:06:15.990 --> 00:06:18.930 happened to be on stage the very next day with two CISOs and with 105 00:06:18.930 --> 00:06:22.620 a secret service agent. The reaction to the heightened 106 00:06:22.620 --> 00:06:26.670 threat was sort of, okay, great. Now people just know what we've 107 00:06:26.670 --> 00:06:31.170 known all along. So devil's advocate, has anything really 108 00:06:31.170 --> 00:06:35.310 changed? Or are we talking about the same risks that we had three 109 00:06:35.000 --> 00:06:38.096 David Pollino: Well, you have this particular situation, you 110 00:06:35.310 --> 00:06:35.910 weeks ago? 111 00:06:38.165 --> 00:06:42.225 have various different opinions as to whether or not a, you 112 00:06:41.270 --> 00:07:41.900 Tom Field: Very good. Anna, back to you. 113 00:06:42.293 --> 00:06:46.422 know, foreign nation state like Russia would attack critical 114 00:06:46.491 --> 00:06:50.551 infrastructure because of the blowback that they could get. 115 00:06:50.619 --> 00:06:54.886 That being said, we have seen selected attacks by other nation 116 00:06:54.955 --> 00:06:58.808 states in the past that have been attributed to Iran and 117 00:06:58.877 --> 00:07:03.212 North Korea, if you believe that the media around that, as well 118 00:07:03.281 --> 00:07:07.616 as other hacktivists and people who decide they want to promote 119 00:07:07.685 --> 00:07:11.194 a cause it could be a nationalistic cause, like the 120 00:07:11.263 --> 00:07:15.460 country that they have. And so these attacks could definitely 121 00:07:15.529 --> 00:07:19.795 happen. So, it's important to understand that you know whether 122 00:07:19.864 --> 00:07:24.130 it's this particular incident, or something unknown that could 123 00:07:24.199 --> 00:07:28.603 happen next month or next year, that we should be prepared in on 124 00:07:28.672 --> 00:07:32.594 the spectrum of cybersecurity controls. Know where you're 125 00:07:32.663 --> 00:07:36.516 weak, know where you're strong, and know where you could 126 00:07:36.585 --> 00:07:40.370 potentially make investments if the opportunity arises. 127 00:07:41.000 --> 00:07:45.980 Anna Delaney: So, David, with the rising tensions in Russia, 128 00:07:46.040 --> 00:07:49.760 and Log4j and the spate of ransomware attacks, how does 129 00:07:49.760 --> 00:07:52.310 this impact or change the way we need to think about the 130 00:07:52.310 --> 00:07:56.330 adversary? What does know your adversary mean in today's threat 131 00:07:56.330 --> 00:07:56.960 landscape? 132 00:07:58.250 --> 00:08:01.760 David Pollino: Well, it's important to know exactly what 133 00:08:01.760 --> 00:08:05.570 you're protecting against. In years past, maybe certain 134 00:08:05.570 --> 00:08:08.990 industries were focusing on insiders, whether it was a 135 00:08:08.990 --> 00:08:14.930 threat of intellectual property, or stealing goods or services or 136 00:08:14.930 --> 00:08:21.050 money from the institution. If this particular adversary is, 137 00:08:21.110 --> 00:08:24.140 you know, what you're protecting against, then you may focus some 138 00:08:24.140 --> 00:08:27.680 of those resources on your perimeter defenses, on your 139 00:08:27.680 --> 00:08:32.900 ability to detect and respond, get that response time down, you 140 00:08:32.900 --> 00:08:37.460 know, to a lower level. Some companies, especially the ones 141 00:08:37.460 --> 00:08:39.500 that I've talked to, that are smaller, medium-sized 142 00:08:39.500 --> 00:08:42.920 enterprises are looking at outsourcing some critical 143 00:08:42.920 --> 00:08:48.230 functions. Maybe it's some of their EDR, XDR type of 144 00:08:48.260 --> 00:08:52.370 monitoring and incident response, could be vulnerability 145 00:08:52.370 --> 00:08:56.540 management or other types of SOC resources. That gives the 146 00:08:56.540 --> 00:08:59.510 company the ability to focus on what they feel is the real 147 00:08:59.510 --> 00:09:02.120 threat. That's happening right now. 148 00:09:03.410 --> 00:09:05.990 Anna Delaney: And, David, I really want to pick up on a 149 00:09:05.990 --> 00:09:08.810 theme that was that came up in the summit yesterday, and that 150 00:09:08.810 --> 00:09:15.020 was about the 80% of what CISOs can focus on and can control and 151 00:09:15.050 --> 00:09:19.190 passwords sort of identity management came up and people 152 00:09:19.520 --> 00:09:22.730 and patching configurations. And I just wanted your thoughts on 153 00:09:22.730 --> 00:09:25.730 that. What should be in that 80%? 154 00:09:26.180 --> 00:09:30.860 David Pollino: That the 80% of what CISOs should be focusing on 155 00:09:31.070 --> 00:09:34.940 are the key areas of the program where they require innovation 156 00:09:35.210 --> 00:09:41.690 and focus. So a CISO needs to surround him or herself with a 157 00:09:41.900 --> 00:09:46.760 team that has taken care of some of those standard information 158 00:09:46.760 --> 00:09:50.690 security practices, whether it's identity and access management, 159 00:09:50.780 --> 00:09:54.500 data protection, vulnerability management. Those should be some 160 00:09:54.500 --> 00:10:00.800 of the table stakes to show up to be a security leader. And 161 00:10:00.800 --> 00:10:04.340 whether it's outsourcing those to technology organizations, or 162 00:10:04.340 --> 00:10:07.670 monitoring for compliance against standards, those should 163 00:10:07.670 --> 00:10:12.710 not take up the majority of a CISO's time. The CISO should be 164 00:10:12.800 --> 00:10:16.790 understanding where the company needs to go from a cybersecurity 165 00:10:16.790 --> 00:10:20.900 perspective, what the threats are, be able to focus on 166 00:10:20.960 --> 00:10:23.720 security incidents when they happen, to be able to guide the 167 00:10:23.720 --> 00:10:27.860 organization in the right way, and building out a strategy. So 168 00:10:27.860 --> 00:10:31.010 they're thinking about where things are going one or two 169 00:10:31.010 --> 00:10:34.760 years down the road that could include for some companies a 170 00:10:34.760 --> 00:10:38.090 road into the cloud, it could be bring your own device, it could 171 00:10:38.090 --> 00:10:42.290 be what does zero trust mean to your organization, and how we're 172 00:10:42.290 --> 00:10:47.570 going to get there. So, in this day and age, those kinds of core 173 00:10:47.660 --> 00:10:51.920 practices and responsibilities need to be as close as possible 174 00:10:51.920 --> 00:10:58.670 to automatic, and the metrics and monitoring infrastructure 175 00:10:58.670 --> 00:11:01.940 need to tell the CISO if they're not working the way they're 176 00:11:01.940 --> 00:11:04.910 designed, and then he can focus on those areas as well. 177 00:11:05.480 --> 00:11:07.040 Anna Delaney: Great, comprehensive answer. Thank you, 178 00:11:07.070 --> 00:11:08.360 David. Tom? 179 00:11:09.020 --> 00:11:11.150 Tom Field: David, when you and I first met years ago, it was 180 00:11:11.150 --> 00:11:13.790 because we were talking about education and awareness 181 00:11:13.790 --> 00:11:16.250 training. And that's something you particularly excelled at. 182 00:11:16.580 --> 00:11:21.140 When we're talking about the threat of nation-state 183 00:11:21.140 --> 00:11:25.430 adversaries, and we're making our employees even our customers 184 00:11:25.430 --> 00:11:29.630 more aware of what's going on, how do we find what I think is a 185 00:11:29.630 --> 00:11:34.160 delicate balance between the need to prepare people 186 00:11:34.220 --> 00:11:39.320 appropriately, and yet not turn them into a paranoid workforce? 187 00:11:41.070 --> 00:11:43.470 David Pollino: Yeah, that's an interesting question. I have 188 00:11:43.470 --> 00:11:47.730 found myself recording a number of legitimate messages as 189 00:11:47.730 --> 00:11:51.330 phishing messages, and sometimes not following up on some of the 190 00:11:51.330 --> 00:11:54.210 things I need to get done. Because, you know, of my 191 00:11:54.210 --> 00:11:59.160 heightened state of paranoia. It is important that individuals 192 00:11:59.160 --> 00:12:03.450 understand what the right thing to do is, and it needs to be a 193 00:12:03.450 --> 00:12:06.870 fit-for-purpose approach. So, when you think about your 194 00:12:06.870 --> 00:12:11.100 awareness program, it shouldn't just be the annual security 195 00:12:11.100 --> 00:12:15.480 awareness training that everybody seems to do. You know, 196 00:12:15.510 --> 00:12:21.450 quarterly phishing tests seem like they're the opportunity to 197 00:12:21.510 --> 00:12:24.960 keep things top of mind. Those quarterly phishing tests may 198 00:12:24.960 --> 00:12:29.850 also include additional classes for those that fail. So, you 199 00:12:29.850 --> 00:12:35.670 know, not clicking on the links and reporting them to your SOC 200 00:12:35.790 --> 00:12:39.150 is definitely a good thing. But, depending on what your role is 201 00:12:39.150 --> 00:12:41.940 at the company, maybe you work in finance, there might be some 202 00:12:41.940 --> 00:12:45.960 specialized training that you want to give the finance members 203 00:12:45.960 --> 00:12:48.840 around, you know, business email compromised, which I guess we 204 00:12:48.840 --> 00:12:51.900 never really talked about too much anymore, but still out 205 00:12:51.900 --> 00:12:56.700 there. Or other types of focus awareness training, maybe it's 206 00:12:56.700 --> 00:13:01.710 around system admins, about how to properly do privileged access 207 00:13:01.710 --> 00:13:04.800 management. So those things are not abused. It could be 208 00:13:04.800 --> 00:13:08.460 developers have a secure development life cycle. How to 209 00:13:08.700 --> 00:13:11.970 check your code and make sure that as you're deploying your 210 00:13:12.780 --> 00:13:16.560 artifacts, that they're done in a secure way, and that any of 211 00:13:16.560 --> 00:13:20.760 the vulnerabilities that are identified through the 212 00:13:20.760 --> 00:13:24.900 development process get remediated prior to production. 213 00:13:24.900 --> 00:13:29.820 So not only taking advantage of opportunities, but also making 214 00:13:29.820 --> 00:13:32.340 it fit for purpose is important in an awareness program. 215 00:13:32.630 --> 00:13:34.970 Tom Field: Well said. Anna, back to you to bring us home. 216 00:13:35.860 --> 00:13:39.490 Anna Delaney: Fantastic. David, on that theme of striking the 217 00:13:39.490 --> 00:13:43.300 balance, how do we go from under-communicating to 218 00:13:43.360 --> 00:13:46.420 over-communicating, go back to the middle? There is that risk 219 00:13:46.420 --> 00:13:49.930 of over-communicating isn't there and pushing your workforce 220 00:13:49.930 --> 00:13:53.650 away. Maybe you have some techniques that work for you. 221 00:13:55.520 --> 00:13:59.090 David Pollino: Well, over communicating security issues 222 00:13:59.090 --> 00:14:03.020 tends to not be the problem. Typically, we want to 223 00:14:03.020 --> 00:14:06.980 communicate more, we want to raise awareness more. As I've 224 00:14:06.980 --> 00:14:10.250 been thinking about, you know, what we need to do, I'm starting 225 00:14:10.250 --> 00:14:15.920 to shift my focus to the younger generation, and make being a 226 00:14:15.920 --> 00:14:20.330 secure technologist as part of what you would teach them. 227 00:14:20.780 --> 00:14:24.950 Whether you give them the phone or you give them a computer, and 228 00:14:24.950 --> 00:14:28.160 you kind of take that through the whole educational process. 229 00:14:28.460 --> 00:14:35.240 We should view security as a necessary idea or a concept in 230 00:14:35.240 --> 00:14:39.050 our mind as we're utilizing any piece of technology. Think about 231 00:14:39.080 --> 00:14:43.250 how it's used, how it could be misused, how it could be, you 232 00:14:43.250 --> 00:14:48.770 know, secured. Even just the concept of somebody closing 233 00:14:48.770 --> 00:14:52.340 their webcam or putting a sticker on it or whatever. 234 00:14:52.430 --> 00:14:56.810 Somebody's thinking that there could potentially be malicious 235 00:14:56.840 --> 00:14:59.780 actors, you know, operating software on your computer that 236 00:14:59.780 --> 00:15:02.660 could take advantage of resources that you have for 237 00:15:02.660 --> 00:15:06.440 nefarious purposes. So I think it's really important to get 238 00:15:06.440 --> 00:15:11.480 that ingrained in everybody's psyche and over-communication of 239 00:15:11.480 --> 00:15:15.200 security issues hasn't been a huge problem in the industry, 240 00:15:15.200 --> 00:15:15.950 that's for sure. 241 00:15:16.950 --> 00:15:20.137 Anna Delaney: Well said. David, again, something that came up 242 00:15:20.204 --> 00:15:24.409 yesterday, the Great Resignation and the challenges for CISOs, 243 00:15:24.477 --> 00:15:28.681 the limits on their budgets and resources. But, of course, the 244 00:15:28.749 --> 00:15:32.953 staff shortages. So, what are your thoughts on this? How do we 245 00:15:33.021 --> 00:15:34.920 tackle this massive problem? 246 00:15:34.000 --> 00:15:36.712 David Pollino: Yes, so it's interesting, since I've been 247 00:15:36.778 --> 00:15:40.813 reading, I've been following this, since it kind of started a 248 00:15:40.879 --> 00:15:44.914 couple of years ago. And now you're seeing some, I forget the 249 00:15:44.980 --> 00:15:49.081 exact term for it, but people going back to the companies that 250 00:15:49.147 --> 00:15:53.116 they left. So you know, the grass is not always greener some 251 00:15:53.182 --> 00:15:57.151 someplace else. So, I think it's important for each CISO, or 252 00:15:57.218 --> 00:16:01.054 whatever company that they're leading, that they create an 253 00:16:01.120 --> 00:16:05.221 environment where people want to work there. That it promotes, 254 00:16:05.288 --> 00:16:09.190 you know, their own goals, that they feel appreciated, that 255 00:16:09.256 --> 00:16:13.556 they're invested in, and they're put in an environment where they 256 00:16:13.622 --> 00:16:17.260 feel like that they can be successful. And whether that 257 00:16:17.327 --> 00:16:21.560 means, them never leaving in the first place or maybe after they 258 00:16:21.626 --> 00:16:25.397 leave, coming back to the company, the more you create an 259 00:16:25.463 --> 00:16:29.630 environment that people want to be there, the better off you'll 260 00:16:29.696 --> 00:16:33.004 be long term. And for individuals who are thinking 261 00:16:33.070 --> 00:16:37.303 about leaving, it's important to sit down and kind of take stock 262 00:16:37.369 --> 00:16:41.471 as to what's important to you. What are you looking for, is it 263 00:16:41.537 --> 00:16:45.175 more money typically is not a good long-term motivator. 264 00:16:45.241 --> 00:16:48.747 Normally, you want an environment that supports maybe 265 00:16:48.813 --> 00:16:52.650 your lifestyle, your goals, your career ambitions. So it's 266 00:16:52.716 --> 00:16:56.949 important whether you're trying to attract people and creating a 267 00:16:57.015 --> 00:17:01.117 good work environment, or you're thinking about what your next 268 00:17:01.183 --> 00:17:05.152 thing is that you understand what type of culture, what type 269 00:17:05.218 --> 00:17:09.385 of area am I creating? And are the people who would be drawn to 270 00:17:09.451 --> 00:17:13.090 that, the people that I want working in my environment? 271 00:17:13.960 --> 00:17:16.900 Anna Delaney: Great advice. Tom, did the great resignation come 272 00:17:16.900 --> 00:17:19.000 up in the panels you recorded? 273 00:17:20.920 --> 00:17:23.410 Tom Field: No, not so much. But now that I'm hearing what David 274 00:17:23.410 --> 00:17:26.560 said, if indeed, we're seeing people go back to the jobs they 275 00:17:26.560 --> 00:17:30.460 had before, maybe resignation takes on a new meaning. Instead 276 00:17:30.460 --> 00:17:33.220 of resigning from careers, they're resigned to their fate, 277 00:17:33.250 --> 00:17:34.720 maybe that's what we see next. 278 00:17:35.800 --> 00:17:38.140 David Pollino: Or maybe it's the great sabbatical, where they, 279 00:17:38.350 --> 00:17:39.490 you know 280 00:17:39.540 --> 00:17:42.330 Tom Field: Yeah, that doesn't sound great. But the great catch 281 00:17:42.330 --> 00:17:44.430 your breath is something I could sign up for. 282 00:17:44.060 --> 00:17:47.358 Anna Delaney: Yeah. It's been a time to think through things 283 00:17:47.430 --> 00:17:51.805 perhaps, and reflect. David, just a final quick last question 284 00:17:51.877 --> 00:17:56.538 for us all, really. On the topic of live events and travel, we're 285 00:17:56.610 --> 00:18:00.626 not just dreaming anymore. What's been the top city that 286 00:18:00.698 --> 00:18:05.287 you've been to? That you've been to a conference? And so the top 287 00:18:05.359 --> 00:18:08.300 cybersecurity conference city, I suppose. 288 00:18:10.130 --> 00:18:14.510 David Pollino: Well, you know, I am much like Tom. I really like 289 00:18:14.510 --> 00:18:19.340 to embrace every city I go to, and figure out. I'll go to the 290 00:18:19.340 --> 00:18:23.120 museums. I'll stay an extra day. I'll, you know, see the sights 291 00:18:23.120 --> 00:18:27.740 and try to hit the local area. So to be perfectly honest, I 292 00:18:27.740 --> 00:18:31.640 love them all. It's fun to be able to connect with people 293 00:18:31.670 --> 00:18:35.540 individually; it's fun to be able to understand the local 294 00:18:35.540 --> 00:18:39.110 culture. Some cities, you can stay longer in than others, you 295 00:18:39.110 --> 00:18:42.290 know, there's more culture, there's more things to see. But, 296 00:18:42.320 --> 00:18:46.490 I think every city has its unique aspects that can be 297 00:18:46.490 --> 00:18:49.520 appreciated. And you know, coming from somebody who's lived 298 00:18:50.030 --> 00:18:54.620 in the east coast, west coast, Midwest, they all have great 299 00:18:54.620 --> 00:18:57.800 aspects to them. So I'm just looking forward to getting back 300 00:18:57.800 --> 00:19:01.220 in town, connecting with people individually, and sampling the 301 00:19:01.220 --> 00:19:04.190 local cuisine and maybe going to the local dive bar. 302 00:19:04.610 --> 00:19:10.250 Anna Delaney: We're with you on that one. And Tom, London? 303 00:19:10.730 --> 00:19:13.640 Tom Field: The criteria is walking. I like a good walking 304 00:19:13.640 --> 00:19:17.060 city, when I think of those, San Francisco qualifies, New York, 305 00:19:17.060 --> 00:19:21.230 Chicago, Philadelphia, London, certainly. It's about getting 306 00:19:21.230 --> 00:19:24.320 out when you can, when you have that opportunity, and just being 307 00:19:24.320 --> 00:19:27.170 able to walk around and take in the culture, the history, the 308 00:19:27.170 --> 00:19:30.920 community. And I think that the adventure we're all on right now 309 00:19:31.100 --> 00:19:34.460 is getting back to these cities and finding out how many of our 310 00:19:34.460 --> 00:19:36.110 favorite restaurants still exist. 311 00:19:36.360 --> 00:19:40.320 Anna Delaney: Yeah, for sure. And well, I'll just single one 312 00:19:40.320 --> 00:19:44.730 out. I remember Nashville being just a lot of fun with all the 313 00:19:44.730 --> 00:19:49.020 music and yeah, the history there. So that was a good one. 314 00:19:49.020 --> 00:19:52.410 But I agree. Just embrace it all. And that's what we're doing 315 00:19:52.410 --> 00:19:57.480 here. So happy to be back. Well, David, this has been an immense 316 00:19:57.480 --> 00:20:00.810 pleasure. Thank you so much for joining us. Loved speaking with 317 00:20:00.810 --> 00:20:03.600 you and hearing your expertise and perspective. So really 318 00:20:03.600 --> 00:20:04.050 appreciate it! 319 00:20:04.450 --> 00:20:05.380 David Pollino: Thanks for having me. 320 00:20:06.320 --> 00:20:08.390 Anna Delaney: And thank you so much for watching. Until next 321 00:20:08.390 --> 00:20:08.720 time.