WEBVTT 1 00:00:00.360 --> 00:00:03.240 Anna Delaney: Hello, welcome to Proof of Concept, the ISMG talk 2 00:00:03.240 --> 00:00:05.970 show where we discuss the cybersecurity and privacy 3 00:00:05.970 --> 00:00:09.930 challenges of today and tomorrow with industry experts and how we 4 00:00:09.930 --> 00:00:12.930 could potentially solve them. We are your hosts. I'm Anna 5 00:00:12.930 --> 00:00:15.450 Delaney, director of productions at ISMG. 6 00:00:15.990 --> 00:00:17.850 Tom Field: I'm Tom Field. I'm senior vice president of 7 00:00:17.850 --> 00:00:21.030 editorial, ISMG. Anna, always a pleasure to see you. 8 00:00:21.270 --> 00:00:24.300 Anna Delaney: Always. So Tom, how is New York this morning? 9 00:00:25.110 --> 00:00:27.810 Tom Field: New York was the easiest trip I have ever had 10 00:00:27.840 --> 00:00:30.810 coming into the city. It is the Juneteenth holiday. And it was a 11 00:00:30.810 --> 00:00:34.740 breeze coming from the airport to Manhattan. So, very happy to 12 00:00:34.740 --> 00:00:35.430 be here today. 13 00:00:36.210 --> 00:00:38.910 Anna Delaney: And do tell the world, why are we both in New 14 00:00:38.910 --> 00:00:39.930 York today. 15 00:00:40.500 --> 00:00:42.300 Tom Field: Because it was really cold in northern New England 16 00:00:42.300 --> 00:00:47.370 over the weekend? Oh, we're here for our first live event in New 17 00:00:47.370 --> 00:00:51.960 York City, our live summit since the fall of 2019. We have our 18 00:00:51.960 --> 00:00:56.220 cybersecurity event going on tomorrow in Midtown. And we've 19 00:00:56.220 --> 00:00:58.980 got, you know, we've got speakers and agenda. It's just 20 00:00:58.980 --> 00:01:02.670 phenomenal. I mean, think about this. John Kindervag, Lisa 21 00:01:02.670 --> 00:01:05.490 Sotto, whom we're going to meet later in this discussion here. 22 00:01:06.000 --> 00:01:11.490 We've got Ari Redbord, Claire Le Gal from MasterCard. And we've 23 00:01:11.490 --> 00:01:15.930 got truly an all star lineup of speakers and topics for 24 00:01:15.930 --> 00:01:17.310 tomorrow. What do you look forward to? 25 00:01:18.000 --> 00:01:20.310 Anna Delaney: Well, I'm looking forward to moderating a couple 26 00:01:20.310 --> 00:01:24.120 of sessions. But firstly, I think this is the biggest ISMG 27 00:01:24.480 --> 00:01:28.290 event of the year. Usually, it's your New York event. So I'm 28 00:01:28.290 --> 00:01:31.410 excited. That's my first with you. I'm looking forward. 29 00:01:31.410 --> 00:01:34.530 Tom Field: Anna, any event we have these days is the biggest 30 00:01:34.530 --> 00:01:35.940 after the last couple of years. 31 00:01:36.320 --> 00:01:38.690 Anna Delaney: Yeah, for sure. So there's always a buzz. There's a 32 00:01:38.690 --> 00:01:40.880 lot of adrenaline, there's excitement to meet people again. 33 00:01:41.360 --> 00:01:44.000 Well, I've got a couple of panels that I'm moderating, 34 00:01:44.000 --> 00:01:48.740 first talking with CISOs, about whether the Russia-Ukraine war 35 00:01:48.740 --> 00:01:53.540 has accelerated or even stalled their cyber plan. So we're going 36 00:01:53.540 --> 00:01:58.040 to be talking about what are the threats they're observing, how 37 00:01:58.040 --> 00:02:02.120 they're responding to heightened threat activity, if at all, and 38 00:02:02.150 --> 00:02:05.960 how they're maintaining cyber resilience in wartime, and what 39 00:02:05.960 --> 00:02:08.960 potential disruption they're preparing for as the war 40 00:02:08.960 --> 00:02:12.860 continues. So really looking forward to that topical panel 41 00:02:12.890 --> 00:02:16.940 discussion. And then of course, there is another panel, but 42 00:02:16.940 --> 00:02:21.920 fraud related the challenge of P2P, peer to peer payment fraud. 43 00:02:21.920 --> 00:02:26.660 So in particular, we're looking at the Zelle payment app and the 44 00:02:26.660 --> 00:02:31.520 challenge around impersonation. One of our speakers, David, will 45 00:02:31.520 --> 00:02:35.060 have thoughts on this later. But I don't know if you know this, 46 00:02:35.090 --> 00:02:39.080 Tom, Zelle is America's most popular payment app. And it's 47 00:02:39.080 --> 00:02:42.110 free, easy to use, but of course ... 48 00:02:42.150 --> 00:02:43.140 Tom Field: Are you making a pitch? 49 00:02:44.080 --> 00:02:48.190 Anna Delaney: I should, 10% commission from Zelle. But it's 50 00:02:48.550 --> 00:02:52.480 obviously proven popular with the criminals. So we're going to 51 00:02:52.480 --> 00:02:55.960 be looking at Zelle in particular, and other social 52 00:02:55.960 --> 00:03:00.070 engineering scams or trends and the challenges for banks and 53 00:03:00.490 --> 00:03:04.000 consumers and regulators. And how do we prevent this in the 54 00:03:04.000 --> 00:03:08.110 first place? So there's a lot going on, a lot to discuss. But 55 00:03:08.140 --> 00:03:11.080 ultimately, as I mentioned before, I'm looking forward to 56 00:03:11.200 --> 00:03:14.080 seeing everybody in person, speakers and attendees. 57 00:03:14.070 --> 00:03:17.098 Tom Field: Exactly. I've got a session tomorrow afternoon with 58 00:03:17.157 --> 00:03:20.898 John Kindervag, the godfather of zero trust. It's going to be a 59 00:03:20.957 --> 00:03:24.638 town hall. We are going to sit there, have a conversation, and 60 00:03:24.698 --> 00:03:28.379 take questions from the audience and try to dispel some of the 61 00:03:28.438 --> 00:03:32.001 myths and unrealities about zero trust. So I'm excited about 62 00:03:32.060 --> 00:03:35.385 that. And also hosting a more private session with Chris 63 00:03:35.445 --> 00:03:39.126 Pierson, the founder and CEO of BlackCloak. And we're going to 64 00:03:39.185 --> 00:03:42.629 be talking about the growing need for executive protection 65 00:03:42.689 --> 00:03:45.836 outside the traditional perimeter for the 12 hours of 66 00:03:45.895 --> 00:03:49.339 day when senior leaders and board members aren't within an 67 00:03:49.398 --> 00:03:52.723 office. It's fascinating topic and it's getting a lot of 68 00:03:52.783 --> 00:03:56.880 traction. So looking forward to continuing that conversation as well. 69 00:03:57.330 --> 00:04:01.080 Anna Delaney: Yeah, rich topics and discussions. Well, Tom, you 70 00:04:01.080 --> 00:04:04.140 mentioned Lisa earlier, why don't you go ahead and formally 71 00:04:04.000 --> 00:04:06.671 Tom Field: This is called foreshadowing in literary sense. 72 00:04:04.140 --> 00:04:04.710 introduce her. 73 00:04:06.728 --> 00:04:09.912 Yes. We're going to have a conversation here with one of 74 00:04:09.968 --> 00:04:13.152 our frequent guests and contributors, Lisa Sotto. She is 75 00:04:13.209 --> 00:04:16.449 partner and chair of the global privacy and cybersecurity 76 00:04:16.120 --> 00:04:22.180 Lisa Sotto: Priestess of privacy 77 00:04:16.506 --> 00:04:19.917 practice at Hunton Andrews Kurth LLP. I believe she has been 78 00:04:19.973 --> 00:04:22.930 called the princess of privacy. Is that right, Lisa? 79 00:04:23.470 --> 00:04:30.400 Tom Field: Priestess of privacy. She is essentially the Princess 80 00:04:30.400 --> 00:04:33.280 Leia of privacy in our galaxy. We're privileged to have her 81 00:04:33.280 --> 00:04:33.730 here today. 82 00:04:33.760 --> 00:04:36.100 Lisa Sotto: Queen of breach privilege or priestess of 83 00:04:36.100 --> 00:04:37.210 privacy. There you go. 84 00:04:37.780 --> 00:04:41.530 Tom Field: Lisa, so much going on here. Talk about the current 85 00:04:41.530 --> 00:04:45.610 landscape, which I feel changes every other week. How are 86 00:04:45.610 --> 00:04:50.980 enterprises managing when it comes to juggling five different 87 00:04:50.980 --> 00:04:54.490 privacy laws? And it could be six if today wasn't a holiday? 88 00:04:55.770 --> 00:04:58.680 Lisa Sotto: You're right, Tom. It's been an incredibly busy 89 00:04:58.680 --> 00:05:03.390 time. We've really gone from zero to 60 in just a few short 90 00:05:03.390 --> 00:05:07.620 years. And just to recount a little bit, let's remember that 91 00:05:07.890 --> 00:05:12.960 until 2020, we did not have a comprehensive privacy law in 92 00:05:12.960 --> 00:05:17.580 this country at all. What we were known to have was a 93 00:05:17.580 --> 00:05:21.330 sectoral regime, meaning we regulated by industry sector. So 94 00:05:21.330 --> 00:05:25.500 we had healthcare privacy rules, we had rules around financial 95 00:05:25.500 --> 00:05:29.430 data, rules around kids' data collected online for kids under 96 00:05:29.430 --> 00:05:33.690 the age of 13, rules around credit reporting data, drivers' 97 00:05:33.720 --> 00:05:40.170 data, video rental records, and the like. So we had a sectoral 98 00:05:40.170 --> 00:05:43.860 regime very very much out of step with the rest of the world. 99 00:05:44.310 --> 00:05:50.760 And then in 2018, now effective January 1, 2020, everything 100 00:05:50.760 --> 00:05:56.430 changed with the California law. So California really changed the 101 00:05:56.430 --> 00:06:01.080 landscape in the United States for privacy in implementing and 102 00:06:01.080 --> 00:06:04.410 putting into place a comprehensive omnibus privacy 103 00:06:04.410 --> 00:06:09.420 law for Californians. And not to be outdone, several states 104 00:06:09.420 --> 00:06:14.130 followed suit. So we now have Virginia, Colorado, Utah, and 105 00:06:14.280 --> 00:06:17.400 the latest kid on the block is Connecticut, to have 106 00:06:17.400 --> 00:06:22.230 comprehensive privacy laws for residents of those states. And 107 00:06:22.230 --> 00:06:26.070 this really has dramatically changed the landscape in the 108 00:06:26.070 --> 00:06:29.880 United States. And it also brought extraordinary compliance 109 00:06:29.880 --> 00:06:34.200 challenges. We're on a bit of a collision course, trying to 110 00:06:34.200 --> 00:06:36.570 manage all of these five laws. 111 00:06:36.930 --> 00:06:38.850 Tom Field: Well, talk about that collision course leads, because 112 00:06:38.850 --> 00:06:42.720 my understanding is, this isn't as easy as you pick the most 113 00:06:42.720 --> 00:06:44.760 stringent and adhere to that, and you're going to be okay. 114 00:06:45.320 --> 00:06:48.140 Lisa Sotto: You're exactly right. Unfortunately, these laws 115 00:06:48.140 --> 00:06:52.370 are not harmonized with each other, they're not consistent. 116 00:06:52.370 --> 00:06:55.970 We really think of these laws in two buckets. California stands 117 00:06:55.970 --> 00:07:01.370 alone, and its rules are really quite different from the other 118 00:07:01.370 --> 00:07:04.940 four. Now, the other four do fall into one bucket, and that 119 00:07:04.940 --> 00:07:08.150 they're reasonably similar, but they're not the same. So it's 120 00:07:08.150 --> 00:07:12.410 very important to remember that. I think the key to managing all 121 00:07:12.410 --> 00:07:17.840 of this is to try to harmonize the key concepts, and focus on 122 00:07:17.840 --> 00:07:21.410 the key concepts or things like transparency, providing privacy 123 00:07:21.410 --> 00:07:25.550 notices, consent for the use of sensitive data, where it's 124 00:07:25.550 --> 00:07:29.750 appropriate service provider management, training, data 125 00:07:29.750 --> 00:07:33.380 security enforcement. So those are the key principles that 126 00:07:33.380 --> 00:07:37.580 underpin any privacy law around the world. And we need to be 127 00:07:37.580 --> 00:07:41.210 thinking about those key principles with respect to the 128 00:07:41.210 --> 00:07:45.050 US laws as well. And if you put a framework in place to manage 129 00:07:45.050 --> 00:07:48.950 those key principles, you'll get a good 70 or 80% of the way 130 00:07:48.950 --> 00:07:49.280 there. 131 00:07:49.660 --> 00:07:52.630 Tom Field: Now, at the same time we have a new draft of a federal 132 00:07:52.660 --> 00:07:57.760 privacy bill. What does it mean? And can we handicap the chances 133 00:07:57.760 --> 00:07:58.570 of it getting anywhere? 134 00:07:58.560 --> 00:08:02.153 Lisa Sotto: I would not predict. This is really hard. Yes, on 135 00:08:02.227 --> 00:08:06.700 June 3, the House and the Senate released a new comprehensive 136 00:08:06.774 --> 00:08:11.541 privacy bill called the American Data Privacy and Protection Act. 137 00:08:11.614 --> 00:08:16.235 The bill provides for the usual privacy rights that we see: the 138 00:08:16.308 --> 00:08:20.708 right to access your data, the right to deletion correction, 139 00:08:20.782 --> 00:08:25.109 portability, it also imposes data minimization obligations. 140 00:08:25.182 --> 00:08:29.362 And we're seeing that, of course, around the world. There 141 00:08:29.436 --> 00:08:33.910 are requirements for express consent for the use of sensitive 142 00:08:33.983 --> 00:08:38.603 data. There's a requirement for privacy policies. And I'll just 143 00:08:38.677 --> 00:08:43.297 note, this is important and new that there is now in this bill, 144 00:08:43.370 --> 00:08:47.918 a requirement to put reasonable security measures in place. So 145 00:08:47.991 --> 00:08:52.391 that's new, and because it would be a comprehensive security 146 00:08:52.465 --> 00:08:55.985 requirement at the federal level. So as to state 147 00:08:56.058 --> 00:09:00.165 preemption, and this is the question that we get all the 148 00:09:00.239 --> 00:09:04.566 time because we just talked about the five and you're right 149 00:09:04.639 --> 00:09:08.453 about six, and we're not Juneteenth, you know, as to 150 00:09:08.526 --> 00:09:12.853 state preemption there is, there's a good chunk of the bill 151 00:09:12.927 --> 00:09:17.620 that addresses state preemption, and it is limited. It is not as 152 00:09:17.694 --> 00:09:22.314 extensive as some would like it to be. So, for example, general 153 00:09:22.388 --> 00:09:26.935 consumer protection laws are not preempted. Facial recognition 154 00:09:27.008 --> 00:09:31.482 laws are not present preempted. Illinois' BIPA, the Biometric 155 00:09:31.555 --> 00:09:35.956 Information Privacy Act is not preempted. And then the other 156 00:09:36.029 --> 00:09:40.796 key question is private right of action. Is there one and yes. In 157 00:09:40.869 --> 00:09:45.270 this bill, there is a limited private right of action with a 158 00:09:45.343 --> 00:09:49.964 host of exceptions, as well as a limitation on damages. So, you 159 00:09:50.037 --> 00:09:54.364 know, the question in my mind is what are the roadblocks to 160 00:09:54.437 --> 00:09:58.691 passage and it is the usual suspects. It's a private right 161 00:09:58.764 --> 00:10:02.578 of action. It's State preemption. And then I'll also 162 00:10:02.651 --> 00:10:06.979 note that the bill would create a new FTC Bureau called the 163 00:10:07.052 --> 00:10:11.159 bureau of privacy and query whether partisan tensions on 164 00:10:11.232 --> 00:10:15.853 either side will consider that either appropriate regulation or 165 00:10:15.926 --> 00:10:17.100 over regulation. 166 00:10:18.260 --> 00:10:20.570 Tom Field: Fair points. Claire will be discussing this. 167 00:10:20.690 --> 00:10:23.870 Meanwhile, as legislation happens, ransomware continues to 168 00:10:23.870 --> 00:10:27.170 burn. What are the ransomware trends you're paying attention 169 00:10:27.170 --> 00:10:28.310 to as we approach mid year? 170 00:10:28.000 --> 00:10:32.560 Lisa Sotto: Yeah, good question. Well, it was a slow start to the 171 00:10:32.560 --> 00:10:36.730 year for the ransomware actors. So that was a little bit of a 172 00:10:36.730 --> 00:10:41.290 surprise, but they are back in full force. There is no stopping 173 00:10:41.290 --> 00:10:44.980 that train. There are now reportedly more than 60 174 00:10:44.980 --> 00:10:48.970 ransomware collectives and they are wreaking havoc as they 175 00:10:48.970 --> 00:10:53.410 always have done. We're seeing some bigger demands, the demands 176 00:10:53.410 --> 00:10:56.980 used to be, you know, one to five million. Now we're seeing 177 00:10:56.980 --> 00:11:01.570 some that are truly moonshot demands — 10 million, and up and 178 00:11:01.570 --> 00:11:04.630 sometimes much, much higher than that. And they're not 179 00:11:04.630 --> 00:11:09.370 negotiating down quite as much as they used to be able to 180 00:11:09.460 --> 00:11:13.390 negotiate very significant discounts. Not so much anymore; 181 00:11:13.390 --> 00:11:16.510 there seems to be a little bit less willingness to do that. 182 00:11:17.230 --> 00:11:20.950 I'll also just note another really disturbing trend. The 183 00:11:20.980 --> 00:11:24.670 threat actors are now contacting third parties. So they're not 184 00:11:24.670 --> 00:11:30.070 only contacting the company that has been hit, but they're also 185 00:11:30.130 --> 00:11:34.390 looking through the data and finding customers or business 186 00:11:34.390 --> 00:11:38.620 partners or service providers whose data is in the mix. And 187 00:11:38.620 --> 00:11:41.290 they're contacting them. And of course, that increases the 188 00:11:41.290 --> 00:11:46.750 leverage that they have and forces the hand of the ransom 189 00:11:46.780 --> 00:11:53.140 party to go ahead and pay. And I'll also note on the other side 190 00:11:53.140 --> 00:12:00.400 of the scale, we now have a very active federal government with 191 00:12:00.400 --> 00:12:04.900 respect to ransomware. So we saw the passage of the strengthening 192 00:12:04.900 --> 00:12:09.550 American Cybersecurity Act, which will require for critical 193 00:12:09.550 --> 00:12:12.850 infrastructure, not quite in place yet, but will require a 194 00:12:12.850 --> 00:12:19.630 72-hour notice obligation when an attack has occurred. And then 195 00:12:20.110 --> 00:12:23.770 when you pay ransom, you needed to notify within 24 hours of 196 00:12:23.770 --> 00:12:28.090 doing so. We now have 24-hour reporting obligations for 197 00:12:28.090 --> 00:12:32.260 pipelines, for surface transportation. And we have a 198 00:12:32.260 --> 00:12:37.150 proposed SEC rule that would require notice within four 199 00:12:37.150 --> 00:12:40.480 business days to the world. It's a disclosure obligation. 200 00:12:41.440 --> 00:12:43.630 Tom Field: We continue to live in interesting times and Lisa, 201 00:12:43.630 --> 00:12:44.920 you're going to continue to be busy. 202 00:12:45.610 --> 00:12:47.650 Lisa Sotto: Indeed. Thank you very much, Tom. 203 00:12:47.920 --> 00:12:50.170 Tom Field: Thank you. And with that, we have much more to talk 204 00:12:50.170 --> 00:12:52.630 about in terms of fraud and scam. So, Anna, let me turn this 205 00:12:52.630 --> 00:12:54.940 back to you in a conversation with our next guest. 206 00:12:55.150 --> 00:12:57.580 Anna Delaney: Thank you very much. That was excellent. Okay, 207 00:12:57.580 --> 00:13:01.000 well, I'm very, very pleased to welcome back the brilliant David 208 00:13:01.000 --> 00:13:05.290 Pollino, former CISO of PNC Bank. Very good to see you, 209 00:13:05.290 --> 00:13:06.460 David, thanks for joining us. 210 00:13:07.410 --> 00:13:09.450 David Pollino: Thanks for the invitation. Glad to be here. 211 00:13:09.959 --> 00:13:13.259 Anna Delaney: So David, following on from Tom's question 212 00:13:13.259 --> 00:13:16.709 to Lisa, what are the ransomware trends you're observing? And is 213 00:13:16.709 --> 00:13:18.509 there anything different that you're picking up on? 214 00:13:18.000 --> 00:13:20.978 David Pollino: I think it's a twofold answer. One is more of 215 00:13:21.044 --> 00:13:24.684 the same. And there's some things that we can highlight 216 00:13:24.750 --> 00:13:28.787 from the recent Verizon 2021 Data Breach. But looking at some 217 00:13:28.853 --> 00:13:32.890 of the technical aspects of some of the variants that are out 218 00:13:32.956 --> 00:13:37.126 there, right now, the ransomware actors are getting smarter. So 219 00:13:37.192 --> 00:13:41.295 there's at least one variant out there that's been reported to 220 00:13:41.361 --> 00:13:45.531 utilize the web browser as the primary mechanism for infection, 221 00:13:45.597 --> 00:13:49.105 bypassing some of the traditional email controls that 222 00:13:49.171 --> 00:13:53.009 could be in place. Many of the email controls also involve 223 00:13:53.076 --> 00:13:57.113 rewriting the URL to do some sort of behavioral detection. If 224 00:13:57.179 --> 00:14:01.348 it's a watering hole attack or some other type of browser based 225 00:14:01.414 --> 00:14:05.385 attack that does involve email, then you bypass part of your 226 00:14:05.451 --> 00:14:09.158 control suite, as well as delivering payloads from known 227 00:14:09.224 --> 00:14:12.864 good sites by embedding those into areas that have been 228 00:14:12.930 --> 00:14:16.702 compromised or give you the ability to publish content to 229 00:14:16.768 --> 00:14:20.938 the site. So the takeaway from that is that we need to continue 230 00:14:21.004 --> 00:14:25.041 to make sure that our controls are evolving and staying up to 231 00:14:25.107 --> 00:14:28.946 date with current threats, because the criminals out there 232 00:14:29.012 --> 00:14:32.718 making sure that they're innovating as time goes on. And 233 00:14:32.784 --> 00:14:36.424 then more of the same as Lisa mentioned, the ransomware 234 00:14:36.490 --> 00:14:40.395 attacks seem to continue to have a twofold impact, not just 235 00:14:40.461 --> 00:14:43.969 getting your data back but disclosing the data to the 236 00:14:44.035 --> 00:14:47.477 public talking to your customers. And as she brought 237 00:14:47.543 --> 00:14:51.447 out, the paydays have really gone up. So that combined with 238 00:14:51.514 --> 00:14:55.154 the continuance of remote workers, shows that we have a 239 00:14:55.220 --> 00:14:58.860 pretty high or large attack space that we need to worry 240 00:14:58.926 --> 00:15:03.095 about, and that ransomware, much like many of the cybersecurity 241 00:15:03.162 --> 00:15:07.530 trends, will continue to evolve over time, but will never go away. 242 00:15:07.780 --> 00:15:13.810 Anna Delaney: Right! But thank you, I appreciate that insight. 243 00:15:14.470 --> 00:15:18.730 Another trend we're seeing is job scams or job seekers scams. 244 00:15:18.760 --> 00:15:21.730 They're on the rise since remote work became necessary and 245 00:15:21.730 --> 00:15:25.480 popular now, what are the trends you're seeing and the potential 246 00:15:25.480 --> 00:15:26.140 impact? 247 00:15:27.110 --> 00:15:28.820 David Pollino: Well, like we just talked about with 248 00:15:29.120 --> 00:15:33.380 ransomware, job sites might be one of the, you know, ultimate 249 00:15:33.380 --> 00:15:36.890 watering hole attacks. If you're looking to find somebody with 250 00:15:36.890 --> 00:15:40.340 top secret access, you could potentially post a fraudulent 251 00:15:40.340 --> 00:15:45.800 job position advertising that, you know, the need for top 252 00:15:45.800 --> 00:15:48.350 secret access, or maybe it's cash management or fraud 253 00:15:48.350 --> 00:15:53.030 prevention or even ransomware control expert. And then you 254 00:15:53.030 --> 00:15:57.080 could utilize that to either infect the particular individual 255 00:15:57.110 --> 00:16:01.250 or in some cases, you could set up fraudulent job interviews, 256 00:16:01.280 --> 00:16:06.740 gather information from them. I have heard from some security 257 00:16:06.740 --> 00:16:12.350 researchers that not every job interview is with the intent of 258 00:16:12.530 --> 00:16:15.770 finding the ideal candidate for positions. Sometimes job 259 00:16:15.950 --> 00:16:19.700 postings could be related to gathering information, learning 260 00:16:19.700 --> 00:16:24.590 what other firms are doing. And it may ultimately end up being a 261 00:16:24.590 --> 00:16:28.970 waste of time. So when it comes to job scams, you need to be 262 00:16:29.000 --> 00:16:33.410 very careful. And many of the search firms are also these very 263 00:16:33.410 --> 00:16:38.030 small companies. So typically, when you give advice to somebody 264 00:16:38.030 --> 00:16:40.970 about whether they should click on a link, you tell them to do 265 00:16:40.970 --> 00:16:44.570 some research. And sometimes you can do research on companies 266 00:16:44.570 --> 00:16:47.360 that have been in business for an extended period of time. 267 00:16:47.390 --> 00:16:50.930 Other times, it's a little bit more difficult. So being able to 268 00:16:50.930 --> 00:16:54.920 do adequate due diligence on small search forums is a 269 00:16:54.920 --> 00:16:57.830 challenge, especially if they're utilizing some of their own 270 00:16:57.860 --> 00:17:02.330 technology. You may see the job posting on LinkedIn or Dice or 271 00:17:02.330 --> 00:17:04.970 one of the other, you know, online job boards, but then 272 00:17:05.000 --> 00:17:09.410 frequently, it links you out to some sort of smaller information 273 00:17:09.410 --> 00:17:13.400 gathering, you know, candidate management site. And what's the 274 00:17:13.400 --> 00:17:16.550 tech behind that? Has that site been compromised? Is it a 275 00:17:16.550 --> 00:17:19.310 malicious site? Just trying to figure out whether or not you 276 00:17:19.310 --> 00:17:22.880 should continue applying for that position can be a bit of a 277 00:17:22.880 --> 00:17:26.960 challenge. I've been approached many times over the years by 278 00:17:26.960 --> 00:17:31.130 search firms looking for a candidate for senior job 279 00:17:31.130 --> 00:17:35.630 security positions. I will typically research the firm. In 280 00:17:35.630 --> 00:17:40.490 many cases, I will not call or email the information that's 281 00:17:40.490 --> 00:17:44.420 provided within that unsolicited connection. I'll call the main 282 00:17:44.420 --> 00:17:50.150 switchboard. And it's amazing for heads of security positions, 283 00:17:50.150 --> 00:17:53.240 how many times I've called a headhunter, who's been 284 00:17:53.240 --> 00:17:56.330 recruiting for many years for security jobs. And they tell me, 285 00:17:56.330 --> 00:18:00.110 I'm the only one who's ever gone through that extra effort of due 286 00:18:00.110 --> 00:18:02.510 diligence. Do you think security professionals would be a little 287 00:18:02.510 --> 00:18:05.690 bit more paranoid than the rest and try to make sure that they 288 00:18:05.690 --> 00:18:10.220 weren't setting themselves up to be ripped off? I even had one 289 00:18:10.250 --> 00:18:13.940 headhunter that was contacting me for a CISO position. And he 290 00:18:13.940 --> 00:18:20.120 sent me a link, one of those tiny URLs to a site with more 291 00:18:20.120 --> 00:18:23.240 information about it. And I told them that nobody qualified for 292 00:18:23.240 --> 00:18:26.240 this position would actually click on that link. But that 293 00:18:26.240 --> 00:18:29.300 ended our conversation there pretty quickly. But really, it 294 00:18:29.300 --> 00:18:32.090 just goes to show that you need good OpSec and personal 295 00:18:32.090 --> 00:18:35.840 protection, if you're in the market. And most people whether 296 00:18:35.840 --> 00:18:38.990 or not they're actively looking for a job, or actively being 297 00:18:38.990 --> 00:18:42.320 solicited, and in some cases might actually take a look and 298 00:18:42.320 --> 00:18:48.020 see if the grass is greener out there. So you have good OpSec 299 00:18:48.020 --> 00:18:51.560 yourself, especially as security professionals. One thing just to 300 00:18:51.560 --> 00:18:56.840 follow up on Lisa's comments as well as what you'll likely hear 301 00:18:56.840 --> 00:19:00.530 from Ari every time you talk to him. Money meals are still a 302 00:19:00.530 --> 00:19:04.760 thing. They are not as common as they used to be, because the 303 00:19:04.760 --> 00:19:09.140 ransom and some of the payoffs are now being performed through 304 00:19:09.140 --> 00:19:13.370 crypto and crypto has its own mechanisms for laundering funds 305 00:19:13.370 --> 00:19:17.450 by exchanging it with different types of currencies. But you 306 00:19:17.450 --> 00:19:22.100 still need to be wary of money meal scams, they are still out 307 00:19:22.100 --> 00:19:26.750 there and that they will crop up from time to time. You know, the 308 00:19:26.750 --> 00:19:31.250 adage remains true, especially with job positions. If it sounds 309 00:19:31.250 --> 00:19:33.800 too good to be true, then it probably is. 310 00:19:33.000 --> 00:19:37.290 Anna Delaney: Fair point. Sharing information on social 311 00:19:37.290 --> 00:19:40.830 media, related to this of course, as employees we are 312 00:19:40.830 --> 00:19:44.730 always encouraged to share on platforms like LinkedIn. When 313 00:19:44.730 --> 00:19:48.690 might that sharing be a corporate risk do you think? 314 00:19:48.000 --> 00:19:51.840 David Pollino: Yeah, it's a very interesting conversation as 315 00:19:51.900 --> 00:19:56.550 organizations vary on what they want shared and what their 316 00:19:56.580 --> 00:20:00.900 policies are. Many organizations would not want sensitive 317 00:20:00.900 --> 00:20:05.700 information shared over, you know, job sites; LinkedIn, 318 00:20:05.700 --> 00:20:09.030 Twitter, those types of things. That could include the 319 00:20:09.030 --> 00:20:12.390 technologies that are being utilized, security controls, 320 00:20:12.510 --> 00:20:17.460 confidential project names, and even the org chart information. 321 00:20:17.670 --> 00:20:23.010 So each company has to sit down and figure out what is 322 00:20:23.010 --> 00:20:27.450 appropriate from a sharing perspective for them. And make 323 00:20:27.450 --> 00:20:30.810 sure that they're taking the appropriate educational steps to 324 00:20:30.810 --> 00:20:33.120 let users know what is appropriate and what is not 325 00:20:33.120 --> 00:20:37.080 appropriate. It's common and TTPs for pentesters, and 326 00:20:37.080 --> 00:20:40.980 intelligence professionals. You know, job sites are a great 327 00:20:40.980 --> 00:20:44.190 source of OSINT. You know, I've heard some stories about 328 00:20:44.220 --> 00:20:47.910 professionals, either penetration testers or people 329 00:20:48.120 --> 00:20:51.840 working for government agencies, that are trying to compromise a 330 00:20:51.840 --> 00:20:56.190 particular organization. They'll look not just on the standard 331 00:20:56.190 --> 00:21:00.180 job sites like LinkedIn, but they'll also look at forums and 332 00:21:00.180 --> 00:21:03.420 see who's posting with particular usernames with 333 00:21:03.420 --> 00:21:06.510 questions about particular, maybe it's, you know, operating 334 00:21:06.510 --> 00:21:11.160 system, database versions, security tools. And just by 335 00:21:11.160 --> 00:21:14.370 looking at that information, you can derive a lot of information 336 00:21:14.370 --> 00:21:18.420 about the internal structure and the control environment that 337 00:21:18.420 --> 00:21:22.110 they have. I know from personal experience, there was one 338 00:21:22.110 --> 00:21:26.190 criminal that called pretending to be a helpdesk individual. And 339 00:21:26.190 --> 00:21:31.230 he used the lingo from LinkedIn, at least that's the best we 340 00:21:31.230 --> 00:21:35.100 could figure out because he had an internal jargon that was used 341 00:21:35.100 --> 00:21:38.760 to say, "Hey, I'm calling you because of this site. And this 342 00:21:38.790 --> 00:21:43.890 access and this mechanism need you to reset your password." 343 00:21:43.890 --> 00:21:47.130 Standard social engineering thing. But you know, the 344 00:21:47.130 --> 00:21:51.180 information we think was derived by looking at LinkedIn 345 00:21:51.630 --> 00:21:54.990 information that was being posted. And the criminal in this 346 00:21:54.990 --> 00:21:58.500 particular instance, even had his own LinkedIn page purporting 347 00:21:58.500 --> 00:22:02.100 to be part of the organization in a geography that was 348 00:22:02.100 --> 00:22:04.980 consistent with the organization, and also put some 349 00:22:04.980 --> 00:22:09.960 of that lingo on the fraudulent LinkedIn page. So it really is 350 00:22:09.990 --> 00:22:13.260 an area of risk that that companies at least need to sit 351 00:22:13.260 --> 00:22:16.230 down and say, you know, are we doing what we need to do when it 352 00:22:16.230 --> 00:22:19.770 comes to managing the information that's being shared 353 00:22:19.770 --> 00:22:20.190 here. 354 00:22:21.090 --> 00:22:23.040 Anna Delaney: I appreciate that overview. That's excellent for 355 00:22:23.040 --> 00:22:25.470 scams and what you're seeing. In that particular case you 356 00:22:25.470 --> 00:22:28.140 highlighted, how far down the line did it actually get? 357 00:22:31.170 --> 00:22:34.920 David Pollino: You can read about that one in the media. The 358 00:22:34.920 --> 00:22:39.180 criminal was ultimately arrested. He was able to get 359 00:22:39.180 --> 00:22:41.790 some success at you know, not a company that I worked at, but 360 00:22:41.790 --> 00:22:46.500 other companies, but it was a pretty successful attack. That's 361 00:22:46.500 --> 00:22:46.980 for sure. 362 00:22:48.250 --> 00:22:50.830 Anna Delaney: Well, these criminals keep on trying and 363 00:22:50.830 --> 00:22:53.590 keeping us on our toes. But thank you, David. Well, I'd like 364 00:22:53.590 --> 00:22:58.480 to welcome you all back to the studio. Here we are. So final 365 00:22:58.480 --> 00:23:05.860 question. So six months of the year remaining. Let's look to 366 00:23:05.860 --> 00:23:08.890 the second half of the year. We have Russia still in Ukraine, 367 00:23:09.130 --> 00:23:12.790 the economy is tumbling. What are we going to be focused on by 368 00:23:12.820 --> 00:23:15.850 year's end? Lisa, any thoughts? 369 00:23:16.170 --> 00:23:20.490 Lisa Sotto: Sure. Well, as David said, more of the same. So we 370 00:23:20.490 --> 00:23:27.450 are hyper vigilant now because of the war in Ukraine. And we've 371 00:23:27.450 --> 00:23:30.300 been hyper vigilant for the last few years. But of course, all of 372 00:23:30.300 --> 00:23:33.060 our efforts are stepped up. And I think that the threat level 373 00:23:33.090 --> 00:23:38.640 will continue to remain high. And we will continue in the 374 00:23:38.640 --> 00:23:43.920 foreseeable future to keep all hands on deck in the security 375 00:23:44.760 --> 00:23:49.230 area. On the privacy side, there is little question that 376 00:23:49.230 --> 00:23:53.370 additional states will join the party as well. We're going to 377 00:23:53.370 --> 00:23:57.300 certainly see more states passing laws. So we're crossing 378 00:23:57.300 --> 00:24:00.390 our fingers for a preemptive federal law. 379 00:24:02.640 --> 00:24:05.100 Anna Delaney: Good, busy indeed. David? 380 00:24:05.960 --> 00:24:07.970 David Pollino: Yeah, as we've talked about earlier, when it 381 00:24:07.970 --> 00:24:11.090 comes to ransomware, and some of those other threats that we 382 00:24:11.090 --> 00:24:14.840 have, if you look at the organizations that are being 383 00:24:14.840 --> 00:24:19.220 hit, they're the organizations that are still utilizing some of 384 00:24:19.220 --> 00:24:24.770 the very antiquated security approaches, not utilizing MFA, 385 00:24:24.770 --> 00:24:27.290 having flat networks, not utilizing zero trust 386 00:24:27.290 --> 00:24:31.790 technologies. The attacks seem to be naturally progressing more 387 00:24:31.790 --> 00:24:35.960 towards those weaker institutions. So institutions 388 00:24:35.960 --> 00:24:38.480 really need to make sure they're making the proper investment 389 00:24:38.900 --> 00:24:43.070 proactively because as Lisa said, with paydays going up, 390 00:24:43.070 --> 00:24:46.940 being a target — a successful target — of a ransomware attack 391 00:24:47.030 --> 00:24:51.200 can be very expensive. And to circle back to the question 392 00:24:51.200 --> 00:24:55.820 around social media and what types of things, you know, could 393 00:24:55.820 --> 00:24:59.900 be exploited there for organizations. It's really 394 00:24:59.900 --> 00:25:04.100 important for organizations to clearly articulate what they 395 00:25:04.100 --> 00:25:08.000 want good behavior to be, have that in policy, educate their 396 00:25:08.000 --> 00:25:11.540 users, monitor — not create a police state where they're 397 00:25:11.540 --> 00:25:14.210 checking on everything that their users are doing — but make 398 00:25:14.210 --> 00:25:16.760 sure that if there are things that are sensitive to the 399 00:25:16.760 --> 00:25:21.530 organization, confidential information, project names, IP, 400 00:25:21.560 --> 00:25:25.850 that those things are kept out of the OSINT as it were, so 401 00:25:26.180 --> 00:25:29.270 companies can continue to protect their secrets. 402 00:25:30.440 --> 00:25:31.550 Anna Delaney: Appreciate these insights. 403 00:25:31.000 --> 00:25:34.175 Tom Field: Anna, I would add to that, as I think about this, I 404 00:25:34.237 --> 00:25:37.537 agree with everything, of course, that Lisa and David 405 00:25:37.599 --> 00:25:41.210 said. What's happening in Ukraine has cast a cybersecurity 406 00:25:41.273 --> 00:25:45.257 shadow over the world. And we're continuing to see ramifications 407 00:25:45.319 --> 00:25:48.868 of that. But think about it. We also have got significant 408 00:25:48.931 --> 00:25:52.479 elections upcoming in the US this year. What impact might 409 00:25:52.542 --> 00:25:55.966 outside interference have on those? We have last year's 410 00:25:56.028 --> 00:25:59.889 Executive Order continuing to progress and we're continuing to 411 00:25:59.951 --> 00:26:03.562 have conversations about things such as a software bill of 412 00:26:03.624 --> 00:26:06.986 materials, and also would caution that here we are mid 413 00:26:07.049 --> 00:26:10.971 year 2022. And we have not seen our Kaseya of the year, we have 414 00:26:11.033 --> 00:26:14.707 not seen our Colonial Pipeline, we have not seen our Log4j. 415 00:26:14.769 --> 00:26:18.442 Every day, I feel that that other shoe is going to drop and 416 00:26:18.504 --> 00:26:22.365 at some point, it will. At the same time we're seeing economic 417 00:26:22.427 --> 00:26:25.976 conditions globally just go into areas we've not seen for 418 00:26:26.038 --> 00:26:29.836 generations in some cases. I think as you look to the economy 419 00:26:29.898 --> 00:26:33.696 this year, it may well be that the rich won't continue to get 420 00:26:33.758 --> 00:26:37.805 significantly richer. But I like to think that the poorly secured 421 00:26:37.868 --> 00:26:41.541 will continue to get better secured because of these storms 422 00:26:41.603 --> 00:26:44.530 that are brewing and the momentum that we have. 423 00:26:46.010 --> 00:26:48.410 Anna Delaney: Well said! You say we've not seen a Kaseya. Well, 424 00:26:48.410 --> 00:26:51.050 what holiday — what US holiday — do we have coming up? 425 00:26:51.590 --> 00:26:53.330 Tom Field: Fourth of July is coming up very quickly. 426 00:26:54.350 --> 00:26:57.140 Anna Delaney: Let's touchwood here. Well, that is 427 00:26:57.320 --> 00:26:59.960 unfortunately what we have time for. Thank you very much, Lisa 428 00:27:00.110 --> 00:27:04.280 and David and Tom, as always. Thank you very much for 429 00:27:04.280 --> 00:27:05.720 watching. Until next time!