WEBVTT 1 00:00:00.000 --> 00:00:03.030 Anna Delaney: Hi, welcome to the ISMG Editors' Panel. I'm Anna 2 00:00:03.030 --> 00:00:05.790 Delaney, and this is our weekly conversation around the most 3 00:00:05.790 --> 00:00:09.000 important themes in the industry. And joining me this 4 00:00:09.000 --> 00:00:12.150 week are Tom Field, senior vice president of editorial; Matthew 5 00:00:12.150 --> 00:00:15.450 Schwartz, executive editor of DataBreachToday and Europe; and 6 00:00:15.480 --> 00:00:19.290 Tony Morbin, executive news editor for the EU. Great to see 7 00:00:19.290 --> 00:00:20.580 you all! Long time! 8 00:00:21.720 --> 00:00:23.940 Tom Field: It's odd to see each other virtually now. 9 00:00:24.240 --> 00:00:26.220 Anna Delaney: Yeah. You remember how that feels? 10 00:00:27.270 --> 00:00:29.040 Matthew Schwartz: It's a poor substitute for the real thing, I 11 00:00:29.040 --> 00:00:29.430 feel. 12 00:00:30.870 --> 00:00:34.740 Tony Morbin: Two years of seeing Tom twice a day, and then, you 13 00:00:34.740 --> 00:00:37.860 know, before I actually see him in person, but yeah, great to 14 00:00:37.860 --> 00:00:39.570 actually see people in person. 15 00:00:39.900 --> 00:00:44.640 Anna Delaney: Yes. So I know, for our audience, we all met in 16 00:00:44.640 --> 00:00:48.120 London this week for our London Summit — the ISMG London Summit 17 00:00:48.120 --> 00:00:51.420 — as you can tell by our backgrounds. So Tom, talk us 18 00:00:51.420 --> 00:00:52.290 through your background. 19 00:00:52.900 --> 00:00:54.850 Tom Field: Well, I think I was with you when I took that 20 00:00:54.850 --> 00:00:56.980 picture, you and Matthew on the way to dinner the other night 21 00:00:56.980 --> 00:01:00.370 after our cybersecurity summit, of course. We were then near 22 00:01:00.370 --> 00:01:05.110 Southbank, and lovely picture as the sun was starting to set 23 00:01:05.530 --> 00:01:08.560 behind the London Eye. And of course, we got colorful photos 24 00:01:08.560 --> 00:01:10.960 on the way back as well. That was a nice way to cap off 25 00:01:10.960 --> 00:01:13.840 several days in London with the opportunity to bring the team 26 00:01:13.840 --> 00:01:18.670 together and to host our first live summit event in two and a 27 00:01:18.670 --> 00:01:19.240 half years. 28 00:01:19.650 --> 00:01:21.960 Anna Delaney: Yeah, it was fabulous. And, Tony? 29 00:01:22.950 --> 00:01:25.890 Tony Morbin: Yeah, pretty standard House of Commons, Big 30 00:01:25.890 --> 00:01:29.190 Ben, but without the scaffolding. You know, as Tom 31 00:01:29.190 --> 00:01:31.650 was mentioning, we can actually see it now, hasn't got the 32 00:01:31.650 --> 00:01:33.420 scaffolding. So it's a lovely view. 33 00:01:35.130 --> 00:01:37.410 Anna Delaney: And, Matt, you've got a rather arty shot, but 34 00:01:37.410 --> 00:01:38.010 still London. 35 00:01:38.670 --> 00:01:41.520 Matthew Schwartz: Still London. This is Kings Cross as I was 36 00:01:41.550 --> 00:01:45.180 pulling out of the station on my way back home up to Scotland. So 37 00:01:45.180 --> 00:01:51.210 a little bit of blurry train motion. You know, feelings of 38 00:01:51.210 --> 00:01:54.780 loss and regret having to leave London after a successful event 39 00:01:54.780 --> 00:01:56.010 and seeing everyone in person. 40 00:01:56.100 --> 00:01:58.320 Anna Delaney: Yeah. And it was two years since you've been here 41 00:01:58.410 --> 00:01:59.040 as well. 42 00:01:59.190 --> 00:02:02.370 Matthew Schwartz: Nearly, yes. So it was wonderful to be back. 43 00:02:02.370 --> 00:02:04.530 There were attempts to do it before, but obviously, there's 44 00:02:04.530 --> 00:02:07.530 been some ups and downs the last couple of years. So great to 45 00:02:07.530 --> 00:02:08.340 finally make it. 46 00:02:09.080 --> 00:02:11.030 Anna Delaney: And I'm showing you another side of London — 47 00:02:11.090 --> 00:02:14.810 yesterday at the Chelsea Flower Show. So London in full bloom. 48 00:02:14.840 --> 00:02:18.980 But there is a theme here, it is London. Tom, what were the 49 00:02:18.980 --> 00:02:21.470 highlights from the conference? 50 00:02:22.080 --> 00:02:24.960 Tom Field: Where do we begin? First of all, the notion that we 51 00:02:24.960 --> 00:02:27.990 did put on our first live conference event in two and a 52 00:02:27.990 --> 00:02:31.380 half years, having the opportunity to see speakers that 53 00:02:31.380 --> 00:02:35.580 we hadn't seen in that time and to meet new people and to have 54 00:02:35.580 --> 00:02:39.210 the vibrancy of the topics that we talked about whether it was 55 00:02:39.210 --> 00:02:45.270 ransomware, or resilience, or mental health. I think that we 56 00:02:45.270 --> 00:02:49.410 benefited from having the thought leadership of Don 57 00:02:49.410 --> 00:02:53.940 Gibson, the security leader with the UK's Department of 58 00:02:54.000 --> 00:02:58.140 International Trade, because he brought good topics and good 59 00:02:58.140 --> 00:03:01.350 speakers to us. And he was a steady presence throughout the 60 00:03:01.350 --> 00:03:04.800 event. I think he kept us very tapped into the zeitgeist so to 61 00:03:04.800 --> 00:03:08.190 speak, that we're able to ensure that we were talking about the 62 00:03:08.190 --> 00:03:11.310 topics that were most relevant. And we had speakers who were new 63 00:03:11.310 --> 00:03:15.660 to us, but certainly had great depth of expertise in their 64 00:03:15.660 --> 00:03:19.380 topic. So it was such a terrific opportunity to bring everybody 65 00:03:19.380 --> 00:03:22.770 together. And of course, we had two stages going in at a time. 66 00:03:22.770 --> 00:03:28.020 We had a video studio going. So it was as much busyness as 67 00:03:28.020 --> 00:03:32.700 usual. But if I were to share, what for me was the single 68 00:03:32.730 --> 00:03:35.670 biggest highlight, and we're talking about, again, first live 69 00:03:35.670 --> 00:03:39.870 event in two and a half years in London. First opportunity to be 70 00:03:39.870 --> 00:03:45.060 up on the stage with this team. And you know, the attendees that 71 00:03:45.060 --> 00:03:48.120 we had, the response that we had, the single highlight I will 72 00:03:48.120 --> 00:03:53.970 share with you, because it was this - it was the opportunity to 73 00:03:53.970 --> 00:03:58.440 get this team together for the very first time. Anna, you and I 74 00:03:58.440 --> 00:04:01.110 have worked together for two and a half years now. Tony, for 75 00:04:01.110 --> 00:04:05.610 about two years. Matt and Tony, you had never met before. Anna 76 00:04:05.610 --> 00:04:08.820 and Matt, you had never met before. Tony, you and I had 77 00:04:08.820 --> 00:04:12.000 never met before. The opportunity for us to finally 78 00:04:12.000 --> 00:04:16.110 get together and to be able to share this experience together 79 00:04:16.440 --> 00:04:20.610 on the same stage, in the same city was to me absolutely 80 00:04:20.610 --> 00:04:21.270 priceless. 81 00:04:23.580 --> 00:04:25.680 Anna Delaney: We all recognize ourselves in that pic. It was 82 00:04:25.680 --> 00:04:27.570 great. It definitely was. 83 00:04:28.020 --> 00:04:30.090 Tony Morbin: Or maybe do a haircut, didn't quite make it in 84 00:04:30.000 --> 00:04:34.830 Anna Delaney: Tony, what were the highlights for you? What 85 00:04:30.090 --> 00:04:30.900 time for the show. 86 00:04:34.860 --> 00:04:36.270 topics stood out? 87 00:04:37.380 --> 00:04:40.320 Tony Morbin: Well, there was so much. So I'm afraid I'm going to 88 00:04:40.350 --> 00:04:43.710 have to miss loads of it as I run through some of the things I 89 00:04:43.740 --> 00:04:47.520 particularly liked. And a couple of quotes, you know, the threat 90 00:04:47.520 --> 00:04:52.050 landscape hasn't changed one job from the Ukraine war. Russia was 91 00:04:52.050 --> 00:04:56.040 already a hostile threat. There has been a cyber war but the 92 00:04:56.040 --> 00:04:58.980 Russians have been the victims and they let not pick you out 93 00:04:58.980 --> 00:05:03.630 too early. And then basically saying that, you know, really 94 00:05:03.630 --> 00:05:07.110 it's the shift to the public cloud and errors like 95 00:05:07.110 --> 00:05:10.080 misconfiguration and phishing are still a bigger threat than 96 00:05:10.080 --> 00:05:14.400 state APTs. Although one change we have seen is the increase 97 00:05:14.400 --> 00:05:17.370 hacktivist threat, and particularly the threat to 98 00:05:17.370 --> 00:05:20.940 reputation from hacktivists going forward. The normal things 99 00:05:20.940 --> 00:05:23.940 apply, keep up-to-date, look after ID. There's always going 100 00:05:23.940 --> 00:05:27.330 to be zero days, it's how we contain it when it happens. Use 101 00:05:27.330 --> 00:05:31.140 your ransomware playbook to defense against virtually any 102 00:05:31.260 --> 00:05:35.700 attack. Because the escalation from Ukraine fallout, it's going 103 00:05:35.700 --> 00:05:38.790 to be appropriate to defend against anything. Patch or be 104 00:05:38.790 --> 00:05:42.450 punished. Work from anywhere means you can be attacked from 105 00:05:42.450 --> 00:05:46.320 anywhere. Access control is a focus. Cloud security 106 00:05:46.320 --> 00:05:49.920 architecture misconfiguration cause whole loads of problems. 107 00:05:50.580 --> 00:05:54.270 One financial institution noted how administrators exchange 108 00:05:54.270 --> 00:05:57.720 information, they're their fund administrators exchanging 109 00:05:57.720 --> 00:06:00.840 information via email. So what they're doing is looking to 110 00:06:00.840 --> 00:06:04.530 drive and shift the information exchange via portals that they 111 00:06:04.530 --> 00:06:07.560 can control them and control where the traffic's moving, 112 00:06:07.890 --> 00:06:09.960 shifting their business to architectures that are going to 113 00:06:09.960 --> 00:06:13.050 be more secure. Another interesting point was that 114 00:06:13.170 --> 00:06:15.960 pre-Colonial Pipeline, the only constraint on ransomware 115 00:06:15.960 --> 00:06:18.720 operators was the number of affiliate of affiliates they 116 00:06:18.720 --> 00:06:23.310 were able to get. Our defenses were incidental, including 117 00:06:23.310 --> 00:06:26.880 Colonial. And as a result, we've seen a more vertical integration 118 00:06:26.880 --> 00:06:30.000 in the ransomware ecosystem and the focus of the ransomware 119 00:06:30.000 --> 00:06:33.300 gangs is now on smaller organizations that wouldn't be 120 00:06:33.300 --> 00:06:35.550 regarded as critical infrastructure — just basically 121 00:06:35.550 --> 00:06:38.970 getting the heat off. It's not so much they're targeting as 122 00:06:38.970 --> 00:06:42.300 they are triaging where they've gained access. So they're going 123 00:06:42.300 --> 00:06:46.950 after the smaller victims. There was a nice line from one of the 124 00:06:46.950 --> 00:06:50.520 sessions you did - the global talent pool is there, but it's 125 00:06:50.520 --> 00:06:53.790 difficult to access due to legal issues of employment, cultural 126 00:06:53.790 --> 00:06:58.410 issues. For me, the big one was really resilience. Once, 127 00:06:58.410 --> 00:07:02.310 business continuity was about one time event. Now it's all the 128 00:07:02.310 --> 00:07:05.940 time, and how do you continue to operate as a business. It was 129 00:07:05.940 --> 00:07:09.510 pointed out that business resilience and business recovery 130 00:07:09.540 --> 00:07:13.440 are different. Having a backup is not enough. You need good 131 00:07:13.440 --> 00:07:16.620 backups in place, obviously, but it should encompass the whole 132 00:07:16.620 --> 00:07:19.980 business, including physical resilience, and a long term 133 00:07:19.980 --> 00:07:24.030 strategy of how to survive. Nothing's 100% secure, but you 134 00:07:24.030 --> 00:07:27.360 need to be comfortable with the level of risk you have. And also 135 00:07:27.360 --> 00:07:31.530 get a budget for unknown things that might happen. Who owns 136 00:07:31.530 --> 00:07:34.410 cyber risk was probably the last thing I'll cover, which was, 137 00:07:34.770 --> 00:07:37.290 it's known by everybody in the business, but it needs to have 138 00:07:37.290 --> 00:07:39.990 board accountability. It's certainly not just an IT issue. 139 00:07:40.350 --> 00:07:44.100 Businesses need to be able to articulate their risk, what's 140 00:07:44.100 --> 00:07:47.190 the vulnerability, what's the impact of the risk, so you've 141 00:07:47.190 --> 00:07:49.800 got to come to the board prepared with a simple story. 142 00:07:49.980 --> 00:07:53.070 They own the risk, they're responsible for it. But 143 00:07:53.070 --> 00:07:55.650 businesses will only take ownership of the risk if they 144 00:07:55.650 --> 00:07:58.440 understand the risk and the value of the data or whatever it 145 00:07:58.440 --> 00:08:01.890 is at risk to quantify it and get the business to accept the 146 00:08:01.890 --> 00:08:02.370 risk. 147 00:08:02.490 --> 00:08:04.200 Anna Delaney: Yeah, great overview, Tony. 148 00:08:05.670 --> 00:08:07.440 Tom Field: If you missed the summit, please talk to Tony. 149 00:08:10.080 --> 00:08:12.780 Tony Morbin: I missed loads more I am afraid, you know. You'll 150 00:08:12.780 --> 00:08:14.640 still miss a lot if you didn't go to the summit. 151 00:08:15.540 --> 00:08:17.250 Anna Delaney: I'll pass the baton to Matthew. I mean, I'm 152 00:08:17.250 --> 00:08:19.980 sure you'll have other stuff to add. But Tony did mention 153 00:08:19.980 --> 00:08:24.240 Colonial and Colonial came up a lot. It seems to be such a 154 00:08:24.240 --> 00:08:27.300 pivotal moment in the history of ransomware. So we'd love your 155 00:08:27.300 --> 00:08:30.630 thoughts as to what you'd like to add, but also whether you 156 00:08:30.630 --> 00:08:31.920 learned anything new. 157 00:08:33.630 --> 00:08:36.210 Tom Field: If I can add, I think it's important to point out what 158 00:08:36.210 --> 00:08:39.240 Anna was talking about here. Yes, Colonial did come up 159 00:08:39.330 --> 00:08:42.900 increasingly. What's unique about that, that's an entirely 160 00:08:42.900 --> 00:08:46.920 American event and yet it had such repercussions. So Matthew, 161 00:08:47.340 --> 00:08:47.910 go forward. 162 00:08:49.560 --> 00:08:51.180 Matthew Schwartz: I had great discussions about Colonial 163 00:08:51.180 --> 00:08:56.100 Pipeline. I wonder if it's a bit of a lazy reference in the 164 00:08:56.100 --> 00:09:01.320 security industry now. I'm not calling anybody out definitely. 165 00:09:01.320 --> 00:09:04.680 I was at a conference in Dublin in November, on cybercrime, 166 00:09:04.680 --> 00:09:07.620 which was fantastic. And everyone was able to bring their 167 00:09:07.620 --> 00:09:12.240 own presentation. And literally everyone had a presentation on 168 00:09:12.240 --> 00:09:16.350 ransomware. So obviously, it's a hot topic. It brings a lot of 169 00:09:16.350 --> 00:09:19.080 different things together that we've been battling/combating 170 00:09:19.110 --> 00:09:24.270 for years. Colonial Pipeline is interesting, because it has so 171 00:09:24.270 --> 00:09:27.000 many different aspects to it. I think when you're trying to 172 00:09:27.510 --> 00:09:31.500 illustrate both from the defensive side, the offensive 173 00:09:31.500 --> 00:09:35.640 side, how governments have increasingly responded, it's a 174 00:09:35.640 --> 00:09:39.900 good example. Definitely it is not the only example. But as it 175 00:09:39.900 --> 00:09:44.340 was a year, almost to the day from when the Colonial Pipeline 176 00:09:44.340 --> 00:09:48.360 attack happened, I think it was a perfect example to be hearing 177 00:09:48.390 --> 00:09:51.330 at the London Summit. Unfortunately, we'll probably 178 00:09:51.330 --> 00:09:55.080 have lots of fresher examples as we go forward, especially with 179 00:09:55.080 --> 00:10:00.570 RSA coming up on us quickly. So I think it was great. But there 180 00:10:00.570 --> 00:10:05.580 was so much discussed, as Tom and Tony have highlighted. I 181 00:10:05.580 --> 00:10:07.950 love the line from Ian Thornton-Trump about 182 00:10:07.950 --> 00:10:12.060 cyberthreats. It was in the first session that we had, where 183 00:10:12.060 --> 00:10:15.480 he said, "Everyone expected Russia to launch cyber war. 184 00:10:15.870 --> 00:10:19.770 Russia, to its chagrin, has found that it's the target of 185 00:10:19.920 --> 00:10:24.540 numerous online operations instead." That was great. I had 186 00:10:24.540 --> 00:10:28.920 a really good interview with Ian, in our studio, where he 187 00:10:28.950 --> 00:10:33.000 further expanded on the threat landscape, and we talked about 188 00:10:33.000 --> 00:10:36.990 all sorts of things. But drones, for example, have completely 189 00:10:37.020 --> 00:10:41.160 changed what's happening. And as he said, "Who knew that 190 00:10:41.160 --> 00:10:45.360 satellite communications would so change the landscape." That 191 00:10:45.360 --> 00:10:49.830 is an online attack we saw from Russia on the very first day of 192 00:10:49.830 --> 00:10:53.880 the war, attempting to knock out satellite communications used 193 00:10:54.180 --> 00:10:58.680 for such things as artillery and also drones. And then you had 194 00:10:59.130 --> 00:11:03.210 Elon Musk's company swooping in with the Starlink terminals and 195 00:11:03.330 --> 00:11:07.680 hook everything up. So much is happening, so many different 196 00:11:07.680 --> 00:11:12.150 aspects. And on the ransomware front, the Russia-Ukraine war 197 00:11:12.150 --> 00:11:15.300 has also had some interesting effects, as people discussed. 198 00:11:15.480 --> 00:11:18.450 You had the National Security Agency saying we've seen the 199 00:11:18.450 --> 00:11:22.650 volume of ransomware attacks go down recently, we think because 200 00:11:22.650 --> 00:11:25.980 of the war, the difficulty of routing payments, perhaps, it is 201 00:11:25.980 --> 00:11:29.910 part of that to the criminals. So many different wonderful 202 00:11:29.910 --> 00:11:33.870 things to analyze. Tony obviously has called out a whole 203 00:11:33.870 --> 00:11:36.870 bunch of them. I want to continue the risk discussion, 204 00:11:36.870 --> 00:11:42.570 just very briefly, because we had such good speakers this 205 00:11:42.570 --> 00:11:46.170 year. So many of them were at the spear end of a lot of this 206 00:11:46.170 --> 00:11:50.730 stuff. And we had, for example, Douglas Mujana, who is at 207 00:11:50.730 --> 00:11:53.670 Societe Generale. He's their vice president of Information 208 00:11:53.700 --> 00:11:58.650 Technology Risk. Who better to talk cyber risk? And him, 209 00:11:58.680 --> 00:12:03.330 together with Milos Pesic , who is also a cybersecurity 210 00:12:03.330 --> 00:12:07.050 executive, I had them on a panel, speaking about the art 211 00:12:07.050 --> 00:12:10.830 and science of translating cyber risk and loss exposure into 212 00:12:10.830 --> 00:12:15.420 quantifiable measures. Tony excerpted some of that, but one 213 00:12:15.420 --> 00:12:21.600 of the points I loved that they made, was communicating cyber 214 00:12:21.600 --> 00:12:26.220 risk in a way to not just the board, but also the business 215 00:12:26.250 --> 00:12:29.190 leaders, the heads of the different business lines of 216 00:12:29.190 --> 00:12:33.090 business. Communicating it in such a way that they understood, 217 00:12:33.120 --> 00:12:37.620 for example, if they're going to be suffering a downtime of three 218 00:12:37.620 --> 00:12:41.280 days, or working with them to say you're hit by ransomware, 219 00:12:41.310 --> 00:12:44.820 what do you think your outage is going to be? And working through 220 00:12:44.820 --> 00:12:48.480 that exercise, so that something like a ransomware attack isn't 221 00:12:48.480 --> 00:12:51.840 just scary malware, it's the fact that you can't use your 222 00:12:51.840 --> 00:12:55.800 systems for a certain amount of time, based on the controls and 223 00:12:55.800 --> 00:12:59.460 the backups, and the restoration that we have in place. Getting 224 00:12:59.460 --> 00:13:03.990 to a dollars and cents or pounds and pence bigger for what that 225 00:13:03.990 --> 00:13:08.340 impact is going to be. And then this goes into another session 226 00:13:08.340 --> 00:13:12.060 that I was also doing, but basically coming up with a way 227 00:13:12.060 --> 00:13:15.600 to express that in business terms. What is the risk? And 228 00:13:15.600 --> 00:13:19.200 we're quantifying that risk in terms of the dollar impact, the 229 00:13:19.200 --> 00:13:23.760 outage impact, the reputational impact, going before the board 230 00:13:23.760 --> 00:13:27.810 and saying, 'you own this, do you wish to spend to mitigate 231 00:13:27.810 --> 00:13:32.580 it?' Or are you okay to just say, 'we think we can handle 232 00:13:32.580 --> 00:13:34.440 that risk, we're not going to bother.' Do we need cyber 233 00:13:34.440 --> 00:13:37.920 insurance to help us? Just elevating that discussion. And 234 00:13:37.920 --> 00:13:42.990 it was wonderful to see the uptake and the interaction that 235 00:13:42.990 --> 00:13:45.810 these ideas had with the audience, really great 236 00:13:46.170 --> 00:13:49.770 engagement. I'll just highlight one of the thing. One of the 237 00:13:49.770 --> 00:13:56.370 things I loved is behavioral change and user awareness. And 238 00:13:56.730 --> 00:13:59.460 there's been a really great shift in recent years to not 239 00:13:59.460 --> 00:14:03.930 blame the user, which is paramount. If security 240 00:14:03.930 --> 00:14:08.520 functioned really well, we wouldn't need user inputs to 241 00:14:08.520 --> 00:14:12.810 help ensure that it succeeded. And I had a wonderful panel, 242 00:14:13.050 --> 00:14:16.980 including a gentleman from Airbus, Adam Wedgbury , who 243 00:14:17.010 --> 00:14:20.490 talked about bringing in behavioral psychologists, and 244 00:14:20.490 --> 00:14:24.570 even marketing professionals to help ensure that what they're 245 00:14:24.570 --> 00:14:29.250 doing isn't to design systems that they hope users will use in 246 00:14:29.250 --> 00:14:34.320 a certain way. But to understand human nature, and to try to 247 00:14:34.320 --> 00:14:37.860 design better security in a way that is easier for people to 248 00:14:37.860 --> 00:14:41.430 use, easier for them to do the right thing. As another 249 00:14:41.430 --> 00:14:46.350 panelist, Ash Hunt at Sanne Group, said, "If I am presented 250 00:14:46.350 --> 00:14:49.500 with a security control that relies on the user to ensure 251 00:14:49.500 --> 00:14:53.640 that it is effective, I veto it immediately." And I just think 252 00:14:53.640 --> 00:14:57.840 that's a wonderful, more mature, more helpful and hopefully more 253 00:14:57.840 --> 00:15:01.620 successful attitude that we're seeing when it comes to 254 00:15:01.620 --> 00:15:03.390 cybersecurity programs. 255 00:15:04.050 --> 00:15:07.830 Tom Field: Matt, if I may follow up on the resilience, a theme I 256 00:15:07.830 --> 00:15:10.830 heard not just at London, but even at the virtual roundtable I 257 00:15:10.830 --> 00:15:14.460 hosted after the event is increasingly security leaders, 258 00:15:14.580 --> 00:15:18.060 involving their senior leaders and their boards even in 259 00:15:18.060 --> 00:15:22.650 tabletop exercises, in preparation for events, to be 260 00:15:22.650 --> 00:15:26.640 able to sort of incubate that awareness that you're talking 261 00:15:26.640 --> 00:15:30.360 about of the business risk, and who has a stake in it. A line 262 00:15:30.360 --> 00:15:32.820 that came out of the discussion that we had yesterday, I thought 263 00:15:32.820 --> 00:15:36.780 was interesting was, there are no game time players when it 264 00:15:36.780 --> 00:15:39.270 comes to cybersecurity incidents. In other words, you 265 00:15:39.270 --> 00:15:41.790 don't just check yourself into the game and you're ready to go. 266 00:15:41.910 --> 00:15:45.030 You do need to practice, you do need to rehearse. And this is a 267 00:15:45.030 --> 00:15:46.680 theme I'm hearing consistently. 268 00:15:47.400 --> 00:15:49.050 Matthew Schwartz: Yeah, great example about the tabletop 269 00:15:49.050 --> 00:15:52.920 exercises. Douglas Mujana from Societe Generale brought that 270 00:15:52.920 --> 00:15:56.760 up. He said, some organizations are doing a tabletop exercise 271 00:15:56.760 --> 00:16:01.590 every quarter. And it's two hours out of their day. And it 272 00:16:01.590 --> 00:16:05.460 comes around and they go, 'Oh, no! Not this again.' He said, 273 00:16:05.580 --> 00:16:10.080 make it real, get in with the business and figure out what it 274 00:16:10.080 --> 00:16:13.200 is they're really worried about or having to deal with. So that 275 00:16:13.200 --> 00:16:15.810 when you are doing these tabletop exercises, they are 276 00:16:15.810 --> 00:16:18.390 more interesting. They're relevant. It's some challenges 277 00:16:18.390 --> 00:16:21.270 that they are having to deal with. And he said that he's 278 00:16:21.450 --> 00:16:25.140 trying to get them excited about that. Not fear and uncertainty 279 00:16:25.140 --> 00:16:27.450 and doubt, but really tailoring it to the sorts of things 280 00:16:27.450 --> 00:16:30.150 they're dealing with. He said, that's also huge in terms of 281 00:16:30.150 --> 00:16:33.930 getting buy in. And as you said, getting that practice and that 282 00:16:33.930 --> 00:16:38.520 mindset that you're going to need when things inevitably blow 283 00:16:38.520 --> 00:16:39.450 up on a Friday night. 284 00:16:40.650 --> 00:16:42.660 Tony Morbin: As I was saying, sort of, it was the building of 285 00:16:42.660 --> 00:16:45.090 muscle memory, weren't they? And of course, because a lot of 286 00:16:45.090 --> 00:16:48.180 people in this industry are coming from sort of military 287 00:16:48.180 --> 00:16:51.180 type backgrounds, you don't actually fight wars every day, 288 00:16:51.180 --> 00:16:52.590 but you practice a hell of a lot. 289 00:16:53.190 --> 00:16:56.310 Tom Field: That's a good point. Anna, I hate to do this but I 290 00:16:56.310 --> 00:16:59.370 want to set you up for a conversation here. We've talked 291 00:16:59.370 --> 00:17:02.790 an awful lot about the London event. To me, absolutely, the 292 00:17:02.790 --> 00:17:06.000 highlight of the content was the panel that you moderated at 293 00:17:06.000 --> 00:17:10.320 day's end, about CISOs health, about mental health. And I've 294 00:17:10.320 --> 00:17:14.370 attended and moderated a lot of summits and panels. I've seen 295 00:17:14.370 --> 00:17:17.760 people laugh, I've seen people get angry. I've seen people get 296 00:17:17.760 --> 00:17:21.720 engaged. I'm not sure I've ever seen people cry until the 297 00:17:21.720 --> 00:17:24.360 session that you moderated. I think it was a highlight and I'd 298 00:17:24.360 --> 00:17:25.500 love to hear your take on it. 299 00:17:26.250 --> 00:17:28.620 Anna Delaney: Yeah, it was amazing panel and important 300 00:17:28.620 --> 00:17:33.360 discussion. And when we talk about security, underfunded, 301 00:17:33.840 --> 00:17:38.790 understaffed always come up. But we also talked about resilience 302 00:17:38.910 --> 00:17:40.770 and the resilience of our organizations. What about our 303 00:17:40.770 --> 00:17:44.790 people? We've been having great conversations this year about, 304 00:17:45.030 --> 00:17:49.170 wow, cybersecurity is now a national security threat. And 305 00:17:49.170 --> 00:17:53.370 also the security-led leaders and their teams, they are now 306 00:17:53.580 --> 00:17:57.240 frontline workers and emergency workers. If we don't look after 307 00:17:57.240 --> 00:18:00.120 our people, how are we going to protect our systems and our 308 00:18:00.120 --> 00:18:04.590 organizations? So there were some really sensitive themes 309 00:18:04.620 --> 00:18:08.850 that were raised. And just showing that, yes, our leaders 310 00:18:08.880 --> 00:18:12.870 are actually human beings, and they have challenges as well. I 311 00:18:12.870 --> 00:18:16.110 do hope these discussions go further than discussions, 312 00:18:16.110 --> 00:18:22.140 though. How can we make change in organizations but the CISO of 313 00:18:22.170 --> 00:18:25.200 Penguin, Deborah Howorth did say that, yes, there's only so much 314 00:18:25.200 --> 00:18:28.560 your organization's can do. You have to look after yourself, you 315 00:18:28.560 --> 00:18:30.720 have to look up to number one. And that means putting 316 00:18:30.750 --> 00:18:34.320 boundaries in place, and whatever that takes to look 317 00:18:34.320 --> 00:18:36.450 after your well-being and the well-being of your teams as 318 00:18:36.450 --> 00:18:40.200 well. So I learnt a lot. I thought it was really engaging. 319 00:18:40.200 --> 00:18:44.610 And there was a very engaged audience as well, some great 320 00:18:44.610 --> 00:18:45.660 questions, I've got to say. 321 00:18:46.200 --> 00:18:49.590 Tony Morbin: It was very revelatory, your panelists, and 322 00:18:49.590 --> 00:18:54.210 I think you're saying about the ongoing impact, I think anybody 323 00:18:54.210 --> 00:18:58.350 in the audience in a position of power to do so, were sitting 324 00:18:58.350 --> 00:19:01.530 there thinking, 'yeah, our organization needs to be more 325 00:19:01.530 --> 00:19:05.790 supportive of its people.' I think that message totally got 326 00:19:05.790 --> 00:19:08.370 across to everybody in that audience. 327 00:19:09.180 --> 00:19:11.134 Tom Field: Anna, you did a terrific job moderating that 328 00:19:11.179 --> 00:19:14.066 session as well. I like to think that the work that we do here on 329 00:19:14.111 --> 00:19:16.732 a weekly basis gives you the practice so that you're a game 330 00:19:16.776 --> 00:19:17.310 time player. 331 00:19:17.000 --> 00:19:20.153 Matthew Schwartz: You mean, having to handle you, Tom, on a 332 00:19:20.230 --> 00:19:21.230 weekly basis? 333 00:19:21.500 --> 00:19:21.980 Tom Field: Exactly! 334 00:19:23.220 --> 00:19:25.680 Matthew Schwartz: Anything can happen, anything will happen. Be 335 00:19:25.680 --> 00:19:28.830 prepared for the best. Sorry, for the worst, hope for the 336 00:19:28.830 --> 00:19:29.340 best. 337 00:19:32.880 --> 00:19:35.400 Tony Morbin: And Tom sort of said early on, you know, we had 338 00:19:35.400 --> 00:19:39.120 our own practice in resilience with, you know, speakers who had 339 00:19:39.120 --> 00:19:41.880 COVID and coming up with adapting our plans and having 340 00:19:41.880 --> 00:19:44.610 to, you know, change things around but that's life. 341 00:19:44.910 --> 00:19:47.250 Anna Delaney: That's absolutely right! Tom, Matt, you were doing 342 00:19:47.250 --> 00:19:51.510 a bit of improvisation there. But the show must go on. 343 00:19:51.810 --> 00:19:55.260 Matthew Schwartz: It's always fun. And I think showing up is a 344 00:19:55.260 --> 00:19:58.560 big part. And I say that for the attendees as well. The energy in 345 00:19:58.560 --> 00:20:02.580 the rooms was palpable. And I was speaking with individuals, 346 00:20:02.610 --> 00:20:06.150 attendees, who said this is the first event they had been to in 347 00:20:06.150 --> 00:20:10.830 two years, probably. And their eyes lit up a little bit, I 348 00:20:10.830 --> 00:20:15.480 think with the opportunity to mix a bit. And the discussions 349 00:20:15.480 --> 00:20:19.530 that we had ending, as Tom said, with a phenomenal panel that you 350 00:20:19.530 --> 00:20:22.710 did that hit home for a lot of people gave them some marching 351 00:20:22.710 --> 00:20:27.600 orders that weren't just about EDR or XDR, but how can we make 352 00:20:27.600 --> 00:20:31.380 our programs better, more humane, more sustainable. We 353 00:20:31.380 --> 00:20:36.150 just had a wonderful range of themes and experts and energy. 354 00:20:36.000 --> 00:20:38.850 Tom Field: I will say, I hugged deliberately and I tested 355 00:20:38.850 --> 00:20:40.200 negatively. It was a good event. 356 00:20:42.370 --> 00:20:44.650 Anna Delaney: Well, bring on RSA. That's what I'd say. 357 00:20:45.190 --> 00:20:50.950 Tom Field: This was the 10k race in advance of the marathon 358 00:20:50.950 --> 00:20:52.150 coming up in two weeks. 359 00:20:52.960 --> 00:20:57.850 Anna Delaney: Yeah, our dress rehearsal. So we often, at these 360 00:20:57.850 --> 00:21:01.240 conferences, talk about current themes, but also future trends. 361 00:21:01.900 --> 00:21:06.250 What was one word that sticks out that perhaps represents or 362 00:21:06.250 --> 00:21:09.370 even sets the tone for the next half a year? 363 00:21:11.410 --> 00:21:11.980 Tony Morbin: Automation. 364 00:21:11.980 --> 00:21:13.600 Tom Field: I am going to go back... 365 00:21:13.630 --> 00:21:14.980 Tony Morbin: Sorry, Tom, go ahead. 366 00:21:15.680 --> 00:21:17.720 Tom Field: I'm going to go back to response. Because I think 367 00:21:17.720 --> 00:21:20.330 this is something I'm hearing consistently in all the events 368 00:21:20.330 --> 00:21:23.630 that they host whether they're actual or virtual. The notion of 369 00:21:23.630 --> 00:21:29.780 having this response plan and team ready and tested and ready 370 00:21:29.780 --> 00:21:33.740 and tested in today's environment, with the landscape 371 00:21:33.740 --> 00:21:37.130 that we have, and our presence in the cloud, and the lack of 372 00:21:37.130 --> 00:21:40.910 visibility, and then all the different devices and hundreds 373 00:21:40.910 --> 00:21:44.210 of personal offices that we have. Response for me is the big 374 00:21:44.210 --> 00:21:44.390 one. 375 00:21:45.220 --> 00:21:47.980 Anna Delaney: Nice! Tony said automation, I think. 376 00:21:48.110 --> 00:21:50.390 Tony Morbin: Yeah, I'll go back on that with a few seconds more 377 00:21:50.390 --> 00:21:53.210 to think about it. I'll jump back. It's a bit more clichéd, 378 00:21:53.210 --> 00:21:58.700 but resilience really is. And I think having just coming out of 379 00:21:58.700 --> 00:22:02.150 the pandemic, and then we've got the war and we've got so many 380 00:22:02.150 --> 00:22:07.430 things happening, resilience is, you know, it's pretty key. 381 00:22:09.460 --> 00:22:12.700 Matthew Schwartz: I will say the unexpected. At the CYBERUK 382 00:22:12.700 --> 00:22:15.280 conference here a couple of weeks ago, Jen Easterly, the 383 00:22:15.280 --> 00:22:19.360 director of CISA, in the United States, did a Monty Python riff, 384 00:22:19.390 --> 00:22:22.450 right? Because it was Britain. No one expected the Spanish 385 00:22:22.450 --> 00:22:27.520 inquisition. And that was her, was it a metaphor or her pop 386 00:22:27.520 --> 00:22:30.520 culture reference for encapsulating what it's like to 387 00:22:30.520 --> 00:22:33.760 have been in cybersecurity and what it will be like. You don't 388 00:22:33.760 --> 00:22:36.370 know what's going to come storming in your front door. So 389 00:22:36.580 --> 00:22:39.970 be prepared, try to learn from the past, but be prepared to 390 00:22:39.970 --> 00:22:43.150 have your expectations challenged and sometimes 391 00:22:44.980 --> 00:22:46.630 impinged upon with a British accent I suppose. 392 00:22:48.760 --> 00:22:52.120 Anna Delaney: Can only get better. Well, thank you very 393 00:22:52.120 --> 00:22:55.390 much. This has been a great discussion, and loved meeting 394 00:22:55.390 --> 00:22:58.300 you all in person as well again, or for the first time, so that's 395 00:22:58.300 --> 00:23:03.820 been great. We have to leave it there, unfortunately. 396 00:23:04.090 --> 00:23:05.950 Matthew Schwartz: It's fun to do it virtually. But let's do it 397 00:23:05.980 --> 00:23:07.870 again, live sometime soon. 398 00:23:08.140 --> 00:23:08.800 Tom Field: Maybe in a week. 399 00:23:10.690 --> 00:23:14.830 Anna Delaney: You're on. Thank you very much, Tom, Matt, Tony. 400 00:23:14.920 --> 00:23:17.320 And thank you so much for watching. Until next time!