WEBVTT 1 00:00:03.270 --> 00:00:05.400 Anna Delaney: Hello, this is Proof of Concept, a talk show 2 00:00:05.400 --> 00:00:08.880 where we invite leading experts to discuss the cybersecurity and 3 00:00:08.880 --> 00:00:12.180 privacy challenges of today and tomorrow, and how we can 4 00:00:12.180 --> 00:00:15.330 potentially solve them. We are your hosts. I'm Anna Delaney, 5 00:00:15.330 --> 00:00:17.460 director of productions at ISMG. 6 00:00:18.150 --> 00:00:20.550 Tom Field: I'm Tom Field, senior vice president of editorial at 7 00:00:20.550 --> 00:00:23.460 ISMG. Anna, it is always a pleasure to see you and have the 8 00:00:23.460 --> 00:00:24.270 chance to speak with you. 9 00:00:24.690 --> 00:00:27.300 Anna Delaney: Absolutely. And Tom, in a moment, we'll be 10 00:00:27.300 --> 00:00:30.600 joined by two leading CISOs in the U.S. government sector, and 11 00:00:30.600 --> 00:00:33.570 there is lots we could explore. But on this episode, we'll 12 00:00:33.570 --> 00:00:37.230 discuss their experiences, challenges and strategies when 13 00:00:37.230 --> 00:00:40.260 it comes to securing digital government services, and 14 00:00:40.260 --> 00:00:43.620 ensuring the protection of citizen data and government 15 00:00:43.620 --> 00:00:47.520 systems. And as you know very well, the U.S. government faces 16 00:00:47.520 --> 00:00:51.510 many unique cybersecurity challenges due to its size, 17 00:00:51.570 --> 00:00:56.280 complexity, and the political nature of its operations. Tom, 18 00:00:56.400 --> 00:00:59.430 you speak to many, many CISOs in the sector. What are some of the 19 00:00:59.490 --> 00:01:01.890 issues you hear them discuss? 20 00:01:02.320 --> 00:01:05.530 Tom Field: Here's life for a government CISO. You work for an 21 00:01:05.530 --> 00:01:10.810 entity that's among the most targeted in the world. You have 22 00:01:10.810 --> 00:01:15.670 one of the broadest potential attack surfaces of individuals 23 00:01:15.670 --> 00:01:20.110 and devices you don't directly control in the world. You are 24 00:01:20.110 --> 00:01:26.470 laden with legacy equipment. It is a struggle to get budget and 25 00:01:26.470 --> 00:01:31.960 human resources that you need. And yet, the data that you guard 26 00:01:32.740 --> 00:01:37.300 is among the most critical to protect. It's the ultimate 27 00:01:37.300 --> 00:01:40.240 quandary. What do you do, and this is what we hear 28 00:01:40.240 --> 00:01:43.960 consistently from government security leaders, and add on top 29 00:01:43.960 --> 00:01:48.040 of that, regulatory pressure, even things such as President 30 00:01:48.040 --> 00:01:55.090 Biden's 2020-21 Executive Order, and then you get the term that 31 00:01:55.090 --> 00:01:58.630 is familiar with government CISOs unfunded mandates. You've 32 00:01:58.630 --> 00:02:02.200 to do this, but where is the budget coming from? This is what 33 00:02:03.280 --> 00:02:06.490 the public cybersecurity servant deals with every day. And they 34 00:02:06.520 --> 00:02:07.600 got my sympathy for that. 35 00:02:07.000 --> 00:02:11.320 Anna Delaney: It's a tough job, indeed. Cyber espionage, of 36 00:02:11.320 --> 00:02:16.090 course, and election security. It's a tough job and addressing 37 00:02:16.090 --> 00:02:19.720 these challenges really does require a comprehensive and 38 00:02:19.750 --> 00:02:22.870 adaptive cybersecurity strategy that involves collaboration 39 00:02:22.870 --> 00:02:26.500 between agencies, effective risk management, continuous training, 40 00:02:26.860 --> 00:02:30.760 and really leveraging these cutting edge technologies. And I 41 00:02:30.760 --> 00:02:32.800 want to dive into these areas with our speakers. 42 00:02:33.410 --> 00:02:35.990 Tom Field: But to add to this is that you get one of the biggest 43 00:02:36.020 --> 00:02:40.400 possible targets on your back, one of the smallest possible 44 00:02:40.700 --> 00:02:45.080 wallets in your back pocket. And because you are government, you 45 00:02:45.080 --> 00:02:48.830 aren't able to pay to retain talent as well as the private 46 00:02:48.830 --> 00:02:51.740 sector does. So you're constantly churning people, 47 00:02:51.740 --> 00:02:54.830 losing people to the private sector and having to replace 48 00:02:54.830 --> 00:02:57.410 those people. And how do you hold on to the human 49 00:02:57.410 --> 00:03:00.830 intellectual capital that you need to move forward? It's a 50 00:03:00.830 --> 00:03:03.020 struggle. I think our guests are probably going to tell us that. 51 00:03:03.600 --> 00:03:06.300 Anna Delaney: Sure, they've got lots to share of the challenges, 52 00:03:06.300 --> 00:03:08.670 but also their successes. So I'm really looking forward to this. 53 00:03:09.030 --> 00:03:11.970 And at this moment, I'd love to welcome our first guest, I'm 54 00:03:11.970 --> 00:03:15.840 very pleased to introduce Jeff Brown, CISO for the State of 55 00:03:15.840 --> 00:03:18.450 Connecticut. Thank you so much for joining us, Jeff. 56 00:03:19.020 --> 00:03:20.970 Jeff Brown: Oh, fantastic. Thanks for inviting me, Anna. 57 00:03:20.970 --> 00:03:23.670 And I'm really happy to be here today. 58 00:03:24.000 --> 00:03:25.110 Tom Field: Always a pleasure to see you, Jeff. 59 00:03:25.890 --> 00:03:28.140 Anna Delaney: Very good. Well, Jeff, I know you've been working 60 00:03:28.350 --> 00:03:31.740 very hard on delivering online services to citizens. Talk us 61 00:03:31.740 --> 00:03:34.560 through how you conduct risk assessments to identify 62 00:03:34.560 --> 00:03:37.410 potential vulnerabilities and threats in your digital 63 00:03:37.410 --> 00:03:38.430 government services. 64 00:03:38.819 --> 00:03:41.459 Jeff Brown: Yeah, thanks. And that's a great question. Because 65 00:03:41.459 --> 00:03:44.549 most states now we're in the midst of moving towards digital 66 00:03:44.549 --> 00:03:48.089 government. And really, what digital government means for us 67 00:03:48.089 --> 00:03:50.429 is that people want to interact with the government in a 68 00:03:50.429 --> 00:03:53.969 different way now. You know, typically, we actually have some 69 00:03:53.969 --> 00:03:56.969 really big success stories in the State of Connecticut, as an 70 00:03:56.969 --> 00:04:01.259 example, we now do more business online than in person at the 71 00:04:01.259 --> 00:04:04.769 Department of Motor Vehicles. And who isn't happy with that? 72 00:04:05.489 --> 00:04:08.909 That's really a big win. I mean, one of the ... we actually made 73 00:04:08.909 --> 00:04:11.849 the newspaper at Hartford Courant above the fold, which 74 00:04:11.849 --> 00:04:15.539 means that we were headline news for something good, you know, 75 00:04:15.899 --> 00:04:18.089 the fact that we can now interact with the government the 76 00:04:18.089 --> 00:04:22.949 way our citizens want to, which is 24/7, online mobile devices. 77 00:04:23.159 --> 00:04:25.859 And all of that stuff brings with it a new host of 78 00:04:25.859 --> 00:04:29.099 vulnerabilities of things that we just haven't thought about, 79 00:04:29.099 --> 00:04:33.299 things that we have to really kind of think through. When we 80 00:04:33.299 --> 00:04:36.749 have all of those services exposed, we have to worry about 81 00:04:36.749 --> 00:04:40.409 are they available all the time? You know, many of our services 82 00:04:40.409 --> 00:04:43.469 in the state of Connecticut, some of these may be Department 83 00:04:43.469 --> 00:04:47.609 of Social Services, things that people depend on. We run the 911 84 00:04:47.609 --> 00:04:51.779 network. I mean, there's really important things that we do. And 85 00:04:51.779 --> 00:04:54.869 because of that, we have to take a risk-based approach and what 86 00:04:54.869 --> 00:04:58.799 we do and how we do it and where we concentrate our attention, 87 00:04:59.339 --> 00:05:03.119 making sure we are addressing the most serious risks with the 88 00:05:03.119 --> 00:05:06.779 highest impact first, you know, and of course, we take all risks 89 00:05:06.779 --> 00:05:10.649 seriously. But, you know, again, I think there is a struggle with 90 00:05:10.679 --> 00:05:13.829 talent in not only in the state, right, like, I mean, it's it's 91 00:05:13.829 --> 00:05:16.949 everywhere. I mean, I came from private industry and spend about 92 00:05:16.949 --> 00:05:20.759 25 years in financial services, we struggled there, too. Just 93 00:05:20.789 --> 00:05:23.759 you know, the people aren't there, many of them are able to 94 00:05:23.819 --> 00:05:27.599 just move on to another job very quickly. And because of that, we 95 00:05:27.599 --> 00:05:30.239 have to really think about, you know, how we're attracting and 96 00:05:30.239 --> 00:05:33.479 retaining people. How are we training the people we have, I 97 00:05:33.479 --> 00:05:35.699 think that's also really important. And then really 98 00:05:35.699 --> 00:05:39.029 focusing on the wrist and the impact. You know, when when I 99 00:05:39.029 --> 00:05:41.369 look at applications that we're moving towards digital 100 00:05:41.369 --> 00:05:45.119 government, what does the app do? Can it move money? Does it 101 00:05:45.119 --> 00:05:48.779 expose private information? If somebody gets into the system? 102 00:05:49.079 --> 00:05:51.449 What happens if it's not available? What if somebody just 103 00:05:51.449 --> 00:05:54.599 takes the entire thing down? You know, if that's the 911 network, 104 00:05:54.599 --> 00:05:58.799 that's huge. That's a really big problem. So that's how we really 105 00:05:58.799 --> 00:06:01.919 think about cybersecurity in a digital government landscape. 106 00:06:01.919 --> 00:06:07.229 And it's something that we have to be resourceful I say in state 107 00:06:07.229 --> 00:06:11.579 cyber, I say that there's a perceived lack of resources, but 108 00:06:11.579 --> 00:06:14.729 it's really a lack of resourcefulness. Like you know, 109 00:06:14.729 --> 00:06:18.269 we're getting money from grants from the federal government. In 110 00:06:18.269 --> 00:06:20.369 the State of Connecticut, we actually made a case, we have a 111 00:06:20.369 --> 00:06:23.579 cybersecurity bond that we released to fund the program, 112 00:06:24.089 --> 00:06:26.699 you just have to be a little more creative, I think in terms 113 00:06:26.699 --> 00:06:28.169 of what you do and how you do it. 114 00:06:29.070 --> 00:06:30.990 Anna Delaney: Very good. And I want to talk more about talent 115 00:06:30.990 --> 00:06:35.010 in a moment. How you recruit, how you upskill. But let's take 116 00:06:35.010 --> 00:06:39.060 a look at the threat landscape. What types of cyberthreats or 117 00:06:39.060 --> 00:06:42.270 attacks have you encountered while implementing digital 118 00:06:42.270 --> 00:06:43.200 government services? 119 00:06:43.840 --> 00:06:46.060 Jeff Brown: You know, it's interesting, one thing we found 120 00:06:46.060 --> 00:06:49.060 was that it doesn't really matter if you're letting people 121 00:06:49.060 --> 00:06:51.430 know that a site has been launched or not. So we've had a 122 00:06:51.430 --> 00:06:54.310 few instances where we have a very soft launch, something's 123 00:06:54.310 --> 00:06:58.030 just turned on. People find it very quickly whether you've 124 00:06:58.030 --> 00:07:01.240 talked about it or not. In fact, we've seen sites go live. And in 125 00:07:01.240 --> 00:07:04.480 parallel, people are on the dark web saying like, look, there's a 126 00:07:04.480 --> 00:07:06.940 new site that's just gone up in the State of Connecticut, right? 127 00:07:06.940 --> 00:07:10.420 So we need to be strong with that, we need to lockstep with 128 00:07:10.450 --> 00:07:13.990 the cybersecurity controls. As things launch, we can't just put 129 00:07:13.990 --> 00:07:17.830 anything out on the internet, that really means that we have 130 00:07:17.830 --> 00:07:20.740 to get a lot more involved with our agencies. And you know, the 131 00:07:20.740 --> 00:07:23.770 State of Connecticut is in the midst of centralizing a lot of 132 00:07:23.770 --> 00:07:29.320 it and optimizing it, we are now much more in tune with what's 133 00:07:29.320 --> 00:07:31.930 going on at the agencies. So that's, I think, really 134 00:07:31.930 --> 00:07:34.360 important. Now, one thing I would say is that there is a 135 00:07:34.360 --> 00:07:36.580 saying, in the industry, if you've seen one state, you've 136 00:07:36.580 --> 00:07:39.610 seen one state. They are all organized, slightly different. 137 00:07:39.610 --> 00:07:43.090 They all have different takes on digital government. So this is 138 00:07:43.090 --> 00:07:45.580 really what we're doing in Connecticut, but really, we have 139 00:07:45.580 --> 00:07:49.870 to be very prepared, we have to be, you know, 24/7 monitoring, 140 00:07:49.870 --> 00:07:53.920 which I think is a new subject for some states, and trying to 141 00:07:53.920 --> 00:07:58.360 get that with partners and employees, that can be a big 142 00:07:58.360 --> 00:08:00.790 challenge. You know, nobody wants to work that midnight to 8 143 00:08:00.790 --> 00:08:01.450 a.m. shift. 144 00:08:02.110 --> 00:08:05.380 Anna Delaney: So can you just dive deeper and share examples 145 00:08:05.380 --> 00:08:09.670 have how these experiences have influenced your approach to 146 00:08:09.670 --> 00:08:12.250 cybersecurity risk management? You mentioned collaboration with 147 00:08:12.250 --> 00:08:14.380 agencies and 24/7 monitoring. 148 00:08:15.200 --> 00:08:17.840 Jeff Brown: Yeah, and not only that, but also understanding 149 00:08:17.840 --> 00:08:21.650 what the site's do, right? A lot of what we have to do is the 150 00:08:21.650 --> 00:08:25.340 business context, and we can't really do cybersecurity without 151 00:08:25.340 --> 00:08:29.300 having the business context as an example. Some sites, it may 152 00:08:29.300 --> 00:08:31.940 be data integrity is far more important than what's in the 153 00:08:31.940 --> 00:08:35.660 system. Right? It may be a licensing kind of application, 154 00:08:35.660 --> 00:08:37.760 right? We don't want the wrong people to have the wrong 155 00:08:37.760 --> 00:08:41.180 licenses. Other things may move money, like the Department of 156 00:08:41.180 --> 00:08:43.940 Labor, you know, certainly during the pandemic, that was a 157 00:08:43.940 --> 00:08:47.270 really big issue, not just in our state, but across all states 158 00:08:47.270 --> 00:08:50.600 where suddenly the Department of Labor is moving a lot of money 159 00:08:50.600 --> 00:08:54.080 around and attackers know that and a lot of those controls in 160 00:08:54.110 --> 00:08:57.950 those kinds of systems are ancient, so a lot of it, they 161 00:08:57.950 --> 00:09:00.800 just weren't ready for that kind of stuff. So now business fraud 162 00:09:01.100 --> 00:09:04.310 is a very big subject in the state. That's something ... and 163 00:09:04.310 --> 00:09:08.090 all states. I mean, I would say we do spend a lot of time 164 00:09:08.090 --> 00:09:11.120 talking to other CISOs and other states, this is not unique to 165 00:09:11.120 --> 00:09:15.650 us. We have to worry now if you're gping to do more work 166 00:09:15.650 --> 00:09:18.740 with the government online, then you have to expect more fraud. 167 00:09:19.520 --> 00:09:22.700 So fraud and cyber, there's a very fine line between those 168 00:09:22.700 --> 00:09:26.930 two, but I think a lot of CISOs are going to find fraud much 169 00:09:26.930 --> 00:09:29.840 more on their doorstep. I mean, that's something that's ... and 170 00:09:29.840 --> 00:09:33.050 to do fraud correctly, you really need to understand the 171 00:09:33.050 --> 00:09:36.410 business context. Is the PO box okay? Is that a sign of fraud? 172 00:09:37.100 --> 00:09:40.070 You know, those kinds of things are really business specific, so 173 00:09:40.070 --> 00:09:42.710 we have to work with the agencies closer than ever 174 00:09:42.710 --> 00:09:43.310 before. 175 00:09:44.650 --> 00:09:47.620 Anna Delaney: Jeff, the 2022 Connecticut State Cybersecurity 176 00:09:47.620 --> 00:09:51.340 Strategy called attention to the cyber skills shortage, and for 177 00:09:51.340 --> 00:09:54.760 partnering with educational institutions to support training 178 00:09:54.760 --> 00:09:58.420 and education programs. And as a result you establish Connecticut 179 00:09:58.450 --> 00:10:02.230 Cyberhub. So tell us about it and some of the successes you've 180 00:10:02.230 --> 00:10:02.440 had. 181 00:10:03.200 --> 00:10:06.110 Jeff Brown: Yeah, thanks. That's a great subject, and one that is 182 00:10:06.110 --> 00:10:09.950 very near and dear to my heart. So we are looking at training of 183 00:10:09.950 --> 00:10:12.680 the next generation, everybody talks about the shortage of 184 00:10:12.680 --> 00:10:15.650 cyber skills, you know, but we have to do something about it. 185 00:10:15.650 --> 00:10:18.830 So the White House strategy was very interesting and that they 186 00:10:18.860 --> 00:10:23.120 basically called and said that until a cyber career is within 187 00:10:23.120 --> 00:10:25.970 the reach of every capable American who wants to pursue a 188 00:10:25.970 --> 00:10:29.810 cyber career, we haven't succeeded. Now, that said, it 189 00:10:29.810 --> 00:10:32.240 takes you know, a good solid four years to get people through 190 00:10:32.240 --> 00:10:35.420 universities, it takes a lot of money, that really puts it out 191 00:10:35.420 --> 00:10:38.900 of reach for some, and there's some really talented people who 192 00:10:38.930 --> 00:10:41.810 are taking a much more skills-based learning, you know, 193 00:10:41.810 --> 00:10:44.120 we're reaching out with universities. And what we've 194 00:10:44.120 --> 00:10:48.680 seen is that some of the more the bigger universities, some of 195 00:10:48.680 --> 00:10:50.930 them have very good cyber programs, but they don't have 196 00:10:50.960 --> 00:10:54.380 necessarily the ties to businesses so and the business 197 00:10:54.380 --> 00:10:56.240 has struggled to hire. So there's a little bit of a 198 00:10:56.240 --> 00:11:00.590 disconnect between universities, employers and students. So in 199 00:11:00.590 --> 00:11:02.480 Connecticut, we've actually launched this virtual 200 00:11:02.480 --> 00:11:06.080 apprenticeship model called the Cyberhub. And we just launched 201 00:11:06.080 --> 00:11:09.290 that with the mayor of Stamford, Connecticut, Mayor Caroline 202 00:11:09.290 --> 00:11:13.610 Simmons. And we have a couple of really important wins to report 203 00:11:13.610 --> 00:11:16.850 already in that we have Cigna on board, we have Citizens Bank, 204 00:11:16.850 --> 00:11:20.960 and others, Microsoft, Splunk are partnering with us. So 205 00:11:20.960 --> 00:11:24.320 there's just a lot of interest in this. And the idea is really 206 00:11:24.320 --> 00:11:27.950 that not only is there an online learning component to this, but 207 00:11:27.950 --> 00:11:30.890 that we also partner with someone like me as an example. 208 00:11:30.920 --> 00:11:33.800 We have a number of different mentors, but we are the reality 209 00:11:33.800 --> 00:11:36.710 check for people so that things aren't just textbook learning, 210 00:11:37.580 --> 00:11:40.610 capstone projects, things like that fictitious incidents, 211 00:11:40.610 --> 00:11:43.970 people are working in a fictitious company together to 212 00:11:44.030 --> 00:11:47.060 really take textbook and knowledge-based learning and 213 00:11:47.060 --> 00:11:50.630 bring it out into the real world so that at the end of this, many 214 00:11:50.630 --> 00:11:53.510 students are landing straight into an internship or even 215 00:11:53.510 --> 00:11:54.950 better yet straight into a job. 216 00:11:55.830 --> 00:11:58.350 Anna Delaney: It's an excellent initiative. Thank you, Jeff. And 217 00:11:58.350 --> 00:12:00.540 thank you so much for the wealth of information you've just 218 00:12:00.540 --> 00:12:03.900 shared here. And for offering your perspective and experience. 219 00:12:04.050 --> 00:12:05.220 Well, it's back to you, Tom. 220 00:12:05.840 --> 00:12:07.400 Tom Field: Very good. Well, let's go across the country to 221 00:12:07.400 --> 00:12:10.670 Arizona. I want introduce our next guest is Lester Godsey. He 222 00:12:10.670 --> 00:12:14.000 is the CISO of Maricopa County. Lester, thanks so much for being 223 00:12:14.000 --> 00:12:14.810 here with us today. 224 00:12:15.290 --> 00:12:16.730 Lester Godsey: Thanks for having me. Appreciate it. 225 00:12:17.300 --> 00:12:19.490 Tom Field: Lester, to build on some of what Anna and Jeff 226 00:12:19.490 --> 00:12:24.260 talked about, how would you say the drive for easier government 227 00:12:24.260 --> 00:12:28.220 service consumption has actually improved user experiences and 228 00:12:28.220 --> 00:12:29.630 accessibility for citizens? 229 00:12:29.000 --> 00:12:34.130 Lester Godsey: You know, I think it's interesting in that, from 230 00:12:34.130 --> 00:12:39.140 my perspective, the drive was twofold. One, it's the the 231 00:12:39.170 --> 00:12:43.910 current user experience that everybody has with the Amazons 232 00:12:43.910 --> 00:12:47.780 of the world or other modern services. And so I think the 233 00:12:47.780 --> 00:12:51.650 logical extension has been, hey, if these companies can do, why 234 00:12:51.650 --> 00:12:54.890 is interfacing with government so difficult or not user 235 00:12:54.890 --> 00:12:58.160 friendly, right. But the other thing, at least in Maricopa 236 00:12:58.160 --> 00:13:01.580 County that we experienced, that really accelerated the adoption 237 00:13:01.640 --> 00:13:07.790 of digital services, at least for us was the pandemic. And so 238 00:13:08.330 --> 00:13:12.860 the instances, we had a bunch of traditional on-prem services 239 00:13:12.860 --> 00:13:15.290 where the only way you could consume that service is actually 240 00:13:15.290 --> 00:13:18.740 coming in person. Obviously, that choice was removed from 241 00:13:18.740 --> 00:13:24.380 them when the pandemic was in full tilt. And so we had to get 242 00:13:24.380 --> 00:13:28.430 creative along those lines. But then, by extension, really, 243 00:13:28.430 --> 00:13:32.810 what's occurred is, we're seeing a constant feedback loop with 244 00:13:32.840 --> 00:13:37.880 our residents, our constituents, who are consuming that services, 245 00:13:37.910 --> 00:13:40.520 providing us feedback about what they like or what they don't 246 00:13:40.520 --> 00:13:44.150 like. And it's really forcing us to reprioritize as an 247 00:13:44.150 --> 00:13:46.880 organization, what that experience is. 248 00:13:48.440 --> 00:13:50.300 Tom Field: Lester, you share a challenge that your private 249 00:13:50.300 --> 00:13:54.680 sector colleagues share as well. What have you done to ensure 250 00:13:54.680 --> 00:13:58.130 that these user friendly interfaces everybody needs and 251 00:13:58.130 --> 00:14:01.880 you want to provide don't also compromise your security 252 00:14:01.880 --> 00:14:04.760 standards, Which everybody needs you to provide? 253 00:14:05.800 --> 00:14:08.470 Lester Godsey: Yeah, you know, so that's a great question. And 254 00:14:08.470 --> 00:14:11.620 so going back to my initial statement, especially with some 255 00:14:11.620 --> 00:14:16.480 of the older services that were traditionally on-prem provided, 256 00:14:16.480 --> 00:14:20.710 right, and so we had to get in front of the organization, 257 00:14:20.710 --> 00:14:26.500 frankly, to really have a frank and honest conversation around 258 00:14:26.500 --> 00:14:29.950 risk, because in a lot of instances, departments, all they 259 00:14:29.950 --> 00:14:33.190 wanted to do was expose those internal applications and make 260 00:14:33.190 --> 00:14:37.330 them accessible to the internet, which obviously, is a red flag, 261 00:14:37.330 --> 00:14:40.240 so we had to get in front of them, work with them to come up 262 00:14:40.240 --> 00:14:43.120 with reasonable compromises that would meet their business needs 263 00:14:43.120 --> 00:14:46.120 from a timing perspective, you know, and implementing things 264 00:14:46.120 --> 00:14:49.720 like web application firewalls to mitigate the risk of exposing 265 00:14:49.720 --> 00:14:53.650 those applications that frankly, weren't developed initially, for 266 00:14:53.650 --> 00:14:57.280 consumption via the Internet. They were really designed to be 267 00:14:57.310 --> 00:15:01.660 provided internally within our network on cram for a walk up 268 00:15:01.660 --> 00:15:05.410 customers, right. And so that was quite a challenge. But then 269 00:15:05.410 --> 00:15:09.760 moving forward along those lines. Obviously, we've seen the 270 00:15:09.760 --> 00:15:13.300 accelerated adoption and clouds, you know, cloud based services, 271 00:15:13.300 --> 00:15:15.640 things of that sort, whether it's private cloud or third 272 00:15:15.640 --> 00:15:20.470 party hosted. And so one of the things we wound up having to do 273 00:15:20.500 --> 00:15:24.010 was make concerted efforts provide tools to the 274 00:15:24.010 --> 00:15:28.510 organization. I'm sure Jeff can speak to this at the state 275 00:15:28.510 --> 00:15:33.040 level, but Maricopa County, we are highly decentralized from an 276 00:15:33.040 --> 00:15:37.600 IT organization perspective. So we have multiple IT shops within 277 00:15:37.630 --> 00:15:40.780 the county that have their own application development efforts, 278 00:15:40.780 --> 00:15:45.910 or own standards, etc, etc. So we really provided tools by way 279 00:15:45.910 --> 00:15:49.690 of static and dynamic code analysis. So that way, as those 280 00:15:49.690 --> 00:15:54.250 departments were developing those interfaces to improve the 281 00:15:54.250 --> 00:15:58.030 user experience, that they could do so in a secure fashion, but 282 00:15:58.030 --> 00:16:00.910 it wasn't just the technology tools, but it was also training. 283 00:16:01.300 --> 00:16:06.130 So we brought in third-party trainers to come in and train 284 00:16:06.130 --> 00:16:11.320 the entire organization on what practices should you adopt, you 285 00:16:11.320 --> 00:16:15.040 know, for a secured or a mature DevSecOps approach, if you will, 286 00:16:15.040 --> 00:16:17.890 the application development, interface development, things of 287 00:16:17.890 --> 00:16:21.130 that sort. So it was really training technology, and 288 00:16:21.130 --> 00:16:24.880 frankly, processes ensuring that we worked with our constituent 289 00:16:24.880 --> 00:16:27.910 departments within Maricopa County, and that they knew what 290 00:16:27.910 --> 00:16:30.760 the risks were with their particular solution. And what 291 00:16:30.760 --> 00:16:33.550 can we provide to mitigate that risk to an acceptable level? 292 00:16:34.470 --> 00:16:37.530 Tom Field: Parallel to security, easier, service consumption does 293 00:16:37.530 --> 00:16:40.650 often involve streamlined authentication processes as 294 00:16:40.650 --> 00:16:43.620 well, again, how do you strike the balance between user 295 00:16:43.620 --> 00:16:46.560 convenience and robust identity verification. 296 00:16:47.440 --> 00:16:51.130 Lester Godsey: So that's a great challenge. And again, government 297 00:16:51.130 --> 00:16:54.550 being inherently in a lot of instances, especially when 298 00:16:54.550 --> 00:16:58.240 you're at the county or a state level, obviously, the federal 299 00:16:58.240 --> 00:17:03.970 government has its own unique set of challenges. But having, I 300 00:17:03.970 --> 00:17:07.390 guess, a common identity framework that all all parties 301 00:17:07.420 --> 00:17:11.050 agree to is critical. And so the more you can accomplish that the 302 00:17:11.050 --> 00:17:15.430 more successful you are. So by creating a more ubiquitous 303 00:17:15.670 --> 00:17:18.700 experience from an end user perspective, but while at the 304 00:17:18.700 --> 00:17:23.320 same time ensuring an acceptable level of security. And so 305 00:17:23.350 --> 00:17:26.350 Maricopa County, what we're doing is we're in our second 306 00:17:26.350 --> 00:17:30.220 phase of what our IAM, strategy is, our identity and access 307 00:17:30.220 --> 00:17:33.520 management. And so we had brought in a third party to kind 308 00:17:33.520 --> 00:17:37.780 of assess the lay of the land at maturity around identity. From 309 00:17:37.780 --> 00:17:40.510 an internal perspective, we identified a number of 310 00:17:40.510 --> 00:17:43.600 shortfalls and gaps within our organization, namely, in the 311 00:17:43.600 --> 00:17:46.840 form of, we didn't have an authoritative source of 312 00:17:46.840 --> 00:17:49.690 information around certain internal identity types, 313 00:17:49.720 --> 00:17:53.140 specifically, contractors, volunteers, and interns. And so 314 00:17:53.140 --> 00:17:57.550 we were able to work with the organization to ensure that our 315 00:17:57.550 --> 00:18:02.470 migration to our new HRM system, address that. And so in our 316 00:18:02.470 --> 00:18:06.130 second phase that we're in, we're actively partnering with 317 00:18:06.130 --> 00:18:09.220 those other departments. In our instance, it's the elected 318 00:18:09.220 --> 00:18:13.840 departments, so our shares our recorders, our treasurer, etc, 319 00:18:13.840 --> 00:18:18.160 etc. And working on establishing a identity framework, 320 00:18:18.160 --> 00:18:23.230 specifically around external or citizen identity, to ensure that 321 00:18:23.530 --> 00:18:27.100 our residents to the best of our ability only have one identity 322 00:18:27.100 --> 00:18:30.760 that they need to consume all county services. And so from 323 00:18:30.760 --> 00:18:34.690 their perspective, it makes for a better user experience. And 324 00:18:34.690 --> 00:18:37.810 then from a cybersecurity perspective, having that single 325 00:18:37.900 --> 00:18:42.460 centralized identity source makes things more secure. It 326 00:18:43.030 --> 00:18:47.410 lessens the complexity by having to manage multiple identities, 327 00:18:47.680 --> 00:18:52.300 and then looking at things like, of those identities, how many of 328 00:18:52.300 --> 00:18:55.270 them are tokenized for payment services are things of that 329 00:18:55.270 --> 00:19:00.880 sort. So by virtue of being able to centralize and consolidate, 330 00:19:00.910 --> 00:19:04.180 it actually kills two birds with one stone, it makes the user 331 00:19:04.180 --> 00:19:07.360 experience better, because they only have to manage a smaller 332 00:19:07.360 --> 00:19:11.590 handful of identities, ideally, just one. And at the same time, 333 00:19:11.590 --> 00:19:13.000 it makes things more secure. 334 00:19:13.930 --> 00:19:15.700 Tom Field: Let's get to this point. Have there been instances 335 00:19:15.700 --> 00:19:18.580 where compromised identities were a concern? If so, how did 336 00:19:18.580 --> 00:19:19.150 you address them? 337 00:19:20.380 --> 00:19:25.240 Lester Godsey: So compromised identities are always a concern 338 00:19:25.240 --> 00:19:29.020 for any organization, especially from an internal perspective. 339 00:19:30.250 --> 00:19:36.550 We've had our fair share along those lines, and so not to pull 340 00:19:36.550 --> 00:19:41.380 out kind of a industry tired term, but the concept of a zero 341 00:19:41.380 --> 00:19:45.400 trust network does come into play, right? So all of us on 342 00:19:45.400 --> 00:19:48.340 this call know that zero trust has been around conceptually for 343 00:19:48.340 --> 00:19:53.740 decades. It's kind of a new coat of paint, if you will, right. 344 00:19:53.770 --> 00:19:57.700 But this is an example and again, Maricopa County is well 345 00:19:57.700 --> 00:20:01.390 under, on its way on this journey for zero trust 346 00:20:01.390 --> 00:20:06.010 environment. So this is where having compromised accounts, the 347 00:20:06.010 --> 00:20:09.130 impact of that from a cybersecurity perspective is 348 00:20:09.130 --> 00:20:13.240 minimized in that if your default position as an 349 00:20:13.240 --> 00:20:18.700 organization is you're not given access to more than you need, 350 00:20:19.210 --> 00:20:23.410 then even if your identity is compromised, your ability to 351 00:20:23.440 --> 00:20:27.370 negatively impact organization is lessoned inherently right? 352 00:20:28.180 --> 00:20:32.410 What we're finding is we have our normal just like everybody 353 00:20:32.410 --> 00:20:35.620 else, I'm sure Connecticut has the same challenges with you 354 00:20:35.620 --> 00:20:37.990 know, credential harvesting websites where they get your 355 00:20:37.990 --> 00:20:40.900 username and password. But if you have things like multifactor 356 00:20:40.900 --> 00:20:45.670 authentication in place, and or if you are limited in terms of 357 00:20:45.670 --> 00:20:48.760 what that identity can actually access within your environment, 358 00:20:48.760 --> 00:20:52.720 then you mitigate that risk. We've seen that, we have our 359 00:20:52.720 --> 00:20:55.480 internal processes and how we address that, how we look for 360 00:20:55.480 --> 00:20:59.890 that. The other thing I would also mention that might be a 361 00:20:59.890 --> 00:21:04.360 little bit more of a variation is it's not just internal 362 00:21:04.360 --> 00:21:08.350 identities, but it's also our external facing identity. So 363 00:21:08.350 --> 00:21:12.730 we've had situations where somebody for example, in our 364 00:21:12.730 --> 00:21:16.600 county manager's office, somebody tried stealing their 365 00:21:16.600 --> 00:21:20.380 identity on social media, and was impersonating them. And so 366 00:21:20.560 --> 00:21:24.430 we've had to come up with some unusual or unique internal 367 00:21:24.430 --> 00:21:28.840 processes to monitor VIPs, our elected individuals, our county 368 00:21:28.840 --> 00:21:33.190 management, to see if their identities on the various 369 00:21:33.190 --> 00:21:35.860 different social media platforms have been compromised as well, 370 00:21:36.070 --> 00:21:38.110 and also that of the organization to. 371 00:21:39.370 --> 00:21:41.500 Tom Field: Very well said, Lester. Thank you so much. Stick 372 00:21:41.500 --> 00:21:43.330 around. We've got more questions for you. Meanwhile, I want to 373 00:21:43.330 --> 00:21:45.580 turn this back to my colleague, Anna, please. 374 00:21:45.910 --> 00:21:48.280 Anna Delaney: Thanks so much. We'll bring the team back for 375 00:21:48.310 --> 00:21:51.640 this final question, Jeff, maybe starting with you, I'd like to 376 00:21:51.640 --> 00:21:55.690 look at emerging technology. So how does your state approach the 377 00:21:55.690 --> 00:21:58.810 cybersecurity challenges posed by emerging technologies such as 378 00:21:58.810 --> 00:22:01.630 AI, IoT and cloud computing? 379 00:22:01.620 --> 00:22:03.745 Jeff Brown: Yeah, great question. And every state is 380 00:22:03.798 --> 00:22:07.145 taking a different approach on this we do, we do keep an eye on 381 00:22:07.198 --> 00:22:10.651 what other states are doing. But then we do our own thing. So one 382 00:22:10.704 --> 00:22:13.892 thing that we decided to do was AI technology, especially, I 383 00:22:13.945 --> 00:22:17.080 think the promise of AI is very compelling for a lot of use 384 00:22:17.133 --> 00:22:20.267 cases. But on the other hand, with new technology comes new 385 00:22:20.320 --> 00:22:23.561 risk. The State of Connecticut has taken a slightly different 386 00:22:23.614 --> 00:22:26.908 approach than some states which have chosen to outright either 387 00:22:26.961 --> 00:22:30.043 ban it or put a moratorium in place. We've actually passed 388 00:22:30.096 --> 00:22:33.283 legislation already. It's kind of first pass legislation, we 389 00:22:33.336 --> 00:22:36.365 expect more to come in the future. But we really, largely 390 00:22:36.418 --> 00:22:39.180 embrace these kinds of technologies, and then try to 391 00:22:39.234 --> 00:22:42.474 figure out where do we put the guardrails on it? Right? Like, 392 00:22:42.527 --> 00:22:45.821 there are risks. And we do need to acknowledge those risks and 393 00:22:45.874 --> 00:22:49.009 make sure that we're being transparent, make sure that, you 394 00:22:49.062 --> 00:22:52.143 know that we're making the right decisions that humans are 395 00:22:52.197 --> 00:22:55.544 involved, where they should be involved, and all of those kinds 396 00:22:55.597 --> 00:22:58.891 of things. But I mean, you know, we're getting ourselves if we 397 00:22:58.944 --> 00:23:02.185 think that nobody's going to use this, we need to embrace the 398 00:23:02.238 --> 00:23:05.585 fact that they are going to use it, and that we're working with 399 00:23:05.638 --> 00:23:08.879 them to help understand like, here are some of the risks that 400 00:23:08.932 --> 00:23:11.694 can happen. And that's everything from getting a bad 401 00:23:11.747 --> 00:23:14.988 answer that you assume is a good answer, which is, I think, a 402 00:23:15.041 --> 00:23:18.123 pretty common issue with with generative AI, for sure, you 403 00:23:18.176 --> 00:23:21.310 know, but also just making ethical decisions, transparency, 404 00:23:21.363 --> 00:23:24.551 prompt injecting all of that kind of stuff. And we also have 405 00:23:24.604 --> 00:23:28.004 to acknowledge that, while they are spending so much time trying 406 00:23:28.057 --> 00:23:31.458 to make all of the public models ethical, there's a great number 407 00:23:31.511 --> 00:23:34.805 of people who are now you know, fraud GPT, and others that are 408 00:23:34.858 --> 00:23:38.258 training models, specifically to attack. You know, that's why it 409 00:23:38.311 --> 00:23:41.339 becomes a very untenable position for us to take and say, 410 00:23:41.392 --> 00:23:44.474 like, well, we're just going to stay away from all things, 411 00:23:44.527 --> 00:23:47.768 artificial intelligence, the benefit's too great, we can't do 412 00:23:47.821 --> 00:23:50.796 that, you know, but we do have to really get in front of 413 00:23:50.849 --> 00:23:53.665 things, keep the lines of communication open with our 414 00:23:53.718 --> 00:23:56.906 employees, make sure that they know to raise questions, make 415 00:23:56.959 --> 00:24:00.253 sure that they know not to put sensitive data into these kinds 416 00:24:00.306 --> 00:24:03.600 of systems. And then, really, I think where we're heading with 417 00:24:03.653 --> 00:24:07.000 this is bespoke models where we can train it on our own data in 418 00:24:07.053 --> 00:24:10.453 the privacy. And just imagine a scenario where you go to a State 419 00:24:10.506 --> 00:24:13.853 portal, and you say, like, I'm not really sure where to go, but 420 00:24:13.906 --> 00:24:17.200 I'm looking for a license, I'm looking for whatever. And these 421 00:24:17.253 --> 00:24:20.388 chatbots can work 24/7. So I mean, this is an integral part 422 00:24:20.441 --> 00:24:23.682 of digital government, right, like, you know, for citizens to 423 00:24:23.735 --> 00:24:27.082 be able to self serve in the way that they want to, you can use 424 00:24:27.135 --> 00:24:30.482 this technology to help us but we also have to acknowledge that 425 00:24:30.535 --> 00:24:32.820 our attackers are going to leverage it too. 426 00:24:34.510 --> 00:24:38.020 Anna Delaney: Lester, love your thoughts. How is Maricopa County 427 00:24:38.020 --> 00:24:40.300 keeping up with this rapidly evolving tech? 428 00:24:41.380 --> 00:24:44.140 Lester Godsey: Yes, you know, so Maricopa County has taken a 429 00:24:44.140 --> 00:24:46.870 similar approach to Jeff in the state of Connecticut with 430 00:24:46.870 --> 00:24:52.060 regards to not trying to outright ban the use of that and 431 00:24:52.060 --> 00:24:54.880 I think part of it too, is Pandora's Box has already been 432 00:24:54.880 --> 00:24:57.100 opened. It's not like you're going to put that back in there. 433 00:24:57.100 --> 00:25:01.990 So I think, frankly, the sooner that government agencies accept 434 00:25:01.990 --> 00:25:04.030 that reality, the better off, they'll be, and they'll be 435 00:25:04.030 --> 00:25:09.880 focused on the right thing. You know, in our instance, we were 436 00:25:09.880 --> 00:25:14.110 absolutely expecting the worst. So we're going into the 2024 437 00:25:14.110 --> 00:25:18.430 election cycle. As you know, Maricopa County has been in the 438 00:25:18.430 --> 00:25:21.070 epicenter of mis-, dis-, and malinformation along those 439 00:25:21.070 --> 00:25:26.290 lines. And so in previous election cycles, we've seen, you 440 00:25:26.290 --> 00:25:31.360 know, fake images used against us to spread mis-, dis-, and 441 00:25:31.360 --> 00:25:35.260 malinformation, we're fully expecting the use, we've already 442 00:25:35.260 --> 00:25:38.860 seen the use of deep fakes on a limited basis against the 443 00:25:38.860 --> 00:25:42.940 county, they've all been really poor, we fully suspect that this 444 00:25:43.180 --> 00:25:46.510 upcoming cycle, we'll see significantly higher quality 445 00:25:46.510 --> 00:25:51.040 deep fake videos used against us. We're preparing and we're 446 00:25:51.040 --> 00:25:54.580 making the assumption that generative AI is going to be 447 00:25:54.580 --> 00:25:58.660 used. And we don't know for sure if it's being used against us, 448 00:25:58.660 --> 00:26:02.920 but the quality of the phishing emails that we're seeing are 449 00:26:02.920 --> 00:26:06.070 significantly higher than they have been at past. So we're 450 00:26:06.070 --> 00:26:09.250 making the assumption that generative AI is being leveraged 451 00:26:09.250 --> 00:26:14.980 in that capacity to help deliver a more compelling phishing 452 00:26:14.980 --> 00:26:20.890 campaign against us along those lines. And, and honestly, we are 453 00:26:20.920 --> 00:26:26.410 also from a cloud computing perspective, we're seeing the 454 00:26:26.410 --> 00:26:29.830 attack surface change as well, too. And so with government 455 00:26:29.830 --> 00:26:33.700 agencies, you know, really accelerating the adoption of 456 00:26:33.700 --> 00:26:38.200 cloud services, that might be in some instances at the cost of 457 00:26:38.200 --> 00:26:40.270 doing your due diligence to ensure you have the proper 458 00:26:40.270 --> 00:26:43.780 security controls in place. So a lot of that, at least in our 459 00:26:43.780 --> 00:26:47.860 instance, is doing the basics well. So things like asset 460 00:26:47.860 --> 00:26:53.260 management and vulnerability management, now there's a wider 461 00:26:53.260 --> 00:26:57.280 area to cover not just on-prem or within your internal network. 462 00:26:57.490 --> 00:27:01.090 But that also applies to third-party cloud providers. 463 00:27:01.090 --> 00:27:03.280 Right? So what assets do you have in the cloud that you're 464 00:27:03.280 --> 00:27:07.000 consuming? In some instances, your insight might be less 465 00:27:07.000 --> 00:27:09.250 because you don't own the infrastructure along those 466 00:27:09.250 --> 00:27:14.470 lines? So how do you ensure that or one of my biggest fears is 467 00:27:15.400 --> 00:27:20.260 poorly written or poorly secured APIs? Right? And so that might 468 00:27:20.260 --> 00:27:23.410 be an instance where threat actors instead of trying to 469 00:27:23.530 --> 00:27:27.550 hack, you know, Amazon or Microsoft, right? They're 470 00:27:27.550 --> 00:27:31.510 looking for things like, you know, poorly, poorly secured S3 471 00:27:31.510 --> 00:27:34.900 buckets, or they're going to look at the APIs that all 472 00:27:34.900 --> 00:27:39.100 organizations are leveraging to extract data and send data to 473 00:27:39.100 --> 00:27:42.730 and from, right, within the respective environments. And so 474 00:27:42.730 --> 00:27:46.300 those are the things that we're looking at and trying to take 475 00:27:46.750 --> 00:27:49.030 steps proactively to address. 476 00:27:50.350 --> 00:27:53.080 Anna Delaney: We've got more questions now than when we 477 00:27:53.080 --> 00:27:56.350 started. We certainly want to follow up with APIs and election 478 00:27:56.350 --> 00:27:58.630 security. But in the interest of time, we've got to leave it 479 00:27:58.630 --> 00:28:03.040 there. Jeff and Lester, we're so grateful to you for this 480 00:28:03.070 --> 00:28:06.250 education insight you've provided us and our audience 481 00:28:06.250 --> 00:28:07.120 today. So thank you. 482 00:28:07.540 --> 00:28:08.410 Tom Field: Thank you so much. 483 00:28:11.010 --> 00:28:13.350 Anna Delaney: Thanks so much for watching. Until next time,