WEBVTT 1 00:00:07.260 --> 00:00:09.660 Anna Delaney: Welcome to the ISMG Editors' Panel. I'm Anna 2 00:00:09.660 --> 00:00:12.600 Delaney, and this week we're delving into a significant 3 00:00:12.600 --> 00:00:15.750 cybersecurity discovery, exploring the implications of a 4 00:00:15.750 --> 00:00:19.980 backdoor found in a critical Linux utility. Also a CISO's 5 00:00:20.010 --> 00:00:23.730 comprehensive approach to cloud security and key insights shared 6 00:00:23.730 --> 00:00:26.340 at the Cybersecurity for Critical Assets Summit in 7 00:00:26.340 --> 00:00:30.000 Houston last week. Joining me today are Tom Field, senior vice 8 00:00:30.000 --> 00:00:33.270 president of editorial; Suparna Goswami, associate editor at 9 00:00:33.270 --> 00:00:36.330 ISMG Asia; and Mathew Schwartz, executive editor of 10 00:00:36.330 --> 00:00:39.390 DataBreachToday and Europe. Really good to see you all. 11 00:00:40.050 --> 00:00:40.710 Tom Field: Great to be seen. 12 00:00:41.700 --> 00:00:42.540 Mathew Schwartz: Happy Spring! 13 00:00:42.720 --> 00:00:44.100 Anna Delaney: Happy Spring, indeed. 14 00:00:44.460 --> 00:00:45.180 Tom Field: With an asterisk. 15 00:00:45.450 --> 00:00:50.190 Anna Delaney: With an asterisk, yeah, exactly. More on that 16 00:00:50.190 --> 00:00:53.430 soon. So Suparna, start us off with that beautiful sky behind 17 00:00:53.430 --> 00:00:53.670 you. 18 00:00:54.550 --> 00:00:56.790 Suparna Goswami: Oh, yes. This was where I was last week. A 19 00:00:56.844 --> 00:00:59.938 small city in the north of India, called Rishikesh, in the 20 00:00:59.992 --> 00:01:03.033 Himalayan foothills besides the river Ganges. The city is 21 00:01:03.086 --> 00:01:06.181 actually renowned for being a center for studying yoga and 22 00:01:06.234 --> 00:01:09.435 meditation, but I did neither of these. Just went there, sat 23 00:01:09.488 --> 00:01:12.370 beside the river, and gorged on some good street food. 24 00:01:12.000 --> 00:01:19.710 Anna Delaney: Blissful! Tom, you mentioned that Asterix, so? 25 00:01:21.190 --> 00:01:23.590 Tom Field: Yeah, I'm afraid this is where I spent last week as 26 00:01:23.590 --> 00:01:26.950 well. But wasn't quite so scenic as Suparna. As you can see, 27 00:01:26.950 --> 00:01:31.330 we've had a bit of an ice storm up here in northern New England. 28 00:01:31.630 --> 00:01:34.540 And it continues even though we're in the first week of 29 00:01:34.540 --> 00:01:37.720 April. I'm sitting here now looking at tomorrow, is it going 30 00:01:37.720 --> 00:01:39.970 to be three inches of snow, is it going to be 12 inches? Is it 31 00:01:39.970 --> 00:01:43.150 going to be 24 inches? No determining right now. But 32 00:01:43.150 --> 00:01:46.810 there's going to be some "spring is coming." I understand that. 33 00:01:47.230 --> 00:01:47.920 But not quite yet. 34 00:01:49.570 --> 00:01:51.190 Anna Delaney: It's quite artistic though, I've got to say 35 00:01:51.190 --> 00:01:52.090 as a shot. 36 00:01:52.780 --> 00:01:56.440 Tom Field: It's also one of my windows. When you are stuck 37 00:01:56.440 --> 00:01:58.420 inside, you look for lots of creative endeavors. 38 00:01:58.540 --> 00:02:01.660 Anna Delaney: Yes, indeed. Well, Mat, you found rainbow? 39 00:02:02.530 --> 00:02:05.140 Mathew Schwartz: Like Tom seeking creative endeavors in 40 00:02:05.140 --> 00:02:09.610 the every day and they close by, this is Perth Road in Dundee. 41 00:02:09.850 --> 00:02:14.380 And I was walking the other day and it was raining as it does in 42 00:02:14.380 --> 00:02:17.650 Scotland. And then it cleared and there's this lovely little 43 00:02:17.650 --> 00:02:20.470 rainbow. So I thought, Anna would like that. 44 00:02:20.000 --> 00:02:27.230 Anna Delaney: I like that. Well, I'd like it very much because I 45 00:02:27.290 --> 00:02:29.720 got the colors behind me as well. I know Easter has come and 46 00:02:29.720 --> 00:02:33.980 gone but just sharing a glimpse of what we got up to the weekend 47 00:02:33.980 --> 00:02:37.340 as a family. And that's a bit of a tradition at our house - 48 00:02:37.340 --> 00:02:40.430 getting creative over Easter and playing Easter egg games. 49 00:02:41.180 --> 00:02:41.720 Tom Field: Very nice. 50 00:02:42.440 --> 00:02:43.940 Anna Delaney: Well, Mat, let's start with your story. Because 51 00:02:43.940 --> 00:02:47.030 it's a big one. A security researcher has uncovered a 52 00:02:47.030 --> 00:02:50.540 backdoor in a critical Linux utility, as I mentioned in the 53 00:02:50.540 --> 00:02:53.360 intro, believes to have been deployed by nation-state 54 00:02:53.390 --> 00:02:57.560 attackers, aiming to gain full remote access. Now, we know that 55 00:02:57.560 --> 00:03:01.610 the vulnerability was identified and mitigated before widespread 56 00:03:01.640 --> 00:03:05.120 exploitation. But it is a huge story, as I said, in terms of 57 00:03:05.120 --> 00:03:09.050 what the implications might have been had it not been detected. 58 00:03:09.350 --> 00:03:11.300 So why don't you just talk us through what happened? 59 00:03:11.900 --> 00:03:14.360 Mathew Schwartz: Yeah, we really dodged a big bullet with 60 00:03:14.360 --> 00:03:19.370 this one. This could have easily been worse than SolarWinds, 61 00:03:19.460 --> 00:03:25.430 security experts are saying, because of just how far reaching 62 00:03:25.910 --> 00:03:30.800 Linux is these days. So we're talking about a piece of code 63 00:03:30.800 --> 00:03:37.250 called XZ Utils. This is a compression and decompression 64 00:03:37.280 --> 00:03:42.380 tool that is built into pretty much every distribution, every 65 00:03:42.380 --> 00:03:48.710 major distribution anyway, of Linux. Thankfully, this 66 00:03:49.160 --> 00:03:55.160 erroneous code in XZ Utils got discovered by a developer at 67 00:03:55.160 --> 00:04:00.830 Microsoft. He noticed that the utility was doing some funky 68 00:04:00.830 --> 00:04:05.000 stuff, it was taking too long, and just there was some 69 00:04:05.000 --> 00:04:10.670 weirdness. And so to his credit, he dug in, and thanks to him, 70 00:04:10.910 --> 00:04:16.910 looking into this and posting to an open-source mailing list, 71 00:04:17.120 --> 00:04:20.900 there is a backdoor in this, we averted disaster, it was 72 00:04:20.930 --> 00:04:25.070 probably just a few weeks away from when this would have worked 73 00:04:25.070 --> 00:04:29.600 its way into major Linux distributions, like Red Hat 74 00:04:29.600 --> 00:04:34.790 Enterprise and just been everywhere. tear downs of this 75 00:04:34.790 --> 00:04:39.140 code are still underway. And they're complicated somewhat by 76 00:04:39.140 --> 00:04:43.190 the fact that this appears to have been a really sophisticated 77 00:04:43.520 --> 00:04:48.290 effort, which is what has a lot of people saying nation state 78 00:04:48.320 --> 00:04:51.050 attack. And there's some other reasons for that as well that we 79 00:04:51.050 --> 00:04:56.090 can touch on in just a moment. Researchers aren't even exactly 80 00:04:56.090 --> 00:05:01.100 clear how this would have worked. It looks like code was 81 00:05:01.100 --> 00:05:09.500 altered in XZ Utils, so that at a later date via SSHD, or remote 82 00:05:09.770 --> 00:05:14.930 attacker could introduce code to the system and have it run. But 83 00:05:14.930 --> 00:05:17.870 they were looking to get the capability to do this, so 84 00:05:17.870 --> 00:05:21.320 embedded that it probably wouldn't even have been noticed 85 00:05:21.470 --> 00:05:24.920 for a while. And it would have allowed attackers to execute any 86 00:05:24.920 --> 00:05:28.610 code of their choosing, this would have been incredibly 87 00:05:28.760 --> 00:05:34.490 powerful. So where are we? Where we're at is the person who is 88 00:05:34.490 --> 00:05:38.420 developing this appears to be blameless, like a lot of 89 00:05:38.480 --> 00:05:44.990 critical parts of today's software, open-source ecosystem 90 00:05:44.990 --> 00:05:49.430 software, there was one person maintaining this very useful and 91 00:05:49.430 --> 00:05:52.970 widely used tool, and this person has other stuff on their 92 00:05:52.970 --> 00:05:57.410 plate. This was a hobby, maintaining this. What was 93 00:05:57.410 --> 00:05:59.690 interesting about this is it appears to have been a very 94 00:05:59.690 --> 00:06:04.760 patient nation state attacker, who, two years ago, and we're 95 00:06:04.760 --> 00:06:07.940 deducing here, but what appears to have happened was a couple of 96 00:06:07.940 --> 00:06:11.930 years ago, someone started saying to this code maintainer, 97 00:06:12.260 --> 00:06:14.480 oh, I really need this feature, I really need this 98 00:06:14.480 --> 00:06:18.110 functionality. How come you're not spending more time updating 99 00:06:18.110 --> 00:06:20.690 this code, you really should look at this, this isn't fair. 100 00:06:21.890 --> 00:06:25.040 So turning up the heat a little bit, psychologically speaking, 101 00:06:25.940 --> 00:06:30.770 after a little while, through a confluence of events, lo and 102 00:06:30.770 --> 00:06:34.640 behold, this miracle developer shows up, he's like, hey, I got 103 00:06:34.640 --> 00:06:39.230 some free time, I see all this flak that you're getting for not 104 00:06:39.230 --> 00:06:43.250 maintaining this code. Let me introduce you to another friend 105 00:06:43.250 --> 00:06:46.340 of mine who's actually really a great developer, he can help you 106 00:06:46.340 --> 00:06:51.260 out. So there's this setup, where they socially engineer the 107 00:06:51.260 --> 00:06:55.010 person who has the right to maintain the code. And he ends 108 00:06:55.010 --> 00:07:00.500 up sharing the code maintenance capability with this miracle 109 00:07:00.500 --> 00:07:03.950 developer who has been parachuted in. And that is where 110 00:07:03.950 --> 00:07:08.990 it appears that the backdoor code has been inserted. So 111 00:07:09.020 --> 00:07:12.260 again, a lot of things that we don't know about the attack. 112 00:07:12.320 --> 00:07:17.510 What we do know, it's been stopped. The backdoored code did 113 00:07:17.510 --> 00:07:21.800 get into some beta versions, or some rolling development 114 00:07:21.800 --> 00:07:25.820 versions. And people have flagged what those are, and 115 00:07:25.820 --> 00:07:32.420 said, please downgrade to what we know, to be safe. The guy who 116 00:07:32.450 --> 00:07:37.850 is maintaining the code, legitimate guy, Lasse is his 117 00:07:37.850 --> 00:07:41.810 name. He said, look, I'm digging into everything that's happened. 118 00:07:42.380 --> 00:07:46.490 Use the safe version for now. I will put out a newer version 119 00:07:46.520 --> 00:07:49.220 that we know is safe. And I'm sure you can expect everyone to 120 00:07:49.220 --> 00:07:54.230 have a really close look at that for obvious reasons. But that is 121 00:07:54.230 --> 00:07:58.280 forthcoming. There's a lot of tear downs, like I was saying 122 00:07:58.280 --> 00:08:03.230 still happening of the backdoored code. A lot we don't 123 00:08:03.230 --> 00:08:07.430 know. But the longer the short of it is, thankfully, this was 124 00:08:07.430 --> 00:08:10.520 discovered; raises some really big questions about if any other 125 00:08:10.520 --> 00:08:13.880 code has been backdoored. And we just don't know it. There is a 126 00:08:13.880 --> 00:08:17.420 huge attack surface here. And this is the first time we've 127 00:08:17.420 --> 00:08:20.810 seen people attempt to mess with open-source components, won't be 128 00:08:20.810 --> 00:08:25.580 the last, raises big provocative questions about if anything is 129 00:08:25.610 --> 00:08:28.520 going to be done to fix this sort of thing. It's going to 130 00:08:28.520 --> 00:08:31.700 take time, attention, funding, all those sorts of things that 131 00:08:31.700 --> 00:08:34.880 we don't seem to see in abundance, unfortunately, with a 132 00:08:34.880 --> 00:08:36.110 lot of open-source software. 133 00:08:37.070 --> 00:08:39.230 Anna Delaney: I think the most interesting part of it is how 134 00:08:39.230 --> 00:08:41.270 the backdoor actually got there in the first place that you 135 00:08:41.870 --> 00:08:44.840 explained very well there. How do you think this incident 136 00:08:45.260 --> 00:08:48.770 affects the trust in open-source software? And what are you 137 00:08:48.770 --> 00:08:51.170 hearing in terms of solutions? You mentioned resources there, 138 00:08:51.170 --> 00:08:57.260 but what can we do in terms of securing this open-source world? 139 00:08:57.960 --> 00:09:00.750 Mathew Schwartz: We've seen some efforts in the past when massive 140 00:09:00.750 --> 00:09:03.510 vulnerabilities have been found in critical open source 141 00:09:03.510 --> 00:09:06.780 components. The Linux Foundation has some funding that they have 142 00:09:06.780 --> 00:09:11.430 been putting into components. But I don't even personally 143 00:09:11.430 --> 00:09:13.890 understand the scope of the problem here. You have a 144 00:09:13.890 --> 00:09:17.490 compression decompression utility in Linux, that people 145 00:09:17.490 --> 00:09:20.760 were able to subvert in such a way that they could have made 146 00:09:20.760 --> 00:09:25.500 Linux do whatever they wanted at any point in the future. All 147 00:09:25.500 --> 00:09:28.320 these critical components, I don't even think we have a full 148 00:09:28.380 --> 00:09:31.800 understanding of the supply chain risk here. So hopefully 149 00:09:31.800 --> 00:09:35.670 this is going to prompt a lot of probing questions, but this 150 00:09:35.670 --> 00:09:38.970 isn't the first time this has happened. And it's not like the 151 00:09:38.970 --> 00:09:42.570 floodgates have been opened. And all these open-source components 152 00:09:42.570 --> 00:09:46.140 are getting the time attention and love that they need to be 153 00:09:46.140 --> 00:09:47.760 getting to help prevent this 154 00:09:47.000 --> 00:09:51.080 Tom Field: Days for Log4j, Mat, and I'm thinking that even a 155 00:09:51.080 --> 00:09:57.560 year ago 25 to 30% of new downloads of Log4j with the 156 00:09:57.560 --> 00:10:01.730 infected version. So prediction, we're going to hear a lot of 157 00:10:02.450 --> 00:10:05.930 moaning, we're going to see some hand wringing. But I don't know 158 00:10:05.930 --> 00:10:07.850 that we're going to see any significant changes. I don't 159 00:10:07.850 --> 00:10:11.750 know what significant changes could be imposed. And we're 160 00:10:11.750 --> 00:10:12.920 going to wait for the next incident. 161 00:10:13.820 --> 00:10:16.490 Mathew Schwartz: Yep. And it's going to happen. Yep, I agree. 162 00:10:17.630 --> 00:10:19.460 Anna Delaney: Any advice to organizations right now? 163 00:10:22.490 --> 00:10:25.340 Mathew Schwartz: That's a good one. I mean, kudos to the 164 00:10:25.340 --> 00:10:28.940 Microsoft developer who dug in and found this. I think we're 165 00:10:28.940 --> 00:10:32.930 still waiting to see what if any takeaways we have here, except 166 00:10:33.230 --> 00:10:35.720 that we need to show more love to the open-source ecosystem. 167 00:10:36.740 --> 00:10:39.680 Tom Field: It is open source, it's maintained by volunteers. 168 00:10:40.520 --> 00:10:44.600 As Mat says, there are too few hands on it too few eyes 169 00:10:44.600 --> 00:10:48.080 watching it and too many potential vulnerabilities. This 170 00:10:48.080 --> 00:10:50.540 is another incident just waiting to happen. 171 00:10:50.900 --> 00:10:52.910 Suparna Goswami: Tom said too few people watching it, as too 172 00:10:52.910 --> 00:10:56.060 many people using it as well. I mean, mass of people who just 173 00:10:56.150 --> 00:10:57.980 swear by open source. 174 00:10:58.860 --> 00:11:01.020 Anna Delaney: Now, Suparna, you've recently interviewed 175 00:11:01.020 --> 00:11:03.960 MatchMove CISO about his comprehensive approach to 176 00:11:03.960 --> 00:11:07.530 enhancing cloud security. And I know that you covered a range of 177 00:11:07.530 --> 00:11:10.560 issues, including compliance challenges, and bridging skill 178 00:11:10.560 --> 00:11:13.440 gaps through training and automation. Just tell us about 179 00:11:13.440 --> 00:11:13.620 it. 180 00:11:14.650 --> 00:11:17.350 Suparna Goswami: Yes, to the CISO was one of the award 181 00:11:17.350 --> 00:11:22.360 winners at DCISO awards for cloud security. So before I 182 00:11:22.360 --> 00:11:25.990 begin, I must say that DCISO Awards not only recognizes the 183 00:11:25.990 --> 00:11:30.280 best from the industry, but also so much to learn from these case 184 00:11:30.280 --> 00:11:33.220 studies. For example, the interview I will be speaking 185 00:11:33.220 --> 00:11:36.550 about today with some Samrat Bhatt, who happens to be the CEO 186 00:11:36.550 --> 00:11:40.090 of MatchMove, which is a fintech company here in India, he won 187 00:11:40.090 --> 00:11:43.420 the award for cloud security, but likes to call his project, 188 00:11:43.690 --> 00:11:46.930 integrated security transformation project, because 189 00:11:46.930 --> 00:11:50.650 he believes that if unless you take care of all other things, 190 00:11:50.680 --> 00:11:53.500 you can't just secure the cloud. So you have to take care of 191 00:11:53.500 --> 00:11:56.410 other things as well, like endpoint security, automation, 192 00:11:56.410 --> 00:12:00.370 everything needs to be done. So before he started, it was a 193 00:12:00.370 --> 00:12:03.580 startup, you know, MatchMove was relatively a small company when 194 00:12:03.580 --> 00:12:07.030 he joined. So there were a lot of areas that he identified that 195 00:12:07.030 --> 00:12:11.020 needed immediate attention. So one was, of course, because it's 196 00:12:12.250 --> 00:12:14.740 part financial industry, compliance with regulatory 197 00:12:14.740 --> 00:12:18.880 standards, data protection were top challenges, then there were 198 00:12:18.880 --> 00:12:22.090 various issues in your identity and access management, vendor 199 00:12:22.090 --> 00:12:25.570 risk management, incident response preparation, cloud 200 00:12:25.570 --> 00:12:29.260 governance, and those need to continuously monitor and 201 00:12:31.810 --> 00:12:34.960 DevSecOps. So these were some of the priorities. And of course, 202 00:12:34.960 --> 00:12:38.260 one of the critical challenge that he faced, in fact, he said 203 00:12:38.260 --> 00:12:41.350 that, for him that was the most challenging part, was the lack 204 00:12:41.350 --> 00:12:44.950 of cloud security skills, both among the IT staff as well as 205 00:12:44.950 --> 00:12:48.520 the security stuff. So of course, now that he had to 206 00:12:48.520 --> 00:12:51.310 implement cloud security, he just couldn't focus on that, he 207 00:12:51.310 --> 00:12:54.490 had to have a concrete plan, which took into account all 208 00:12:54.490 --> 00:12:58.540 these things. So that's why he named his project integrated 209 00:12:58.540 --> 00:13:03.280 security transformation. And for him, an integrated approach was 210 00:13:03.280 --> 00:13:07.540 probably combining training programs, your automating tools, 211 00:13:08.170 --> 00:13:11.290 then implementing advanced security solutions, specifically 212 00:13:11.290 --> 00:13:15.730 cloud-native solutions. Then, of course, he took care of the 213 00:13:15.730 --> 00:13:19.330 endpoints and targeted awareness initiatives; that was majorly 214 00:13:19.330 --> 00:13:23.440 something that he focused on. And he divided the project into 215 00:13:23.440 --> 00:13:25.480 four quarters. So when I spoke with him, you'll hear in the 216 00:13:25.480 --> 00:13:29.260 interview, which should be published this week, he explains 217 00:13:29.260 --> 00:13:33.190 it very nicely, I'll just give a brief of what he's said. So he 218 00:13:33.220 --> 00:13:36.040 divided the project into four quarters. So the first quarter 219 00:13:36.040 --> 00:13:39.880 was focusing on uplifting IT skills, because he said that 220 00:13:39.880 --> 00:13:44.680 unless and until those skills were imparted, he couldn't go 221 00:13:44.680 --> 00:13:48.610 ahead with the project. So an interesting thing that I would 222 00:13:48.610 --> 00:13:51.880 like to mention here was that he trained his IT staff to have 223 00:13:51.880 --> 00:13:55.990 knowledge on cloud as well, because he believed that going 224 00:13:55.990 --> 00:14:00.430 forward IT admin will need to have knowledge of the cloud. So 225 00:14:00.430 --> 00:14:04.510 he tackled the industry-wide skill shortage by focusing on 226 00:14:04.510 --> 00:14:07.690 training and development. And he specifically had vendor 227 00:14:07.690 --> 00:14:10.150 supported training a lot. Every week, there were vendor 228 00:14:10.150 --> 00:14:13.300 supported training. And he emphasized those both on 229 00:14:13.300 --> 00:14:16.720 technical training as well as the concepts of information 230 00:14:16.720 --> 00:14:21.610 security. And needless to say, he had to automate a lot of 231 00:14:21.670 --> 00:14:24.370 tasks to manage the skill gap and enhance operational 232 00:14:24.370 --> 00:14:27.250 efficiency and I'd tell you later, like what percentage of 233 00:14:27.250 --> 00:14:30.490 operational efficiency actually improved. Plus another thing 234 00:14:30.490 --> 00:14:33.220 that he mentioned, which I found interesting was that he wanted 235 00:14:33.220 --> 00:14:37.030 the IT and the security teams to work closely. So he organized 236 00:14:37.030 --> 00:14:40.480 sessions every week where the IT team will teach security teams 237 00:14:40.510 --> 00:14:44.680 about specific tools in the IT and vice versa. So they could 238 00:14:44.680 --> 00:14:47.110 both work together. So if one was absent because it's not a 239 00:14:47.110 --> 00:14:50.470 huge team that he had, the others could for the time being 240 00:14:50.470 --> 00:14:53.740 take care of things. So that was how the first quarter was spent. 241 00:14:53.740 --> 00:14:59.110 Then second quarter he focused on automation. He automated DLP 242 00:14:59.110 --> 00:15:01.570 monitoring, he automated exception monitoring, 243 00:15:01.570 --> 00:15:06.310 vulnerability analysis. So all these things were automated. And 244 00:15:06.310 --> 00:15:09.100 in the third quarter, he implemented next-generation 245 00:15:09.100 --> 00:15:11.740 security in which to secure the endpoints, he went for the 246 00:15:11.740 --> 00:15:16.240 endpoints first. And here he what he mentioned was he held 247 00:15:16.240 --> 00:15:19.630 regular meetings with the C-suit the business, hence, developers, 248 00:15:20.170 --> 00:15:22.810 to let them know from where the threats are coming, to let them 249 00:15:22.810 --> 00:15:26.260 know that what exactly were the challenges in the endpoint. He 250 00:15:26.260 --> 00:15:28.840 said that it was very, very important to let all of them 251 00:15:28.840 --> 00:15:32.170 know that this is a threat, this is what it will result in so 252 00:15:32.170 --> 00:15:35.050 that they take all these things very seriously. And the last 253 00:15:35.050 --> 00:15:37.690 quarter, and finally, the fourth quarter was about the 254 00:15:37.720 --> 00:15:45.610 cloud-native security solutions. It was basically on AWS. So this 255 00:15:45.610 --> 00:15:47.890 is how he said the project was deployed. Of course, the 256 00:15:47.890 --> 00:15:52.210 interview has the, you know, the details. But what particularly 257 00:15:52.210 --> 00:15:55.360 impressed me was that though I initially approached him only 258 00:15:55.360 --> 00:15:58.630 for, you know, having an interview on cloud security, but 259 00:15:58.630 --> 00:16:01.720 the way he explained me the way he approached the entire project 260 00:16:02.410 --> 00:16:07.600 that caught my attention, and that shows that equal attention 261 00:16:07.600 --> 00:16:10.300 needs to be paid to everything; if you're just securing the 262 00:16:10.300 --> 00:16:13.780 cloud, it will not help; how he had to secure everything to 263 00:16:13.780 --> 00:16:16.630 ensure that okay, the final product is successful. 264 00:16:17.920 --> 00:16:20.404 Anna Delaney: Did you discuss the benefits to the business? 265 00:16:20.459 --> 00:16:23.937 I'm just curious to know, what were the tangible and intangible 266 00:16:23.992 --> 00:16:27.250 benefits to the business and how does Samrat measure these? 267 00:16:26.790 --> 00:16:28.857 Suparna Goswami: Well, of course, so like I said, 268 00:16:28.921 --> 00:16:32.539 operational efficiency, he said, you know, it led to 20% 269 00:16:32.604 --> 00:16:36.609 reduction in manual processes. And it improved ROI by at least 270 00:16:36.674 --> 00:16:40.873 30%. And intangible benefits. Of course, there were a lot he said 271 00:16:40.938 --> 00:16:44.233 there was a culture shift towards security. He said 272 00:16:44.297 --> 00:16:48.303 employee confidence increased as far as security is concerned, 273 00:16:48.367 --> 00:16:51.921 agility, adaptability, then, of course, it enhanced the 274 00:16:51.985 --> 00:16:55.668 reputation also. And the cybersecurity score rating which 275 00:16:55.170 --> 00:17:01.302 Anna Delaney: Great. Well, thanks for that, Suparna. We 276 00:16:55.732 --> 00:16:59.479 was in 60s and 70s, when he joined late 60s, early 70s was 277 00:16:59.544 --> 00:17:03.549 above 97-98 out of 100 by the end of the project. So there was 278 00:17:01.452 --> 00:17:10.276 look forward to watching it soon. Tom, you've just returned 279 00:17:03.614 --> 00:17:07.167 quite a bit of ... and that probably what impressed the 280 00:17:07.232 --> 00:17:11.173 juries as well, when he won the award, that in one year there 281 00:17:10.426 --> 00:17:19.251 from Houston, where you were busy interviewing speakers and 282 00:17:11.237 --> 00:17:14.403 was a massive shift in the overall culture of the 283 00:17:14.467 --> 00:17:18.408 organization. So it was a lovely conversation that I had with 284 00:17:18.473 --> 00:17:20.670 him. So it was pretty interesting. 285 00:17:19.400 --> 00:17:28.823 attendees at the Cybersecurity for Critical Assets Summit. What 286 00:17:28.973 --> 00:17:37.050 were the key trends, takeaways that stood out for you? 287 00:17:37.600 --> 00:17:42.460 Tom Field: But first a commercial because OT is the 288 00:17:42.460 --> 00:17:45.760 word of the day - operational technology. And I want to make 289 00:17:45.760 --> 00:17:48.430 sure we have this announcement to our audience here that just 290 00:17:48.430 --> 00:17:52.180 last week announced at the Cybersecurity for Critical 291 00:17:52.180 --> 00:17:56.170 Assets event. We have launched our newest media property, 292 00:17:56.440 --> 00:18:02.350 OT.today. So you can look that up OT.today. It is the latest 293 00:18:02.350 --> 00:18:07.450 media property from ISMG, it is focused entirely on operational 294 00:18:07.450 --> 00:18:11.800 technology. Now, backdrop for our conversation today. Maybe 295 00:18:11.800 --> 00:18:15.250 two weeks ago, we as an editorial group were visited by 296 00:18:15.250 --> 00:18:22.120 Dawn Capelli, from Dragos, who heads up the OT-CERT. And she 297 00:18:22.120 --> 00:18:26.350 shared some information with us that I think would be shocking 298 00:18:26.350 --> 00:18:31.600 to almost anybody talking about small public utilities, where 299 00:18:32.410 --> 00:18:36.730 the person who is in charge of cybersecurity is also in charge 300 00:18:36.730 --> 00:18:41.560 of cutting the lawn. And we hear and I hear this consistently 301 00:18:41.560 --> 00:18:46.810 talking with CIOs and CISOs, about oil and gas, chemical, 302 00:18:46.870 --> 00:18:50.650 manufacturing facilities that all have that one or two 303 00:18:50.650 --> 00:18:55.540 critical systems that are dependent upon Windows 95. and 304 00:18:56.140 --> 00:19:02.230 long, long, outdated technology. It's as frightening as the 305 00:19:02.230 --> 00:19:06.610 picture that Mat painted about open-source in the software 306 00:19:06.610 --> 00:19:11.740 supply chain. So great focus on OT, we went to this event in 307 00:19:11.740 --> 00:19:16.690 Houston, which was hosted by our new acquisition QG, entirely on 308 00:19:16.690 --> 00:19:19.960 cybersecurity in the OT space. I spent time there, we put up a 309 00:19:19.960 --> 00:19:23.290 pop up video studio where I was able to interview about a dozen 310 00:19:23.740 --> 00:19:28.600 security leaders - people in IT, people in OT, people that are in 311 00:19:28.600 --> 00:19:31.750 both disciplines, and talking about a lot of the issues that 312 00:19:31.750 --> 00:19:35.200 are at the heart of being able to secure these critical 313 00:19:35.200 --> 00:19:39.070 critical systems. So I wanted to share some of these today, but 314 00:19:39.070 --> 00:19:41.170 the videos aren't ready. We'll have up on our site probably 315 00:19:41.170 --> 00:19:43.900 within the next several days. But among the topics we 316 00:19:43.900 --> 00:19:48.070 discussed, were life cycle management for these legacy 317 00:19:48.070 --> 00:19:51.940 systems. How you approach that how you get funding for that and 318 00:19:51.940 --> 00:19:57.010 make this a business risk that the organization has got to pay 319 00:19:57.010 --> 00:20:02.440 attention to. We talked about building and maintaining 320 00:20:02.650 --> 00:20:06.850 cybersecurity programs for OT, where this hasn't been a long 321 00:20:06.850 --> 00:20:10.120 standing discipline, as has been with IT. We talked about 322 00:20:10.120 --> 00:20:14.470 workforce management, getting the right skills, being able to 323 00:20:14.470 --> 00:20:17.740 develop the people you need to manage and maintain these 324 00:20:17.740 --> 00:20:22.840 systems. So there's a lot of fundamental work going on to 325 00:20:22.870 --> 00:20:27.010 create and maintain these security programs for systems 326 00:20:27.100 --> 00:20:31.150 that really are at the heart of our critical infrastructure in 327 00:20:31.150 --> 00:20:34.450 so many ways. And of course, the challenge with all of these is, 328 00:20:34.750 --> 00:20:39.580 how do you address cybersecurity issues, and be able to have such 329 00:20:39.580 --> 00:20:43.090 resilience and such quick response, that you don't take 330 00:20:43.090 --> 00:20:47.200 down production cycles, or interrupt a utility? You know, 331 00:20:47.200 --> 00:20:50.380 these are systems that can't come down for any reason, and 332 00:20:50.380 --> 00:20:53.290 how do you maintain them, well, without having to take them 333 00:20:53.290 --> 00:20:57.070 down. So lots of topics, I'm excited about our new property, 334 00:20:57.280 --> 00:21:00.010 the opportunity to talk to more thought leaders, create 335 00:21:00.010 --> 00:21:02.530 compelling content and educational opportunities, 336 00:21:02.530 --> 00:21:05.950 because, you know, as much as I've been talking about, as Mat 337 00:21:05.950 --> 00:21:09.280 has, software supply chain for the last couple of years and 338 00:21:09.280 --> 00:21:12.550 have concerns about that, the concerns are very much parallel 339 00:21:12.820 --> 00:21:15.550 with operational technology. I'm glad that we have this new 340 00:21:15.550 --> 00:21:17.350 property that where we can talk about this. 341 00:21:18.250 --> 00:21:22.270 Anna Delaney: Lots in there. Did your interviewees propose any 342 00:21:22.270 --> 00:21:25.870 insights or solutions for overcoming these cybersecurity 343 00:21:25.870 --> 00:21:28.420 challenges in critical asset environments? 344 00:21:28.000 --> 00:21:32.440 Tom Field: You know, I guess, one, acknowledging there's an 345 00:21:32.440 --> 00:21:35.080 issue with step one, I hate to make this like a 12-step 346 00:21:35.080 --> 00:21:37.960 program. But it almost is. We have to acknowledge there is an 347 00:21:37.960 --> 00:21:41.680 issue. And that comes from all aspects of the organization. 348 00:21:41.920 --> 00:21:45.250 Doing that we've got to build cybersecurity programs with legs 349 00:21:45.250 --> 00:21:49.480 and with teeth that can sustain and can be able to bridge 350 00:21:49.690 --> 00:21:53.620 cultural gaps between IT and OT. You know, this is a space where 351 00:21:54.040 --> 00:21:57.130 people that own this technology have been on their own for many 352 00:21:57.130 --> 00:22:00.640 years, and they have suspicions of people coming in and trying 353 00:22:00.640 --> 00:22:04.510 to offer new solutions that might impact their production 354 00:22:04.510 --> 00:22:08.080 cycle. So there's a lot to be done here. And as we spoke to 355 00:22:08.080 --> 00:22:11.770 one leader in particular, a CISO, that has been an IT 356 00:22:11.770 --> 00:22:16.870 security for decades, now, OT is part of his remit. And there's a 357 00:22:16.870 --> 00:22:21.610 big educational curve there for him to learn more about this. I 358 00:22:21.610 --> 00:22:24.250 think that's not particularly unusual. I don't think he's 359 00:22:24.370 --> 00:22:27.850 necessarily the exception. So there are lots of discussions to 360 00:22:27.850 --> 00:22:30.640 have, it's nice that they're having those and this was a well 361 00:22:30.640 --> 00:22:34.090 attended event. So I'm glad that we're giving them the forum to 362 00:22:34.090 --> 00:22:37.180 move forward here. But there's a lot of work to be done quickly. 363 00:22:37.000 --> 00:22:40.180 Anna Delaney: Well, we can't wait to watch your interview. So 364 00:22:40.210 --> 00:22:42.640 they're coming up, I hope in the next couple of weeks. 365 00:22:43.030 --> 00:22:43.600 Tom Field: Look forward to it. 366 00:22:44.950 --> 00:22:47.140 Anna Delaney: And finally and just for fun, if there were a 367 00:22:47.140 --> 00:22:50.020 book club for cybersecurity enthusiasts, and I'm sure there 368 00:22:50.020 --> 00:22:53.950 is, what non-technical book should be on their reading list 369 00:22:53.980 --> 00:22:59.230 and why? And go creative with this. Tom, I know you're going 370 00:22:59.230 --> 00:23:00.250 to share something with us. 371 00:23:00.000 --> 00:23:50.550 Anna Delaney: Now we know why you say good at communicating 372 00:23:00.240 --> 00:23:03.167 Tom Field: I am. This is actually the book that I have 373 00:23:03.235 --> 00:23:07.592 given away more copies of in my life than any other book, Strunk 374 00:23:07.660 --> 00:23:11.609 & White's Elements of Style. To me, it is the best book on 375 00:23:11.677 --> 00:23:15.490 writing that has ever been prepared. And for people that 376 00:23:15.558 --> 00:23:19.302 have got to communicate via email, via text, via video, 377 00:23:19.371 --> 00:23:23.387 however they're communicating, there is nothing better than 378 00:23:23.455 --> 00:23:27.745 spending the 45 to 50 minutes it takes to go through this book, 379 00:23:27.813 --> 00:23:32.102 and learn just the fundamentals of communication. Starting with 380 00:23:32.170 --> 00:23:35.983 and ending with eliminate unnecessary words. As a writer 381 00:23:36.051 --> 00:23:40.136 and editor, I have given away scores of copies of this book, 382 00:23:40.204 --> 00:23:44.220 and I would recommend it for anybody that is working in the 383 00:23:44.289 --> 00:23:46.740 cybersecurity and technology fields. 384 00:23:50.550 --> 00:23:55.530 Tom. I'll to read it. Great suggestion. Suparna, go for it. 385 00:23:56.610 --> 00:23:57.960 Suparna Goswami: I thought, you know, the first thing that 386 00:23:57.960 --> 00:24:01.590 struck me was when you said it can be as fiction as well was 387 00:24:01.590 --> 00:24:06.270 1984 by George Orwell. So the novel's overall theme is about 388 00:24:06.270 --> 00:24:09.450 surveillance, is about government control, lack of 389 00:24:09.450 --> 00:24:13.020 privacy, a thought provoking read, so I thought it would be a 390 00:24:13.050 --> 00:24:15.240 good read for the cybersecurity professionals. 391 00:24:16.140 --> 00:24:16.650 Anna Delaney: Perfect. 392 00:24:17.070 --> 00:24:17.460 Tom Field: Good choice. 393 00:24:18.450 --> 00:24:18.840 Anna Delaney: Mat? 394 00:24:19.800 --> 00:24:23.940 Mathew Schwartz: Wow, yeah, from authoritarian regimes and 395 00:24:24.150 --> 00:24:28.890 writing best practices, I'm going to go with a book by The 396 00:24:28.890 --> 00:24:35.730 Visual Display of Quantitative Information, which, like Tom is 397 00:24:35.730 --> 00:24:38.310 talking about how to how to write well, this is talking 398 00:24:38.310 --> 00:24:42.960 about how to consume visual information well, and also how 399 00:24:42.960 --> 00:24:46.500 to design it as well. So one of the famous examples in Tufte's 400 00:24:46.530 --> 00:24:51.000 book is, I'm not sure which way you pronounce it, but is the 401 00:24:51.030 --> 00:24:56.160 cholera map prepared in 1854 by Dr. John Snow, where he looked 402 00:24:56.160 --> 00:25:00.330 at cholera deaths in London and visualize those in a map up with 403 00:25:00.330 --> 00:25:05.610 dots. And then he plotted with crosses where the pumps were. 404 00:25:05.910 --> 00:25:12.540 And there was a very immediate, visual way of figuring out what 405 00:25:12.540 --> 00:25:15.540 had happened, what was causing these cholera deaths. And 406 00:25:15.540 --> 00:25:21.090 there's numerous examples he brings up to highlight how 407 00:25:21.660 --> 00:25:26.910 displaying information visually, can really help communicate in a 408 00:25:26.910 --> 00:25:32.460 way that nothing else can, you know, with an apology to Strunk 409 00:25:32.460 --> 00:25:37.290 and White, sometimes the visual display is the most useful 410 00:25:37.290 --> 00:25:39.180 display. So I would suggest that. 411 00:25:39.570 --> 00:25:42.060 Tom Field: Brilliant book, I've had read it as well, Mat, love 412 00:25:42.960 --> 00:25:43.020 it. 413 00:25:43.290 --> 00:25:45.870 Mathew Schwartz: Yeah. Hard to have one that you love. They're 414 00:25:45.870 --> 00:25:46.410 all great. 415 00:25:47.710 --> 00:25:49.990 Anna Delaney: Excellent choice. And next time, you're all in 416 00:25:49.990 --> 00:25:54.190 London, we'll go to the John Snow pub in Soho, named after 417 00:25:54.190 --> 00:25:58.630 John Snow in his honor. Well, I'm going to turn to master 418 00:25:58.630 --> 00:26:02.140 William Shakespeare, because I think his works are a compendium 419 00:26:02.170 --> 00:26:05.530 of human emotions and vulnerabilities. And in 420 00:26:05.530 --> 00:26:07.900 particular, I'm going to recommend his play A Fellow, 421 00:26:08.350 --> 00:26:12.820 which explores themes of manipulation and deception and 422 00:26:13.030 --> 00:26:17.230 the destructive potential of jealousy or unchecked jealousy. 423 00:26:17.380 --> 00:26:21.040 And I think we can all agree that these are themes that 424 00:26:21.190 --> 00:26:23.950 resonate, perhaps with cybersecurity professionals as 425 00:26:23.950 --> 00:26:26.710 they deal with social engineering and misinformation 426 00:26:26.710 --> 00:26:36.190 and the human factor in cyberthreats every day. Well, 427 00:26:36.190 --> 00:26:38.350 thank you so much, everybody. This has been excellent, 428 00:26:38.350 --> 00:26:39.850 informative, and fun. 429 00:26:39.880 --> 00:26:40.480 Suparna Goswami: Thank you, Anna. 430 00:26:41.500 --> 00:26:42.220 Mathew Schwartz: Thanks. 431 00:26:42.370 --> 00:26:44.260 Anna Delaney: Thanks so much for watching. Until next time.