WEBVTT 1 00:00:00.480 --> 00:00:03.090 Anna Delaney: Hello and welcome to the ISMG Editors' Panel. I'm 2 00:00:03.090 --> 00:00:06.180 Anna Delaney and here, we discuss and analyze the week's 3 00:00:06.180 --> 00:00:09.870 top cybersecurity stories. Joining me this week, they need 4 00:00:09.870 --> 00:00:12.600 no introduction, but I'm going to do it anyway. Tom Field, 5 00:00:12.630 --> 00:00:15.660 senior vice president of editorial, Suparna Goswami, 6 00:00:15.720 --> 00:00:19.470 associate editor at ISMG Asia. And Mathew Schwartz, executive 7 00:00:19.470 --> 00:00:22.740 editor of DataBreachToday & Europe. Lovely to see you all. 8 00:00:23.370 --> 00:00:24.000 Tom Field: Lovely to be seen. 9 00:00:24.030 --> 00:00:24.780 Suparna Goswami: Always a pleasure. 10 00:00:25.350 --> 00:00:26.100 Mathew Schwartz: Great to be here. 11 00:00:27.090 --> 00:00:29.400 Anna Delaney: Suparna, tell us where you are. That looks 12 00:00:29.400 --> 00:00:29.970 intriguing. 13 00:00:31.260 --> 00:00:34.380 Suparna Goswami: Oh, the background is Africa. So we are 14 00:00:34.380 --> 00:00:37.860 having our Africa Summit next week, the second one in as many 15 00:00:37.860 --> 00:00:40.740 years in the region. So, virtually I'm there in the 16 00:00:41.160 --> 00:00:45.210 continent this week and next week. So, we are having the 17 00:00:45.210 --> 00:00:48.360 summit on September 22. So hopefully, some great speakers 18 00:00:48.360 --> 00:00:50.820 lined up, hopefully to see you all virtually there. 19 00:00:51.480 --> 00:00:53.610 Anna Delaney: Absolutely, we will be, and hopefully, we'll be 20 00:00:53.610 --> 00:00:57.270 discussing this later. Tom, what a sight. 21 00:00:57.660 --> 00:01:00.360 Tom Field: Different kind of jungle, not quite Africa, but 22 00:01:00.360 --> 00:01:04.290 Manhattan. I'm in town this week, hosting a virtual event, 23 00:01:04.290 --> 00:01:07.080 last night hosted a live event, today talking about SOC 24 00:01:07.080 --> 00:01:11.460 modernization. And it is good to be back in Manhattan again, 25 00:01:11.490 --> 00:01:14.550 although as I told you earlier, the difference between visiting 26 00:01:14.550 --> 00:01:17.970 Manhattan today and when I did when I was probably in my early 27 00:01:18.000 --> 00:01:21.150 20s - I will come through Times Square then and be stopped by 28 00:01:21.150 --> 00:01:24.240 people trying to sell me marijuana. I go through Times 29 00:01:24.240 --> 00:01:26.790 Square now and I'm stopped by clouds of people smoking 30 00:01:26.790 --> 00:01:27.330 illegally. 31 00:01:30.810 --> 00:01:34.800 Anna Delaney: I can only imagine the smell. And Mathew, that 32 00:01:34.800 --> 00:01:35.700 looks rather pleasant. 33 00:01:36.000 --> 00:01:38.190 Mathew Schwartz: It was extremely pleasant. And this is 34 00:01:38.190 --> 00:01:43.350 Stockholm where I was last week to moderate a roundtable on 35 00:01:43.380 --> 00:01:46.920 securing the open-source software supply chain. So, it 36 00:01:46.920 --> 00:01:50.760 was also a beautiful few days, I was very lucky to get out for a 37 00:01:50.760 --> 00:01:55.830 bit of a walk and check out the wonderful sea fronts. I mean, so 38 00:01:55.830 --> 00:01:58.170 much of Stockholm is on the water, but it's just such a 39 00:01:58.170 --> 00:01:58.920 beautiful city. 40 00:01:59.280 --> 00:02:01.860 Anna Delaney: Well, it's lovely. Well, I thought I'd share 41 00:02:01.860 --> 00:02:05.220 something royal this week, being the week that it is. This is the 42 00:02:05.220 --> 00:02:09.330 Royal Opera House in London, which the late Queen was patron 43 00:02:09.360 --> 00:02:13.110 of, and I presume the new King will take over the role. So it 44 00:02:13.110 --> 00:02:16.290 has stunning interiors if you ever get the chance to visit. 45 00:02:17.580 --> 00:02:20.760 Tom, I believe you recently had the pleasure of interviewing Ron 46 00:02:20.790 --> 00:02:24.150 Green, who is the chief security officer at Mastercard, and it's 47 00:02:24.150 --> 00:02:27.060 an excellent conversation. What did you take away from it? 48 00:02:27.600 --> 00:02:31.170 Tom Field: Yeah, he visited our Government Cybersecurity Summit 49 00:02:31.170 --> 00:02:34.740 in Washington D.C. not too long ago. And what's impressive is 50 00:02:34.740 --> 00:02:38.070 that Ron Green's a busy man, he's the chief security officer 51 00:02:38.070 --> 00:02:41.310 of Mastercard, formerly with the Secret Service. He had to go to 52 00:02:41.310 --> 00:02:44.250 a congressional hearing that day. But he made time to come 53 00:02:44.250 --> 00:02:47.400 over to our event because he wanted to be on stage for a 54 00:02:47.400 --> 00:02:50.400 discussion of public and private partnerships. And he wanted to 55 00:02:50.400 --> 00:02:53.670 sit down with me and talk about that before he went to his 56 00:02:53.670 --> 00:02:56.790 congressional hearing. So this is something very much on his 57 00:02:56.790 --> 00:03:00.690 mind. And timely, the notion of public-private partnerships - 58 00:03:00.690 --> 00:03:03.690 not new, something we've talked about for years - but I think 59 00:03:03.690 --> 00:03:07.650 the urgency of it really has stepped up particularly this 60 00:03:07.650 --> 00:03:13.860 year because we're seeing such faster, broader attack cycles 61 00:03:13.860 --> 00:03:17.610 than we've ever seen before. The adversaries are weaponizing zero 62 00:03:17.610 --> 00:03:21.360 days faster than we've ever dealt with. And the old notion 63 00:03:21.510 --> 00:03:25.710 of threat intelligence, in terms of "this is what we've been 64 00:03:25.710 --> 00:03:30.000 seeing" doesn't work anymore. The rearview mirror is not 65 00:03:30.000 --> 00:03:33.360 important. It's the windshield in front of you. And we need 66 00:03:33.360 --> 00:03:36.330 this intelligence of what's happening now, what's being seen 67 00:03:36.330 --> 00:03:39.510 now, so that organizations in the public and the private 68 00:03:39.510 --> 00:03:43.140 sectors can be able to respond accordingly. So he came, we 69 00:03:43.140 --> 00:03:48.390 spoke about that. And I asked him about the notion of "are we 70 00:03:48.390 --> 00:03:53.490 getting better at this?" I hear in the discussions I have people 71 00:03:53.490 --> 00:03:57.960 from the private sector, saying that they're a lot more open to 72 00:03:57.960 --> 00:04:00.390 information sharing than they had been in the past. In the 73 00:04:00.420 --> 00:04:03.420 past, it had been, "I'm open to what you want to share with me, 74 00:04:03.450 --> 00:04:06.780 but don't ask for anything from me." I think that's changing. 75 00:04:06.870 --> 00:04:09.600 And to some degree, Ron validated that. So, if you don't 76 00:04:09.600 --> 00:04:13.200 mind, I'd like to share a clip of the discussion that we had 77 00:04:13.200 --> 00:04:14.580 together in Washington, DC. 78 00:04:14.000 --> 00:04:17.540 Ron Green: I've been at this a long time. So I have to say, 79 00:04:17.540 --> 00:04:22.070 yes, across the years, I've seen lots of positive movement, both 80 00:04:22.070 --> 00:04:28.760 from the government side and, by example, the work that they've 81 00:04:28.760 --> 00:04:33.140 done to declassify things so quickly in response to Eastern 82 00:04:33.140 --> 00:04:36.830 European issues that have resulted, but even beyond that, 83 00:04:37.850 --> 00:04:41.420 like an agency like the Secret Service, where they provided us 84 00:04:41.420 --> 00:04:44.240 information that - I kind of compromise there for bait 85 00:04:44.300 --> 00:04:51.680 investigation, but it gives us an opportunity to protect other 86 00:04:51.680 --> 00:04:55.640 companies. The example I'm thinking about is, I think you 87 00:04:55.640 --> 00:04:58.520 understand what a cash-out attack is, where bad guys get in 88 00:04:58.520 --> 00:05:01.730 the group and then they pull money out of the ATMs. Secret 89 00:05:01.730 --> 00:05:04.610 Service had an ongoing investigation. And they came to 90 00:05:04.610 --> 00:05:07.460 us, identified the bank that was involved. And we were able to 91 00:05:07.460 --> 00:05:11.060 put in safeguards to prevent the loss. And so that's just another 92 00:05:11.060 --> 00:05:14.900 example of the government willing to even risk its own - 93 00:05:16.340 --> 00:05:18.440 the things it's trying to achieve in order to see the 94 00:05:18.440 --> 00:05:21.830 right thing done. On the private sector side, I think you have a 95 00:05:21.830 --> 00:05:26.750 lot of companies that they know, "We can better protect the 96 00:05:26.750 --> 00:05:29.810 sector, if we're more open in our sharing." I think you do 97 00:05:29.810 --> 00:05:33.020 have organizations that might still be somewhat resistant, 98 00:05:33.680 --> 00:05:36.560 they're still afraid of, "Hey, if I give this information, 99 00:05:36.560 --> 00:05:39.710 either to CISA or law enforcement agency, it's going 100 00:05:39.710 --> 00:05:42.020 to end up in my regulators' hands that are going to use that 101 00:05:42.020 --> 00:05:45.710 against me and find me or, you know, cause me some other 102 00:05:45.710 --> 00:05:49.400 issue." I think there's a lot of work that's taking place to try 103 00:05:49.400 --> 00:05:54.230 and ease those concerns with reporting, so we'll have to see 104 00:05:54.260 --> 00:05:58.310 how that comes along. But I think all of us should look at 105 00:05:58.430 --> 00:06:02.840 the benefit that we get by reporting and engaging with the 106 00:06:02.840 --> 00:06:06.770 law enforcement partners or CISA earlier rather than after 107 00:06:06.770 --> 00:06:07.490 something bad happened. 108 00:06:07.810 --> 00:06:10.330 Tom Field: There, now look at that, I'm impressed for a 109 00:06:10.330 --> 00:06:12.160 change. I'm not wearing the same jacket. 110 00:06:14.050 --> 00:06:15.790 Anna Delaney: That's great. It's interesting to see how the 111 00:06:15.790 --> 00:06:19.000 conversation is evolving and even maturing, but you speak to 112 00:06:19.000 --> 00:06:23.200 practitioners every day. What do they vent about when it comes to 113 00:06:23.320 --> 00:06:26.260 forming and maintaining these partnerships? 114 00:06:26.660 --> 00:06:28.580 Tom Field: Honestly, the conversation isn't about venting 115 00:06:28.580 --> 00:06:31.550 anymore. There really used to be the notion that if we give 116 00:06:31.550 --> 00:06:33.710 something up, it's going to be used against us, somehow, it's 117 00:06:33.710 --> 00:06:36.230 going to get to our regulator, it's going to come up in an 118 00:06:36.230 --> 00:06:39.230 audit finding, and we're ultimately going to be penalized 119 00:06:39.230 --> 00:06:42.440 for whatever we share. That's gone. There's this acceptance 120 00:06:42.440 --> 00:06:44.990 now, even came up in the conversation I had yesterday in 121 00:06:44.990 --> 00:06:49.130 the virtual roundtable about how quickly zero days are being 122 00:06:49.130 --> 00:06:53.090 weaponized now. It used to be a zero day could be announced, it 123 00:06:53.090 --> 00:06:56.390 could be two months or more down the road before you start to see 124 00:06:56.390 --> 00:07:01.670 that actualized in the wild. And then, it became maybe a couple 125 00:07:01.670 --> 00:07:05.660 of days, I'm thinking in the time of the Apache Struts, 126 00:07:05.750 --> 00:07:09.890 breach of Equifax. That was a couple of days, I think, before 127 00:07:09.890 --> 00:07:13.100 you started to say, things went down to hours and minutes in 128 00:07:13.100 --> 00:07:17.930 some cases now. And so reality has seeped in - there's no 129 00:07:18.500 --> 00:07:22.610 opportunity for venting here, you got to be able to share what 130 00:07:22.610 --> 00:07:26.360 you're seeing with people who can respond and to receive that 131 00:07:26.360 --> 00:07:29.540 information so you have any kind of a chance to be able to 132 00:07:29.540 --> 00:07:32.180 respond to the speed and the scale of these attacks. 133 00:07:32.540 --> 00:07:35.000 Mathew Schwartz: You're hearing a much greater, I think, 134 00:07:35.120 --> 00:07:38.390 awareness on the part of the U.S. government now as well. It 135 00:07:38.390 --> 00:07:41.210 used to be "We're the FBI, we're here to help you." And now, 136 00:07:41.240 --> 00:07:43.880 there's a bit more, I think, consensus building. They're 137 00:07:43.880 --> 00:07:49.460 trying to give something to get something. In Sweden, it was 138 00:07:49.460 --> 00:07:52.820 very interesting to see that, a lot of respect at the event that 139 00:07:52.820 --> 00:07:56.000 I was at, people listening to each other. And I think there's 140 00:07:56.000 --> 00:08:00.860 a much greater willingness with CISA, for example, to ask, "Are 141 00:08:00.860 --> 00:08:03.620 we doing this the right way? Are there things we could be doing 142 00:08:03.620 --> 00:08:06.260 better?" and there's a greater awareness with things like 143 00:08:06.260 --> 00:08:09.020 ransomware, you've got to collaborate, you've got to work 144 00:08:09.020 --> 00:08:11.900 together. And so I think there's some really, I know, it might 145 00:08:11.900 --> 00:08:14.270 sound a little loosey goosey, but I think there's some 146 00:08:14.270 --> 00:08:17.810 goodwill, helping that happen now in a way that maybe it 147 00:08:17.810 --> 00:08:19.340 hasn't been happening before. 148 00:08:19.900 --> 00:08:22.060 Tom Field: Now you make such a good point. The tone of the top 149 00:08:22.060 --> 00:08:24.790 is so important. And I give Jen Easterly a lot of credit for 150 00:08:24.790 --> 00:08:28.300 that, and how she's out in the community. She's part of the 151 00:08:28.300 --> 00:08:30.070 community. She's from the community and spreading the 152 00:08:30.070 --> 00:08:33.640 word. But Chris Krebs was just as active before her and did a 153 00:08:33.640 --> 00:08:36.460 terrific job in setting this tone that I think we all benefit 154 00:08:36.460 --> 00:08:36.970 from now. 155 00:08:38.770 --> 00:08:41.260 Anna Delaney: Well said. This is great and great to see how the 156 00:08:41.500 --> 00:08:45.550 conversation is moving forward. Suparna, speaking of moving 157 00:08:45.550 --> 00:08:48.400 forward conversations, you've been working hard with your 158 00:08:48.400 --> 00:08:52.840 colleagues forming to this fantastic event next week, the 159 00:08:52.840 --> 00:08:55.840 Africa Cybersecurity Summit. Could you tell us about it? 160 00:08:57.250 --> 00:08:59.800 Suparna Goswami: Oh, so this time, we made sure that we have 161 00:08:59.830 --> 00:09:02.410 representation of some more countries other than South 162 00:09:02.410 --> 00:09:05.860 Africa. So we have speakers from Uganda, we have speakers from 163 00:09:05.860 --> 00:09:09.760 Nigeria, South Africa, as well as Kenya. And some great 164 00:09:09.760 --> 00:09:13.270 speakers lined up. We have keynote from Professor Snail, 165 00:09:13.270 --> 00:09:16.600 who is consultant with the information regulator in South 166 00:09:16.600 --> 00:09:20.710 Africa. He sets the context and speaks about the cybersecurity 167 00:09:20.710 --> 00:09:24.250 landscape in Africa. What are the kinds of crimes that are 168 00:09:24.250 --> 00:09:28.060 happening? Where is the market headed? And we have a session - 169 00:09:28.180 --> 00:09:30.400 of course, the themes are more or less the same across the 170 00:09:30.400 --> 00:09:34.030 globe - so we have a session on Industry 4.0 and how best we can 171 00:09:34.030 --> 00:09:38.020 protect the data. So here, I thought, why not get somebody 172 00:09:38.140 --> 00:09:45.730 from Malaysia and have him chat with the CIO in Africa. So I got 173 00:09:46.030 --> 00:09:49.660 the CEO of Cybersecurity Malaysia, Dr. Wahab, and he has 174 00:09:49.660 --> 00:09:53.410 a chat with the CIO of NSIA Insurance, which is one of the 175 00:09:53.410 --> 00:09:57.700 big insurance companies in Nigeria, and they talk about the 176 00:09:57.730 --> 00:10:03.220 IT-OT merger. Dr. Wahab spoke about how the machines are not 177 00:10:03.220 --> 00:10:06.670 really designed, keeping in mind the security, but more from a 178 00:10:06.670 --> 00:10:09.790 functionality point of view, and how best we can address these 179 00:10:09.790 --> 00:10:13.030 issues. So, he said, convergence of IT and OT workers is 180 00:10:13.420 --> 00:10:16.630 important. They need to sit together more often, the 181 00:10:17.200 --> 00:10:20.320 management needs to clearly communicate the goals of IT and 182 00:10:20.320 --> 00:10:23.200 OT convergence, there need to be good objective in both, the 183 00:10:23.200 --> 00:10:27.760 groups need to accept those objectives, agree to those 184 00:10:27.760 --> 00:10:31.720 objectives, and understand these integration. So, another chat 185 00:10:31.780 --> 00:10:34.300 session, which I'm really looking forward to is a panel 186 00:10:34.330 --> 00:10:37.870 where we have Julius Torach, who is Commissioner, Information 187 00:10:37.870 --> 00:10:40.630 Technology, Ministry of Information and Communication 188 00:10:40.630 --> 00:10:44.200 Technology with Uganda government, and Varsha Sewlal, 189 00:10:44.200 --> 00:10:47.710 who is from railway safety regulator in South Africa, again 190 00:10:47.710 --> 00:10:51.190 a public sector government company. And she's also the 191 00:10:51.190 --> 00:10:54.520 conference chair for the summit. So, in this panel, we discussed 192 00:10:54.550 --> 00:10:58.960 whether Africa should have a common cybersecurity policy, and 193 00:10:58.960 --> 00:11:01.900 whether it is a practical thing to have one. So common 194 00:11:01.900 --> 00:11:04.930 challenges they spoke about - they have been talking about it 195 00:11:04.930 --> 00:11:07.660 for the past few years - but the common challenges that have come 196 00:11:07.660 --> 00:11:11.920 across that every country is on a different cybersecurity 197 00:11:11.920 --> 00:11:15.520 maturity. So you have South Africa, which is slightly ahead 198 00:11:15.520 --> 00:11:21.220 than say, Uganda or Nigeria. And up until now, it has not worked 199 00:11:21.220 --> 00:11:23.350 out since none of the governments have really made an 200 00:11:23.350 --> 00:11:27.760 effort. So there have been small groups that have been formed. 201 00:11:27.760 --> 00:11:31.810 And, you know, conventions have been there, but somehow it has 202 00:11:31.810 --> 00:11:36.430 not really worked out. So we'll hear in this session, how best 203 00:11:36.430 --> 00:11:39.640 it can be done and what are some practical steps to achieve a 204 00:11:39.640 --> 00:11:43.450 common cybersecurity policy across Africa. Of course, we 205 00:11:43.450 --> 00:11:46.450 have sessions on zero trust, we can't really ignore that, zero 206 00:11:46.450 --> 00:11:51.640 trust on cloud, CISOs talking about that. As well as mobile 207 00:11:51.640 --> 00:11:54.700 applications. So we all know that Africa is big on mobile 208 00:11:54.730 --> 00:11:59.620 adoption. So this session speaks on how best we can balance 209 00:11:59.620 --> 00:12:02.380 privacy as well as security since mobile is a very private 210 00:12:02.380 --> 00:12:06.730 device. So where does the security team, how best can the 211 00:12:06.730 --> 00:12:10.930 security team implement policies there that doesn't seem very 212 00:12:10.930 --> 00:12:14.440 intrusive? Another interesting topic, of course, are your 213 00:12:14.440 --> 00:12:17.050 third-party risks, which I said some of the topics are very 214 00:12:17.050 --> 00:12:20.320 global in nature, which will, of course, find in this region as 215 00:12:20.320 --> 00:12:26.050 well. Yeah, it's a lovely panel. They speak about ... and one 216 00:12:26.050 --> 00:12:31.090 topic is on how security needs to be an enabler for business. 217 00:12:31.360 --> 00:12:34.990 So it's more of how CIO and CISOs can really work together 218 00:12:35.560 --> 00:12:40.540 and make security ... as well as the CIO and the IT team work 219 00:12:40.540 --> 00:12:43.720 together for the betterment of the business. So it's ... where 220 00:12:43.720 --> 00:12:47.140 are the gaps? How best these can be addressed, and here the 221 00:12:47.140 --> 00:12:49.570 speaker's speaking about what she has done in her 222 00:12:49.570 --> 00:12:53.200 organization, Nastassja Finnegan from First Rand Bank, like what 223 00:12:53.230 --> 00:12:56.830 are the steps that she's taking in an organization with this, 224 00:12:56.860 --> 00:12:59.920 both of them, CIOs, as well as CISOs, was collaborating. So 225 00:12:59.920 --> 00:13:02.320 yes, some great speakers lined up, some good sessions. So 226 00:13:02.320 --> 00:13:05.290 hopefully, it turns out to be a good summit. 227 00:13:06.820 --> 00:13:09.100 Anna Delaney: Sounds incredible. So, was there a particular theme 228 00:13:09.100 --> 00:13:12.160 that stood out, for you, being very different to the other 229 00:13:12.160 --> 00:13:15.340 regions of the world? Because you conduct panels all over the 230 00:13:15.340 --> 00:13:17.080 world ... 231 00:13:17.350 --> 00:13:19.270 Suparna Goswami: No one particular theme. As I said, 232 00:13:19.300 --> 00:13:23.260 more or less, these topics are very global in nature. But yes, 233 00:13:23.260 --> 00:13:25.540 I did. I'm really looking forward to that particular 234 00:13:25.540 --> 00:13:28.990 panel, where they are talking about having that common, 235 00:13:28.990 --> 00:13:31.930 because that came across in two-three of the sessions that 236 00:13:32.140 --> 00:13:36.520 whether we should have a common cybersecurity policy much like, 237 00:13:36.550 --> 00:13:39.730 you know, it has been there in Europe, whether they are talking 238 00:13:39.730 --> 00:13:42.550 about it, much like the GDPR in Europe, they said whether it is 239 00:13:42.550 --> 00:13:45.010 a practical thing to have its own cybersecurity policy in 240 00:13:45.010 --> 00:13:51.520 which other countries can look up to. So, three, four sessions, 241 00:13:51.520 --> 00:13:55.060 a topic they did touch upon, but I thought, let's have a session 242 00:13:55.060 --> 00:13:58.270 on this since we have been hearing this. So that is one 243 00:13:58.270 --> 00:14:01.240 particular session. I'm really looking forward to it. 244 00:14:01.270 --> 00:14:02.980 Anna Delaney: Well, we look forward to it. Great work, 245 00:14:02.980 --> 00:14:06.850 Suparna. Matt, I think it's time for some ransomware updates. 246 00:14:06.880 --> 00:14:07.720 What's been happening? 247 00:14:07.990 --> 00:14:10.150 Mathew Schwartz: I hope I'm not getting too predictable with the 248 00:14:10.180 --> 00:14:13.930 constant ransomware updates, but it's so much innovation 249 00:14:13.960 --> 00:14:17.380 happening around these attacks that it's really interesting to 250 00:14:17.380 --> 00:14:22.450 track and to see what is coming out of the mind of these crazy 251 00:14:22.450 --> 00:14:27.430 guys who are involved in these ransomware gangs. So, a couple 252 00:14:27.430 --> 00:14:30.880 of interesting things to highlight. One is that someone's 253 00:14:30.880 --> 00:14:35.410 been disrupting a lot of ransomware groups' operations. 254 00:14:35.500 --> 00:14:40.270 Specifically, we've seen the likes of Everest, Hive, Quantum, 255 00:14:40.330 --> 00:14:45.910 Ragnar Locker, Snatch, Vice Society and LockBit having their 256 00:14:45.940 --> 00:14:51.100 data leak sites get disrupted by DDoS attacks. So that begs the 257 00:14:51.100 --> 00:14:54.700 question, who is doing the disrupting? And nobody's taking 258 00:14:54.700 --> 00:14:57.790 credit for it. It might be nice to think this is some 259 00:14:58.720 --> 00:15:03.100 coordinated interest national government or military smackdown 260 00:15:03.130 --> 00:15:08.260 finally coming to bring disruption in mass to scores in 261 00:15:08.260 --> 00:15:12.820 society that is ransomware. Unfortunately, I suspect it's 262 00:15:13.180 --> 00:15:18.850 disaffected rivals, teenagers, early 20 somethings involved in 263 00:15:18.850 --> 00:15:22.030 these ransomware groups, basically trying to smack each 264 00:15:22.030 --> 00:15:26.170 other down. You see all this kind of, again, teenaged, 265 00:15:26.170 --> 00:15:29.020 adolescent soap opera type stuff when these different groups 266 00:15:29.020 --> 00:15:32.860 denigrate each other online. And there's this undercurrent, if 267 00:15:32.860 --> 00:15:37.660 you will, in a certain strata of Internet users that you have 268 00:15:37.660 --> 00:15:41.500 seen before with gaming sites, where people will DDoS the 269 00:15:41.500 --> 00:15:46.120 gaming sites just for the lulz, and just to cause disruption and 270 00:15:46.120 --> 00:15:50.140 be annoying. So I suspect that's what's happening. But it is one 271 00:15:50.140 --> 00:15:53.200 of those grab-the-popcorn moments, interesting to watch 272 00:15:53.260 --> 00:15:57.490 because the data leak sites, where the group's attempt to 273 00:15:57.490 --> 00:16:02.320 name and shame victims to try to force them to pay a ransom have 274 00:16:02.320 --> 00:16:07.450 been disrupted. Unfortunately, we're not seeing get disrupted 275 00:16:07.540 --> 00:16:10.450 the sites that they're using to communicate directly with 276 00:16:10.450 --> 00:16:12.820 victims. I mean, I guess that's maybe an upside if you're a 277 00:16:12.820 --> 00:16:17.200 victim, and you do make the business's decision to pay. So 278 00:16:17.200 --> 00:16:21.520 that's not been disrupted. Also, apparently, not disrupted are 279 00:16:21.520 --> 00:16:25.510 the portals used by ransomware groups, business associates, 280 00:16:25.570 --> 00:16:28.990 their affiliates, who will download the cryptolocker 281 00:16:28.990 --> 00:16:32.650 malware, and in fact, victims with it. Apparently, they can 282 00:16:32.650 --> 00:16:36.160 still get access to those portals, whoever's DDoSing the 283 00:16:36.160 --> 00:16:39.160 sites, it's very easy to find out where the data leak site is. 284 00:16:39.280 --> 00:16:43.360 But a lot of times the sites for victims and the sites for 285 00:16:43.360 --> 00:16:47.200 portals being used by affiliates are not well known. Or maybe 286 00:16:47.200 --> 00:16:50.050 they're known to intelligence agencies and law enforcement. 287 00:16:50.230 --> 00:16:53.110 And historically, they have not disrupted them. I think if they 288 00:16:53.110 --> 00:16:56.440 can get visibility into those, they will probably take that and 289 00:16:56.470 --> 00:16:59.740 use it to try to build intelligence. So we've seen 290 00:16:59.740 --> 00:17:03.430 these DDoS disruptions. And we've seen, as always, 291 00:17:03.580 --> 00:17:07.720 ransomware groups attempting to shift the narrative around, "Oh, 292 00:17:07.780 --> 00:17:10.720 we're not the victims here. We're the masterminds." In the 293 00:17:10.720 --> 00:17:13.660 case of LockBit, for example, "Well, if you're going to hit us 294 00:17:13.660 --> 00:17:16.720 with a DDoS attack, maybe we'll hit you with a DDoS attack." 295 00:17:17.050 --> 00:17:20.860 Like I said, adolescent grade response here, where they're 296 00:17:20.860 --> 00:17:24.400 saying, "Well, maybe we'll add DDoS to the things, the arsenal 297 00:17:24.400 --> 00:17:27.190 that we bring against our victims." Other groups have 298 00:17:27.190 --> 00:17:30.280 already done this before, doesn't seem to have stuck so 299 00:17:30.280 --> 00:17:33.940 much. So that's the DDoS side. One other interesting thing to 300 00:17:33.940 --> 00:17:39.340 highlight is a rise in the use of intermittent or partial 301 00:17:39.400 --> 00:17:44.590 encryption. And this is not a wide-spread technique. But it's 302 00:17:44.620 --> 00:17:48.760 being used by groups to big themselves up a little bit. 303 00:17:49.060 --> 00:17:52.240 Ransomware operations compete with each other, as I was just 304 00:17:52.240 --> 00:17:55.000 indicating, and one of the ways they try to differentiate 305 00:17:55.000 --> 00:17:58.630 themselves is via their technical acumen. And so what 306 00:17:58.630 --> 00:18:02.290 some groups have been doing is saying that they can encrypt 307 00:18:02.350 --> 00:18:07.090 victims faster. So if you're an affiliate, this has upsides, the 308 00:18:07.090 --> 00:18:11.170 faster you can encrypt a victim, the less the likelihood that 309 00:18:11.170 --> 00:18:13.750 they will see it or even if they see it, that they will be able 310 00:18:13.750 --> 00:18:18.460 to meaningfully stop it. So we have this technique, it's 311 00:18:18.460 --> 00:18:20.620 called, like I said, intermittent or partial 312 00:18:20.620 --> 00:18:24.580 encryption, where, it turns out, if you want to encrypt files, 313 00:18:24.670 --> 00:18:29.080 you don't necessarily need to encrypt the entire file, you 314 00:18:29.080 --> 00:18:33.100 just need to encrypt parts of it. And by encrypting parts of 315 00:18:33.100 --> 00:18:36.730 it, you can make it unusable, thus still potentially driving 316 00:18:36.730 --> 00:18:40.990 the victim to have to pay for a decrypter. For example, if 317 00:18:40.990 --> 00:18:43.870 there's a 50 gigabyte file, apparently, if you use these 318 00:18:43.870 --> 00:18:47.440 tactics, you can encrypt it in two minutes less than it would 319 00:18:47.440 --> 00:18:51.010 otherwise take. If you think about a bunch of servers, a 320 00:18:51.010 --> 00:18:55.840 bunch of systems, this could lead to a big savings in the 321 00:18:55.840 --> 00:19:01.150 time that it takes to hit a victim. So we've seen Conti 322 00:19:01.150 --> 00:19:04.690 spin-offs advertising this capability. Black Pasta, for 323 00:19:04.720 --> 00:19:09.250 example, also BlackCat are offering this, as are some other 324 00:19:09.280 --> 00:19:12.610 new ransomware groups. I don't know if it's going to take off, 325 00:19:12.790 --> 00:19:16.480 if everyone's going to do it. There are some ways of combating 326 00:19:16.480 --> 00:19:19.660 this. I spoke with security experts and they say that these 327 00:19:19.660 --> 00:19:22.600 tactics attempt to make a file look like it hasn't been 328 00:19:22.600 --> 00:19:26.650 encrypted. But they are accessing certain fast-read 329 00:19:27.700 --> 00:19:31.720 techniques in the operating system. So, if anti-malware 330 00:19:31.750 --> 00:19:35.350 isn't already looking for this, I suspect that we'll see it 331 00:19:35.350 --> 00:19:39.490 being coded to watch out for this type of activity, because 332 00:19:39.490 --> 00:19:42.910 there are some definite tells, some definite red flags that 333 00:19:42.910 --> 00:19:46.150 come along with this. But I highlight it because it's 334 00:19:46.150 --> 00:19:49.510 interesting to see how ransomware groups continue to 335 00:19:49.510 --> 00:19:53.260 innovate. You have the marketing side of things, like we're going 336 00:19:53.260 --> 00:19:57.130 to run DDoS attacks against those who dare to DDoS us, and 337 00:19:57.130 --> 00:19:59.740 you have the technology side of things where they're trying to 338 00:19:59.740 --> 00:20:03.340 make the attacks faster and more effective in order to get more 339 00:20:03.340 --> 00:20:05.110 people who want to work with them. 340 00:20:06.190 --> 00:20:09.250 Anna Delaney: Fascinating, Matt. So, does this change anything 341 00:20:09.250 --> 00:20:12.190 for the defenders? What should they be doing differently when 342 00:20:12.190 --> 00:20:13.630 it comes to partial encryption? 343 00:20:14.470 --> 00:20:16.510 Mathew Schwartz: Ask your anti-malware provider if they 344 00:20:16.510 --> 00:20:20.440 can spot signs of a partial encryption attack at work. That 345 00:20:20.440 --> 00:20:23.110 would be one of my main takeaways here. With a lot of 346 00:20:23.110 --> 00:20:26.200 this technical-level stuff, there are, indeed, defenses that 347 00:20:26.200 --> 00:20:30.430 can be brought to bear. And so, it's, I think, good to be aware 348 00:20:30.430 --> 00:20:33.700 of how you might get hit, good to know what this kind of attack 349 00:20:33.700 --> 00:20:36.520 might look like. For example, it might leave a file partially 350 00:20:36.520 --> 00:20:40.150 readable and you might be thinking, why is this happening? 351 00:20:40.360 --> 00:20:43.660 Well, partial encryption by ransomware gang. That's possibly 352 00:20:43.660 --> 00:20:48.370 the answer. So be aware, I think is the big thing here. And it'll 353 00:20:48.370 --> 00:20:51.190 be interesting to see if we see a rise in these types of attacks 354 00:20:51.220 --> 00:20:51.760 or not. 355 00:20:52.450 --> 00:20:54.700 Tom Field: I do appreciate the conversation I moderated last 356 00:20:54.700 --> 00:20:56.710 night. It was about ransomware defense and a couple of big 357 00:20:56.710 --> 00:21:00.940 topics. One was the desire of many of the participants to 358 00:21:00.940 --> 00:21:04.150 start tokenizing their data, and the idea of making data 359 00:21:04.150 --> 00:21:06.970 worthless to anyone who might get a hold of it. The other 360 00:21:06.970 --> 00:21:10.960 conversation was about why don't we just prohibit the paying of 361 00:21:10.960 --> 00:21:15.010 any kinds of ransoms. And interestingly, on the panel was 362 00:21:15.010 --> 00:21:17.560 a member of the former Cyberspace Solarium Commission. 363 00:21:17.560 --> 00:21:22.150 He said he argued against banning ransomware payments for 364 00:21:22.150 --> 00:21:26.290 the notion that you don't want to criminalize someone who's 365 00:21:26.290 --> 00:21:27.220 just been a victim. 366 00:21:28.510 --> 00:21:31.330 Mathew Schwartz: Absolutely, there's so many unintended 367 00:21:31.360 --> 00:21:34.240 consequences. If you're a healthcare organization, for 368 00:21:34.240 --> 00:21:37.930 example, and you need to operate and you can't get the records. I 369 00:21:37.930 --> 00:21:41.620 mean, there's a lot of horrible outcomes. And sometimes you 370 00:21:41.620 --> 00:21:44.440 think, "oh! Hollywood style sort of stuff." But there's so many 371 00:21:44.440 --> 00:21:47.380 unintended consequences that can happen. I think it really does 372 00:21:47.380 --> 00:21:50.650 need to be made a business decision. Hopefully, more people 373 00:21:50.650 --> 00:21:53.500 will get their defenses in order. But we can't expect that 374 00:21:53.500 --> 00:21:56.350 no one's ever going to get hit. And then, that won't cause some 375 00:21:56.350 --> 00:22:00.580 sort of public health or national security or other type 376 00:22:00.580 --> 00:22:01.360 of crisis. 377 00:22:02.590 --> 00:22:04.270 Suparna Goswami: This is such a gray area. There's no black and 378 00:22:04.270 --> 00:22:07.330 white, right? I mean, to each his business, because it's a 379 00:22:07.330 --> 00:22:10.360 business decision at the end of the day, like Matt said, it's a 380 00:22:10.360 --> 00:22:13.360 very gray area. You can't really say it's the wrong or the right 381 00:22:13.360 --> 00:22:13.660 thing. 382 00:22:15.040 --> 00:22:17.500 Mathew Schwartz: You need to be very careful about, as you said, 383 00:22:17.770 --> 00:22:21.040 black and white sorts of approaches to technical matters. 384 00:22:21.130 --> 00:22:24.310 There's often some nuance you hadn't considered which could be 385 00:22:24.670 --> 00:22:25.780 horrifying. 386 00:22:26.980 --> 00:22:28.660 Anna Delaney: The debate continues. Thank you, Matt. 387 00:22:29.050 --> 00:22:32.470 Final question for you all, who is on the top of your interview 388 00:22:32.470 --> 00:22:36.430 dream? So someone you haven't interviewed in the industry. And 389 00:22:36.430 --> 00:22:37.030 you'd like to. 390 00:22:37.810 --> 00:22:40.900 Tom Field: Reminds me early in my career, I had the opportunity 391 00:22:40.900 --> 00:22:43.420 to work for James Russell Wiggins, who was the former 392 00:22:43.810 --> 00:22:46.060 editor-in-chief of The Washington Post. He stepped down 393 00:22:46.060 --> 00:22:49.150 and Ben Bradlee came in. And someone asked him, someone from 394 00:22:49.150 --> 00:22:51.940 Time magazine interviewed and said, "If you could interview 395 00:22:51.940 --> 00:22:54.640 anyone in history, who would it be?" He looks straight at the 396 00:22:54.640 --> 00:23:01.450 interviewer, he said, "God! No, really Thomas Jefferson." I 397 00:23:01.450 --> 00:23:04.720 would say I would like to interview, if God is not on the 398 00:23:04.720 --> 00:23:08.290 list, I'd like to talk to Angus King, senator from my state of 399 00:23:08.290 --> 00:23:11.860 Maine. He was the head of the Cyberspace Solarium Commission. 400 00:23:11.860 --> 00:23:14.410 And things have changed considerably in a couple of 401 00:23:14.410 --> 00:23:17.380 years since the Commission issued its report. I'd like to 402 00:23:17.380 --> 00:23:20.020 catch up with him. The last time I spoke with him, he was the 403 00:23:20.020 --> 00:23:23.110 first congressional leader I have spoken to. They could talk 404 00:23:23.110 --> 00:23:26.260 about cybersecurity without having first had a briefing and 405 00:23:26.260 --> 00:23:28.840 refer to those briefings notes. He knew what he was talking 406 00:23:28.840 --> 00:23:30.280 about. Now, let's speak to him again. 407 00:23:31.270 --> 00:23:33.400 Anna Delaney: Brilliant. Suparna? 408 00:23:35.130 --> 00:23:37.560 Suparna Goswami: Yes, I'm not somebody from the industry. But 409 00:23:37.560 --> 00:23:42.000 I thought of interviewing our prime minister and asking him 410 00:23:42.000 --> 00:23:45.300 when will the data prediction will be out, considering that it 411 00:23:45.300 --> 00:23:49.050 was just rejected by so many government? I mean, the 412 00:23:49.050 --> 00:23:54.600 government decided to revisit it again, present a new bill. They 413 00:23:54.600 --> 00:23:57.720 have not presented a new bill, but work on it completely in a 414 00:23:57.720 --> 00:24:01.560 new fashion. So I just want to ask him, when is the country 415 00:24:01.560 --> 00:24:04.860 going to be out with the privacy bill? The law we have been 416 00:24:04.860 --> 00:24:09.630 working on it since 2018. So it's high time and everybody is 417 00:24:09.630 --> 00:24:13.050 looking here at India, asking when's the bill going to be out? 418 00:24:13.350 --> 00:24:15.930 So I think we need an answer for that, and I'd love to have an 419 00:24:15.930 --> 00:24:18.390 interview with him on that particular topic. 420 00:24:18.850 --> 00:24:21.820 Anna Delaney: I'll definitely watch that, Suparna. Great 421 00:24:21.820 --> 00:24:23.500 choice. Mathew? 422 00:24:24.520 --> 00:24:26.230 Mathew Schwartz: I'm going to go for a twofer. We've mentioned 423 00:24:26.230 --> 00:24:28.750 Jen Easterly, the director of CISA. I've not had the pleasure 424 00:24:28.750 --> 00:24:32.680 of interviewing her before. And I'd also like to interview her 425 00:24:32.680 --> 00:24:36.850 counterpart here in Britain - Lindy Cameron, the head 426 00:24:36.880 --> 00:24:41.200 president of the National Cyber Security Center. I'd love to get 427 00:24:41.200 --> 00:24:43.870 both of them together on a panel actually, just have the two of 428 00:24:43.870 --> 00:24:47.410 them talking about top challenges, top approaches, and 429 00:24:47.410 --> 00:24:51.550 just their perspective. It's so fascinating to speak with 430 00:24:51.580 --> 00:24:55.480 government cybersecurity leaders and to hear their perspective 431 00:24:55.480 --> 00:24:58.720 because they see things that obviously would horrify us. We 432 00:24:58.720 --> 00:25:01.480 wouldn't be able to sleep, I'm sure, and just to get their 433 00:25:01.480 --> 00:25:04.090 perspective on how things are unfolding, what they're keeping 434 00:25:04.090 --> 00:25:06.670 track of, what they're advocating, it's just great to 435 00:25:06.670 --> 00:25:10.300 get that perspective. So I have a twofer there on my wish list. 436 00:25:10.600 --> 00:25:13.600 Anna Delaney: Absolutely. I had Lindy Cameron as well. So there 437 00:25:13.600 --> 00:25:18.100 we go, we have to fight over her, Mathew. But my plan B is a 438 00:25:18.100 --> 00:25:22.540 bit like the God, but Tom, it would be Alan Turing, I'd love 439 00:25:22.540 --> 00:25:25.270 his opinion. Of course, he's the founder of computer science. 440 00:25:25.270 --> 00:25:28.450 What would he think about today's state of affairs and 441 00:25:28.450 --> 00:25:29.350 cybersecurity? 442 00:25:31.390 --> 00:25:33.430 Tom Field: Drive to a newly elevated princess and find out 443 00:25:33.430 --> 00:25:35.530 which of them is more cybersecurity savvy? 444 00:25:37.150 --> 00:25:43.090 Anna Delaney: Yes, very true. Next time. So, thank you very 445 00:25:43.090 --> 00:25:45.700 much, everyone. Mathew, Tom, Suparna. This has been a 446 00:25:45.700 --> 00:25:47.020 pleasure. As always. 447 00:25:47.260 --> 00:25:47.740 Tom Field: Thanks so much. 448 00:25:47.740 --> 00:25:48.190 Suparna Goswami: Thank you. 449 00:25:49.960 --> 00:25:52.000 Anna Delaney: Thanks so much for watching. Until next time.