WEBVTT 1 00:00:00.000 --> 00:00:02.550 Anna Delaney: Hello, I'm Anna Delaney and welcome back to the 2 00:00:02.550 --> 00:00:05.700 weekly edition of the ISMG Editors' Panel where I'm joined 3 00:00:05.700 --> 00:00:09.870 by fellow colleagues on the ISMG editorial team to evaluate and 4 00:00:09.870 --> 00:00:13.620 chew over the top cybersecurity news stories. Delighted this 5 00:00:13.620 --> 00:00:16.740 week to be joined by Tony Morbin, executive news editor 6 00:00:16.800 --> 00:00:21.120 for the EU; Editorial Director, Cal Harrison; and Consultant 7 00:00:21.120 --> 00:00:24.660 Editor, Akshaya Asokan. Cal, you're looking pretty in the 8 00:00:24.660 --> 00:00:26.040 flowers. Tell us where you are. 9 00:00:27.350 --> 00:00:31.010 Cal Harrison: Just in South Carolina. This is actually a 10 00:00:31.010 --> 00:00:35.000 Camellia bush that's right outside my deck, and it's in 11 00:00:35.390 --> 00:00:40.280 full flower. And so I think it's neat because it's sort of the 12 00:00:40.280 --> 00:00:45.590 last gasp of summer. And before things get all dark and gray. 13 00:00:46.980 --> 00:00:49.320 Anna Delaney: It started happening here. I can tell you 14 00:00:49.320 --> 00:00:52.620 that. Tony, I recognize some British comedy. 15 00:00:53.670 --> 00:00:57.360 Tony Morbin: Yeah, I want to talk about not blaming the 16 00:00:57.360 --> 00:01:00.690 customer. And I thought of what I can do to illustrate that. So 17 00:01:00.690 --> 00:01:03.390 I've come up with Basil Fawlty for anybody who does know it 18 00:01:03.390 --> 00:01:06.510 will make sense. And if you don't well, worth a look. 19 00:01:06.780 --> 00:01:10.680 Anna Delaney: We can't wait. Akshaya, a beautiful scene, a 20 00:01:10.680 --> 00:01:12.270 beautiful sky. Tell us more. 21 00:01:12.810 --> 00:01:16.200 Akshaya Asokan: So this is where I live, closer 22 00:01:16.200 --> 00:01:20.460 to where I stay. It's like a landmark in the town. And it's 23 00:01:20.460 --> 00:01:25.710 just you know, beautiful during sunset. Just turns cold pink and 24 00:01:25.710 --> 00:01:26.430 colorful. 25 00:01:27.950 --> 00:01:30.830 Anna Delaney: Yeah. Beautiful. And I am joining you from the 26 00:01:30.830 --> 00:01:33.560 rooftops of Paris. Unfortunately, not in real time. 27 00:01:33.770 --> 00:01:37.820 But hey, next time hopefully. Now there was an interesting 28 00:01:37.820 --> 00:01:41.600 announcement from the DOJ this week related to a crime 29 00:01:41.810 --> 00:01:43.910 committed 10 years ago. Tell us about it. 30 00:01:44.740 --> 00:01:48.070 Cal Harrison: Yeah, just a huge amount of money involved. 31 00:01:48.760 --> 00:01:52.540 Federal law enforcement finally caught up with a Georgia 32 00:01:52.540 --> 00:01:58.180 property developer after a fellow named Jimmy Zhong 10 33 00:01:58.210 --> 00:02:02.740 years after he stole 50,000 bitcoin from Silk Road, the 34 00:02:02.740 --> 00:02:06.760 infamous cybercrime darknet site. He pleaded guilty on 35 00:02:06.760 --> 00:02:10.840 Monday in federal court, and forfeited the bitcoin, cash and 36 00:02:10.870 --> 00:02:15.640 80% of his investments. At the time of the raid, this is really 37 00:02:15.640 --> 00:02:19.450 interesting, the stolen cryptocurrency was worth about 38 00:02:19.690 --> 00:02:26.050 $3.36 billion. Today, thanks to market volatility, it's down to 39 00:02:26.050 --> 00:02:30.670 about $1 billion. The Justice Department says this was the 40 00:02:30.670 --> 00:02:34.720 second largest federal seizure of cryptocurrency, the largest 41 00:02:34.720 --> 00:02:39.970 being the 4.5 billion seeds from BitFenix, which was announced 42 00:02:39.970 --> 00:02:43.660 just earlier this year. All of this money is going to be 43 00:02:43.660 --> 00:02:47.920 retained by the government because it was stolen from Silk 44 00:02:47.920 --> 00:02:52.120 Road administrator Ross Ulbricht, also known as the 45 00:02:52.150 --> 00:02:56.080 Dread Pirate Roberts, which by the way, is a character in my 46 00:02:56.170 --> 00:02:59.830 favorite movie best movie of all time, The Princess Bride if you 47 00:02:59.830 --> 00:03:05.200 know. He overexerted serving life in prison for his 48 00:03:05.200 --> 00:03:10.420 involvement in that darknet market. Interesting thing about 49 00:03:10.420 --> 00:03:14.950 this hack was John figured out how to game the Silk Road's 50 00:03:15.490 --> 00:03:20.380 system by making deposits and withdrawals within milliseconds. 51 00:03:20.920 --> 00:03:26.290 For example, he could turn one $500 deposit into $2,500 and 52 00:03:26.290 --> 00:03:31.690 withdrawals within a second. He hid the funds in various wallets 53 00:03:31.690 --> 00:03:36.250 and stored the information in an underground safe and in a single 54 00:03:36.730 --> 00:03:41.950 board computer that was hidden and a popcorn tin underneath a 55 00:03:41.950 --> 00:03:47.560 pile of clothes in his closet. And I would suppose if there are 56 00:03:47.560 --> 00:03:51.940 any lessons to be learned from the cybercriminals for this one 57 00:03:51.940 --> 00:03:55.540 is to find a better hiding place for your stolen crypto. 58 00:03:55.000 --> 00:03:58.878 Anna Delaney: So Cal, where would you hide $3 billion worth 59 00:03:58.964 --> 00:04:00.430 of stolen crypto? 60 00:04:01.090 --> 00:04:03.370 Cal Harrison: Yeah, I've actually given it some thought. 61 00:04:03.910 --> 00:04:08.920 I would maybe put it on my desk because I can't ever seem to 62 00:04:08.920 --> 00:04:10.300 find anything on it. 63 00:04:10.780 --> 00:04:13.150 Anna Delaney: There's always the fear of hiding it in a place 64 00:04:13.150 --> 00:04:16.960 which is so good that you forget where it is. But Cal, what's the 65 00:04:16.960 --> 00:04:19.390 significance of this seizure from a law enforcement 66 00:04:19.000 --> 00:04:23.500 Cal Harrison: The interesting question. We talked to Ari 67 00:04:19.390 --> 00:04:19.990 perspective? 68 00:04:23.530 --> 00:04:28.360 Redbord, who is a ISMG contributor, former prosecutor 69 00:04:28.510 --> 00:04:35.230 and head of legal affairs and enforcement at TRM Labs, and he 70 00:04:35.230 --> 00:04:37.990 says the case is a real testament to the power of 71 00:04:37.990 --> 00:04:44.530 blockchain and how it helps investigators in unraveling this 72 00:04:44.560 --> 00:04:45.640 type of crime. 73 00:04:46.170 --> 00:04:48.270 Ari Redbord: Really interesting case for a number of reasons. 74 00:04:48.270 --> 00:04:53.160 One, the conduct occurred in 2012, which is really 75 00:04:53.160 --> 00:04:58.080 significant. That was a long time ago. It was a lifetime ago 76 00:04:58.080 --> 00:05:00.450 when it comes to bitcoin. We were really just hearing early 77 00:05:00.450 --> 00:05:04.230 days of bitcoin, I think the bitcoin pizza was purchased in 78 00:05:04.230 --> 00:05:08.880 2012. So really very early days of bitcoin. But what it really 79 00:05:08.880 --> 00:05:13.890 shows on so many levels is the power of blockchains. And the 80 00:05:13.890 --> 00:05:17.430 unique characteristics that blockchains have to help 81 00:05:17.430 --> 00:05:20.880 investigators investigate fraud and financial crime. So really 82 00:05:20.910 --> 00:05:24.240 the nature of blockchains, right? You know, these forever 83 00:05:24.270 --> 00:05:29.370 public ledgers - traceable, immutable, nothing ever changes, 84 00:05:29.400 --> 00:05:35.100 and every transaction is logged forever. So with those qualities 85 00:05:35.100 --> 00:05:40.230 allow for our investigators to use tools like TRM, to go back 86 00:05:40.320 --> 00:05:44.490 and trace and track the flow of funds over years. So really, the 87 00:05:44.490 --> 00:05:47.820 technology may have not been there in 2012, may have not even 88 00:05:47.820 --> 00:05:51.750 been there in 2015, 2016, 2018. But today, investigators were 89 00:05:51.750 --> 00:05:54.300 able to go back or a year ago, and I'll get to that in a 90 00:05:54.300 --> 00:05:57.360 moment, we're able to go back and trace and track the flow of 91 00:05:57.360 --> 00:06:00.660 funds. So really, this is sort of an "only in crypto" type of 92 00:06:00.660 --> 00:06:03.540 law enforcement action, because it allowed investigators to go 93 00:06:03.540 --> 00:06:08.340 back and trace across years in order to build a case. Another 94 00:06:08.340 --> 00:06:09.990 thing that was really extraordinary about this is 95 00:06:09.990 --> 00:06:14.340 remember, you know, we talk about tools all the time. And 96 00:06:14.340 --> 00:06:17.670 capabilities, tools like TRM that allow law enforcement to 97 00:06:17.670 --> 00:06:20.700 track and trace the flow of funds. But we see here is that 98 00:06:20.730 --> 00:06:23.730 these are very powerful tools, but they're only one tool in a 99 00:06:23.730 --> 00:06:27.780 larger toolbox. And what we see here is ultimately, law 100 00:06:27.780 --> 00:06:31.050 enforcement, IRSCI, in particular, IRS Criminal 101 00:06:31.050 --> 00:06:34.650 Investigations, was able to execute a search warrant on a 102 00:06:34.650 --> 00:06:38.700 residence that belonged to the defendant, in this case, James 103 00:06:38.820 --> 00:06:42.300 Zhong and were ultimately able to find the evidence it needed 104 00:06:42.780 --> 00:06:46.920 to make its case. That search warrant was executed about a 105 00:06:46.920 --> 00:06:50.760 year ago. So really, what, you know, I'm not involved in the 106 00:06:50.760 --> 00:06:53.070 case, and I wasn't involved in the case. But having been a 107 00:06:53.070 --> 00:06:56.940 prosecutor for about 11 years at DOJ, it is likely that over the 108 00:06:56.940 --> 00:07:01.260 last year, we've seen, you know, intense cooperation, the 109 00:07:01.260 --> 00:07:05.790 defendant sort of work with law enforcement to recover funds, to 110 00:07:05.820 --> 00:07:09.600 provide information about the laundering, and really kind of 111 00:07:09.600 --> 00:07:14.700 help them get a clearer picture of this case. So much going on 112 00:07:14.700 --> 00:07:19.500 here but when I think about sort of really the key takeaways is 113 00:07:19.530 --> 00:07:23.550 this case never happens in traditional world. This case is 114 00:07:23.580 --> 00:07:27.960 only enabled by those unique qualities of blockchains - open, 115 00:07:27.990 --> 00:07:32.100 transparent, immutable, traceable and forever. 116 00:07:32.300 --> 00:07:35.150 Cal Harrison: Really interesting points he made, you can almost 117 00:07:35.150 --> 00:07:40.310 see the parallel between how the technology developed with DNA 118 00:07:40.310 --> 00:07:44.420 evidence. And you know, how the technology to fight cybercrime, 119 00:07:44.720 --> 00:07:48.410 you know, has developed over the years and while we're seeing a 120 00:07:48.410 --> 00:07:53.390 case this old actually getting resolved. The other interesting 121 00:07:53.390 --> 00:07:58.850 point, I think was about how the investigators and prosecutors 122 00:07:59.150 --> 00:08:02.960 are using this to giving the criminals an opportunity to 123 00:08:02.960 --> 00:08:06.500 mitigate their sentences by cooperating and sharing their 124 00:08:06.500 --> 00:08:12.620 tactics. Investigators are certainly looking for ways to 125 00:08:12.620 --> 00:08:20.450 hone their skills and train new investigators. Just this past 126 00:08:20.450 --> 00:08:25.430 year, scammers sold about $14 billion in cryptocurrency and 127 00:08:25.490 --> 00:08:29.900 obviously we've been seeing some troubling cases this year so 128 00:08:30.740 --> 00:08:37.340 this will hopefully contribute to the overall techniques and 129 00:08:37.340 --> 00:08:41.570 tactics that the investigators are using to defeat this type of 130 00:08:41.570 --> 00:08:42.260 cybercrime. 131 00:08:43.160 --> 00:08:46.460 Anna Delaney: Absolutely, it's a fascinating time to be working 132 00:08:46.460 --> 00:08:50.240 on crypto-related investigations. Thank you, Cal. 133 00:08:51.260 --> 00:08:55.010 Tony, U.K. National Cybersecurity Center and GCHQ 134 00:08:55.010 --> 00:08:59.240 veteran, Dr. Ian Levy is leaving what he describes as the best 135 00:08:59.240 --> 00:09:03.320 job in the world. And he's been working for government for over 136 00:09:03.350 --> 00:09:07.280 22 years, I believe. He bid farewell with a stimulating, 137 00:09:07.310 --> 00:09:11.330 thought-provoking 6,000-word blog post. I think you're going 138 00:09:11.330 --> 00:09:12.560 to summarize it for us. 139 00:09:12.000 --> 00:09:14.958 Tony Morbin: No, I'm just going to dip in and borrow one of the 140 00:09:15.017 --> 00:09:18.626 themes in his parting blog. As you say, he's leaving his role 141 00:09:18.686 --> 00:09:22.295 as technical director of U.K.'s NCSC, which is an offshoot of 142 00:09:22.354 --> 00:09:26.023 GCHQ. Now, it's a role he held at the U.K.'s government public 143 00:09:26.082 --> 00:09:28.626 facing cyber defense organization since its 144 00:09:28.685 --> 00:09:32.235 inception, shortly after the Snowden-NSA revelations dragged 145 00:09:32.295 --> 00:09:35.786 cybersecurity out from the shadows and into the spotlight. 146 00:09:35.845 --> 00:09:39.454 Now he was great at making some really complex issues that we 147 00:09:39.513 --> 00:09:42.768 face appear simple. So do forgive me if my paraphrasing 148 00:09:42.827 --> 00:09:46.554 doesn't really do him justice. But here goes: now, in kicks off 149 00:09:46.614 --> 00:09:49.394 with a quantum state superposition joke using a 150 00:09:49.454 --> 00:09:53.004 formula I won't even pretend that I can decipher. Other than 151 00:09:53.063 --> 00:09:56.672 that I get what he's saying is that cyber techies can be both 152 00:09:56.731 --> 00:10:00.518 simultaneously incredibly smart and incredibly dumb. The classic 153 00:10:00.577 --> 00:10:04.246 absent-minded professor who's maybe great in the lab, but less 154 00:10:04.305 --> 00:10:08.151 so when it comes to dealing with the non-tech world. And he tells 155 00:10:08.210 --> 00:10:11.642 a story about the World War II Boeing B-17 Flying Fortress 156 00:10:11.701 --> 00:10:15.192 aircraft, in which two similar looking controls sat side by 157 00:10:15.251 --> 00:10:18.920 side, one bringing down landing gear and the other controlling 158 00:10:18.979 --> 00:10:22.056 the wind flaps. It was eventually realized that this 159 00:10:22.115 --> 00:10:25.843 was the reason so many were lost on landing, as tired pilots at 160 00:10:25.902 --> 00:10:29.689 the end of an exhausting sortie hit the wrong switch causing the 161 00:10:29.748 --> 00:10:33.239 plane to crash. Of course, these controls are now different 162 00:10:33.298 --> 00:10:36.789 shapes, and they're far apart from each other. The aircraft 163 00:10:36.848 --> 00:10:40.517 world learned from its mistakes, and rather than blaming users 164 00:10:40.576 --> 00:10:44.067 for being unable to use badly designed safety features, and 165 00:10:44.126 --> 00:10:47.262 yet there are many in cybersecurity who do still talk 166 00:10:47.321 --> 00:10:50.871 about users being the biggest risk. They remind me, and this 167 00:10:50.931 --> 00:10:54.540 is where my picture comes in, of Basil Fawlty - the fictional 168 00:10:54.599 --> 00:10:58.268 hotel owner, who thought that the hotel would function so much 169 00:10:58.327 --> 00:11:01.699 better if it wasn't for having to put up with those awful 170 00:11:01.759 --> 00:11:05.486 guests. And as you can see the worried guests are being given a 171 00:11:05.545 --> 00:11:09.155 full dose in the picture. Now, as Ian points out, for all the 172 00:11:09.214 --> 00:11:12.527 tremendous ingenuity and creativity of the cybersecurity 173 00:11:12.586 --> 00:11:15.841 industry, we do continue to place ridiculous demands on 174 00:11:15.900 --> 00:11:19.036 users. And that's not just avoiding clicking links in 175 00:11:19.095 --> 00:11:22.527 emails or having poor password policies. And we implicitly 176 00:11:22.586 --> 00:11:26.077 expect arbitrarily complex implementations of technology to 177 00:11:26.136 --> 00:11:29.686 be perfect and vulnerability free in the long term. And then 178 00:11:29.746 --> 00:11:33.592 we break those who use the stuff that we build, when they fail to 179 00:11:33.651 --> 00:11:37.319 properly defend themselves from everything that hostile states 180 00:11:37.378 --> 00:11:41.106 can throw at them. Now, if I'm a physician, a car manufacturer, 181 00:11:41.165 --> 00:11:44.834 or retail owner, my priorities are healing, making or selling. 182 00:11:44.893 --> 00:11:48.443 They're not cybersecurity. I want cybersecurity specialists, 183 00:11:48.502 --> 00:11:52.171 professionals to take that worry away from me. Of course, that 184 00:11:52.230 --> 00:11:55.780 doesn't absolve me from some responsibility for security, no 185 00:11:55.839 --> 00:11:59.449 more than it absolves me from safety obligations. But I don't 186 00:11:59.508 --> 00:12:03.235 expect to have to learn a whole new profession to do my job. So 187 00:12:03.295 --> 00:12:06.726 what is reasonable? It's not reasonable to ask a 10 person 188 00:12:06.786 --> 00:12:10.099 software company to defend themselves from the Russians. 189 00:12:10.158 --> 00:12:13.768 But it is reasonable to ask a critical infrastructure company 190 00:12:13.827 --> 00:12:16.963 not to have their management systems connected to the 191 00:12:17.022 --> 00:12:20.335 internet with a password that school kids can crack. For 192 00:12:20.394 --> 00:12:23.767 cybersecurity to be scalable, long term, various security 193 00:12:23.826 --> 00:12:27.495 burdens need to be appropriately allocated with incentives for 194 00:12:27.554 --> 00:12:30.867 the correct management. The obvious person to manage the 195 00:12:30.927 --> 00:12:34.358 risk might not always be the best one. So make the change, 196 00:12:34.418 --> 00:12:38.086 move things around. As an aid to getting it right, or at least 197 00:12:38.145 --> 00:12:41.932 getting it better, Ian suggests a few recommendations. One, talk 198 00:12:41.991 --> 00:12:45.482 to people who aren't technical and actually listen to them. 199 00:12:45.541 --> 00:12:48.796 Stop blaming those without technical understanding when 200 00:12:48.855 --> 00:12:52.405 something goes wrong. Build stuff that works for most people 201 00:12:52.464 --> 00:12:56.074 most of the time, rather than going for the easy or the shiny 202 00:12:56.133 --> 00:12:59.801 thing and put ourselves in the shoes of users and ask if we're 203 00:12:59.860 --> 00:13:03.411 really being sensible in our expectations. Now, as Ian says, 204 00:13:03.470 --> 00:13:07.020 we haven't got that right yet. But to end on a more positive 205 00:13:07.079 --> 00:13:10.748 note, I will say that personally within cybersecurity circles, 206 00:13:10.807 --> 00:13:14.475 I've heard far less blaming of users today than I did a decade 207 00:13:14.534 --> 00:13:17.730 ago. So at least we are headed in the right direction. 208 00:13:19.790 --> 00:13:21.860 Anna Delaney: Excellent. I think there's a lot of value in that 209 00:13:21.950 --> 00:13:25.250 blog post and it's funny he's often been described as a 210 00:13:25.250 --> 00:13:29.570 disrupter, hasn't he, Tony? But whoever comes next, has big 211 00:13:29.570 --> 00:13:30.590 shoes to fill, I feel. 212 00:13:30.620 --> 00:13:32.570 Tony Morbin: They do. I mean, as I say, it's that lovely 213 00:13:32.570 --> 00:13:36.050 combination of being a really great, in-depth techie techie, 214 00:13:36.170 --> 00:13:40.370 whilst also being a communicator that can get the message across 215 00:13:40.370 --> 00:13:44.120 to a wider audience, but also, as demonstrated there have real 216 00:13:44.120 --> 00:13:45.500 empathy with that audience. 217 00:13:45.000 --> 00:13:48.600 Anna Delaney: For sure. And thank you for weaving in Basil 218 00:13:48.600 --> 00:13:55.980 Fawlty. First on the Editors' Panel. So Akshaya, this week, EU 219 00:13:55.980 --> 00:14:00.090 committee set up to investigate the questionable use of spyware 220 00:14:00.090 --> 00:14:02.820 across Europe presenting the initial findings of an 221 00:14:02.820 --> 00:14:06.570 investigation it started back in April this year. Could you just 222 00:14:06.570 --> 00:14:07.830 talk us through the findings? 223 00:14:08.980 --> 00:14:11.620 Akshaya Asokan: Yeah, so the committee was initially formed 224 00:14:11.620 --> 00:14:16.270 in March to investigate the extent of spyware abuse within 225 00:14:16.300 --> 00:14:20.920 EU nations. And this was formed, especially after reports emerged 226 00:14:20.920 --> 00:14:26.740 that EU nations Poland, Greece, Hungary, Spain, had used their 227 00:14:26.740 --> 00:14:29.950 business to target its politicians, journalists and 228 00:14:29.980 --> 00:14:34.870 activists. So the committee began his investigation in April 229 00:14:34.870 --> 00:14:38.590 this year, and as part of their probe, its members who are 230 00:14:38.590 --> 00:14:43.480 mostly members of the European Parliament, visited these four 231 00:14:43.480 --> 00:14:48.010 nations, for fact finding purposes. But the interesting 232 00:14:48.040 --> 00:14:50.950 thing about the committee is that since it was launched, it 233 00:14:50.950 --> 00:14:55.480 has been highly politicized. Largely because a lot of the 234 00:14:55.480 --> 00:14:59.950 time these nations that have been investigated didn't want to 235 00:14:59.950 --> 00:15:05.230 be accountable or transparent about their use of spyware, by 236 00:15:05.230 --> 00:15:10.960 its own governments. So after that, so after much resistance 237 00:15:10.960 --> 00:15:14.620 and non-cooperation from the nations are being investigated, 238 00:15:14.650 --> 00:15:19.540 the committee finally presented their findings. This is not the 239 00:15:19.540 --> 00:15:24.310 final report yet. But this is just like a sum total of the 240 00:15:25.600 --> 00:15:29.650 investigative reports that they have done so far. So it 241 00:15:29.650 --> 00:15:33.190 presented the findings yesterday, and one of the things 242 00:15:33.190 --> 00:15:38.080 that they've said is that, U.S. complacency has played a large 243 00:15:38.110 --> 00:15:45.010 role. It's complacency from the end of EU nations, large 244 00:15:45.010 --> 00:15:49.750 organizations like the commission, parliament has 245 00:15:49.750 --> 00:15:57.340 played a big role in spyware to sort of mushroom and expand 246 00:15:57.370 --> 00:16:03.010 within EU. So the committee report says that within EU there 247 00:16:03.040 --> 00:16:07.900 are like 30 spyware companies that are active, and all of them 248 00:16:07.930 --> 00:16:14.230 are exploiting its features like common market, Schengen systems, 249 00:16:14.530 --> 00:16:18.760 and the EU regulated label, which just sort of act like, you 250 00:16:18.760 --> 00:16:23.260 know, credibility tag. So that's one interesting thing that the 251 00:16:23.260 --> 00:16:28.630 committee said yesterday. So this is not the final report 252 00:16:28.630 --> 00:16:33.040 yet. And this will be presented before the committee members for 253 00:16:33.160 --> 00:16:36.490 more amendments, and once that is finalized, which will be 254 00:16:36.490 --> 00:16:40.270 released towards the end of this month, and will be presented at 255 00:16:40.270 --> 00:16:41.590 the EU parliament. 256 00:16:43.870 --> 00:16:45.880 Anna Delaney: That's a great overview. And so I think there 257 00:16:45.880 --> 00:16:51.430 were other proposals, which included defining or states 258 00:16:51.430 --> 00:16:54.490 defining what national security is, which was interesting in the 259 00:16:54.490 --> 00:16:59.080 creation of a dedicated European Export Control Agency and a 260 00:16:59.080 --> 00:17:02.470 joint initiative with the U.S. to create common standards, and 261 00:17:02.470 --> 00:17:05.530 a blacklist of spyware vendors. And I know, Tony, you've 262 00:17:05.590 --> 00:17:08.350 actually reported on this or at least in the Editors' Panel in 263 00:17:08.350 --> 00:17:11.650 the past. I mean, isn't it impossible to regulate spyware 264 00:17:11.650 --> 00:17:14.500 use? Because it's always going to exist? What do you think? 265 00:17:15.010 --> 00:17:17.650 Tony Morbin: I think it's useful to regulate things. It's like 266 00:17:17.650 --> 00:17:20.350 saying, you know, is there any point having law enforcement 267 00:17:20.350 --> 00:17:23.620 when crimes are always going to exist? So you do have to 268 00:17:23.620 --> 00:17:26.320 regulate it, you have to put down the parameters of what's 269 00:17:26.320 --> 00:17:29.650 acceptable and what's not acceptable. And there will be 270 00:17:29.650 --> 00:17:33.400 states that don't sign up to it, just totally ignore it. There'll 271 00:17:33.400 --> 00:17:36.820 be states that sign up to it, and surreptitiously ignore it, 272 00:17:37.000 --> 00:17:39.370 but at least they can be held to account. But if there's no rules 273 00:17:39.370 --> 00:17:42.010 whatsoever, there is no holding anybody to account. 274 00:17:42.990 --> 00:17:46.770 Anna Delaney: Yeah. So shall we wait and see in the next few 275 00:17:46.770 --> 00:17:49.110 months? Is that what's happening next? 276 00:17:49.330 --> 00:17:52.540 Akshaya Asokan: Yes. So there's a lot that the commission and 277 00:17:52.570 --> 00:17:57.940 other EU agencies need to do. So we'll see how their response 278 00:17:57.940 --> 00:18:00.700 will be like to the committee's finding. 279 00:18:01.690 --> 00:18:04.060 Anna Delaney: Excellent, well, thank you for that. And finally, 280 00:18:04.060 --> 00:18:07.360 we are seeing a wave of people in the infosec world leave 281 00:18:07.390 --> 00:18:11.470 Twitter, or threatening to leave the platform since Elon Musk 282 00:18:11.470 --> 00:18:15.400 took charge. Whether it's because the site will now charge 283 00:18:15.760 --> 00:18:19.000 users who want to be blue tick verified, or they don't trust it 284 00:18:19.000 --> 00:18:23.770 anymore, or have a disdain for Musk and his values. What are 285 00:18:23.770 --> 00:18:28.120 your thoughts? Have you left the platform as cybersecurity 286 00:18:28.120 --> 00:18:32.560 journalists? Or are you planning to leave? Are you thinking it's 287 00:18:32.560 --> 00:18:33.820 an overblown reaction? 288 00:18:35.730 --> 00:18:38.280 Tony Morbin: As a journalist, I'm going to have to go where 289 00:18:38.280 --> 00:18:42.750 the audience is. I don't think everyone is going to Mastodon. 290 00:18:42.750 --> 00:18:46.710 So although I probably will open a Mastodon account, because many 291 00:18:46.710 --> 00:18:51.060 techies are moving, it will be in addition to Twitter. Mastodon 292 00:18:51.060 --> 00:18:54.780 obviously has its flaws, as well as does Twitter. You know, with 293 00:18:54.780 --> 00:18:57.750 Twitter trying to get ourselves to increasingly identify 294 00:18:57.750 --> 00:19:00.630 ourselves more pay for the privilege and looking at other 295 00:19:00.630 --> 00:19:05.040 ways to monetize this, Mastodon has various issues such as 296 00:19:05.340 --> 00:19:08.280 people being tagged automatically and brought into a 297 00:19:08.280 --> 00:19:11.610 conversation where they weren't intended to be part of it. And 298 00:19:11.610 --> 00:19:14.040 there were likely to be other teething problems, partly 299 00:19:14.040 --> 00:19:17.310 because so many people are moving there. But as a 300 00:19:17.310 --> 00:19:20.430 journalist, it's going to be another channel rather than an 301 00:19:20.430 --> 00:19:23.070 alternative instead of channel. 302 00:19:23.400 --> 00:19:25.470 Anna Delaney: Yeah, because Mastodon has existed already for 303 00:19:25.500 --> 00:19:29.820 six years, but the influx of numbers have shot up. 304 00:19:31.880 --> 00:19:35.000 Tony Morbin: I saw a graph showing how they shot up when 305 00:19:35.960 --> 00:19:38.330 Elon Musk first said he was looking at taking over the 306 00:19:38.330 --> 00:19:41.510 channel. Then since he has taken it over and of course, you know, 307 00:19:41.510 --> 00:19:45.830 with the charging for the blue tick as well. So there is an 308 00:19:45.830 --> 00:19:48.290 uptick every time he messes with it. 309 00:19:49.010 --> 00:19:51.110 Anna Delaney: For sure. Akshaya? 310 00:19:53.120 --> 00:19:56.870 Akshaya Asokan: So again, like Tony said, for me, it will 311 00:19:56.870 --> 00:19:59.540 depend on where the audience is and a lot of the time the 312 00:19:59.540 --> 00:20:04.640 stories come from Twitter. And I do see it's still active. And so 313 00:20:04.640 --> 00:20:08.720 long as I'm still getting the stories on Twitter, I should be 314 00:20:08.720 --> 00:20:12.530 there. Also the other platform, it's not something that I'm 315 00:20:12.530 --> 00:20:13.190 familiar with. 316 00:20:16.100 --> 00:20:19.550 Anna Delaney: Staying on Twitter, for now. Cal? 317 00:20:19.000 --> 00:20:23.050 Cal Harrison: Well, and I'll probably follow the path of big 318 00:20:23.050 --> 00:20:30.910 influencers like you. I'm basically a lurker on Twitter, 319 00:20:31.120 --> 00:20:36.670 just as Tony and Akshaya were saying. You know, we use it as a 320 00:20:37.840 --> 00:20:42.490 way to keep up with the news and what people are saying. So it's 321 00:20:42.490 --> 00:20:47.050 one of the tools and the tool belt that I think we will 322 00:20:48.130 --> 00:20:52.870 continue to use. Also, it's a communication channel, as you 323 00:20:52.870 --> 00:20:57.340 know, if we need to get if we need to reach out to someone 324 00:20:57.340 --> 00:21:01.870 who's in the news, often Twitter is really one of the only ways 325 00:21:01.870 --> 00:21:08.470 that we can do that. So anyway, I'll probably follow your lead. 326 00:21:11.350 --> 00:21:13.540 Anna Delaney: Well actually, it's funny, I'm naturally swayed 327 00:21:13.540 --> 00:21:16.570 to LinkedIn, really, these days or in recent years. Sort of, 328 00:21:16.930 --> 00:21:19.720 probably say more on that platform, but I still have my 329 00:21:19.720 --> 00:21:23.590 Twitter profile. It's good for reading others, and others 330 00:21:23.590 --> 00:21:28.210 tweets. Okay, well, let's see how that goes. Akshaya, Cal and 331 00:21:28.210 --> 00:21:30.490 Tony. It's been a pleasure. Thank you so much. 332 00:21:31.300 --> 00:21:32.440 Tony Morbin: Likewise, thanks very much. 333 00:21:32.440 --> 00:21:32.950 Akshaya Asokan: Thank you. 334 00:21:33.310 --> 00:21:34.510 Cal Harrison: You're welcome. Enjoyed it. 335 00:21:35.710 --> 00:21:37.600 Anna Delaney: And thank you so much for watching. Until next 336 00:21:37.600 --> 00:21:37.960 time.