WEBVTT 1 00:00:00.000 --> 00:00:02.550 Anna Delaney: Hello and welcome to Proof of Concept, the ISMG 2 00:00:02.550 --> 00:00:05.070 talk show where we discuss today's and tomorrow's 3 00:00:05.100 --> 00:00:08.220 cybersecurity and privacy challenges with experts in the 4 00:00:08.220 --> 00:00:11.340 field, and how we can potentially solve them. We are 5 00:00:11.340 --> 00:00:14.700 your hosts. I'm Anna Delaney, director of productions at ISMG. 6 00:00:15.180 --> 00:00:17.130 Tom Field: I'm Tom Field. I'm senior vice president of 7 00:00:17.130 --> 00:00:20.670 editorial, also at ISMG. Anna, always a pleasure to see you. 8 00:00:20.820 --> 00:00:24.750 Anna Delaney: Always a pleasure. Tom, you're in Atlanta this 9 00:00:24.750 --> 00:00:26.160 week. Tell us more. 10 00:00:26.700 --> 00:00:29.460 Tom Field: Well, I'm here for our Southeast Cybersecurity 11 00:00:29.460 --> 00:00:32.580 Summit. It was held yesterday live. And today is hybrid as 12 00:00:32.580 --> 00:00:35.640 well, where we brought together security leaders to talk about 13 00:00:35.640 --> 00:00:38.250 leading topics such as third-party risk management, 14 00:00:38.250 --> 00:00:41.610 such as business email compromise, incident response, 15 00:00:41.610 --> 00:00:43.620 software security, so much more. 16 00:00:44.790 --> 00:00:47.580 Anna Delaney: So, if there was one key takeaway for you, what 17 00:00:47.580 --> 00:00:48.180 would that be? 18 00:00:48.810 --> 00:00:52.680 Tom Field: Key takeaway. One thing in Atlanta, I'm always 19 00:00:52.680 --> 00:00:55.830 impressed with the solidarity of this community. It's one of the 20 00:00:55.830 --> 00:00:58.800 tightest CISO communities I've seen anywhere in the world 21 00:00:58.800 --> 00:01:01.830 honestly. People that have stayed together and networked 22 00:01:01.830 --> 00:01:04.800 together and have been close for 20 years or more. So it's a 23 00:01:04.800 --> 00:01:07.530 tight community here. And if there's alignment on anything, 24 00:01:07.710 --> 00:01:11.250 it is that we have to address issues such as software supply 25 00:01:11.250 --> 00:01:15.390 chain security, we have to address incident response to the 26 00:01:15.390 --> 00:01:20.430 point that we're having a more structured and more automated 27 00:01:20.460 --> 00:01:24.210 response to attacks that are increasingly broader and 28 00:01:24.210 --> 00:01:27.780 increasingly automated. So, similar conversations to what 29 00:01:27.780 --> 00:01:31.350 we've had elsewhere in the world this year. But again, I'm 30 00:01:31.350 --> 00:01:34.140 particularly impressed by this community and how tight it is 31 00:01:34.140 --> 00:01:36.360 and how aligned it is on topics such as these. 32 00:01:36.480 --> 00:01:38.790 Anna Delaney: Fantastic. Did you learn anything new? 33 00:01:39.630 --> 00:01:42.840 Tom Field: Did I learn anything new? I learned that I am not a 34 00:01:42.840 --> 00:01:49.950 good resident at a so-called Millennial Hotel. I'm not going 35 00:01:50.040 --> 00:01:52.410 to give all names, I don't want to give any adverse publicity, 36 00:01:52.410 --> 00:01:56.070 but it's one of these hotels where you know ... let me read 37 00:01:56.070 --> 00:02:00.990 from the description here, if I may. Okay. "During your stay, 38 00:02:01.080 --> 00:02:05.040 you might see a flash mob breakout, or a giant pink teddy 39 00:02:05.040 --> 00:02:08.760 bear sitting at the bar. It's all about the fun for us here." 40 00:02:09.510 --> 00:02:12.810 Now, this fun includes: there are no closets in the rooms, 41 00:02:12.870 --> 00:02:15.510 there are hooks on the wall, there are no irons or ironing 42 00:02:15.510 --> 00:02:18.360 boards, there are ironing stations on one floor where you 43 00:02:18.360 --> 00:02:21.600 queue up to iron your clothes. There are no coffee machines in 44 00:02:21.600 --> 00:02:25.770 the room. And it's all designed, as they say here, so that you 45 00:02:25.770 --> 00:02:31.260 can have room to do your downward dog in your room and 46 00:02:31.260 --> 00:02:34.860 not be bothered by these other things. You know, I'm officially 47 00:02:34.860 --> 00:02:39.030 too old and I have officially become the get-off-my-lawn guy. 48 00:02:39.750 --> 00:02:40.320 There you go. 49 00:02:40.410 --> 00:02:42.120 Anna Delaney: I think that sounds quite inviting. 50 00:02:43.290 --> 00:02:45.120 Tom Field: That's a difference between Anna and Tom! 51 00:02:45.720 --> 00:02:48.420 Anna Delaney: Talking of the community, you did mention 52 00:02:48.420 --> 00:02:52.170 earlier that there was a lot of chatter around the conviction of 53 00:02:52.200 --> 00:02:57.720 ex Uber's CSO. Can you reveal what was being discussed? What's 54 00:02:57.720 --> 00:02:58.260 the tone? 55 00:02:59.430 --> 00:03:02.820 Tom Field: It comes up in lots of conversations. And there's a 56 00:03:02.820 --> 00:03:06.060 lot of guardedness about people wanting to say. I understand 57 00:03:06.060 --> 00:03:10.080 that this is a sensitive topic. For people in our community, 58 00:03:10.080 --> 00:03:14.760 this is one of our own, who is paying a significant toll for 59 00:03:14.820 --> 00:03:18.720 events that have happened. And you can argue what was corporate 60 00:03:18.720 --> 00:03:21.840 responsibility? What was his responsibility? What was 61 00:03:22.920 --> 00:03:25.470 revealed, and maybe what should have been revealed? But there 62 00:03:25.470 --> 00:03:28.710 are repercussions in the conversations that people are 63 00:03:28.800 --> 00:03:33.270 being hit home with is what does this say about my role in my 64 00:03:33.270 --> 00:03:37.320 organization, in my responsibility? And as someone 65 00:03:37.320 --> 00:03:41.430 said yesterday, when I accept a new job in my contract, should 66 00:03:41.430 --> 00:03:44.940 it say you're going to protect me? So there are lots of 67 00:03:44.940 --> 00:03:47.940 conversations that aren't done by any means. And it's an 68 00:03:47.940 --> 00:03:51.930 emotional one and a new one for this community as well. I'm sure 69 00:03:51.930 --> 00:03:53.880 David will talk more about him when we bring him on. 70 00:03:54.720 --> 00:03:57.360 Anna Delaney: I look forward to that. Why don't you introduce 71 00:03:57.390 --> 00:03:58.290 our first guest? 72 00:03:58.560 --> 00:04:02.700 Tom Field: I would be happy to do that because she's a frequent 73 00:04:02.700 --> 00:04:05.910 guest here. She's the chair of the global privacy and 74 00:04:05.910 --> 00:04:09.420 cybersecurity practice in the Hunton Andrews Kurth LLP. She's 75 00:04:09.420 --> 00:04:12.060 Lisa Sotto. And I understand she's Lisa Sotto live from 76 00:04:12.060 --> 00:04:13.410 London today. Is that correct? 77 00:04:13.440 --> 00:04:16.410 Lisa Sotto: That is correct. And I'm laughing about your hotel 78 00:04:16.410 --> 00:04:19.170 description, Tom, because mine is the exact opposite. I don't 79 00:04:19.170 --> 00:04:23.970 have actually enough outlets. I'm in a very old hotel. 80 00:04:24.390 --> 00:04:26.310 Tom Field: Well, there are plenty of outlets here because 81 00:04:26.310 --> 00:04:28.680 Millennials like to be plugged in. So you can be plugged in 82 00:04:28.680 --> 00:04:31.650 almost anywhere here, but to make a cup of coffee - that's a 83 00:04:31.650 --> 00:04:35.760 different story. Anyway, Lisa, great that you're where you are 84 00:04:35.760 --> 00:04:38.160 right now, because just last week, as you know, President 85 00:04:38.160 --> 00:04:41.490 Biden signed an executive order to implement a new framework to 86 00:04:41.490 --> 00:04:44.640 protect the privacy of personal data shared between the U.S. and 87 00:04:44.640 --> 00:04:48.030 Europe. Yeah, as you know, some privacy campaigners aren't 88 00:04:48.030 --> 00:04:51.690 impressed. The Austrian privacy activist Maximilian Schrems said 89 00:04:51.690 --> 00:04:54.750 he sees no ban on bulk surveillance and no actual 90 00:04:54.750 --> 00:04:59.040 limitations. And we expect a new data agreement to be ready maybe 91 00:04:59.040 --> 00:05:03.180 by March of next year, although, privacy activists are expected 92 00:05:03.180 --> 00:05:06.300 to challenge the ruling in court. Big question for you, 93 00:05:06.420 --> 00:05:09.780 what are your thoughts on what's been proposed? And what are we 94 00:05:09.780 --> 00:05:12.870 likely to see develop over the next quarter or so? 95 00:05:13.800 --> 00:05:17.910 Sotto: Yeah, Tom, thank you. This new agreement, now 96 00:05:17.910 --> 00:05:22.050 known as the Trans-Atlantic Data Privacy Framework, is a very 97 00:05:22.050 --> 00:05:26.370 significant development, in my view, in the world of EU and 98 00:05:26.370 --> 00:05:30.300 U.K. data transfers to the U.S. So, just to provide a bit of 99 00:05:30.300 --> 00:05:35.910 background. About six months ago, the presidents of both the 100 00:05:35.910 --> 00:05:40.200 U.S. and the European Commission made a joint statement in which 101 00:05:40.200 --> 00:05:44.160 they announced an agreement in principle to replace the now 102 00:05:44.190 --> 00:05:49.080 invalidated Privacy Shield. And as you will recall, it had been 103 00:05:49.080 --> 00:05:52.800 struck down as one of the very few valid data transfer 104 00:05:52.800 --> 00:05:56.190 mechanisms by the Court of Justice of the European Union, 105 00:05:56.760 --> 00:06:00.510 and the decision is known as the Schrems II decision. Of course, 106 00:06:00.510 --> 00:06:05.430 Max Schrems was operative here. And it focused on the lack of 107 00:06:05.430 --> 00:06:09.150 protections for EU residents in connection with U.S. 108 00:06:09.180 --> 00:06:14.280 surveillance programs, and the court also criticized the 109 00:06:14.880 --> 00:06:19.080 insufficient redress mechanisms to challenge any unlawful 110 00:06:19.080 --> 00:06:24.000 government surveillance. So last Friday, President Biden issued 111 00:06:24.000 --> 00:06:27.780 an executive order that outlined safeguards that the U.S. 112 00:06:27.810 --> 00:06:31.590 government will put in place to address the alleged shortcomings 113 00:06:31.590 --> 00:06:36.720 in intelligence gathering, the safeguards used, and also put in 114 00:06:36.720 --> 00:06:41.400 place a robust process for redress. And it even stands up a 115 00:06:41.400 --> 00:06:45.930 new and independent core to call the data protection review 116 00:06:45.930 --> 00:06:51.120 court, which is very significant. So in response, in 117 00:06:51.120 --> 00:06:54.600 what was clearly a coordinated approach, the European 118 00:06:54.600 --> 00:06:58.890 Commission released a Q&A document and they announced that 119 00:06:58.890 --> 00:07:03.600 they intend to now prepare a draft adequacy decision and also 120 00:07:03.600 --> 00:07:08.490 launch an adoption procedure and the European Commission will 121 00:07:08.490 --> 00:07:11.280 seek an opinion from the European Data Protection Board 122 00:07:11.550 --> 00:07:15.420 and also get approval from a committee of EU member state 123 00:07:15.420 --> 00:07:18.810 representatives. And the European Parliament also can 124 00:07:18.810 --> 00:07:22.650 review adequacy decisions. So, as you said, all of this can 125 00:07:22.650 --> 00:07:26.340 take about six months to play out. But the good news is that 126 00:07:26.340 --> 00:07:30.000 we are well on our way to having a reinstated transfer mechanism 127 00:07:30.000 --> 00:07:33.990 for transfers from the EU and the U.K. to the U.S. And this is 128 00:07:33.990 --> 00:07:35.310 very welcome news. 129 00:07:35.760 --> 00:07:37.350 Tom Field: No, indeed. What's your advice to U.S. 130 00:07:37.350 --> 00:07:39.210 organizations now, as this plays out? 131 00:07:40.110 --> 00:07:43.410 Sotto: Well, U.S. companies have really been in purgatory, 132 00:07:43.410 --> 00:07:47.100 the ones that are certified to the shield. And we've been 133 00:07:47.130 --> 00:07:51.780 eagerly awaiting a new revamped version of the framework. We've 134 00:07:51.780 --> 00:07:54.000 had to put in place, those companies have had to put in 135 00:07:54.000 --> 00:07:58.290 place an alternative mechanism, which generally are standard 136 00:07:58.290 --> 00:08:03.450 contractual clauses. They are really complex these days. It's 137 00:08:03.450 --> 00:08:08.010 kind of an albatross for companies. So those companies 138 00:08:08.010 --> 00:08:11.820 that are currently certified are the lucky ones, they'll be able 139 00:08:11.820 --> 00:08:16.470 to take advantage of the revamped shield to transfer data 140 00:08:16.470 --> 00:08:20.640 without having to navigate the complexities of the standard 141 00:08:20.640 --> 00:08:26.130 contractual clauses. This will bring much needed relief to 142 00:08:26.130 --> 00:08:30.810 global organizations. And as you indicated, it is extremely 143 00:08:30.810 --> 00:08:34.440 likely that there will be yet another challenge. We'll call it 144 00:08:35.130 --> 00:08:41.280 the Schrems III case and Max Schrems has explicitly said as 145 00:08:41.280 --> 00:08:45.270 much, but still having this mechanism in place will buy 146 00:08:45.300 --> 00:08:47.730 companies at least a few years of relief. 147 00:08:48.120 --> 00:08:50.940 Tom Field: Very good. Shifting gears, Lisa, there was a recent 148 00:08:50.940 --> 00:08:55.050 action by the California AG under CCPA. Can you tell us a 149 00:08:55.050 --> 00:08:55.770 bit about that? 150 00:08:56.520 --> 00:08:59.610 Sotto: Yeah, absolutely. And it's a very significant 151 00:08:59.640 --> 00:09:07.740 action. The California AG had not brought any enforcement 152 00:09:07.740 --> 00:09:11.700 actions under the CCPA. And now we have our first enforcement 153 00:09:11.700 --> 00:09:17.370 action. This is the Sephora case and they also simultaneously on 154 00:09:17.370 --> 00:09:21.090 August 24, they announced a settlement with Sephora, but 155 00:09:21.090 --> 00:09:24.630 they also, at the same time, announced a broader enforcement 156 00:09:24.630 --> 00:09:29.580 sweep of over 100 online retailers. And the essence of it 157 00:09:29.580 --> 00:09:37.620 is that there is a need now to recognize the global privacy 158 00:09:37.620 --> 00:09:42.360 controller GPC. We'll call it TPC. And so the Attorney General 159 00:09:42.360 --> 00:09:47.250 issued a number of notices of alleged CCPA non compliances 160 00:09:47.910 --> 00:09:52.320 regarding business's failure to process opt-out of sale requests 161 00:09:52.530 --> 00:09:57.720 that were made using user-enabled global privacy 162 00:09:57.720 --> 00:10:02.190 controls such as GPC, so after being notified, many of these 163 00:10:02.190 --> 00:10:05.640 businesses updated their service provider contracts and 164 00:10:05.640 --> 00:10:11.430 implemented technology to recognize the signal, from GPC. 165 00:10:12.120 --> 00:10:16.860 As to Sephora, the allegation was that the company failed to 166 00:10:16.860 --> 00:10:21.480 disclose to consumers that their data was being "sold". And that 167 00:10:21.630 --> 00:10:25.050 is a term of art. It's a defined term under the CCPA. And it 168 00:10:25.050 --> 00:10:29.820 means an exchange of personal data for either monetary 169 00:10:29.820 --> 00:10:34.170 consideration or other valuable consideration. And the 170 00:10:34.170 --> 00:10:37.560 allegation was that Sephora did not provide do not sell 171 00:10:37.590 --> 00:10:44.250 link and also failed to recognize the GPC signal. And 172 00:10:44.250 --> 00:10:49.080 they also did not cure these alleged violations within 30 173 00:10:49.080 --> 00:10:52.590 days, because there's a 30-day grace period right now. And the 174 00:10:52.590 --> 00:10:58.500 settlement was for 1.2 million in penalties. And they will need 175 00:10:58.500 --> 00:11:02.340 to make continuous reports to the Attorney General on their 176 00:11:02.340 --> 00:11:07.950 efforts to both comply with the CCPA and also honor the global 177 00:11:07.950 --> 00:11:09.030 privacy control. 178 00:11:09.510 --> 00:11:12.120 Tom Field: Very good. One more topic: ransomware, as we are in 179 00:11:12.120 --> 00:11:15.660 the last quarter now of 2022. Let's check in. What are the 180 00:11:15.660 --> 00:11:17.790 trends you're seeing in ransomware? And what are the 181 00:11:17.790 --> 00:11:20.910 trends you're seeing in response in terms of what organizations 182 00:11:21.120 --> 00:11:25.590 continue to do wrong in prevention, preparation and 183 00:11:25.590 --> 00:11:26.160 response? 184 00:11:26.190 --> 00:11:30.780 Sotto: All good questions. And, you know, it's kind of more 185 00:11:30.780 --> 00:11:35.730 of the same but souped up. So we're seeing amounts that are 186 00:11:35.730 --> 00:11:39.750 being demanded that are just higher than ever. We used to see 187 00:11:39.930 --> 00:11:43.980 amounts when they were very high, they would be deemed 188 00:11:43.980 --> 00:11:47.850 moonshot amounts. Now, no longer. They're just standard, 189 00:11:48.150 --> 00:11:54.210 very high payment demands. We're also seeing an interesting shift 190 00:11:54.210 --> 00:11:58.440 in cryptocurrency where Monero is now the most requested 191 00:11:58.440 --> 00:12:02.310 cryptocurrency and there's a premium actually at being 192 00:12:02.310 --> 00:12:07.140 charged to accept Bitcoin. Just a couple of other things that 193 00:12:07.140 --> 00:12:12.450 I'll add: Look, Russia and Ukraine, obviously, the war has 194 00:12:12.450 --> 00:12:16.770 had a very significant impact. And we're seeing Russian threat 195 00:12:16.770 --> 00:12:21.060 actors, government actors hitting infrastructure and 196 00:12:21.060 --> 00:12:25.110 government systems. And we're also seeing Ukrainian cyber 197 00:12:25.110 --> 00:12:29.340 warriors hitting back in some cases with surprising success. 198 00:12:29.910 --> 00:12:36.600 And, of course, ransomware takes first prize still in the threat 199 00:12:36.600 --> 00:12:41.220 actors' most coveted exploit. The LAPSUS$ arrests were 200 00:12:41.220 --> 00:12:45.900 interesting, but clearly, LAPSUS$ has nine lives because 201 00:12:45.900 --> 00:12:50.340 then they came back to hit Uber and Rockstar Games, so never say 202 00:12:50.340 --> 00:12:55.530 die. And the only other thing I'll mention is that we now have 203 00:12:55.560 --> 00:12:59.100 the cyber incident reporting for Critical Infrastructure Act in 204 00:12:59.100 --> 00:13:03.870 the United States. And that will mean that certain critical 205 00:13:03.870 --> 00:13:07.980 infrastructure entities are going to need to report certain 206 00:13:07.980 --> 00:13:12.810 events to the government within 72 hours, and within 24 hours of 207 00:13:12.810 --> 00:13:17.610 paying a ransom. And, Tom, I'll just quickly address your last 208 00:13:17.640 --> 00:13:20.400 point, which is what are companies still doing wrong? 209 00:13:21.510 --> 00:13:25.890 We're still seeing shortcomings in basic security measures, 210 00:13:26.400 --> 00:13:30.930 which enough is enough, it's time to just shore things up. 211 00:13:31.170 --> 00:13:34.710 Multi-factor authentication everywhere, access control, 212 00:13:34.710 --> 00:13:40.380 segmentation, the usual guidance, and there, of course, 213 00:13:40.380 --> 00:13:44.550 still needs to be a real beefing up of proactive readiness, doing 214 00:13:44.550 --> 00:13:50.130 tabletop exercises, making sure the executive leadership team is 215 00:13:50.160 --> 00:13:53.670 well aware of the decisions that they're going to need to make, 216 00:13:53.670 --> 00:13:57.870 should this hit. Is the incident response team ready? Is there a 217 00:13:57.870 --> 00:14:01.830 state-of-the-art incident response plan? Have all the 218 00:14:01.860 --> 00:14:05.250 protective measures been tested over and over again, through Red 219 00:14:05.250 --> 00:14:09.990 Team testing or Blue Team testing. And then make sure you 220 00:14:09.990 --> 00:14:14.250 know which experts you're going to use if you're hit with this 221 00:14:14.250 --> 00:14:18.030 sort of thing. Make sure you are sufficiently prepared with 222 00:14:18.030 --> 00:14:24.210 respect to cyber insurance and training and awareness, so you 223 00:14:24.210 --> 00:14:25.320 can never get enough of it. 224 00:14:26.010 --> 00:14:27.420 Tom Field: And I know we're going to be talking about this 225 00:14:27.420 --> 00:14:29.700 again next year. Please, as always, thanks for your time. 226 00:14:29.700 --> 00:14:29.940 Thanks. 227 00:14:30.000 --> 00:14:30.990 Sotto: Thank you very much. 228 00:14:31.770 --> 00:14:33.600 Tom Field: Anna, back to you and our next guest. 229 00:14:33.810 --> 00:14:36.000 Anna Delaney: Fantastic. Excellent insights, Lisa, as 230 00:14:36.000 --> 00:14:39.390 always. I would like to welcome back to the studio David 231 00:14:39.390 --> 00:14:43.560 Pollino, former CISO of PNC Bank. Great to see you, David. 232 00:14:44.220 --> 00:14:45.510 David Pollino: Hello, Anna. Thanks for having me. 233 00:14:45.780 --> 00:14:49.290 Anna Delaney: Very good. Not in London, unfortunately. I reckon. 234 00:14:50.940 --> 00:14:54.150 Not yet. We'll have to get you here. So, David, let's talk 235 00:14:54.150 --> 00:14:57.870 about this hot topic of Zelle fraud. Last week, Senator 236 00:14:57.870 --> 00:15:00.900 Elizabeth Warren's office said that its investigation into 237 00:15:00.900 --> 00:15:04.920 Zelle show that fraud and theft are not only rampant, but 238 00:15:04.920 --> 00:15:08.640 getting worse. Can you share an overview of the scale of these 239 00:15:08.670 --> 00:15:11.100 Zelle scams and the trends you're most concerned by? 240 00:15:12.270 --> 00:15:15.180 David Pollino: Yeah, it's funny we planned on talking about this 241 00:15:15.180 --> 00:15:20.310 before the Senate hearings were even discussed or made the 242 00:15:20.310 --> 00:15:24.180 headlines. So it's definitely a hot topic. In the report, 243 00:15:24.180 --> 00:15:27.540 there's actually some very interesting things. And if you 244 00:15:27.540 --> 00:15:31.200 read through the numbers, I think you see a little bit of, 245 00:15:31.410 --> 00:15:36.300 you know, why - acknowledgement that there's a problem here. And 246 00:15:36.300 --> 00:15:39.900 then you also see some spin on some of the numbers. So, a few 247 00:15:39.900 --> 00:15:43.650 things to talk about. A lot of the banks refused to publish 248 00:15:43.650 --> 00:15:47.700 their numbers. I thought that was telling that if there wasn't 249 00:15:47.700 --> 00:15:50.130 a problem, they likely would have been a lot more 250 00:15:50.130 --> 00:15:54.030 transparent. There were four banks that published numbers. 251 00:15:54.270 --> 00:15:59.640 And if you look at 2020, compared to 2022, for those four 252 00:15:59.640 --> 00:16:03.540 banks, there was about 90 million in fraud, and it went up 253 00:16:03.540 --> 00:16:09.240 from 2020 to 255 million in 2022. And that's just four of 254 00:16:09.240 --> 00:16:12.390 the thousands of banks that are a part of the Zelle. Obviously, 255 00:16:13.140 --> 00:16:16.110 some of the larger banks participate in the platform. 256 00:16:16.320 --> 00:16:21.300 What you see is a significant uptick in the overall fraud 257 00:16:21.300 --> 00:16:24.150 that's on the platform. They also mentioned that there was 258 00:16:24.150 --> 00:16:28.830 $440 million lost in the scam. So this was a 440 million 259 00:16:29.610 --> 00:16:35.580 reported is 440 million of actual customer losses in 2021. 260 00:16:35.880 --> 00:16:39.750 So what you have is a significant industry here that 261 00:16:39.750 --> 00:16:44.640 is focusing on Zelle as their primary revenue stream, from a 262 00:16:44.640 --> 00:16:49.950 cybersecurity perspective. So it goes to show that, I think, the 263 00:16:49.950 --> 00:16:55.260 product needs innovation. When you do a Google News search for 264 00:16:55.260 --> 00:17:00.060 Zelle, the first, I think, it was 20 screens. I looked at, 265 00:17:00.120 --> 00:17:04.440 every article was talking about fraud. So, who knows if they'll 266 00:17:04.440 --> 00:17:07.020 solve this problem soon. They may even need to change the name 267 00:17:07.020 --> 00:17:11.760 because Zelle might become synonymous with fraud. So I 268 00:17:11.760 --> 00:17:14.550 think that there's an acknowledgement here across the 269 00:17:14.550 --> 00:17:18.420 board. Typically what we've seen in financial services, either 270 00:17:18.420 --> 00:17:23.430 they self-regulate, or the government comes and regulates 271 00:17:23.580 --> 00:17:28.590 for them. So, I think there's a short window here of the bank's 272 00:17:28.620 --> 00:17:32.010 opportunity to change the operating rules and to get to a 273 00:17:32.010 --> 00:17:36.630 point where customers are not feeling as scammed and taken 274 00:17:36.630 --> 00:17:41.280 over and victims to be able to get the product before the hard 275 00:17:41.280 --> 00:17:44.580 hitting regulation comes down and likely changes this product 276 00:17:44.580 --> 00:17:46.080 significantly for years to come. 277 00:17:46.950 --> 00:17:48.840 Anna Delaney: Very good. And not only is there a lack of 278 00:17:48.840 --> 00:17:51.720 transparency from banks, I read somewhere that they're not 279 00:17:51.720 --> 00:17:55.440 repaying 90% of cases in which customers were tricked into 280 00:17:55.440 --> 00:17:59.070 making payments on Zelle. What are your thoughts on this? Where 281 00:17:59.070 --> 00:18:02.130 does responsibility lie? I suppose there's always a lot of 282 00:18:02.160 --> 00:18:05.160 finger pointing. But I'm curious to know your thoughts. 283 00:18:06.000 --> 00:18:10.020 David Pollino: Yeah, as a former banker myself, I definitely see 284 00:18:10.260 --> 00:18:14.520 both sides of the equation. You have some of these more 285 00:18:14.520 --> 00:18:20.550 traditional payment mechanisms like PayPal, Venmo, Cash App 286 00:18:20.610 --> 00:18:24.720 that are built on other products, they're built on ACH, 287 00:18:24.780 --> 00:18:29.640 they're built on the card infrastructure and to a certain 288 00:18:29.640 --> 00:18:33.690 extent, also checking products as well. Those products have 289 00:18:33.870 --> 00:18:38.580 well-established operating rules that have factored in fraud as 290 00:18:38.580 --> 00:18:41.970 part of the arrangements. And because normally, you're only 291 00:18:41.970 --> 00:18:44.280 seeing one side of the transaction, you're either 292 00:18:44.280 --> 00:18:48.780 seeing the sender, the maker, the receiver, or you're on the 293 00:18:48.780 --> 00:18:54.510 other side, there's some ways to be able to enforce good behavior 294 00:18:54.510 --> 00:18:59.940 by chargebacks or unauthorized transfers, you know, those types 295 00:18:59.940 --> 00:19:05.640 of things are returned to maker on the check side. So what we're 296 00:19:05.640 --> 00:19:12.120 seeing here is this new product that did not factor fraud into 297 00:19:12.120 --> 00:19:16.860 its overall revenue model, it was all about cost avoidance. 298 00:19:16.920 --> 00:19:21.660 And as a result, they tried to make the rules about product in 299 00:19:21.660 --> 00:19:25.080 such a way that they wouldn't see fraud. It's a send-only 300 00:19:25.110 --> 00:19:28.920 mechanism. It's not a receive mechanism, you can request it, 301 00:19:28.920 --> 00:19:33.510 but it has to be pushed out by the account owner or whoever's 302 00:19:33.510 --> 00:19:37.020 in control of the device. That's the authenticator for that 303 00:19:37.020 --> 00:19:41.340 particular account. And there hasn't been a good mechanism for 304 00:19:41.340 --> 00:19:46.530 charging back. So the one rule that governs this approach is 305 00:19:46.530 --> 00:19:51.210 Regulation E and Reg E has three aspects to it, but the primary 306 00:19:51.210 --> 00:19:55.980 one that they focus in on for Zelle transactions wasn't 307 00:19:56.010 --> 00:20:01.860 authorized. In many of these cases, the customer is fooled, 308 00:20:03.570 --> 00:20:07.020 for whatever is happening with that particular scam - which 309 00:20:07.020 --> 00:20:09.960 they have a long list of them here, of ones that are seen was 310 00:20:09.960 --> 00:20:15.840 out, the customer at some point is saying, "Here's my money", or 311 00:20:16.320 --> 00:20:21.270 their MFA has been hijacked to make it look like it's their 312 00:20:21.270 --> 00:20:25.230 money, but that's the kind of key differentiator that the 313 00:20:25.230 --> 00:20:30.390 banks are saying, "You said that it was authorized. So you can't 314 00:20:30.390 --> 00:20:32.910 come back later and say that it's not authorized." So that's 315 00:20:32.910 --> 00:20:38.250 why very few customers are being reimbursed. For me, the big 316 00:20:38.250 --> 00:20:42.780 differentiator between this and why it's misunderstood, compared 317 00:20:42.780 --> 00:20:46.380 to something like a credit card transaction, if you're a credit 318 00:20:46.380 --> 00:20:49.710 card merchant, there are certain things you need to go through to 319 00:20:49.710 --> 00:20:52.740 establish yourself as a merchant, and then as you're 320 00:20:52.740 --> 00:20:55.920 operating, those credit cards hold back some funds for 321 00:20:55.950 --> 00:21:00.510 chargebacks. If you have a high chargeback rate, you're either 322 00:21:00.510 --> 00:21:02.610 going to be subject to higher fees or you'll be kicked out of 323 00:21:02.610 --> 00:21:06.870 the network altogether. We don't have that same infrastructure 324 00:21:06.870 --> 00:21:12.660 mechanism around Zell. It's made to be like cash. And I know some 325 00:21:12.690 --> 00:21:17.880 scams have hit the Pollino household and also people in the 326 00:21:17.880 --> 00:21:21.750 neighborhood and they'll understand that Zell is like 327 00:21:21.750 --> 00:21:24.840 cash. It's not like a credit card. It's not like these other 328 00:21:24.840 --> 00:21:28.050 transactions where you can say, "The services weren't provided, 329 00:21:28.050 --> 00:21:31.560 the goods were never delivered, claw this thing back." No, the 330 00:21:31.560 --> 00:21:35.340 money is gone. And so I think there's a combination of 331 00:21:35.340 --> 00:21:39.240 innovations in the product to help protect consumers, but also 332 00:21:39.390 --> 00:21:43.950 education for the consumer so they understand and know what 333 00:21:43.950 --> 00:21:46.980 Zelle is. And not just think about it with the lens of some 334 00:21:46.980 --> 00:21:49.830 of the other common payment mechanisms that we have in the 335 00:21:49.830 --> 00:21:50.340 industry. 336 00:21:50.700 --> 00:21:53.730 Anna Delaney: Yeah. You mentioned regulation earlier. 337 00:21:53.910 --> 00:21:57.570 Warren concluded in her report that regulatory clarity is 338 00:21:57.630 --> 00:22:02.100 needed to further protect Zelle users. Do you agree? Because 339 00:22:02.310 --> 00:22:03.480 some banks disagree? 340 00:22:04.770 --> 00:22:07.350 David Pollino: Well, I definitely see both sides of the 341 00:22:07.350 --> 00:22:10.950 overall equation. I think, generally speaking, the banks 342 00:22:11.130 --> 00:22:16.140 are better at regulating themselves. And, you know, when 343 00:22:16.140 --> 00:22:19.260 you see at kind of the ecosystem that has come up with the card 344 00:22:19.260 --> 00:22:22.380 networks, and how they've been able to create something that's 345 00:22:22.380 --> 00:22:28.500 advantageous for whether you're a merchant acquirer or a 346 00:22:28.500 --> 00:22:33.180 network, they've been able to work that out. Sometimes, you 347 00:22:33.180 --> 00:22:35.610 know, the mechanisms that come through government regulation 348 00:22:35.610 --> 00:22:38.700 aren't as nimble and aren't as effective as they are when 349 00:22:38.700 --> 00:22:42.990 they're originally penned and put to paper. But I do see that 350 00:22:42.990 --> 00:22:46.320 there is a huge opportunity here to make sure that we're not 351 00:22:46.650 --> 00:22:51.000 banking scammers, and that if we're allowing accounts to be 352 00:22:51.000 --> 00:22:54.420 taken over by scammers as a destination, there should be a 353 00:22:54.420 --> 00:22:57.540 little bit more onus on that receiving institution to give 354 00:22:57.540 --> 00:23:01.500 the money back. And there also should be a greater mechanism to 355 00:23:01.500 --> 00:23:05.790 be able to share information to be able to understand where that 356 00:23:05.790 --> 00:23:12.390 money is going, and be able to have some sort of a mechanism 357 00:23:12.390 --> 00:23:16.260 that if it is a suspicious destination, or if all of a 358 00:23:16.260 --> 00:23:21.030 sudden, the transactional volume or the account is being used 359 00:23:21.090 --> 00:23:26.010 outside of what your KYC or CDD process indicated that, you 360 00:23:26.010 --> 00:23:29.070 know, some action would be taken on the receiving bank, that way 361 00:23:29.070 --> 00:23:33.090 the individuals who are being scammed might have some recourse 362 00:23:33.090 --> 00:23:34.200 that's not there today. 363 00:23:35.520 --> 00:23:38.310 Anna Delaney: And as ever, David what can organizations merchants 364 00:23:38.310 --> 00:23:40.050 networks take away from this? 365 00:23:41.370 --> 00:23:44.310 David Pollino: Well, you know, if you look at the scams that 366 00:23:44.310 --> 00:23:47.160 are being perpetrated over Zelle, there's a couple of new 367 00:23:47.160 --> 00:23:50.610 ones but most of them are kind of the tried and true ones that 368 00:23:50.610 --> 00:23:54.090 we've seen over and over again. You have the impersonation, you 369 00:23:54.090 --> 00:23:59.370 know, grandparents friends scam, you have overpayments scams, 370 00:23:59.370 --> 00:24:03.510 which is a slightly different, you know, twist to it, you have 371 00:24:03.540 --> 00:24:06.840 these utility scams, you have law enforcement, "Hey, you need 372 00:24:06.840 --> 00:24:09.840 to pay your taxes, or we're going to come arrest you," the 373 00:24:10.230 --> 00:24:14.370 lottery scams, all those types of things. The banks, I think, 374 00:24:14.370 --> 00:24:20.730 need to do a better job of being able to educate the consumer at 375 00:24:20.730 --> 00:24:24.420 the point of transaction to be able to point out, "Does your 376 00:24:24.420 --> 00:24:28.350 transaction look like this?" You know, maybe ask them some 377 00:24:28.350 --> 00:24:30.540 questions, especially when they're first, second or third 378 00:24:30.540 --> 00:24:34.980 time, Zelle users to understand how this money is being used. 379 00:24:35.070 --> 00:24:39.540 Let them know that it's cash, it cannot come back, and make sure 380 00:24:39.540 --> 00:24:42.720 that they understand the operating rules. You know, my 381 00:24:42.720 --> 00:24:47.580 wife's been married to a banker for 27 years. And, you know, her 382 00:24:47.580 --> 00:24:52.020 mom was selling a couch and the business upgrade scam, you know, 383 00:24:52.020 --> 00:24:54.660 came through my house and we didn't lose the money, but it 384 00:24:54.660 --> 00:24:59.220 was "Hey, you know, you upgrade your Zelle account by sending 385 00:24:59.220 --> 00:25:02.310 additional money" and even got a, you know, 386 00:25:02.400 --> 00:25:06.600 Zelletechsupport@gmail.com, you know, official looking 387 00:25:06.600 --> 00:25:10.230 notification that they needed to do it. And, you know, really 388 00:25:10.230 --> 00:25:13.140 preys on people's misunderstanding of the product 389 00:25:13.500 --> 00:25:17.310 and how it works. And I was able to intervene before the actual 390 00:25:17.310 --> 00:25:22.920 loss could take place. But I think there needs to be a better 391 00:25:22.920 --> 00:25:25.560 educational program, because right now, if you do a Google 392 00:25:25.560 --> 00:25:29.730 News search of Zelle, all it talks about is fraud. It doesn't 393 00:25:29.730 --> 00:25:32.970 talk about how to use it correctly, what it actually is, 394 00:25:33.420 --> 00:25:34.950 or how it can be used safely. 395 00:25:35.280 --> 00:25:38.190 Anna Delaney: Yeah, very true. Well, not everybody has a David 396 00:25:38.190 --> 00:25:42.870 Pollino in their household. So, education is key, for sure. 397 00:25:42.870 --> 00:25:45.180 Thank you very much, David. Always informative. Thank you. 398 00:25:45.720 --> 00:25:49.920 So let's bring the gang together. A question outside the 399 00:25:49.920 --> 00:25:54.750 box perhaps. What is something outside of, or unrelated to, 400 00:25:54.750 --> 00:25:59.820 cyber privacy, or even anything legal that helps you in your 401 00:25:59.820 --> 00:26:01.140 current roles? 402 00:26:04.170 --> 00:26:09.960 Sotto: I can start, if you like. It used to be that privacy 403 00:26:09.960 --> 00:26:17.910 and cyber were esoteric issues that were certainly not part of 404 00:26:17.910 --> 00:26:27.240 the usual daily discussion, and now we're seeing enormous social 405 00:26:28.110 --> 00:26:32.250 and legal relevance to these issues. So, every day, there are 406 00:26:32.250 --> 00:26:36.180 new headlines. And it's always fascinating to me to see how far 407 00:26:36.180 --> 00:26:40.350 we have progressed in the last 20 years in these areas. 408 00:26:40.560 --> 00:26:43.050 Anna Delaney: Yeah, sure. David? 409 00:26:43.950 --> 00:26:46.560 David Pollino: Yes, one positive development I've seen with 410 00:26:46.560 --> 00:26:52.140 companies in the last few years is security and privacy, also, 411 00:26:52.140 --> 00:26:55.710 the tools and techniques that we use in this industry can also be 412 00:26:55.710 --> 00:27:00.930 used to perform good. So, you see some companies that are 413 00:27:00.960 --> 00:27:04.830 either investing in fighting things like human trafficking, 414 00:27:05.070 --> 00:27:08.700 and devoting a certain portion of the revenue associated to 415 00:27:08.700 --> 00:27:12.450 fund charities that are related to human trafficking, and also 416 00:27:12.450 --> 00:27:16.410 some of the mechanisms that you use to detect fraud. And scams 417 00:27:16.410 --> 00:27:20.520 can also be used to detect human trafficking. But that's all part 418 00:27:20.520 --> 00:27:24.840 of an overall larger trend in the industry, where, you know, 419 00:27:24.840 --> 00:27:27.630 companies are saying, "Maybe, I'm worried to feed people, 420 00:27:27.630 --> 00:27:30.870 we're going to clothe people, we're going to take care of a 421 00:27:30.870 --> 00:27:33.510 need from a charity perspective." And I think that's 422 00:27:33.510 --> 00:27:38.460 great. I also see companies offering the ability to do 423 00:27:38.460 --> 00:27:42.900 recurring volunteer hours to their employees. So, and those 424 00:27:42.900 --> 00:27:45.360 volunteer hours for, as a cybersecurity professional, 425 00:27:45.360 --> 00:27:49.260 maybe they're helping out a educational institution or 426 00:27:49.290 --> 00:27:53.610 not-for-profit or a religious organization. So you're having a 427 00:27:53.610 --> 00:27:57.600 community that is focusing on giving back and with skills that 428 00:27:57.600 --> 00:28:00.900 are in demand and in short supply, I think definitely is a 429 00:28:00.900 --> 00:28:03.330 positive development in cybersecurity. 430 00:28:03.870 --> 00:28:07.770 Anna Delaney: Yes, for sure. Tom, anything unrelated to cyber 431 00:28:07.770 --> 00:28:08.850 that helps you in your role? 432 00:28:09.150 --> 00:28:12.330 Tom Field: Oh, 100%. And, as you know, my girlfriend and I sing 433 00:28:12.330 --> 00:28:14.970 and perform in elder care facilities on the weekends, and 434 00:28:14.970 --> 00:28:18.570 that's a huge boost, just to get out and express a little bit 435 00:28:18.570 --> 00:28:21.330 through music and be able to just bring in some different 436 00:28:21.330 --> 00:28:23.160 kinds of energies. That helps enormously. 437 00:28:24.030 --> 00:28:26.880 Anna Delaney: Okay, well, it's great to end on a positive note. 438 00:28:27.300 --> 00:28:29.550 Thank you very much, Lisa Sotto and David Pollino. 439 00:28:30.840 --> 00:28:31.440 David Pollino: Thanks for having us. 440 00:28:31.440 --> 00:28:31.620 Sotto: Thank you 441 00:28:31.650 --> 00:28:32.850 Anna Delaney: It's goodbye from us.