WEBVTT
1
00:00:00.590 --> 00:00:02.780
Equifax in 2017, suffered one of the
2
00:00:02.780 --> 00:00:05.310
worst data breaches in history.
3
00:00:05.310 --> 00:00:08.130
Adrian Sanabria is a security professional and researcher
4
00:00:08.130 --> 00:00:10.820
who's been taking a close look at what went wrong.
5
00:00:10.820 --> 00:00:13.720
Adrian, thank you for joining me today.
6
00:00:13.720 --> 00:00:15.360
Yeah, thanks for having me Mathew.
7
00:00:15.360 --> 00:00:17.770
I really am looking forward to talking about Equifax.
8
00:00:17.770 --> 00:00:21.920
It seems like a great case study for others to learn from.
9
00:00:21.920 --> 00:00:24.120
If this is not too big a question,
10
00:00:24.120 --> 00:00:27.270
what would some of your top takeaways from the breach,
11
00:00:27.270 --> 00:00:29.150
or the failures of Equifax,
12
00:00:29.150 --> 00:00:32.680
be for people in the information security field.
13
00:00:32.680 --> 00:00:36.180
Well first of all, I think people need to spend more time
14
00:00:36.180 --> 00:00:39.560
really digging into case studies like this,
15
00:00:39.560 --> 00:00:42.200
where we actually have all the details,
16
00:00:42.200 --> 00:00:45.230
because it's incredibly rare that we get the level of detail
17
00:00:45.230 --> 00:00:47.180
that we've gotten from Equifax,
18
00:00:47.180 --> 00:00:48.690
and typically it only happens
19
00:00:48.690 --> 00:00:53.230
if there's class action lawsuits or federal investigations,
20
00:00:53.230 --> 00:00:56.700
that by their nature, end up making those documents public,
21
00:00:56.700 --> 00:00:58.600
or getting 'em into the public domain,
22
00:00:58.600 --> 00:01:01.040
whether it's voluntarily or through
23
00:01:01.040 --> 00:01:03.280
the Freedom of Information Act,
24
00:01:03.280 --> 00:01:06.370
but defenders should be jumping on this opportunity,
25
00:01:06.370 --> 00:01:08.570
and just digging through these documents
26
00:01:08.570 --> 00:01:10.950
with a fine tooth comb, and seeing where they can apply
27
00:01:10.950 --> 00:01:14.650
some of these lessons to their own environments.
28
00:01:14.650 --> 00:01:18.780
So, it's a huge opportunity, and nobody should just be
29
00:01:18.780 --> 00:01:21.610
gleaming over the headlines of something like this,
30
00:01:21.610 --> 00:01:24.330
if they're a defender, if they're in a similar situation,
31
00:01:24.330 --> 00:01:27.600
and they're worried about, could this happen to us?
32
00:01:27.600 --> 00:01:29.830
It seems unusual to me that we've seen
33
00:01:29.830 --> 00:01:31.870
so much detail and depth.
34
00:01:31.870 --> 00:01:33.580
I mean there was the GAO report,
35
00:01:33.580 --> 00:01:36.500
there was a report from House and Senate Committees,
36
00:01:36.500 --> 00:01:39.310
there was the U.K. ICO privacy watchdog.
37
00:01:39.310 --> 00:01:40.750
They also put out a report
38
00:01:40.750 --> 00:01:43.990
tied to their fine against Equifax.
39
00:01:43.990 --> 00:01:46.360
There's a abundance of information here.
40
00:01:46.360 --> 00:01:48.690
Yeah, it's really, really useful.
41
00:01:48.690 --> 00:01:50.660
There's some good data in there,
42
00:01:50.660 --> 00:01:54.740
and I wish we got this level of detail more often.
43
00:01:54.740 --> 00:01:58.420
It really helps to solidify some of the best practices
44
00:01:58.420 --> 00:02:00.410
and recommendations that we give,
45
00:02:00.410 --> 00:02:03.410
and clearly we can see that the industry is too focused
46
00:02:03.410 --> 00:02:05.610
on fixing problems with tools,
47
00:02:05.610 --> 00:02:08.920
and not focused enough on leadership,
48
00:02:08.920 --> 00:02:13.030
and on the people and processes parts of things.
49
00:02:13.030 --> 00:02:16.810
So, one narrative that we've seen across many breaches
50
00:02:16.810 --> 00:02:18.970
where we get this level of detail
51
00:02:18.970 --> 00:02:22.510
is that even when companies own the tools,
52
00:02:22.510 --> 00:02:26.710
they rarely had them all set up and working correctly,
53
00:02:26.710 --> 00:02:28.460
and that was a big problem here in Equifax.
54
00:02:28.460 --> 00:02:31.320
In fact, the way that they found out about this breach,
55
00:02:31.320 --> 00:02:35.033
is they finally fixed one of these tools that had been,
56
00:02:35.940 --> 00:02:37.910
so I originally said broken,
57
00:02:37.910 --> 00:02:40.550
and actually one of the people who developed this tool
58
00:02:40.550 --> 00:02:42.017
on Twitter, came back and said,
59
00:02:42.017 --> 00:02:45.417
"Well actually (laughs) it wasn't broken,
60
00:02:45.417 --> 00:02:46.467
"it was working correctly.
61
00:02:46.467 --> 00:02:48.637
"They just didn't have it configured.
62
00:02:48.637 --> 00:02:49.930
"They had an out of service."
63
00:02:49.930 --> 00:02:51.467
I said, "Okay, I didn't mean to say
64
00:02:51.467 --> 00:02:52.927
"that your product was broken,
65
00:02:52.927 --> 00:02:54.647
"but, you know, when it's not working,
66
00:02:54.647 --> 00:02:57.200
"it's just the term that I used."
67
00:02:57.200 --> 00:03:00.840
So, they had this SSL Inspector that would decrypt traffic,
68
00:03:00.840 --> 00:03:03.470
so that intrusion detection systems
69
00:03:03.470 --> 00:03:05.700
can look at that traffic and inspect it,
70
00:03:05.700 --> 00:03:08.960
and alert you on suspicious or anomalous stuff,
71
00:03:08.960 --> 00:03:10.240
and the cert expired,
72
00:03:10.240 --> 00:03:13.540
and they let it sit expired for something like 18 months,
73
00:03:13.540 --> 00:03:15.850
and once they got that fixed up,
74
00:03:15.850 --> 00:03:17.950
I think it actually needed several certificates
75
00:03:17.950 --> 00:03:21.360
to decrypt all the traffic it was inspecting,
76
00:03:21.360 --> 00:03:23.710
but the moment they hooked that back up,
77
00:03:23.710 --> 00:03:27.240
the alert started going off, the siren started blaring,
78
00:03:27.240 --> 00:03:30.280
and that's how they actually discovered the breach.
79
00:03:30.280 --> 00:03:32.060
So they had this investment in a tool
80
00:03:32.060 --> 00:03:33.672
that would have saved their bacon,
81
00:03:33.672 --> 00:03:36.090
if they'd been using the tool correctly.
82
00:03:36.090 --> 00:03:39.470
Yeah, and it's such a frustrating thing, 'cause it's like,
83
00:03:39.470 --> 00:03:42.280
well, you know, you did all the right things,
84
00:03:42.280 --> 00:03:44.960
you spent your budget on the right things clearly,
85
00:03:44.960 --> 00:03:47.400
'cause it did tell you about the breach,
86
00:03:47.400 --> 00:03:51.268
but it only works if you believe it operational,
87
00:03:51.268 --> 00:03:54.410
if you take care of it, if you maintain this stuff,
88
00:03:54.410 --> 00:03:56.670
and clearly that's where we've got some pain.
89
00:03:56.670 --> 00:03:59.180
There are also some issues of people
90
00:03:59.180 --> 00:04:01.790
knowing how to use the tools properly.
91
00:04:01.790 --> 00:04:04.870
One of the things that fascinated me about Equifax,
92
00:04:04.870 --> 00:04:08.170
is that in the media, like social media,
93
00:04:08.170 --> 00:04:12.920
you saw a lot of people just thinking Equifax was lazy.
94
00:04:12.920 --> 00:04:17.390
They were slow to respond, or they didn't bother to patch,
95
00:04:17.390 --> 00:04:19.520
but in fact if you read the documents,
96
00:04:19.520 --> 00:04:21.150
they were really sweating this out.
97
00:04:21.150 --> 00:04:23.430
When that Struts vulnerability came out,
98
00:04:23.430 --> 00:04:24.890
they were aware of it.
99
00:04:24.890 --> 00:04:26.847
They had some big meetings they said,
100
00:04:26.847 --> 00:04:28.597
"Hey, this is a big deal, let's figure out
101
00:04:28.597 --> 00:04:31.217
"if we've got Struts, and if the versions
102
00:04:31.217 --> 00:04:33.450
"we have running are vulnerable."
103
00:04:33.450 --> 00:04:35.017
And they looked for it, and they looked for it,
104
00:04:35.017 --> 00:04:37.630
and they searched many different ways
105
00:04:37.630 --> 00:04:40.800
using many different tools, and it was there,
106
00:04:40.800 --> 00:04:43.540
and it was vulnerable, but they failed to find it.
107
00:04:43.540 --> 00:04:46.210
Part of the reason for that is the security team
108
00:04:46.210 --> 00:04:48.030
didn't really understand Struts,
109
00:04:48.030 --> 00:04:50.520
they didn't understand the right ways to look for it,
110
00:04:50.520 --> 00:04:53.710
and they didn't have any documentation of their own systems
111
00:04:53.710 --> 00:04:57.040
that they could search through and find Struts that way,
112
00:04:57.040 --> 00:04:58.600
so there's no bill of materials,
113
00:04:58.600 --> 00:05:02.010
no software bill of materials for their own products,
114
00:05:02.010 --> 00:05:04.960
for their own applications.
115
00:05:04.960 --> 00:05:07.320
What actually had Struts in it and got hacked,
116
00:05:07.320 --> 00:05:09.380
was an older legacy system.
117
00:05:09.380 --> 00:05:11.990
All the people that knew how it worked had left,
118
00:05:11.990 --> 00:05:13.760
sounds like from the report,
119
00:05:13.760 --> 00:05:16.970
and nobody knew where Struts was there.
120
00:05:16.970 --> 00:05:19.330
So at one point, they actually run a tool
121
00:05:19.330 --> 00:05:22.850
to look for Struts, and they're one directory below
122
00:05:22.850 --> 00:05:24.400
where Struts is sitting,
123
00:05:24.400 --> 00:05:26.910
and they don't use the recursive flag on the tool,
124
00:05:26.910 --> 00:05:28.100
so it misses it.
125
00:05:28.100 --> 00:05:31.220
Yeah, they're only scanning the current directory,
126
00:05:31.220 --> 00:05:34.560
and not the deeper directories from that one.
127
00:05:34.560 --> 00:05:37.500
You've talked about focusing more on the basics,
128
00:05:37.500 --> 00:05:38.790
but then there's the basics,
129
00:05:38.790 --> 00:05:40.600
and then there's the basic basics.
130
00:05:40.600 --> 00:05:44.070
So, what are some of the basic basics to take away here?
131
00:05:44.070 --> 00:05:45.220
For example, you're talking about
132
00:05:45.220 --> 00:05:47.080
asset discovery and management.
133
00:05:47.080 --> 00:05:48.320
So the problem with the basics
134
00:05:48.320 --> 00:05:51.920
is that they're not all that basic, they're tough.
135
00:05:51.920 --> 00:05:53.980
If you look at the critical security controls,
136
00:05:53.980 --> 00:05:57.510
there's a top 20 list of critical security controls,
137
00:05:57.510 --> 00:06:00.930
that's fairly useful as a general guidance
138
00:06:00.930 --> 00:06:02.690
for building out your security program
139
00:06:02.690 --> 00:06:05.380
and kinda judging how mature your company is,
140
00:06:05.380 --> 00:06:08.370
in terms of cybersecurity, and the first two
141
00:06:08.370 --> 00:06:11.840
on there, critical security control one, is asset inventory,
142
00:06:11.840 --> 00:06:14.060
and critical security control number two,
143
00:06:14.060 --> 00:06:17.770
is software inventory, because you can't do a great job
144
00:06:17.770 --> 00:06:19.000
at securing your stuff,
145
00:06:19.000 --> 00:06:21.920
if you don't know what your stuff is comprised of,
146
00:06:21.920 --> 00:06:26.660
if you don't have a list of those assets and software,
147
00:06:26.660 --> 00:06:29.390
what's running on 'em, what versions are running on 'em,
148
00:06:29.390 --> 00:06:31.600
and building these inventories is tough.
149
00:06:31.600 --> 00:06:34.480
I think back when this broke, and they had gotten in,
150
00:06:34.480 --> 00:06:36.803
and they put all these web shells in different systems,
151
00:06:36.803 --> 00:06:39.600
part of the research I did is I wondered,
152
00:06:39.600 --> 00:06:43.720
well how big is Equifax in terms of internet footprint,
153
00:06:43.720 --> 00:06:45.060
and attack surface?
154
00:06:45.060 --> 00:06:48.170
Looked it up, and found they have over 17,000
155
00:06:48.170 --> 00:06:50.320
external IP addresses that they own.
156
00:06:50.320 --> 00:06:53.200
Now that's not to say that they're actually running
157
00:06:53.200 --> 00:06:57.350
public external services on all 17,000 of them,
158
00:06:57.350 --> 00:07:01.520
but that's still a lot to inventory,
159
00:07:01.520 --> 00:07:04.020
to monitor, to take care of.
160
00:07:04.020 --> 00:07:05.650
It's not easy to do the basics,
161
00:07:05.650 --> 00:07:08.880
once you scale up to these very, very large environments,
162
00:07:08.880 --> 00:07:09.900
but that's the job.
163
00:07:09.900 --> 00:07:11.500
You gotta find some way to do it.
164
00:07:11.500 --> 00:07:13.320
These lessons that have been learned,
165
00:07:13.320 --> 00:07:17.050
in terms of ensuring you have asset discovery,
166
00:07:17.050 --> 00:07:18.850
know where all your systems are,
167
00:07:18.850 --> 00:07:20.360
I feel like they have been learned,
168
00:07:20.360 --> 00:07:22.390
and have they been forgotten?
169
00:07:22.390 --> 00:07:24.740
Yeah, so I think part of the problem here
170
00:07:24.740 --> 00:07:27.280
is that security, is this layer
171
00:07:27.280 --> 00:07:29.980
that you can throw on top of anything.
172
00:07:29.980 --> 00:07:34.980
What companies expect their CSOs to do, it can be so broad.
173
00:07:35.056 --> 00:07:38.310
If security or privacy comes up at all,
174
00:07:38.310 --> 00:07:40.290
they look down the table at the CSO,
175
00:07:40.290 --> 00:07:42.300
and that's a lot to take care of,
176
00:07:42.300 --> 00:07:45.800
'cause we're talking about things on the technical side,
177
00:07:45.800 --> 00:07:48.180
that you're interfacing with legal,
178
00:07:48.180 --> 00:07:50.890
so dealing with PR type stuff.
179
00:07:50.890 --> 00:07:54.270
It's just such a broad set of things,
180
00:07:54.270 --> 00:07:57.190
that coming in as a new CSO through an organization,
181
00:07:57.190 --> 00:07:59.070
and cleaning up this kind of mess,
182
00:07:59.070 --> 00:08:01.930
and trying to get a good security program in place,
183
00:08:01.930 --> 00:08:05.020
you're doing everything from training internal employees
184
00:08:05.020 --> 00:08:07.580
to make sure that they don't fall for phishing
185
00:08:07.580 --> 00:08:12.580
and BEC scams, Business Email Compromise scams,
186
00:08:12.690 --> 00:08:14.250
and then dealing with technical issues,
187
00:08:14.250 --> 00:08:17.580
like printers that are vulnerable,
188
00:08:17.580 --> 00:08:20.000
and websites that are vulnerable,
189
00:08:20.000 --> 00:08:22.410
and trying to work with the developers
190
00:08:22.410 --> 00:08:24.320
to write more secure code.
191
00:08:24.320 --> 00:08:27.280
So you're wearing so many hats and so many roles,
192
00:08:27.280 --> 00:08:30.790
that it's really easy to fall down in certain areas
193
00:08:30.790 --> 00:08:33.030
just because you don't have the staff
194
00:08:33.030 --> 00:08:36.390
or it's difficult to visualize everything,
195
00:08:36.390 --> 00:08:39.710
and to prioritize it, and say okay,
196
00:08:39.710 --> 00:08:42.050
what's the most likely way
197
00:08:42.050 --> 00:08:43.380
that we're gonna get hacked here,
198
00:08:43.380 --> 00:08:45.590
and is getting hacked even the worst thing
199
00:08:45.590 --> 00:08:46.840
that I'm worried about?
200
00:08:46.840 --> 00:08:49.610
It's a difficult problem to come into.
201
00:08:49.610 --> 00:08:52.860
Security is very broad, which is why we have frameworks
202
00:08:52.860 --> 00:08:56.150
like the top 20 critical security controls.
203
00:08:56.150 --> 00:08:58.560
Where we see people fall down a lot,
204
00:08:58.560 --> 00:09:02.240
is they're trying to outsource that to purchasing a tool,
205
00:09:02.240 --> 00:09:05.710
or they try and outsource that to a third party organization
206
00:09:05.710 --> 00:09:08.190
that manages that for them,
207
00:09:08.190 --> 00:09:10.783
but maybe doesn't do a great job of it.
208
00:09:12.630 --> 00:09:15.760
What we end up doing, or a lot of organizations
209
00:09:15.760 --> 00:09:19.380
end up doing, is a very mediocre job at a lot of things,
210
00:09:19.380 --> 00:09:22.090
instead of a very good job at the few things
211
00:09:22.090 --> 00:09:24.930
that really matter, and nobody agrees
212
00:09:24.930 --> 00:09:28.020
on what those things are that really matter,
213
00:09:28.020 --> 00:09:32.700
so it's still very young, cybersecurity is,
214
00:09:32.700 --> 00:09:36.470
maybe two and 1/2 decades old really.
215
00:09:36.470 --> 00:09:38.720
So, yeah, we're still trying to figure out
216
00:09:38.720 --> 00:09:40.060
what we're doing there.
217
00:09:40.060 --> 00:09:43.600
It's not professionalized, we have some of these frameworks,
218
00:09:43.600 --> 00:09:48.240
but there are best guesses at this point.
219
00:09:48.240 --> 00:09:50.220
So, that's why we study these breaches.
220
00:09:50.220 --> 00:09:51.240
That's why it's so important
221
00:09:51.240 --> 00:09:53.210
to get the details of these breaches,
222
00:09:53.210 --> 00:09:55.110
so we can come back and say,
223
00:09:55.110 --> 00:09:57.740
well, yeah, maybe that's not a best practice,
224
00:09:57.740 --> 00:09:59.670
or maybe we're missing a best practice,
225
00:09:59.670 --> 00:10:04.670
or we need to realign our priorities here.
226
00:10:04.740 --> 00:10:06.220
You've talked before about how there's
227
00:10:06.220 --> 00:10:08.710
really no National Highway Traffic Safety Administration
228
00:10:08.710 --> 00:10:10.860
for data breaches, for example,
229
00:10:10.860 --> 00:10:12.580
but wouldn't that be a good thing,
230
00:10:12.580 --> 00:10:15.210
if a government agency or some other organization
231
00:10:15.210 --> 00:10:17.830
was able to see the bigger picture,
232
00:10:17.830 --> 00:10:20.400
and communicate some of those lessons learned
233
00:10:20.400 --> 00:10:22.620
to organizations sector by sector,
234
00:10:22.620 --> 00:10:24.290
which we don't have now, do we?
235
00:10:24.290 --> 00:10:27.810
No, no we don't, and it's a huge problem,
236
00:10:27.810 --> 00:10:31.580
because we've got literally thousands of breaches
237
00:10:31.580 --> 00:10:35.516
occurring every year, and we know almost nothing about them.
238
00:10:35.516 --> 00:10:39.290
The information that's released to the media
239
00:10:39.290 --> 00:10:42.510
and to the public is just the bare details
240
00:10:42.510 --> 00:10:44.970
that they're required to release.
241
00:10:44.970 --> 00:10:49.590
There's this many customer accounts affected, this many.
242
00:10:49.590 --> 00:10:51.393
Here's the data that was lost.
243
00:10:52.830 --> 00:10:57.140
So, we get details about what was leaked,
244
00:10:57.140 --> 00:10:59.950
what was compromised, whatever that data might be,
245
00:10:59.950 --> 00:11:03.830
or whatever damages were done, if it were ransomware,
246
00:11:03.830 --> 00:11:06.603
or somebody just being destructive.
247
00:11:07.600 --> 00:11:09.160
But the details we don't get
248
00:11:09.160 --> 00:11:11.253
are the ones that we need to learn from.
249
00:11:12.530 --> 00:11:14.480
Learning from other people's mistakes
250
00:11:14.480 --> 00:11:16.750
is generally preferable to having
251
00:11:16.750 --> 00:11:20.373
to make those same mistakes, and learn from them yourself.
252
00:11:21.950 --> 00:11:25.630
Until we require some of those details to be released,
253
00:11:25.630 --> 00:11:30.380
like the organizations that study automobile crashes,
254
00:11:30.380 --> 00:11:33.373
and why they happen, ship collisions and why they happen,
255
00:11:34.725 --> 00:11:39.725
unlike those other industries that are able to learn
256
00:11:40.360 --> 00:11:42.890
from each of these, and put new rules into place
257
00:11:42.890 --> 00:11:44.960
to prevent them from happening in the future,
258
00:11:44.960 --> 00:11:46.153
that's just not happening here.
259
00:11:46.153 --> 00:11:48.680
We're seeing the same mistakes,
260
00:11:48.680 --> 00:11:51.833
if you go back and study a breach from 20 years ago,
261
00:11:53.370 --> 00:11:54.690
there's some minor things different,
262
00:11:54.690 --> 00:11:59.690
but it can be like reading the same exact report
263
00:11:59.890 --> 00:12:03.840
over two decades of time, and technology has changed a lot,
264
00:12:03.840 --> 00:12:06.793
but the attacks don't really have to change that much.
265
00:12:07.790 --> 00:12:10.780
We were seeing extortion and those kinds of approaches
266
00:12:10.780 --> 00:12:13.580
in the '90s, and we're still seeing it now,
267
00:12:13.580 --> 00:12:16.660
where somebody takes over your company's resources,
268
00:12:16.660 --> 00:12:20.900
and wants some money to give 'em back to you,
269
00:12:20.900 --> 00:12:21.960
stuff like that.
270
00:12:21.960 --> 00:12:24.590
So obviously, digging into these reports,
271
00:12:24.590 --> 00:12:26.580
learning from them, this is the way forward
272
00:12:26.580 --> 00:12:29.460
if we don't want to be having this same discussion
273
00:12:29.460 --> 00:12:31.610
in another 20 years.
274
00:12:31.610 --> 00:12:35.470
Yeah, and the positive note is those Equifax details
275
00:12:35.470 --> 00:12:36.950
are out there, and this is something
276
00:12:36.950 --> 00:12:38.400
I want to spend more time on.
277
00:12:39.630 --> 00:12:41.310
I'm planning on a blog series,
278
00:12:41.310 --> 00:12:45.003
that are just post-mortems on breaches,
279
00:12:46.060 --> 00:12:48.250
where we do have this level of detail,
280
00:12:48.250 --> 00:12:50.600
and we need somebody pushing for
281
00:12:51.520 --> 00:12:53.320
getting this level of detail more often,
282
00:12:53.320 --> 00:12:58.320
without a federal investigation, or class action lawsuit,
283
00:12:58.730 --> 00:13:00.603
being necessary to bring it out,
284
00:13:00.603 --> 00:13:03.940
even if it's not shared with the general public.
285
00:13:03.940 --> 00:13:07.610
Certainly defenders, we need to know
286
00:13:07.610 --> 00:13:10.353
how these attacks happen, so that we can prevent them.
287
00:13:11.870 --> 00:13:14.210
The positive note is, some of those details are out there,
288
00:13:14.210 --> 00:13:16.540
and if you haven't dug into them,
289
00:13:16.540 --> 00:13:18.580
you definitely need somebody on your team,
290
00:13:18.580 --> 00:13:21.766
somebody in your business that understands
291
00:13:21.766 --> 00:13:24.380
at a pretty deep level, how Equifax happened,
292
00:13:24.380 --> 00:13:25.750
how Target happened,
293
00:13:25.750 --> 00:13:28.180
how some of these different breaches occurred.
294
00:13:28.180 --> 00:13:30.900
Fantastic, well Adrian, thank you so much
295
00:13:30.900 --> 00:13:32.690
for your time and insights today.
296
00:13:32.690 --> 00:13:33.930
Yeah, my pleasure.
297
00:13:33.930 --> 00:13:36.450
I've been speaking with Adrian Sanabria,
298
00:13:36.450 --> 00:13:38.340
a security professional, who's had a good hard look
299
00:13:38.340 --> 00:13:39.970
at what Equifax did wrong,
300
00:13:39.970 --> 00:13:41.570
and what others should learn from that.
301
00:13:41.570 --> 00:13:44.920
I'm Mathew Schwartz with Information Security Media Group.
302
00:13:44.920 --> 00:13:46.953
Thank you very much for joining us.