Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,000 --> 00:00:02,070
Instructor: When we talk about risk, we have to think
2
00:00:02,070 --> 00:00:06,000
about what we can do about risk as an organization.
3
00:00:06,000 --> 00:00:08,130
After you conclude your penetration test,
4
00:00:08,130 --> 00:00:10,440
one of the biggest deliverables you're gonna have
5
00:00:10,440 --> 00:00:12,900
is a final report to your client's organization
6
00:00:12,900 --> 00:00:15,150
that lists all the vulnerabilities you found,
7
00:00:15,150 --> 00:00:16,440
how they were exploited,
8
00:00:16,440 --> 00:00:18,630
and what controls the organization can add
9
00:00:18,630 --> 00:00:21,930
as mitigations to minimize their risk exposure.
10
00:00:21,930 --> 00:00:23,910
In every risk management program
11
00:00:23,910 --> 00:00:27,000
there are essentially only four things you can do with risk.
12
00:00:27,000 --> 00:00:30,300
You can avoid it, you can transfer it, you can mitigate it,
13
00:00:30,300 --> 00:00:31,920
and you can accept it.
14
00:00:31,920 --> 00:00:34,650
Risk avoidance is a strategy that involves stopping
15
00:00:34,650 --> 00:00:38,460
a risky activity or choosing a less risky alternative.
16
00:00:38,460 --> 00:00:41,430
But how does this apply to our IT networks?
17
00:00:41,430 --> 00:00:43,590
Well, let's assume that we have a network
18
00:00:43,590 --> 00:00:45,870
currently comprised of a hundred computers,
19
00:00:45,870 --> 00:00:48,660
but 15 of those are running Windows 7.
20
00:00:48,660 --> 00:00:50,430
If you know anything about end of life
21
00:00:50,430 --> 00:00:52,620
and unsupported software, you already know
22
00:00:52,620 --> 00:00:55,590
that Windows 7 stopped receiving official support
23
00:00:55,590 --> 00:00:57,930
back in January of 2020.
24
00:00:57,930 --> 00:01:00,240
To avoid the risk of running an unsupported software
25
00:01:00,240 --> 00:01:03,600
like Windows 7, we really only have two choices.
26
00:01:03,600 --> 00:01:05,610
We could take those computers offline,
27
00:01:05,610 --> 00:01:08,010
meaning we stop that risky activity,
28
00:01:08,010 --> 00:01:09,810
or we can upgrade those computers
29
00:01:09,810 --> 00:01:11,610
to Windows 10 which is a newer
30
00:01:11,610 --> 00:01:13,560
and still supported operating system
31
00:01:13,560 --> 00:01:16,560
and thereby is a less risky alternative.
32
00:01:16,560 --> 00:01:18,690
Now, when we talk about risk avoidance
33
00:01:18,690 --> 00:01:21,210
we're basically eliminating the hazards, activities
34
00:01:21,210 --> 00:01:24,240
and exposures that could negatively affect us.
35
00:01:24,240 --> 00:01:26,310
For example, in my own company,
36
00:01:26,310 --> 00:01:28,080
we ended up choosing risk avoidance
37
00:01:28,080 --> 00:01:31,050
for our CompTIA Exam Voucher Sales Program.
38
00:01:31,050 --> 00:01:33,360
We saw a sharp increase of fraud from people
39
00:01:33,360 --> 00:01:36,420
buying our exam vouchers using stolen credit cards.
40
00:01:36,420 --> 00:01:39,690
In just one month, one out of every two exam vouchers
41
00:01:39,690 --> 00:01:42,180
that we sold, ended up being disputed as fraud
42
00:01:42,180 --> 00:01:44,250
by the victim's credit card companies.
43
00:01:44,250 --> 00:01:46,230
This means that my company was out,
44
00:01:46,230 --> 00:01:47,670
not just the purchase price,
45
00:01:47,670 --> 00:01:49,830
but also the exam voucher itself
46
00:01:49,830 --> 00:01:52,200
because we had already issued that to our buyer.
47
00:01:52,200 --> 00:01:54,600
Because of this, we made the business decision
48
00:01:54,600 --> 00:01:56,160
to stop selling exam vouchers
49
00:01:56,160 --> 00:01:57,780
in certain countries around the world
50
00:01:57,780 --> 00:02:00,060
because the rate of fraud was simply too high
51
00:02:00,060 --> 00:02:02,250
and we wanted to avoid that risk.
52
00:02:02,250 --> 00:02:05,310
The second thing we can do with risk is we can transfer it.
53
00:02:05,310 --> 00:02:08,070
Now, risk transfer is a strategy that passes the risk
54
00:02:08,070 --> 00:02:09,570
over to a third party,
55
00:02:09,570 --> 00:02:11,250
and most commonly, this is done
56
00:02:11,250 --> 00:02:13,350
by giving it to an insurance company.
57
00:02:13,350 --> 00:02:14,820
Now, a good example of this is
58
00:02:14,820 --> 00:02:16,740
if our organization was worried about the risk
59
00:02:16,740 --> 00:02:19,050
of our server room being destroyed by a flood,
60
00:02:19,050 --> 00:02:19,883
we could go out
61
00:02:19,883 --> 00:02:21,900
and purchase insurance to transfer the risk
62
00:02:21,900 --> 00:02:25,620
of losing all those assets over to a third party insurer.
63
00:02:25,620 --> 00:02:27,630
Then if a flood did occur,
64
00:02:27,630 --> 00:02:29,400
they're gonna write us a really large check
65
00:02:29,400 --> 00:02:31,200
so we can replace all of our equipment
66
00:02:31,200 --> 00:02:33,000
and pay for a data recovery team
67
00:02:33,000 --> 00:02:35,580
to help restore our data and services.
68
00:02:35,580 --> 00:02:38,820
The third thing we can do with risk is we can mitigate it.
69
00:02:38,820 --> 00:02:40,620
Now, this is probably the most popular thing
70
00:02:40,620 --> 00:02:43,290
that you're gonna see done with risk in the real world.
71
00:02:43,290 --> 00:02:45,900
Risk mitigation is simply a strategy that seeks
72
00:02:45,900 --> 00:02:48,240
to minimize the risk to an acceptable level
73
00:02:48,240 --> 00:02:50,670
which an organization can then accept.
74
00:02:50,670 --> 00:02:52,230
Now, for example, let's say,
75
00:02:52,230 --> 00:02:54,390
if we identified there was a running server
76
00:02:54,390 --> 00:02:57,120
that's identified to have five critical, two high,
77
00:02:57,120 --> 00:03:00,240
four medium, and 17 low vulnerabilities.
78
00:03:00,240 --> 00:03:03,270
Our risk management program might have a policy that states
79
00:03:03,270 --> 00:03:05,490
that any server with a critical vulnerability
80
00:03:05,490 --> 00:03:07,560
should be taken offline immediately.
81
00:03:07,560 --> 00:03:10,620
But, if we can patch those five critical vulnerabilities,
82
00:03:10,620 --> 00:03:12,900
they might be willing to accept the residual risk
83
00:03:12,900 --> 00:03:15,480
from the high, medium and low vulnerabilities
84
00:03:15,480 --> 00:03:18,210
because the overall risk was mitigated downwards
85
00:03:18,210 --> 00:03:20,580
by adding those extra controls and patches
86
00:03:20,580 --> 00:03:22,530
and bringing that overall risk level down
87
00:03:22,530 --> 00:03:24,360
to an acceptable level.
88
00:03:24,360 --> 00:03:26,580
Let's go back to my earlier example of the fraud
89
00:03:26,580 --> 00:03:28,440
that occurred with our exam vouchers.
90
00:03:28,440 --> 00:03:30,900
There were some countries where we experienced fraud
91
00:03:30,900 --> 00:03:32,790
but it was at a lower level.
92
00:03:32,790 --> 00:03:35,400
For example, our United States voucher program
93
00:03:35,400 --> 00:03:38,100
actually experiences fraud on a regular basis,
94
00:03:38,100 --> 00:03:40,560
but we've put in place certain mitigations
95
00:03:40,560 --> 00:03:42,630
to block some of that fraud from occurring,
96
00:03:42,630 --> 00:03:43,920
and we were able to get it down
97
00:03:43,920 --> 00:03:45,570
to a low enough level of fraud
98
00:03:45,570 --> 00:03:48,000
that we're able to continue offering the exam vouchers
99
00:03:48,000 --> 00:03:49,500
without losing money.
100
00:03:49,500 --> 00:03:51,900
Therefore, we made a business risk decision
101
00:03:51,900 --> 00:03:53,700
to accept that residual risk level
102
00:03:53,700 --> 00:03:56,850
after applying our risk mitigations for now.
103
00:03:56,850 --> 00:03:59,940
The final thing we can do with risk is we can accept it.
104
00:03:59,940 --> 00:04:01,920
Risk acceptance is the strategy that seeks
105
00:04:01,920 --> 00:04:03,447
to accept the current level of risk
106
00:04:03,447 --> 00:04:05,190
and the cost associated with it
107
00:04:05,190 --> 00:04:07,830
if those risks were actually realized.
108
00:04:07,830 --> 00:04:09,900
Generally this is the proper strategy
109
00:04:09,900 --> 00:04:11,880
if the asset is very low-cost
110
00:04:11,880 --> 00:04:15,540
or the impact in the organization would be very low.
111
00:04:15,540 --> 00:04:18,060
For example, we may choose to transfer the risk
112
00:04:18,060 --> 00:04:20,610
of a server being damaged by purchasing insurance
113
00:04:20,610 --> 00:04:23,610
because those costs $10,000 or more to replace
114
00:04:23,610 --> 00:04:27,150
but for a laptop, we might just accept that risk
115
00:04:27,150 --> 00:04:30,780
because a laptop can be replaced for three to $500.
116
00:04:30,780 --> 00:04:33,030
So how does an organization determine
117
00:04:33,030 --> 00:04:35,730
which risk handling action they need to take,
118
00:04:35,730 --> 00:04:38,370
or, when they've applied enough mitigations
119
00:04:38,370 --> 00:04:40,590
to accept the residual risk?
120
00:04:40,590 --> 00:04:42,660
Well, as in most things in life,
121
00:04:42,660 --> 00:04:45,990
it's all about how much risk you're willing to accept.
122
00:04:45,990 --> 00:04:47,760
Every organization is unique
123
00:04:47,760 --> 00:04:48,960
and they have a different level
124
00:04:48,960 --> 00:04:50,700
that they're willing to accept.
125
00:04:50,700 --> 00:04:52,290
There's actually a term for this.
126
00:04:52,290 --> 00:04:55,800
It's known as the organization's risk appetite.
127
00:04:55,800 --> 00:04:58,020
Now, risk appetite is the amount of risk
128
00:04:58,020 --> 00:05:00,210
that an organization is willing to accept
129
00:05:00,210 --> 00:05:01,710
in pursuit of its objectives
130
00:05:01,710 --> 00:05:05,760
before action is deemed necessary to reduce the risk level.
131
00:05:05,760 --> 00:05:07,680
Often the term, risk appetite,
132
00:05:07,680 --> 00:05:10,800
is also called risk attitude or risk tolerance.
133
00:05:10,800 --> 00:05:13,230
For the exam, you may see these three terms
134
00:05:13,230 --> 00:05:14,790
being used interchangeably,
135
00:05:14,790 --> 00:05:17,970
and you'll also find this occurs in the real world as well.
136
00:05:17,970 --> 00:05:20,220
Some organizations though, make a distinction
137
00:05:20,220 --> 00:05:22,890
between risk appetite and risk tolerance.
138
00:05:22,890 --> 00:05:24,750
When they talk of risk appetite
139
00:05:24,750 --> 00:05:26,880
they refer to the overall generic level
140
00:05:26,880 --> 00:05:30,030
of risk to the organization that they're willing to accept.
141
00:05:30,030 --> 00:05:30,943
Conversely, when they talk
142
00:05:30,943 --> 00:05:33,690
about risk tolerance in these organizations,
143
00:05:33,690 --> 00:05:36,360
they're gonna be referring to a specific maximum risk
144
00:05:36,360 --> 00:05:37,920
that organization is willing to take
145
00:05:37,920 --> 00:05:40,980
in regards to a specific identified risk.
146
00:05:40,980 --> 00:05:43,590
So if you're thinking about your organization,
147
00:05:43,590 --> 00:05:46,470
remember they're gonna have an overall risk appetite
148
00:05:46,470 --> 00:05:48,180
associated with the organization
149
00:05:48,180 --> 00:05:50,520
that is set by your higher level decision makers
150
00:05:50,520 --> 00:05:51,900
like the C-suite,
151
00:05:51,900 --> 00:05:52,733
and then you're gonna find
152
00:05:52,733 --> 00:05:54,840
that there are different levels of risk tolerance
153
00:05:54,840 --> 00:05:57,420
within certain product lines, certain departments
154
00:05:57,420 --> 00:06:00,780
or even certain servers within a department or division.
155
00:06:00,780 --> 00:06:01,980
This risk appetite
156
00:06:01,980 --> 00:06:05,130
and risk tolerance is going to affect the decisions you make
157
00:06:05,130 --> 00:06:07,890
in regards to risk avoidance, risk transfer,
158
00:06:07,890 --> 00:06:10,710
risk mitigation, and risk acceptance.
159
00:06:10,710 --> 00:06:13,080
Now, as you decide which of these four to choose from
160
00:06:13,080 --> 00:06:15,090
in your risk handling, remember,
161
00:06:15,090 --> 00:06:17,910
there are always gonna be trade-offs that have to be made.
162
00:06:17,910 --> 00:06:19,410
If you add more security,
163
00:06:19,410 --> 00:06:21,480
you're adding more cost to the project.
164
00:06:21,480 --> 00:06:23,490
Also, if you add more security
165
00:06:23,490 --> 00:06:26,610
you're often reducing the usability of that system.
166
00:06:26,610 --> 00:06:28,980
This is the never ending trade-off that occurs,
167
00:06:28,980 --> 00:06:32,160
the war between usability and security.
168
00:06:32,160 --> 00:06:34,710
For example, I worked for one organization
169
00:06:34,710 --> 00:06:37,200
that required users to access their work email
170
00:06:37,200 --> 00:06:39,090
only from a dedicated smartphone
171
00:06:39,090 --> 00:06:40,920
that the organization issued them,
172
00:06:40,920 --> 00:06:43,230
but that was gonna be a huge expense
173
00:06:43,230 --> 00:06:45,120
so not everybody could access their email
174
00:06:45,120 --> 00:06:47,370
because not everybody got a smartphone.
175
00:06:47,370 --> 00:06:50,280
Instead, they chose five to 10% of the people
176
00:06:50,280 --> 00:06:51,390
who were given a smartphone,
177
00:06:51,390 --> 00:06:53,550
and those people could access their email
178
00:06:53,550 --> 00:06:55,440
using that dedicated device.
179
00:06:55,440 --> 00:06:57,360
Now, this was a security decision,
180
00:06:57,360 --> 00:07:00,990
but its consequences was that the service became less usable
181
00:07:00,990 --> 00:07:03,750
because 90 to 95% of the employees
182
00:07:03,750 --> 00:07:06,420
couldn't access their email after working hours.
183
00:07:06,420 --> 00:07:08,760
So who do you think had the smartphones?
184
00:07:08,760 --> 00:07:11,910
Well, mostly it was the managers and the bosses.
185
00:07:11,910 --> 00:07:14,730
So the bosses would be frantically sending out emails
186
00:07:14,730 --> 00:07:16,980
at 8:00 PM at night when something went wrong,
187
00:07:16,980 --> 00:07:18,000
and then they were surprised
188
00:07:18,000 --> 00:07:19,530
when the workers didn't respond to them
189
00:07:19,530 --> 00:07:21,330
until 8:00 AM the next day.
190
00:07:21,330 --> 00:07:22,230
They would say things like,
191
00:07:22,230 --> 00:07:24,360
but I sent you this email and I marked it as urgent.
192
00:07:24,360 --> 00:07:25,920
Why didn't you respond?
193
00:07:25,920 --> 00:07:28,800
Well, because we made that system so secure,
194
00:07:28,800 --> 00:07:31,530
it made it so the only way people could access their email
195
00:07:31,530 --> 00:07:34,350
was by driving into work and checking it from their desktop,
196
00:07:34,350 --> 00:07:35,430
and they weren't gonna be doing that
197
00:07:35,430 --> 00:07:36,360
at eight o'clock at night
198
00:07:36,360 --> 00:07:38,370
if they didn't know there was a problem.
199
00:07:38,370 --> 00:07:41,280
This is the idea of usability versus security.
200
00:07:41,280 --> 00:07:42,810
You always have to keep this in mind
201
00:07:42,810 --> 00:07:44,910
when you're making your security recommendations
202
00:07:44,910 --> 00:07:46,590
at the end of a penetration test,
203
00:07:46,590 --> 00:07:48,780
because just because something is more secure,
204
00:07:48,780 --> 00:07:51,630
doesn't mean it's the right answer for that organization.
15626
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.