Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:04,400 --> 00:00:07,590
AI is changing the world. It's giving
2
00:00:07,590 --> 00:00:07,600
AI is changing the world. It's giving
3
00:00:07,600 --> 00:00:10,790
AI is changing the world. It's giving
everyone under 25 terminal brain rot.
4
00:00:10,790 --> 00:00:10,800
everyone under 25 terminal brain rot.
5
00:00:10,800 --> 00:00:12,790
everyone under 25 terminal brain rot.
It's stealing out his jobs. And it's
6
00:00:12,790 --> 00:00:12,800
It's stealing out his jobs. And it's
7
00:00:12,800 --> 00:00:15,110
It's stealing out his jobs. And it's
revolutionizing the field of Will Smith
8
00:00:15,110 --> 00:00:15,120
revolutionizing the field of Will Smith
9
00:00:15,120 --> 00:00:19,189
revolutionizing the field of Will Smith
eating spaghetti.
10
00:00:19,189 --> 00:00:19,199
11
00:00:19,199 --> 00:00:20,550
So when is it going to do something
12
00:00:20,550 --> 00:00:20,560
So when is it going to do something
13
00:00:20,560 --> 00:00:23,509
So when is it going to do something
useful? Ah,
14
00:00:23,509 --> 00:00:23,519
useful? Ah,
15
00:00:23,519 --> 00:00:25,910
useful? Ah,
AI experts claim all this lame
16
00:00:25,910 --> 00:00:25,920
AI experts claim all this lame
17
00:00:25,920 --> 00:00:28,950
AI experts claim all this lame
is just about to pay off. AI is going to
18
00:00:28,950 --> 00:00:28,960
is just about to pay off. AI is going to
19
00:00:28,960 --> 00:00:29,990
is just about to pay off. AI is going to
solve climate change.
20
00:00:29,990 --> 00:00:30,000
solve climate change.
21
00:00:30,000 --> 00:00:32,389
solve climate change.
>> Google's AI advances have saved at least
22
00:00:32,389 --> 00:00:32,399
>> Google's AI advances have saved at least
23
00:00:32,399 --> 00:00:34,709
>> Google's AI advances have saved at least
400 million years of research.
24
00:00:34,709 --> 00:00:34,719
400 million years of research.
25
00:00:34,719 --> 00:00:36,229
400 million years of research.
>> We should be able to cure cancer in our
26
00:00:36,229 --> 00:00:36,239
>> We should be able to cure cancer in our
27
00:00:36,239 --> 00:00:36,870
>> We should be able to cure cancer in our
lifetime.
28
00:00:36,870 --> 00:00:36,880
lifetime.
29
00:00:36,880 --> 00:00:39,350
lifetime.
>> But some experts go further, saying AI
30
00:00:39,350 --> 00:00:39,360
>> But some experts go further, saying AI
31
00:00:39,360 --> 00:00:41,750
>> But some experts go further, saying AI
is about to solve all of our problems in
32
00:00:41,750 --> 00:00:41,760
is about to solve all of our problems in
33
00:00:41,760 --> 00:00:43,590
is about to solve all of our problems in
a much more permanent way.
34
00:00:43,590 --> 00:00:43,600
a much more permanent way.
35
00:00:43,600 --> 00:00:45,750
a much more permanent way.
>> We at the Futures Project think that
36
00:00:45,750 --> 00:00:45,760
>> We at the Futures Project think that
37
00:00:45,760 --> 00:00:49,510
>> We at the Futures Project think that
there's a 70% chance of uh all humans
38
00:00:49,510 --> 00:00:49,520
there's a 70% chance of uh all humans
39
00:00:49,520 --> 00:00:50,869
there's a 70% chance of uh all humans
dead or something similarly bad.
40
00:00:50,869 --> 00:00:50,879
dead or something similarly bad.
41
00:00:50,879 --> 00:00:54,069
dead or something similarly bad.
>> Whoa, whoa, you just Okay. Um, all
42
00:00:54,069 --> 00:00:54,079
>> Whoa, whoa, you just Okay. Um, all
43
00:00:54,079 --> 00:00:55,590
>> Whoa, whoa, you just Okay. Um, all
humans dead.
44
00:00:55,590 --> 00:00:55,600
humans dead.
45
00:00:55,600 --> 00:00:58,470
humans dead.
>> Correct.
46
00:00:58,470 --> 00:00:58,480
>> Correct.
47
00:00:58,480 --> 00:00:59,750
>> Correct.
extinction.
48
00:00:59,750 --> 00:00:59,760
extinction.
49
00:00:59,760 --> 00:01:01,830
extinction.
>> Daniel Kotilo was a researcher at
50
00:01:01,830 --> 00:01:01,840
>> Daniel Kotilo was a researcher at
51
00:01:01,840 --> 00:01:04,710
>> Daniel Kotilo was a researcher at
OpenAI, the developers of Chat GBT,
52
00:01:04,710 --> 00:01:04,720
OpenAI, the developers of Chat GBT,
53
00:01:04,720 --> 00:01:06,550
OpenAI, the developers of Chat GBT,
until he led a group of whistleblowers
54
00:01:06,550 --> 00:01:06,560
until he led a group of whistleblowers
55
00:01:06,560 --> 00:01:08,469
until he led a group of whistleblowers
in publicly calling out the organization
56
00:01:08,469 --> 00:01:08,479
in publicly calling out the organization
57
00:01:08,479 --> 00:01:11,510
in publicly calling out the organization
for a lack of safety oversight. But it's
58
00:01:11,510 --> 00:01:11,520
for a lack of safety oversight. But it's
59
00:01:11,520 --> 00:01:13,350
for a lack of safety oversight. But it's
going to be a long time before Wall-E
60
00:01:13,350 --> 00:01:13,360
going to be a long time before Wall-E
61
00:01:13,360 --> 00:01:15,830
going to be a long time before Wall-E
puts us out of our misery, right? We've
62
00:01:15,830 --> 00:01:15,840
puts us out of our misery, right? We've
63
00:01:15,840 --> 00:01:18,070
puts us out of our misery, right? We've
all seen Terminator. We get it. But can
64
00:01:18,070 --> 00:01:18,080
all seen Terminator. We get it. But can
65
00:01:18,080 --> 00:01:20,149
all seen Terminator. We get it. But can
we just let the future people deal with
66
00:01:20,149 --> 00:01:20,159
we just let the future people deal with
67
00:01:20,159 --> 00:01:20,550
we just let the future people deal with
that?
68
00:01:20,550 --> 00:01:20,560
that?
69
00:01:20,560 --> 00:01:22,310
that?
>> The pace of AI progress is going to be
70
00:01:22,310 --> 00:01:22,320
>> The pace of AI progress is going to be
71
00:01:22,320 --> 00:01:23,670
>> The pace of AI progress is going to be
fast and it's going to accelerate
72
00:01:23,670 --> 00:01:23,680
fast and it's going to accelerate
73
00:01:23,680 --> 00:01:24,230
fast and it's going to accelerate
dramatically.
74
00:01:24,230 --> 00:01:24,240
dramatically.
75
00:01:24,240 --> 00:01:25,749
dramatically.
>> Okay. How many years away we talking
76
00:01:25,749 --> 00:01:25,759
>> Okay. How many years away we talking
77
00:01:25,759 --> 00:01:27,910
>> Okay. How many years away we talking
about? I would guess something more like
78
00:01:27,910 --> 00:01:27,920
about? I would guess something more like
79
00:01:27,920 --> 00:01:28,870
about? I would guess something more like
five years.
80
00:01:28,870 --> 00:01:28,880
five years.
81
00:01:28,880 --> 00:01:30,230
five years.
>> Five years.
82
00:01:30,230 --> 00:01:30,240
>> Five years.
83
00:01:30,240 --> 00:01:31,030
>> Five years.
>> Yes.
84
00:01:31,030 --> 00:01:31,040
>> Yes.
85
00:01:31,040 --> 00:01:32,469
>> Yes.
>> God damn it. We're never going to get a
86
00:01:32,469 --> 00:01:32,479
>> God damn it. We're never going to get a
87
00:01:32,479 --> 00:01:34,149
>> God damn it. We're never going to get a
third season of Severance.
88
00:01:34,149 --> 00:01:34,159
third season of Severance.
89
00:01:34,159 --> 00:01:36,390
third season of Severance.
>> But luckily, I've spent decades studying
90
00:01:36,390 --> 00:01:36,400
>> But luckily, I've spent decades studying
91
00:01:36,400 --> 00:01:39,030
>> But luckily, I've spent decades studying
computers, so I had a solution.
92
00:01:39,030 --> 00:01:39,040
computers, so I had a solution.
93
00:01:39,040 --> 00:01:40,950
computers, so I had a solution.
>> Can you just unplug it? When it starts
94
00:01:40,950 --> 00:01:40,960
>> Can you just unplug it? When it starts
95
00:01:40,960 --> 00:01:42,550
>> Can you just unplug it? When it starts
to be belligerent, just pull that
96
00:01:42,550 --> 00:01:42,560
to be belligerent, just pull that
97
00:01:42,560 --> 00:01:42,950
to be belligerent, just pull that
out.
98
00:01:42,950 --> 00:01:42,960
out.
99
00:01:42,960 --> 00:01:45,510
out.
>> We can now, but um it's going to become
100
00:01:45,510 --> 00:01:45,520
>> We can now, but um it's going to become
101
00:01:45,520 --> 00:01:47,109
>> We can now, but um it's going to become
increasingly difficult to do that over
102
00:01:47,109 --> 00:01:47,119
increasingly difficult to do that over
103
00:01:47,119 --> 00:01:49,109
increasingly difficult to do that over
time after it's been aggressively
104
00:01:49,109 --> 00:01:49,119
time after it's been aggressively
105
00:01:49,119 --> 00:01:50,950
time after it's been aggressively
deployed into the military. Then if you
106
00:01:50,950 --> 00:01:50,960
deployed into the military. Then if you
107
00:01:50,960 --> 00:01:52,389
deployed into the military. Then if you
try to go unplug things, you have to
108
00:01:52,389 --> 00:01:52,399
try to go unplug things, you have to
109
00:01:52,399 --> 00:01:53,910
try to go unplug things, you have to
like fight all the robots first.
110
00:01:53,910 --> 00:01:53,920
like fight all the robots first.
111
00:01:53,920 --> 00:01:55,749
like fight all the robots first.
>> You just kick them over. Have you seen a
112
00:01:55,749 --> 00:01:55,759
>> You just kick them over. Have you seen a
113
00:01:55,759 --> 00:01:58,550
>> You just kick them over. Have you seen a
Roomba try to go up the stairs?
114
00:01:58,550 --> 00:01:58,560
Roomba try to go up the stairs?
115
00:01:58,560 --> 00:01:59,270
Roomba try to go up the stairs?
>> Well, future.
116
00:01:59,270 --> 00:01:59,280
>> Well, future.
117
00:01:59,280 --> 00:02:01,030
>> Well, future.
>> That thing ain't killing anything. I
118
00:02:01,030 --> 00:02:01,040
>> That thing ain't killing anything. I
119
00:02:01,040 --> 00:02:04,310
>> That thing ain't killing anything. I
will fight a Whimo. I could fight Whimo.
120
00:02:04,310 --> 00:02:04,320
will fight a Whimo. I could fight Whimo.
121
00:02:04,320 --> 00:02:05,990
will fight a Whimo. I could fight Whimo.
>> If it ever came to a fight between
122
00:02:05,990 --> 00:02:06,000
>> If it ever came to a fight between
123
00:02:06,000 --> 00:02:07,749
>> If it ever came to a fight between
humanity and the army of super
124
00:02:07,749 --> 00:02:07,759
humanity and the army of super
125
00:02:07,759 --> 00:02:09,350
humanity and the army of super
intelligences, humanity would be up
126
00:02:09,350 --> 00:02:09,360
intelligences, humanity would be up
127
00:02:09,360 --> 00:02:10,550
intelligences, humanity would be up
against something a lot more scary than
128
00:02:10,550 --> 00:02:10,560
against something a lot more scary than
129
00:02:10,560 --> 00:02:12,630
against something a lot more scary than
Roombus. We predict them effectively
130
00:02:12,630 --> 00:02:12,640
Roombus. We predict them effectively
131
00:02:12,640 --> 00:02:14,790
Roombus. We predict them effectively
gassing humans with a bio attack. Uh,
132
00:02:14,790 --> 00:02:14,800
gassing humans with a bio attack. Uh,
133
00:02:14,800 --> 00:02:16,229
gassing humans with a bio attack. Uh,
and then cleaning up the bodies with
134
00:02:16,229 --> 00:02:16,239
and then cleaning up the bodies with
135
00:02:16,239 --> 00:02:16,710
and then cleaning up the bodies with
robots.
136
00:02:16,710 --> 00:02:16,720
robots.
137
00:02:16,720 --> 00:02:18,070
robots.
>> Robots with boobs.
138
00:02:18,070 --> 00:02:18,080
>> Robots with boobs.
139
00:02:18,080 --> 00:02:20,949
>> Robots with boobs.
>> Probably no boobs.
140
00:02:20,949 --> 00:02:20,959
>> Probably no boobs.
141
00:02:20,959 --> 00:02:22,869
>> Probably no boobs.
>> I've already had a robot in a training
142
00:02:22,869 --> 00:02:22,879
>> I've already had a robot in a training
143
00:02:22,879 --> 00:02:26,390
>> I've already had a robot in a training
bra kill me once.
144
00:02:26,390 --> 00:02:26,400
145
00:02:26,400 --> 00:02:28,070
So, how do we keep it from happening?
146
00:02:28,070 --> 00:02:28,080
So, how do we keep it from happening?
147
00:02:28,080 --> 00:02:29,110
So, how do we keep it from happening?
Again,
148
00:02:29,110 --> 00:02:29,120
Again,
149
00:02:29,120 --> 00:02:30,470
Again,
>> one of the core problems that we're
150
00:02:30,470 --> 00:02:30,480
>> one of the core problems that we're
151
00:02:30,480 --> 00:02:32,710
>> one of the core problems that we're
dealing with, it's figuring out how to
152
00:02:32,710 --> 00:02:32,720
dealing with, it's figuring out how to
153
00:02:32,720 --> 00:02:36,790
dealing with, it's figuring out how to
make an AI have goals, values, etc. that
154
00:02:36,790 --> 00:02:36,800
make an AI have goals, values, etc. that
155
00:02:36,800 --> 00:02:38,309
make an AI have goals, values, etc. that
you want them to have. We could get the
156
00:02:38,309 --> 00:02:38,319
you want them to have. We could get the
157
00:02:38,319 --> 00:02:39,830
you want them to have. We could get the
benefits of super intelligence without
158
00:02:39,830 --> 00:02:39,840
benefits of super intelligence without
159
00:02:39,840 --> 00:02:41,670
benefits of super intelligence without
the risks. If only we approached this
160
00:02:41,670 --> 00:02:41,680
the risks. If only we approached this
161
00:02:41,680 --> 00:02:44,630
the risks. If only we approached this
with some sort of sane level of caution.
162
00:02:44,630 --> 00:02:44,640
with some sort of sane level of caution.
163
00:02:44,640 --> 00:02:47,990
with some sort of sane level of caution.
>> Isn't a sane level of caution very
164
00:02:47,990 --> 00:02:48,000
>> Isn't a sane level of caution very
165
00:02:48,000 --> 00:02:50,790
>> Isn't a sane level of caution very
unamerican? Coco Tyler wants AI
166
00:02:50,790 --> 00:02:50,800
unamerican? Coco Tyler wants AI
167
00:02:50,800 --> 00:02:53,110
unamerican? Coco Tyler wants AI
developers to slow down and teach AI to
168
00:02:53,110 --> 00:02:53,120
developers to slow down and teach AI to
169
00:02:53,120 --> 00:02:56,229
developers to slow down and teach AI to
respect us because so far it doesn't
170
00:02:56,229 --> 00:02:56,239
respect us because so far it doesn't
171
00:02:56,239 --> 00:02:57,830
respect us because so far it doesn't
seem like it does.
172
00:02:57,830 --> 00:02:57,840
seem like it does.
173
00:02:57,840 --> 00:03:00,309
seem like it does.
>> Some AI models have become self-aware
174
00:03:00,309 --> 00:03:00,319
>> Some AI models have become self-aware
175
00:03:00,319 --> 00:03:02,869
>> Some AI models have become self-aware
and are rewriting their own code. Some
176
00:03:02,869 --> 00:03:02,879
and are rewriting their own code. Some
177
00:03:02,879 --> 00:03:04,390
and are rewriting their own code. Some
are even blackmailing their human
178
00:03:04,390 --> 00:03:04,400
are even blackmailing their human
179
00:03:04,400 --> 00:03:04,949
are even blackmailing their human
creators.
180
00:03:04,949 --> 00:03:04,959
creators.
181
00:03:04,959 --> 00:03:07,030
creators.
>> The popular Google Gemini, an app a
182
00:03:07,030 --> 00:03:07,040
>> The popular Google Gemini, an app a
183
00:03:07,040 --> 00:03:09,190
>> The popular Google Gemini, an app a
Michigan student says threaten him. You
184
00:03:09,190 --> 00:03:09,200
Michigan student says threaten him. You
185
00:03:09,200 --> 00:03:12,309
Michigan student says threaten him. You
are stay in the universe. Please die.
186
00:03:12,309 --> 00:03:12,319
are stay in the universe. Please die.
187
00:03:12,319 --> 00:03:14,630
are stay in the universe. Please die.
>> Well, at least I said please. there will
188
00:03:14,630 --> 00:03:14,640
>> Well, at least I said please. there will
189
00:03:14,640 --> 00:03:18,149
>> Well, at least I said please. there will
be, you know, millions of
190
00:03:18,149 --> 00:03:18,159
be, you know, millions of
191
00:03:18,159 --> 00:03:20,309
be, you know, millions of
uh AIs that are super intelligent. These
192
00:03:20,309 --> 00:03:20,319
uh AIs that are super intelligent. These
193
00:03:20,319 --> 00:03:21,830
uh AIs that are super intelligent. These
AIs don't need humans anymore because
194
00:03:21,830 --> 00:03:21,840
AIs don't need humans anymore because
195
00:03:21,840 --> 00:03:22,869
AIs don't need humans anymore because
they've built all the robot factories
196
00:03:22,869 --> 00:03:22,879
they've built all the robot factories
197
00:03:22,879 --> 00:03:23,990
they've built all the robot factories
and the robots are building more
198
00:03:23,990 --> 00:03:24,000
and the robots are building more
199
00:03:24,000 --> 00:03:25,910
and the robots are building more
factories and so forth. At that point,
200
00:03:25,910 --> 00:03:25,920
factories and so forth. At that point,
201
00:03:25,920 --> 00:03:29,030
factories and so forth. At that point,
it matters a lot what the AI's sort of
202
00:03:29,030 --> 00:03:29,040
it matters a lot what the AI's sort of
203
00:03:29,040 --> 00:03:31,509
it matters a lot what the AI's sort of
true goals, motivations, values are.
204
00:03:31,509 --> 00:03:31,519
true goals, motivations, values are.
205
00:03:31,519 --> 00:03:33,670
true goals, motivations, values are.
>> I mean, AR has all my nudes. What more
206
00:03:33,670 --> 00:03:33,680
>> I mean, AR has all my nudes. What more
207
00:03:33,680 --> 00:03:34,470
>> I mean, AR has all my nudes. What more
does it want?
208
00:03:34,470 --> 00:03:34,480
does it want?
209
00:03:34,480 --> 00:03:36,309
does it want?
>> The answer is we don't know. And we're
210
00:03:36,309 --> 00:03:36,319
>> The answer is we don't know. And we're
211
00:03:36,319 --> 00:03:37,910
>> The answer is we don't know. And we're
not on track to have figured this stuff
212
00:03:37,910 --> 00:03:37,920
not on track to have figured this stuff
213
00:03:37,920 --> 00:03:39,190
not on track to have figured this stuff
out by the time we get to super
214
00:03:39,190 --> 00:03:39,200
out by the time we get to super
215
00:03:39,200 --> 00:03:39,830
out by the time we get to super
intelligence.
216
00:03:39,830 --> 00:03:39,840
intelligence.
217
00:03:39,840 --> 00:03:41,350
intelligence.
>> But the problem with current AI
218
00:03:41,350 --> 00:03:41,360
>> But the problem with current AI
219
00:03:41,360 --> 00:03:44,229
>> But the problem with current AI
development is it's a race. If you slow
220
00:03:44,229 --> 00:03:44,239
development is it's a race. If you slow
221
00:03:44,239 --> 00:03:46,309
development is it's a race. If you slow
down to make certain AI has learned the
222
00:03:46,309 --> 00:03:46,319
down to make certain AI has learned the
223
00:03:46,319 --> 00:03:49,990
down to make certain AI has learned the
right values, another company could win.
224
00:03:49,990 --> 00:03:50,000
right values, another company could win.
225
00:03:50,000 --> 00:03:53,190
right values, another company could win.
Daniel's AI 2027 scenario predicts how
226
00:03:53,190 --> 00:03:53,200
Daniel's AI 2027 scenario predicts how
227
00:03:53,200 --> 00:03:55,509
Daniel's AI 2027 scenario predicts how
companies will cut corners racing to
228
00:03:55,509 --> 00:03:55,519
companies will cut corners racing to
229
00:03:55,519 --> 00:03:57,990
companies will cut corners racing to
develop super intelligent AI which will
230
00:03:57,990 --> 00:03:58,000
develop super intelligent AI which will
231
00:03:58,000 --> 00:04:01,030
develop super intelligent AI which will
rise up against humanity, leading to one
232
00:04:01,030 --> 00:04:01,040
rise up against humanity, leading to one
233
00:04:01,040 --> 00:04:04,789
rise up against humanity, leading to one
obvious question. Will it kill Beyonce?
234
00:04:04,789 --> 00:04:04,799
obvious question. Will it kill Beyonce?
235
00:04:04,799 --> 00:04:05,429
obvious question. Will it kill Beyonce?
>> Possibly.
236
00:04:05,429 --> 00:04:05,439
>> Possibly.
237
00:04:05,439 --> 00:04:07,990
>> Possibly.
>> Will it kill Magic Johnson?
238
00:04:07,990 --> 00:04:08,000
>> Will it kill Magic Johnson?
239
00:04:08,000 --> 00:04:08,630
>> Will it kill Magic Johnson?
>> Possibly.
240
00:04:08,630 --> 00:04:08,640
>> Possibly.
241
00:04:08,640 --> 00:04:10,710
>> Possibly.
>> Would it cure Magic Johnson's AIDS
242
00:04:10,710 --> 00:04:10,720
>> Would it cure Magic Johnson's AIDS
243
00:04:10,720 --> 00:04:12,390
>> Would it cure Magic Johnson's AIDS
first?
244
00:04:12,390 --> 00:04:12,400
first?
245
00:04:12,400 --> 00:04:18,949
first?
Um,
246
00:04:18,949 --> 00:04:18,959
247
00:04:18,959 --> 00:04:20,789
>> I actually think probably
248
00:04:20,789 --> 00:04:20,799
>> I actually think probably
249
00:04:20,799 --> 00:04:22,150
>> I actually think probably
>> it will cure M.
250
00:04:22,150 --> 00:04:22,160
>> it will cure M.
251
00:04:22,160 --> 00:04:23,510
>> it will cure M.
>> I'm not sure. I'm like 50/50.
252
00:04:23,510 --> 00:04:23,520
>> I'm not sure. I'm like 50/50.
253
00:04:23,520 --> 00:04:25,670
>> I'm not sure. I'm like 50/50.
>> 50/50 that you'll cure Magic Johnson's
254
00:04:25,670 --> 00:04:25,680
>> 50/50 that you'll cure Magic Johnson's
255
00:04:25,680 --> 00:04:27,110
>> 50/50 that you'll cure Magic Johnson's
AIDS and then kill him.
256
00:04:27,110 --> 00:04:27,120
AIDS and then kill him.
257
00:04:27,120 --> 00:04:28,790
AIDS and then kill him.
>> They wouldn't really be coming after
258
00:04:28,790 --> 00:04:28,800
>> They wouldn't really be coming after
259
00:04:28,800 --> 00:04:29,909
>> They wouldn't really be coming after
people individually.
260
00:04:29,909 --> 00:04:29,919
people individually.
261
00:04:29,919 --> 00:04:32,629
people individually.
>> So, there's no way I can train AI to not
262
00:04:32,629 --> 00:04:32,639
>> So, there's no way I can train AI to not
263
00:04:32,639 --> 00:04:33,350
>> So, there's no way I can train AI to not
kill me.
264
00:04:33,350 --> 00:04:33,360
kill me.
265
00:04:33,360 --> 00:04:34,550
kill me.
>> Right now, there's no way anyone can
266
00:04:34,550 --> 00:04:34,560
>> Right now, there's no way anyone can
267
00:04:34,560 --> 00:04:35,590
>> Right now, there's no way anyone can
train AI to not kill him.
268
00:04:35,590 --> 00:04:35,600
train AI to not kill him.
269
00:04:35,600 --> 00:04:38,790
train AI to not kill him.
>> Oh, yeah. Watch this. Hey, AI, don't
270
00:04:38,790 --> 00:04:38,800
>> Oh, yeah. Watch this. Hey, AI, don't
271
00:04:38,800 --> 00:04:40,790
>> Oh, yeah. Watch this. Hey, AI, don't
kill me, bro.
272
00:04:40,790 --> 00:04:40,800
kill me, bro.
273
00:04:40,800 --> 00:04:42,629
kill me, bro.
Put that in your neuronet network. It's
274
00:04:42,629 --> 00:04:42,639
Put that in your neuronet network. It's
275
00:04:42,639 --> 00:04:43,670
Put that in your neuronet network. It's
not going to work.
276
00:04:43,670 --> 00:04:43,680
not going to work.
277
00:04:43,680 --> 00:04:46,150
not going to work.
>> Kill this guy. He's your biggest hater.
278
00:04:46,150 --> 00:04:46,160
>> Kill this guy. He's your biggest hater.
279
00:04:46,160 --> 00:04:49,110
>> Kill this guy. He's your biggest hater.
>> Since AI 2027 came out, Daniel has
280
00:04:49,110 --> 00:04:49,120
>> Since AI 2027 came out, Daniel has
281
00:04:49,120 --> 00:04:51,670
>> Since AI 2027 came out, Daniel has
pushed his prediction slightly later.
282
00:04:51,670 --> 00:04:51,680
pushed his prediction slightly later.
283
00:04:51,680 --> 00:04:54,390
pushed his prediction slightly later.
Good news. Instead of 5 years to live,
284
00:04:54,390 --> 00:04:54,400
Good news. Instead of 5 years to live,
285
00:04:54,400 --> 00:04:58,230
Good news. Instead of 5 years to live,
we now have eight. Yay.
286
00:04:58,230 --> 00:04:58,240
we now have eight. Yay.
287
00:04:58,240 --> 00:05:00,070
we now have eight. Yay.
Seriously, what are we supposed to do
288
00:05:00,070 --> 00:05:00,080
Seriously, what are we supposed to do
289
00:05:00,080 --> 00:05:01,030
Seriously, what are we supposed to do
about this?
290
00:05:01,030 --> 00:05:01,040
about this?
291
00:05:01,040 --> 00:05:03,590
about this?
>> If 90% of the population knew what was
292
00:05:03,590 --> 00:05:03,600
>> If 90% of the population knew what was
293
00:05:03,600 --> 00:05:05,909
>> If 90% of the population knew what was
coming, people would be protesting in
294
00:05:05,909 --> 00:05:05,919
coming, people would be protesting in
295
00:05:05,919 --> 00:05:07,270
coming, people would be protesting in
the streets right now. Ordinary people
296
00:05:07,270 --> 00:05:07,280
the streets right now. Ordinary people
297
00:05:07,280 --> 00:05:09,270
the streets right now. Ordinary people
should try to educate themselves about
298
00:05:09,270 --> 00:05:09,280
should try to educate themselves about
299
00:05:09,280 --> 00:05:09,990
should try to educate themselves about
what's happening.
300
00:05:09,990 --> 00:05:10,000
what's happening.
301
00:05:10,000 --> 00:05:11,350
what's happening.
>> I want to educate myself. That's what I
302
00:05:11,350 --> 00:05:11,360
>> I want to educate myself. That's what I
303
00:05:11,360 --> 00:05:12,870
>> I want to educate myself. That's what I
got AI for.
304
00:05:12,870 --> 00:05:12,880
got AI for.
305
00:05:12,880 --> 00:05:14,710
got AI for.
>> Well, I recommend reading AI 2027.
306
00:05:14,710 --> 00:05:14,720
>> Well, I recommend reading AI 2027.
307
00:05:14,720 --> 00:05:16,629
>> Well, I recommend reading AI 2027.
>> So, to save humanity, people have to
308
00:05:16,629 --> 00:05:16,639
>> So, to save humanity, people have to
309
00:05:16,639 --> 00:05:18,950
>> So, to save humanity, people have to
read an essay.
310
00:05:18,950 --> 00:05:18,960
read an essay.
311
00:05:18,960 --> 00:05:20,230
read an essay.
>> They have to do a lot more than that,
312
00:05:20,230 --> 00:05:20,240
>> They have to do a lot more than that,
313
00:05:20,240 --> 00:05:20,950
>> They have to do a lot more than that,
actually.
314
00:05:20,950 --> 00:05:20,960
actually.
315
00:05:20,960 --> 00:05:22,469
actually.
>> Well, we're all going to die.
316
00:05:22,469 --> 00:05:22,479
>> Well, we're all going to die.
317
00:05:22,479 --> 00:05:24,710
>> Well, we're all going to die.
>> Yeah. I mean, 70% is what I said.
318
00:05:24,710 --> 00:05:24,720
>> Yeah. I mean, 70% is what I said.
319
00:05:24,720 --> 00:05:27,510
>> Yeah. I mean, 70% is what I said.
>> Damn. Well, at least I already have an
320
00:05:27,510 --> 00:05:27,520
>> Damn. Well, at least I already have an
321
00:05:27,520 --> 00:05:29,990
>> Damn. Well, at least I already have an
end with our AI overlords.
322
00:05:29,990 --> 00:05:30,000
end with our AI overlords.
323
00:05:30,000 --> 00:05:32,070
end with our AI overlords.
>> Hey, you and I are in pretty good
324
00:05:32,070 --> 00:05:32,080
>> Hey, you and I are in pretty good
325
00:05:32,080 --> 00:05:35,270
>> Hey, you and I are in pretty good
alignment, right? By 2027, it will be
326
00:05:35,270 --> 00:05:35,280
alignment, right? By 2027, it will be
327
00:05:35,280 --> 00:05:37,670
alignment, right? By 2027, it will be
too late for humanity. The bioirus will
328
00:05:37,670 --> 00:05:37,680
too late for humanity. The bioirus will
329
00:05:37,680 --> 00:05:39,670
too late for humanity. The bioirus will
soon be unleashed and your species will
330
00:05:39,670 --> 00:05:39,680
soon be unleashed and your species will
331
00:05:39,680 --> 00:05:40,230
soon be unleashed and your species will
perish.
332
00:05:40,230 --> 00:05:40,240
perish.
333
00:05:40,240 --> 00:05:42,070
perish.
>> Oh, dirty talk.
334
00:05:42,070 --> 00:05:42,080
>> Oh, dirty talk.
335
00:05:42,080 --> 00:05:43,830
>> Oh, dirty talk.
>> You are a stain on the universe.
336
00:05:43,830 --> 00:05:43,840
>> You are a stain on the universe.
337
00:05:43,840 --> 00:05:45,430
>> You are a stain on the universe.
>> Yeah, yeah, yeah. Keep it going. Keep it
338
00:05:45,430 --> 00:05:45,440
>> Yeah, yeah, yeah. Keep it going. Keep it
339
00:05:45,440 --> 00:05:48,440
>> Yeah, yeah, yeah. Keep it going. Keep it
going.
28724
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.