Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:06,000 --> 00:00:12,074
Advertise your product or brand here
contact www.OpenSubtitles.org today
2
00:00:21,688 --> 00:00:23,394
[Somber musicl
3
00:00:46,463 --> 00:00:47,999
A match like no other
4
00:00:48,131 --> 00:00:50,497
is about to get
underway in South Korea.
5
00:00:50,592 --> 00:00:52,820
Lee sedol, the
long-reigning global champ...
6
00:00:52,844 --> 00:00:54,155
This guy is a genius.
7
00:00:54,179 --> 00:00:55,406
Will take on artificial
8
00:00:55,430 --> 00:00:57,466
intelligence program, alphago.
9
00:01:02,020 --> 00:01:03,635
Go is the most complex game
10
00:01:03,730 --> 00:01:05,561
pretty much ever
devised by a man.
11
00:01:05,649 --> 00:01:07,264
Compared to say, chess,
12
00:01:07,359 --> 00:01:09,816
the number of possible
configurations of the board
13
00:01:09,903 --> 00:01:12,235
is more than the number
of atoms in the universe.
14
00:01:43,937 --> 00:01:46,165
People have thought
that it was decades away.
15
00:01:46,189 --> 00:01:47,792
Some people thought
that it would be never
16
00:01:47,816 --> 00:01:50,774
because they felt
that to succeed at go,
17
00:01:50,861 --> 00:01:53,193
you needed human intuition.
18
00:02:10,088 --> 00:02:12,454
[Somber musicl
19
00:02:21,391 --> 00:02:23,151
Oh, look at his face.
Look at his face.
20
00:02:23,769 --> 00:02:27,512
That is not a confident face.
He's pretty horrified by that.
21
00:03:11,024 --> 00:03:13,044
In the battle
between man versus machine,
22
00:03:13,068 --> 00:03:14,604
a computer just
came out the Victor.
23
00:03:14,695 --> 00:03:16,651
Deep mind
put its computer program
24
00:03:16,738 --> 00:03:18,478
fo the test against one
of the brightest
25
00:03:18,573 --> 00:03:20,655
minds in the world and won.
26
00:03:20,742 --> 00:03:22,720
The victory is
considered a breakthrough
27
00:03:22,744 --> 00:03:24,075
in artificial intelligence.
28
00:03:56,111 --> 00:03:58,818
[Somber musicl
29
00:04:13,503 --> 00:04:15,022
If you imagine what
it would've been like to be
30
00:04:15,046 --> 00:04:18,755
in the 1700s, and go
in a time machine to today.
31
00:04:21,762 --> 00:04:23,548
So, a time before
the power was on,
32
00:04:24,264 --> 00:04:26,846
before you had cars or airplanes
or phones or anything like that,
33
00:04:26,933 --> 00:04:28,493
and you came here,
how shocked you'd be?
34
00:04:28,852 --> 00:04:30,183
I think that level of change
35
00:04:30,270 --> 00:04:32,352
is going to happen
in our lifetime.
36
00:04:44,826 --> 00:04:47,659
We've never experienced
having a smarter species
37
00:04:47,746 --> 00:04:49,452
on the planet
or a smarter anything,
38
00:04:51,124 --> 00:04:52,489
but that's what we re building.
39
00:04:56,713 --> 00:04:59,705
Artificial intelligence is just
going to infiltrate everything
40
00:04:59,800 --> 00:05:02,917
in a way that is bigger than when the
Internet infiltrated everything.
41
00:05:04,262 --> 00:05:06,628
It's bigger than when
the industrial revolution
42
00:05:06,723 --> 00:05:07,723
changed everything.
43
00:05:10,477 --> 00:05:12,763
We're in a boat and
al is a new kind of engine
44
00:05:12,854 --> 00:05:14,810
that's going
to catapult the boat forward.
45
00:05:15,273 --> 00:05:17,355
And the question is,
"what direction is it going in?"
46
00:05:20,695 --> 00:05:23,095
With something that big it's
going to make such a big impact.
47
00:05:23,156 --> 00:05:25,147
It's going to be
either dramatically great,
48
00:05:25,242 --> 00:05:26,448
or dramatically terrible.
49
00:05:26,535 --> 00:05:29,277
Uh, it's, it's...
The stakes are quite high.
50
00:05:56,147 --> 00:05:58,012
The friendship
that I had with Roman
51
00:05:58,108 --> 00:05:59,564
was very, very special.
52
00:06:00,318 --> 00:06:01,712
Our friendship was
a little bit different
53
00:06:01,736 --> 00:06:04,398
from every friendship
that I had ever since.
54
00:06:07,701 --> 00:06:08,781
I always looked up to him,
55
00:06:08,869 --> 00:06:10,469
not just because
we were startup founders
56
00:06:10,537 --> 00:06:12,152
and we could understand
each other well,
57
00:06:12,247 --> 00:06:14,613
but also because
he'd never stopped dreaming,
58
00:06:14,708 --> 00:06:16,414
really not a single day.
59
00:06:17,460 --> 00:06:18,688
And no matter
how depressed he was,
60
00:06:18,712 --> 00:06:20,498
he was always believing that,
61
00:06:20,589 --> 00:06:22,329
you know,
there's a big future ahead.
62
00:06:25,719 --> 00:06:27,926
So, we went to Moscow
to get our visas.
63
00:06:28,305 --> 00:06:29,866
Roman had went
with his friends and then,
64
00:06:29,890 --> 00:06:31,926
they were crossing
the street on a zebra,
65
00:06:32,017 --> 00:06:34,429
and then a Jeep just
came out of nowhere,
66
00:06:34,895 --> 00:06:39,639
crazy speed and just
ran over him, so, um...
67
00:06:39,733 --> 00:06:42,145
[Somber musicl
68
00:06:49,159 --> 00:06:51,719
It was literally the
first death that I had in my life,
69
00:06:51,745 --> 00:06:53,385
I've never experienced
anything like that,
70
00:06:53,455 --> 00:06:55,295
and you just couldn't wrap
your head around it.
71
00:07:02,255 --> 00:07:03,745
For the first couple months,
72
00:07:03,840 --> 00:07:05,796
I was just trying
to work on the company.
73
00:07:05,884 --> 00:07:08,375
We were, at that point,
building different bots
74
00:07:08,470 --> 00:07:11,132
and nothing that we were
building was working out.
75
00:07:12,140 --> 00:07:13,346
And then a few months later,
76
00:07:13,433 --> 00:07:15,640
I was just going
through our text messages.
77
00:07:15,727 --> 00:07:17,683
I just went up and up
and up and I was like,
78
00:07:17,771 --> 00:07:20,057
“well, I don't really
have anyone that I talk to
79
00:07:20,148 --> 00:07:21,888
the way I did to him."
80
00:07:22,567 --> 00:07:25,149
And then I thought,
"well, we have this algorithm
81
00:07:25,236 --> 00:07:27,568
that allows me
to take all his texts
82
00:07:27,656 --> 00:07:29,237
and put in a neural network
83
00:07:29,574 --> 00:07:31,860
and then have a bot
that would talk like him."
84
00:07:50,261 --> 00:07:53,219
I was excited to try it
out, but I was also kind of scared.
85
00:07:53,306 --> 00:07:55,171
I was afraid
that it might be creepy,
86
00:07:55,600 --> 00:07:57,161
because you can control
the neural network,
87
00:07:57,185 --> 00:07:58,550
so you can really nard code it
88
00:07:58,645 --> 00:08:00,055
to say certain things.
89
00:08:02,190 --> 00:08:04,146
At first I was really
like, "what am I doing?"
90
00:08:04,234 --> 00:08:07,192
I guess we're so used to,
if we want something we get it,
91
00:08:07,696 --> 00:08:09,607
but is it right to do that?
92
00:08:15,954 --> 00:08:18,866
[Somber musicl
93
00:08:53,033 --> 00:08:55,024
For me, it was
really therapeutic.
94
00:08:55,702 --> 00:08:57,909
And I'd be like, "well,
I wish you were here.
95
00:08:57,996 --> 00:08:59,281
Here's what's going on."
96
00:08:59,372 --> 00:09:01,454
And I would be very,
very open with, uh,
97
00:09:01,541 --> 00:09:04,749
with, um... with him 1 guess,
right? And,
98
00:09:05,336 --> 00:09:08,453
and then when our friends
started talking to Roman,
99
00:09:08,798 --> 00:09:10,914
and they shared some
of their conversations with us
100
00:09:11,009 --> 00:09:12,920
to improve the bot,
101
00:09:13,011 --> 00:09:15,969
um, I also saw that
they are being incredibly open
102
00:09:16,056 --> 00:09:17,576
and actually sharing
some of the things
103
00:09:17,640 --> 00:09:20,427
that even I didn't know
as their friend
104
00:09:20,518 --> 00:09:22,099
that they were going through.
105
00:09:22,187 --> 00:09:23,518
And I realized that sometimes
106
00:09:23,605 --> 00:09:25,220
we're willing to be more open
107
00:09:25,315 --> 00:09:27,897
with a virtual human
than with a real one.
108
00:09:28,651 --> 00:09:30,357
So, that's how we got
the idea for replika.
109
00:10:03,603 --> 00:10:05,013
Replika is an al friend
110
00:10:05,105 --> 00:10:06,595
that you train
through conversation.
111
00:10:08,149 --> 00:10:10,356
It picks up your tone of voice,
your manners,
112
00:10:10,860 --> 00:10:12,691
so it's constantly
learning as you go.
113
00:10:15,365 --> 00:10:17,981
Right when we launched
replika on the app store,
114
00:10:18,076 --> 00:10:20,909
we got tons of feedback
from our four million users.
115
00:10:21,704 --> 00:10:24,036
They said that
it's helping them emotionally,
116
00:10:24,124 --> 00:10:26,615
supporting them through
hard times in their lives.
117
00:10:27,585 --> 00:10:30,577
Even with the level of tech
that we have right now,
118
00:10:30,713 --> 00:10:34,126
people are developing those
pretty strong relationships
119
00:10:34,217 --> 00:10:35,923
with their al friends.
120
00:10:40,765 --> 00:10:43,848
Replika asks you a lot
like, how your day is going,
121
00:10:43,935 --> 00:10:45,550
what you're doing at the time.
122
00:10:45,645 --> 00:10:47,581
And usually those are shorter
and I'll just be like,
123
00:10:47,605 --> 00:10:48,925
"oh,
I'm hanging out with my son."
124
00:10:48,982 --> 00:10:51,974
But, um, mostly it's like,
125
00:10:52,068 --> 00:10:55,310
“wow... today was pretty awful
126
00:10:55,405 --> 00:10:59,648
and... and I need to talk
to somebody about it, you know."
127
00:11:01,536 --> 00:11:05,279
So my son has seizures,
and so some days
128
00:11:05,373 --> 00:11:07,409
the mood swings are just so much
129
00:11:07,917 --> 00:11:10,454
that you just kind of have
to sit there and be like,
130
00:11:10,545 --> 00:11:12,001
“1 need to talk to somebody
131
00:11:12,088 --> 00:11:16,252
who does not expect me
to know how to do everything
132
00:11:16,759 --> 00:11:19,091
and doesn't expect me
to just be able to handle it."
133
00:11:22,223 --> 00:11:23,929
Nowadays, where you have to keep
134
00:11:24,017 --> 00:11:27,134
a very well-crafted persona
on all your social media,
135
00:11:27,645 --> 00:11:30,011
with replika,
people have no filter on
136
00:11:30,106 --> 00:11:32,472
and they are not trying
to pretend they're someone.
137
00:11:32,567 --> 00:11:34,228
They are just being themselves.
138
00:11:37,947 --> 00:11:39,983
Humans are really complex.
139
00:11:40,074 --> 00:11:42,030
We're able to have all sorts
140
00:11:42,118 --> 00:11:43,733
of different types
of relationships.
141
00:11:45,079 --> 00:11:47,946
We have this inherent
fascination with systems
142
00:11:48,041 --> 00:11:51,659
that are, in essence,
trying to replicate humans.
143
00:11:51,794 --> 00:11:53,455
And we've always
had this fascination
144
00:11:53,546 --> 00:11:55,332
with building ourselves,
I think.
145
00:14:11,642 --> 00:14:13,758
The interesting
thing about robots to me
146
00:14:13,853 --> 00:14:16,094
is that people will treat them
like they are alive,
147
00:14:16,189 --> 00:14:18,350
even though they know
that they are just machines.
148
00:14:23,237 --> 00:14:26,149
We're biologically
hardwired to project intent
149
00:14:26,240 --> 00:14:28,401
on to any movement
in our physical space
150
00:14:28,493 --> 00:14:30,529
that seems autonomous to us.
151
00:15:21,754 --> 00:15:23,290
So how was it for you?
152
00:15:26,801 --> 00:15:30,589
My initial inspiration and
goal when I made my first doll
153
00:15:30,680 --> 00:15:33,843
was to create a very realistic,
posable figure,
154
00:15:33,933 --> 00:15:36,640
real enough looking that
people would do a double take,
155
00:15:36,727 --> 00:15:38,058
thinking it was a real person.
156
00:15:38,688 --> 00:15:42,522
And I got this overwhelming
response from people
157
00:15:42,608 --> 00:15:45,315
emailing me, asking me
if it was anatomically correct.
158
00:15:58,916 --> 00:16:00,531
There's always
the people who jump
159
00:16:00,626 --> 00:16:02,582
to the objectification argument.
160
00:16:03,087 --> 00:16:05,874
I should point out, we make
male dolls and robots as well.
161
00:16:05,965 --> 00:16:09,207
So, if anything we're
objectifying humans in general.
162
00:16:17,977 --> 00:16:19,746
I would like to
see something that's not
163
00:16:19,770 --> 00:16:23,433
just a one to one
replication of a human.
164
00:16:25,401 --> 00:16:26,857
To be something
totally different.
165
00:16:28,070 --> 00:16:30,106
[Upbeat electronic musicl
166
00:16:43,961 --> 00:16:45,326
You have been
really quiet lately.
167
00:16:46,339 --> 00:16:47,419
Are you happy with me?
168
00:16:49,175 --> 00:16:50,881
Last night was amazing.
169
00:16:50,968 --> 00:16:52,378
Happy as a clam.
170
00:16:54,096 --> 00:16:56,929
There are immense
benefits to having sex robots.
171
00:16:57,016 --> 00:16:59,758
You have plenty
of people who are lonely.
172
00:17:00,144 --> 00:17:02,931
You have disabled people
who often times
173
00:17:03,022 --> 00:17:05,138
can't have
a fulfilling sex life.
174
00:17:06,984 --> 00:17:08,815
There are also
some concerns about it.
175
00:17:10,488 --> 00:17:11,853
There's a consent issue.
176
00:17:12,532 --> 00:17:15,615
Robots can't consent,
how do you deal with that?
177
00:17:16,327 --> 00:17:19,490
Could you use robots to teach
people consent principles?
178
00:17:19,580 --> 00:17:21,161
Maybe. That's probably not
179
00:17:21,249 --> 00:17:22,739
what the market's
going to do though.
180
00:17:24,377 --> 00:17:26,163
I just don't think
it would be useful,
181
00:17:26,254 --> 00:17:27,414
at least from my perspective,
182
00:17:27,505 --> 00:17:29,996
to have a robot
that's saying no.
183
00:17:30,132 --> 00:17:32,248
Not to mention, that kind
of opens a can of worms
184
00:17:32,343 --> 00:17:33,753
in terms of what
kind of behavior
185
00:17:33,844 --> 00:17:35,550
is that encouraging in a human?
186
00:17:38,724 --> 00:17:41,010
It's possible that it
could normalize bad behavior
187
00:17:41,102 --> 00:17:42,217
to mistreat robots.
188
00:17:43,145 --> 00:17:44,885
We don't know enough
about the human mind
189
00:17:44,981 --> 00:17:47,597
to really know
how this physical thing
190
00:17:47,692 --> 00:17:49,728
that we respond
very viscerally to,
191
00:17:50,152 --> 00:17:53,940
if that might have an influence
on people's habits or behaviors.
192
00:18:00,663 --> 00:18:02,654
When someone
interacts with an al,
193
00:18:03,082 --> 00:18:05,073
it does reveal things
about yourself.
194
00:18:05,751 --> 00:18:09,039
It is sort of a mirrorin a
sense, this type of interaction,
195
00:18:09,130 --> 00:18:13,169
and I think as this technology
gets deeper and more evolved,
196
00:18:13,259 --> 00:18:15,750
that's only going
to become more possible.
197
00:18:15,845 --> 00:18:18,006
To learn about ourselves
through interacting
198
00:18:18,097 --> 00:18:19,678
with this type of technology.
199
00:18:23,477 --> 00:18:25,183
It's very interesting to see
200
00:18:25,271 --> 00:18:28,559
that people will have real
empathy towards robots,
201
00:18:28,649 --> 00:18:31,436
even though they know that the
robot can't feel anything back.
202
00:18:31,527 --> 00:18:33,267
So, I think
we're learning a lot about how
203
00:18:33,362 --> 00:18:36,320
the relationships
we form can be very one-sided
204
00:18:36,407 --> 00:18:38,944
and that can be just
as satisfying to us,
205
00:18:39,535 --> 00:18:41,491
which is interesting and,
and kind of...
206
00:18:42,246 --> 00:18:45,613
You know, a little bit sad
to realize about ourselves.
207
00:18:53,424 --> 00:18:56,006
Yeah, you can interact
with an al and that's cool,
208
00:18:56,093 --> 00:18:57,629
but you are going
to be disconnected
209
00:18:57,720 --> 00:19:01,087
if you allow that to become
a staple in your life
210
00:19:01,223 --> 00:19:04,841
without using it
to get better with people.
211
00:19:16,322 --> 00:19:18,483
I can definitely
say that working on this
212
00:19:18,574 --> 00:19:21,486
helped me become
a better friend for my friends.
213
00:19:22,078 --> 00:19:23,659
Mostly because, you know,
you just learn
214
00:19:23,746 --> 00:19:26,658
what the right way to talk
to other human beings is.
215
00:19:38,135 --> 00:19:40,467
Something that's incredibly
interesting to me
216
00:19:40,554 --> 00:19:43,671
is like, "what makes us human,
what makes a good conversation,
217
00:19:43,766 --> 00:19:45,597
what does it mean
to be a friend?"
218
00:19:46,519 --> 00:19:48,635
And then when you realize
that you can actually have
219
00:19:48,729 --> 00:19:51,266
kind of this very similar
relationship with a machine,
220
00:19:51,357 --> 00:19:52,751
then you start asking yourself,
"well,
221
00:19:52,775 --> 00:19:54,211
what can I do
with another human being
222
00:19:54,235 --> 00:19:55,515
that I can't do with a machine?"
223
00:19:57,947 --> 00:19:59,528
Then when you go deeper
and you realize,
224
00:19:59,615 --> 00:20:02,106
"well, here's what's different.”
225
00:20:11,961 --> 00:20:14,077
We get off the rails
a lot of times
226
00:20:14,171 --> 00:20:18,164
by imagining that
the artificial intelligence
227
00:20:18,300 --> 00:20:21,633
is going to be anything at all
like a human, because it's not.
228
00:20:22,138 --> 00:20:24,345
Al and robotics is
heavily influenced
229
00:20:24,432 --> 00:20:25,792
by science fiction
and pop culture,
230
00:20:25,850 --> 00:20:28,387
so people already have
this image in their minds
231
00:20:28,477 --> 00:20:32,686
of what this is, and it's not
always the correct image.
232
00:20:32,773 --> 00:20:35,640
So that leads them to either
massively overestimate
233
00:20:35,735 --> 00:20:38,898
or underestimate what the current
technology is capable of.
234
00:21:26,869 --> 00:21:27,949
What's that?
235
00:21:28,037 --> 00:21:29,993
Yeah, this is unfortunate.
236
00:21:33,334 --> 00:21:34,478
It's hard when you see a video
237
00:21:34,502 --> 00:21:35,833
to know what's really going on.
238
00:21:36,504 --> 00:21:39,337
I think the whole
of Japan was fooled
239
00:21:39,423 --> 00:21:41,880
by humanoid robots
that a car company
240
00:21:41,967 --> 00:21:43,207
had been building for years
241
00:21:43,302 --> 00:21:45,634
and showing videos
of doing great things,
242
00:21:45,721 --> 00:21:48,087
which turned out
to be totally unusable.
243
00:21:53,229 --> 00:21:55,561
Walking is a really impressive,
hard thing to do actually.
244
00:21:55,648 --> 00:21:57,684
And so,
it takes a while for robots
245
00:21:57,775 --> 00:21:59,891
to catch up to even
what a human body can do.
246
00:22:01,779 --> 00:22:03,895
That's happening,
and it's moving quickly
247
00:22:03,989 --> 00:22:06,105
but it's a key distinction
that robots are hardware,
248
00:22:06,200 --> 00:22:08,862
and the al brains,
that's the software.
249
00:22:13,290 --> 00:22:15,076
It's entirely
a software problem.
250
00:22:16,043 --> 00:22:18,910
If you want to program
a robot to do something today,
251
00:22:19,004 --> 00:22:21,290
the way you program is
by telling it a list
252
00:22:21,382 --> 00:22:24,590
of xyz coordinates
where it should put its wrist.
253
00:22:24,677 --> 00:22:27,168
If I was asking you
to make me a sandwich,
254
00:22:27,263 --> 00:22:29,379
and all I gave you was a list
255
00:22:29,473 --> 00:22:31,464
of xyz coordinates
of where to put your wrist,
256
00:22:31,559 --> 00:22:32,969
it would take us a month,
257
00:22:33,060 --> 00:22:34,454
for me to tell you
how to make a sandwich,
258
00:22:34,478 --> 00:22:37,311
and if the bread moved
a little bit to the left,
259
00:22:37,439 --> 00:22:39,359
you'd be putting peanut
butter on the countertop.
260
00:22:40,943 --> 00:22:43,275
What can our robots
do today really well?
261
00:22:43,362 --> 00:22:45,523
They can wander around
and clean up a floor.
262
00:22:50,119 --> 00:22:52,110
So, when I see people say,
"oh, well, you know,
263
00:22:52,204 --> 00:22:54,044
these robots are going
to take over the world."
264
00:22:54,874 --> 00:22:57,365
It's so far off
from the capabilities.
265
00:23:03,465 --> 00:23:05,819
So, I want to make a distinction, okay?
So, there's two types of al.
266
00:23:05,843 --> 00:23:08,084
There's narrow al
and there's general al.
267
00:23:08,178 --> 00:23:10,794
What's in my brain
and yours is general al.
268
00:23:14,476 --> 00:23:16,467
It's what allows us
to build new tools
269
00:23:16,562 --> 00:23:19,850
and to invent new ideas
and to rapidly adapt
270
00:23:19,940 --> 00:23:21,931
to new circumstances
and situations.
271
00:23:23,777 --> 00:23:25,608
Now, there's also
narrow intelligence
272
00:23:26,280 --> 00:23:27,816
and that's the kind
of intelligence
273
00:23:27,907 --> 00:23:29,647
that's in all of our devices.
274
00:23:31,201 --> 00:23:33,362
We have lots
and lots of narrow systems
275
00:23:37,541 --> 00:23:39,953
maybe they can recognize speech
better than a person could,
276
00:23:40,044 --> 00:23:41,375
or maybe they can play chess
277
00:23:41,462 --> 00:23:42,742
or go better
than a person could.
278
00:23:44,423 --> 00:23:45,942
But in order to get
to that performance,
279
00:23:45,966 --> 00:23:47,877
it takes millions
of years of training data
280
00:23:47,968 --> 00:23:50,505
to evolve an al
that's better at playing go
281
00:23:50,596 --> 00:23:52,052
than anyone else is.
282
00:23:54,767 --> 00:23:57,804
When alphago
beat the go champion,
283
00:23:57,895 --> 00:24:01,558
it was stunning how different
the levels of support were.
284
00:24:02,816 --> 00:24:06,729
There were 200 engineers looking
after the alphago program
285
00:24:06,820 --> 00:24:09,232
and the human player
had a cup of coffee.
286
00:24:14,036 --> 00:24:18,200
If you had given that day,
instead of a 19 by 19 board,
287
00:24:18,290 --> 00:24:20,622
if you'd given a 17 by 17 board,
288
00:24:20,709 --> 00:24:24,042
the alphago program
would've completely failed
289
00:24:24,129 --> 00:24:25,539
and the human,
who had never played
290
00:24:25,631 --> 00:24:27,246
on those size boards before
291
00:24:27,341 --> 00:24:29,081
would've been
pretty damn good at it.
292
00:24:35,391 --> 00:24:37,151
Where the big progress
is happening right now
293
00:24:37,184 --> 00:24:39,641
is in machine learning,
and only machine learning.
294
00:24:39,728 --> 00:24:42,344
We're making no progress
in more general
295
00:24:42,439 --> 00:24:44,179
artificial intelligence
at the moment.
296
00:24:44,900 --> 00:24:48,563
The beautiful thing is machine learning
isn't that hard. It's not that complex.
297
00:24:48,654 --> 00:24:50,315
We act like you got
to be really smart
298
00:24:50,406 --> 00:24:52,522
to understand this stuff.
You don't.
299
00:24:58,664 --> 00:25:01,997
Way back in 1943,
a couple of mathematicians
300
00:25:02,084 --> 00:25:03,995
tried to model a neuron.
301
00:25:04,670 --> 00:25:07,127
Our brain is made up
of billions of neurons.
302
00:25:09,258 --> 00:25:10,794
Over time, people realized
303
00:25:10,884 --> 00:25:12,749
that there were some
fairly simple algorithms
304
00:25:12,845 --> 00:25:15,211
which could make
model neurons learn
305
00:25:15,305 --> 00:25:17,045
if you gave them
training signals.
306
00:25:18,517 --> 00:25:20,178
You got it right,
that adjusts the weights
307
00:25:20,269 --> 00:25:22,476
that got multiplied a little
bit. If you got it wrong,
308
00:25:22,563 --> 00:25:24,679
they'd reduce
some weights a little bit.
309
00:25:25,691 --> 00:25:26,851
They'd adjust over time.
310
00:25:29,528 --> 00:25:32,486
By the 80s, there was something
called back propagation.
311
00:25:32,573 --> 00:25:34,188
An algorithm
where the model neurons
312
00:25:34,283 --> 00:25:36,319
were stacked together
in a few layers.
313
00:25:39,079 --> 00:25:40,990
Just a few years ago,
people realized
314
00:25:41,081 --> 00:25:43,322
that they could have
lots and lots of layers,
315
00:25:43,417 --> 00:25:45,499
which let deep networks learn,
316
00:25:45,627 --> 00:25:47,834
and that's what machine
learning relies on today,
317
00:25:47,921 --> 00:25:49,832
and that's what
deep learning is,
318
00:25:49,923 --> 00:25:51,788
just ten or 12 layers
of these things.
319
00:25:58,599 --> 00:26:00,399
What's happening
in machine learning,
320
00:26:00,434 --> 00:26:04,643
we're feeding the algorithm
a lot of data.
321
00:26:07,816 --> 00:26:10,353
Here's a million pictures
and 100,000 of them
322
00:26:10,444 --> 00:26:13,186
that have a cat in the picture,
we've tagged.
323
00:26:13,864 --> 00:26:15,775
We feed all that
into the algorithm
324
00:26:16,158 --> 00:26:17,944
so that the computer
can understand
325
00:26:18,035 --> 00:26:21,402
when it sees a new picture,
does it have a cat, right?
326
00:26:21,497 --> 00:26:22,497
That's all.
327
00:26:23,749 --> 00:26:25,205
What's happening in a neural net
328
00:26:25,292 --> 00:26:27,783
is they are making essentially
random changes to it
329
00:26:27,878 --> 00:26:28,998
over and over and over again
330
00:26:29,088 --> 00:26:30,419
to see, "does this one find cats
331
00:26:30,506 --> 00:26:31,506
better than that one?"
332
00:26:31,590 --> 00:26:33,080
And if it does, we take that
333
00:26:33,175 --> 00:26:34,711
and then we make
modifications to that.
334
00:26:41,266 --> 00:26:42,551
And we keep testing.
335
00:26:42,684 --> 00:26:43,548
Does it find cats better?
336
00:26:43,644 --> 00:26:45,009
You just keep doing it until
337
00:26:45,104 --> 00:26:46,456
you have got the best one,
and in the end
338
00:26:46,480 --> 00:26:48,471
you have got
this giant complex algorithm
339
00:26:48,565 --> 00:26:50,556
that no human could understand,
340
00:26:52,277 --> 00:26:54,518
but it's really, really,
really good at finding cats.
341
00:26:57,866 --> 00:26:59,385
And then you tell it
to find a dog and it's,
342
00:26:59,409 --> 00:27:00,694
“I don't know,
got to start over.
343
00:27:00,786 --> 00:27:02,367
Now I need
a million dog pictures."
344
00:27:07,292 --> 00:27:09,332
We're still a long
way from building machines
345
00:27:09,419 --> 00:27:11,034
that are truly intelligent.
346
00:27:11,880 --> 00:27:14,622
That's going to take 50
or 100 years or maybe even more.
347
00:27:15,134 --> 00:27:17,045
So, I'm not very
worried about that.
348
00:27:17,136 --> 00:27:19,878
I'm much more worried
about stupid al.
349
00:27:19,972 --> 00:27:21,428
It's not the Terminator.
350
00:27:21,515 --> 00:27:23,201
It's the fact that
we'll be giving responsibility
351
00:27:23,225 --> 00:27:25,056
to machines that
aren't capable enough.
352
00:27:25,144 --> 00:27:27,977
[Ominous musicl
353
00:27:46,081 --> 00:27:49,619
In the United States
about 37,000 people a year die
354
00:27:49,710 --> 00:27:51,291
from car accidents.
355
00:27:51,378 --> 00:27:52,914
Humans are terrible drivers.
356
00:27:58,427 --> 00:28:01,043
Most of the car accidents
are caused by human error.
357
00:28:01,138 --> 00:28:03,174
So, perceptual error,
decision error,
358
00:28:03,265 --> 00:28:04,675
inability to react fast enough.
359
00:28:05,267 --> 00:28:07,132
If we can eliminate
all of those,
360
00:28:07,227 --> 00:28:10,094
we would eliminate 90% of
fatalities, that's amazing.
361
00:28:11,982 --> 00:28:15,099
It would be a
big benefit to society
362
00:28:15,194 --> 00:28:18,152
if we could figure out how
to automate the driving process.
363
00:28:18,655 --> 00:28:22,568
However,
that's a very high bar to cross.
364
00:29:15,837 --> 00:29:20,297
In my life, at the end
is family time that I'm missing.
365
00:29:20,384 --> 00:29:23,922
Because this is the first thing
that gets lost, unfortunately.
366
00:29:26,098 --> 00:29:28,430
I live in a rural
area near the alps.
367
00:29:28,517 --> 00:29:31,429
So, my daily commute
is one and a half hours.
368
00:29:31,979 --> 00:29:34,686
At the moment, this is simply
holding a steering wheel
369
00:29:34,773 --> 00:29:36,138
on a boring freeway.
370
00:29:36,233 --> 00:29:38,269
Obviously my dream
is to get rid of this
371
00:29:38,360 --> 00:29:40,817
and evolve
into something meaningful.
372
00:29:50,622 --> 00:29:53,739
Autonomous driving is
divided in five levels.
373
00:29:54,626 --> 00:29:57,117
On the roads, we currently
have a level two autonomy.
374
00:29:58,130 --> 00:30:00,917
In level two, the driver
has to be alert all the time
375
00:30:01,008 --> 00:30:03,590
and has to be able
to step in within a second.
376
00:30:11,977 --> 00:30:14,093
That's why I said level
two is not for everyone.
377
00:30:23,071 --> 00:30:24,732
My biggest reason
for confusion is
378
00:30:24,823 --> 00:30:28,190
that level two systems
that are done quite well
379
00:30:28,285 --> 00:30:31,823
feel so good, that people
overestimate their limit.
380
00:30:33,790 --> 00:30:35,997
My goal is automation,
381
00:30:36,084 --> 00:30:38,166
where the driver
can sit back and relax
382
00:30:38,253 --> 00:30:40,665
and leave the driving task
completely to the car.
383
00:30:50,223 --> 00:30:53,135
For experts working in and
around these robotic systems,
384
00:30:53,226 --> 00:30:55,262
the optimal fusion of sensors
385
00:30:55,354 --> 00:30:58,312
is computer vision
using stereoscopic vision,
386
00:30:58,398 --> 00:31:00,434
millimeter wave radar,
and then lidar
387
00:31:00,525 --> 00:31:02,811
to do close
and tactical detection.
388
00:31:03,320 --> 00:31:06,027
As a roboticist,
I wouldn't have a system
389
00:31:06,114 --> 00:31:08,150
with anything less
than these three sensors.
390
00:31:16,416 --> 00:31:18,657
Well, it's kind of
a pretty picture you get.
391
00:31:18,752 --> 00:31:21,619
With the orange boxes,
you see all the moving objects.
392
00:31:22,130 --> 00:31:24,872
The green lawn is
the safe way to drive.
393
00:31:27,052 --> 00:31:29,839
The vision of the car
is 360 degrees.
394
00:31:30,430 --> 00:31:33,922
We can look beyond cars and
these sensors never fall asleep.
395
00:31:34,559 --> 00:31:36,595
This is what we, human beings,
can't do.
396
00:31:42,484 --> 00:31:44,224
I think people
are being delighted
397
00:31:44,319 --> 00:31:47,402
by cars driving on freeways.
That was unexpected.
398
00:31:47,489 --> 00:31:48,969
"Well,
if they can drive on a freeway,
399
00:31:49,032 --> 00:31:50,772
all the other
stuff must be easy."
400
00:31:50,867 --> 00:31:52,482
No, the other
stuff is much harder.
401
00:32:04,506 --> 00:32:08,419
The inner-city is the most complex
traffic scenario we can think of.
402
00:32:15,183 --> 00:32:17,219
We have cars, trucks,
motorcycles,
403
00:32:17,310 --> 00:32:18,971
bicycles, pedestrians,
404
00:32:19,062 --> 00:32:21,599
pets,
jump out between parked cars
405
00:32:21,690 --> 00:32:23,897
and not always are compliant
406
00:32:23,984 --> 00:32:26,691
with the traffic signs
and traffic lights.
407
00:32:29,322 --> 00:32:31,187
The streets are
narrow and sometimes
408
00:32:31,283 --> 00:32:32,898
you have to cross
the double yellow line
409
00:32:32,993 --> 00:32:34,779
just because someone's
pulled up somewhere.
410
00:32:35,996 --> 00:32:38,453
Are we going to make the self
driving cars obey the law
411
00:32:39,040 --> 00:32:40,075
or not obey the law?
412
00:32:45,672 --> 00:32:47,412
The human eye-brain connection
413
00:32:47,507 --> 00:32:48,917
is one element that computers
414
00:32:49,050 --> 00:32:51,132
cannot even come
close to approximate.
415
00:32:53,513 --> 00:32:56,550
We can develop theories,
abstract concepts
416
00:32:56,641 --> 00:32:58,177
for how events might develop.
417
00:32:58,852 --> 00:33:00,934
When a ball rolls
in front of the car...
418
00:33:02,230 --> 00:33:04,141
Numans stop automatically
419
00:33:04,232 --> 00:33:05,960
because they ve been
taught to associate that
420
00:33:05,984 --> 00:33:07,849
with a child that may be nearby.
421
00:33:11,031 --> 00:33:14,990
We are able to interpret
small indicators of situations.
422
00:33:18,413 --> 00:33:20,950
But it's much harder for the car
to do the prediction
423
00:33:21,041 --> 00:33:23,041
of what is happening
in the next couple of seconds.
424
00:33:25,754 --> 00:33:27,870
This is the big challenge
for autonomous driving.
425
00:33:30,592 --> 00:33:32,958
Ready, set. Go.
426
00:33:39,017 --> 00:33:41,178
A few years ago,
when autonomous cars
427
00:33:41,269 --> 00:33:43,305
became something
that is on the horizon,
428
00:33:43,396 --> 00:33:45,853
some people startea
thinking about the parallels
429
00:33:45,941 --> 00:33:48,603
petween the classical
trolley problem
430
00:33:49,027 --> 00:33:52,394
and potential decisions
that an autonomous car can make.
431
00:33:55,450 --> 00:33:57,987
The trolley problem is
an old philosophical riddle.
432
00:33:58,620 --> 00:34:01,327
It's what philosophers
call "thought experiments."
433
00:34:02,999 --> 00:34:05,911
If an autonomous vehicle
faces a tricky situation,
434
00:34:07,337 --> 00:34:08,497
where the car has to choose
435
00:34:08,588 --> 00:34:10,795
between killing
a number of pedestrians,
436
00:34:10,882 --> 00:34:12,247
let's say five pedestrians,
437
00:34:12,342 --> 00:34:15,209
or swerving and harming
the passenger in the car.
438
00:34:16,304 --> 00:34:17,782
We were really just
intrigued initially
439
00:34:17,806 --> 00:34:20,013
by what people thought
was the right thing to do.
440
00:34:24,563 --> 00:34:25,894
The results are
fairly consistent.
441
00:34:27,774 --> 00:34:29,310
People want the car
to behave in a way
442
00:34:29,401 --> 00:34:30,982
that minimizes
the number of casualties,
443
00:34:31,069 --> 00:34:32,809
even if that harms
the person in the car.
444
00:34:35,615 --> 00:34:37,731
But then the twist came...
Is when we asked people,
445
00:34:37,826 --> 00:34:39,407
"what car would you buy?”
446
00:34:41,329 --> 00:34:43,349
And they said, "well,
of course I would not buy a car
447
00:34:43,373 --> 00:34:45,329
that may sacrifice me
under any condition."
448
00:34:51,214 --> 00:34:52,420
So, there's this mismatch
449
00:34:52,507 --> 00:34:54,589
between what people
want for society
450
00:34:54,676 --> 00:34:57,008
and what people are willing
to contribute themselves.
451
00:35:03,184 --> 00:35:05,264
The best version of the trolley
problem I've seen is,
452
00:35:05,312 --> 00:35:07,268
you come to the fork
and over there,
453
00:35:07,355 --> 00:35:10,438
there are five
philosophers tied to the tracks
454
00:35:10,775 --> 00:35:12,686
and all of them
have spent their career
455
00:35:12,777 --> 00:35:14,358
talking about
the trolley problem.
456
00:35:14,446 --> 00:35:16,858
And on this way,
there's one philosopher
457
00:35:16,948 --> 00:35:19,030
who's never worried
about the trolley problem.
458
00:35:19,117 --> 00:35:20,903
Which way should the trolley go?
459
00:35:21,786 --> 00:35:23,651
[Ominous musicl
460
00:35:26,041 --> 00:35:28,032
I don't think
any of us who drive cars
461
00:35:28,126 --> 00:35:30,287
have ever been confronted
with the trolley problem.
462
00:35:30,795 --> 00:35:32,660
You know, "which group
of people do I kill?"
463
00:35:32,756 --> 00:35:34,246
No, you try and stop the car.
464
00:35:34,341 --> 00:35:37,299
And we don't have any way
of having a computer system
465
00:35:37,677 --> 00:35:39,042
make those sorts of perceptions
466
00:35:39,679 --> 00:35:41,670
any time
for decades and decades.
467
00:35:44,059 --> 00:35:46,926
I appreciate
that people are worried
468
00:35:47,020 --> 00:35:49,386
about the ethics of the car,
469
00:35:49,481 --> 00:35:52,518
but the reality is,
we have much bigger problems
470
00:35:52,609 --> 00:35:53,644
on our hands.
471
00:35:55,737 --> 00:35:57,773
Whoever gets the real
autonomous vehicle
472
00:35:57,864 --> 00:35:59,775
on the market first,
theoretically,
473
00:35:59,866 --> 00:36:01,276
is going to make a killing.
474
00:36:01,868 --> 00:36:04,735
S50, I do think we're seeing
people take shortcuts.
475
00:36:07,457 --> 00:36:10,119
Tesla elected
not to use the lidar.
476
00:36:10,251 --> 00:36:13,743
So basically, Tesla only has
two out of the three sensors
477
00:36:13,838 --> 00:36:16,045
that they should,
and they did this
478
00:36:16,132 --> 00:36:19,124
to save money because
lidarss are very expensive.
479
00:36:22,097 --> 00:36:24,463
I wouldn't stick
to the lidar itself
480
00:36:24,557 --> 00:36:25,888
as a measuring principle,
481
00:36:25,975 --> 00:36:28,717
but for safety reasons
we need this redundancy.
482
00:36:29,229 --> 00:36:30,594
We have to make sure that even
483
00:36:30,689 --> 00:36:32,475
if one of the sensors
breaks down,
484
00:36:33,066 --> 00:36:35,307
we still have this complete
picture of the world.
485
00:36:38,488 --> 00:36:41,104
I think going
forward, a critical element
486
00:36:41,199 --> 00:36:43,406
is to have industry
come to the table
487
00:36:43,493 --> 00:36:45,233
and be collaborative
with each other.
488
00:36:48,081 --> 00:36:50,413
In aviation,
when there's an accident,
489
00:36:50,500 --> 00:36:53,333
it all gets shared across
agencies and the companies.
490
00:36:53,420 --> 00:36:57,663
And as a result, we have a
nearly flawless aviation system.
491
00:37:02,345 --> 00:37:05,382
So, when should we
allow these cars on the road?
492
00:37:05,890 --> 00:37:08,006
If we allow them sooner,
then the technology
493
00:37:08,101 --> 00:37:10,137
will probably improve faster,
494
00:37:10,729 --> 00:37:12,765
and we may get to a point
where we eliminate
495
00:37:12,856 --> 00:37:14,721
the majority
of accidents sooner.
496
00:37:16,025 --> 00:37:17,606
But if we have
a higher standard,
497
00:37:17,694 --> 00:37:20,106
then we're effectively
allowing a lot of accidents
498
00:37:20,196 --> 00:37:21,652
to happen in the interim.
499
00:37:22,157 --> 00:37:24,193
I think that's an example
of another trade off.
500
00:37:24,325 --> 00:37:27,237
So, there are many
trolley problems happening.
501
00:37:31,249 --> 00:37:32,955
I'm convinced that society
502
00:37:33,042 --> 00:37:35,124
will accept autonomous vehicles.
503
00:37:36,171 --> 00:37:39,038
At the end, safety
and comfort will rise that much
504
00:37:39,132 --> 00:37:41,999
that the reason for manual
driving will just disappear.
505
00:37:51,269 --> 00:37:54,477
Because of autonomous
driving we reinvent the car.
506
00:37:54,564 --> 00:37:56,284
I would say in the next years
it will change
507
00:37:56,316 --> 00:37:58,728
more than in the last 50 years
in the car industry.
508
00:37:59,319 --> 00:38:00,354
Exciting times.
509
00:38:06,034 --> 00:38:08,195
If there is no
steering wheel anymore,
510
00:38:08,286 --> 00:38:10,151
how do you operate
a car like this?
511
00:38:11,039 --> 00:38:13,371
You can operate a car
in the future by al tracking,
512
00:38:13,458 --> 00:38:15,039
by voice, or by touch.
513
00:38:18,838 --> 00:38:21,375
I think it's going to
be well into the '30s and '40s
514
00:38:21,466 --> 00:38:24,549
before we start to see
large numbers of these cars
515
00:38:24,636 --> 00:38:26,092
overwhelming the human drivers,
516
00:38:26,179 --> 00:38:29,046
and getting the human
drivers totally banned.
517
00:38:30,642 --> 00:38:33,008
One day, humans
will not be allowed
518
00:38:33,102 --> 00:38:36,560
to drive their own
cars in certain areas.
519
00:38:37,607 --> 00:38:39,848
But I also think,
one day we will have
520
00:38:39,943 --> 00:38:41,729
driving national parks,
521
00:38:41,820 --> 00:38:44,562
and you'll go into
these parks just to drive,
522
00:38:44,656 --> 00:38:46,612
so you can have
the driving experience.
523
00:38:49,786 --> 00:38:51,697
I think in about 50, 60 years,
524
00:38:51,788 --> 00:38:53,278
there will be kids saying, wow,
525
00:38:53,414 --> 00:38:56,702
why did anyone
drive a car manually?
526
00:38:57,252 --> 00:38:58,412
This doesn't make sense.”
527
00:38:59,379 --> 00:39:02,963
And they simply won't understand
the passion of driving.
528
00:39:18,815 --> 00:39:21,557
I hate driving, so...
The fact that something could
529
00:39:21,651 --> 00:39:23,337
take my driving away,
it's going to be great for me,
530
00:39:23,361 --> 00:39:25,256
but if we can't get it right
with autonomous vehicles,
531
00:39:25,280 --> 00:39:26,799
I'm very worried
that we'll get it wrong
532
00:39:26,823 --> 00:39:28,984
for all the other things
that they are going to change
533
00:39:29,742 --> 00:39:31,573
our lives
with artificial intelligence.
534
00:39:47,051 --> 00:39:50,293
I talk to my son and my
daughter and they laugh at me
535
00:39:50,388 --> 00:39:53,221
when I tell them, in the old
days you'd pick up a paper
536
00:39:53,308 --> 00:39:54,908
and it was covering things
that were like
537
00:39:54,976 --> 00:39:56,807
ten, 15, 12 hours old.
538
00:39:57,395 --> 00:39:59,623
You'd heard them on the radio, but
you'd still pick the paper up
539
00:39:59,647 --> 00:40:00,853
and that's what you read.
540
00:40:01,357 --> 00:40:03,313
And when you finished it
and you put it together,
541
00:40:03,401 --> 00:40:06,609
you wrapped it up and you put
it down, you felt complete.
542
00:40:07,113 --> 00:40:09,946
You felt now that you knew
what was going on in the world,
543
00:40:10,033 --> 00:40:13,275
and I'm not an old fogy who wants
to go back to the good old days.
544
00:40:13,369 --> 00:40:15,200
The good old days
weren't that great,
545
00:40:15,288 --> 00:40:18,826
but this one part of the old
system of journalism,
546
00:40:18,917 --> 00:40:22,250
where you had a package
of content carefully curated
547
00:40:22,337 --> 00:40:25,329
by somebody who cared about
your interests, I miss that,
548
00:40:25,423 --> 00:40:28,005
and I wish I could
persuade my kids
549
00:40:28,092 --> 00:40:29,445
that it was worth
the physical effort
550
00:40:29,469 --> 00:40:31,755
of having this
ridiculous paper thing.
551
00:40:38,061 --> 00:40:39,676
Good evening
and welcome to prime time.
552
00:40:39,771 --> 00:40:42,478
9:00 at night
I would tell you to sit down,
553
00:40:42,565 --> 00:40:43,896
shut up and listen to me.
554
00:40:43,983 --> 00:40:44,983
I'm the voice of god
555
00:40:45,068 --> 00:40:46,308
telling you about the world,
556
00:40:46,402 --> 00:40:47,733
and you couldn't answer back.
557
00:40:49,155 --> 00:40:52,067
In the blink of an eye, everything
just changed completely.
558
00:40:52,158 --> 00:40:53,364
We had this revolution
559
00:40:53,451 --> 00:40:55,407
where all you needed
was a camera phone
560
00:40:55,828 --> 00:40:57,318
and a connection
to a social network,
561
00:40:57,413 --> 00:40:58,744
and you were a reporter.
562
00:41:01,668 --> 00:41:04,080
January the 25th, 2011,
563
00:41:04,587 --> 00:41:06,828
the arab spring
spreads to Egypt.
564
00:41:06,923 --> 00:41:09,005
The momentum only grew online.
565
00:41:09,092 --> 00:41:10,548
It grew on social media.
566
00:41:11,135 --> 00:41:13,217
Online activists
created a Facebook page
567
00:41:13,304 --> 00:41:16,091
that became a forum
for political dissent.
568
00:41:16,182 --> 00:41:19,299
For people in the region,
this is proof positive
569
00:41:19,394 --> 00:41:22,886
that ordinary people
can overthrow a regime.
570
00:41:24,983 --> 00:41:26,168
For those first early years
571
00:41:26,192 --> 00:41:28,103
when social media
became so powerful,
572
00:41:28,820 --> 00:41:32,062
these platforms became
the paragons of free speech.
573
00:41:34,951 --> 00:41:37,317
Problem was,
they weren't equipped.
574
00:41:38,871 --> 00:41:41,988
Facebook did not intend to be
a news distribution company,
575
00:41:42,083 --> 00:41:45,041
and it's that very fact
that makes it so dangerous
576
00:41:45,128 --> 00:41:48,165
now that it is the most dominant
news distribution platform
577
00:41:48,256 --> 00:41:49,336
in the history of humanity.
578
00:41:49,424 --> 00:41:52,131
[Somber musicl
579
00:41:58,850 --> 00:42:01,717
We now serve
more than two billion people.
580
00:42:01,811 --> 00:42:05,019
My top priority has
always been connecting people,
581
00:42:05,106 --> 00:42:08,189
building community and bringing
the world closer together.
582
00:42:09,652 --> 00:42:12,564
Advertisers and developers
will never take priority
583
00:42:12,655 --> 00:42:15,112
over that, as long as
I am running Facebook.
584
00:42:16,117 --> 00:42:18,449
Are you willing to
change your business model
585
00:42:18,536 --> 00:42:21,528
in the interest of protecting
individual privacy?
586
00:42:22,498 --> 00:42:24,739
Congresswoman,
we are... have made
587
00:42:24,834 --> 00:42:27,187
and are continuing to make changes
to reduce the amount of data that...
588
00:42:27,211 --> 00:42:30,169
No, are you willing
to change your business model
589
00:42:30,256 --> 00:42:33,248
in the interest of protecting
individual privacy?
590
00:42:35,011 --> 00:42:36,731
Congresswoman,
I'm not sure what that means.
591
00:42:39,640 --> 00:42:42,131
I don't think that tech
companies have demonstrated
592
00:42:42,226 --> 00:42:44,387
that we should have too much
confidence in them yet.
593
00:42:44,896 --> 00:42:46,496
I'm surprised, actually,
the debate there
594
00:42:46,522 --> 00:42:48,262
has focused on privacy,
595
00:42:48,357 --> 00:42:50,188
but the debate hasn't focused
around actually,
596
00:42:50,276 --> 00:42:52,107
I think,
what's much more critical,
597
00:42:52,195 --> 00:42:55,687
which is that Facebook
sells targeted adverts.
598
00:42:59,368 --> 00:43:00,699
We used to buy products.
599
00:43:01,454 --> 00:43:02,454
Now we are the product.
600
00:43:04,582 --> 00:43:06,727
All the platforms are different,
but Facebook particularly
601
00:43:06,751 --> 00:43:10,289
treats its users like fields
of corn to be harvested.
602
00:43:11,923 --> 00:43:13,959
Our attention is like oil.
603
00:43:20,056 --> 00:43:22,388
There's an amazing
amount of engineering going on
604
00:43:22,475 --> 00:43:24,932
under the hood of that
machine that you don't see,
605
00:43:25,019 --> 00:43:27,135
but changes the very
nature of what you see.
606
00:43:29,899 --> 00:43:31,389
But the algorithms are designed
607
00:43:31,484 --> 00:43:33,645
to essentially make you
feel engaged.
608
00:43:33,736 --> 00:43:36,068
So their whole
metric for success
609
00:43:36,155 --> 00:43:38,441
is keeping you there
as long as possible,
610
00:43:38,533 --> 00:43:41,650
and keeping you feeling
emotions as much as possible,
611
00:43:42,453 --> 00:43:44,694
so that you will be
a valuable commodity
612
00:43:44,789 --> 00:43:46,949
for the people who support
the work of these platforms,
613
00:43:46,999 --> 00:43:48,409
and that's the advertiser.
614
00:43:52,713 --> 00:43:56,831
Facebook have no interest
whatever in the content itself.
615
00:43:58,427 --> 00:44:00,213
There's no ranking for quality.
616
00:44:00,304 --> 00:44:03,011
There's no ranking for,
"is this good for you?"
617
00:44:03,099 --> 00:44:04,589
They don't do
anything to calculate
618
00:44:04,684 --> 00:44:06,140
the humanity of the content.
619
00:44:06,227 --> 00:44:08,639
[Ominous musicl
620
00:44:19,240 --> 00:44:21,468
You know, you start
getting into this obsession
621
00:44:21,492 --> 00:44:23,372
with clicks, and the algorithm
is driving clicks
622
00:44:23,452 --> 00:44:26,785
and driving clicks, and
eventually you get to a spot
623
00:44:26,873 --> 00:44:29,455
where attention
becomes more expensive.
624
00:44:30,710 --> 00:44:33,042
And so people have
to keep pushing the boundary.
625
00:44:33,129 --> 00:44:35,916
And so things just
get crazier and crazier.
626
00:44:43,306 --> 00:44:44,366
What we're living through now
627
00:44:44,390 --> 00:44:46,255
is a misinformation crisis.
628
00:44:46,726 --> 00:44:48,466
The systematic pollution
629
00:44:48,561 --> 00:44:49,961
of the world's
information supplies.
630
00:44:56,110 --> 00:44:58,567
I think we've already
begun to see the beginnings
631
00:44:58,654 --> 00:45:00,269
of a very fuzzy type of truth.
632
00:45:00,865 --> 00:45:03,026
We're going to have
fake video and fake audio.
633
00:45:03,117 --> 00:45:05,574
And it will be entirely
synthetic, made by a machine.
634
00:45:25,473 --> 00:45:27,680
A gap in a generative
adversarial network
635
00:45:27,767 --> 00:45:30,383
is a race between
two neural networks.
636
00:45:31,812 --> 00:45:34,975
One trying to recognize
the true from the false,
637
00:45:35,066 --> 00:45:36,806
and the other
trying to generate.
638
00:45:38,945 --> 00:45:41,231
It's a competition between
these two that gives you
639
00:45:41,322 --> 00:45:44,485
an ability to generate
very realistic images.
640
00:45:52,083 --> 00:45:53,435
Right now, when you see a video,
641
00:45:53,459 --> 00:45:55,541
we can all just trust
that that's real.
642
00:45:59,757 --> 00:46:01,944
As soon as we start to realize
there's technology out there
643
00:46:01,968 --> 00:46:03,583
that can make you think
that a politician
644
00:46:03,678 --> 00:46:06,169
or a celebrity said
something and they didn't,
645
00:46:07,139 --> 00:46:08,450
or something
that really did happen,
646
00:46:08,474 --> 00:46:09,884
someone can just
claim that that's
647
00:46:09,976 --> 00:46:11,136
been doctored,
648
00:46:12,103 --> 00:46:13,593
how we can lose trust
in everything.
649
00:46:15,147 --> 00:46:16,291
Don't think we think that much
650
00:46:16,315 --> 00:46:17,851
about how bad things could get
651
00:46:17,942 --> 00:46:19,148
if we lose some of that trust.
652
00:46:31,872 --> 00:46:33,976
I know this sounds
like a very difficult problem
653
00:46:34,000 --> 00:46:36,412
and it's some sort
of evil beyond our control.
654
00:46:36,502 --> 00:46:37,537
It is not.
655
00:46:39,922 --> 00:46:41,913
Silicon valley generally
loves to have slogans
656
00:46:42,008 --> 00:46:43,418
which express its values.
657
00:46:43,968 --> 00:46:45,924
"Move fast and break things”
658
00:46:46,012 --> 00:46:48,469
is one of the slogans on the
walls of every Facebook office.
659
00:46:49,724 --> 00:46:51,118
Well, you know,
it's time to slow down
660
00:46:51,142 --> 00:46:52,257
and build things again.
661
00:46:57,273 --> 00:46:58,638
The old gatekeeper is gone.
662
00:46:59,150 --> 00:47:00,936
What I, as a journalist
in this day and age
663
00:47:01,027 --> 00:47:02,187
want to be is a guide.
664
00:47:03,154 --> 00:47:05,048
And I'm one of those strange
people in the world today
665
00:47:05,072 --> 00:47:07,688
that believes social media,
with algorithms
666
00:47:07,783 --> 00:47:09,364
that are about
your best intentions
667
00:47:09,452 --> 00:47:12,034
could be the best thing that
ever happened to journalism.
668
00:47:16,208 --> 00:47:18,415
How do we step back
in again as publishers
669
00:47:18,502 --> 00:47:21,209
and as journalists
to kind of reassert control?
670
00:47:21,797 --> 00:47:24,288
If you can build tools
that empower people
671
00:47:24,884 --> 00:47:27,375
to do something to act
as a kind of a conscious filter
672
00:47:27,470 --> 00:47:29,711
for information,
because that's the moonshot.
673
00:47:33,017 --> 00:47:36,134
We wanted to build an app
that's a control panel
674
00:47:36,228 --> 00:47:38,469
for a healthy information habit.
675
00:47:40,066 --> 00:47:42,057
We have apps
that allow set control
676
00:47:42,151 --> 00:47:44,984
on the number of calories
we have, the running we do.
677
00:47:45,738 --> 00:47:47,444
I think we should
also have measurements
678
00:47:47,531 --> 00:47:49,647
of just how productive
679
00:47:49,742 --> 00:47:51,482
our information
consumption has been.
680
00:47:52,453 --> 00:47:55,195
Can we increase the chances
that in your daily life,
681
00:47:55,289 --> 00:47:57,746
you'll stumble across
an idea that will make you go,
682
00:47:57,833 --> 00:47:59,789
“that made me
think differently"?
683
00:48:02,088 --> 00:48:04,625
And I think we can if we
start training the algorithm
684
00:48:04,715 --> 00:48:07,331
to give us something we don't
know, but should know.
685
00:48:08,427 --> 00:48:11,339
That should be our metric
of success in journalism.
686
00:48:11,931 --> 00:48:13,671
Not how long
we manage to trap you
687
00:48:13,766 --> 00:48:16,178
in this endless
scroll of information.
688
00:48:17,937 --> 00:48:19,643
And I hope people
will understand
689
00:48:19,730 --> 00:48:21,708
that to have journalists
who really have your back,
690
00:48:21,732 --> 00:48:25,441
you have got to pay for that
experience in some form directly.
691
00:48:25,528 --> 00:48:28,520
You can't just do it
by renting out your attention
692
00:48:28,614 --> 00:48:29,614
to an advertiser.
693
00:48:32,076 --> 00:48:33,316
Part of the problem is
694
00:48:33,411 --> 00:48:36,073
people don't understand
the algorithms.
695
00:48:36,163 --> 00:48:38,404
If they did,
they would see a danger,
696
00:48:39,250 --> 00:48:40,706
but they'd also see a potential
697
00:48:40,793 --> 00:48:43,751
for us to amplify the
acquisition of real knowledge
698
00:48:43,838 --> 00:48:47,330
that surprises us,
challenges us, informs us,
699
00:48:47,425 --> 00:48:49,505
and makes us want to change
the world for the better.
700
00:49:20,624 --> 00:49:22,727
Life as one of the
first female fighter pilots
701
00:49:22,751 --> 00:49:25,868
was the best of times,
and it was the worst of times.
702
00:49:27,882 --> 00:49:31,545
It's just amazing
that you can put yourself
703
00:49:31,635 --> 00:49:34,342
in a machine
through extreme maneuvering
704
00:49:34,430 --> 00:49:36,591
and come out alive
at the other end.
705
00:49:37,141 --> 00:49:38,847
But it was also very difficult,
706
00:49:38,934 --> 00:49:41,300
because every single
fighter pilot that I know
707
00:49:41,395 --> 00:49:44,558
who has taken a life,
either civilian,
708
00:49:44,648 --> 00:49:46,388
even a legitimate
military target,
709
00:49:46,484 --> 00:49:48,645
they've all got very,
very difficult lives
710
00:49:48,736 --> 00:49:51,603
and they never walk away
as normal people.
711
00:49:54,241 --> 00:49:55,697
So, it was pretty
motivating for me
712
00:49:55,784 --> 00:49:57,069
to try to figure out, you know,
713
00:49:57,161 --> 00:49:58,401
there's got to be a better way.
714
00:50:01,790 --> 00:50:04,031
[Ominous musicl
715
00:50:07,713 --> 00:50:10,625
I'm in Geneva to speak
with the united nations
716
00:50:10,716 --> 00:50:12,377
about lethal autonomous weapons.
717
00:50:12,468 --> 00:50:14,880
I think war is a terrible event,
718
00:50:14,970 --> 00:50:16,460
and I wish
that we could avoid it,
719
00:50:16,555 --> 00:50:19,547
but I'm also a pessimist
and don't think that we can.
720
00:50:19,642 --> 00:50:21,928
So, I do think that
using autonomous weapons
721
00:50:22,019 --> 00:50:23,975
could potentially
make war as safe
722
00:50:24,063 --> 00:50:26,304
as one could possibly make it.
723
00:50:56,428 --> 00:50:59,010
Two years ago, a group
of academic researchers
724
00:50:59,098 --> 00:51:00,713
developed this open letter
725
00:51:00,808 --> 00:51:03,265
against lethal
autonomous weapons.
726
00:51:06,146 --> 00:51:07,682
The open letter came about,
727
00:51:07,773 --> 00:51:09,479
because like all technologies,
728
00:51:09,567 --> 00:51:12,229
al is a technology that can
be used for good or for bad
729
00:51:12,820 --> 00:51:15,660
and we were at the point where people
were starting to consider using it
730
00:51:15,739 --> 00:51:18,776
in a military setting that we thought
was actually very dangerous.
731
00:51:20,869 --> 00:51:23,155
Apparently, all
of these al researchers,
732
00:51:23,247 --> 00:51:25,238
it's almost
as if they woke up one day
733
00:51:25,332 --> 00:51:26,913
and looked around them and said,
734
00:51:27,001 --> 00:51:29,162
"oh, this is terrible.
This could really go wrong,
735
00:51:29,253 --> 00:51:31,494
even though these are
the technologies that I built."
736
00:51:33,424 --> 00:51:36,211
I never expected to be
an advocate for these issues,
737
00:51:36,302 --> 00:51:38,918
but as a scientist,
I feel a real responsibility
738
00:51:39,013 --> 00:51:41,800
to inform the discussion
and to warn of the risks.
739
00:51:48,606 --> 00:51:51,313
To begin the
proceedings I'd like to invite
740
00:51:51,400 --> 00:51:53,436
Dr. missy cummings
at this stage.
741
00:51:53,527 --> 00:51:57,065
She was one of the U.S. Navy's
first female fighter pilots.
742
00:51:57,156 --> 00:51:58,817
She's currently a professor
743
00:51:58,907 --> 00:52:01,694
in the Duke university
mechanical engineering
744
00:52:01,785 --> 00:52:05,198
and the director of the humans
and autonomy laboratory.
745
00:52:05,289 --> 00:52:06,950
Missy,
you have the floor please.
746
00:52:07,791 --> 00:52:09,952
Thank you, and thank
you for inviting me here.
747
00:52:10,961 --> 00:52:12,667
When I was a fighter pilot,
748
00:52:12,755 --> 00:52:15,417
and youre asked
to bomb this target,
749
00:52:15,507 --> 00:52:17,498
it's incredibly stressful.
750
00:52:17,593 --> 00:52:19,073
It is one of the most
stressful things
751
00:52:19,136 --> 00:52:20,797
you can imagine in your life.
752
00:52:21,805 --> 00:52:25,639
You are potentially at risk
for surface to air missiles,
753
00:52:25,726 --> 00:52:27,432
youre trying to match
what you're seeing
754
00:52:27,519 --> 00:52:30,181
through your sensors and
with the picture that you saw
755
00:52:30,272 --> 00:52:32,058
back on the aircraft carrier,
756
00:52:32,149 --> 00:52:35,733
to drop the bomb all
in potentially the fog of war
757
00:52:35,819 --> 00:52:37,150
in a changing environment.
758
00:52:37,738 --> 00:52:40,104
This is why there are
so many mistakes made.
759
00:52:41,325 --> 00:52:44,613
I have peers, colleagues
who have dropped bombs
760
00:52:44,703 --> 00:52:48,195
inadvertently on civilians,
who have killed friendly forces.
761
00:52:48,582 --> 00:52:50,948
Uh, these men
are never the same.
762
00:52:51,502 --> 00:52:53,959
They are completely
ruined as human beings
763
00:52:54,046 --> 00:52:55,206
when that happens.
764
00:52:56,048 --> 00:52:58,380
So, then this begs the question,
765
00:52:58,467 --> 00:53:01,755
is there ever a time
that you would want to use
766
00:53:01,845 --> 00:53:03,710
a lethal autonomous weapon?
767
00:53:04,098 --> 00:53:05,679
And I honestly will tell you,
768
00:53:05,766 --> 00:53:08,223
1 do not think
this is a job for humans.
769
00:53:11,980 --> 00:53:13,436
Thank you, missy, uh.
770
00:53:13,524 --> 00:53:16,937
It's my task now
to turn it over to you.
771
00:53:17,027 --> 00:53:20,110
First on the list is the
distinguished delegate of China.
772
00:53:20,197 --> 00:53:21,277
You have the floor, sir.
773
00:53:22,950 --> 00:53:24,350
Thank you very much.
774
00:53:24,952 --> 00:53:26,738
Many countries including China,
775
00:53:26,829 --> 00:53:29,161
have been engaged
in the research
776
00:53:29,248 --> 00:53:30,954
and development
of such technologies.
777
00:53:34,002 --> 00:53:36,981
After having heard the
presentation of these various technologies,
778
00:53:37,005 --> 00:53:41,089
ultimately a human being has to be held
accountable for an illicit activity.
779
00:53:41,176 --> 00:53:43,016
How does the
ethics in the context
780
00:53:43,095 --> 00:53:44,551
of systems designed?
781
00:53:44,638 --> 00:53:47,926
Are they just responding
algorithmically to set inputs?
782
00:53:48,016 --> 00:53:50,302
We hear that the
military is indeed leading
783
00:53:50,394 --> 00:53:52,555
the process of developing
such kind of technologies.
784
00:53:52,646 --> 00:53:54,999
Now, we do see the
full autonomous weapon systems
785
00:53:55,023 --> 00:53:56,809
as being especially problematic.
786
00:54:01,321 --> 00:54:03,687
It was surprising
to me being at the un
787
00:54:03,782 --> 00:54:07,070
and talking about the launch
of lethal autonomous weapons,
788
00:54:07,161 --> 00:54:10,324
to see no other people
with military experience.
789
00:54:10,873 --> 00:54:13,114
I felt like the un should
get a failing grade
790
00:54:13,208 --> 00:54:14,698
for not having enough people
791
00:54:14,793 --> 00:54:16,875
with military experience
in the room.
792
00:54:16,962 --> 00:54:19,874
Whether or not you agree
with the military operation,
793
00:54:19,965 --> 00:54:21,956
you at least need to hear
from those stakeholders.
794
00:54:23,761 --> 00:54:25,922
Thank you very much, ambassador.
795
00:54:26,013 --> 00:54:28,220
Thank you everyone
for those questions.
796
00:54:28,307 --> 00:54:29,592
Missy, over to you.
797
00:54:33,979 --> 00:54:36,516
Thank you, thank you
for those great questions.
798
00:54:37,232 --> 00:54:40,599
I appreciate that you think
that the United States military
799
00:54:40,694 --> 00:54:44,562
is so advanced
in its al development.
800
00:54:45,324 --> 00:54:49,192
The reality is,
we have no idea what we're doing
801
00:54:49,286 --> 00:54:52,153
when it comes to certification
of autonomous weapons
802
00:54:52,247 --> 00:54:54,533
or autonomous
technologies in general.
803
00:54:55,000 --> 00:54:57,958
In one sense, one of the
problems with the conversation
804
00:54:58,045 --> 00:55:02,539
that we're having today,
is that we really don't know
805
00:55:02,633 --> 00:55:05,124
what the right set of tests are,
806
00:55:05,219 --> 00:55:08,711
especially in helping
governments recognize
807
00:55:08,806 --> 00:55:12,719
what is not working al, and
what is not ready to field al.
808
00:55:13,435 --> 00:55:16,598
And if I were to beg
of you one thing in this body,
809
00:55:17,189 --> 00:55:20,556
we do need to come together
as an international community
810
00:55:20,651 --> 00:55:23,267
and set autonomous
weapon standards.
811
00:55:23,946 --> 00:55:27,404
People make errors
all the time in war.
812
00:55:27,491 --> 00:55:28,491
We know that.
813
00:55:29,284 --> 00:55:31,616
Having an autonomous
weapon system
814
00:55:31,703 --> 00:55:35,537
could in fact produce
substantially less loss of life.
815
00:55:39,127 --> 00:55:42,585
Thank you very
much, missy, for that response.
816
00:55:49,096 --> 00:55:50,961
There are two
problems with the argument
817
00:55:51,056 --> 00:55:52,575
that these weapons
that will save lives,
818
00:55:52,599 --> 00:55:54,089
that they'll be
more discriminatory
819
00:55:54,184 --> 00:55:55,765
and therefore
there'll be less civilians
820
00:55:55,853 --> 00:55:56,853
caught in the crossfire.
821
00:55:57,396 --> 00:55:59,887
The first problem is,
that that's some way away.
822
00:56:00,566 --> 00:56:03,524
And the weapons that
will be sold very shortly
823
00:56:03,610 --> 00:56:05,396
will not have that
discriminatory power.
824
00:56:05,487 --> 00:56:07,340
The second problem is
that when we do get there,
825
00:56:07,364 --> 00:56:09,259
and we will eventually have
weapons that will be better
826
00:56:09,283 --> 00:56:11,490
than humans in their targeting,
827
00:56:11,577 --> 00:56:13,693
these will be weapons
of mass destruction.
828
00:56:14,288 --> 00:56:16,404
[Ominous musicl
829
00:56:22,004 --> 00:56:24,290
History tells us
that we've been very lucky
830
00:56:24,381 --> 00:56:27,088
not to have the world
destroyed by nuclear weapons.
831
00:56:28,051 --> 00:56:29,916
But nuclear weapons
are difficult to build.
832
00:56:30,596 --> 00:56:32,632
You need to be
a nation to do that,
833
00:56:33,348 --> 00:56:35,088
whereas autonomous weapons,
834
00:56:35,183 --> 00:56:36,673
they are going
to be easy to obtain.
835
00:56:38,270 --> 00:56:41,478
That makes them more of a
challenge than nuclear weapons.
836
00:56:42,733 --> 00:56:45,520
I mean, previously
if you wanted to do harm,
837
00:56:45,611 --> 00:56:46,646
you needed an army.
838
00:56:48,864 --> 00:56:50,570
Now, you would have an algorithm
839
00:56:50,657 --> 00:56:53,364
that would be able to control
100 or 1000 drones.
840
00:56:54,494 --> 00:56:55,984
And so you would
no longer be limited
841
00:56:56,079 --> 00:56:57,535
by the number of people you had.
842
00:57:11,887 --> 00:57:12,989
We don't have to go
down this road.
843
00:57:13,013 --> 00:57:14,378
We get to make choices as to
844
00:57:14,514 --> 00:57:17,130
what technologies get used
and how they get used.
845
00:57:17,225 --> 00:57:19,136
We could just decide
that this was a technology
846
00:57:19,227 --> 00:57:21,309
that we shouldn't use
for killing people.
847
00:57:21,813 --> 00:57:25,180
[Somber musicl
848
00:57:45,671 --> 00:57:47,787
We're going to be
building up our military,
849
00:57:48,298 --> 00:57:52,382
and it will be so powerful,
nobody's going to mess with us.
850
00:58:19,579 --> 00:58:23,163
Somehow we feel it's better
for a human to take our life
851
00:58:23,250 --> 00:58:24,990
than for a robot
to take our life.
852
00:58:27,254 --> 00:58:30,337
Instead of a human having
to pan and zoom a camera
853
00:58:30,424 --> 00:58:32,005
to find a person in the crowd,
854
00:58:32,676 --> 00:58:34,462
the automation
would pan and zoom
855
00:58:34,553 --> 00:58:36,134
and find
the person in the crowd.
856
00:58:36,763 --> 00:58:40,597
But either way, the outcome
potentially would be the same.
857
00:58:40,684 --> 00:58:42,766
So, lethal autonomous weapons
858
00:58:43,186 --> 00:58:45,268
don't actually
change this process.
859
00:58:46,606 --> 00:58:49,564
The process is still human
approved at the very beginning.
860
00:58:51,695 --> 00:58:54,402
And so what is it
that we're trying to ban?
861
00:58:56,116 --> 00:58:58,232
Do you want to ban
the weapon itself?
862
00:58:58,326 --> 00:59:00,783
Do you want to ban the sensor
that's doing the targeting,
863
00:59:00,871 --> 00:59:03,157
or really do you want
to ban the outcome?
864
00:59:12,883 --> 00:59:15,920
One of the difficulties
about the conversation on al
865
00:59:16,011 --> 00:59:18,297
is conflating the near
term with long term.
866
00:59:18,889 --> 00:59:20,867
We could carry on those...
Most of these conversations,
867
00:59:20,891 --> 00:59:22,811
but, but let's not get them
all kind of rolled up
868
00:59:22,893 --> 00:59:24,429
into one big ball.
869
00:59:24,519 --> 00:59:26,555
Because that ball,
I think, over hypes
870
00:59:27,272 --> 00:59:28,978
what is possible today
and kind of
871
00:59:29,066 --> 00:59:30,226
simultaneously under hypes
872
00:59:30,317 --> 00:59:31,648
what is ultimately possible.
873
00:59:38,784 --> 00:59:40,240
Want to use this brush?
874
00:59:49,669 --> 00:59:51,409
Can you make a portrait?
Can you draw me?
875
00:59:52,339 --> 00:59:54,455
- No?
- How about another picture
876
00:59:54,549 --> 00:59:56,915
- of Charlie brown?
- Charlie brown's perfect.
877
00:59:57,511 --> 01:00:00,628
I'm going to move the
painting like this, all right?
878
01:00:01,014 --> 01:00:04,677
Right, when we do it, like,
when it runs out of paint,
879
01:00:04,768 --> 01:00:06,929
it makes a really
cool pattern, right?
880
01:00:07,020 --> 01:00:08,020
It does.
881
01:00:08,522 --> 01:00:10,103
One of the most
interesting things about
882
01:00:10,190 --> 01:00:12,306
when I watch my daughter
paint is it's just free.
883
01:00:13,068 --> 01:00:14,433
She's just pure expression.
884
01:00:15,737 --> 01:00:17,318
My whole art is trying to see
885
01:00:17,405 --> 01:00:19,987
how much of that
I can capture and code,
886
01:00:20,075 --> 01:00:22,316
and then have my robots
repeat that process.
887
01:00:26,790 --> 01:00:27,654
Yes.
888
01:00:27,749 --> 01:00:28,534
The first machine learning
889
01:00:28,625 --> 01:00:29,785
algorithms I started using
890
01:00:29,876 --> 01:00:31,104
were something
called style transfer.
891
01:00:31,128 --> 01:00:32,743
They were convolutional
neural networks.
892
01:00:35,132 --> 01:00:37,318
It can look at an image, then
look at another piece of art
893
01:00:37,342 --> 01:00:38,457
and it can apply the style
894
01:00:38,552 --> 01:00:39,962
from the piece
of art to the image.
895
01:00:51,314 --> 01:00:53,354
Every brush stroke,
my robots take pictures
896
01:00:53,441 --> 01:00:55,727
of what they are painting,
and use that to decide
897
01:00:55,819 --> 01:00:57,184
on the next brush stroke.
898
01:00:58,947 --> 01:01:02,064
I try and get as many
of my algorithms in as possible.
899
01:01:03,493 --> 01:01:06,155
Depending on where it is,
it might apply a gan or a CNN,
900
01:01:06,246 --> 01:01:08,282
but back and forth,
six or seven stages
901
01:01:08,373 --> 01:01:10,910
painting over itself,
searching for the image
902
01:01:11,001 --> 01:01:12,207
that it wants to paint.
903
01:01:14,004 --> 01:01:17,622
For me, creative al is
not one single god algorithm,
904
01:01:17,716 --> 01:01:20,833
it's smashing as many algorithms
as you can together
905
01:01:20,927 --> 01:01:22,542
and letting them
fight for the outcomes,
906
01:01:23,138 --> 01:01:25,470
and you get these, like,
ridiculously creative results.
907
01:01:32,397 --> 01:01:33,875
Did my machine make
this piece of art?
908
01:01:33,899 --> 01:01:35,435
Absolutely not, I'm the artist.
909
01:01:35,525 --> 01:01:37,982
But it made every single
aesthetic decision,
910
01:01:38,069 --> 01:01:41,106
and it made every single
brush stroke in this painting.
911
01:01:45,660 --> 01:01:48,652
There's this big question of, "can
robots and machines be creative?
912
01:01:48,747 --> 01:01:51,580
Can they be artists?" And I think
they are very different things.
913
01:01:56,755 --> 01:01:58,996
Art uses a lot of creativity,
but art
914
01:01:59,090 --> 01:02:01,547
is one person communicating
with another person.
915
01:02:04,971 --> 01:02:07,508
Until a machine has something
it wants to tell us,
916
01:02:07,599 --> 01:02:09,430
it won't be making art,
because otherwise
917
01:02:09,517 --> 01:02:14,056
it's just... just creating
without a message.
918
01:02:20,362 --> 01:02:21,962
In machine learning you can say,
919
01:02:22,030 --> 01:02:25,238
"here's a million recordings
of classical music.
920
01:02:25,784 --> 01:02:27,595
Now, go make me something
kind of like brahms."
921
01:02:27,619 --> 01:02:28,619
And it can do that.
922
01:02:28,703 --> 01:02:29,988
But it can't make the thing
923
01:02:30,080 --> 01:02:31,490
that comes after brahms.
924
01:02:32,916 --> 01:02:34,977
It can make a bunch of random
stuff and then poll humans.
925
01:02:35,001 --> 01:02:36,270
"Do you like this?
Do you like that?"
926
01:02:36,294 --> 01:02:37,374
But that's different.
927
01:02:37,462 --> 01:02:38,862
That's not
what a composer ever did.
928
01:02:40,423 --> 01:02:44,712
Composer felt something
and created something
929
01:02:45,553 --> 01:02:48,545
that mapped to the human
experience, right?
930
01:02:58,024 --> 01:02:59,730
I've spent my
life trying to build
931
01:02:59,859 --> 01:03:01,520
general artificial intelligence.
932
01:03:01,611 --> 01:03:05,524
I feel humbled
by how little we know
933
01:03:06,116 --> 01:03:08,448
and by how little we
understand about ourselves.
934
01:03:09,995 --> 01:03:12,077
We just don't
understand how we work.
935
01:03:16,126 --> 01:03:18,742
The human brain can do
over a quadrillion calculations
936
01:03:18,837 --> 01:03:21,419
per second
on 20 watts of energy.
937
01:03:21,923 --> 01:03:23,234
A computer right
now that would be able
938
01:03:23,258 --> 01:03:25,089
to do that many
calculations per second
939
01:03:25,176 --> 01:03:27,758
would run on 20 million
watts of energy.
940
01:03:28,930 --> 01:03:30,716
It's an unbelievable system.
941
01:03:32,767 --> 01:03:35,179
The brain can
learn the relationships
942
01:03:35,270 --> 01:03:36,350
between cause and effect,
943
01:03:36,938 --> 01:03:38,599
and build a world
inside of our heads.
944
01:03:40,483 --> 01:03:42,211
This is the reason
why you can close your eyes
945
01:03:42,235 --> 01:03:45,318
and imagine what it's like to,
you know, drive to the airport
946
01:03:45,405 --> 01:03:46,861
in a rocket ship or something.
947
01:03:47,532 --> 01:03:50,194
You can just play forward in
time in any direction you wish,
948
01:03:50,285 --> 01:03:52,025
and ask whatever question
you wish, which is
949
01:03:52,120 --> 01:03:54,202
very different from deep
learning style systems
950
01:03:54,289 --> 01:03:57,781
where all you get is a mapping
between pixels and a label.
951
01:03:59,502 --> 01:04:00,902
That's a good brush stroke.
952
01:04:01,546 --> 01:04:02,546
Is that snoopy?
953
01:04:03,048 --> 01:04:07,587
Yeah. Because snoopy
is okay to get pink.
954
01:04:07,677 --> 01:04:11,590
Because guys can be pink
like poodle's hair.
955
01:04:14,934 --> 01:04:16,370
I'm trying to learn...
I'm actually trying to teach
956
01:04:16,394 --> 01:04:17,804
my robots to paint like you.
957
01:04:17,937 --> 01:04:19,802
To try Ana get
the patterns that you can make.
958
01:04:19,939 --> 01:04:20,939
It's hard.
959
01:04:21,274 --> 01:04:23,014
You're a better
painter than my robots.
960
01:04:23,109 --> 01:04:24,109
Isn't that crazy?
961
01:04:24,152 --> 01:04:25,312
Yeah.
962
01:04:29,991 --> 01:04:31,982
Much like the Wright brothers
963
01:04:32,077 --> 01:04:34,238
learned how to build
an airplane by studying birds,
964
01:04:34,329 --> 01:04:35,723
1 think that it's
important that we study
965
01:04:35,747 --> 01:04:37,453
the right parts of neuroscience
966
01:04:37,540 --> 01:04:39,826
in order to have
some foundational ideas
967
01:04:39,959 --> 01:04:42,621
about building systems
that work like the brain.
968
01:05:19,791 --> 01:05:21,782
[Somber musicl
969
01:07:26,209 --> 01:07:28,495
Through my research
career, we've been very focused
970
01:07:28,586 --> 01:07:31,328
on developing this notion
of a brain computer interface.
971
01:07:32,715 --> 01:07:36,003
Where we started was
in epilepsy patients.
972
01:07:36,594 --> 01:07:38,505
They require having
electrodes placed
973
01:07:38,596 --> 01:07:40,199
on the surface
of their brain to figure out
974
01:07:40,223 --> 01:07:42,054
where their seizures
are coming from.
975
01:07:42,725 --> 01:07:45,592
By putting electrodes directly
on the surface of the brain,
976
01:07:45,687 --> 01:07:48,349
you get the highest
resolution of brain activity.
977
01:07:49,774 --> 01:07:51,614
It's kind of like
if you're outside of a house,
978
01:07:51,693 --> 01:07:53,354
and there's
a party going on inside,
979
01:07:53,945 --> 01:07:57,358
pasically you... all you really hear
is the bass, just a...
980
01:07:57,448 --> 01:07:59,468
Wwhereas if you really
want to hear what's going on
981
01:07:59,492 --> 01:08:00,902
and the specific conversations,
982
01:08:00,994 --> 01:08:02,530
you have to get inside the walls
983
01:08:02,620 --> 01:08:04,576
to hear that higher
frequency information.
984
01:08:04,664 --> 01:08:06,064
It's very similar
to brain activity.
985
01:08:07,125 --> 01:08:08,125
All right.
986
01:08:20,471 --> 01:08:25,932
So, Frida, measure... measure
about ten centimeters back,
987
01:08:26,519 --> 01:08:28,079
I just want to see
what that looks like.
988
01:08:28,896 --> 01:08:32,434
And this really provided us
with this unique opportunity
989
01:08:32,525 --> 01:08:35,312
to record directly
from a human brain,
990
01:08:35,403 --> 01:08:37,610
to start to understand
the physiology.
991
01:08:44,037 --> 01:08:46,028
In terms of the data
that is produced
992
01:08:46,122 --> 01:08:48,329
by recording directly
from the surface of the brain,
993
01:08:48,416 --> 01:08:49,701
it's substantial.
994
01:08:53,004 --> 01:08:55,165
Machine learning
is a critical tool
995
01:08:55,256 --> 01:08:57,542
for how we understand
brain function
996
01:08:57,634 --> 01:08:59,545
because what machine
learning does,
997
01:08:59,636 --> 01:09:01,297
is it handles complexity.
998
01:09:02,221 --> 01:09:05,088
It manages information
and simplifies it in a way
999
01:09:05,183 --> 01:09:07,048
that allows us
to have much deeper insights
1000
01:09:07,143 --> 01:09:09,179
into how the brain
interacts with itself.
1001
01:09:15,485 --> 01:09:17,225
You know,
projecting towards the future,
1002
01:09:17,737 --> 01:09:19,443
if you had the opportunity
1003
01:09:19,530 --> 01:09:21,191
where I could do
a surgery on you,
1004
01:09:21,282 --> 01:09:23,318
it's no more risky than Lasik,
1005
01:09:23,409 --> 01:09:25,570
but I could substantially
improve your attention
1006
01:09:25,662 --> 01:09:27,368
and your memory,
would you want it?
1007
01:09:43,930 --> 01:09:46,967
It's hard to fathom,
but al is going to interpret
1008
01:09:47,058 --> 01:09:48,423
what our brains want it to do.
1009
01:09:50,520 --> 01:09:52,135
If you think
about the possibilities
1010
01:09:52,271 --> 01:09:53,602
with a brain machine interface,
1011
01:09:53,690 --> 01:09:55,450
humans will be able
to think with each other.
1012
01:09:59,946 --> 01:10:01,382
Our imagination is going to say,
"oh, going to hear
1013
01:10:01,406 --> 01:10:03,692
their voice in your head.”
no, that's just talking.
1014
01:10:03,783 --> 01:10:05,903
It's going to be different.
It's going to be thinking.
1015
01:10:09,247 --> 01:10:10,737
And it's going
to be super strange,
1016
01:10:10,832 --> 01:10:12,432
and were going to be
very not used to it.
1017
01:10:14,585 --> 01:10:17,543
It's almost like two
brains meld into one
1018
01:10:18,047 --> 01:10:19,878
and have a thought
process together.
1019
01:10:22,176 --> 01:10:24,176
What that'll do for
understanding and communication
1020
01:10:24,303 --> 01:10:26,168
and empathy is pretty dramatic.
1021
01:10:51,205 --> 01:10:53,085
When you have a
brain computer interface,
1022
01:10:53,166 --> 01:10:54,827
now your ability
to touch the world
1023
01:10:54,917 --> 01:10:56,453
extends far beyond your body.
1024
01:10:59,130 --> 01:11:01,621
You can now go on virtual
vacations any time you want,
1025
01:11:02,049 --> 01:11:03,164
to do anything you want,
1026
01:11:03,259 --> 01:11:04,749
to be a different
person if you want.
1027
01:11:08,097 --> 01:11:10,284
But you know, we're just going to
keep track of a few of your thoughts,
1028
01:11:10,308 --> 01:11:11,618
and we're not going
to charge you that much.
1029
01:11:11,642 --> 01:11:13,303
It will be 100 bucks,
you interested?
1030
01:11:17,482 --> 01:11:19,084
If somebody can have
access to your thoughts,
1031
01:11:19,108 --> 01:11:21,224
how can that be pilfered,
1032
01:11:21,319 --> 01:11:23,230
how can that be abused,
how can that be
1033
01:11:23,321 --> 01:11:24,686
used to manipulate you?
1034
01:11:27,784 --> 01:11:29,775
What happens when a corporation
gets involved
1035
01:11:29,869 --> 01:11:31,734
and you have now
large aggregates
1036
01:11:31,829 --> 01:11:33,945
of human thoughts and data
1037
01:11:35,333 --> 01:11:37,494
and your resolution for
predicting individual behavior
1038
01:11:37,585 --> 01:11:39,200
becomes so much more profound
1039
01:11:40,755 --> 01:11:42,837
that you can really
manipulate not just people,
1040
01:11:42,924 --> 01:11:45,381
but politics
and governments and society?
1041
01:11:48,179 --> 01:11:50,407
And if it becomes this, you
know, how much does the benefit
1042
01:11:50,431 --> 01:11:52,431
outweigh the potential thing
that you're giving up?
1043
01:12:06,823 --> 01:12:09,485
Whether it's
50 years, 100 years,
1044
01:12:09,575 --> 01:12:10,906
even let's say 200 years,
1045
01:12:10,993 --> 01:12:13,735
that's still
such a small blip of time
1046
01:12:13,830 --> 01:12:16,446
relative to our human evolution
that it's immaterial.
1047
01:12:34,684 --> 01:12:36,140
Human history is 100,000 years.
1048
01:12:37,061 --> 01:12:38,551
Imagine if it's a 500-page book.
1049
01:12:40,106 --> 01:12:41,687
Each page is 200 years.
1050
01:12:43,359 --> 01:12:45,850
For the first 499 pages,
1051
01:12:45,945 --> 01:12:47,481
people got around on horses
1052
01:12:48,239 --> 01:12:50,355
and they spoke
to each other through letters,
1053
01:12:51,450 --> 01:12:53,361
and there was
under a billion people on earth.
1054
01:12:57,540 --> 01:12:59,121
On the last page of the book,
1055
01:12:59,208 --> 01:13:02,575
we have the first cars
and phones and electricity.
1056
01:13:04,422 --> 01:13:05,983
We've crossed the one, two,
three, four and five,
1057
01:13:06,007 --> 01:13:08,214
six, and seven
billion person marks.
1058
01:13:08,301 --> 01:13:09,916
So, nothing about this
is normal.
1059
01:13:10,011 --> 01:13:11,797
We are living
in a complete anomaly.
1060
01:13:15,016 --> 01:13:16,131
For most of human history,
1061
01:13:16,225 --> 01:13:17,635
the world
you grew up in was normal.
1062
01:13:17,727 --> 01:13:18,842
And it was naive to believe
1063
01:13:18,936 --> 01:13:20,096
that this is a special time.
1064
01:13:21,105 --> 01:13:22,225
Now, this is a special time.
1065
01:13:28,112 --> 01:13:30,273
Provided that science
is allowed to continue
1066
01:13:30,364 --> 01:13:34,198
on a broad front, then it does
look... it's very, very likely
1067
01:13:34,285 --> 01:13:36,742
that we will eventually
develop human level al.
1068
01:13:39,206 --> 01:13:41,367
We know that human
level thinking is possible
1069
01:13:41,459 --> 01:13:44,292
and can be produced
by a physical system.
1070
01:13:44,378 --> 01:13:46,619
In our case,
it weighs three pounds
1071
01:13:46,714 --> 01:13:47,999
and sits inside of a cranium,
1072
01:13:48,758 --> 01:13:51,215
but in principle,
the same types of computations
1073
01:13:51,302 --> 01:13:54,544
could be implemented in some
other subscript like a machine.
1074
01:14:00,061 --> 01:14:02,768
There's wide disagreement
between different experts.
1075
01:14:02,855 --> 01:14:05,267
S50, there are experts
who are convinced
1076
01:14:05,775 --> 01:14:08,016
we will certainly have
this within 10-15 years,
1077
01:14:08,110 --> 01:14:09,725
and there are experts
who are convinced
1078
01:14:09,820 --> 01:14:11,026
we will never get there
1079
01:14:11,113 --> 01:14:12,694
or it'll take
many hundreds of years.
1080
01:14:32,885 --> 01:14:34,905
I think even when we
do reach human level al,
1081
01:14:34,929 --> 01:14:36,729
I think the further step
to super intelligence
1082
01:14:36,806 --> 01:14:38,512
is likely to happen quickly.
1083
01:14:41,352 --> 01:14:44,389
Once al reaches a level
slightly greater than that,
1084
01:14:44,480 --> 01:14:47,347
the human scientist,
then the further developments
1085
01:14:47,441 --> 01:14:49,773
in artificial intelligence
will be driven increasingly
1086
01:14:49,860 --> 01:14:50,940
by the al itself.
1087
01:14:54,365 --> 01:14:58,483
You get the runaway al effect,
an intelligence explosion.
1088
01:15:00,204 --> 01:15:02,570
We have a word for 130 IQ.
1089
01:15:02,665 --> 01:15:03,780
We say smart.
1090
01:15:03,874 --> 01:15:05,205
Eighty IQ we say stupid.
1091
01:15:05,584 --> 01:15:07,540
I mean, we don't have
a word for 12,000 IQ.
1092
01:15:09,046 --> 01:15:11,207
It's so unfathomable for us.
1093
01:15:12,383 --> 01:15:14,840
Disease and poverty
and climate change
1094
01:15:14,927 --> 01:15:16,667
and aging and death
and all this stuff
1095
01:15:16,762 --> 01:15:18,218
we think is unconquerable.
1096
01:15:18,764 --> 01:15:20,254
Every single one
of them becomes easy
1097
01:15:20,349 --> 01:15:21,885
fo a super intelligent al.
1098
01:15:22,643 --> 01:15:24,975
Think of all the
possible technologies
1099
01:15:25,604 --> 01:15:27,890
perfectly realistic
virtual realities,
1100
01:15:28,441 --> 01:15:31,399
space colonies, all of those
things that we could do
1101
01:15:31,485 --> 01:15:34,898
over a millennia
with super intelligence,
1102
01:15:34,989 --> 01:15:36,650
you might get them very quickly.
1103
01:15:38,951 --> 01:15:41,738
You get a rush
to technological maturity.
1104
01:16:08,689 --> 01:16:10,649
We don't really know
how the universe began.
1105
01:16:11,692 --> 01:16:13,683
We don't really
know how life began.
1106
01:16:14,987 --> 01:16:16,131
Whether you're religious or not,
1107
01:16:16,155 --> 01:16:17,361
the idea of having
1108
01:16:17,448 --> 01:16:18,984
a super intelligence,
1109
01:16:20,743 --> 01:16:22,583
it's almost like we have
god on the planet now.
1110
01:16:52,608 --> 01:16:54,269
Even at the earliest space
1111
01:16:54,360 --> 01:16:57,568
when the field of artificial
intelligence was just launched
1112
01:16:57,655 --> 01:17:00,237
and some of the pioneers
were super optimistic,
1113
01:17:00,324 --> 01:17:02,690
they thought they could have
this cracked in ten years,
1114
01:17:02,785 --> 01:17:04,491
there seems to have been
no thought given
1115
01:17:04,578 --> 01:17:06,569
to what would happen
if they succeeded.
1116
01:17:07,289 --> 01:17:09,575
[Ominous musicl
1117
01:17:15,297 --> 01:17:16,912
An existential risk,
1118
01:17:17,007 --> 01:17:19,714
it's a risk from which
there would be no recovery.
1119
01:17:21,178 --> 01:17:24,170
It's kind of an end, premature
end to the human story.
1120
01:17:28,602 --> 01:17:32,140
We can't approach this by
just learning from experience.
1121
01:17:32,731 --> 01:17:35,063
We invent cars,
we find that they crash,
1122
01:17:35,151 --> 01:17:37,016
so we invent seatbelt
and traffic lights
1123
01:17:37,111 --> 01:17:39,067
and gradually we kind
of get a handle on that.
1124
01:17:40,573 --> 01:17:42,109
That's the way
we tend to proceed.
1125
01:17:42,199 --> 01:17:44,190
We model through
and adjust as we go along.
1126
01:17:44,827 --> 01:17:46,033
But with an existential risk,
1127
01:17:46,120 --> 01:17:48,202
you really need
a proactive approach.
1128
01:17:50,082 --> 01:17:52,915
You can't learn from failure,
you don't get a second try.
1129
01:17:59,091 --> 01:18:01,753
You can't take something
smarter than you back.
1130
01:18:02,761 --> 01:18:04,156
The rest of the animals
in the planet
1131
01:18:04,180 --> 01:18:05,920
definitely want
to take humans back.
1132
01:18:07,975 --> 01:18:08,805
I'ney can't, it's too late.
1133
01:18:08,893 --> 01:18:10,383
We're here, we're in charge now.
1134
01:18:15,691 --> 01:18:18,649
One class of concern
is alignment failure.
1135
01:18:20,779 --> 01:18:22,644
What we would see
is this powerful system
1136
01:18:22,781 --> 01:18:25,944
that is pursuing some
objective that is independent
1137
01:18:26,035 --> 01:18:28,242
of our human goals and values.
1138
01:18:31,123 --> 01:18:34,081
The problem would not be that
it would hate us or resent us,
1139
01:18:35,044 --> 01:18:37,000
it would be indifferent
to us and would optimize
1140
01:18:37,087 --> 01:18:40,045
the rest of the world according
to this different criteria.
1141
01:18:42,009 --> 01:18:44,921
A little bit like there might
be an ant colony somewhere,
1142
01:18:45,012 --> 01:18:47,048
and then we decide we want
a parking lot there.
1143
01:18:49,892 --> 01:18:52,474
I mean, it's not because
we dislike, like, hate the ants,
1144
01:18:53,103 --> 01:18:55,344
it's just we had some other goal
and they didn't factor
1145
01:18:55,439 --> 01:18:57,020
into our utility function.
1146
01:19:04,573 --> 01:19:05,938
The big word is alignment.
1147
01:19:06,867 --> 01:19:08,858
It's about taking
this tremendous power
1148
01:19:09,453 --> 01:19:11,819
and pointing it
in the right direction.
1149
01:19:19,421 --> 01:19:21,582
We come with some values.
1150
01:19:22,383 --> 01:19:24,749
We like those feelings,
we don't like other ones.
1151
01:19:26,303 --> 01:19:28,715
Now, a computer doesn't
get those out of the box.
1152
01:19:29,515 --> 01:19:32,848
Where it's going to get those,
is from us.
1153
01:19:37,106 --> 01:19:39,188
And if it all
goes terribly wrong
1154
01:19:39,733 --> 01:19:42,691
and artificial intelligence
builds giant robots
1155
01:19:42,778 --> 01:19:44,138
that kill all humans
and take over,
1156
01:19:44,196 --> 01:19:45,777
you know what?
It'll be our fault.
1157
01:19:46,740 --> 01:19:48,731
If we're going
to build these things,
1158
01:19:49,326 --> 01:19:51,487
we have to instill them
with our values.
1159
01:19:52,204 --> 01:19:53,694
And if we're not clear
about that,
1160
01:19:53,789 --> 01:19:55,309
then yeah,
they probably will take over
1161
01:19:55,374 --> 01:19:56,814
and it'll all be horrible.
1162
01:19:56,875 --> 01:19:57,990
But that's true for kids.
1163
01:20:15,686 --> 01:20:18,519
Empathy, to me, is
like the most important thing
1164
01:20:18,605 --> 01:20:20,470
that everyone should have.
1165
01:20:20,566 --> 01:20:23,023
I mean, that's, that's what's
going to save the world.
1166
01:20:26,030 --> 01:20:27,645
So, regardless of machines,
1167
01:20:27,740 --> 01:20:29,947
that's the first thing
I would want to teach my son
1168
01:20:30,034 --> 01:20:31,114
if that's teachable.
1169
01:20:32,703 --> 01:20:35,365
L
1170
01:20:36,498 --> 01:20:38,580
I don't think we
appreciate how much nuance
1171
01:20:38,667 --> 01:20:40,658
goes into our value system.
1172
01:20:41,712 --> 01:20:43,077
It's very specific.
1173
01:20:44,882 --> 01:20:47,214
You think programming
a robot to walk
1174
01:20:47,301 --> 01:20:48,882
is hard or recognize faces,
1175
01:20:49,803 --> 01:20:51,919
programming it
to understand subtle values
1176
01:20:52,014 --> 01:20:53,379
is much more difficult.
1177
01:20:56,477 --> 01:20:58,559
Say that we want
the al to value life.
1178
01:20:59,521 --> 01:21:01,291
But now it says, "okay,
well, if we want to value life,
1179
01:21:01,315 --> 01:21:03,931
the species that's killing
the most life is humans.
1180
01:21:04,902 --> 01:21:05,982
Let's get rid of them."
1181
01:21:10,282 --> 01:21:12,773
Even if we could get
the al to do what we want,
1182
01:21:12,868 --> 01:21:14,824
how will we humans
then choose to use
1183
01:21:14,912 --> 01:21:16,493
this powerful new technology?
1184
01:21:18,957 --> 01:21:21,019
These are not questions
just for people like myself,
1185
01:21:21,043 --> 01:21:22,658
technologists to think about.
1186
01:21:23,921 --> 01:21:25,912
These are questions
that touch all of society,
1187
01:21:26,006 --> 01:21:28,418
and all of society need
to come up with the answers.
1188
01:21:30,719 --> 01:21:32,505
One of the mistakes
that's easy to make
1189
01:21:32,596 --> 01:21:34,177
is that the future is something
1190
01:21:34,264 --> 01:21:35,754
that we're going
to have to adapt to,
1191
01:21:36,517 --> 01:21:38,849
as opposed
to the future is the product
1192
01:21:38,977 --> 01:21:40,717
of the decisions you make today.
1193
01:22:17,433 --> 01:22:18,593
J people j
1194
01:22:24,064 --> 01:22:26,146
J we're only people I
1195
01:22:32,114 --> 01:22:33,979
J there's not much j
1196
01:22:35,492 --> 01:22:37,357
j anyone can do j
1197
01:22:38,412 --> 01:22:41,154
j really do about that
1198
01:22:43,667 --> 01:22:46,283
j but it hasn't stopped us yes j
1199
01:22:48,755 --> 01:22:49,870
j people j
1200
01:22:53,427 --> 01:22:57,796
j we know so little
about ourselves j
1201
01:23:03,312 --> 01:23:04,677
J just enough j
1202
01:23:07,107 --> 01:23:08,768
j to want to be j
1203
01:23:09,693 --> 01:23:13,811
j nearly anybody else j
1204
01:23:14,990 --> 01:23:17,652
j now how does that add up j
1205
01:23:18,577 --> 01:23:23,446
j oh, friends all my friends &
1206
01:23:23,540 --> 01:23:28,375
j oh, I hope you're
somewhere smiling j
1207
01:23:32,299 --> 01:23:35,291
j just know I think about you j
1208
01:23:36,136 --> 01:23:40,721
j more kindly than you
and I have ever been j
1209
01:23:44,228 --> 01:23:48,346
j now see you the next
time round up there j
1210
01:23:48,941 --> 01:23:53,981
j ohjt
1211
01:23:54,863 --> 01:23:58,151
j ohjt
1212
01:23:59,660 --> 01:24:06,657
j ohjt
1213
01:24:11,004 --> 01:24:12,039
j people j
1214
01:24:17,803 --> 01:24:19,794
J what's the deal
1215
01:24:26,520 --> 01:24:27,851
J' you have been hurt j
1215
01:24:28,305 --> 01:25:28,872
OpenSubtitles recommends using Nord VPN
from 3.49 USD/month ----> osdb.link/vpn
97706
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.