Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:29,581 --> 00:00:32,549
- Intelligence is
the ability to understand.
2
00:00:32,549 --> 00:00:34,827
We passed on what
we know to machines.
3
00:00:34,827 --> 00:00:36,346
- The rise of
artificial intelligence
4
00:00:36,346 --> 00:00:38,797
is happening fast, but some
fear the new technology
5
00:00:38,797 --> 00:00:41,524
might have more problems
than anticipated.
6
00:00:41,524 --> 00:00:43,043
- We will not control it.
7
00:01:42,378 --> 00:01:44,897
- Artificially
intelligent algorithms are here,
8
00:01:44,897 --> 00:01:46,485
but this is only the beginning.
9
00:01:56,840 --> 00:01:59,395
- In the age of
AI, data is the new oil.
10
00:02:01,673 --> 00:02:03,709
- Today, Amazon,
Google and Facebook
11
00:02:03,709 --> 00:02:05,953
are richer and more
powerful than any companies
12
00:02:05,953 --> 00:02:08,335
that have ever existed
throughout human history.
13
00:02:09,543 --> 00:02:11,234
- A handful
of people working
14
00:02:11,234 --> 00:02:13,167
at a handful of
technology companies
15
00:02:13,167 --> 00:02:15,894
steer what a billion
people are thinking today.
16
00:02:17,137 --> 00:02:19,139
- This
technology is changing
17
00:02:19,139 --> 00:02:21,002
what does it mean to be human?
18
00:03:16,334 --> 00:03:17,749
- Artificial intelligence
19
00:03:17,749 --> 00:03:20,648
is simply non-biological
intelligence.
20
00:03:21,822 --> 00:03:24,031
And intelligence itself
is simply the ability
21
00:03:24,031 --> 00:03:25,446
to accomplish goals.
22
00:03:27,966 --> 00:03:30,520
I'm convinced that AI
will ultimately be either
23
00:03:30,520 --> 00:03:32,350
the best thing ever
to happen to humanity,
24
00:03:32,350 --> 00:03:34,144
or the worst thing
ever to happen.
25
00:03:35,732 --> 00:03:38,977
We can use it to solve all
of today's and tomorrow's
26
00:03:38,977 --> 00:03:43,982
greatest problems;
cure diseases, deal
with climate change,
27
00:03:45,259 --> 00:03:47,330
lift everybody out of poverty.
28
00:03:48,883 --> 00:03:51,783
But, we could use exactly
the same technology
29
00:03:51,783 --> 00:03:56,305
to create a brutal global
dictatorship with unprecedented
30
00:03:56,305 --> 00:03:59,135
surveillance and
inequality and suffering.
31
00:04:00,757 --> 00:04:02,759
That's why this is the
most important conversation
32
00:04:02,759 --> 00:04:04,244
of our time.
33
00:04:09,179 --> 00:04:12,079
- Artificial intelligence
is everywhere
34
00:04:13,253 --> 00:04:15,979
because we now have
thinking machines.
35
00:04:17,118 --> 00:04:20,294
If you go on social
media or online,
36
00:04:20,294 --> 00:04:22,089
there's an artificial
intelligence engine
37
00:04:22,089 --> 00:04:24,781
that decides what to recommend.
38
00:04:26,196 --> 00:04:28,026
If you go on Facebook
and you're just scrolling
39
00:04:28,026 --> 00:04:29,890
through your friends' posts,
40
00:04:29,890 --> 00:04:31,443
there's an artificial
intelligence engine
41
00:04:31,443 --> 00:04:33,583
that's picking which
one to show you first
42
00:04:33,583 --> 00:04:35,344
and which one to bury.
43
00:04:35,344 --> 00:04:38,692
If you try to get insurance,
there is an AI engine
44
00:04:38,692 --> 00:04:40,763
trying to figure out
how risky you are.
45
00:04:41,936 --> 00:04:45,250
And if you apply for a
job, it's quite possible
46
00:04:45,250 --> 00:04:47,942
that an AI engine
looks at the resume.
47
00:04:55,881 --> 00:04:57,573
- We are made of data.
48
00:04:58,781 --> 00:05:01,024
Every one of us is made of data,
49
00:05:02,543 --> 00:05:04,234
in terms of how we behave,
50
00:05:04,234 --> 00:05:07,479
how we talk, how we love,
what we do every day.
51
00:05:09,895 --> 00:05:11,552
So, computer scientists
52
00:05:11,552 --> 00:05:14,141
are developing deep
learning algorithms
53
00:05:15,315 --> 00:05:19,042
that can learn to
identify, classify,
54
00:05:19,042 --> 00:05:22,977
and predict patterns within
massive amounts of data.
55
00:05:35,887 --> 00:05:39,684
We are facing a form of
precision surveillance,
56
00:05:39,684 --> 00:05:43,308
you could call it
algorithmic surveillance,
57
00:05:43,308 --> 00:05:46,311
and it means that you
cannot go unrecognized.
58
00:05:47,623 --> 00:05:50,626
You are always under
the watch of algorithms.
59
00:05:55,562 --> 00:05:58,875
- Almost all the AI
development on the planet today
60
00:05:58,875 --> 00:06:01,499
is done by a handful of
big technology companies
61
00:06:01,499 --> 00:06:03,224
or by a few large governments.
62
00:06:06,366 --> 00:06:10,611
If we look at what AI is
mostly being developed for,
63
00:06:10,611 --> 00:06:15,616
I would say it's killing,
spying, and brainwashing.
64
00:06:16,479 --> 00:06:18,239
So, I mean, we have military AI,
65
00:06:18,239 --> 00:06:21,346
we have a whole surveillance
apparatus being built
66
00:06:21,346 --> 00:06:23,521
using AI by major governments,
67
00:06:23,521 --> 00:06:26,213
and we have an advertising
industry which is oriented
68
00:06:26,213 --> 00:06:30,355
toward recognizing what ads
to try to sell to someone.
69
00:06:33,531 --> 00:06:36,568
- We humans have come to
a fork in the road now.
70
00:06:38,018 --> 00:06:41,159
The AI we have today
is very narrow.
71
00:06:42,505 --> 00:06:44,783
The holy grail of AI research
ever since the beginning
72
00:06:44,783 --> 00:06:47,786
is to make AI that can do
everything better than us,
73
00:06:49,270 --> 00:06:50,858
and we've basically built a God.
74
00:06:52,204 --> 00:06:54,517
It's going to revolutionize
life as we know it.
75
00:06:57,693 --> 00:07:00,247
It's incredibly important
to take a step back
76
00:07:00,247 --> 00:07:02,111
and think carefully about this.
77
00:07:03,802 --> 00:07:05,942
What sort of society do we want?
78
00:07:09,567 --> 00:07:12,155
- So, we're in this
historic transformation.
79
00:07:13,363 --> 00:07:15,711
Like we're raising
this new creature.
80
00:07:15,711 --> 00:07:18,679
We have a new
offspring of sorts.
81
00:07:20,578 --> 00:07:24,098
But just like actual offspring,
82
00:07:24,098 --> 00:07:27,412
you don't get to control
everything it's going to do.
83
00:08:01,515 --> 00:08:04,691
- We are living at
this privileged moment where,
84
00:08:04,691 --> 00:08:09,696
for the first time, we
will see probably that AI
85
00:08:11,042 --> 00:08:13,320
is really going to outcompete
humans in many, many,
86
00:08:13,320 --> 00:08:14,908
if not all, important fields.
87
00:08:18,739 --> 00:08:20,914
- Everything is going to change.
88
00:08:20,914 --> 00:08:24,262
A new form of life is emerging.
89
00:08:49,494 --> 00:08:54,499
When I was a boy, I thought,
how can I maximize my impact?
90
00:08:56,536 --> 00:09:00,781
And then it was clear that
I have to build something
91
00:09:00,781 --> 00:09:04,509
that learns to become
smarter than myself,
92
00:09:04,509 --> 00:09:06,131
such that I can retire,
93
00:09:06,131 --> 00:09:08,893
and the smarter thing
can further self-improve
94
00:09:08,893 --> 00:09:11,343
and solve all the problems
that I cannot solve.
95
00:09:16,452 --> 00:09:18,937
Multiplying that tiny
little bit of creativity
96
00:09:18,937 --> 00:09:20,801
that I have into infinity,
97
00:09:22,700 --> 00:09:25,219
and that's what has been
driving me since then.
98
00:09:43,652 --> 00:09:45,101
How am I trying to build
99
00:09:45,101 --> 00:09:47,759
a general purpose
artificial intelligence?
100
00:09:50,003 --> 00:09:54,248
If you want to be intelligent,
you have to recognize speech,
101
00:09:54,248 --> 00:09:59,150
video and handwriting, and
faces, and all kinds of things,
102
00:09:59,150 --> 00:10:02,015
and there we have made
a lot of progress.
103
00:10:03,775 --> 00:10:06,502
See, LSTM, neural networks,
104
00:10:06,502 --> 00:10:11,369
which we developed in our labs
in Munich and in Switzerland,
105
00:10:11,369 --> 00:10:14,959
and it's now used for speech
recognition and translation,
106
00:10:14,959 --> 00:10:16,616
and video recognition.
107
00:10:18,065 --> 00:10:21,137
They are now in everybody's
smartphone, almost one billion
108
00:10:21,137 --> 00:10:25,176
iPhones and in over two
billion Android phones.
109
00:10:26,626 --> 00:10:31,527
So, we are generating all
kinds of useful by-products
110
00:10:32,390 --> 00:10:33,563
on the way to the general goal.
111
00:10:44,574 --> 00:10:49,545
The main goal, some Artificial
General Intelligence,
112
00:10:49,545 --> 00:10:52,686
an AGI that can learn
113
00:10:52,686 --> 00:10:55,896
to improve the learning
algorithm itself.
114
00:10:57,518 --> 00:11:01,453
So, it basically can learn
to improve the way it learns
115
00:11:01,453 --> 00:11:05,423
and it can also recursively
improve the way it learns,
116
00:11:05,423 --> 00:11:08,944
the way it learns without
any limitations except for
117
00:11:08,944 --> 00:11:12,464
the basic fundamental
limitations of computability.
118
00:11:19,092 --> 00:11:22,405
One of my favorite
robots is this one here.
119
00:11:22,405 --> 00:11:26,893
We use this robot for our
studies of artificial curiosity.
120
00:11:26,893 --> 00:11:31,898
Where we are trying to teach
this robot to teach itself.
121
00:11:37,386 --> 00:11:38,663
What is a baby doing?
122
00:11:38,663 --> 00:11:42,356
A baby is curiously
exploring its world.
123
00:11:44,531 --> 00:11:46,740
That's how he learns
how gravity works
124
00:11:46,740 --> 00:11:49,674
and how certain things
topple, and so on.
125
00:11:51,262 --> 00:11:54,575
And as it learns to ask
questions about the world,
126
00:11:54,575 --> 00:11:57,199
and as it learns to
answer these questions,
127
00:11:57,199 --> 00:11:59,649
it becomes a more and more
general problem solver.
128
00:12:01,030 --> 00:12:03,619
And so, our artificial
systems are also learning
129
00:12:03,619 --> 00:12:07,416
to ask all kinds of
questions, not just slavishly
130
00:12:07,416 --> 00:12:11,420
try to answer the questions
given to them by humans.
131
00:12:15,389 --> 00:12:19,048
You have to give AI the freedom
to invent its own tasks.
132
00:12:22,327 --> 00:12:25,054
If you don't do that, it's not
going to become very smart.
133
00:12:26,815 --> 00:12:28,333
On the other hand,
134
00:12:28,333 --> 00:12:31,543
it's really hard to predict
what they are going to do.
135
00:12:53,980 --> 00:12:57,466
- I feel that technology
is a force of nature.
136
00:13:00,055 --> 00:13:02,402
I feel like there is a lot of
similarity between technology
137
00:13:02,402 --> 00:13:03,644
and biological evolution.
138
00:13:13,275 --> 00:13:14,241
Playing God.
139
00:13:17,900 --> 00:13:20,765
Scientists have been accused
of playing God for a while,
140
00:13:22,456 --> 00:13:25,080
but there is a real sense
141
00:13:25,080 --> 00:13:26,875
in which we are
creating something
142
00:13:28,255 --> 00:13:30,740
very different from anything
we've created so far.
143
00:13:54,247 --> 00:13:55,835
I was interested in
the concept of AI
144
00:13:55,835 --> 00:13:57,353
from a relatively early age.
145
00:13:59,666 --> 00:14:01,323
At some point, I got
especially interested
146
00:14:01,323 --> 00:14:02,358
in machine learning.
147
00:14:06,155 --> 00:14:07,950
What is experience?
148
00:14:07,950 --> 00:14:09,158
What is learning?
149
00:14:09,158 --> 00:14:10,159
What is thinking?
150
00:14:11,333 --> 00:14:12,610
How does the brain work?
151
00:14:14,612 --> 00:14:16,338
These questions
are philosophical,
152
00:14:16,338 --> 00:14:19,651
but it looks like we can
come up with algorithms that
153
00:14:19,651 --> 00:14:22,965
both do useful things and help
us answer these questions.
154
00:14:24,449 --> 00:14:26,693
Like it's almost like
applied philosophy.
155
00:14:52,546 --> 00:14:55,895
Artificial General
Intelligence, AGI.
156
00:14:57,620 --> 00:15:01,935
A computer system that
can do any job or any task
157
00:15:01,935 --> 00:15:04,489
that a human does,
but only better.
158
00:15:17,640 --> 00:15:20,298
Yeah, I mean, we definitely
will be able to create
159
00:15:21,817 --> 00:15:24,475
completely autonomous
beings with their own goals.
160
00:15:28,513 --> 00:15:30,067
And it will be very important,
161
00:15:30,067 --> 00:15:34,934
especially as these beings
become much smarter than humans,
162
00:15:34,934 --> 00:15:39,145
it's going to be important
to have these beings,
163
00:15:41,009 --> 00:15:43,873
that the goals of these beings
be aligned with our goals.
164
00:15:47,153 --> 00:15:49,879
That's what we're
trying to do at OpenAI.
165
00:15:49,879 --> 00:15:54,229
Be at the forefront of research
and steer the research,
166
00:15:54,229 --> 00:15:58,440
steer their initial conditions
so to maximize the chance
167
00:15:58,440 --> 00:16:00,407
that the future will
be good for humans.
168
00:16:16,734 --> 00:16:18,805
Now, AI is a great thing
because AI will solve
169
00:16:18,805 --> 00:16:20,945
all the problems
that we have today.
170
00:16:23,603 --> 00:16:27,538
It will solve employment,
it will solve disease,
171
00:16:29,126 --> 00:16:30,990
it will solve poverty,
172
00:16:33,440 --> 00:16:35,822
but it will also
create new problems.
173
00:16:38,998 --> 00:16:40,551
I think that
174
00:16:43,002 --> 00:16:47,385
the problem of fake news
is going to be 1000,
175
00:16:47,385 --> 00:16:48,835
a million times worse.
176
00:16:50,940 --> 00:16:53,357
Cyberattacks will become
much more extreme.
177
00:16:55,152 --> 00:16:57,982
You will have totally
automated AI weapons.
178
00:17:00,640 --> 00:17:01,986
I think AI has the potential
179
00:17:01,986 --> 00:17:04,230
to create infinitely
stable dictatorships.
180
00:17:10,339 --> 00:17:13,273
You're gonna see dramatically
more intelligent systems
181
00:17:13,273 --> 00:17:17,036
in 10 or 15 years from now,
and I think it's highly likely
182
00:17:17,036 --> 00:17:18,347
that those systems
183
00:17:18,347 --> 00:17:21,523
will have completely
astronomical impact on society.
184
00:17:23,904 --> 00:17:25,527
Will humans actually benefit?
185
00:17:27,046 --> 00:17:28,875
And who will benefit,
who will not?
186
00:17:49,033 --> 00:17:52,140
- In 2012, IBM estimated that
187
00:17:52,140 --> 00:17:56,523
an average person is
generating 500 megabytes
188
00:17:56,523 --> 00:17:59,561
of digital footprints
every single day.
189
00:17:59,561 --> 00:18:01,839
Imagine that you wanted
to back-up one day worth
190
00:18:01,839 --> 00:18:05,705
of data that humanity is
leaving behind, on paper.
191
00:18:05,705 --> 00:18:08,811
How tall will be the stack
of paper that contains
192
00:18:08,811 --> 00:18:12,194
just one day worth of data
that humanity is producing?
193
00:18:14,023 --> 00:18:17,096
It's like from the earth to
the sun, four times over.
194
00:18:18,994 --> 00:18:23,999
In 2025, we'll be generating
62 gigabytes of data
195
00:18:25,069 --> 00:18:26,588
per person, per day.
196
00:18:41,430 --> 00:18:45,745
We're leaving a ton of
digital footprints while going
197
00:18:45,745 --> 00:18:46,815
through our lives.
198
00:18:49,231 --> 00:18:52,165
They provide computer algorithms
with a fairly good idea
199
00:18:52,165 --> 00:18:53,891
about who we are,
200
00:18:53,891 --> 00:18:56,721
what we want, what we are doing.
201
00:19:00,449 --> 00:19:02,762
In my work, I looked
at different types
202
00:19:02,762 --> 00:19:04,177
of digital footprints.
203
00:19:04,177 --> 00:19:07,042
I looked at Facebook likes,
I looked at language,
204
00:19:07,042 --> 00:19:11,253
credit card records,
web browsing histories,
search records.
205
00:19:12,737 --> 00:19:16,431
and each time I found that if
you get enough of this data,
206
00:19:16,431 --> 00:19:18,847
you can accurately
predict future behavior
207
00:19:18,847 --> 00:19:21,988
and reveal important
intimate traits.
208
00:19:23,576 --> 00:19:25,681
This can be used in great ways,
209
00:19:25,681 --> 00:19:28,926
but it can also be used
to manipulate people.
210
00:19:34,345 --> 00:19:38,280
Facebook is delivering
daily information
211
00:19:38,280 --> 00:19:40,903
to two billion people or more.
212
00:19:43,182 --> 00:19:45,356
If you slightly
change the functioning
213
00:19:45,356 --> 00:19:47,289
of the Facebook engine,
214
00:19:47,289 --> 00:19:50,982
you can move the
opinions and hence,
215
00:19:50,982 --> 00:19:54,262
the votes of millions of people.
216
00:19:55,401 --> 00:19:57,230
- Brexit!
- When do we want it?
217
00:19:57,230 --> 00:19:58,404
- Now!
218
00:20:00,647 --> 00:20:03,202
- A politician
wouldn't be able to figure out
219
00:20:03,202 --> 00:20:07,275
which message each one of
his or her voters would like,
220
00:20:07,275 --> 00:20:10,450
but a computer can see
what political message
221
00:20:10,450 --> 00:20:13,384
would be particularly
convincing for you.
222
00:20:15,766 --> 00:20:17,250
- Ladies and gentlemen,
223
00:20:17,250 --> 00:20:20,667
it's my privilege to speak
to you today about the power
224
00:20:20,667 --> 00:20:24,947
of big data and psychographics
in the electoral process.
225
00:20:24,947 --> 00:20:26,880
- Data from
Cambridge Analytica
226
00:20:26,880 --> 00:20:29,538
secretly harvested the
personal information
227
00:20:29,538 --> 00:20:32,576
of 50 million unsuspecting
Facebook users.
228
00:20:32,576 --> 00:20:33,784
- USA, USA, USA!
229
00:20:36,235 --> 00:20:37,581
- The data firm hired
230
00:20:37,581 --> 00:20:40,411
by Donald Trump's
presidential election campaign
231
00:20:40,411 --> 00:20:42,482
used secretly
obtained information
232
00:20:42,482 --> 00:20:45,589
to directly target
potential American voters.
233
00:20:45,589 --> 00:20:47,487
- With that, they
say they can predict
234
00:20:47,487 --> 00:20:52,009
the personality of every single
adult in the United States.
235
00:20:52,009 --> 00:20:54,011
- Tonight we're hearing
from Cambridge Analytica
236
00:20:54,011 --> 00:20:56,255
whistleblower,
Christopher Wiley.
237
00:20:56,255 --> 00:20:59,568
- What we worked on was
data harvesting programs
238
00:20:59,568 --> 00:21:01,501
where we would pull data and run
239
00:21:01,501 --> 00:21:03,641
that data through algorithms
that could profile
240
00:21:03,641 --> 00:21:06,368
their personality traits and
other psychological attributes
241
00:21:06,368 --> 00:21:10,061
to exploit mental
vulnerabilities that
our algorithms showed
242
00:21:10,061 --> 00:21:11,442
that they had.
243
00:21:19,968 --> 00:21:22,039
- Cambridge Analytica
mentioned once
244
00:21:22,039 --> 00:21:24,938
or said that their models
were based on my work,
245
00:21:26,975 --> 00:21:29,633
but Cambridge Analytica is
just one of the hundreds
246
00:21:29,633 --> 00:21:34,534
of companies that are using
such methods to target voters.
247
00:21:36,605 --> 00:21:40,229
You know, I would be
asked questions by
journalists such as,
248
00:21:40,229 --> 00:21:41,714
"So how do you feel about
249
00:21:43,405 --> 00:21:46,581
electing Trump and
supporting Brexit?"
250
00:21:46,581 --> 00:21:48,307
How do you answer
to such question?
251
00:21:49,929 --> 00:21:54,899
I guess that I have to deal
with being blamed for all of it.
252
00:22:12,158 --> 00:22:16,921
- How tech started was
as a democratizing force,
253
00:22:16,921 --> 00:22:19,545
as a force for good, as
an ability for humans
254
00:22:19,545 --> 00:22:21,961
to interact with each
other without gatekeepers.
255
00:22:24,688 --> 00:22:27,035
There's never been
a bigger experiment
256
00:22:27,035 --> 00:22:29,555
in communications
for the human race.
257
00:22:30,970 --> 00:22:33,386
What happens when everybody
gets to have their say?
258
00:22:34,594 --> 00:22:36,355
You would assume that it
would be for the better,
259
00:22:36,355 --> 00:22:37,597
that there would
be more democracy,
260
00:22:37,597 --> 00:22:39,116
there would be more discussion,
261
00:22:39,116 --> 00:22:42,153
there would be more tolerance,
but what's happened is that
262
00:22:42,153 --> 00:22:44,121
these systems have
been hijacked.
263
00:22:45,778 --> 00:22:48,159
- We stand for
connecting every person.
264
00:22:49,471 --> 00:22:50,783
For a global community.
265
00:22:52,301 --> 00:22:55,028
- One company,
Facebook, is responsible
266
00:22:55,028 --> 00:22:58,031
for the communications of
a lot of the human race.
267
00:23:00,171 --> 00:23:01,794
Same thing with Google.
268
00:23:01,794 --> 00:23:04,624
Everything you want know about
the world comes from them.
269
00:23:05,970 --> 00:23:09,560
This is global
information economy
270
00:23:09,560 --> 00:23:12,529
that is controlled by a
small group of people.
271
00:23:19,052 --> 00:23:20,329
- The world's richest companies
272
00:23:20,329 --> 00:23:21,917
are all technology companies.
273
00:23:23,263 --> 00:23:28,234
Google, Apple, Microsoft,
Amazon, Facebook.
274
00:23:30,374 --> 00:23:35,379
It's staggering how, in
probably just 10 years,
275
00:23:36,656 --> 00:23:39,314
that the entire
corporate power structure
276
00:23:39,314 --> 00:23:43,939
are basically in the business
of trading electrons.
277
00:23:45,734 --> 00:23:50,705
These little bits and bytes
are really the new currency.
278
00:23:58,091 --> 00:24:00,300
- The way that data is monetized
279
00:24:00,300 --> 00:24:03,096
is happening all around us,
even if it's invisible to us.
280
00:24:05,409 --> 00:24:08,688
Google has every amount
of information available.
281
00:24:08,688 --> 00:24:11,519
They track people by
their GPS location.
282
00:24:11,519 --> 00:24:14,453
They know exactly what your
search history has been.
283
00:24:14,453 --> 00:24:17,283
They know your
political preferences.
284
00:24:17,283 --> 00:24:19,285
Your search history
alone can tell you
285
00:24:19,285 --> 00:24:22,012
everything about an individual
from their health problems
286
00:24:22,012 --> 00:24:23,910
to their sexual preferences.
287
00:24:23,910 --> 00:24:26,568
So, Google's reach is unlimited.
288
00:24:33,195 --> 00:24:36,060
- So we've seen
Google and Facebook
289
00:24:36,060 --> 00:24:38,580
rise into these large
surveillance machines
290
00:24:39,719 --> 00:24:42,653
and they're both
actually ad brokers.
291
00:24:42,653 --> 00:24:46,485
It sounds really mundane, but
they're high tech ad brokers.
292
00:24:47,865 --> 00:24:50,523
And the reason they're so
profitable is that they're using
293
00:24:50,523 --> 00:24:55,010
artificial intelligence to
process all this data about you,
294
00:24:56,184 --> 00:24:59,118
and then to match you
with the advertiser
295
00:24:59,118 --> 00:25:04,088
that wants to reach people
like you, for whatever message.
296
00:25:07,436 --> 00:25:09,231
- One of the problems
with technology
297
00:25:09,231 --> 00:25:11,440
is that it's been
developed to be addictive.
298
00:25:12,545 --> 00:25:14,374
The way these companies
design these things
299
00:25:14,374 --> 00:25:16,480
is in order to pull
you in and engage you.
300
00:25:17,861 --> 00:25:21,243
They want to become essentially
a slot machine of attention.
301
00:25:23,038 --> 00:25:24,488
So you're always
paying attention,
302
00:25:24,488 --> 00:25:26,214
you're always jacked
into the matrix,
303
00:25:26,214 --> 00:25:27,974
you're always checking.
304
00:25:31,012 --> 00:25:33,152
- When somebody
controls what you read,
305
00:25:33,152 --> 00:25:34,878
they also control
what you think.
306
00:25:36,569 --> 00:25:39,089
You get more of what you've
seen before and liked before,
307
00:25:39,089 --> 00:25:42,368
because this gives more traffic
and that gives more ads,
308
00:25:43,852 --> 00:25:47,615
but it also locks you
into your echo chamber.
309
00:25:47,615 --> 00:25:49,720
And this is what leads
to this polarization
310
00:25:49,720 --> 00:25:51,273
that we see today.
311
00:25:51,273 --> 00:25:54,449
- Jair Bolsonaro,
Jair Bolsonaro!
312
00:25:54,449 --> 00:25:57,072
- Jair Bolsonaro,
Brazil's right-wing
313
00:25:57,072 --> 00:26:00,455
populist candidate sometimes
likened to Donald Trump,
314
00:26:00,455 --> 00:26:03,044
winning the presidency Sunday
night in that country's
315
00:26:03,044 --> 00:26:06,219
most polarizing
election in decades.
316
00:26:06,219 --> 00:26:07,220
- Bolsonaro!
317
00:26:08,532 --> 00:26:11,017
- What we are seeing
around the world
318
00:26:11,017 --> 00:26:13,986
is upheaval and
polarization and conflict
319
00:26:15,194 --> 00:26:19,439
that is partially
pushed by algorithms
320
00:26:19,439 --> 00:26:23,512
that's figured out that
political extremes,
321
00:26:23,512 --> 00:26:26,585
tribalism, and sort of
shouting for your team,
322
00:26:26,585 --> 00:26:29,208
and feeling good
about it, is engaging.
323
00:26:35,386 --> 00:26:37,596
- Social media may
be adding to the attention
324
00:26:37,596 --> 00:26:39,908
to hate crimes around the globe.
325
00:26:39,908 --> 00:26:42,566
- It's about how people
can become radicalized
326
00:26:42,566 --> 00:26:46,156
by living in the fever
swamps of the Internet.
327
00:26:46,156 --> 00:26:48,779
- So is this a key
moment for the tech giants?
328
00:26:48,779 --> 00:26:50,436
Are they now prepared
to take responsibility
329
00:26:50,436 --> 00:26:53,335
as publishers for what
they share with the world?
330
00:26:54,647 --> 00:26:57,236
- If you deploy a
powerful potent technology
331
00:26:57,236 --> 00:27:01,067
at scale, and if you're talking
about Google and Facebook,
332
00:27:01,067 --> 00:27:03,932
you're deploying things
at scale of billions.
333
00:27:03,932 --> 00:27:07,177
If your artificial intelligence
is pushing polarization,
334
00:27:07,177 --> 00:27:09,489
you have global
upheaval potentially.
335
00:27:10,352 --> 00:27:11,457
- White lives matter!
336
00:27:11,457 --> 00:27:12,700
- Black lives matter!
337
00:27:12,700 --> 00:27:15,323
Black lives matter,
Black lives matter,
338
00:27:15,323 --> 00:27:18,602
Black lives matter,
Black lives matter!
339
00:28:05,028 --> 00:28:07,168
- Artificial General
Intelligence, AGI.
340
00:28:10,827 --> 00:28:12,794
Imagine your smartest friend,
341
00:28:14,002 --> 00:28:16,453
with 1,000 friends,
just as smart,
342
00:28:18,904 --> 00:28:21,976
and then run them at a 1,000
times faster than real time.
343
00:28:21,976 --> 00:28:24,323
So it means that in
every day of our time,
344
00:28:24,323 --> 00:28:26,877
they will do three
years of thinking.
345
00:28:26,877 --> 00:28:29,880
Can you imagine how
much you could do
346
00:28:31,330 --> 00:28:35,955
if, for every day, you could
do three years' worth of work?
347
00:28:56,838 --> 00:29:00,117
It wouldn't be an
unfair comparison to say
348
00:29:00,117 --> 00:29:04,121
that what we have right now
is even more exciting than
349
00:29:04,121 --> 00:29:06,883
the quantum physicists of
the early 20th century.
350
00:29:06,883 --> 00:29:08,436
They discovered nuclear power.
351
00:29:10,127 --> 00:29:12,612
I feel extremely lucky to
be taking part in this.
352
00:29:19,723 --> 00:29:21,345
Many machine learning experts,
353
00:29:21,345 --> 00:29:23,209
who are very knowledgeable
and experienced,
354
00:29:23,209 --> 00:29:24,935
have a lot of
skepticism about AGI.
355
00:29:26,765 --> 00:29:28,421
About when it would happen,
356
00:29:28,421 --> 00:29:30,354
and about whether it
could happen at all.
357
00:29:35,843 --> 00:29:37,603
But right now, this is something
358
00:29:37,603 --> 00:29:40,054
that just not that many
people have realized yet.
359
00:29:41,538 --> 00:29:45,369
That the speed of computers,
for neural networks, for AI,
360
00:29:45,369 --> 00:29:49,684
are going to become maybe
100,000 times faster
361
00:29:49,684 --> 00:29:51,168
in a small number of years.
362
00:29:53,792 --> 00:29:56,553
The entire hardware
industry for a long time
363
00:29:56,553 --> 00:29:59,107
didn't really know
what to do next,
364
00:30:00,246 --> 00:30:03,594
but with artificial
neural networks,
365
00:30:03,594 --> 00:30:05,320
now that they actually work,
366
00:30:05,320 --> 00:30:07,702
you have a reason to
build huge computers.
367
00:30:08,979 --> 00:30:11,257
You can build a brain in
silicon, it's possible.
368
00:30:19,093 --> 00:30:23,166
The very first AGIs
will be basically very,
369
00:30:23,166 --> 00:30:27,308
very large data centers
packed with specialized
370
00:30:27,308 --> 00:30:30,104
neural network processors
working in parallel.
371
00:30:32,382 --> 00:30:35,419
Compact, hot, power
hungry package,
372
00:30:36,869 --> 00:30:39,872
consuming like 10 million
homes' worth of energy.
373
00:30:58,166 --> 00:31:00,134
A roast beef sandwich.
374
00:31:00,134 --> 00:31:01,860
Yeah, something
slightly different.
375
00:31:02,722 --> 00:31:03,689
Just this once.
376
00:31:08,556 --> 00:31:10,730
Even the very first AGIs
377
00:31:10,730 --> 00:31:13,595
will be dramatically
more capable than humans.
378
00:31:15,597 --> 00:31:17,876
Humans will no longer
be economically useful
379
00:31:17,876 --> 00:31:19,049
for nearly any task.
380
00:31:20,637 --> 00:31:22,328
Why would you want
to hire a human,
381
00:31:22,328 --> 00:31:24,123
if you could just get a computer
382
00:31:24,123 --> 00:31:26,022
that's going to do it much
better and much more cheaply?
383
00:31:33,339 --> 00:31:35,479
AGI is going to be
like, without question,
384
00:31:36,825 --> 00:31:39,173
the most important
technology in the history
385
00:31:39,173 --> 00:31:41,071
of the planet by a huge margin.
386
00:31:44,005 --> 00:31:47,008
It's going to be bigger
than electricity, nuclear,
387
00:31:47,008 --> 00:31:48,458
and the Internet combined.
388
00:31:50,253 --> 00:31:52,117
In fact, you could say
that the whole purpose
389
00:31:52,117 --> 00:31:54,153
of all human science, the
purpose of computer science,
390
00:31:54,153 --> 00:31:57,294
the end Game, this is the
end Game, to build this.
391
00:31:57,294 --> 00:31:58,709
And it's going to be built.
392
00:31:58,709 --> 00:32:00,504
It's going to be
a new life form.
393
00:32:00,504 --> 00:32:01,333
It's going to be,
394
00:32:03,473 --> 00:32:05,061
it's going to make us obsolete.
395
00:32:13,897 --> 00:32:16,693
- We had programs
go down where we sent out
396
00:32:16,693 --> 00:32:18,281
AK47 fire, over.
397
00:32:22,009 --> 00:32:26,841
- Standby for
four one dash three.
398
00:32:26,841 --> 00:32:28,774
- European manufacturers
know the Americans
399
00:32:28,774 --> 00:32:31,915
have invested heavily in
the necessary hardware.
400
00:32:31,915 --> 00:32:34,262
- Step into a
brave new world of power,
401
00:32:34,262 --> 00:32:35,988
performance and productivity.
402
00:32:37,403 --> 00:32:39,958
- All of the images you are
about to see on the large screen
403
00:32:39,958 --> 00:32:44,134
will be generated by
what's in that Macintosh.
404
00:32:44,134 --> 00:32:46,757
- It's my honor and
privilege to introduce to you
405
00:32:46,757 --> 00:32:48,449
the Windows 95 Development Team.
406
00:32:50,140 --> 00:32:53,523
- Human physical labor has
been mostly obsolete for
407
00:32:53,523 --> 00:32:54,938
getting on for a century.
408
00:32:56,388 --> 00:32:59,805
Routine human mental labor
is rapidly becoming obsolete
409
00:32:59,805 --> 00:33:01,013
and that's why
we're seeing a lot
410
00:33:01,013 --> 00:33:03,222
of the middle class
jobs disappearing.
411
00:33:05,397 --> 00:33:06,950
- Every once in a while,
412
00:33:06,950 --> 00:33:10,781
a revolutionary product comes
along that changes everything.
413
00:33:10,781 --> 00:33:13,992
Today, Apple is
reinventing the phone.
414
00:33:25,279 --> 00:33:28,696
- Machine intelligence
is already all around us.
415
00:33:28,696 --> 00:33:30,422
The list of things
that we humans
416
00:33:30,422 --> 00:33:32,010
can do better than machines
417
00:33:32,010 --> 00:33:34,012
is actually shrinking
pretty fast.
418
00:33:40,156 --> 00:33:41,847
- Driverless cars are great.
419
00:33:41,847 --> 00:33:44,436
They probably will
reduce accidents.
420
00:33:44,436 --> 00:33:48,095
Except, alongside with
that, in the United States,
421
00:33:48,095 --> 00:33:51,029
you're going to lose
10 million jobs.
422
00:33:51,029 --> 00:33:54,135
What are you going to do with
10 million unemployed people?
423
00:33:59,416 --> 00:34:03,041
- The risk for social
conflict and tensions,
424
00:34:03,041 --> 00:34:07,114
if you exacerbate inequalities,
is very, very high.
425
00:34:16,157 --> 00:34:18,263
- AGI can, by definition,
426
00:34:18,263 --> 00:34:21,093
do all jobs better
than we can do.
427
00:34:21,093 --> 00:34:23,095
People who are saying,
"There will always be jobs
428
00:34:23,095 --> 00:34:25,166
that humans can do better
than machines," are simply
429
00:34:25,166 --> 00:34:28,135
betting against science and
saying there will never be AGI.
430
00:34:35,280 --> 00:34:37,799
- What we are seeing
now is like a train hurtling
431
00:34:37,799 --> 00:34:42,287
down a dark tunnel at breakneck
speed and it looks like
432
00:34:42,287 --> 00:34:43,702
we're sleeping at the wheel.
433
00:35:33,579 --> 00:35:35,961
- A large fraction of
the digital footprints
434
00:35:35,961 --> 00:35:39,171
we're leaving behind
are digital images.
435
00:35:40,276 --> 00:35:41,863
And specifically, what's
really interesting
436
00:35:41,863 --> 00:35:43,693
to me as a psychologist
437
00:35:43,693 --> 00:35:45,902
are digital images of our faces.
438
00:35:48,939 --> 00:35:51,701
Here you can see the difference
in the facial outline
439
00:35:51,701 --> 00:35:54,566
of an average gay and an
average straight face.
440
00:35:54,566 --> 00:35:57,120
And you can see
that straight men
441
00:35:57,120 --> 00:35:59,157
have slightly broader jaws.
442
00:36:00,261 --> 00:36:03,333
Gay women have
slightly larger jaws,
443
00:36:03,333 --> 00:36:04,783
compared with straight women.
444
00:36:06,785 --> 00:36:10,098
Computer algorithms can
reveal our political views
445
00:36:10,098 --> 00:36:12,687
or sexual orientation,
or intelligence,
446
00:36:12,687 --> 00:36:15,552
just based on the
picture of our faces.
447
00:36:16,726 --> 00:36:18,900
Even a human brain can
distinguish between gay
448
00:36:18,900 --> 00:36:21,524
and straight men
with some accuracy.
449
00:36:21,524 --> 00:36:23,664
Now it turns out
that the computer
450
00:36:23,664 --> 00:36:26,080
can do it with much
higher accuracy.
451
00:36:26,080 --> 00:36:29,601
What you're seeing
here is an accuracy of
452
00:36:29,601 --> 00:36:33,812
off-the-shelf facial
recognition software.
453
00:36:33,812 --> 00:36:36,194
This is terrible news
454
00:36:36,194 --> 00:36:38,851
for gay men and women
all around the world.
455
00:36:38,851 --> 00:36:40,370
And not only gay men and women,
456
00:36:40,370 --> 00:36:42,752
because the same algorithms
can be used to detect other
457
00:36:42,752 --> 00:36:46,825
intimate traits, think being
a member of the opposition,
458
00:36:46,825 --> 00:36:49,552
or being a liberal,
or being an atheist.
459
00:36:51,243 --> 00:36:54,522
Being an atheist is
also punishable by death
460
00:36:54,522 --> 00:36:57,007
in Saudi Arabia, for instance.
461
00:37:04,152 --> 00:37:06,189
My mission as an academic
is to warn people
462
00:37:06,189 --> 00:37:08,812
about the dangers of algorithms
463
00:37:08,812 --> 00:37:12,644
being able to reveal
our intimate traits.
464
00:37:14,162 --> 00:37:18,443
The problem is that when
people receive bad news,
465
00:37:18,443 --> 00:37:20,445
they very often choose
to dismiss them.
466
00:37:22,240 --> 00:37:23,344
Well, it's a bit scary
467
00:37:23,344 --> 00:37:25,312
when you start
receiving death threats
468
00:37:25,312 --> 00:37:26,623
from one day to another,
469
00:37:26,623 --> 00:37:29,005
and I've received quite
a few death threats,
470
00:37:30,282 --> 00:37:34,252
but as a scientist, I
have to basically show
471
00:37:34,252 --> 00:37:35,287
what is possible.
472
00:37:38,014 --> 00:37:41,051
So what I'm really interested
in now is to try to see
473
00:37:41,051 --> 00:37:45,470
whether we can predict other
traits from people's faces.
474
00:37:50,682 --> 00:37:52,994
Now, if you can detect
depression from a face,
475
00:37:52,994 --> 00:37:58,034
or suicidal thoughts,
maybe a CCTV system
476
00:37:59,207 --> 00:38:01,313
on the train station
can save some lives.
477
00:38:03,488 --> 00:38:05,800
What if we could
predict that someone
478
00:38:05,800 --> 00:38:08,182
is more prone to commit a crime?
479
00:38:09,356 --> 00:38:11,427
You probably had a
school counselor,
480
00:38:11,427 --> 00:38:14,775
a psychologist hired
there to identify children
481
00:38:14,775 --> 00:38:19,227
that potentially may have
some behavioral problems.
482
00:38:21,713 --> 00:38:24,440
So now imagine if you could
predict with high accuracy
483
00:38:24,440 --> 00:38:27,028
that someone is likely to
commit a crime in the future
484
00:38:27,028 --> 00:38:29,168
from the language
use, from the face,
485
00:38:29,168 --> 00:38:32,102
from the facial expressions,
from the likes on Facebook.
486
00:38:36,693 --> 00:38:39,593
I'm not developing new
methods, I'm just describing
487
00:38:39,593 --> 00:38:43,321
something or testing something
in an academic environment.
488
00:38:45,012 --> 00:38:47,186
But there obviously
is a chance that,
489
00:38:47,186 --> 00:38:52,191
while warning people against
risks of new technologies,
490
00:38:53,331 --> 00:38:54,884
I may also give some
people new ideas.
491
00:39:15,525 --> 00:39:17,078
- We haven't yet seen the future
492
00:39:17,078 --> 00:39:22,083
in terms of the ways in which
the new data-driven society
493
00:39:23,153 --> 00:39:25,777
is going to really evolve.
494
00:39:27,951 --> 00:39:31,127
The tech companies want
to get every possible bit
495
00:39:31,127 --> 00:39:34,544
of information that they
can collect on everyone
496
00:39:34,544 --> 00:39:36,097
to facilitate business.
497
00:39:37,858 --> 00:39:41,482
The police and the military
want to do the same thing
498
00:39:41,482 --> 00:39:43,242
to facilitate security.
499
00:39:46,384 --> 00:39:50,836
The interests that the two
have in common are immense,
500
00:39:50,836 --> 00:39:55,634
and so the extent
of collaboration
between what you might
501
00:39:55,634 --> 00:40:00,639
call the Military-Tech Complex
is growing dramatically.
502
00:40:04,816 --> 00:40:07,405
- The CIA, for a very long time,
503
00:40:07,405 --> 00:40:10,546
has maintained a close
connection with Silicon Valley.
504
00:40:11,719 --> 00:40:14,722
Their venture capital
firm known as In-Q-Tel,
505
00:40:14,722 --> 00:40:18,312
makes seed investments to
start-up companies developing
506
00:40:18,312 --> 00:40:21,936
breakthrough technology that
the CIA hopes to deploy.
507
00:40:23,110 --> 00:40:26,527
Palantir, the big
data analytics firm,
508
00:40:26,527 --> 00:40:29,323
one of their first seed
investments was from In-Q-Tel.
509
00:40:33,396 --> 00:40:36,261
- In-Q-Tel has struck
gold in Palantir
510
00:40:36,261 --> 00:40:40,507
in helping to create
a private vendor
511
00:40:40,507 --> 00:40:45,373
that has intelligence and
artificial intelligence
512
00:40:45,373 --> 00:40:48,135
capabilities that the government
can't even compete with.
513
00:40:50,862 --> 00:40:53,243
- Good evening, I'm Peter Thiel.
514
00:40:54,693 --> 00:40:58,697
I'm not a politician, but
neither is Donald Trump.
515
00:40:58,697 --> 00:41:03,530
He is a builder and it's
time to rebuild America.
516
00:41:05,704 --> 00:41:08,362
- Peter Thiel,
the founder of Palantir,
517
00:41:08,362 --> 00:41:11,054
was a Donald Trump
transition advisor
518
00:41:11,054 --> 00:41:13,609
and a close friend and donor.
519
00:41:15,473 --> 00:41:18,130
Trump was elected
largely on the promise
520
00:41:18,130 --> 00:41:21,996
to deport millions
of immigrants.
521
00:41:21,996 --> 00:41:26,691
The only way you can do that
is with a lot of intelligence
522
00:41:26,691 --> 00:41:29,832
and that's where
Palantir comes in.
523
00:41:33,111 --> 00:41:37,460
They ingest huge troves
of data, which include,
524
00:41:37,460 --> 00:41:41,809
where you live, where
you work, who you know,
525
00:41:41,809 --> 00:41:45,433
who your neighbors are,
who your family is,
526
00:41:45,433 --> 00:41:48,954
where you have visited,
where you stay,
527
00:41:48,954 --> 00:41:50,784
your social media profile.
528
00:41:53,752 --> 00:41:57,722
Palantir gets all of that
and is remarkably good
529
00:41:57,722 --> 00:42:02,658
at structuring it in a way
that helps law enforcement,
530
00:42:03,866 --> 00:42:06,869
immigration authorities
or intelligence agencies
531
00:42:06,869 --> 00:42:10,597
of any kind, track
you, find you,
532
00:42:10,597 --> 00:42:13,669
and learn everything there
is to know about you.
533
00:42:53,018 --> 00:42:55,538
- We're putting AI in
charge now of evermore
534
00:42:55,538 --> 00:42:57,954
important decisions that
affect people's lives.
535
00:42:59,404 --> 00:43:02,441
Old-school AI used to have
its intelligence programmed in
536
00:43:02,441 --> 00:43:05,755
by humans who understood
how it worked, but today,
537
00:43:05,755 --> 00:43:08,482
powerful AI systems have
just learned for themselves,
538
00:43:08,482 --> 00:43:11,865
and we have no clue
really how they work,
539
00:43:11,865 --> 00:43:13,936
which makes it really
hard to trust them.
540
00:43:18,665 --> 00:43:22,220
- This isn't some futuristic
technology, this is now.
541
00:43:23,877 --> 00:43:27,639
AI might help determine
where a fire department
542
00:43:27,639 --> 00:43:30,331
is built in a community or
where a school is built.
543
00:43:30,331 --> 00:43:32,817
It might decide
whether you get bail,
544
00:43:32,817 --> 00:43:35,095
or whether you stay in jail.
545
00:43:35,095 --> 00:43:37,338
It might decide where the
police are going to be.
546
00:43:37,338 --> 00:43:39,202
It might decide
whether you're going
547
00:43:39,202 --> 00:43:41,549
to be under additional
police scrutiny.
548
00:43:47,797 --> 00:43:51,421
- It's popular now in the US
to do predictive policing.
549
00:43:51,421 --> 00:43:53,389
So what they do is
they use an algorithm
550
00:43:53,389 --> 00:43:55,322
to figure out where
crime will be,
551
00:43:56,323 --> 00:43:57,565
and they use that to tell
552
00:43:57,565 --> 00:43:59,533
where we should send
police officers.
553
00:44:00,879 --> 00:44:03,813
So that's based on a
measurement of crime rate.
554
00:44:05,056 --> 00:44:06,713
So we know that there is bias.
555
00:44:06,713 --> 00:44:09,474
Black people and Hispanic
people are pulled over,
556
00:44:09,474 --> 00:44:11,579
and stopped by the police
officers more frequently
557
00:44:11,579 --> 00:44:14,168
than white people are, so we
have this biased data going in,
558
00:44:14,168 --> 00:44:16,343
and then what happens
is you use that to say,
559
00:44:16,343 --> 00:44:18,000
"Here's where the
cops should go."
560
00:44:18,000 --> 00:44:19,277
Well, the cops go to
those neighborhoods,
561
00:44:19,277 --> 00:44:22,073
and guess what they
do, they arrest people.
562
00:44:22,073 --> 00:44:25,628
And then it feeds back
biased data into the system,
563
00:44:25,628 --> 00:44:27,319
and that's called
a feedback loop.
564
00:44:40,539 --> 00:44:45,096
- Predictive policing
leads at the extremes
565
00:44:46,235 --> 00:44:50,411
to experts saying,
"Show me your baby,
566
00:44:50,411 --> 00:44:53,311
and I will tell you whether
she's going to be a criminal."
567
00:44:55,693 --> 00:45:00,318
Now that we can predict it,
we're going to then surveil
568
00:45:00,318 --> 00:45:05,254
those kids much more closely
and we're going to jump on them
569
00:45:06,117 --> 00:45:08,084
at the first sign of a problem.
570
00:45:08,084 --> 00:45:11,398
And that's going to make
for more effective policing.
571
00:45:11,398 --> 00:45:15,678
It does, but it's going to
make for a really grim society
572
00:45:15,678 --> 00:45:20,614
and it's reinforcing
dramatically
existing injustices.
573
00:45:27,345 --> 00:45:32,212
- Imagine a world in which
networks of CCTV cameras,
574
00:45:32,212 --> 00:45:35,353
drone surveillance
cameras, have sophisticated
575
00:45:35,353 --> 00:45:38,839
face recognition technologies
and are connected
576
00:45:38,839 --> 00:45:41,186
to other government
surveillance databases.
577
00:45:42,498 --> 00:45:45,639
We will have the
technology in place to have
578
00:45:45,639 --> 00:45:50,126
all of our movements
comprehensively
tracked and recorded.
579
00:45:52,094 --> 00:45:54,752
What that also means
is that we will have
580
00:45:54,752 --> 00:45:57,340
created a surveillance
time machine
581
00:45:57,340 --> 00:46:00,965
that will allow governments
and powerful corporations
582
00:46:00,965 --> 00:46:03,036
to essentially hit
rewind on our lives.
583
00:46:03,036 --> 00:46:06,108
We might not be under
any suspicion now
584
00:46:06,108 --> 00:46:08,248
and five years from now,
they might want to know
585
00:46:08,248 --> 00:46:12,493
more about us, and can
then recreate granularly
586
00:46:12,493 --> 00:46:14,668
everything we've done,
everyone we've seen,
587
00:46:14,668 --> 00:46:17,464
everyone we've been around
over that entire period.
588
00:46:20,191 --> 00:46:23,711
That's an extraordinary
amount of power
589
00:46:23,711 --> 00:46:26,024
for us to seed to anyone.
590
00:46:27,267 --> 00:46:29,752
And it's a world that I
think has been difficult
591
00:46:29,752 --> 00:46:33,549
for people to imagine,
but we've already built
592
00:46:33,549 --> 00:46:36,379
the architecture to enable that.
593
00:47:11,483 --> 00:47:14,659
- I'm a political reporter
and I'm very interested
594
00:47:14,659 --> 00:47:19,043
in the ways powerful industries
use their political power
595
00:47:19,043 --> 00:47:21,562
to influence the
public policy process.
596
00:47:25,394 --> 00:47:28,811
The large tech companies and
their lobbyists get together
597
00:47:28,811 --> 00:47:31,918
behind closed doors and
are able to craft policies
598
00:47:31,918 --> 00:47:33,471
that we all have to live under.
599
00:47:35,162 --> 00:47:38,683
That's true for surveillance
policies, for policies in terms
600
00:47:38,683 --> 00:47:40,133
of data collection,
601
00:47:40,133 --> 00:47:43,446
but also increasingly important
when it comes to military
602
00:47:43,446 --> 00:47:44,585
and foreign policy.
603
00:47:48,313 --> 00:47:51,869
Starting in 2016, the
Defense Department
604
00:47:51,869 --> 00:47:54,664
formed the Defense
Innovation Board.
605
00:47:54,664 --> 00:47:58,461
That's a special body created
to bring top tech executives
606
00:47:58,461 --> 00:48:00,636
into closer contact
with the military.
607
00:48:03,535 --> 00:48:06,124
Eric Schmidt, former
chairman of Alphabet,
608
00:48:06,124 --> 00:48:07,746
the parent company of Google,
609
00:48:07,746 --> 00:48:10,749
became the chairman of the
Defense Innovation Board,
610
00:48:11,923 --> 00:48:14,270
and one of their first
priorities was to say,
611
00:48:14,270 --> 00:48:17,101
"We need more artificial
intelligence integrated
612
00:48:17,101 --> 00:48:18,205
into the military."
613
00:48:20,276 --> 00:48:23,555
- I've worked with a group
of volunteers over the
614
00:48:23,555 --> 00:48:26,558
last couple of years to take
a look at innovation in the
615
00:48:26,558 --> 00:48:31,184
overall military, and my summary
conclusion is that we have
616
00:48:31,184 --> 00:48:34,601
fantastic people who are
trapped in a very bad system.
617
00:48:37,673 --> 00:48:40,089
- From the Department of
Defense's perspective,
618
00:48:40,089 --> 00:48:41,780
where I really started
to get interested in it
619
00:48:41,780 --> 00:48:45,336
was when we started thinking
about Unmanned Systems and
620
00:48:45,336 --> 00:48:50,341
how robotic and unmanned systems
would start to change war.
621
00:48:51,756 --> 00:48:54,276
The smarter you made the
Unmanned Systems and robots,
622
00:48:54,276 --> 00:48:57,935
the more powerful you might
be able to make your military.
623
00:48:59,937 --> 00:49:01,593
- Under Secretary of Defense,
624
00:49:01,593 --> 00:49:04,803
Robert Work put together
a major memo known as
625
00:49:04,803 --> 00:49:08,048
the Algorithmic Warfare
Cross-Functional Team,
626
00:49:08,048 --> 00:49:09,705
better known as Project Maven.
627
00:49:12,190 --> 00:49:15,124
Eric Schmidt gave a number of
speeches and media appearances
628
00:49:15,124 --> 00:49:18,645
where he said this effort
was designed to increase fuel
629
00:49:18,645 --> 00:49:22,062
efficiency in the Air Force,
to help with the logistics,
630
00:49:22,062 --> 00:49:25,410
but behind closed doors there
was another parallel effort.
631
00:49:31,106 --> 00:49:34,109
Late in 2017 as part
of Project Maven,
632
00:49:34,109 --> 00:49:38,078
Google, Eric Schmidt's firm,
was tasked to secretly work
633
00:49:38,078 --> 00:49:40,494
on another part
of Project Maven,
634
00:49:40,494 --> 00:49:43,359
and that was to take
the vast volumes
635
00:49:43,359 --> 00:49:45,879
of image data vacuumed up
636
00:49:45,879 --> 00:49:49,607
by drones operating in
Iraq and Afghanistan
637
00:49:49,607 --> 00:49:51,402
and to teach an AI
638
00:49:51,402 --> 00:49:54,060
to quickly identify
targets on the battlefield.
639
00:49:56,994 --> 00:50:00,998
- We have a sensor and the
sensor can do full motion video
640
00:50:00,998 --> 00:50:02,861
of an entire city.
641
00:50:02,861 --> 00:50:06,106
And we would have three
seven-person teams working
642
00:50:06,106 --> 00:50:10,766
constantly and they could
process 15% of the information.
643
00:50:10,766 --> 00:50:13,389
The other 85% of the
information was just left
644
00:50:13,389 --> 00:50:15,874
on the cutting room
floor, so we said,
645
00:50:15,874 --> 00:50:18,463
"Hey, AI and machine learning
646
00:50:18,463 --> 00:50:21,466
would help us process
100% of the information."
647
00:50:29,785 --> 00:50:33,168
- Google has long had
the motto, "Don't be evil."
648
00:50:33,168 --> 00:50:35,032
They have created a public image
649
00:50:35,032 --> 00:50:39,208
that they are devoted
to public transparency.
650
00:50:39,208 --> 00:50:41,659
But for Google to
slowly transform
651
00:50:41,659 --> 00:50:43,626
into a defense contractor,
652
00:50:43,626 --> 00:50:46,250
they maintained
the utmost secrecy.
653
00:50:46,250 --> 00:50:48,528
And you had Google
entering into this contract
654
00:50:48,528 --> 00:50:49,770
with most of the employees,
655
00:50:49,770 --> 00:50:51,186
even employees who were working
656
00:50:51,186 --> 00:50:53,119
on the program completely
left in the dark.
657
00:51:06,201 --> 00:51:09,411
- Usually within Google,
anyone in the company
658
00:51:09,411 --> 00:51:12,483
is allowed to know about any
other project that's happening
659
00:51:12,483 --> 00:51:14,174
in some other part
of the company.
660
00:51:15,589 --> 00:51:18,696
With Project Maven, the fact
that it was kept secret,
661
00:51:18,696 --> 00:51:20,974
I think was alarming to people
662
00:51:20,974 --> 00:51:22,907
because that's not
the norm at Google.
663
00:51:25,082 --> 00:51:27,394
- When this story
was first revealed,
664
00:51:27,394 --> 00:51:29,983
it set off a firestorm
within Google.
665
00:51:29,983 --> 00:51:32,848
You had a number of employees
quitting in protests,
666
00:51:32,848 --> 00:51:36,300
others signing a petition
objecting to this work.
667
00:51:38,578 --> 00:51:39,820
- You have to really say,
668
00:51:39,820 --> 00:51:41,684
"I don't want to be
part of this anymore."
669
00:51:43,272 --> 00:51:46,206
There are companies
called defense contractors
670
00:51:46,206 --> 00:51:50,279
and Google should just not
be one of those companies
671
00:51:50,279 --> 00:51:54,835
because people need to trust
Google for Google to work.
672
00:51:56,630 --> 00:52:00,255
- Good morning and
welcome to Google I/O.
673
00:52:01,704 --> 00:52:04,569
- We've seen emails that
show that Google simply
674
00:52:04,569 --> 00:52:07,745
continued to mislead their
employees that the drone
675
00:52:07,745 --> 00:52:11,749
targeting program was only a
minor effort that could at most
676
00:52:11,749 --> 00:52:13,820
be worth $9 million to the firm,
677
00:52:13,820 --> 00:52:15,511
which is drops in the bucket
678
00:52:15,511 --> 00:52:17,927
for a gigantic
company like Google.
679
00:52:19,032 --> 00:52:21,690
But from internal
emails that we obtained,
680
00:52:21,690 --> 00:52:25,935
Google was expecting Project
Maven would ramp up to as much
681
00:52:25,935 --> 00:52:30,940
as $250 million, and that this
entire effort would provide
682
00:52:32,321 --> 00:52:34,323
Google with Special Defense
Department certification to make
683
00:52:34,323 --> 00:52:37,119
them available for even
bigger defense contracts,
684
00:52:37,119 --> 00:52:38,810
some worth as much
as $10 billion.
685
00:52:50,167 --> 00:52:54,205
The pressure for Google to
compete for military contracts
686
00:52:54,205 --> 00:52:56,380
has come at a time
when its competitors
687
00:52:56,380 --> 00:52:58,175
are also shifting their culture.
688
00:53:00,246 --> 00:53:04,422
Amazon, similarly pitching the
military and law enforcement.
689
00:53:04,422 --> 00:53:06,459
IBM and other leading firms,
690
00:53:06,459 --> 00:53:09,289
they're pitching law
enforcement and military.
691
00:53:10,704 --> 00:53:13,845
To stay competitive, Google
has slowly transformed.
692
00:53:19,541 --> 00:53:23,269
- The Defense Science
Board said of all of the
693
00:53:23,269 --> 00:53:26,548
technological advances that
are happening right now,
694
00:53:26,548 --> 00:53:30,828
the single most important thing
was artificial intelligence
695
00:53:30,828 --> 00:53:35,177
and the autonomous operations
that it would lead.
696
00:53:35,177 --> 00:53:36,730
Are we investing enough?
697
00:53:42,219 --> 00:53:45,670
- Once we develop
what are known as
698
00:53:45,670 --> 00:53:50,227
autonomous lethal
weapons, in other words,
699
00:53:50,227 --> 00:53:53,195
weapons that are not
controlled at all,
700
00:53:53,195 --> 00:53:55,922
they are genuinely autonomous,
701
00:53:55,922 --> 00:53:58,165
you've only got to get
a president who says,
702
00:53:58,165 --> 00:54:00,858
"The hell with international
law, we've got these weapons.
703
00:54:00,858 --> 00:54:03,136
We're going to do what
we want with them."
704
00:54:06,933 --> 00:54:08,314
- We're very close.
705
00:54:08,314 --> 00:54:10,557
When you have the
hardware already set up
706
00:54:10,557 --> 00:54:12,904
and all you have to
do is flip a switch
707
00:54:12,904 --> 00:54:14,734
to make it fully autonomous,
708
00:54:14,734 --> 00:54:17,357
what is it there that's
stopping you from doing that?
709
00:54:19,186 --> 00:54:21,603
There's something
really to be feared
710
00:54:21,603 --> 00:54:23,432
in war at machine speed.
711
00:54:24,640 --> 00:54:26,332
What if you're a machine
and you've run millions
712
00:54:26,332 --> 00:54:28,748
and millions of
different war scenarios
713
00:54:28,748 --> 00:54:30,439
and you have a team of drones
714
00:54:30,439 --> 00:54:32,441
and you've delegated
control to half of them,
715
00:54:32,441 --> 00:54:34,340
and you're collaborating
in real time?
716
00:54:35,513 --> 00:54:37,653
What happens when
that swarm of drones
717
00:54:37,653 --> 00:54:40,035
is tasked with engaging a city?
718
00:54:41,864 --> 00:54:44,281
How will they take
over that city?
719
00:54:44,281 --> 00:54:47,180
The answer is we won't
know until it happens.
720
00:54:54,429 --> 00:54:58,467
- We do not want an
AI system to decide
721
00:54:58,467 --> 00:55:00,538
what human it would attack,
722
00:55:00,538 --> 00:55:03,852
but we're going up against
authoritarian competitors.
723
00:55:03,852 --> 00:55:06,786
So in my view, an
authoritarian regime
724
00:55:06,786 --> 00:55:10,065
will have less problem
delegating authority
725
00:55:10,065 --> 00:55:12,585
to a machine to make
lethal decisions.
726
00:55:13,758 --> 00:55:17,797
So how that plays out
remains to be seen.
727
00:55:41,648 --> 00:55:45,342
- Almost the gift of AI now
is that it will force us
728
00:55:45,342 --> 00:55:48,793
collectively to think through
at a very basic level,
729
00:55:48,793 --> 00:55:50,657
what does it mean to be human?
730
00:55:53,039 --> 00:55:54,903
What do I do as a human better
731
00:55:54,903 --> 00:55:57,354
than a certain super
smart machine can do?
732
00:56:00,667 --> 00:56:05,362
First, we create our technology
and then it recreates us.
733
00:56:05,362 --> 00:56:09,987
We need to make sure that we
don't miss some of the things
734
00:56:09,987 --> 00:56:11,782
that make us so beautiful human.
735
00:56:16,027 --> 00:56:18,754
- Once we build
intelligent machines,
736
00:56:18,754 --> 00:56:21,274
the philosophical vocabulary
we have available to think
737
00:56:21,274 --> 00:56:25,036
about ourselves as human
increasingly fails us.
738
00:56:27,522 --> 00:56:30,352
If I ask you to write up a
list of all the terms you have
739
00:56:30,352 --> 00:56:33,355
available to describe
yourself as human,
740
00:56:33,355 --> 00:56:35,461
there are not so many terms.
741
00:56:35,461 --> 00:56:40,466
Culture, history, sociality,
maybe politics, civilization,
742
00:56:43,261 --> 00:56:48,266
subjectivity, all of these
terms ground in two positions
743
00:56:50,096 --> 00:56:52,029
that humans are more
than mere animals
744
00:56:53,410 --> 00:56:56,413
and that humans are
more than mere machines.
745
00:56:59,623 --> 00:57:03,592
But if machines truly
think there is a large set
746
00:57:03,592 --> 00:57:07,872
of key philosophical questions
in which what is at stake is:
747
00:57:09,253 --> 00:57:10,185
Who are we?
748
00:57:10,185 --> 00:57:11,497
What is our place in the world?
749
00:57:11,497 --> 00:57:12,394
What is the world?
750
00:57:12,394 --> 00:57:13,568
How is it structured?
751
00:57:13,568 --> 00:57:16,536
Do the categories that
we have relied on,
752
00:57:16,536 --> 00:57:17,986
do they still work?
753
00:57:17,986 --> 00:57:19,436
Were they wrong?
754
00:57:23,888 --> 00:57:26,615
- Many people think
of intelligence as
something mysterious
755
00:57:26,615 --> 00:57:30,550
that can only exist inside of
biological organisms, like us,
756
00:57:30,550 --> 00:57:33,553
but intelligence is all
about information processing.
757
00:57:34,968 --> 00:57:36,453
It doesn't matter whether
the intelligence is processed
758
00:57:36,453 --> 00:57:40,318
by carbon atoms inside of
cells and brains, and people,
759
00:57:40,318 --> 00:57:42,424
or by silicon
atoms in computers.
760
00:57:45,047 --> 00:57:47,671
Part of the success of
AI recently has come
761
00:57:47,671 --> 00:57:51,951
from stealing great
ideas from evolution.
762
00:57:51,951 --> 00:57:53,470
We noticed that the
brain, for example,
763
00:57:53,470 --> 00:57:57,335
has all these neurons inside
connected in complicated ways.
764
00:57:57,335 --> 00:58:00,062
So we stole that idea
and abstracted it
765
00:58:00,062 --> 00:58:02,686
into artificial neural
networks in computers,
766
00:58:04,204 --> 00:58:07,138
and that's what
has revolutionized
machine intelligence.
767
00:58:12,005 --> 00:58:14,629
If we one day get Artificial
General Intelligence,
768
00:58:14,629 --> 00:58:18,667
then by definition, AI can
also do better the job of AI
769
00:58:18,667 --> 00:58:23,258
programming and that means
that further progress in making
770
00:58:23,258 --> 00:58:27,331
AI will be dominated not by
human programmers, but by AI.
771
00:58:30,196 --> 00:58:33,440
Recursively self-improving AI
could leave human intelligence
772
00:58:33,440 --> 00:58:37,341
far behind, creating
super intelligence.
773
00:58:39,343 --> 00:58:42,346
It's gonna be the last
invention we ever need to make,
774
00:58:42,346 --> 00:58:44,900
because it can then
invent everything else
775
00:58:44,900 --> 00:58:46,384
much faster than we could.
776
00:59:58,456 --> 01:00:03,427
- There is a future that
we all need to talk about.
777
01:00:04,842 --> 01:00:06,775
Some of the fundamental
questions about the future
778
01:00:06,775 --> 01:00:10,641
of artificial intelligence,
not just where it's going,
779
01:00:10,641 --> 01:00:14,024
but what it means for
society to go there.
780
01:00:15,370 --> 01:00:19,029
It is not what computers can do,
781
01:00:19,029 --> 01:00:21,997
but what computers should do.
782
01:00:21,997 --> 01:00:25,345
As the generation of
people that is bringing AI
783
01:00:25,345 --> 01:00:26,657
to the future,
784
01:00:26,657 --> 01:00:28,763
we are the generation
785
01:00:28,763 --> 01:00:32,387
that will answer this
question first and foremost.
786
01:00:38,980 --> 01:00:41,638
- We haven't created
the human-level
thinking machine yet,
787
01:00:41,638 --> 01:00:43,812
but we get closer and closer.
788
01:00:45,572 --> 01:00:49,231
Maybe we'll get to human-level
AI in five years from now
789
01:00:49,231 --> 01:00:51,544
or maybe it'll take 50
or 100 years from now.
790
01:00:51,544 --> 01:00:53,201
It almost doesn't matter.
791
01:00:53,201 --> 01:00:55,997
Like these are all
really, really soon,
792
01:00:55,997 --> 01:01:00,657
in terms of the overall
history of humanity.
793
01:01:05,662 --> 01:01:06,870
Very nice.
794
01:01:20,090 --> 01:01:23,714
So, the AI field is
extremely international.
795
01:01:23,714 --> 01:01:28,201
China is up and coming and
it's starting to rival the US,
796
01:01:28,201 --> 01:01:31,308
Europe and Japan in
terms of putting a lot
797
01:01:31,308 --> 01:01:34,173
of processing power behind AI
798
01:01:34,173 --> 01:01:37,555
and gathering a lot of
data to help AI learn.
799
01:01:40,386 --> 01:01:44,942
We have a young generation
of Chinese researchers now.
800
01:01:44,942 --> 01:01:46,841
Nobody knows where
the next revolution
801
01:01:46,841 --> 01:01:48,049
is going to come from.
802
01:01:54,711 --> 01:01:58,611
- China always wanted to become
the superpower in the world.
803
01:02:00,820 --> 01:02:01,752
The Chinese government thinks AI
804
01:02:01,752 --> 01:02:03,202
gave them the chance to become
805
01:02:03,202 --> 01:02:08,207
one of the most advanced
technology wise, business wise.
806
01:02:09,070 --> 01:02:10,692
So the Chinese government look
807
01:02:10,692 --> 01:02:11,969
at this as a huge opportunity.
808
01:02:13,557 --> 01:02:18,113
Like they've raised a flag and
said, "That's a good field.
809
01:02:18,113 --> 01:02:20,702
The companies should
jump into it."
810
01:02:20,702 --> 01:02:22,600
Then China's commercial
world and companies say,
811
01:02:22,600 --> 01:02:25,189
"Okay, the government
raised a flag, that's good.
812
01:02:25,189 --> 01:02:26,639
Let's put the money into it."
813
01:02:28,261 --> 01:02:30,298
Chinese tech giants, like Baidu,
814
01:02:30,298 --> 01:02:32,507
like Tencent and like AliBaba,
815
01:02:32,507 --> 01:02:36,200
they put a lot of the
investment into the AI field.
816
01:02:37,823 --> 01:02:41,240
So we see that China's AI
development is booming.
817
01:02:47,798 --> 01:02:51,768
- In China, everybody has
Alipay and WeChat pay,
818
01:02:51,768 --> 01:02:53,770
so mobile payment is everywhere.
819
01:02:55,254 --> 01:02:58,947
And with that, they can
do a lot of AI analysis
820
01:02:58,947 --> 01:03:03,676
to know like your spending
habits, your credit rating.
821
01:03:05,540 --> 01:03:10,476
Face recognition technology
is widely adopted in China,
822
01:03:10,476 --> 01:03:12,443
in airports and train stations.
823
01:03:13,617 --> 01:03:15,930
So, in the future, maybe
in just a few months,
824
01:03:15,930 --> 01:03:19,519
you don't need a paper
ticket to board a train.
825
01:03:19,519 --> 01:03:20,417
Only your face.
826
01:03:29,012 --> 01:03:32,670
- We generate the
world's biggest platform
827
01:03:32,670 --> 01:03:34,051
of facial recognition.
828
01:03:35,639 --> 01:03:40,678
We have 300,000 developers
using our platform.
829
01:03:42,749 --> 01:03:45,925
A lot of it is
selfie camera apps.
830
01:03:45,925 --> 01:03:48,341
It makes you look
more beautiful.
831
01:03:50,965 --> 01:03:54,175
There are millions and millions
of cameras in the world,
832
01:03:55,555 --> 01:03:59,249
each camera from my point
is a data generator.
833
01:04:03,149 --> 01:04:06,843
In a machine's eye, your face
will change into the features
834
01:04:06,843 --> 01:04:10,985
and it will turn your face
into a paragraph of code.
835
01:04:12,434 --> 01:04:16,335
So we can detect how old you
are, if you're male or female,
836
01:04:16,335 --> 01:04:17,819
and your emotions.
837
01:04:21,547 --> 01:04:25,137
Shopping is about what kind
of thing you are looking at.
838
01:04:25,137 --> 01:04:28,761
We can track your eyeballs,
so if you are focusing
839
01:04:28,761 --> 01:04:30,038
on some product,
840
01:04:30,038 --> 01:04:33,007
we can track that
so that we can know
841
01:04:33,007 --> 01:04:36,389
which kind of people like
which kind of product.
842
01:05:50,360 --> 01:05:52,396
- The Chinese government
is using multiple
843
01:05:52,396 --> 01:05:55,192
different kinds of
technologies, whether it's AI,
844
01:05:55,192 --> 01:05:58,023
whether it's big data
platforms, facial recognition,
845
01:05:58,023 --> 01:06:01,371
voice recognition,
essentially to monitor
846
01:06:01,371 --> 01:06:03,131
what the population is doing.
847
01:06:06,203 --> 01:06:08,585
I think the Chinese
government has made very clear
848
01:06:08,585 --> 01:06:13,590
its intent to gather massive
amounts of data about people
849
01:06:14,763 --> 01:06:17,490
to socially engineer a
dissent-free society.
850
01:06:20,355 --> 01:06:23,393
The logic behind the
Chinese government's
851
01:06:23,393 --> 01:06:27,121
social credit system,
it's to take the idea that
852
01:06:27,121 --> 01:06:32,126
whether you are credit
worthy for a financial loan
853
01:06:33,541 --> 01:06:36,199
and adding to it a very
political dimension to say,
854
01:06:36,199 --> 01:06:38,649
"Are you a trustworthy
human being?
855
01:06:40,997 --> 01:06:42,481
What you've said online,
856
01:06:42,481 --> 01:06:44,655
have you ever been critical
of the authorities?
857
01:06:44,655 --> 01:06:46,312
Do you have a criminal record?"
858
01:06:48,073 --> 01:06:51,697
And all that information is
packaged up together to rate
859
01:06:51,697 --> 01:06:56,219
you in ways that if you have
performed well in their view,
860
01:06:56,219 --> 01:06:58,842
you'll have easier
access to certain kinds
861
01:06:58,842 --> 01:07:00,809
of state services or benefits.
862
01:07:02,087 --> 01:07:04,020
But if you haven't
done very well,
863
01:07:04,020 --> 01:07:06,263
you are going to be
penalized or restricted.
864
01:07:10,267 --> 01:07:13,374
There's no way for people to
challenge those designations
865
01:07:13,374 --> 01:07:14,858
or, in some cases,
866
01:07:14,858 --> 01:07:16,446
even know that they've
been put in that category,
867
01:07:16,446 --> 01:07:19,725
and it's not until they
try to access some kind
868
01:07:19,725 --> 01:07:22,314
of state service or
buy a plane ticket,
869
01:07:22,314 --> 01:07:24,799
or get a passport, or
enroll their kid in school,
870
01:07:24,799 --> 01:07:27,250
that they come to learn
that they've been labeled
871
01:07:27,250 --> 01:07:28,458
in this way,
872
01:07:28,458 --> 01:07:30,598
and that there are
negative consequences
873
01:07:30,598 --> 01:07:31,909
for them as a result.
874
01:07:49,341 --> 01:07:53,069
We've spent the better part
of the last one or two years
875
01:07:53,069 --> 01:07:57,521
looking at abuses
of surveillance
technology across China,
876
01:07:57,521 --> 01:08:00,490
and a lot of that work
has taken us to Xinjiang,
877
01:08:01,870 --> 01:08:05,529
the Northwestern region of
China that has a more than half
878
01:08:05,529 --> 01:08:09,947
population of Turkic Muslims,
Uyghurs, Kazakhs and Hui.
879
01:08:12,847 --> 01:08:15,298
This is a region
and a population the
Chinese government
880
01:08:15,298 --> 01:08:19,164
has long considered to be
politically suspect or disloyal.
881
01:08:21,925 --> 01:08:24,514
We came to find information
about what's called
882
01:08:24,514 --> 01:08:27,482
the Integrated Joint
Operations Platform,
883
01:08:27,482 --> 01:08:31,521
which is a predictive policing
program and that's one
884
01:08:31,521 --> 01:08:35,007
of the programs that has been
spitting out lists of names
885
01:08:35,007 --> 01:08:37,906
of people to be subjected
to political re-education.
886
01:08:43,395 --> 01:08:47,123
A number of our interviewees
for the report we just released
887
01:08:47,123 --> 01:08:50,471
about the political education
camps in Xinjiang just
888
01:08:50,471 --> 01:08:54,751
painted an extraordinary
portrait of a
surveillance state.
889
01:08:57,857 --> 01:09:00,722
A region awash in
surveillance cameras
890
01:09:00,722 --> 01:09:04,899
for facial recognition purposes,
checkpoints, body scanners,
891
01:09:04,899 --> 01:09:08,316
QR codes outside people's homes.
892
01:09:09,973 --> 01:09:13,873
Yeah, it really is the
stuff of dystopian movies
893
01:09:13,873 --> 01:09:15,323
that we've all gone
to and thought,
894
01:09:15,323 --> 01:09:17,463
"Wow, that would be a
creepy world to live in."
895
01:09:17,463 --> 01:09:20,811
Yeah, well, 13 million
Turkic Muslims in China
896
01:09:20,811 --> 01:09:22,917
are living in that
reality right now.
897
01:09:35,171 --> 01:09:37,414
- The Intercept reports
that Google is planning to
898
01:09:37,414 --> 01:09:40,797
launch a censored version of
its search engine in China.
899
01:09:40,797 --> 01:09:43,075
- Google's search
for new markets leads it
900
01:09:43,075 --> 01:09:46,492
to China, despite Beijing's
rules on censorship.
901
01:09:46,492 --> 01:09:48,391
- Tell us more about
why you felt it was
902
01:09:48,391 --> 01:09:50,979
your ethical
responsibility to resign,
903
01:09:50,979 --> 01:09:52,429
because you talk
about being complicit
904
01:09:52,429 --> 01:09:55,398
in censorship and
oppression, and surveillance.
905
01:09:55,398 --> 01:09:58,539
- There is a Chinese venture
company that has to be
906
01:09:58,539 --> 01:10:00,575
set up for Google
to operate in China.
907
01:10:00,575 --> 01:10:03,233
And the question is, to what
degree did they get to control
908
01:10:03,233 --> 01:10:06,305
the blacklist and to what
degree would they have just
909
01:10:06,305 --> 01:10:09,584
unfettered access to
surveilling Chinese citizens?
910
01:10:09,584 --> 01:10:11,862
And the fact that Google
refuses to respond
911
01:10:11,862 --> 01:10:13,623
to human rights
organizations on this,
912
01:10:13,623 --> 01:10:16,488
I think should be extremely
disturbing to everyone.
913
01:10:21,009 --> 01:10:22,735
Due to my conviction
that dissent
914
01:10:22,735 --> 01:10:25,117
is fundamental to
functioning democracies
915
01:10:25,117 --> 01:10:27,878
and forced to resign in
order to avoid contributing
916
01:10:27,878 --> 01:10:29,949
to or profiting from the erosion
917
01:10:29,949 --> 01:10:31,675
of protections for dissidents.
918
01:10:33,263 --> 01:10:35,300
The United Nations is currently
reporting that between
919
01:10:35,300 --> 01:10:38,613
200,000 and one million
Uyghurs have been disappeared
920
01:10:38,613 --> 01:10:40,960
into re-education camps.
921
01:10:40,960 --> 01:10:42,307
And there is a serious argument
922
01:10:42,307 --> 01:10:43,894
that Google would be complicit
923
01:10:43,894 --> 01:10:46,483
should it launch a surveilled
version of search in China.
924
01:10:49,590 --> 01:10:54,560
Dragonfly is a project meant
to launch search in China under
925
01:10:55,941 --> 01:11:00,256
Chinese government regulations,
which include censoring
926
01:11:00,256 --> 01:11:03,845
sensitive content, basic
queries on human rights,
927
01:11:03,845 --> 01:11:07,539
information about political
representatives is blocked,
928
01:11:07,539 --> 01:11:11,405
information about student
protests is blocked.
929
01:11:11,405 --> 01:11:13,510
And that's one small part of it.
930
01:11:13,510 --> 01:11:16,410
Perhaps a deeper concern is
the surveillance side of this.
931
01:11:20,414 --> 01:11:22,174
When I raised the
issue with my managers,
932
01:11:22,174 --> 01:11:24,107
with my colleagues,
933
01:11:24,107 --> 01:11:26,212
there was a lot of concern,
but everyone just said,
934
01:11:26,212 --> 01:11:27,386
"I don't know anything."
935
01:11:32,149 --> 01:11:34,566
And then when there
was a meeting finally,
936
01:11:34,566 --> 01:11:36,913
there was essentially
no addressing
937
01:11:36,913 --> 01:11:39,018
the serious concerns
associated with it.
938
01:11:41,366 --> 01:11:44,230
So then I filed my
formal resignation,
939
01:11:44,230 --> 01:11:45,611
not just to my manager,
940
01:11:45,611 --> 01:11:47,337
but I actually distributed
it company-wide.
941
01:11:47,337 --> 01:11:49,615
And that's the letter
that I was reading from.
942
01:11:54,793 --> 01:11:57,347
Personally, I
haven't slept well.
943
01:11:57,347 --> 01:12:00,316
I've had pretty
horrific headaches,
944
01:12:00,316 --> 01:12:02,835
wake up in the middle of
the night just sweating.
945
01:12:04,975 --> 01:12:07,668
With that said, what I
found since speaking out
946
01:12:07,668 --> 01:12:12,293
is just how positive the global
response to this has been.
947
01:12:14,675 --> 01:12:17,954
Engineers should demand
to know what the uses
948
01:12:17,954 --> 01:12:20,232
of their technical
contributions are
949
01:12:20,232 --> 01:12:23,235
and to have a seat at the table
in those ethical decisions.
950
01:12:31,381 --> 01:12:33,935
Most citizens don't really
understand what it means to be
951
01:12:33,935 --> 01:12:36,352
in a very large scale
prescriptive technology.
952
01:12:38,112 --> 01:12:40,701
Where someone has already
pre-divided the work
953
01:12:40,701 --> 01:12:42,634
and all you know about
is your little piece,
954
01:12:42,634 --> 01:12:45,499
and almost certainly you don't
understand how it fits in.
955
01:12:49,019 --> 01:12:51,401
So, I think it's worth
drawing the analogy
956
01:12:51,401 --> 01:12:55,440
to physicists' work
on the atomic bomb.
957
01:12:58,304 --> 01:13:00,928
In fact, that's actually
the community I came out of.
958
01:13:03,551 --> 01:13:05,588
I wasn't a nuclear
scientist by any means,
959
01:13:05,588 --> 01:13:07,313
but I was an applied
mathematician
960
01:13:08,763 --> 01:13:11,145
and my PhD program
was actually funded
961
01:13:11,145 --> 01:13:13,734
to train people to
work in weapons labs.
962
01:13:17,323 --> 01:13:18,532
One could certainly argue
963
01:13:18,532 --> 01:13:21,293
that there is an
existential threat
964
01:13:22,708 --> 01:13:25,677
and whoever is leading in
AI will lead militarily.
965
01:13:35,756 --> 01:13:38,621
- China fully expects to
pass the United States
966
01:13:38,621 --> 01:13:41,831
as the number one economy in
the world and it believes that
967
01:13:41,831 --> 01:13:46,836
AI will make that jump more
quickly and more dramatically.
968
01:13:48,285 --> 01:13:50,840
And they also see it as
being able to leapfrog
969
01:13:50,840 --> 01:13:54,947
the United States in
terms of military power.
970
01:14:01,920 --> 01:14:03,922
Their plan is very simple.
971
01:14:03,922 --> 01:14:05,337
We want to catch
the United States
972
01:14:05,337 --> 01:14:07,960
and these technologies by 2020,
973
01:14:07,960 --> 01:14:09,617
we want to surpass
the United States
974
01:14:09,617 --> 01:14:12,240
in these technologies by 2025,
975
01:14:12,240 --> 01:14:13,552
and we want to be
the world leader
976
01:14:13,552 --> 01:14:16,935
in AI and autonomous
technologies by 2030.
977
01:14:19,455 --> 01:14:21,215
It is a national plan.
978
01:14:21,215 --> 01:14:26,151
It is backed up by at least
$150 billion in investments.
979
01:14:26,151 --> 01:14:29,119
So, this is definitely a race.
980
01:14:54,455 --> 01:14:56,043
- AI is a little bit like fire.
981
01:14:57,734 --> 01:15:00,599
Fire was invented
700,000 years ago,
982
01:15:01,773 --> 01:15:03,913
and it has its pros and cons.
983
01:15:06,294 --> 01:15:09,470
People realized you can
use fire to keep warm
984
01:15:09,470 --> 01:15:10,885
at night and to cook,
985
01:15:12,818 --> 01:15:14,233
but they also realized
986
01:15:14,233 --> 01:15:18,306
that you can kill
other people with that.
987
01:15:24,209 --> 01:15:28,627
Fire also has this
AI-like quality of growing
988
01:15:28,627 --> 01:15:31,734
in a wildfire without
further human ado,
989
01:15:34,875 --> 01:15:39,880
but the advantages outweigh
the disadvantages by so much
990
01:15:41,053 --> 01:15:43,608
that we are not going
to stop its development.
991
01:15:53,859 --> 01:15:55,551
Europe is waking up.
992
01:15:56,724 --> 01:16:00,521
Lots of companies in
Europe are realizing
993
01:16:00,521 --> 01:16:02,730
that the next wave of AI
994
01:16:02,730 --> 01:16:05,871
will be much bigger
than the current wave.
995
01:16:07,597 --> 01:16:12,360
The next wave of AI
will be about robots.
996
01:16:14,121 --> 01:16:19,126
All these machines that make
things, that produce stuff,
997
01:16:20,506 --> 01:16:23,648
that build other machines,
they are going to become smart.
998
01:16:31,241 --> 01:16:34,313
In the not-so-distant
future, we will have robots
999
01:16:34,313 --> 01:16:37,593
that we can teach
like we teach kids.
1000
01:16:39,974 --> 01:16:43,012
For example, I will talk to a
little robot and I will say,
1001
01:16:44,358 --> 01:16:46,981
"Look here, robot, look here.
1002
01:16:46,981 --> 01:16:49,052
Let's assemble a smartphone.
1003
01:16:49,052 --> 01:16:50,951
We take this slab
of plastic like that
1004
01:16:50,951 --> 01:16:53,160
and we takes a
screwdriver like that,
1005
01:16:53,160 --> 01:16:56,750
and now we screw in
everything like this.
1006
01:16:56,750 --> 01:16:58,752
No, no, not like this.
1007
01:16:58,752 --> 01:17:01,962
Like this, look, robot,
look, like this."
1008
01:17:03,411 --> 01:17:06,173
And he will fail a couple
of times but rather quickly,
1009
01:17:06,173 --> 01:17:09,176
he will learn to do the
same thing much better
1010
01:17:09,176 --> 01:17:10,764
than I could do it.
1011
01:17:10,764 --> 01:17:14,353
And then we stop the learning
and we make a million copies,
1012
01:17:14,353 --> 01:17:15,423
and sell it.
1013
01:17:36,686 --> 01:17:40,103
Regulation of AI sounds
like an attractive idea,
1014
01:17:40,103 --> 01:17:42,243
but I don't think it's possible.
1015
01:17:44,625 --> 01:17:47,041
One of the reasons
why it won't work is
1016
01:17:47,041 --> 01:17:50,700
the sheer curiosity
of scientists.
1017
01:17:51,874 --> 01:17:53,910
They don't give a
damn for regulation.
1018
01:17:56,982 --> 01:18:01,055
Military powers won't give a
damn for regulations, either.
1019
01:18:01,055 --> 01:18:03,609
They will say, "If we,
the Americans don't do it,
1020
01:18:03,609 --> 01:18:05,232
then the Chinese will do it."
1021
01:18:05,232 --> 01:18:07,544
And the Chinese will
say, "If we don't do it,
1022
01:18:07,544 --> 01:18:09,029
then the Russians will do it."
1023
01:18:11,963 --> 01:18:15,207
No matter what kind of political
regulation is out there,
1024
01:18:15,207 --> 01:18:18,970
all these military
industrial complexes,
1025
01:18:18,970 --> 01:18:21,800
they will almost by
definition have to ignore that
1026
01:18:23,077 --> 01:18:25,183
because they want to
avoid falling behind.
1027
01:18:37,505 --> 01:18:40,474
- A program developed by
the company OpenAI can write
1028
01:18:40,474 --> 01:18:43,788
coherent and credible stories
just like human beings.
1029
01:18:43,788 --> 01:18:45,824
- It's one
small step for machine,
1030
01:18:45,824 --> 01:18:48,654
one giant leap for machine kind.
1031
01:18:48,654 --> 01:18:51,485
IBM's newest artificial
intelligence system took on
1032
01:18:51,485 --> 01:18:55,903
experienced human debaters
and won a live debate.
1033
01:18:55,903 --> 01:18:58,526
- Computer-generated
videos known as deep fakes
1034
01:18:58,526 --> 01:19:02,254
are being used to put women's
faces on pornographic videos.
1035
01:19:06,811 --> 01:19:10,746
- Artificial intelligence
evolves at a very crazy pace.
1036
01:19:12,299 --> 01:19:14,439
You know, it's like
progressing so fast.
1037
01:19:14,439 --> 01:19:17,028
In some ways, we're only
at the beginning right now.
1038
01:19:18,754 --> 01:19:21,687
You have so many potential
applications, it's a gold mine.
1039
01:19:23,931 --> 01:19:27,763
Since 2012, when deep learning
became a big game changer
1040
01:19:27,763 --> 01:19:29,661
in the computer
vision community,
1041
01:19:29,661 --> 01:19:33,044
we were one of the first to
actually adopt deep learning
1042
01:19:33,044 --> 01:19:35,425
and apply it in the field
of computer graphics.
1043
01:19:38,946 --> 01:19:41,846
A lot of our research
is funded by government,
1044
01:19:41,846 --> 01:19:44,124
military intelligence agencies.
1045
01:19:48,300 --> 01:19:52,132
The way we create these
photoreal mappings,
1046
01:19:52,132 --> 01:19:54,617
usually the way it works is
that we need two subjects,
1047
01:19:54,617 --> 01:19:57,827
a source and a target, and
I can do a face replacement.
1048
01:20:03,177 --> 01:20:06,042
One of the applications
is, for example,
1049
01:20:06,042 --> 01:20:08,113
I want to manipulate
someone's face
1050
01:20:08,113 --> 01:20:09,632
saying things that he did not.
1051
01:20:13,291 --> 01:20:16,501
It can be used for creative
things, for funny contents,
1052
01:20:16,501 --> 01:20:20,470
but obviously, it can also
be used for just simply
1053
01:20:20,470 --> 01:20:22,990
manipulate videos and
generate fake news.
1054
01:20:25,406 --> 01:20:27,512
This can be very dangerous.
1055
01:20:29,134 --> 01:20:31,343
If it gets into the wrong hands,
1056
01:20:31,343 --> 01:20:33,345
it can get out of
control very quickly.
1057
01:20:37,729 --> 01:20:40,180
- We're entering an era
in which our enemies can
1058
01:20:40,180 --> 01:20:42,389
make it look like anyone
is saying anything
1059
01:20:42,389 --> 01:20:44,046
at any point in time,
1060
01:20:44,046 --> 01:20:46,876
even if they would
never say those things.
1061
01:20:46,876 --> 01:20:49,085
Moving forward, we need
to be more vigilant
1062
01:20:49,085 --> 01:20:51,570
with what we trust
from the Internet.
1063
01:20:51,570 --> 01:20:54,401
It may sound basic,
but how we move forward
1064
01:20:55,851 --> 01:20:59,613
in the age of information is
going to be the difference
1065
01:20:59,613 --> 01:21:02,478
between whether we survive
or whether we become
1066
01:21:02,478 --> 01:21:04,860
some kind of fucked up dystopia.
1067
01:22:35,433 --> 01:22:38,056
- One criticism that
is frequently raised
1068
01:22:38,056 --> 01:22:41,577
against my work is saying that,
1069
01:22:41,577 --> 01:22:43,751
"Hey, you know there
were stupid ideas
1070
01:22:43,751 --> 01:22:47,963
in the past like
phrenology or physiognomy.
1071
01:22:49,550 --> 01:22:53,692
There were people claiming
that you can read a character
1072
01:22:53,692 --> 01:22:56,385
of a person just
based on their face."
1073
01:22:58,214 --> 01:23:00,320
People would say,
"This is rubbish.
1074
01:23:00,320 --> 01:23:05,325
We know it was just thinly
veiled racism and superstition."
1075
01:23:09,985 --> 01:23:13,954
But the fact that someone
made a claim in the past
1076
01:23:13,954 --> 01:23:18,959
and tried to support this
claim with invalid reasoning,
1077
01:23:20,133 --> 01:23:22,721
doesn't automatically
invalidate the claim.
1078
01:23:28,072 --> 01:23:30,005
Of course, people
should have rights
1079
01:23:30,005 --> 01:23:31,316
to their privacy when it comes
1080
01:23:31,316 --> 01:23:34,595
to sexual orientation
or political views,
1081
01:23:36,666 --> 01:23:38,082
but I'm also afraid
1082
01:23:38,082 --> 01:23:40,015
that in the current
technological environment,
1083
01:23:40,015 --> 01:23:41,947
this is essentially impossible.
1084
01:23:46,987 --> 01:23:49,231
People should realize
there's no going back.
1085
01:23:49,231 --> 01:23:52,717
There's no running away
from the algorithms.
1086
01:23:55,444 --> 01:24:00,035
The sooner we accept
the inevitable and
inconvenient truth
1087
01:24:00,863 --> 01:24:03,659
that privacy is gone,
1088
01:24:06,041 --> 01:24:09,630
the sooner we can actually
start thinking about
1089
01:24:09,630 --> 01:24:11,770
how to make sure
that our societies
1090
01:24:11,770 --> 01:24:15,774
are ready for the
Post-Privacy Age.
1091
01:24:39,350 --> 01:24:41,559
- While speaking about
facial recognition,
1092
01:24:41,559 --> 01:24:45,528
in my deep thoughts, I sometimes
get to the very dark era
1093
01:24:47,082 --> 01:24:48,048
of our history.
1094
01:24:49,877 --> 01:24:53,364
When the people had
to live in the system,
1095
01:24:53,364 --> 01:24:57,333
where some part of the
society was accepted
1096
01:24:57,333 --> 01:25:01,096
and some part of the society
was accused to death.
1097
01:25:05,962 --> 01:25:09,311
What would Mengele do to
have such an instrument
1098
01:25:09,311 --> 01:25:10,139
in his hands?
1099
01:25:15,317 --> 01:25:19,493
It would be very quick and
efficient for selection
1100
01:25:22,738 --> 01:25:26,604
and this is the
apocalyptic vision.
1101
01:26:20,175 --> 01:26:24,179
- So in the near future,
the entire story of you
1102
01:26:24,179 --> 01:26:29,184
will exist in a vast array of
connected databases of faces,
1103
01:26:30,081 --> 01:26:32,704
genomes, behaviors and emotion.
1104
01:26:35,259 --> 01:26:39,642
So, you will have a digital
avatar of yourself online,
1105
01:26:39,642 --> 01:26:43,474
which records how well you
are doing as a citizen,
1106
01:26:43,474 --> 01:26:46,201
what kind of a
relationship do you have,
1107
01:26:46,201 --> 01:26:50,205
what kind of political
orientation and
sexual orientation.
1108
01:26:54,347 --> 01:26:58,454
Based on all of those data,
those algorithms will be able to
1109
01:26:58,454 --> 01:27:02,665
manipulate your behavior
with an extreme precision,
1110
01:27:02,665 --> 01:27:07,083
changing how we think and
probably in the future,
1111
01:27:07,083 --> 01:27:08,119
how we feel.
1112
01:27:29,968 --> 01:27:33,178
- The beliefs and
desires of the first AGIs
1113
01:27:33,178 --> 01:27:34,766
will be extremely important.
1114
01:27:37,044 --> 01:27:39,254
So, it's important to
program them correctly.
1115
01:27:40,496 --> 01:27:42,153
I think that if
this is not done,
1116
01:27:43,534 --> 01:27:48,124
then the nature of evolution
of natural selection will favor
1117
01:27:49,505 --> 01:27:51,921
those systems, prioritize their
own survival above all else.
1118
01:27:56,201 --> 01:27:59,791
It's not that it's going
to actively hate humans
1119
01:27:59,791 --> 01:28:00,827
and want to harm them,
1120
01:28:03,070 --> 01:28:05,659
but it's just going
to be too powerful
1121
01:28:05,659 --> 01:28:06,833
and I think a good analogy
1122
01:28:06,833 --> 01:28:09,732
would be the way
humans treat animals.
1123
01:28:11,251 --> 01:28:12,356
It's not that we hate animals.
1124
01:28:12,356 --> 01:28:13,874
I think humans love animals
1125
01:28:13,874 --> 01:28:15,738
and have a lot of
affection for them,
1126
01:28:16,636 --> 01:28:18,707
but when the time comes
1127
01:28:18,707 --> 01:28:21,917
to build a highway
between two cities,
1128
01:28:21,917 --> 01:28:24,368
we are not asking the
animals for permission.
1129
01:28:24,368 --> 01:28:27,302
We just do it because
it's important for us.
1130
01:28:28,648 --> 01:28:30,753
And I think by default, that's
the kind of relationship
1131
01:28:30,753 --> 01:28:33,342
that's going to be between us
1132
01:28:33,342 --> 01:28:37,242
and AGIs which are
truly autonomous
1133
01:28:37,242 --> 01:28:39,003
and operating on
their own behalf.
1134
01:28:51,464 --> 01:28:55,675
If you have an arms-race
dynamics between multiple kings
1135
01:28:55,675 --> 01:28:57,297
trying to build the AGI first,
1136
01:28:58,471 --> 01:29:00,749
they will have less
time to make sure
1137
01:29:00,749 --> 01:29:02,302
that the AGI that they build
1138
01:29:03,165 --> 01:29:04,546
will care deeply for humans.
1139
01:29:07,790 --> 01:29:10,828
Because the way I imagine it
is that there is an avalanche,
1140
01:29:10,828 --> 01:29:13,451
there is an avalanche
of AGI development.
1141
01:29:13,451 --> 01:29:16,558
Imagine it's a huge
unstoppable force.
1142
01:29:20,078 --> 01:29:23,323
And I think it's pretty likely
the entire surface of the
1143
01:29:23,323 --> 01:29:26,222
earth would be covered with
solar panels and data centers.
1144
01:29:30,330 --> 01:29:32,953
Given these kinds of concerns,
1145
01:29:32,953 --> 01:29:37,095
it will be important that
the AGI is somehow built
1146
01:29:37,095 --> 01:29:39,857
as a cooperation with
multiple countries.
1147
01:29:42,377 --> 01:29:45,103
The future is going to be
good for the AIs, regardless.
1148
01:29:46,553 --> 01:29:49,004
It would be nice if it would
be good for humans as well.
1149
01:30:11,509 --> 01:30:14,926
- Is there a lot of
responsibility weighing
on my shoulders?
1150
01:30:14,926 --> 01:30:15,858
Not really.
1151
01:30:17,791 --> 01:30:21,070
Was there a lot of
responsibility on the shoulders
1152
01:30:21,070 --> 01:30:23,038
of the parents of Einstein?
1153
01:30:24,488 --> 01:30:26,110
The parents somehow made him,
1154
01:30:26,110 --> 01:30:29,769
but they had no way of
predicting what he would do,
1155
01:30:29,769 --> 01:30:31,460
and how he would
change the world.
1156
01:30:32,875 --> 01:30:36,845
And so, you can't really hold
them responsible for that.
1157
01:31:01,939 --> 01:31:04,528
So, I'm not a very
human-centric person.
1158
01:31:06,081 --> 01:31:09,774
I think I'm a little stepping
stone in the evolution
1159
01:31:09,774 --> 01:31:12,190
of the Universe towards
higher complexity.
1160
01:31:15,400 --> 01:31:19,163
But it's also clear to me that
I'm not the crown of creation
1161
01:31:19,163 --> 01:31:23,616
and that humankind as a whole
is not the crown of creation,
1162
01:31:25,480 --> 01:31:27,343
but we are setting the
stage for something
1163
01:31:27,343 --> 01:31:30,485
that is bigger than
us, that transcends us.
1164
01:31:33,004 --> 01:31:35,420
And then will go
out there in a way
1165
01:31:35,420 --> 01:31:36,629
where humans cannot follow
1166
01:31:36,629 --> 01:31:39,563
and transform the entire
universe, or at least,
1167
01:31:39,563 --> 01:31:41,565
the reachable universe.
1168
01:31:45,845 --> 01:31:50,850
So, I find beauty and
awe in seeing myself
1169
01:31:52,023 --> 01:31:53,577
as part of this
much grander theme.
1170
01:32:18,360 --> 01:32:20,327
- AI is inevitable.
1171
01:32:21,915 --> 01:32:26,920
We need to make sure we have
the necessary human regulation
1172
01:32:28,404 --> 01:32:32,685
to prevent the weaponization
of artificial intelligence.
1173
01:32:33,893 --> 01:32:36,343
We don't need any
more weaponization
1174
01:32:36,343 --> 01:32:38,311
of such a powerful tool.
1175
01:32:41,452 --> 01:32:43,385
- One of the most
critical things, I think,
1176
01:32:43,385 --> 01:32:46,284
is the need for
international governance.
1177
01:32:48,217 --> 01:32:51,427
We have an imbalance of
power here because now
1178
01:32:51,427 --> 01:32:53,498
we have corporations with
more power, might and ability,
1179
01:32:53,498 --> 01:32:55,431
than entire countries.
1180
01:32:55,431 --> 01:32:58,400
How do we make sure
that people's voices
are getting heard?
1181
01:33:02,059 --> 01:33:03,957
- It can't be a law-free zone.
1182
01:33:03,957 --> 01:33:06,408
It can't be a rights-free zone.
1183
01:33:06,408 --> 01:33:10,170
We can't embrace all of these
wonderful new technologies
1184
01:33:10,170 --> 01:33:14,347
for the 21st century without
trying to bring with us
1185
01:33:14,347 --> 01:33:19,248
the package of human rights
that we fought so hard
1186
01:33:19,248 --> 01:33:23,598
to achieve, and that
remains so fragile.
1187
01:33:32,986 --> 01:33:36,507
- AI isn't good and
it isn't evil, either.
1188
01:33:36,507 --> 01:33:38,958
It's just going to
amplify the desires
1189
01:33:38,958 --> 01:33:40,994
and goals of
whoever controls it.
1190
01:33:40,994 --> 01:33:42,789
And AI today is
under the control
1191
01:33:42,789 --> 01:33:45,378
of a very, very small
group of people.
1192
01:33:48,761 --> 01:33:50,901
The most important question
that we humans have
1193
01:33:50,901 --> 01:33:53,593
to ask ourselves at
this point in history
1194
01:33:53,593 --> 01:33:55,906
requires no technical knowledge.
1195
01:33:55,906 --> 01:33:57,321
It's the question
1196
01:33:57,321 --> 01:34:01,394
of what sort of future
society do we want to create
1197
01:34:01,394 --> 01:34:03,396
with all this
technology we're making?
1198
01:34:05,640 --> 01:34:09,574
What do we want the role of
humans to be in this world?
94292
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.