Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:30,000 --> 00:00:33,500
Correction and synchronisation:
Mazrim Taim
2
00:00:35,078 --> 00:00:41,302
What we're on the brink of is
a world of increasingly intense,
3
00:00:41,345 --> 00:00:45,219
sophisticated artificial intelligence.
4
00:00:45,262 --> 00:00:48,396
Technology is evolving
so much faster than our society
5
00:00:48,439 --> 00:00:51,181
has the ability
to protect us as citizens.
6
00:00:51,486 --> 00:00:55,707
The robots are coming,
and they will destroy our livelihoods.
7
00:01:01,887 --> 00:01:04,238
You have a networked intelligence
that watches us,
8
00:01:04,281 --> 00:01:08,590
knows everything about us,
and begins to try to change us.
9
00:01:08,633 --> 00:01:12,768
Twitter has become
the world's number-one news site.
10
00:01:12,811 --> 00:01:15,205
Technology is never good or bad.
11
00:01:15,249 --> 00:01:18,948
It's what we do
with the technology.
12
00:01:18,991 --> 00:01:22,734
Eventually, millions of people
are gonna be thrown out of jobs
13
00:01:22,778 --> 00:01:25,737
because their skills
are going to be obsolete.
14
00:01:25,781 --> 00:01:27,435
Mass unemployment...
15
00:01:27,478 --> 00:01:31,727
greater inequalities,
even social unrest.
16
00:01:32,570 --> 00:01:35,530
Regardless of whether
to be afraid or not afraid,
17
00:01:35,573 --> 00:01:38,185
the change is coming,
and nobody can stop it.
18
00:01:44,582 --> 00:01:48,146
We've invested huge amounts of money,
and so it stands to reason
19
00:01:48,148 --> 00:01:50,893
that the military,
with their own desires,
20
00:01:50,936 --> 00:01:53,330
are gonna start to use
these technologies.
21
00:01:53,374 --> 00:01:57,552
Autonomous weapons systems
could lead to a global arms race
22
00:01:57,595 --> 00:02:00,032
to rival the Nuclear Era.
23
00:02:02,339 --> 00:02:05,429
We know what the answer is.
They'll eventually be killing us.
24
00:02:10,826 --> 00:02:15,874
These technology leaps
are gonna yield incredible miracles
25
00:02:15,918 --> 00:02:18,181
and incredible horrors.
26
00:02:24,274 --> 00:02:29,323
We created it,
so I think, as we move forward,
27
00:02:29,366 --> 00:02:33,762
this intelligence
will contain parts of us.
28
00:02:33,805 --> 00:02:35,981
And I think the question is:
29
00:02:36,025 --> 00:02:39,463
Will it contain
the good parts...
30
00:02:39,507 --> 00:02:41,378
or the bad parts?
31
00:03:04,836 --> 00:03:08,840
The survivors
called the war "Judgment Day."
32
00:03:08,884 --> 00:03:12,583
They lived only to face
a new nightmare:
33
00:03:12,627 --> 00:03:14,319
The war against the machines.
34
00:03:15,456 --> 00:03:18,023
I think
we've completely fucked this up.
35
00:03:18,067 --> 00:03:21,549
I think Hollywood has managed
to inoculate the general public
36
00:03:21,592 --> 00:03:24,247
against this question.
37
00:03:24,291 --> 00:03:28,251
The idea of machines
that will take over the world.
38
00:03:28,295 --> 00:03:30,645
Open the pod bay doors, HAL.
39
00:03:30,688 --> 00:03:33,561
I'm sorry, Dave.
40
00:03:33,604 --> 00:03:35,911
I'm afraid I can't do that.
41
00:03:37,434 --> 00:03:38,696
HAL?
42
00:03:38,740 --> 00:03:40,437
We've cried wolf enough times...
43
00:03:40,481 --> 00:03:42,483
...that the public
has stopped paying attention,
44
00:03:42,484 --> 00:03:44,120
because it feels like
science fiction.
45
00:03:44,121 --> 00:03:46,001
Even sitting here talking
about it right now,
46
00:03:46,002 --> 00:03:48,301
it feels a little bit silly,
a little bit like,
47
00:03:48,302 --> 00:03:51,697
"Oh, this is an artifact
of some cheeseball movie."
48
00:03:51,709 --> 00:03:56,584
The WOPR spends all its time
thinking about World War III.
49
00:03:56,627 --> 00:03:59,064
But it's not.
50
00:03:59,108 --> 00:04:02,111
The general public is about
to get blindsided by this.
51
00:04:11,555 --> 00:04:13,514
As a society and as individuals,
52
00:04:13,557 --> 00:04:18,954
we're increasingly surrounded
by machine intelligence.
53
00:04:18,997 --> 00:04:22,653
We carry this pocket device
in the palm of our hand
54
00:04:22,697 --> 00:04:24,829
that we use to make
a striking array
55
00:04:24,873 --> 00:04:26,831
of life decisions right now,
56
00:04:26,875 --> 00:04:29,007
aided by a set
of distant algorithms
57
00:04:29,051 --> 00:04:30,748
that we have no understanding.
58
00:04:34,186 --> 00:04:36,537
We're already pretty jaded
about the idea
59
00:04:36,580 --> 00:04:37,929
that we can talk to our phone,
60
00:04:37,973 --> 00:04:40,062
and it mostly understands us.
61
00:04:40,105 --> 00:04:42,456
I found quite a number
of action films.
62
00:04:42,499 --> 00:04:44,327
Five years ago -- no way.
63
00:04:44,371 --> 00:04:47,678
Robotics.
Machines that see and speak...
64
00:04:47,722 --> 00:04:48,897
...and listen.
65
00:04:48,940 --> 00:04:50,202
All that's real now.
66
00:04:50,246 --> 00:04:51,639
And these technologies
67
00:04:51,682 --> 00:04:54,886
are gonna fundamentally
change our society.
68
00:04:55,730 --> 00:05:00,212
Now we have this great
movement of self-driving cars.
69
00:05:00,256 --> 00:05:01,953
Driving a car autonomously
70
00:05:01,997 --> 00:05:05,688
can move people's lives
into a better place.
71
00:05:06,131 --> 00:05:09,570
I've lost a number of family members,
including my mother,
72
00:05:09,613 --> 00:05:11,876
my brother and sister-in-law
and their kids,
73
00:05:11,920 --> 00:05:14,009
to automobile accidents.
74
00:05:14,052 --> 00:05:18,405
It's pretty clear we could
almost eliminate car accidents
75
00:05:18,448 --> 00:05:20,102
with automation.
76
00:05:20,145 --> 00:05:21,843
30,000 lives in the U.S. alone.
77
00:05:21,886 --> 00:05:24,955
About a million around the world
per year.
78
00:05:25,499 --> 00:05:27,501
In healthcare, early indicators
79
00:05:27,544 --> 00:05:29,503
are the name of the game
in that space,
80
00:05:29,546 --> 00:05:33,158
so that's another place where
it can save somebody's life.
81
00:05:33,202 --> 00:05:35,726
Here in
the breast-cancer center,
82
00:05:35,770 --> 00:05:38,381
all the things that
the radiologist's brain
83
00:05:38,425 --> 00:05:43,386
does in two minutes,
the computer does instantaneously.
84
00:05:43,430 --> 00:05:47,303
The computer has looked
at 1 billion mammograms,
85
00:05:47,347 --> 00:05:49,261
and it takes that data
and applies it
86
00:05:49,305 --> 00:05:51,438
to this image instantaneously,
87
00:05:51,481 --> 00:05:54,441
so the medical application
is profound.
88
00:05:56,399 --> 00:05:59,402
Another really exciting area that
we're seeing a lot of development in
89
00:05:59,446 --> 00:06:03,275
is actually understanding
our genetic code
90
00:06:03,319 --> 00:06:06,104
and using that
to both diagnose disease
91
00:06:06,148 --> 00:06:07,758
and create
personalized treatments.
92
00:06:11,675 --> 00:06:14,112
The primary application
of all these machines
93
00:06:14,156 --> 00:06:17,246
will be to extend
our own intelligence.
94
00:06:17,289 --> 00:06:19,422
We'll be able to make
ourselves smarter,
95
00:06:19,466 --> 00:06:22,543
and we'll be better
at solving problems.
96
00:06:22,586 --> 00:06:25,075
We don't have to age.
We'll actually understand aging.
97
00:06:25,076 --> 00:06:27,126
We'll be able to stop it.
98
00:06:27,169 --> 00:06:29,519
There's really no limit
to what intelligent machines
99
00:06:29,563 --> 00:06:30,868
can do for the human race.
100
00:06:36,308 --> 00:06:39,399
How could a smarter machine
not be a better machine?
101
00:06:42,053 --> 00:06:44,708
It's hard to say exactly
when I began to think
102
00:06:44,752 --> 00:06:46,971
that that was a bit naive.
103
00:06:56,503 --> 00:07:00,898
Stuart Russell, he's basically a god
in the field of artificial intelligence.
104
00:07:00,942 --> 00:07:04,380
He wrote the book that almost
every university uses.
105
00:07:04,424 --> 00:07:06,948
I used to say it's the
best-selling AI textbook.
106
00:07:06,991 --> 00:07:10,255
Now I just say "It's the PDF
that's stolen most often."
107
00:07:13,694 --> 00:07:17,306
Artificial intelligence is
about making computers smart,
108
00:07:17,349 --> 00:07:19,830
and from the point
of view of the public,
109
00:07:19,874 --> 00:07:21,484
what counts as AI
is just something
110
00:07:21,528 --> 00:07:23,268
that's surprisingly intelligent
111
00:07:23,312 --> 00:07:25,488
compared to what
we thought computers
112
00:07:25,532 --> 00:07:28,004
would typically be able to do.
113
00:07:28,448 --> 00:07:33,801
AI is a field of research
to try to basically simulate
114
00:07:33,844 --> 00:07:36,717
all kinds of human capabilities.
115
00:07:36,760 --> 00:07:38,719
We're in the AI era.
116
00:07:38,762 --> 00:07:40,503
Silicon Valley
has the ability to focus
117
00:07:40,547 --> 00:07:42,462
on one bright, shiny thing.
118
00:07:42,505 --> 00:07:45,508
It was social networking and
social media over the last decade,
119
00:07:45,552 --> 00:07:48,119
and it's pretty clear
that the bit has flipped.
120
00:07:48,163 --> 00:07:50,557
And it starts
with machine learning.
121
00:07:50,600 --> 00:07:54,343
When we look back at this moment,
what was the first AI?
122
00:07:54,386 --> 00:07:57,389
It's not sexy, and it isn't the thing
we could see at the movies,
123
00:07:57,433 --> 00:08:00,741
but you'd make a great case
that Google created,
124
00:08:00,784 --> 00:08:03,395
not a search engine,
but a godhead.
125
00:08:03,439 --> 00:08:06,486
A way for people to ask
any question they wanted
126
00:08:06,529 --> 00:08:08,270
and get the answer they needed.
127
00:08:08,313 --> 00:08:11,273
Most people are not
aware that what Google is doing
128
00:08:11,316 --> 00:08:13,710
is actually a form of
artificial intelligence.
129
00:08:13,754 --> 00:08:16,234
They just go there,
they type in a thing.
130
00:08:16,278 --> 00:08:18,323
Google gives them the answer.
131
00:08:18,367 --> 00:08:21,444
With each search,
we train it to be better.
132
00:08:21,445 --> 00:08:24,108
Sometimes we're typing a search,
and it tell us the answer
133
00:08:24,109 --> 00:08:27,434
before you've finished
asking the question.
134
00:08:27,463 --> 00:08:29,944
You know, who is the president
of Kazakhstan?
135
00:08:29,987 --> 00:08:31,685
And it'll just tell you.
136
00:08:31,728 --> 00:08:34,818
You don't have to go to the Kazakhstan
national website to find out.
137
00:08:34,862 --> 00:08:37,081
You didn't used to be able to do that.
138
00:08:37,125 --> 00:08:39,475
That is artificial intelligence.
139
00:08:39,519 --> 00:08:42,783
Years from now when we try
to understand, we will say,
140
00:08:42,826 --> 00:08:44,567
"How did we miss it?"
141
00:08:44,611 --> 00:08:48,484
It's one of the striking contradictions
that we're facing.
142
00:08:48,528 --> 00:08:52,053
Google and Facebook, et al,
have built businesses on giving us,
143
00:08:52,096 --> 00:08:54,185
as a society, free stuff.
144
00:08:54,229 --> 00:08:56,013
But it's a Faustian bargain.
145
00:08:56,057 --> 00:09:00,017
They're extracting something
from us in exchange,
146
00:09:00,061 --> 00:09:01,628
but we don't know
147
00:09:01,671 --> 00:09:03,760
what code is running
on the other side and why.
148
00:09:03,804 --> 00:09:05,846
We have no idea.
149
00:09:06,589 --> 00:09:08,591
It does strike
right at the issue
150
00:09:08,635 --> 00:09:11,028
of how much we should
trust these machines.
151
00:09:14,162 --> 00:09:18,166
I use computers
literally for everything.
152
00:09:18,209 --> 00:09:21,386
There are so many
computer advancements now,
153
00:09:21,430 --> 00:09:23,824
and it's become such
a big part of our lives.
154
00:09:23,867 --> 00:09:26,174
It's just incredible
what a computer can do.
155
00:09:26,217 --> 00:09:29,090
You can actually carry
a computer in your purse.
156
00:09:29,133 --> 00:09:31,571
I mean, how awesome is that?
157
00:09:31,614 --> 00:09:35,052
I think most technology is meant
to make things easier
158
00:09:35,096 --> 00:09:37,315
and simpler for all of us,
159
00:09:37,359 --> 00:09:40,362
so hopefully that just
remains the focus.
160
00:09:40,405 --> 00:09:43,147
I think everybody loves
their computers.
161
00:09:51,721 --> 00:09:53,810
People don't realize
they are constantly
162
00:09:53,854 --> 00:09:59,076
being negotiated with
by machines;
163
00:09:59,120 --> 00:10:02,993
whether that's the price
of products in your Amazon cart,
164
00:10:03,037 --> 00:10:05,517
whether you can get
on a particular flight,
165
00:10:05,561 --> 00:10:08,912
whether you can reserve
a room at a particular hotel.
166
00:10:08,956 --> 00:10:11,959
What you're experiencing
are machine-learning algorithms
167
00:10:12,002 --> 00:10:14,265
that have determined
that a person like you
168
00:10:14,309 --> 00:10:17,791
is willing to pay 2 cents more
and is changing the price.
169
00:10:21,795 --> 00:10:24,014
Now, a computer looks
at millions of people
170
00:10:24,058 --> 00:10:28,105
simultaneously for
very subtle patterns.
171
00:10:28,149 --> 00:10:31,369
You can take seemingly
innocent digital footprints,
172
00:10:31,413 --> 00:10:34,677
such as someone's playlist
on Spotify,
173
00:10:34,721 --> 00:10:37,201
or stuff that they
bought on Amazon,
174
00:10:37,245 --> 00:10:40,291
and then use algorithms
to translate this
175
00:10:40,335 --> 00:10:44,513
into a very detailed and a
very accurate, intimate profile.
176
00:10:47,603 --> 00:10:50,911
There is a dossier on
each of us that is so extensive
177
00:10:50,954 --> 00:10:52,695
it would be possibly
accurate to say
178
00:10:52,739 --> 00:10:55,698
that they know more about you
than your mother does.
179
00:11:04,098 --> 00:11:06,883
The major cause
of the recent AI breakthrough
180
00:11:06,927 --> 00:11:08,580
isn't just that some dude
181
00:11:08,624 --> 00:11:11,583
had a brilliant insight
all of a sudden,
182
00:11:11,627 --> 00:11:14,325
but simply that we have
much bigger data
183
00:11:14,369 --> 00:11:18,242
to train them on
and vastly better computers.
184
00:11:18,286 --> 00:11:19,940
The magic is in the data.
185
00:11:19,983 --> 00:11:21,463
It's a ton of data.
186
00:11:21,506 --> 00:11:23,726
I mean, it's data
that's never existed before.
187
00:11:23,770 --> 00:11:26,686
We've never had
this data before.
188
00:11:26,729 --> 00:11:30,733
We've created technologies
that allow us to capture
189
00:11:30,777 --> 00:11:33,040
vast amounts of information.
190
00:11:33,083 --> 00:11:35,738
If you think of a billion
cellphones on the planet
191
00:11:35,782 --> 00:11:38,393
with gyroscopes
and accelerometers
192
00:11:38,436 --> 00:11:39,786
and fingerprint readers...
193
00:11:39,829 --> 00:11:42,005
couple that with the GPS
and the photos they take
194
00:11:42,049 --> 00:11:43,964
and the tweets that you send,
195
00:11:44,007 --> 00:11:47,750
we're all giving off huge
amounts of data individually.
196
00:11:47,794 --> 00:11:50,274
Cars that drive as the cameras
on them suck up information
197
00:11:50,318 --> 00:11:52,059
about the world around them.
198
00:11:52,102 --> 00:11:54,844
The satellites that are now
in orbit the size of a toaster.
199
00:11:54,888 --> 00:11:57,629
The infrared about
the vegetation on the planet.
200
00:11:57,673 --> 00:12:01,024
The buoys that are out in the oceans
to feed into the climate models.
201
00:12:05,072 --> 00:12:08,902
And the NSA, the CIA,
as they collect information
202
00:12:08,945 --> 00:12:12,644
about the
geopolitical situations.
203
00:12:12,688 --> 00:12:15,604
The world today is literally
swimming in this data.
204
00:12:20,609 --> 00:12:22,480
Back in 2012,
205
00:12:22,524 --> 00:12:25,875
IBM estimated
that an average human being
206
00:12:25,919 --> 00:12:31,098
leaves 500 megabytes
of digital footprints every day.
207
00:12:31,141 --> 00:12:34,841
If you wanted to back up
on the one day worth of data
208
00:12:34,884 --> 00:12:36,494
that humanity produces
209
00:12:36,538 --> 00:12:39,062
and imprint it out
on a letter-sized paper,
210
00:12:39,106 --> 00:12:43,806
double-sided, font size 12,
and you stack it up,
211
00:12:43,850 --> 00:12:46,113
it would reach from
the surface of the Earth
212
00:12:46,156 --> 00:12:49,116
to the sun four times over.
213
00:12:49,159 --> 00:12:51,292
That's every day.
214
00:12:51,335 --> 00:12:53,816
The data itself
is not good or evil.
215
00:12:53,860 --> 00:12:55,470
It's how it's used.
216
00:12:55,513 --> 00:12:58,342
We're relying, really,
on the goodwill of these people
217
00:12:58,386 --> 00:13:01,171
and on the policies
of these companies.
218
00:13:01,215 --> 00:13:03,870
There is no legal requirement
for how they can
219
00:13:03,913 --> 00:13:06,307
and should use
that kind of data.
220
00:13:06,350 --> 00:13:09,266
That, to me, is at the heart
of the trust issue.
221
00:13:11,007 --> 00:13:13,793
Right now there's a
giant race for creating machines
222
00:13:13,836 --> 00:13:15,751
that are as smart as humans.
223
00:13:15,795 --> 00:13:18,071
Google -- They're working on
what's really the kind of
224
00:13:18,072 --> 00:13:20,074
Manhattan Project
of artificial intelligence.
225
00:13:20,075 --> 00:13:22,686
They've got the most money.
They've got the most talent.
226
00:13:22,714 --> 00:13:27,067
They're buying up AI companies
and robotics companies.
227
00:13:27,110 --> 00:13:29,069
People still think
of Google as a search engine
228
00:13:29,112 --> 00:13:30,722
and their e-mail provider
229
00:13:30,766 --> 00:13:33,943
and a lot of other things
that we use on a daily basis,
230
00:13:33,987 --> 00:13:39,383
but behind that search box
are 10 million servers.
231
00:13:39,427 --> 00:13:43,910
That makes Google the most powerful
computing platform in the world.
232
00:13:43,953 --> 00:13:47,217
Google is now working
on an AI computing platform
233
00:13:47,261 --> 00:13:50,133
that will have
100 million servers.
234
00:13:52,179 --> 00:13:53,963
So when you're interacting
with Google,
235
00:13:54,007 --> 00:13:56,052
we're just seeing
the toenail of something
236
00:13:56,096 --> 00:13:58,881
that is a giant beast
in the making.
237
00:13:58,925 --> 00:14:00,622
And the truth is,
I'm not even sure
238
00:14:00,665 --> 00:14:02,798
that Google knows
what it's becoming.
239
00:14:11,546 --> 00:14:15,811
If you look inside of what algorithms
are being used at Google,
240
00:14:15,855 --> 00:14:20,076
it's technology
largely from the '80s.
241
00:14:20,120 --> 00:14:23,863
So these are models that you
train by showing them a 1, a 2,
242
00:14:23,906 --> 00:14:27,344
and a 3, and it learns not
what a 1 is or what a 2 is --
243
00:14:27,388 --> 00:14:30,434
It learns what the difference
between a 1 and a 2 is.
244
00:14:30,478 --> 00:14:32,436
It's just a computation.
245
00:14:32,480 --> 00:14:35,396
In the last half decade,
where we've made this rapid progress,
246
00:14:35,439 --> 00:14:38,268
it has all been
in pattern recognition.
247
00:14:38,312 --> 00:14:41,184
Most of the good, old-fashioned AI
248
00:14:41,228 --> 00:14:44,057
was when we would tell our computers
249
00:14:44,100 --> 00:14:46,798
how to play a game like chess...
250
00:14:46,842 --> 00:14:49,584
from the old paradigm where
you just tell the computer
251
00:14:49,627 --> 00:14:51,895
exactly what to do.
252
00:14:54,502 --> 00:14:57,505
This is "Jeopardy!"
253
00:14:59,420 --> 00:15:02,510
"The IBM Challenge"!
254
00:15:02,553 --> 00:15:05,730
No one at the time
had thought that a machine
255
00:15:05,774 --> 00:15:08,298
could have the precision
and the confidence
256
00:15:08,342 --> 00:15:11,475
and the speed to play "Jeopardy!"
well enough against the best humans.
257
00:15:11,519 --> 00:15:14,609
Let's play "Jeopardy!"
258
00:15:18,569 --> 00:15:20,354
What is "shoe"?
259
00:15:20,397 --> 00:15:21,877
You are right.
You get to pick.
260
00:15:21,921 --> 00:15:24,836
Literary Character APB
for $800.
261
00:15:24,880 --> 00:15:28,014
Answer --
the Daily Double.
262
00:15:28,057 --> 00:15:31,539
Watson actually got its
knowledge by reading Wikipedia
263
00:15:31,582 --> 00:15:34,672
and 200 million pages
of natural-language documents.
264
00:15:34,716 --> 00:15:36,674
You can't program every line
265
00:15:36,718 --> 00:15:38,502
of how the world works.
266
00:15:38,546 --> 00:15:40,722
The machine has to learn
by reading.
267
00:15:40,765 --> 00:15:42,202
Now we come to Watson.
268
00:15:42,245 --> 00:15:43,986
"Who is Bram Stoker?"
269
00:15:44,030 --> 00:15:45,988
And the wager?
270
00:15:46,032 --> 00:15:49,165
Hello! $17,973.
271
00:15:49,209 --> 00:15:50,993
$41,413.
272
00:15:51,037 --> 00:15:53,343
And a two-day total
of $77--
273
00:15:53,387 --> 00:15:56,694
Watson's trained
on huge amounts of text,
274
00:15:56,738 --> 00:15:59,628
but it's not like it
understands what it's saying.
275
00:15:59,671 --> 00:16:02,309
It doesn't know that water makes
things wet by touching water
276
00:16:02,352 --> 00:16:04,441
and by seeing the way
things behave in the world
277
00:16:04,485 --> 00:16:06,182
the way you and I do.
278
00:16:06,226 --> 00:16:10,143
A lot of language AI today
is not building logical models
279
00:16:10,186 --> 00:16:11,622
of how the world works.
280
00:16:11,666 --> 00:16:15,365
Rather, it's looking at
how the words appear
281
00:16:15,409 --> 00:16:18,238
in the context of other words.
282
00:16:18,281 --> 00:16:20,196
David Ferrucci
developed IBM's Watson,
283
00:16:20,240 --> 00:16:23,547
and somebody asked him,
"Does Watson think?"
284
00:16:23,591 --> 00:16:26,660
And he said,
"Does a submarine swim?"
285
00:16:26,903 --> 00:16:29,331
And what they meant was,
when they developed submarines,
286
00:16:29,332 --> 00:16:32,949
they borrowed basic principles
of swimming from fish.
287
00:16:33,035 --> 00:16:36,525
But a submarine swims farther and faster
than fish and can carry a huge payload.
288
00:16:36,569 --> 00:16:39,411
It out-swims fish.
289
00:16:39,955 --> 00:16:43,741
Watson winning the game of "Jeopardy!"
will go down in the history of AI
290
00:16:43,785 --> 00:16:46,370
as a significant milestone.
291
00:16:46,614 --> 00:16:49,269
We tend to be amazed
when the machine does so well.
292
00:16:49,312 --> 00:16:52,663
I'm even more amazed when the
computer beats humans at things
293
00:16:52,707 --> 00:16:55,188
that humans are
naturally good at.
294
00:16:55,231 --> 00:16:58,060
This is how we make progress.
295
00:16:58,104 --> 00:17:00,671
In the early days of
the Google Brain project,
296
00:17:00,715 --> 00:17:02,804
I gave the team a very
simple instruction,
297
00:17:02,847 --> 00:17:05,807
which was, "Build the biggest
neural network possible,
298
00:17:05,850 --> 00:17:08,157
like 1,000 computers."
299
00:17:08,201 --> 00:17:12,161
A neural net is something very close
to a simulation of how the brain works.
300
00:17:12,205 --> 00:17:16,818
It's very probabilistic,
but with contextual relevance.
301
00:17:16,819 --> 00:17:18,456
In your brain,
you have long neurons
302
00:17:18,457 --> 00:17:20,372
that connect to thousands
of other neurons,
303
00:17:20,373 --> 00:17:22,592
and you have these pathways
that are formed and forged
304
00:17:22,593 --> 00:17:24,769
based on what
the brain needs to do.
305
00:17:24,782 --> 00:17:28,960
When a baby tries something and
it succeeds, there's a reward,
306
00:17:29,004 --> 00:17:32,312
and that pathway that created
the success is strengthened.
307
00:17:32,355 --> 00:17:34,662
If it fails at something,
the pathway is weakened,
308
00:17:34,705 --> 00:17:36,794
and so, over time,
the brain becomes honed
309
00:17:36,838 --> 00:17:40,120
to be good at
the environment around it.
310
00:17:40,363 --> 00:17:43,279
Really, it's just getting
machines to learn by themselves.
311
00:17:43,323 --> 00:17:45,538
This is called "deep learning,"
and "deep learning"
312
00:17:45,539 --> 00:17:48,834
and "neural networks"
mean roughly the same thing.
313
00:17:48,835 --> 00:17:52,391
Deep learning
is a totally different approach
314
00:17:52,419 --> 00:17:55,161
where the computer learns
more like a toddler,
315
00:17:55,204 --> 00:17:56,466
by just getting a lot of data
316
00:17:56,510 --> 00:18:00,340
and eventually
figuring stuff out.
317
00:18:00,383 --> 00:18:03,125
The computer just gets
smarter and smarter
318
00:18:03,169 --> 00:18:05,997
as it has more experiences.
319
00:18:06,041 --> 00:18:09,697
Imagine, if you will, a neural network,
you know, like 1,000 computers.
320
00:18:09,740 --> 00:18:11,438
And it wakes up
not knowing anything.
321
00:18:11,481 --> 00:18:14,093
And we made it watch YouTube
for a week.
322
00:18:25,408 --> 00:18:28,194
Charlie!
That really hurt!
323
00:18:36,245 --> 00:18:38,508
And so, after watching
YouTube for a week,
324
00:18:38,552 --> 00:18:39,988
what would it learn?
325
00:18:40,031 --> 00:18:42,103
We had a hypothesis that
it would learn to detect
326
00:18:42,146 --> 00:18:44,384
commonly occurring objects
in videos.
327
00:18:44,427 --> 00:18:47,517
And so, we know that human faces
appear a lot in videos,
328
00:18:47,561 --> 00:18:49,302
so we looked,
and, lo and behold,
329
00:18:49,345 --> 00:18:52,008
there was a neuron that had
learned to detect human faces.
330
00:18:52,052 --> 00:18:55,865
Leave Britney alone!
331
00:18:56,309 --> 00:18:58,354
Well, what else
appears in videos a lot?
332
00:19:00,095 --> 00:19:01,792
So, we looked,
and to our surprise,
333
00:19:01,836 --> 00:19:05,082
there was actually a neuron
that had learned to detect cats.
334
00:19:14,892 --> 00:19:17,068
I still remember seeing recognition.
335
00:19:17,112 --> 00:19:20,071
"Wow, that's a cat. Okay, cool.
Great."
336
00:19:23,162 --> 00:19:26,295
It's all pretty innocuous when
you're thinking about the future.
337
00:19:26,339 --> 00:19:29,733
It all seems kind of
harmless and benign.
338
00:19:29,777 --> 00:19:33,520
But we're making cognitive architectures
that will fly farther and faster than us
339
00:19:33,563 --> 00:19:37,437
and carry a bigger payload,
and they won't be warm and fuzzy.
340
00:19:37,480 --> 00:19:41,702
I think that, in three to five years,
you will see a computer system
341
00:19:41,745 --> 00:19:45,401
that will be able
to autonomously learn
342
00:19:45,445 --> 00:19:49,013
how to understand,
how to build understanding,
343
00:19:49,057 --> 00:19:51,364
not unlike the way
the human mind works.
344
00:19:53,931 --> 00:19:56,891
Whatever that lunch was,
it was certainly delicious.
345
00:19:56,934 --> 00:19:59,807
Simply some of
Robby's synthetics.
346
00:19:59,850 --> 00:20:01,635
He's your cook, too?
347
00:20:01,678 --> 00:20:04,551
Even manufactures
the raw materials.
348
00:20:04,594 --> 00:20:06,944
Come around here, Robby.
349
00:20:06,988 --> 00:20:09,773
I'll show you
how this works.
350
00:20:11,122 --> 00:20:13,342
One introduces
a sample of human food
351
00:20:13,386 --> 00:20:15,344
through this aperture.
352
00:20:15,388 --> 00:20:17,738
Down here there's a small
built-in chemical laboratory,
353
00:20:17,781 --> 00:20:19,218
where he analyzes it.
354
00:20:19,261 --> 00:20:21,263
Later, he can reproduce
identical molecules
355
00:20:21,307 --> 00:20:22,482
in any shape or quantity.
356
00:20:22,525 --> 00:20:24,614
Why, it's a housewife's dream.
357
00:20:24,958 --> 00:20:26,834
Meet Baxter,
358
00:20:26,877 --> 00:20:30,490
revolutionary new category of robots,
with common sense.
359
00:20:30,533 --> 00:20:31,839
Baxter...
360
00:20:31,882 --> 00:20:33,449
Baxter is
a really good example
361
00:20:33,493 --> 00:20:36,887
of the kind of competition
we face from machines.
362
00:20:36,931 --> 00:20:42,676
Baxter can do almost anything
we can do with our hands.
363
00:20:42,719 --> 00:20:45,722
Baxter costs about
what a minimum-wage worker
364
00:20:45,766 --> 00:20:47,507
makes in a year.
365
00:20:47,550 --> 00:20:50,318
But Baxter won't be taking the place
of one minimum-wage worker --
366
00:20:50,319 --> 00:20:51,930
He'll be taking
the place of three,
367
00:20:51,931 --> 00:20:55,531
because they never get tired,
they never take breaks.
368
00:20:55,558 --> 00:20:57,865
That's probably the
first thing we're gonna see --
369
00:20:57,908 --> 00:20:59,475
displacement of jobs.
370
00:20:59,519 --> 00:21:04,088
They're gonna be done quicker,
faster, cheaper by machines.
371
00:21:04,132 --> 00:21:07,657
Our ability to even stay current
is so insanely limited
372
00:21:07,701 --> 00:21:10,138
compared to
the machines we build.
373
00:21:10,181 --> 00:21:13,446
For example, now we have this
great movement of Uber and Lyft
374
00:21:13,489 --> 00:21:16,505
kind of making transportation cheaper
and democratizing transportation,
375
00:21:16,506 --> 00:21:17,768
which is great.
376
00:21:17,769 --> 00:21:21,189
The next step is gonna be that they're
all gonna be replaced by driverless cars
377
00:21:21,192 --> 00:21:25,936
and then all the Uber and Lyft drivers
have to find something new to do.
378
00:21:25,980 --> 00:21:29,723
There are 4 million professional drivers
in the United States.
379
00:21:29,766 --> 00:21:31,638
They're unemployed soon.
380
00:21:31,681 --> 00:21:34,075
7 million people that do data entry.
381
00:21:34,118 --> 00:21:37,339
Those people are gonna be jobless.
382
00:21:37,383 --> 00:21:40,342
A job isn't just about money, right?
383
00:21:40,386 --> 00:21:42,605
On a biological level,
it serves a purpose.
384
00:21:42,649 --> 00:21:45,391
It becomes a defining thing.
385
00:21:45,434 --> 00:21:48,350
When the jobs went away
in any given civilization,
386
00:21:48,394 --> 00:21:50,987
it doesn't take long
until that turns into violence.
387
00:21:59,622 --> 00:22:02,016
We face a giant divide
between rich and poor,
388
00:22:02,059 --> 00:22:05,019
because that's what automation
and AI will provoke --
389
00:22:05,062 --> 00:22:08,588
a greater divide between
the haves and the have-nots.
390
00:22:08,631 --> 00:22:10,807
Right now, it's working
into the middle class,
391
00:22:10,851 --> 00:22:12,896
into white-collar jobs.
392
00:22:12,940 --> 00:22:15,334
IBM's Watson does
business analytics
393
00:22:15,377 --> 00:22:20,600
that we used to pay a business analyst
$300 an hour to do.
394
00:22:20,643 --> 00:22:23,037
Today, you're going
to college to be a doctor,
395
00:22:23,080 --> 00:22:25,082
to be an accountant,
to be a journalist.
396
00:22:25,126 --> 00:22:28,608
It's unclear that there's
gonna be jobs there for you.
397
00:22:28,651 --> 00:22:32,612
If someone's planning for
a 40-year career in radiology,
398
00:22:32,655 --> 00:22:34,222
just reading images,
399
00:22:34,265 --> 00:22:37,120
I think that could be a challenge
to the new graduates of today.
400
00:22:58,507 --> 00:23:02,729
The da Vinci robot is currently utilized
401
00:23:02,772 --> 00:23:07,516
by a variety of surgeons
for its accuracy and its ability
402
00:23:07,560 --> 00:23:12,303
to avoid the inevitable
fluctuations of the human hand.
403
00:23:23,402 --> 00:23:28,494
Anybody who watches this
feels the amazingness of it.
404
00:23:30,931 --> 00:23:34,674
You look through the scope,
and you're seeing the claw hand
405
00:23:34,717 --> 00:23:36,893
holding that woman's ovary.
406
00:23:36,937 --> 00:23:42,638
Humanity was resting right here
in the hands of this robot.
407
00:23:42,682 --> 00:23:46,947
People say it's the future,
but it's not the future --
408
00:23:46,990 --> 00:23:50,516
It's the present.
409
00:23:50,559 --> 00:23:52,474
If you think about
a surgical robot,
410
00:23:52,475 --> 00:23:54,894
there's often not a lot
of intelligence in these things,
411
00:23:54,895 --> 00:23:58,567
but over time, as we put more and more
intelligence into these systems,
412
00:23:58,611 --> 00:24:02,281
the surgical robots can actually
learn from each robot surgery.
413
00:24:02,284 --> 00:24:04,581
They're tracking the movements,
they're understanding
414
00:24:04,582 --> 00:24:06,423
what worked
and what didn't work.
415
00:24:06,424 --> 00:24:09,023
And eventually, the robot
for routine surgeries
416
00:24:09,024 --> 00:24:12,362
is going to be able to perform
that entirely by itself...
417
00:24:12,363 --> 00:24:14,056
or with human supervision.
418
00:24:35,038 --> 00:24:37,214
It seems that we're
feeding it and creating it,
419
00:24:37,258 --> 00:24:42,785
but, in a way, we are a slave
to the technology,
420
00:24:42,829 --> 00:24:45,701
because we can't go back.
421
00:24:50,053 --> 00:24:52,882
The machines are taking
bigger and bigger bites
422
00:24:52,926 --> 00:24:57,147
out of our skill set
at an ever-increasing speed.
423
00:24:57,191 --> 00:24:59,236
And so we've got to run
faster and faster
424
00:24:59,280 --> 00:25:00,890
to keep ahead of the machines.
425
00:25:02,675 --> 00:25:04,677
How do I look?
426
00:25:04,720 --> 00:25:06,374
Good.
427
00:25:10,030 --> 00:25:11,553
Are you attracted to me?
428
00:25:11,597 --> 00:25:14,251
What?
- Are you attracted to me?
429
00:25:14,295 --> 00:25:17,777
You give me indications
that you are.
430
00:25:17,820 --> 00:25:20,562
I do?
- Yes.
431
00:25:20,606 --> 00:25:22,608
This is the future we're headed into.
432
00:25:22,651 --> 00:25:26,046
We want to design
our companions.
433
00:25:26,089 --> 00:25:29,266
We're gonna like to see
a human face on AI.
434
00:25:29,310 --> 00:25:33,967
Therefore, gaming our emotions
will be depressingly easy.
435
00:25:34,010 --> 00:25:35,272
We're not that complicated.
436
00:25:35,316 --> 00:25:38,101
We're simple.
Stimulus-response.
437
00:25:38,145 --> 00:25:42,763
I can make you like me basically
by smiling at you a lot.
438
00:25:43,106 --> 00:25:45,974
AI's are gonna be fantastic
at manipulating us.
439
00:25:54,683 --> 00:25:56,946
So, you've developed a technology
440
00:25:56,990 --> 00:26:00,036
that can sense
what people are feeling.
441
00:26:00,080 --> 00:26:03,387
Right. We've developed technology
that can read your facial expressions
442
00:26:03,431 --> 00:26:06,521
and map that to a number
of emotional states.
443
00:26:06,565 --> 00:26:08,697
15 years ago,
I had just finished
444
00:26:08,741 --> 00:26:11,482
my undergraduate studies
in computer science,
445
00:26:11,526 --> 00:26:15,008
and it struck me that I was
spending a lot of time
446
00:26:15,051 --> 00:26:17,793
interacting with my laptops
and my devices,
447
00:26:17,837 --> 00:26:23,582
yet these devices had absolutely
no clue how I was feeling.
448
00:26:23,625 --> 00:26:26,802
I started thinking,
"What if this device could sense
449
00:26:26,846 --> 00:26:29,326
that I was stressed
or I was having a bad day?
450
00:26:29,370 --> 00:26:31,067
What would that open up?"
451
00:26:32,721 --> 00:26:34,418
Hi, first-graders!
452
00:26:34,462 --> 00:26:35,855
How are you?
453
00:26:35,898 --> 00:26:37,813
Can I get a hug?
454
00:26:37,857 --> 00:26:40,773
We had kids interact
with the technology.
455
00:26:40,816 --> 00:26:44,472
A lot of it is still in development,
but it was just amazing.
456
00:26:44,515 --> 00:26:46,648
Who likes robots?
- Me!
457
00:26:46,692 --> 00:26:48,911
Who wants to have a robot
in their house?
458
00:26:48,955 --> 00:26:51,479
What would you use
a robot for, Jack?
459
00:26:51,522 --> 00:26:56,353
I would use it to ask my mom
very hard math questions.
460
00:26:56,397 --> 00:26:58,181
Okay.
What about you, Theo?
461
00:26:58,225 --> 00:27:02,272
I would use it
for scaring people.
462
00:27:02,316 --> 00:27:04,666
All right.
So, start by smiling.
463
00:27:04,710 --> 00:27:06,625
Nice.
464
00:27:06,668 --> 00:27:09,018
Brow furrow.
465
00:27:09,062 --> 00:27:10,890
Nice one.
Eyebrow raise.
466
00:27:10,933 --> 00:27:15,068
This generation, technology is just
surrounding them all the time.
467
00:27:15,111 --> 00:27:17,853
It's almost like they expect
to have robots in their homes,
468
00:27:17,897 --> 00:27:22,336
and they expect these robots
to be socially intelligent.
469
00:27:22,379 --> 00:27:25,252
What makes robots smart?
470
00:27:25,295 --> 00:27:29,648
Put them in, like, a math
or biology class.
471
00:27:29,691 --> 00:27:32,259
I think you would
have to train it.
472
00:27:32,302 --> 00:27:35,218
All right.
Let's walk over here.
473
00:27:35,262 --> 00:27:37,394
So, if you smile and you
raise your eyebrows,
474
00:27:37,438 --> 00:27:39,005
it's gonna run over to you.
475
00:27:39,048 --> 00:27:40,833
It's coming over!
It's coming over! Look.
476
00:27:43,183 --> 00:27:45,272
But if you look angry,
it's gonna run away.
477
00:27:46,534 --> 00:27:48,797
-Awesome!
-Oh, that was good.
478
00:27:48,841 --> 00:27:52,366
We're training computers to read
and recognize emotions.
479
00:27:52,409 --> 00:27:53,846
Ready? Set? Go!
480
00:27:53,889 --> 00:27:57,414
And the response so far
has been really amazing.
481
00:27:57,458 --> 00:27:59,590
People are integrating this
into health apps,
482
00:27:59,634 --> 00:28:03,865
meditation apps, robots, cars.
483
00:28:04,508 --> 00:28:06,728
We're gonna see
how this unfolds.
484
00:28:09,470 --> 00:28:11,602
Robots can contain AI,
485
00:28:11,646 --> 00:28:14,388
but the robot is just
a physical instantiation,
486
00:28:14,431 --> 00:28:16,782
and the artificial intelligence
is the brain.
487
00:28:16,825 --> 00:28:19,872
And so brains can exist purely
in software-based systems.
488
00:28:19,915 --> 00:28:22,483
They don't need to have
a physical form.
489
00:28:22,526 --> 00:28:25,094
Robots can exist without
any artificial intelligence.
490
00:28:25,138 --> 00:28:28,097
We have a lot of
dumb robots out there.
491
00:28:28,141 --> 00:28:31,753
But a dumb robot can be
a smart robot overnight,
492
00:28:31,797 --> 00:28:34,103
given the right software,
given the right sensors.
493
00:28:34,147 --> 00:28:38,629
We can't help but impute
motive into inanimate objects.
494
00:28:38,673 --> 00:28:41,502
We do it with machines.
We'll treat them like children.
495
00:28:41,545 --> 00:28:43,330
We'll treat them like surrogates.
496
00:28:43,373 --> 00:28:45,027
Goodbye!
497
00:28:45,071 --> 00:28:48,204
And we'll pay the price.
498
00:29:08,616 --> 00:29:10,792
Okay, welcome to the ATR.
499
00:29:18,000 --> 00:29:20,800
My purpose is to have
a more human-like robot
500
00:29:20,801 --> 00:29:24,001
which has human-like
intentions and desires.
501
00:29:36,000 --> 00:29:38,400
The name of the robot is Erica.
502
00:29:39,501 --> 00:29:43,901
Erica is the most advanced
human-like robot in the world, I think.
503
00:29:44,202 --> 00:29:47,202
Erica can gaze at your face.
504
00:29:51,528 --> 00:29:52,791
Konnichiwa.
505
00:29:53,592 --> 00:29:57,092
Robots can be pretty good
as conversation partners,
506
00:29:57,093 --> 00:30:00,493
especially for the elderly
and younger children,
507
00:30:00,494 --> 00:30:02,494
handicapped people.
508
00:30:03,094 --> 00:30:06,294
When we talk to the robot
we don't fear the social barriers,
509
00:30:06,295 --> 00:30:08,095
social pressures.
510
00:30:08,396 --> 00:30:15,796
Finally everybody accepts the android
as just our friend or partner.
511
00:30:15,997 --> 00:30:18,997
We have implemented simple desires.
512
00:30:18,998 --> 00:30:22,798
She wanted to be well recognized
and she wanted to go rest.
513
00:30:29,299 --> 00:30:32,299
If a robot could have
intentions and desires,
514
00:30:32,300 --> 00:30:36,300
the robot can understand other people's
intentions and desires.
515
00:30:44,300 --> 00:30:47,100
That is tied to
relationships with people
516
00:30:47,101 --> 00:30:49,201
and that means
they like eachother.
517
00:30:50,302 --> 00:30:53,302
That means, well, I'm not sure,
to rub eachother.
518
00:30:56,985 --> 00:30:58,682
We build artificial intelligence,
519
00:30:58,726 --> 00:31:01,948
and the very first thing
we want to do is replicate us.
520
00:31:02,991 --> 00:31:05,341
I think the key point will come
521
00:31:05,385 --> 00:31:08,858
when all the major senses
are replicated.
522
00:31:09,302 --> 00:31:11,130
Sight...
523
00:31:11,173 --> 00:31:12,871
touch...
524
00:31:12,914 --> 00:31:14,611
smell.
525
00:31:14,655 --> 00:31:17,919
When we replicate our senses,
is that when it becomes alive?
526
00:31:27,624 --> 00:31:31,019
So many of our machines
are being built to understand us.
527
00:31:32,847 --> 00:31:35,005
But what happens when
an anthropomorphic creature
528
00:31:35,006 --> 00:31:37,474
discovers that they can
adjust their loyalty,
529
00:31:37,475 --> 00:31:40,043
adjust their courage,
adjust their avarice,
530
00:31:40,072 --> 00:31:42,291
adjust their cunning?
531
00:31:44,859 --> 00:31:48,645
The average person, they don't see
killer robots going down the streets.
532
00:31:48,689 --> 00:31:50,996
They're like,
"What are you talking about?"
533
00:31:51,039 --> 00:31:56,245
Man, we want to make sure that we don't
have killer robots going down the street.
534
00:31:57,089 --> 00:31:59,439
Once they're going down
the street, it is too late.
535
00:32:05,053 --> 00:32:08,578
The thing that worries me right now,
that keeps me awake,
536
00:32:08,622 --> 00:32:11,842
is the development
of autonomous weapons.
537
00:32:27,815 --> 00:32:32,733
Up to now, people have expressed
unease about drones,
538
00:32:32,776 --> 00:32:35,127
which are remotely
piloted aircraft.
539
00:32:39,827 --> 00:32:43,309
If you take a drone's camera
and feed it into the AI system,
540
00:32:43,352 --> 00:32:47,443
it's a very easy step from here
to fully autonomous weapons
541
00:32:47,487 --> 00:32:50,881
that choose their own targets
and release their own missiles.
542
00:33:12,729 --> 00:33:15,080
The expected life-span
of a human being
543
00:33:15,123 --> 00:33:19,420
in that kind of battle environment
would be measured in seconds.
544
00:33:20,563 --> 00:33:23,740
At one point,
drones were science fiction,
545
00:33:23,784 --> 00:33:28,832
and now they've become
the normal thing in war.
546
00:33:28,876 --> 00:33:33,402
There's over 10,000 in
U.S. military inventory alone.
547
00:33:33,446 --> 00:33:35,274
But they're not
just a U.S. phenomena.
548
00:33:35,317 --> 00:33:39,060
There's more than 80 countries
that operate them.
549
00:33:39,104 --> 00:33:41,932
It stands to reason
that people making some
550
00:33:41,976 --> 00:33:44,587
of the most important and
difficult decisions in the world
551
00:33:44,631 --> 00:33:46,328
are gonna start to use
and implement
552
00:33:46,372 --> 00:33:48,591
artificial intelligence.
553
00:33:50,767 --> 00:33:53,596
The Air Force just designed
a $400-billion jet program
554
00:33:53,640 --> 00:33:55,555
to put pilots in the sky,
555
00:33:55,598 --> 00:34:01,300
and a $500 AI, designed by
a couple of graduate students,
556
00:34:01,343 --> 00:34:03,432
is beating the best human pilots
557
00:34:03,476 --> 00:34:05,782
with a relatively
simple algorithm.
558
00:34:09,438 --> 00:34:13,399
AI will have as big an impact
on the military
559
00:34:13,442 --> 00:34:17,490
as the combustion engine
had at the turn of the century.
560
00:34:17,533 --> 00:34:21,233
It will literally touch everything
that the military does,
561
00:34:21,276 --> 00:34:25,324
from driverless convoys
delivering logistical supplies,
562
00:34:25,367 --> 00:34:27,021
to unmanned drones
563
00:34:27,065 --> 00:34:30,764
delivering medical aid,
to computational propaganda,
564
00:34:30,807 --> 00:34:34,246
trying to win the hearts
and minds of a population.
565
00:34:34,289 --> 00:34:38,337
And so it stands to reason
that whoever has the best AI
566
00:34:38,380 --> 00:34:41,688
will probably achieve
dominance on this planet.
567
00:34:45,561 --> 00:34:47,650
At some point in
the early 21st century,
568
00:34:47,694 --> 00:34:51,219
all of mankind was
united in celebration.
569
00:34:51,263 --> 00:34:53,830
We marveled
at our own magnificence
570
00:34:53,874 --> 00:34:56,833
as we gave birth to AI.
571
00:34:56,877 --> 00:34:58,966
AI?
572
00:34:59,009 --> 00:35:00,489
You mean
artificial intelligence?
573
00:35:00,533 --> 00:35:01,751
A singular consciousness
574
00:35:01,795 --> 00:35:05,886
that spawned
an entire race of machines.
575
00:35:05,929 --> 00:35:09,716
We don't know
who struck first -- us or them,
576
00:35:09,759 --> 00:35:12,980
but we know that it was us
that scorched the sky.
577
00:35:14,677 --> 00:35:16,766
There's a long history
of science fiction,
578
00:35:16,810 --> 00:35:19,987
not just predicting the future,
but shaping the future.
579
00:35:26,863 --> 00:35:30,389
Arthur Conan Doyle
writing before World War I
580
00:35:30,432 --> 00:35:34,393
on the danger of how
submarines might be used
581
00:35:34,436 --> 00:35:38,048
to carry out civilian blockades.
582
00:35:38,092 --> 00:35:40,399
At the time
he's writing this fiction,
583
00:35:40,442 --> 00:35:43,402
the Royal Navy made fun
of Arthur Conan Doyle
584
00:35:43,445 --> 00:35:45,230
for this absurd idea
585
00:35:45,273 --> 00:35:47,623
that submarines
could be useful in war.
586
00:35:53,455 --> 00:35:55,370
One of the things
we've seen in history
587
00:35:55,414 --> 00:35:58,243
is that our attitude
towards technology,
588
00:35:58,286 --> 00:36:01,942
but also ethics,
are very context-dependent.
589
00:36:01,985 --> 00:36:03,726
For example, the submarine...
590
00:36:03,770 --> 00:36:06,468
nations like Great Britain
and even the United States
591
00:36:06,512 --> 00:36:09,863
found it horrifying
to use the submarine.
592
00:36:09,906 --> 00:36:13,214
In fact, the German use of the
submarine to carry out attacks
593
00:36:13,258 --> 00:36:18,480
was the reason why the United
States joined World War I.
594
00:36:18,524 --> 00:36:20,613
But move the timeline forward.
595
00:36:20,656 --> 00:36:23,529
The United States
of America was suddenly
596
00:36:23,572 --> 00:36:28,403
and deliberately attacked
by the empire of Japan.
597
00:36:28,447 --> 00:36:32,190
Five hours after Pearl Harbor,
the order goes out
598
00:36:32,233 --> 00:36:36,498
to commit unrestricted
submarine warfare against Japan.
599
00:36:39,936 --> 00:36:43,589
So Arthur Conan Doyle
turned out to be right.
600
00:36:44,332 --> 00:36:46,856
That's the great old line
about science fiction --
601
00:36:46,900 --> 00:36:48,336
It's a lie that tells the truth.
602
00:36:48,380 --> 00:36:51,470
Fellow executives,
it gives me great pleasure
603
00:36:51,513 --> 00:36:54,821
to introduce you to the future
of law enforcement...
604
00:36:54,864 --> 00:36:56,562
ED-209.
605
00:37:03,656 --> 00:37:05,919
This isn't just a question
of science fiction.
606
00:37:05,962 --> 00:37:09,488
This is about what's next,
about what's happening right now.
607
00:37:13,970 --> 00:37:19,324
The role of intelligent systems is
growing very rapidly in warfare.
608
00:37:19,367 --> 00:37:22,152
Everyone is pushing
in the unmanned realm.
609
00:37:26,418 --> 00:37:28,898
Today, the Secretary of Defense
is very, very clear --
610
00:37:28,942 --> 00:37:32,337
We will not create fully
autonomous attacking vehicles.
611
00:37:32,380 --> 00:37:34,643
Not everyone
is gonna hold themselves
612
00:37:34,687 --> 00:37:36,515
to that same set of values.
613
00:37:36,558 --> 00:37:40,693
And when China and Russia start
deploying autonomous vehicles
614
00:37:40,736 --> 00:37:45,611
that can attack and kill, what's
the move that we're gonna make?
615
00:37:50,006 --> 00:37:51,617
You can't say,
"Well, we're gonna use
616
00:37:51,660 --> 00:37:53,967
autonomous weapons
for our military dominance,
617
00:37:54,010 --> 00:37:56,796
but no one else
is gonna use them."
618
00:37:56,839 --> 00:38:00,495
If you make these weapons,
they're gonna be used to attack
619
00:38:00,539 --> 00:38:03,324
human populations
in large numbers.
620
00:38:12,551 --> 00:38:14,596
Autonomous weapons are,
by their nature,
621
00:38:14,640 --> 00:38:16,468
weapons of mass destruction,
622
00:38:16,511 --> 00:38:19,862
because it doesn't need a
human being to guide it or carry it.
623
00:38:19,906 --> 00:38:22,517
You only need one person,
to, you know,
624
00:38:22,561 --> 00:38:25,781
write a little program.
625
00:38:25,825 --> 00:38:30,220
It just captures
the complexity of this field.
626
00:38:30,264 --> 00:38:32,571
It is cool.
It is important.
627
00:38:32,614 --> 00:38:34,573
It is amazing.
628
00:38:34,616 --> 00:38:37,053
It is also frightening.
629
00:38:37,097 --> 00:38:38,968
And it's all about trust.
630
00:38:42,102 --> 00:38:44,583
It's an open letter about
artificial intelligence,
631
00:38:44,626 --> 00:38:47,063
signed by some of
the biggest names in science.
632
00:38:47,107 --> 00:38:48,413
What do they want?
633
00:38:48,456 --> 00:38:50,763
Ban the use of
autonomous weapons.
634
00:38:50,806 --> 00:38:52,373
The author stated,
635
00:38:52,417 --> 00:38:54,375
"Autonomous weapons
have been described
636
00:38:54,419 --> 00:38:56,595
as the third revolution
in warfare."
637
00:38:56,638 --> 00:38:58,853
...thousand
artificial-intelligence specialists
638
00:38:58,855 --> 00:39:01,875
calling for a global ban
on killer robots.
639
00:39:01,876 --> 00:39:04,357
This open letter basically says
640
00:39:04,385 --> 00:39:07,954
that we should redefine the goal
of the field of artificial intelligence
641
00:39:07,997 --> 00:39:11,610
away from just creating pure,
undirected intelligence,
642
00:39:11,653 --> 00:39:13,655
towards creating
beneficial intelligence.
643
00:39:13,699 --> 00:39:16,092
The development of AI
is not going to stop.
644
00:39:16,136 --> 00:39:18,094
It is going to continue
and get better.
645
00:39:18,138 --> 00:39:19,835
If the international community
646
00:39:19,879 --> 00:39:21,968
isn't putting
certain controls on this,
647
00:39:22,011 --> 00:39:24,666
people will develop things
that can do anything.
648
00:39:24,710 --> 00:39:27,365
The letter says
that we are years, not decades,
649
00:39:27,408 --> 00:39:30,106
away from these weapons being deployed.
So first of all...
650
00:39:30,150 --> 00:39:32,413
We had 6,000 signatories
of that letter,
651
00:39:32,457 --> 00:39:35,155
including many of
the major figures in the field.
652
00:39:37,026 --> 00:39:39,942
I'm getting a lot of visits
from high-ranking officials
653
00:39:39,986 --> 00:39:42,989
who wish to emphasize that
American military dominance
654
00:39:43,032 --> 00:39:45,731
is very important,
and autonomous weapons
655
00:39:45,774 --> 00:39:50,083
may be part of
the Defense Department's plan.
656
00:39:50,126 --> 00:39:52,433
That's very, very scary,
because a value system
657
00:39:52,477 --> 00:39:54,479
of military developers
of technology
658
00:39:54,522 --> 00:39:57,307
is not the same as a value system
of the human race.
659
00:40:00,789 --> 00:40:02,922
Out of the concerns
about the possibility
660
00:40:02,965 --> 00:40:06,665
that this technology might be
a threat to human existence,
661
00:40:06,708 --> 00:40:08,144
a number of the technologists
662
00:40:08,188 --> 00:40:09,972
have funded
the Future of Life Institute
663
00:40:10,016 --> 00:40:12,192
to try to grapple
with these problems.
664
00:40:13,193 --> 00:40:14,847
All of these guys are secretive,
665
00:40:14,890 --> 00:40:16,805
and so it's interesting
to me to see them,
666
00:40:16,849 --> 00:40:19,735
you know, all together.
667
00:40:20,679 --> 00:40:24,030
Everything we have is a result
of our intelligence.
668
00:40:24,073 --> 00:40:26,641
It's not the result
of our big, scary teeth
669
00:40:26,685 --> 00:40:29,470
or our large claws
or our enormous muscles.
670
00:40:29,514 --> 00:40:32,473
It's because we're actually
relatively intelligent.
671
00:40:32,517 --> 00:40:35,520
And among my generation,
we're all having
672
00:40:35,563 --> 00:40:37,086
what we call "holy cow,"
673
00:40:37,130 --> 00:40:39,045
or "holy something else"
moments,
674
00:40:39,088 --> 00:40:41,003
because we see
that the technology
675
00:40:41,047 --> 00:40:44,180
is accelerating faster
than we expected.
676
00:40:44,224 --> 00:40:46,705
I remember sitting
around the table there
677
00:40:46,748 --> 00:40:50,099
with some of the best and
the smartest minds in the world,
678
00:40:50,143 --> 00:40:52,058
and what really
struck me was,
679
00:40:52,101 --> 00:40:56,149
maybe the human brain
is not able to fully grasp
680
00:40:56,192 --> 00:40:58,673
the complexity of the world
that we're confronted with.
681
00:40:58,717 --> 00:41:01,415
As it's currently constructed,
682
00:41:01,459 --> 00:41:04,766
the road that AI is following
heads off a cliff,
683
00:41:04,810 --> 00:41:07,595
and we need to change
the direction that we're going
684
00:41:07,639 --> 00:41:10,729
so that we don't take
the human race off the cliff.
685
00:41:13,558 --> 00:41:17,126
Google acquired DeepMind
several years ago.
686
00:41:17,170 --> 00:41:22,088
DeepMind operates as a
semi-independent subsidiary of Google.
687
00:41:22,131 --> 00:41:24,960
The thing that makes
DeepMind unique
688
00:41:25,004 --> 00:41:26,919
is that DeepMind
is absolutely focused
689
00:41:26,962 --> 00:41:30,313
on creating digital
superintelligence --
690
00:41:30,357 --> 00:41:34,056
an AI that is vastly smarter
than any human on Earth
691
00:41:34,100 --> 00:41:36,624
and ultimately smarter than
all humans on Earth combined.
692
00:41:36,668 --> 00:41:40,715
This is from the DeepMind
reinforcement learning system.
693
00:41:40,759 --> 00:41:43,544
Basically wakes up
like a newborn baby
694
00:41:43,588 --> 00:41:46,852
and is shown the screen
of an Atari video game
695
00:41:46,895 --> 00:41:50,508
and then has to learn
to play the video game.
696
00:41:50,551 --> 00:41:55,600
It knows nothing about objects,
about motion, about time.
697
00:41:57,602 --> 00:41:59,604
It only knows that there's
an image on the screen
698
00:41:59,647 --> 00:42:02,563
and there's a score.
699
00:42:02,607 --> 00:42:06,436
So, if your baby woke up
the day it was born
700
00:42:06,480 --> 00:42:08,090
and, by late afternoon,
701
00:42:08,134 --> 00:42:11,093
was playing
40 different Atari video games
702
00:42:11,137 --> 00:42:15,315
at a superhuman level,
you would be terrified.
703
00:42:15,358 --> 00:42:19,101
You would say,
"My baby is possessed. Send it back."
704
00:42:19,145 --> 00:42:23,584
The DeepMind system
can win at any game.
705
00:42:23,628 --> 00:42:27,588
It can already beat all
the original Atari games.
706
00:42:27,632 --> 00:42:29,155
It is superhuman.
707
00:42:29,198 --> 00:42:31,636
It plays the games at superspeed
in less than a minute.
708
00:42:37,076 --> 00:42:38,643
DeepMind turned
to another challenge,
709
00:42:38,686 --> 00:42:40,558
and the challenge
was the game of Go,
710
00:42:40,601 --> 00:42:42,603
which people
have generally argued
711
00:42:42,647 --> 00:42:45,084
has been beyond
the power of computers
712
00:42:45,127 --> 00:42:48,304
to play with
the best human Go players.
713
00:42:48,348 --> 00:42:51,264
First, they challenged
a European Go champion.
714
00:42:53,222 --> 00:42:55,834
Then they challenged
a Korean Go champion.
715
00:42:55,877 --> 00:42:57,836
Please start the game.
716
00:42:57,879 --> 00:42:59,838
And they were able
to win both times
717
00:42:59,881 --> 00:43:02,797
in kind of striking fashion.
718
00:43:02,841 --> 00:43:05,017
You were reading articles
in New York Times years ago
719
00:43:05,060 --> 00:43:09,761
talking about how Go would take
100 years for us to solve.
720
00:43:09,804 --> 00:43:13,460
People said, "Well, you know,
but that's still just a board.
721
00:43:13,503 --> 00:43:15,027
Poker is an art.
722
00:43:15,070 --> 00:43:16,419
Poker involves reading people.
723
00:43:16,463 --> 00:43:18,073
Poker involves lying
and bluffing.
724
00:43:18,117 --> 00:43:19,553
It's not an exact thing.
725
00:43:19,597 --> 00:43:21,381
That will never be,
you know, a computer.
726
00:43:21,424 --> 00:43:22,861
You can't do that."
727
00:43:22,904 --> 00:43:24,932
They took the best
poker players in the world,
728
00:43:25,176 --> 00:43:30,520
and it took seven days for the computer
to start demolishing the humans.
729
00:43:30,564 --> 00:43:32,461
So it's the best poker player
in the world,
730
00:43:32,462 --> 00:43:35,012
it's the best Go player in the
world, and the pattern here
731
00:43:35,013 --> 00:43:37,454
is that AI might take a little while
732
00:43:37,484 --> 00:43:40,443
to wrap its tentacles
around a new skill,
733
00:43:40,487 --> 00:43:44,883
but when it does, when it
gets it, it is unstoppable.
734
00:43:52,020 --> 00:43:55,110
DeepMind's AI has
administrator-level access
735
00:43:55,154 --> 00:43:57,156
to Google's servers
736
00:43:57,199 --> 00:44:00,768
to optimize energy usage
at the data centers.
737
00:44:00,812 --> 00:44:04,816
However, this could be
an unintentional Trojan horse.
738
00:44:04,859 --> 00:44:07,253
DeepMind has to have complete
control of the data centers,
739
00:44:07,296 --> 00:44:08,950
so with a little
software update,
740
00:44:08,994 --> 00:44:10,691
that AI could take
complete control
741
00:44:10,735 --> 00:44:12,214
of the whole Google system,
742
00:44:12,258 --> 00:44:13,607
which means
they can do anything.
743
00:44:13,651 --> 00:44:16,131
They could look at all your data.
They could do anything.
744
00:44:20,135 --> 00:44:23,051
We're rapidly heading towards
digital superintelligence
745
00:44:23,095 --> 00:44:24,313
that far exceeds any human.
746
00:44:24,357 --> 00:44:26,402
I think it's very obvious.
747
00:44:26,446 --> 00:44:29,710
The problem is, we're not gonna
suddenly hit human-level intelligence
748
00:44:29,754 --> 00:44:33,105
and say,
"Okay, let's stop research."
749
00:44:33,148 --> 00:44:35,015
It's gonna go beyond
human-level intelligence
750
00:44:35,016 --> 00:44:39,459
into what's called "superintelligence,"
and that's anything smarter than us.
751
00:44:39,502 --> 00:44:42,810
AI at the superhuman level,
if we succeed with that, will be
752
00:44:42,854 --> 00:44:46,553
by far the most powerful
invention we've ever made
753
00:44:46,596 --> 00:44:50,296
and the last invention
we ever have to make.
754
00:44:50,339 --> 00:44:53,168
And if we create AI
that's smarter than us,
755
00:44:53,212 --> 00:44:54,735
we have to be open
to the possibility
756
00:44:54,779 --> 00:44:57,520
that we might actually
lose control to them.
757
00:45:00,785 --> 00:45:02,612
Let's say
you give it some objective,
758
00:45:02,656 --> 00:45:04,745
like curing cancer,
and then you discover
759
00:45:04,789 --> 00:45:06,965
that the way
it chooses to go about that
760
00:45:07,008 --> 00:45:08,444
is actually in conflict
761
00:45:08,488 --> 00:45:11,705
with a lot of other things
you care about.
762
00:45:12,448 --> 00:45:16,496
AI doesn't have to be evil
to destroy humanity.
763
00:45:16,539 --> 00:45:20,674
If AI has a goal, and humanity
just happens to be in the way,
764
00:45:20,718 --> 00:45:22,894
it will destroy humanity
as a matter of course,
765
00:45:22,937 --> 00:45:25,113
without even thinking about it.
No hard feelings.
766
00:45:25,157 --> 00:45:27,072
It's just like
if we're building a road
767
00:45:27,115 --> 00:45:29,770
and an anthill happens
to be in the way...
768
00:45:29,814 --> 00:45:31,467
We don't hate ants.
769
00:45:31,511 --> 00:45:33,165
We're just building a road.
770
00:45:33,208 --> 00:45:34,857
And so goodbye, anthill.
771
00:45:37,996 --> 00:45:40,172
It's tempting
to dismiss these concerns,
772
00:45:40,215 --> 00:45:42,783
'cause it's, like,
something that might happen
773
00:45:42,827 --> 00:45:47,396
in a few decades or 100 years,
so why worry?
774
00:45:47,440 --> 00:45:50,704
But if you go back
to September 11, 1933,
775
00:45:50,748 --> 00:45:54,795
Ernest Rutherford, who was the most
well-known nuclear physicist of his time,
776
00:45:54,839 --> 00:45:58,668
said that the possibility of ever
extracting useful amounts of energy
777
00:45:58,712 --> 00:46:00,801
from the transmutation
of atoms, as he called it,
778
00:46:00,845 --> 00:46:03,151
was moonshine.
779
00:46:03,195 --> 00:46:06,502
The next morning, Leo Szilard,
who was a much younger physicist,
780
00:46:06,546 --> 00:46:09,984
read this and got really annoyed
and figured out
781
00:46:10,028 --> 00:46:11,943
how to make
a nuclear chain reaction
782
00:46:11,986 --> 00:46:13,379
just a few months later.
783
00:46:20,603 --> 00:46:23,693
We have spent more
than $2 billion
784
00:46:23,737 --> 00:46:27,523
on the greatest
scientific gamble in history.
785
00:46:27,567 --> 00:46:30,222
So when people say that,
"Oh, this is so far off
786
00:46:30,265 --> 00:46:32,528
in the future, we don't have
to worry about it,"
787
00:46:32,572 --> 00:46:36,271
it might only be three, four
breakthroughs of that magnitude
788
00:46:36,315 --> 00:46:40,275
that will get us from here
to superintelligent machines.
789
00:46:40,319 --> 00:46:42,974
If it's gonna take
20 years to figure out
790
00:46:43,017 --> 00:46:45,237
how to keep AI beneficial,
791
00:46:45,280 --> 00:46:48,849
then we should start today,
not at the last second
792
00:46:48,893 --> 00:46:51,460
when some dudes
drinking Red Bull
793
00:46:51,504 --> 00:46:53,832
decide to flip the switch
and test the thing.
794
00:46:56,814 --> 00:46:58,859
We have five years.
795
00:46:58,903 --> 00:47:03,764
I think digital superintelligence
will happen in my lifetime.
796
00:47:03,908 --> 00:47:05,735
100%.
797
00:47:05,779 --> 00:47:07,215
When this happens,
798
00:47:07,259 --> 00:47:09,696
it will be surrounded
by a bunch of people
799
00:47:09,739 --> 00:47:13,091
who are really just excited
about the technology.
800
00:47:13,134 --> 00:47:15,571
They want to see it succeed,
but they're not anticipating
801
00:47:15,615 --> 00:47:16,964
that it can get out of control.
802
00:47:25,494 --> 00:47:28,584
Oh, my God, I trust
my computer so much.
803
00:47:28,628 --> 00:47:30,195
That's an amazing question.
804
00:47:30,238 --> 00:47:31,457
I don't trust
my computer.
805
00:47:31,500 --> 00:47:32,937
If it's on,
I take it off.
806
00:47:32,980 --> 00:47:34,242
Like, even when it's off,
807
00:47:34,286 --> 00:47:35,896
I still think it's on.
Like, you know?
808
00:47:35,897 --> 00:47:37,694
Like, you really cannot tru--
Like, the webcams,
809
00:47:37,695 --> 00:47:39,625
you don't know if, like,
someone might turn it...
810
00:47:39,639 --> 00:47:41,249
You don't know, like.
811
00:47:41,293 --> 00:47:42,903
I don't trust my computer.
812
00:47:42,947 --> 00:47:46,907
Like, in my phone,
every time they ask me
813
00:47:46,951 --> 00:47:49,475
"Can we send your
information to Apple?"
814
00:47:49,518 --> 00:47:50,998
every time, I...
815
00:47:51,042 --> 00:47:53,087
So, I don't trust my phone.
816
00:47:53,131 --> 00:47:56,743
Okay. So, part of it is,
yes, I do trust it,
817
00:47:56,786 --> 00:48:00,660
because it would be really
hard to get through the day
818
00:48:00,703 --> 00:48:04,011
in the way our world is
set up without computers.
819
00:48:10,975 --> 00:48:13,368
Trust is such a human experience.
820
00:48:21,289 --> 00:48:25,119
I have a patient coming in
with an intracranial aneurysm.
821
00:48:30,037 --> 00:48:31,691
They want to look
in my eyes and know
822
00:48:31,734 --> 00:48:34,955
that they can trust
this person with their life.
823
00:48:34,999 --> 00:48:39,129
I'm not horribly concerned
about anything.
824
00:48:39,138 --> 00:48:40,204
Good.
825
00:48:40,206 --> 00:48:42,920
Part of that is because I
have confidence in you.
826
00:48:50,753 --> 00:48:57,151
This procedure we're doing today,
20 years ago was essentially impossible.
827
00:48:57,195 --> 00:49:00,328
We just didn't have the
materials and the technologies.
828
00:49:22,698 --> 00:49:26,485
So, the coil is barely
in there right now.
829
00:49:26,528 --> 00:49:29,923
It's just a feather
holding it in.
830
00:49:29,967 --> 00:49:32,012
It's nervous time.
831
00:49:36,190 --> 00:49:40,673
We're just in purgatory,
intellectual, humanistic purgatory,
832
00:49:40,716 --> 00:49:43,632
and AI might know
exactly what to do here.
833
00:49:50,639 --> 00:49:52,554
We've got the coil
into the aneurysm.
834
00:49:52,598 --> 00:49:54,556
But it wasn't in
tremendously well
835
00:49:54,600 --> 00:49:56,428
that I knew that it would stay,
836
00:49:56,471 --> 00:50:01,041
so with a maybe 20% risk
of a very bad situation,
837
00:50:01,085 --> 00:50:04,436
I elected
to just bring her back.
838
00:50:04,479 --> 00:50:05,959
Because of my relationship
with her
839
00:50:06,003 --> 00:50:08,222
and knowing the difficulties
of coming in
840
00:50:08,266 --> 00:50:11,051
and having the procedure,
I consider things,
841
00:50:11,095 --> 00:50:14,272
when I should only consider
the safest possible route
842
00:50:14,315 --> 00:50:16,361
to achieve success.
843
00:50:16,404 --> 00:50:19,755
But I had to stand there for
10 minutes agonizing about it.
844
00:50:19,799 --> 00:50:21,757
The computer feels nothing.
845
00:50:21,801 --> 00:50:24,760
The computer just does
what it's supposed to do,
846
00:50:24,804 --> 00:50:26,284
better and better.
847
00:50:30,331 --> 00:50:32,551
I want to be AI in this case.
848
00:50:35,945 --> 00:50:38,861
But can AI be compassionate?
849
00:50:43,083 --> 00:50:47,827
I mean, it's everybody's
question about AI.
850
00:50:47,870 --> 00:50:51,961
We are the sole
embodiment of humanity,
851
00:50:52,005 --> 00:50:55,269
and it's a stretch for us
to accept that a machine
852
00:50:55,313 --> 00:50:58,794
can be compassionate
and loving in that way.
853
00:51:05,149 --> 00:51:07,281
Part of me doesn't believe in magic,
854
00:51:07,325 --> 00:51:09,805
but part of me has faith
that there is something
855
00:51:09,849 --> 00:51:11,546
beyond the sum of the parts,
856
00:51:11,590 --> 00:51:15,637
that there is at least a oneness
in our shared ancestry,
857
00:51:15,681 --> 00:51:19,738
our shared biology,
our shared history.
858
00:51:20,381 --> 00:51:23,210
Some connection there
beyond machine.
859
00:51:30,348 --> 00:51:32,567
So, then, you have
the other side of that, is,
860
00:51:32,611 --> 00:51:37,137
does the computer know it's conscious,
or can it be conscious, or does it care?
861
00:51:37,181 --> 00:51:40,009
Does it need to be conscious?
862
00:51:40,053 --> 00:51:42,011
Does it need to be aware?
863
00:51:52,892 --> 00:51:56,417
I do not think that a robot
could ever be conscious.
864
00:51:56,461 --> 00:51:58,376
Unless they programmed it that way.
865
00:51:58,419 --> 00:52:00,639
Conscious? No.
866
00:52:00,682 --> 00:52:03,163
No.
867
00:52:03,207 --> 00:52:06,035
I mean, think a robot could be
programmed to be conscious.
868
00:52:06,079 --> 00:52:09,648
How are they programmed
to do everything else?
869
00:52:09,691 --> 00:52:12,390
That's another big part
of artificial intelligence,
870
00:52:12,433 --> 00:52:15,741
is to make them conscious
and make them feel.
871
00:52:22,443 --> 00:52:27,709
Back in 2005, we started trying to
build machines with self-awareness.
872
00:52:33,106 --> 00:52:37,284
This robot, to begin with,
didn't know what it was.
873
00:52:37,328 --> 00:52:40,244
All it knew was that it needed
to do something like walk.
874
00:52:44,117 --> 00:52:45,597
Through trial and error,
875
00:52:45,640 --> 00:52:49,731
it figured out how to walk
using its imagination,
876
00:52:49,775 --> 00:52:54,040
and then it walked away.
877
00:52:54,083 --> 00:52:56,390
And then we did
something very cruel.
878
00:52:56,434 --> 00:52:58,653
We chopped off a leg
and watched what happened.
879
00:53:03,049 --> 00:53:07,749
At the beginning, it didn't
quite know what had happened.
880
00:53:07,793 --> 00:53:13,233
But over about a period
of a day, it then began to limp.
881
00:53:13,277 --> 00:53:16,845
And then, a year ago,
we were training an AI system
882
00:53:16,889 --> 00:53:20,240
for a live demonstration.
883
00:53:20,284 --> 00:53:24,113
We wanted to show how we wave
all these objects in front of the camera
884
00:53:24,157 --> 00:53:27,334
and the AI could
recognize the objects.
885
00:53:27,378 --> 00:53:29,031
And so, we're preparing
this demo,
886
00:53:29,075 --> 00:53:31,251
and we had on a side screen
this ability
887
00:53:31,295 --> 00:53:36,778
to watch what certain
neurons were responding to.
888
00:53:36,822 --> 00:53:41,087
And suddenly we noticed that one
of the neurons was tracking faces.
889
00:53:41,130 --> 00:53:45,483
It was tracking our faces
as we were moving around.
890
00:53:45,526 --> 00:53:48,616
Now, the spooky thing about this
is that we never trained
891
00:53:48,660 --> 00:53:52,490
the system
to recognize human faces,
892
00:53:52,533 --> 00:53:55,710
and yet, somehow,
it learned to do that.
893
00:53:57,973 --> 00:53:59,784
Even though these robots
are very simple,
894
00:53:59,785 --> 00:54:02,658
we can see there's
something else going on there.
895
00:54:02,659 --> 00:54:05,867
It's not just programming.
896
00:54:05,894 --> 00:54:08,462
So, this is just the beginning.
897
00:54:10,377 --> 00:54:14,294
I often think about
that beach in Kitty Hawk.
898
00:54:14,338 --> 00:54:18,255
The 1903 flight
by Orville and Wilbur Wright.
899
00:54:21,214 --> 00:54:24,289
It was kind of a canvas plane,
and it's wood and iron,
900
00:54:24,291 --> 00:54:26,928
and it gets off the ground for,
what, a minute and 20 seconds,
901
00:54:26,929 --> 00:54:31,006
on this windy day
before touching back down again.
902
00:54:33,270 --> 00:54:37,143
And it was
just around 65 summers or so
903
00:54:37,186 --> 00:54:43,149
after that moment that you have
a 747 taking off from JFK...
904
00:54:50,099 --> 00:54:52,184
...where a major concern
of someone on the airplane
905
00:54:52,185 --> 00:54:55,380
might be whether or not
their salt-free diet meal
906
00:54:55,381 --> 00:54:56,917
is gonna be coming to them or not.
907
00:54:56,945 --> 00:55:01,385
We have a whole infrastructure,
with travel agents and tower control,
908
00:55:01,428 --> 00:55:03,778
and it's all casual,
and it's all part of the world.
909
00:55:07,086 --> 00:55:09,523
Right now, as far
as we've come with machines
910
00:55:09,567 --> 00:55:12,134
that think and solve problems,
we're at Kitty Hawk now.
911
00:55:12,178 --> 00:55:13,745
We're in the wind.
912
00:55:13,788 --> 00:55:17,052
We have our tattered-canvas
planes up in the air.
913
00:55:20,926 --> 00:55:23,885
But what happens
in 65 summers or so?
914
00:55:23,929 --> 00:55:27,889
We will have machines
that are beyond human control.
915
00:55:27,933 --> 00:55:30,457
Should we worry about that?
916
00:55:32,633 --> 00:55:34,853
I'm not sure it's going to help.
917
00:55:40,337 --> 00:55:44,036
Nobody has any idea today
what it means for a robot
918
00:55:44,079 --> 00:55:46,430
to be conscious.
919
00:55:46,473 --> 00:55:48,649
There is no such thing.
920
00:55:48,693 --> 00:55:50,172
There are a lot of smart people,
921
00:55:50,216 --> 00:55:53,088
and I have a great deal
of respect for them,
922
00:55:53,132 --> 00:55:57,528
but the truth is,
machines are natural psychopaths.
923
00:55:57,571 --> 00:55:59,225
Fear came back into the market.
924
00:55:59,268 --> 00:56:01,706
Went down 800,
nearly 1,000, in a heartbeat.
925
00:56:01,749 --> 00:56:03,360
I mean,
it is classic capitulation.
926
00:56:03,403 --> 00:56:07,146
There are some people who are proposing
it was some kind of fat-finger error.
927
00:56:07,189 --> 00:56:09,583
Take the Flash Crash of 2010.
928
00:56:09,627 --> 00:56:13,413
In a matter of minutes,
$1 trillion in value
929
00:56:13,457 --> 00:56:15,415
was lost in the stock market.
930
00:56:15,459 --> 00:56:18,984
The Dow dropped nearly
1,000 points in a half-hour.
931
00:56:19,027 --> 00:56:22,553
So, what went wrong?
932
00:56:22,596 --> 00:56:26,644
By that point in time,
more than 60% of all the trades
933
00:56:26,687 --> 00:56:29,124
that took place
on the stock exchange
934
00:56:29,168 --> 00:56:32,693
were actually being
initiated by computers.
935
00:56:32,737 --> 00:56:35,783
Panic selling on the way down, and all
of a sudden it stopped on a dime.
936
00:56:35,827 --> 00:56:37,611
This is all happening
in real time, folks.
937
00:56:37,612 --> 00:56:39,883
The short story of what
happened in the Flash Crash
938
00:56:39,884 --> 00:56:42,513
is that algorithms
responded to algorithms,
939
00:56:42,514 --> 00:56:45,430
and it compounded upon itself
over and over and over again
940
00:56:45,431 --> 00:56:47,041
in a matter of minutes.
941
00:56:47,055 --> 00:56:50,972
At one point, the market
fell as if down a well.
942
00:56:51,016 --> 00:56:54,323
There is no regulatory body
that can adapt quickly enough
943
00:56:54,367 --> 00:56:57,979
to prevent potentially
disastrous consequences
944
00:56:58,023 --> 00:57:01,243
of AI operating
in our financial systems.
945
00:57:01,287 --> 00:57:03,898
They are so prime
for manipulation.
946
00:57:03,942 --> 00:57:05,639
Let's talk about the speed
with which
947
00:57:05,683 --> 00:57:08,076
we are watching
this market deteriorate.
948
00:57:08,120 --> 00:57:11,602
That's the type of AI-run-amuck
that scares people.
949
00:57:11,645 --> 00:57:13,560
When you give them a goal,
950
00:57:13,604 --> 00:57:17,225
they will relentlessly
pursue that goal.
951
00:57:17,869 --> 00:57:20,393
How many computer programs
are there like this?
952
00:57:20,437 --> 00:57:22,683
Nobody knows.
953
00:57:23,527 --> 00:57:27,444
One of the fascinating aspects
about AI in general
954
00:57:27,487 --> 00:57:31,970
is that no one really
understands how it works.
955
00:57:32,013 --> 00:57:36,975
Even the people who create AI
don't really fully understand.
956
00:57:37,018 --> 00:57:41,675
Because it has millions of elements,
it becomes completely impossible
957
00:57:41,719 --> 00:57:45,113
for a human being
to understand what's going on.
958
00:57:52,556 --> 00:57:56,037
Microsoft had set up
this artificial intelligence
959
00:57:56,081 --> 00:57:59,127
called Tay on Twitter,
which was a chatbot.
960
00:58:00,912 --> 00:58:02,696
They started out in the morning,
961
00:58:02,740 --> 00:58:06,526
and Tay was starting to tweet
and learning from stuff
962
00:58:06,570 --> 00:58:10,835
that was being sent to him
from other Twitter people.
963
00:58:10,878 --> 00:58:13,272
Because some people,
like trolls, attacked him,
964
00:58:13,315 --> 00:58:18,582
within 24 hours, the Microsoft bot
became a terrible person.
965
00:58:18,625 --> 00:58:21,367
They had to literally
pull Tay off the Net
966
00:58:21,410 --> 00:58:24,718
because he had turned
into a monster.
967
00:58:24,762 --> 00:58:30,550
A misanthropic, racist, horrible person
you'd never want to meet.
968
00:58:30,594 --> 00:58:32,857
And nobody had foreseen this.
969
00:58:35,337 --> 00:58:38,602
The whole idea of AI is that
we are not telling it exactly
970
00:58:38,645 --> 00:58:42,780
how to achieve a given
outcome or a goal.
971
00:58:42,823 --> 00:58:46,435
AI develops on its own.
972
00:58:46,479 --> 00:58:48,829
We're worried about
superintelligent AI,
973
00:58:48,873 --> 00:58:52,790
the master chess player
that will outmaneuver us,
974
00:58:52,833 --> 00:58:55,923
but AI won't have to
actually be that smart
975
00:58:55,967 --> 00:59:00,145
to have massively disruptive
effects on human civilization.
976
00:59:00,188 --> 00:59:01,886
We've seen over the last century
977
00:59:01,929 --> 00:59:05,150
it doesn't necessarily take
a genius to knock history off
978
00:59:05,193 --> 00:59:06,804
in a particular direction,
979
00:59:06,847 --> 00:59:09,589
and it won't take a genius AI
to do the same thing.
980
00:59:09,633 --> 00:59:13,158
Bogus election news stories
generated more engagement
981
00:59:13,201 --> 00:59:17,075
on Facebook
than top real stories.
982
00:59:17,118 --> 00:59:21,079
Facebook really is
the elephant in the room.
983
00:59:21,122 --> 00:59:23,777
AI running Facebook news feed --
984
00:59:23,821 --> 00:59:28,347
The task for AI
is keeping users engaged,
985
00:59:28,390 --> 00:59:29,827
but no one really understands
986
00:59:29,870 --> 00:59:34,832
exactly how this AI
is achieving this goal.
987
00:59:34,875 --> 00:59:38,792
Facebook is building an
elegant mirrored wall around us.
988
00:59:38,836 --> 00:59:41,665
A mirror that we can ask,
"Who's the fairest of them all?"
989
00:59:41,708 --> 00:59:45,016
and it will answer, "You, you,"
time and again
990
00:59:45,059 --> 00:59:48,193
and slowly begin
to warp our sense of reality,
991
00:59:48,236 --> 00:59:53,502
warp our sense of politics,
history, global events,
992
00:59:53,546 --> 00:59:57,028
until determining what's true
and what's not true,
993
00:59:57,071 --> 00:59:58,943
is virtually impossible.
994
01:00:01,032 --> 01:00:03,861
The problem is that AI
doesn't understand that.
995
01:00:03,904 --> 01:00:08,039
AI just had a mission --
maximize user engagement,
996
01:00:08,082 --> 01:00:10,041
and it achieved that.
997
01:00:10,084 --> 01:00:13,653
Nearly 2 billion people
spend nearly one hour
998
01:00:13,697 --> 01:00:17,831
on average a day
basically interacting with AI
999
01:00:17,875 --> 01:00:21,530
that is shaping
their experience.
1000
01:00:21,574 --> 01:00:24,664
Even Facebook engineers,
they don't like fake news.
1001
01:00:24,708 --> 01:00:28,015
It's very bad business.
They want to get rid of fake news.
1002
01:00:28,059 --> 01:00:32,324
It's just very difficult to do because,
how do you recognize news as fake
1003
01:00:32,367 --> 01:00:34,456
if you cannot read
all of those news personally?
1004
01:00:34,500 --> 01:00:39,418
There's so much
active misinformation
1005
01:00:39,461 --> 01:00:41,115
and it's packaged very well,
1006
01:00:41,159 --> 01:00:44,553
and it looks the same when
you see it on a Facebook page
1007
01:00:44,597 --> 01:00:47,426
or you turn on your television.
1008
01:00:47,469 --> 01:00:51,691
It's not terribly sophisticated,
but it is terribly powerful.
1009
01:00:51,735 --> 01:00:54,346
And what it means is
that your view of the world,
1010
01:00:54,389 --> 01:00:56,435
which, 20 years ago,
was determined,
1011
01:00:56,478 --> 01:01:00,004
if you watched the nightly news,
by three different networks,
1012
01:01:00,047 --> 01:01:02,528
the three anchors who endeavored
to try to get it right.
1013
01:01:02,529 --> 01:01:04,583
Might have had a little bias
one way or the other,
1014
01:01:04,584 --> 01:01:08,273
but, largely speaking, we could all
agree on an objective reality.
1015
01:01:08,316 --> 01:01:10,754
Well, that objectivity is gone,
1016
01:01:10,797 --> 01:01:13,757
and Facebook has
completely annihilated it.
1017
01:01:17,108 --> 01:01:20,807
If most of your understanding of how
the world works is derived from Facebook,
1018
01:01:20,851 --> 01:01:23,418
facilitated by algorithmic software
1019
01:01:23,462 --> 01:01:27,118
that tries to show you
the news you want to see,
1020
01:01:27,161 --> 01:01:28,815
that's a terribly
dangerous thing.
1021
01:01:28,859 --> 01:01:33,080
And the idea that we have not
only set that in motion,
1022
01:01:33,124 --> 01:01:37,258
but allowed bad-faith actors
access to that information...
1023
01:01:37,302 --> 01:01:39,565
I mean, this is a recipe
for disaster.
1024
01:01:43,177 --> 01:01:45,876
I think that there will definitely
be lots of bad actors
1025
01:01:45,919 --> 01:01:48,922
trying to manipulate the world with AI.
1026
01:01:48,966 --> 01:01:52,143
2016 was a perfect example
of an election
1027
01:01:52,186 --> 01:01:55,015
where there was lots of AI
producing lots of fake news
1028
01:01:55,059 --> 01:01:58,323
and distributing it
for a purpose, for a result.
1029
01:01:59,890 --> 01:02:02,283
Ladies and gentlemen,
honorable colleagues...
1030
01:02:02,327 --> 01:02:04,546
it's my privilege
to speak to you today
1031
01:02:04,590 --> 01:02:07,985
about the power of big data
and psychographics
1032
01:02:08,028 --> 01:02:09,682
in the electoral process
1033
01:02:09,726 --> 01:02:12,206
and, specifically,
to talk about the work
1034
01:02:12,250 --> 01:02:14,513
that we contributed
to Senator Cruz's
1035
01:02:14,556 --> 01:02:16,558
presidential primary campaign.
1036
01:02:16,602 --> 01:02:19,910
Cambridge Analytica
emerged quietly as a company
1037
01:02:19,953 --> 01:02:21,563
that, according to its own hype,
1038
01:02:21,607 --> 01:02:26,307
has the ability to use
this tremendous amount of data
1039
01:02:26,351 --> 01:02:30,137
in order
to effect societal change.
1040
01:02:30,181 --> 01:02:33,358
In 2016, they had
three major clients.
1041
01:02:33,401 --> 01:02:34,794
Ted Cruz was one of them.
1042
01:02:34,838 --> 01:02:37,884
It's easy to forget that,
only 18 months ago,
1043
01:02:37,928 --> 01:02:42,846
Senator Cruz was one of the less
popular candidates seeking nomination.
1044
01:02:42,889 --> 01:02:47,241
So, what was not possible maybe,
like, 10 or 15 years ago,
1045
01:02:47,285 --> 01:02:49,374
was that you can send fake news
1046
01:02:49,417 --> 01:02:52,420
to exactly the people
that you want to send it to.
1047
01:02:52,464 --> 01:02:56,685
And then you could actually see
how he or she reacts on Facebook
1048
01:02:56,729 --> 01:02:58,905
and then adjust that information
1049
01:02:58,949 --> 01:03:01,778
according to the feedback
that you got.
1050
01:03:01,821 --> 01:03:03,257
So you can start developing
1051
01:03:03,301 --> 01:03:06,130
kind of a real-time management
of a population.
1052
01:03:06,173 --> 01:03:10,699
In this case, we've zoned in on
a group we've called "Persuasion."
1053
01:03:10,743 --> 01:03:13,746
These are people who are
definitely going to vote,
1054
01:03:13,790 --> 01:03:16,705
to caucus, but they need
moving from the center
1055
01:03:16,749 --> 01:03:18,490
a little bit more
towards the right.
1056
01:03:18,533 --> 01:03:19,708
in order to support Cruz.
1057
01:03:19,752 --> 01:03:22,059
They need a persuasion message.
1058
01:03:22,102 --> 01:03:23,800
"Gun rights," I've selected.
1059
01:03:23,843 --> 01:03:25,802
That narrows the field
slightly more.
1060
01:03:25,845 --> 01:03:29,066
And now we know that we need
a message on gun rights,
1061
01:03:29,109 --> 01:03:31,111
it needs to be
a persuasion message,
1062
01:03:31,155 --> 01:03:34,201
and it needs to be nuanced
according to the certain personality
1063
01:03:34,245 --> 01:03:36,029
that we're interested in.
1064
01:03:36,073 --> 01:03:39,946
Through social media, there's an
infinite amount of information
1065
01:03:39,990 --> 01:03:42,514
that you can gather
about a person.
1066
01:03:42,557 --> 01:03:45,734
We have somewhere close
to 4,000 or 5,000 data points
1067
01:03:45,778 --> 01:03:48,563
on every adult
in the United States.
1068
01:03:48,607 --> 01:03:51,915
It's about targeting
the individual.
1069
01:03:51,958 --> 01:03:55,962
It's like a weapon, which can be used
in the totally wrong direction.
1070
01:03:56,006 --> 01:03:58,051
That's the problem
with all of this data.
1071
01:03:58,095 --> 01:04:02,229
It's almost as if we built the bullet
before we built the gun.
1072
01:04:02,273 --> 01:04:06,407
Ted Cruz employed our data,
our behavioral insights.
1073
01:04:06,451 --> 01:04:09,541
He started from a base
of less than 5%
1074
01:04:09,584 --> 01:04:15,590
and had a very slow-and-steady-
but-firm rise to above 35%,
1075
01:04:15,634 --> 01:04:17,157
making him, obviously,
1076
01:04:17,201 --> 01:04:20,465
the second most threatening
contender in the race.
1077
01:04:20,508 --> 01:04:23,120
Now, clearly, the Cruz campaign
is over now,
1078
01:04:23,163 --> 01:04:28,168
but what I can tell you is that of the
two candidates left in this election,
1079
01:04:28,212 --> 01:04:30,867
one of them is using
these technologies.
1080
01:04:32,564 --> 01:04:35,959
I, Donald John Trump,
do solemnly swear
1081
01:04:36,002 --> 01:04:38,222
that I will faithfully execute
1082
01:04:38,265 --> 01:04:42,226
the office of President
of the United States.
1083
01:04:48,275 --> 01:04:50,234
Elections are
a marginal exercise.
1084
01:04:50,277 --> 01:04:53,237
It doesn't take
a very sophisticated AI
1085
01:04:53,280 --> 01:04:57,719
in order to have
a disproportionate impact.
1086
01:04:57,763 --> 01:05:02,550
Before Trump, Brexit was
another supposed client.
1087
01:05:02,594 --> 01:05:04,726
Well, at 20 minutes to 5:00,
1088
01:05:04,770 --> 01:05:08,730
we can now say
the decision taken in 1975
1089
01:05:08,774 --> 01:05:10,950
by this country to join
the common market
1090
01:05:10,994 --> 01:05:15,999
has been reversed by
this referendum to leave the EU.
1091
01:05:16,042 --> 01:05:19,828
Cambridge Analytica
allegedly uses AI
1092
01:05:19,872 --> 01:05:23,267
to push through two of
the most ground-shaking pieces
1093
01:05:23,310 --> 01:05:27,967
of political change
in the last 50 years.
1094
01:05:28,011 --> 01:05:30,709
These are epochal events,
and if we believe the hype,
1095
01:05:30,752 --> 01:05:33,755
they are connected directly
to a piece of software,
1096
01:05:33,799 --> 01:05:37,194
essentially, created
by a professor at Stanford.
1097
01:05:41,459 --> 01:05:45,593
Back in 2013, I described that
what they are doing is possible
1098
01:05:45,637 --> 01:05:49,293
and warned against this
happening in the future.
1099
01:05:49,336 --> 01:05:51,382
At the time, Michal Kosinski
1100
01:05:51,425 --> 01:05:54,994
was a young Polish researcher
working at the Psychometrics Centre.
1101
01:05:55,038 --> 01:06:00,217
So, what Michal had done was to
gather the largest-ever data set
1102
01:06:00,260 --> 01:06:03,481
of how people
behave on Facebook.
1103
01:06:03,524 --> 01:06:07,789
Psychometrics is trying
to measure psychological traits,
1104
01:06:07,833 --> 01:06:09,922
such as personality,
intelligence,
1105
01:06:09,966 --> 01:06:11,880
political views, and so on.
1106
01:06:11,924 --> 01:06:15,058
Now, traditionally,
those traits were measured
1107
01:06:15,101 --> 01:06:17,712
using tests and questions.
1108
01:06:17,756 --> 01:06:20,715
Personality test, the most benign thing
you could possibly think of.
1109
01:06:20,759 --> 01:06:24,197
Something that doesn't necessarily
have a lot of utility, right?
1110
01:06:24,241 --> 01:06:27,331
Our idea was that
instead of tests and questions,
1111
01:06:27,374 --> 01:06:30,029
we could simply look at the
digital footprints of behaviors
1112
01:06:30,073 --> 01:06:32,553
that we are all leaving behind
1113
01:06:32,597 --> 01:06:34,903
to understand openness,
1114
01:06:34,947 --> 01:06:37,732
conscientiousness,
neuroticism.
1115
01:06:37,776 --> 01:06:39,560
You can easily buy
personal data,
1116
01:06:39,604 --> 01:06:43,129
such as where you live,
what club memberships you've tried,
1117
01:06:43,173 --> 01:06:45,044
which gym you go to.
1118
01:06:45,088 --> 01:06:47,873
There are actually marketplaces
for personal data.
1119
01:06:47,916 --> 01:06:51,442
It turns out, we can discover
an awful lot about what you're gonna do
1120
01:06:51,485 --> 01:06:55,750
based on a very, very tiny
set of information.
1121
01:06:55,794 --> 01:06:58,275
We are training
deep-learning networks
1122
01:06:58,318 --> 01:07:01,278
to infer intimate traits,
1123
01:07:01,321 --> 01:07:04,759
people's political views,
personality,
1124
01:07:04,803 --> 01:07:07,806
intelligence,
sexual orientation
1125
01:07:07,849 --> 01:07:10,504
just from an image
from someone's face.
1126
01:07:17,076 --> 01:07:20,645
Now think about countries which
are not so free and open-minded.
1127
01:07:20,688 --> 01:07:23,300
If you can reveal people's
religious views
1128
01:07:23,343 --> 01:07:25,954
or political views
or sexual orientation
1129
01:07:25,998 --> 01:07:28,740
based on only profile pictures,
1130
01:07:28,783 --> 01:07:33,310
this could be literally
an issue of life and death.
1131
01:07:37,009 --> 01:07:39,751
I think there's no going back.
1132
01:07:42,145 --> 01:07:44,321
Do you know what
the Turing test is?
1133
01:07:44,364 --> 01:07:48,977
It's when a human interacts
with a computer,
1134
01:07:49,021 --> 01:07:52,546
and if the human doesn't know they're
interacting with a computer,
1135
01:07:52,590 --> 01:07:54,126
the test is passed.
1136
01:07:54,170 --> 01:07:57,247
And over the next few days,
1137
01:07:57,290 --> 01:07:59,684
you're gonna be the human
component in a Turing test.
1138
01:07:59,727 --> 01:08:02,295
Holy shit.
- That's right, Caleb.
1139
01:08:02,339 --> 01:08:04,080
You got it.
1140
01:08:04,123 --> 01:08:06,865
'Cause if that test is passed,
1141
01:08:06,908 --> 01:08:10,825
you are dead center
of the greatest scientific event
1142
01:08:10,869 --> 01:08:12,958
in the history of man.
1143
01:08:13,001 --> 01:08:17,615
If you've created a conscious machine,
it's not the history of man.
1144
01:08:17,658 --> 01:08:19,356
That's the history of gods.
1145
01:08:26,841 --> 01:08:29,975
It's almost like technology
is a god in and of itself.
1146
01:08:33,196 --> 01:08:35,241
Like the weather.
We can't impact it.
1147
01:08:35,285 --> 01:08:39,293
We can't slow it down.
We can't stop it.
1148
01:08:39,637 --> 01:08:42,849
We feel powerless.
1149
01:08:43,293 --> 01:08:46,687
If we think of God as an unlimited
amount of intelligence,
1150
01:08:46,731 --> 01:08:50,474
the closest we can get to that is
by evolving our own intelligence
1151
01:08:50,517 --> 01:08:55,566
by merging with the artificial
intelligence we're creating.
1152
01:08:55,609 --> 01:08:58,003
Today, our computers, phones,
1153
01:08:58,046 --> 01:09:01,615
applications give us
superhuman capability.
1154
01:09:01,659 --> 01:09:04,662
So, as the old maxim says,
if you can't beat 'em, join 'em.
1155
01:09:06,968 --> 01:09:09,971
It's about
a human-machine partnership.
1156
01:09:10,015 --> 01:09:11,669
I mean, we already see how, you know,
1157
01:09:11,712 --> 01:09:14,933
our phones, for example,
act as memory prosthesis, right?
1158
01:09:14,976 --> 01:09:17,196
I don't have to remember
your phone number anymore
1159
01:09:17,240 --> 01:09:19,198
'cause it's on my phone.
1160
01:09:19,242 --> 01:09:22,070
It's about machines
augmenting our human abilities,
1161
01:09:22,114 --> 01:09:25,248
as opposed to, like,
completely displacing them.
1162
01:09:25,249 --> 01:09:27,538
If you look at all the objects
that have made the leap
1163
01:09:27,539 --> 01:09:30,237
from analog to digital
over the last 20 years...
1164
01:09:30,238 --> 01:09:32,123
it's a lot.
1165
01:09:32,124 --> 01:09:35,388
We're the last analog object
in a digital universe.
1166
01:09:35,389 --> 01:09:37,068
And the problem with that,
of course,
1167
01:09:37,069 --> 01:09:40,609
is that the data input/output
is very limited.
1168
01:09:40,611 --> 01:09:42,613
It's this.
It's these.
1169
01:09:42,856 --> 01:09:45,355
Our eyes are pretty good.
1170
01:09:45,398 --> 01:09:48,445
We're able to take in a lot
of visual information.
1171
01:09:48,488 --> 01:09:52,536
But our information output
is very, very, very low.
1172
01:09:52,579 --> 01:09:55,669
The reason this is important --
If we envision a scenario
1173
01:09:55,713 --> 01:09:59,543
where AI's playing a more
prominent role in societies,
1174
01:09:59,586 --> 01:10:02,023
we want good ways to interact
with this technology
1175
01:10:02,067 --> 01:10:04,983
so that it ends up
augmenting us.
1176
01:10:07,855 --> 01:10:12,295
I think it's incredibly important
that AI not be "other."
1177
01:10:12,338 --> 01:10:14,562
It must be us.
1178
01:10:14,906 --> 01:10:18,605
And I could be wrong
about what I'm saying.
1179
01:10:18,649 --> 01:10:23,915
I'm certainly open to ideas if anybody
can suggest a path that's better.
1180
01:10:23,958 --> 01:10:27,266
But I think we're gonna really
have to either merge with AI
1181
01:10:27,310 --> 01:10:29,063
or be left behind.
1182
01:10:36,406 --> 01:10:38,756
It's hard to kind of
think of unplugging a system
1183
01:10:38,799 --> 01:10:41,802
that's distributed
everywhere on the planet,
1184
01:10:41,846 --> 01:10:45,806
that's distributed now
across the solar system.
1185
01:10:45,850 --> 01:10:49,375
You can't just, you know,
shut that off.
1186
01:10:49,419 --> 01:10:51,290
We've opened Pandora's box.
1187
01:10:51,334 --> 01:10:55,642
We've unleashed forces that
we can't control, we can't stop.
1188
01:10:55,686 --> 01:10:59,516
We're in the midst of essentially
creating a new life-form on Earth.
1189
01:11:05,870 --> 01:11:07,611
We don't know what happens next.
1190
01:11:07,654 --> 01:11:10,353
We don't know what shape
the intellect of a machine
1191
01:11:10,396 --> 01:11:14,531
will be when that intellect is
far beyond human capabilities.
1192
01:11:14,574 --> 01:11:17,360
It's just not something
that's possible.
1193
01:11:24,758 --> 01:11:26,976
The least scary future
I can think of is one
1194
01:11:26,978 --> 01:11:29,633
where we have at least
democratized AI.
1195
01:11:31,548 --> 01:11:34,159
Because if one company
or small group of people
1196
01:11:34,202 --> 01:11:37,031
manages to develop godlike
digital superintelligence,
1197
01:11:37,075 --> 01:11:39,739
they can take over the world.
1198
01:11:40,383 --> 01:11:44,343
At least when there's an evil dictator,
that human is going to die,
1199
01:11:44,387 --> 01:11:46,998
but, for an AI,
there would be no death.
1200
01:11:47,041 --> 01:11:49,392
It would live forever.
1201
01:11:49,435 --> 01:11:53,670
And then you have an immortal dictator
from which we can never escape.
1202
01:13:33,000 --> 01:13:36,500
Correction and synchronisation:
Mazrim Taim
99763
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.