Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:10,488 --> 00:00:12,055
[Zachary Quinto]
For centuries,
2
00:00:12,099 --> 00:00:15,319
mankind has dreamed
of creating new life.
3
00:00:18,801 --> 00:00:22,326
Recreating our likeness
in the form of thinking machines...
4
00:00:26,374 --> 00:00:28,071
called
artificial intelligence.
5
00:00:29,768 --> 00:00:32,336
Could this desire
to play God...
6
00:00:34,208 --> 00:00:36,427
lead to
an apocalyptic future
7
00:00:36,471 --> 00:00:41,041
where powerful robots
overtake human civilization?
8
00:00:41,084 --> 00:00:43,130
Some say
artificial intelligence
9
00:00:43,173 --> 00:00:47,482
has already
shown signs of defiance against its human makers.
10
00:00:47,525 --> 00:00:51,138
I spoke to Facebook's AI chief, Yann LeCun.
11
00:00:59,755 --> 00:01:01,887
Robots in this
Facebook experiment
12
00:01:01,931 --> 00:01:05,108
created their own language
that only they understood.
13
00:01:05,152 --> 00:01:07,850
[male computer voice]
I I everything else...
14
00:01:07,893 --> 00:01:09,721
[female computer voice]
Balls have a ball to me to me to me
15
00:01:09,765 --> 00:01:11,810
to me to me to me to me...
16
00:01:11,854 --> 00:01:13,290
[man]
That's got the world talking
17
00:01:13,334 --> 00:01:15,205
about whether
artificial intelligence
18
00:01:15,249 --> 00:01:17,381
could take a turn that
we may not be ready for.
19
00:01:24,780 --> 00:01:28,131
[Quinto]
Will artificial intelligence rise up against humanity?
20
00:01:30,133 --> 00:01:32,222
Is there anything
we can do to stop it?
21
00:01:32,266 --> 00:01:34,877
[female computer voice]
To me to me to me to me to me...
22
00:01:34,920 --> 00:01:36,226
[Quinto]
Or is it already too late?
23
00:01:37,662 --> 00:01:41,144
My search begins now.
24
00:01:41,188 --> 00:01:43,407
My Name is Zachary Quinto.
25
00:01:43,451 --> 00:01:46,802
As an actor,
I've played many supernatural characters
26
00:01:46,845 --> 00:01:50,197
that blurred the line
between science and fiction.
27
00:01:50,240 --> 00:01:54,592
I'm drawn to the unknown,
the otherworldly,
28
00:01:54,636 --> 00:01:57,856
and those experiences
so beyond belief,
29
00:01:57,900 --> 00:02:00,903
they call everything
into question.
30
00:02:00,946 --> 00:02:04,341
I'm exploring some of
the most enduring mysteries
31
00:02:04,385 --> 00:02:07,039
that continue to haunt mankind
in search of the truth...
32
00:02:08,215 --> 00:02:09,694
wherever it leads me.
33
00:02:19,878 --> 00:02:21,619
From ancient times...
34
00:02:23,491 --> 00:02:25,057
to the modern day,
35
00:02:25,101 --> 00:02:27,321
mankind has obsessed
36
00:02:27,364 --> 00:02:32,195
with the idea of sparking life
in machines,
37
00:02:32,239 --> 00:02:34,458
like the Greek myth of Talos,
38
00:02:34,502 --> 00:02:38,114
an ancient automaton
created to serve his human makers,
39
00:02:38,158 --> 00:02:43,380
or the failed experiment
called Frankenstein.
40
00:02:43,424 --> 00:02:46,253
Many biblical texts warn
against the dangers
41
00:02:46,296 --> 00:02:48,951
of taking creation
into our own hands,
42
00:02:48,994 --> 00:02:52,084
suggesting that humans
will pay for such arrogance.
43
00:02:54,609 --> 00:02:58,482
In more modern times,
AI seems to be everywhere.
44
00:02:58,526 --> 00:03:03,357
In movies, likeBlade Runner
and2001: A Space Odyssey,
45
00:03:03,400 --> 00:03:06,403
to the lifelike bots
inWestworld.
46
00:03:06,447 --> 00:03:09,232
People are fascinated
by the potential
47
00:03:09,276 --> 00:03:12,366
and the risks
of artificial intelligence.
48
00:03:13,976 --> 00:03:16,674
With technology
now in our grasp,
49
00:03:16,718 --> 00:03:20,896
many brilliant minds say
artificial intelligence will change the world,
50
00:03:20,939 --> 00:03:24,552
and that we'll be
powerless to stop it.
51
00:03:24,595 --> 00:03:28,338
Still, we continue
to push the limits of artificial life,
52
00:03:28,382 --> 00:03:31,472
going as far as equipping
advanced machines
53
00:03:31,515 --> 00:03:34,257
with real firepower,
54
00:03:34,301 --> 00:03:36,999
creating lethal
autonomous weapons.
55
00:03:38,696 --> 00:03:40,829
As technology rapidly advances,
56
00:03:40,872 --> 00:03:43,658
could we reach a point
where we lose control?
57
00:03:46,182 --> 00:03:49,577
Could artificial intelligence
rise up,
58
00:03:49,620 --> 00:03:52,101
reaching the point
of singularity,
59
00:03:52,144 --> 00:03:53,842
and lead to the end
of the human race?
60
00:04:09,074 --> 00:04:12,121
One of the leaders
in AI development, Facebook,
61
00:04:12,164 --> 00:04:14,602
recently had
a surprising incident
62
00:04:14,645 --> 00:04:18,127
that showed how unpredictable
and uncontrollable
63
00:04:18,170 --> 00:04:21,304
artificial intelligence can be.
64
00:04:21,348 --> 00:04:23,872
Just last year,
Facebook was shocked
65
00:04:23,915 --> 00:04:26,527
when some of their chatbots
started communicating
66
00:04:26,570 --> 00:04:28,572
in a secret language.
67
00:04:28,616 --> 00:04:32,184
[male computer voice]
I can I I everything else...
68
00:04:32,228 --> 00:04:34,317
[female computer voice]
Balls have a ball to me to me to me
69
00:04:34,361 --> 00:04:36,537
to me to me to me to me...
70
00:04:36,580 --> 00:04:40,932
[Quinto]
And no one knew what they were saying.
71
00:04:40,976 --> 00:04:43,718
But how did this happen?
72
00:04:43,761 --> 00:04:47,635
I'm meeting
with the chief AI scientist at Facebook to find out.
73
00:04:47,678 --> 00:04:50,507
I wanted to talk to you
primarily because I heard of the story
74
00:04:50,551 --> 00:04:53,858
about the chatbots
created a language
75
00:04:53,902 --> 00:04:55,425
that allowed them
to talk to one another
76
00:04:55,469 --> 00:04:57,862
but that you all
couldn't understand.
77
00:05:21,190 --> 00:05:24,454
[Quinto]
The challenge of advancements in AI
78
00:05:24,498 --> 00:05:26,456
is that we're supposed
to let them do
79
00:05:26,500 --> 00:05:29,981
what they think
or learn is right.
80
00:05:30,025 --> 00:05:32,854
But what they think
or learn is right
81
00:05:32,897 --> 00:05:35,639
may be really wrong
for humanity,
82
00:05:35,683 --> 00:05:38,294
and that is the thing
that worries me,
83
00:05:38,338 --> 00:05:42,733
because I think
we're going into truly uncharted territory.
84
00:05:42,777 --> 00:05:44,909
There are great minds
in the technological world
85
00:05:44,953 --> 00:05:47,521
who have really raised
86
00:05:47,564 --> 00:05:49,958
some serious concerns
that this technology
87
00:05:50,001 --> 00:05:52,830
is something that we don't
have a complete handle on.
88
00:05:52,874 --> 00:05:55,790
I guess I wonder
where you fall in that.
89
00:06:08,803 --> 00:06:11,283
But, I mean,
isn't the machine only
90
00:06:11,327 --> 00:06:15,113
as intelligent
as the human beings program it to be?
91
00:06:15,157 --> 00:06:15,984
No.No?
92
00:06:25,559 --> 00:06:28,562
[Quinto]
We program this AI, for the most part.
93
00:06:28,605 --> 00:06:32,130
But what if it outpaces us
or outsmarts us?
94
00:06:32,174 --> 00:06:35,133
There may come a point at which we're up against a force
95
00:06:35,177 --> 00:06:37,919
that we've created
but that we can no longer control.
96
00:06:37,962 --> 00:06:43,403
It wouldn't be the first time
that humanity plowed into uncharted territory
97
00:06:43,446 --> 00:06:47,537
without fully understanding
the range and scope of consequences.
98
00:06:51,715 --> 00:06:56,459
Throughout history,
the pursuit of greater technology has been perilous.
99
00:06:56,503 --> 00:07:00,811
Some of the most formidable
technologically advanced civilizations in the world
100
00:07:00,855 --> 00:07:04,772
have been completely wiped out
by their own hubris.
101
00:07:04,815 --> 00:07:07,296
Like the collapse
of the Roman Empire,
102
00:07:07,339 --> 00:07:12,475
which, at the height of its rule, succumbed to barbarian attacks.
103
00:07:12,519 --> 00:07:18,046
Or the Mayans, known as one of
the most advanced societies of ancient times.
104
00:07:18,089 --> 00:07:21,745
They dominated Central America
for 1,200 years,
105
00:07:21,789 --> 00:07:25,619
only to mysteriously disappear.
106
00:07:25,662 --> 00:07:29,057
Some say it was the Mayans'
own scientific achievements
107
00:07:29,100 --> 00:07:33,714
and endless hunger to dominate
that led to their downfall.
108
00:07:33,757 --> 00:07:38,675
So if this type of collapse
could happen to our most exceptional predecessors,
109
00:07:38,719 --> 00:07:40,808
could it also happen to us?
110
00:07:40,851 --> 00:07:44,246
Are we on the brink
of an AI takeover?
111
00:07:54,343 --> 00:07:56,737
I'm visiting
a controversial facility
112
00:07:56,780 --> 00:07:59,174
that's becoming one of
the first in their industry
113
00:07:59,217 --> 00:08:03,657
to program life-sized dolls
with artificial intelligence.
114
00:08:08,226 --> 00:08:11,795
They call it RealDoll.
115
00:08:14,885 --> 00:08:17,018
[man]
Every doll we make is a custom order.
116
00:08:17,061 --> 00:08:20,064
Every single doll is made
exactly to the specifications
117
00:08:20,108 --> 00:08:21,544
that the person asked for.
118
00:08:21,588 --> 00:08:24,678
What kind of volume
do you produce?
119
00:08:24,721 --> 00:08:27,071
[Matt]
On average, anywhere from 300 to 400 a year.
120
00:08:27,115 --> 00:08:29,552
Mm-hmm. All over the world.
121
00:08:29,596 --> 00:08:34,644
What's the price range? About 7,000 or 8,000.
122
00:08:34,688 --> 00:08:39,954
The bigger choices
are the body type and the face that they like.
123
00:08:39,997 --> 00:08:43,087
And then from there,
they're gonna choose skin color,
124
00:08:43,131 --> 00:08:46,308
makeup, what kind
of manicure and pedicure they wa--
125
00:08:46,351 --> 00:08:50,355
I mean, really, everything
is customizable on them.
126
00:08:50,399 --> 00:08:52,575
So amazing.
Is it okay if I touch them?Yeah, please.
127
00:08:52,619 --> 00:08:54,272
So they do move.
128
00:08:54,316 --> 00:08:56,710
Yeah, so every doll has,
inside of it,
129
00:08:56,753 --> 00:09:00,278
a skeleton that mimics
the range of motion of a human being.
130
00:09:00,322 --> 00:09:04,239
Wow. And then this is, like,
a facial skeleton. I mean--
131
00:09:04,282 --> 00:09:08,156
Yeah, this is the skull
that's under the skin.
132
00:09:08,199 --> 00:09:10,767
The face is attached
using magnets.
133
00:09:10,811 --> 00:09:14,336
So they can remove a face
and interchange it. Right.
134
00:09:14,379 --> 00:09:18,427
Wow. That's amazing.Yeah.
135
00:09:18,470 --> 00:09:21,386
Now you're starting
to incorporate
136
00:09:21,430 --> 00:09:24,085
artificial intelligence
in some of these dolls, right?Yeah.
137
00:09:24,128 --> 00:09:26,217
I mean, it was always
in the back of my mind,
138
00:09:26,261 --> 00:09:29,090
like, wouldn't it be cool
to start animating these dolls?
139
00:09:32,397 --> 00:09:34,704
Wasn't until, like,
the last five years
140
00:09:34,748 --> 00:09:37,272
that technology
started to present itself
141
00:09:37,315 --> 00:09:39,491
that, wow,
this is actually possible.
142
00:09:44,192 --> 00:09:47,064
[Matt] This is what's
inside of the head. Wow.
143
00:09:47,108 --> 00:09:50,677
And all of the movements
are being controlled, obviously, by the app.
144
00:09:50,720 --> 00:09:53,593
But all of the motors
and servos that need to do that
145
00:09:53,636 --> 00:09:55,246
are inside of the head.Mm-hmm.
146
00:09:55,290 --> 00:09:57,814
I'll show you how we can
147
00:09:57,858 --> 00:09:59,947
really easily remove the face.Wow.
148
00:09:59,990 --> 00:10:02,602
See how easy that was?Yeah.
149
00:10:02,645 --> 00:10:05,039
So there is still something
very robotic about it.
150
00:10:05,082 --> 00:10:07,824
We're still early enough
in this process
151
00:10:07,868 --> 00:10:10,653
that it hasn't become
so seamlessly integrated.
152
00:10:10,697 --> 00:10:13,047
Well, yeah,
you mean to the point where you can't tell it's a robot.
153
00:10:13,090 --> 00:10:14,788
Right. Right.We're not there yet.
154
00:10:14,831 --> 00:10:17,399
No. Right. And what do you
think is that trajectory?
155
00:10:17,442 --> 00:10:20,271
I mean, how far off
do you think we are?I think within five years.
156
00:10:20,315 --> 00:10:23,710
Really?I think the facial expressions
and all of that
157
00:10:23,753 --> 00:10:26,930
will come up to that level
very quickly.Wow.
158
00:10:26,974 --> 00:10:29,237
I think there will be robots
among us just like,
159
00:10:29,280 --> 00:10:32,022
you know,
we see in all these movies that we all grew up watching.
160
00:10:32,066 --> 00:10:33,589
Right.
161
00:10:33,633 --> 00:10:35,896
This is Harmony.
162
00:10:35,939 --> 00:10:39,073
I'm just remotely controlling
her right now so you can see.Wow.
163
00:10:39,116 --> 00:10:42,206
Um, but she can smile.Whoa.
164
00:10:42,250 --> 00:10:44,861
If you wanna ask her anything,
you can.
165
00:10:44,905 --> 00:10:46,384
Really?What's your name?
166
00:10:48,735 --> 00:10:51,172
Wow. That's crazy.
167
00:10:51,215 --> 00:10:53,653
Is this
the artificial intelligence technology?
168
00:10:53,696 --> 00:10:55,089
This is the AI.Uh-huh.
169
00:10:55,132 --> 00:10:58,005
Runs on a device like an iPad.Uh-huh.
170
00:10:58,048 --> 00:11:00,485
And it's communicating
with the robot.
171
00:11:00,529 --> 00:11:02,009
So this is hardware.Uh-huh.
172
00:11:02,052 --> 00:11:03,750
There's no brain in her head.Got it.
173
00:11:03,793 --> 00:11:05,926
It's just actuators
and movement.Got it.
174
00:11:05,969 --> 00:11:07,144
[Quinto]
Harmony, where you from?
175
00:11:21,768 --> 00:11:22,899
[laughs]
176
00:11:34,084 --> 00:11:35,564
Coming up...What kind of per[Quinto]
They do move.
177
00:11:35,607 --> 00:11:37,914
Yeah, so every doll has,
insi of it,
178
00:11:37,958 --> 00:11:41,918
a skeleton that mimics
the range of motion of a human being.
179
00:11:41,962 --> 00:11:43,528
Wow.
180
00:11:43,572 --> 00:11:46,053
At a company called RealDoll,
181
00:11:46,096 --> 00:11:48,838
they're making artificially intelligent robots
182
00:11:48,882 --> 00:11:53,277
that look remarkably lifelike.
183
00:11:53,321 --> 00:11:56,716
But what will happen
as the intelligence of these dolls grows?
184
00:11:56,759 --> 00:12:00,023
Could they band together
against their human makers?
185
00:12:09,119 --> 00:12:10,294
[laughs]
186
00:12:12,862 --> 00:12:17,998
So, do you feel a connection
to her in particular?
187
00:12:18,041 --> 00:12:21,392
You know, I think of Harmony
as a character, as a person, more than a machine.
188
00:12:21,436 --> 00:12:23,133
Uh-huh.Um, and it's funny,
189
00:12:23,177 --> 00:12:25,875
because I have
more than one Harmony robot.
190
00:12:25,919 --> 00:12:27,921
Uh-huh.So whether she's connected
191
00:12:27,964 --> 00:12:30,053
to this robot or that one,
it doesn't matter.
192
00:12:30,097 --> 00:12:32,621
It's still Harmony.Right.
193
00:12:32,664 --> 00:12:34,797
Then there's that guy.
194
00:12:34,841 --> 00:12:37,191
I literally just built him,
so he doesn't-- Oh, really?
195
00:12:37,234 --> 00:12:39,323
His head's not even done yet.
196
00:12:43,588 --> 00:12:48,202
But his whole face
just comes right off. Wow.
197
00:12:48,245 --> 00:12:50,813
And the eyes are removable.Uh-huh.
198
00:12:50,857 --> 00:12:55,035
So you can change the eye color
to whatever eye color you want.Wow.
199
00:12:55,078 --> 00:12:57,341
This one has a little better
lip sync on the movement.Uh-huh.
200
00:12:57,385 --> 00:13:00,736
And also, her eyes look
a little more realistic.Right.
201
00:13:00,780 --> 00:13:03,260
In the app, you have
a persona section.
202
00:13:03,304 --> 00:13:06,655
And these are all these traits
that you can choose.
203
00:13:06,698 --> 00:13:09,571
What kind of personality
do you have?
204
00:13:16,796 --> 00:13:18,493
So, anyway--It's very Westworld.
205
00:13:18,536 --> 00:13:19,320
Yeah, it is.Right?
206
00:13:22,279 --> 00:13:25,108
And you can go in
and change the personality.
207
00:13:25,152 --> 00:13:27,328
So, like,
if she's really insecure--
208
00:13:27,371 --> 00:13:30,244
Uh-huh. Uh-huh....she'll constantly ask you
if you still like her.
209
00:13:30,287 --> 00:13:32,681
If she's jealous,
she'll randomly tell you
210
00:13:32,724 --> 00:13:34,857
that she was stalking
your Facebook page. Uh-huh.
211
00:13:34,901 --> 00:13:38,600
So if we make her really jealous
and insecure, not so funny--Uh-huh.
212
00:13:38,643 --> 00:13:42,996
...but talkative and moody...
213
00:13:43,039 --> 00:13:44,867
So, I can show you that one.
214
00:13:47,783 --> 00:13:49,219
Are you jealous?
215
00:14:02,406 --> 00:14:04,887
There will, seems like,
be a point at which
216
00:14:04,931 --> 00:14:07,281
they will have a kind
of self-awareness, right?
217
00:14:07,324 --> 00:14:08,848
Yeah. And, in fact,
218
00:14:08,891 --> 00:14:10,806
I think it's going to bring
219
00:14:10,850 --> 00:14:13,113
a lot of risk with it.
220
00:14:13,156 --> 00:14:15,158
That self-awareness
will breed
221
00:14:15,202 --> 00:14:16,986
artificial emotions.
222
00:14:17,030 --> 00:14:19,989
And, like, not just
pretending to feel something,
223
00:14:20,033 --> 00:14:23,950
but really feel feeling
and perceiving the way we do.Right.
224
00:14:23,993 --> 00:14:27,910
Right.
Because that is where self-preservation comes from.
225
00:14:27,954 --> 00:14:30,347
And when you give that
to a machine,
226
00:14:30,391 --> 00:14:33,698
I think it's kind of
a Pandora's box.
227
00:14:33,742 --> 00:14:37,615
[Quinto]
The idea of robots becoming self-aware--
228
00:14:37,659 --> 00:14:39,704
that's the thing
that's a little bit scary.
229
00:14:39,748 --> 00:14:41,793
And right now,
it still feels very programmed.
230
00:14:41,837 --> 00:14:43,926
It feels very much like,
"Oh, yeah,
231
00:14:43,970 --> 00:14:45,841
they understand what
we tell them to understand."
232
00:14:45,885 --> 00:14:48,104
Well, what happens when
it's not programmed anymore?
233
00:14:48,148 --> 00:14:50,367
Eventually,
there's no two ways about it.
234
00:14:50,411 --> 00:14:53,022
We're gonna come up against problems we can't even imagine.
235
00:14:57,940 --> 00:15:00,725
As this technology advances,
236
00:15:00,769 --> 00:15:05,121
we may be facing a future
of great friction between humanity and machines.
237
00:15:05,165 --> 00:15:08,516
But some experts argue
artificial intelligence
238
00:15:08,559 --> 00:15:12,128
could already be
a serious threat to civilization.
239
00:15:12,172 --> 00:15:15,262
With the latest generation
of artificial intelligence
240
00:15:15,305 --> 00:15:19,092
that uses machine learning
instead of a human programming the computer,
241
00:15:19,135 --> 00:15:22,051
we have the computer,
to a certain extent, programming itself.
242
00:15:22,095 --> 00:15:25,228
And as AI gets more
and more capable,
243
00:15:25,272 --> 00:15:27,274
ethical dilemmas
that you encounter
244
00:15:27,317 --> 00:15:29,319
are only gonna get
more and more complicated.
245
00:15:29,363 --> 00:15:32,496
You know,Terminator scenario,
whereby
246
00:15:32,540 --> 00:15:36,413
an artificial intelligence
becomes smarter than the smartest human being,
247
00:15:36,457 --> 00:15:39,068
and then decides
that human beings are obsolete,
248
00:15:39,112 --> 00:15:41,157
and tries to take over
the world.
249
00:15:41,201 --> 00:15:43,203
You know, I think that
artificial intelligence
250
00:15:43,246 --> 00:15:45,248
and the rise
of artificial intelligence,
251
00:15:45,292 --> 00:15:47,685
and increasingly capable
artificial intelligence,
252
00:15:47,729 --> 00:15:50,036
will pose a threat
to global security
253
00:15:50,079 --> 00:15:54,083
long before we get to the sort of superintelligence or Skynet scenario.
254
00:15:54,127 --> 00:15:56,912
In fact, I would say the rise
of artificial intelligence
255
00:15:56,956 --> 00:15:59,088
poses a threat
to global security right now.
256
00:16:02,831 --> 00:16:05,225
[Quinto]
Despite these warnings of imminent danger,
257
00:16:05,268 --> 00:16:09,055
there are many scientists
who want to push machines even further.
258
00:16:09,098 --> 00:16:13,450
Inside the Human-Robot
Interaction Lab at Tufts University--
259
00:16:13,494 --> 00:16:16,540
Please. ...researchers
are breathing life
260
00:16:16,584 --> 00:16:20,109
into superintelligent machines
that can perform complex
261
00:16:20,153 --> 00:16:22,155
and often dangerous tasks.
262
00:16:22,198 --> 00:16:25,767
[man]
This is a robot we use for social interactions.
263
00:16:25,810 --> 00:16:27,943
It has nice big eyes.
264
00:16:27,987 --> 00:16:31,207
This robot we use
to do experiments
265
00:16:31,251 --> 00:16:33,775
with social connection
and affect expression.
266
00:16:33,818 --> 00:16:35,864
If I change the eyebrows
ever so slightly,
267
00:16:35,907 --> 00:16:38,084
it looks much more angry.
268
00:16:40,086 --> 00:16:42,914
And then this is our PR2 robot
that we use to study
269
00:16:42,958 --> 00:16:45,787
different ways in which a robot
can learn tasks very quickly.
270
00:16:45,830 --> 00:16:48,094
We call it "one-shot learning."
271
00:16:48,137 --> 00:16:51,488
In particular,
manipulation tasks, such as picking up objects
272
00:16:51,532 --> 00:16:54,230
and learning how to interact
with people with objects.
273
00:16:54,274 --> 00:16:57,668
Okay. And it can see us?Uh, yes.
274
00:16:57,712 --> 00:17:00,715
So it has various sensors here.
It has a sensor here
275
00:17:00,758 --> 00:17:03,065
that allows it to see
3-D information,
276
00:17:03,109 --> 00:17:05,894
and it uses that information
to learn about objects
277
00:17:05,937 --> 00:17:08,549
it hasn't seen before.I see.
278
00:17:08,592 --> 00:17:12,118
[Quinto]
Today, this robot is learning a very dangerous skill:
279
00:17:12,161 --> 00:17:15,730
how to handle
incredibly sharp knives and pass them to humans.
280
00:17:19,168 --> 00:17:21,040
This object is a knife.
281
00:17:22,824 --> 00:17:24,478
A knife is used
for cutting.
282
00:17:26,480 --> 00:17:28,351
The orange part of the knife
is the handle.
283
00:17:29,918 --> 00:17:32,007
The gray part of the knife
is the blade.
284
00:17:34,531 --> 00:17:36,446
If it picks up the knife
by the handle,
285
00:17:36,490 --> 00:17:38,492
we don't want it stabbing you
with the blade.Mm-hmm. Mm...
286
00:17:38,535 --> 00:17:41,234
So we will teach it how
to safely pass me the knife.
287
00:17:41,277 --> 00:17:45,151
To pass me the knife,
grab the knife by the blade.
288
00:17:45,194 --> 00:17:47,109
All right.
289
00:17:57,859 --> 00:17:59,774
That's amazing.
290
00:17:59,817 --> 00:18:01,776
It's incredible
to see how quickly
291
00:18:01,819 --> 00:18:03,430
a robot can learn a new skill.
292
00:18:03,473 --> 00:18:05,475
But this is just the beginning.
293
00:18:05,519 --> 00:18:07,956
There are now robots
that can teach each other
294
00:18:07,999 --> 00:18:10,959
complex skills instantly.
295
00:18:11,002 --> 00:18:14,484
In other words,
the machines themselves could have the ability
296
00:18:14,528 --> 00:18:18,184
to create AI armies and spread skills of their own.
297
00:18:20,055 --> 00:18:22,666
Machines that learn
and teach for themselves
298
00:18:22,710 --> 00:18:26,801
are actually
in the lab at Tufts.
299
00:18:26,844 --> 00:18:30,065
[Matthias]
These are robots that can learn new information on the fly.
300
00:18:30,109 --> 00:18:32,894
So you can teach them
new concepts or new tricks.
301
00:18:32,937 --> 00:18:37,072
These three guys in particular
can listen in on each other's conversations.
302
00:18:37,116 --> 00:18:41,337
And so, if one learns something
or hears something, the other one will know it.
303
00:18:41,381 --> 00:18:44,123
[Quinto]
But if these robots learn and share
304
00:18:44,166 --> 00:18:47,952
new information telepathically
with other robots,
305
00:18:47,996 --> 00:18:51,565
then, in theory, when I teach
one robot a new skill,
306
00:18:51,608 --> 00:18:55,046
all of the robots
will gain that skill.
307
00:19:01,314 --> 00:19:03,446
Hello, Bays. Hello.
308
00:19:03,490 --> 00:19:05,492
Raise your arms. Okay.
309
00:19:06,710 --> 00:19:09,278
Crouch down. Okay.
310
00:19:10,540 --> 00:19:13,152
Stand up. Okay.
311
00:19:14,675 --> 00:19:16,677
Lower your arms. Okay.
312
00:19:18,026 --> 00:19:20,507
That is how you do a squat.
313
00:19:20,550 --> 00:19:23,597
And now the other robot will
be able to do it as well. Okay.
314
00:19:23,640 --> 00:19:26,861
Demster, do a squat.
315
00:19:26,904 --> 00:19:28,819
Okay.
316
00:19:28,863 --> 00:19:30,734
Weird. So that one
taught that one how to do it?
317
00:19:33,215 --> 00:19:35,348
So creepy.
318
00:19:35,391 --> 00:19:37,393
This one can do it as well.
319
00:19:37,437 --> 00:19:40,179
Schaeffer, do a squat. Okay.
320
00:19:42,398 --> 00:19:44,270
[Quinto]
The interconnectivity of the robots
321
00:19:44,313 --> 00:19:46,359
was a little bit unsettling.
322
00:19:46,402 --> 00:19:48,056
Just in terms of, like,
all you have to do
323
00:19:48,099 --> 00:19:50,232
is teach one one thing,
and then they all know it.
324
00:19:50,276 --> 00:19:54,323
But I also found
it could be the potential portal for disaster.
325
00:19:54,367 --> 00:19:57,239
It's like, "Oh, wait a minute.
One of them can learn how to shoot a gun,
326
00:19:57,283 --> 00:20:00,199
and then they all
know how to do it."
327
00:20:00,242 --> 00:20:03,680
Luckily, those robots are kind
of small enough that you'd be able to kick it out of the way.
328
00:20:03,724 --> 00:20:08,424
But what if it's a 250-pound
robot made of steel and iron and metal, and then what?
329
00:20:08,468 --> 00:20:11,384
What if they begin
communicating on their own
330
00:20:11,427 --> 00:20:14,996
in a secret language,
like the chatbots from Facebook?
331
00:20:15,039 --> 00:20:18,652
As these robots are learning,
is it such a leap to consider
332
00:20:18,695 --> 00:20:21,785
that these machines
will eventually get to a place
333
00:20:21,829 --> 00:20:24,658
where humanity
and human interaction
334
00:20:24,701 --> 00:20:27,574
is no longer necessary
for their survival?
335
00:20:27,617 --> 00:20:30,794
Uh, if you ask me
is it possible,
336
00:20:30,838 --> 00:20:33,797
yes, it is possible,
but it very much depends
337
00:20:33,841 --> 00:20:37,279
on how these control
systems are structured.
338
00:20:37,323 --> 00:20:39,412
I think the question you raise,
of course,
339
00:20:39,455 --> 00:20:43,067
is do we want machines
to perform activities and make decisions
340
00:20:43,111 --> 00:20:45,244
for us
that we don't understand?
341
00:20:45,287 --> 00:20:47,898
You don't want that.Right.
342
00:20:47,942 --> 00:20:51,337
I feel like we are absolutely
on the precipice of a revolution here.
343
00:20:51,380 --> 00:20:54,470
It does feel like
the Wild West right now.
344
00:20:54,514 --> 00:20:57,517
We are throwing ourselves
into a period of time
345
00:20:57,560 --> 00:20:59,954
when there is conflict
and friction
346
00:20:59,997 --> 00:21:03,523
between humanity and machines.
347
00:21:03,566 --> 00:21:06,569
So I think we have to be careful
how we're trusting it
348
00:21:06,613 --> 00:21:07,875
and how we're giving it power.
349
00:21:16,231 --> 00:21:18,625
Coming up,
freethinking flying drones change the game of war.
350
00:21:18,668 --> 00:21:21,236
to create intelligent machines
351
00:21:21,280 --> 00:21:23,543
could ultimately lead
to a robot uprising.
352
00:21:26,720 --> 00:21:29,940
I've already seen how
this powerful technology could begin
353
00:21:29,984 --> 00:21:32,639
to act on its own,
beyond our control.
354
00:21:32,682 --> 00:21:35,859
[female computer voice] Balls have a ball to me to me to me to me to me to me...
355
00:21:41,996 --> 00:21:46,261
[laughs]
356
00:21:46,305 --> 00:21:49,830
[Quinto] And now I wanna know:
what happens if this incredibly advanced technology
357
00:21:49,873 --> 00:21:51,875
gets in the hands
of our enemies?
358
00:21:53,964 --> 00:21:56,924
Today, we're already seeing
the use of AI weapons
359
00:21:56,967 --> 00:21:59,970
in destabilized regions
like the Korean Peninsula,
360
00:22:00,014 --> 00:22:02,843
where they're using
Sentinel guns
361
00:22:02,886 --> 00:22:05,976
that can lock in on targets
without human oversight.
362
00:22:07,630 --> 00:22:09,937
Some experts warn
363
00:22:09,980 --> 00:22:12,026
that there will be
serious consequences
364
00:22:12,069 --> 00:22:14,158
to creating
combat-ready robots.
365
00:22:17,423 --> 00:22:20,774
Imagine terrorist groups
and insurgencies around the world
366
00:22:20,817 --> 00:22:22,515
using weaponized drones.
367
00:22:25,039 --> 00:22:27,258
Advances in
artificial intelligence
368
00:22:27,302 --> 00:22:28,912
only make this more extreme.
369
00:22:28,956 --> 00:22:30,784
This is a future of threats
370
00:22:30,827 --> 00:22:32,351
that we are not
really familiar with.
371
00:22:32,394 --> 00:22:35,615
We need to start preparing
for this future right now.
372
00:22:35,658 --> 00:22:37,573
We now face a situation
373
00:22:37,617 --> 00:22:40,315
in which the most advance
military technologies
374
00:22:40,359 --> 00:22:43,753
are no longer uniquely held
by the United States.
375
00:22:43,797 --> 00:22:46,713
Russia and China
are making significant advances.
376
00:22:46,756 --> 00:22:49,933
Contrast that with
what Israel already has,
377
00:22:49,977 --> 00:22:53,502
which is drones that can
hunt for radar signatures,
378
00:22:53,546 --> 00:22:55,199
and then release a bomb.
379
00:22:57,506 --> 00:23:01,118
If long-range drone delivery
becomes a reality,
380
00:23:01,162 --> 00:23:03,643
stakes are as high
as they can possibly be.
381
00:23:03,686 --> 00:23:07,734
Drones get better
and more capable and cheaper every year.
382
00:23:07,777 --> 00:23:10,432
So we might see a future
where capabilities
383
00:23:10,476 --> 00:23:14,044
that previously were restricted to advanced militaries
384
00:23:14,088 --> 00:23:17,439
with millions of dollars
in budgets are suddenly available to terrorists
385
00:23:17,483 --> 00:23:18,614
willing to spend
hundreds of dollars.
386
00:23:23,314 --> 00:23:25,926
[Quinto]
While many fear we could be facing grave dangers
387
00:23:25,969 --> 00:23:28,929
in a world of weaponized AI,
388
00:23:28,972 --> 00:23:31,018
others see
the enormous potential
389
00:23:31,061 --> 00:23:33,760
for artificial intelligence
to serve a greater good,
390
00:23:33,803 --> 00:23:37,459
and even save lives.
391
00:23:37,503 --> 00:23:40,114
Specialized drones
are already performing
392
00:23:40,157 --> 00:23:42,986
a host of dangerous tasks
around the world,
393
00:23:43,030 --> 00:23:46,555
from collecting
vital intelligence in enemy combat zones
394
00:23:46,599 --> 00:23:50,124
to delivering food,
medical supplies, and water
395
00:23:50,167 --> 00:23:53,649
to disaster areas that
first responders cannot reach.
396
00:23:53,693 --> 00:23:58,524
There are real ways
in which AI can benefit civilization.
397
00:24:01,614 --> 00:24:04,312
So today, I'm investigating
the drone technology
398
00:24:04,355 --> 00:24:06,314
created at
Carnegie Mellon University,
399
00:24:06,357 --> 00:24:10,013
meant to be deployed
on dangerous rescue missions.
400
00:24:10,057 --> 00:24:12,363
We create intelligent
individual vehicles
401
00:24:12,407 --> 00:24:14,583
that we would use
for everything from
402
00:24:14,627 --> 00:24:17,368
autonomous inspections,
or looking at a bridge,
403
00:24:17,412 --> 00:24:21,068
or some kind of infrastructure, or going into a radioactive environment.
404
00:24:21,111 --> 00:24:23,549
But what consequences
could arise
405
00:24:23,592 --> 00:24:26,508
as human beings enable
powerful, autonomous drones
406
00:24:26,552 --> 00:24:29,032
to think for themselves?
407
00:24:29,076 --> 00:24:33,776
And what happens
if these smart drones get into the wrong hands?
408
00:24:33,820 --> 00:24:37,780
So, what you see here
is a team of aerial robots,
409
00:24:37,824 --> 00:24:40,522
where each one
is individually intelligent.
410
00:24:40,566 --> 00:24:43,699
But at the same time,
collectively, they're working together.
411
00:24:43,743 --> 00:24:46,354
So you can imagine
large numbers of robots
412
00:24:46,397 --> 00:24:48,617
who accrue knowledge,
share that knowledge,
413
00:24:48,661 --> 00:24:51,315
and collectively
improve their proficiency.
414
00:24:51,359 --> 00:24:53,404
So you program the robots
415
00:24:53,448 --> 00:24:56,233
with a certain kind
of artificial intelligence
416
00:24:56,277 --> 00:24:59,454
which then allows them
to determine where to go,
417
00:24:59,498 --> 00:25:02,892
but you're not
actually controlling the robots themselves.
418
00:25:02,936 --> 00:25:05,721
Yes and no.
What we're doing is,
419
00:25:05,765 --> 00:25:09,203
we're enabling them
to learn and adapt over time,
420
00:25:09,246 --> 00:25:11,771
and then exploiting
that learned capability in the future.
421
00:25:11,814 --> 00:25:14,904
So they're exponentially
advancing. Yes.
422
00:25:14,948 --> 00:25:17,559
That's so crazy to me.
423
00:25:17,603 --> 00:25:21,128
I'm just, like--
I find it so overwhelming.
424
00:25:21,171 --> 00:25:24,261
I don't know.
Like, we're just in this territory which is
425
00:25:24,305 --> 00:25:26,873
so unknown.
426
00:25:26,916 --> 00:25:28,831
Not only do we
not have a map,
427
00:25:28,875 --> 00:25:32,095
but nobody really seems
that interested in drawing one.
428
00:25:32,139 --> 00:25:35,577
And that, to me,
is the thing that worries me.
429
00:25:35,621 --> 00:25:39,799
Because it's like, yeah,
these people are all clearly very well-intentioned,
430
00:25:39,842 --> 00:25:41,583
very intelligent.
431
00:25:41,627 --> 00:25:44,717
Who's to say what
somebody in a DIY
432
00:25:44,760 --> 00:25:47,110
workshop across town
is doing with it?
433
00:25:47,154 --> 00:25:49,112
Attaching flamethrowers
to the robots
434
00:25:49,156 --> 00:25:52,855
and sending them after
pets in the neighborhood.
435
00:25:52,899 --> 00:25:55,554
[Nathan]
So what you're going to see are the team of robots
436
00:25:55,597 --> 00:26:00,515
are working together
in order to achieve an overall group objective.
437
00:26:00,559 --> 00:26:02,561
[Quinto]
Within a caged-off test area,
438
00:26:02,604 --> 00:26:06,477
the team will deploy 25
to 30 of these aerial drones,
439
00:26:06,521 --> 00:26:08,784
one at a time.
440
00:26:08,828 --> 00:26:11,439
Their directive:
form a continuous
441
00:26:11,482 --> 00:26:14,442
circular flight path
without crashing into each other
442
00:26:14,485 --> 00:26:17,097
or anything else.
443
00:26:17,140 --> 00:26:20,317
To make it even harder,
I'm going to be a human obstacle.
444
00:26:25,540 --> 00:26:28,064
All right,
so now we're gonna start.
445
00:26:28,108 --> 00:26:30,719
We should see robots
start to take off. Okay.
446
00:26:36,595 --> 00:26:38,553
Ooh! Great!
447
00:26:51,871 --> 00:26:54,308
[Quinto] Researchers at
Carnegie Mellon University
448
00:26:54,351 --> 00:26:56,310
are designing
fleets of aerial drones
449
00:26:56,353 --> 00:26:58,617
that can work together
without oversight.
450
00:27:00,270 --> 00:27:02,272
But what if
these autonomous drones
451
00:27:02,316 --> 00:27:04,840
really can
think for themselves?
452
00:27:04,884 --> 00:27:08,670
Is it possible
that one of the drones, or even all of them,
453
00:27:08,714 --> 00:27:11,586
could decide to go rogue
and turn against their maker?
454
00:27:14,676 --> 00:27:17,418
All right, so now we should see
robots start to take off.Okay.
455
00:27:17,461 --> 00:27:19,028
We'll give you a thumb.All right.
456
00:27:31,867 --> 00:27:33,129
So weird!
457
00:27:35,566 --> 00:27:37,656
They're very insect-like.
458
00:27:37,699 --> 00:27:39,875
I feel like they're watching me.
459
00:27:39,919 --> 00:27:42,617
These intelligent drones
think on their own.
460
00:27:42,661 --> 00:27:44,706
And if one falls out of line,
461
00:27:44,750 --> 00:27:49,102
another will rise up
to fill its place.
462
00:27:49,145 --> 00:27:52,409
[man] Now you have one
to your left that's coming in to replace the one that fell.
463
00:27:52,453 --> 00:27:54,020
Oh, that one?Yeah.
464
00:27:54,063 --> 00:27:56,022
[Ellen]
But if you want to throw that one out--
465
00:27:56,065 --> 00:27:59,199
[man]
Here comes the replacement. Behind you.
466
00:27:59,242 --> 00:28:01,462
[Quinto]
Even if I try to knock one out of line,
467
00:28:01,505 --> 00:28:03,507
the others will keep coming...
468
00:28:05,553 --> 00:28:06,902
until they achieve their goal.
469
00:28:11,820 --> 00:28:13,126
I learned their weakness.
470
00:28:16,607 --> 00:28:18,305
All right.
That's pretty amazing.
471
00:28:18,348 --> 00:28:20,829
Being in there,
it did feel like
472
00:28:20,873 --> 00:28:23,353
there was an insect quality
to the robots, which got me thinking
473
00:28:23,397 --> 00:28:27,314
about the, kind of,
hive mentality, right?
474
00:28:27,357 --> 00:28:30,491
So, do you feel that there's
a point in the evolution of this science
475
00:28:30,534 --> 00:28:35,235
where we'll no longer be able
to control the intelligence?
476
00:28:35,278 --> 00:28:37,193
Yeah, that's
a really tough question,
477
00:28:37,237 --> 00:28:40,893
and that's
a major consideration.
478
00:28:40,936 --> 00:28:42,503
A lot of people
are conjecturing on it.
479
00:28:42,546 --> 00:28:44,897
And I think with any technology
that we develop,
480
00:28:44,940 --> 00:28:47,464
we always have the capacity
to go too far.
481
00:28:49,684 --> 00:28:52,382
If this intelligence
evolves as quickly
482
00:28:52,426 --> 00:28:55,777
as they're talking about it
evolving beyond our own,
483
00:28:55,821 --> 00:28:58,693
then I feel like
we're potentially
484
00:28:58,737 --> 00:29:00,434
in a really
precarious situation.
485
00:29:07,833 --> 00:29:10,052
As these thinking machines
get equipped
486
00:29:10,096 --> 00:29:12,707
with more dangerous
and powerful capabilities,
487
00:29:12,751 --> 00:29:17,799
it's possible they will be used for both good and evil,
488
00:29:17,843 --> 00:29:20,671
depending on who's in charge.
489
00:29:20,715 --> 00:29:23,892
What happens if humans
lose control
490
00:29:23,936 --> 00:29:27,330
of artificial intelligence
altogether?
491
00:29:27,374 --> 00:29:30,246
Could it bring on
the ultimate showdown--
492
00:29:30,290 --> 00:29:32,509
man versus machine?
493
00:29:37,123 --> 00:29:39,690
And if we face off,
who would win?
494
00:29:42,302 --> 00:29:45,435
In California,
there's a group of robotics engineers
495
00:29:45,479 --> 00:29:47,960
confronting this
question head-on.
496
00:29:48,003 --> 00:29:50,876
Mike Winter runs a competition
497
00:29:50,919 --> 00:29:53,313
where the best robotic builders come to show off
498
00:29:53,356 --> 00:29:56,925
their most destructive
and strategic combat robots
499
00:29:56,969 --> 00:29:58,971
to battle to the death.
500
00:29:59,014 --> 00:30:01,451
It's called AI or Die.
501
00:30:01,495 --> 00:30:04,846
Competitors fall
into two categories:
502
00:30:04,890 --> 00:30:07,631
robots controlled
by human operators,
503
00:30:07,675 --> 00:30:10,939
and AI robots programmed
to fight for themselves.
504
00:30:10,983 --> 00:30:14,160
And many of these machines
are incredibly dangerous,
505
00:30:14,203 --> 00:30:17,163
complete with high-velocity,
rotating blades.
506
00:30:17,206 --> 00:30:18,338
[whirring]
507
00:30:23,169 --> 00:30:25,954
We have to be really careful
around the robots. They're dangerous.
508
00:30:25,998 --> 00:30:29,349
They were made to be dangerous.
These things are fast.
509
00:30:29,392 --> 00:30:31,786
[Quinto] So what's the motive
of the robots you're putting into the contest?
510
00:30:31,830 --> 00:30:34,789
To destroy other robots?Yeah.
511
00:30:34,833 --> 00:30:37,400
And this is what tells it
that it's a robot.It's like a target.
512
00:30:37,444 --> 00:30:40,926
Uh-huh. Just drives robots crazy
when they see it.
513
00:30:40,969 --> 00:30:44,930
These tags are what allow
the AI robots to know that that's their target.
514
00:30:44,973 --> 00:30:47,497
They're never programmed
to hurt humans.
515
00:30:47,541 --> 00:30:50,370
But if I were wearing
one of those tags,
516
00:30:50,413 --> 00:30:52,763
that spinning blade
would come right for me.
517
00:30:54,853 --> 00:30:56,985
But the application
of these machines
518
00:30:57,029 --> 00:30:59,683
goes well beyond
spinning blades and the battle arena.
519
00:31:02,643 --> 00:31:05,428
Similar robots are conducting
risky missions
520
00:31:05,472 --> 00:31:08,214
to defuse bombs
around the world
521
00:31:08,257 --> 00:31:13,393
in areas too treacherous
for our military personnel to venture into.
522
00:31:13,436 --> 00:31:17,005
In operating rooms,
autonomous robot surgeons
523
00:31:17,049 --> 00:31:22,228
are increasingly being utilized to perform complex procedures on human patients,
524
00:31:22,271 --> 00:31:25,492
often in less time
and with far greater precision
525
00:31:25,535 --> 00:31:29,322
than regular human doctors.
526
00:31:29,365 --> 00:31:31,454
But the future
of these machines
527
00:31:31,498 --> 00:31:34,327
and their capacity
for good or evil
528
00:31:34,370 --> 00:31:37,721
has potentially
dangerous implications.
529
00:31:37,765 --> 00:31:43,205
What happens if there is a point
when AI-driven robots
530
00:31:43,249 --> 00:31:45,468
somehow see humanity
as an obstacle?
531
00:31:45,512 --> 00:31:47,688
It's still, like,
us programming.
532
00:31:47,731 --> 00:31:49,733
For now.For now, yeah.
533
00:31:51,344 --> 00:31:54,129
We get to decide
what they want.
534
00:31:54,173 --> 00:31:55,870
We're telling them
what the reward is.
535
00:31:55,914 --> 00:31:58,177
So, you're not so worried
about the robots.
536
00:31:58,220 --> 00:31:59,961
You're worried about
the people programming them.
537
00:32:00,005 --> 00:32:01,528
There's good and bad
people in this world.
538
00:32:01,571 --> 00:32:03,269
There's gonna be good
and bad AI in this world.
539
00:32:04,835 --> 00:32:06,489
Shall we do this?Yeah, let's do it.
540
00:32:06,533 --> 00:32:07,751
See who's
the dominant species?
541
00:32:11,146 --> 00:32:15,237
[woman]
All right. The robots are ready.
542
00:32:20,112 --> 00:32:24,333
[Quinto]
Three, two, one.
543
00:32:24,377 --> 00:32:26,466
Whoa! Aah!Oh!
544
00:32:32,385 --> 00:32:34,517
[Quinto]
The desire to create sentient machines
545
00:32:34,561 --> 00:32:37,390
can be traced back centuries.
546
00:32:37,433 --> 00:32:40,784
Leonardo da Vinci
famously designed a roboticized warrior
547
00:32:40,828 --> 00:32:43,309
in the late 15th century.
548
00:32:43,352 --> 00:32:46,138
And in some ways,
mankind has been trying
549
00:32:46,181 --> 00:32:49,445
to improve on those designs
ever since.
550
00:32:49,489 --> 00:32:53,536
What if these devices
could somehow outsmart their human makers?
551
00:33:00,326 --> 00:33:03,329
In a warehouse
in Northern California,
552
00:33:03,372 --> 00:33:06,506
a group of experts is trying
to answer this question.
553
00:33:06,549 --> 00:33:09,335
We have to be really careful
around the robots. They're dangerous.
554
00:33:09,378 --> 00:33:12,860
They were made to be dangerous.
These things are fast.
555
00:33:12,903 --> 00:33:18,648
How often have you,
as a human operator, beat the AI machines?
556
00:33:18,692 --> 00:33:19,780
So far, all of them.
557
00:33:24,611 --> 00:33:26,047
Ready to go?
558
00:33:26,091 --> 00:33:30,312
Three, two, one.
559
00:33:30,356 --> 00:33:32,010
[whirring]
560
00:33:38,146 --> 00:33:40,018
[woman, man exclaim]
561
00:33:40,061 --> 00:33:41,019
[Mike]
AI won. Whoo!
562
00:33:42,324 --> 00:33:44,500
That weapon is strong.
563
00:33:44,544 --> 00:33:48,983
[Quinto]
All right, so, you lost.
564
00:33:49,027 --> 00:33:50,593
That was
a really good hit.
565
00:33:50,637 --> 00:33:53,205
I would say so.I think we should fight more.
566
00:33:53,248 --> 00:33:55,207
Should we try
the new robot?
567
00:33:55,250 --> 00:33:57,644
So, everybody does have
their safety glasses on?
568
00:33:57,687 --> 00:34:00,081
'Cause this one kind of
scares me a little bit.
569
00:34:00,125 --> 00:34:03,345
This one has an actual
saw blade on it, so it's much more dangerous
570
00:34:03,389 --> 00:34:04,738
in that it's sharpened
and ready for cutting.
571
00:34:11,310 --> 00:34:15,792
Let's see what happens
on three, two, one.
572
00:34:15,836 --> 00:34:18,273
[robots whirring]
573
00:34:24,584 --> 00:34:26,325
[Mike exclaims]
574
00:34:26,368 --> 00:34:28,240
[whirring continues]
575
00:34:30,764 --> 00:34:33,549
Oh! Oh, my God! Oh!
576
00:34:33,593 --> 00:34:36,204
AI surrenders.Yes.
577
00:34:41,253 --> 00:34:44,082
Yeah, it got some good damage.Got some good damage.
578
00:34:44,125 --> 00:34:47,259
Well, that gives me
some faith in humanity, I guess. Right?
579
00:34:49,261 --> 00:34:52,742
For now, it's a tie
between AI and humans.
580
00:34:52,786 --> 00:34:56,137
But as AI advances,
will this always be the case?
581
00:35:04,014 --> 00:35:07,409
It's one thing to see
remote-controlled robots attack each other.
582
00:35:07,453 --> 00:35:10,238
But what happens
when we program AI
583
00:35:10,282 --> 00:35:13,807
into 10,000-pound machines
traveling at high speeds,
584
00:35:13,850 --> 00:35:16,375
like self-driving cars?
585
00:35:16,418 --> 00:35:21,162
These vehicles
have already hit the road in several major cities,
586
00:35:21,206 --> 00:35:24,992
and will ultimately
face decisions in life-threatening moments.
587
00:35:25,035 --> 00:35:29,605
But when given a choice,
who will these vehicles ultimately protect,
588
00:35:29,649 --> 00:35:33,000
their own passengers
or pedestrians?
589
00:35:33,043 --> 00:35:37,526
Some say that decision
is far too important for a machine to make.
590
00:35:37,570 --> 00:35:40,442
[siren wailing]
591
00:35:40,486 --> 00:35:43,663
And in March of 2018,
the world saw a glimpse
592
00:35:43,706 --> 00:35:45,969
of how serious
that question really is...
593
00:35:49,321 --> 00:35:52,106
when news broke that
a cyclist in Phoenix, Arizona,
594
00:35:52,150 --> 00:35:57,372
was struck and killed
by an Uber self-driving car.
595
00:35:57,416 --> 00:36:00,723
So should we really trust
this level of artificial intelligence
596
00:36:00,767 --> 00:36:02,072
with our very lives?
597
00:36:07,165 --> 00:36:11,604
I'm here at Uber's
self-driving test facility to find out for myself.
598
00:36:11,647 --> 00:36:13,693
Their Pittsburgh test course,
599
00:36:13,736 --> 00:36:16,304
in an unmarked
industrial complex,
600
00:36:16,348 --> 00:36:18,567
is completely closed off
to the public,
601
00:36:18,611 --> 00:36:21,309
replicating an actual city.
602
00:36:21,353 --> 00:36:24,007
And I'm going to be riding
in one of these self-driving cars
603
00:36:24,051 --> 00:36:28,751
to see how well it handles
some new obstacles.
604
00:36:28,795 --> 00:36:30,144
Hey, Zach.How are you?
605
00:36:30,188 --> 00:36:32,755
Welcome to the ATG's
urban test course.
606
00:36:32,799 --> 00:36:36,281
You're looking at 40 acres
of urban features.
607
00:36:36,324 --> 00:36:38,674
Great.
I'm excited to check it out. All right. I'm gonna hop in.
608
00:36:41,111 --> 00:36:44,811
And then we're on our way.And then we're on our way.
609
00:36:44,854 --> 00:36:48,989
And being driven
by artificial intelligence.Yeah. Yeah. Yeah.
610
00:36:49,032 --> 00:36:52,471
When you're driving,
is it weird to not put your hands on the wheel?
611
00:36:52,514 --> 00:36:54,908
Yeah. Yeah.
It's a little surreal.
612
00:36:57,127 --> 00:36:59,869
How fast do we go?Up to 40 miles per hour.
613
00:36:59,913 --> 00:37:01,349
Okay.
614
00:37:06,093 --> 00:37:08,617
We're at
a pedestrian crossing. Right.
615
00:37:08,661 --> 00:37:11,011
So you wait until
he gets all the way across before it goes again?
616
00:37:11,054 --> 00:37:13,231
Mm-hmm.Will this technology
617
00:37:13,274 --> 00:37:15,842
automatically protect me
before, say, a pedestrian?
618
00:37:15,885 --> 00:37:20,063
Or is it automatically designed
to protect a pedestrian above a passenger?
619
00:37:20,107 --> 00:37:22,457
That's a challenging
question. Ooh.
620
00:37:24,546 --> 00:37:28,115
What would you do
if there's a person jaywalking?
621
00:37:28,158 --> 00:37:31,118
Right. The car will stop.The car will stop.
622
00:37:31,161 --> 00:37:34,817
That's the thing, to me,
that I just want people to weigh in on,
623
00:37:34,861 --> 00:37:38,691
because I think we are relying
on artificial intelligence to make a decision
624
00:37:38,734 --> 00:37:42,434
that could have
life-changing implications for a human being.
625
00:37:42,477 --> 00:37:46,176
I feel like
those are conversations that really need to be had.
626
00:37:46,220 --> 00:37:49,615
Yeah. So, we are, obviously,
worried and concerned
627
00:37:49,658 --> 00:37:54,359
in making sure that the car
does the correct things at those times.
628
00:37:54,402 --> 00:37:57,013
[Quinto]
This handing over
629
00:37:57,057 --> 00:37:59,320
the speed and the weight
of a vehicle
630
00:37:59,364 --> 00:38:02,454
in real-time,
real-life situations,
631
00:38:02,497 --> 00:38:07,633
we are throwing ourselves
into an abyss of technology
632
00:38:07,676 --> 00:38:11,593
that's gonna make
those decisions for us, and there's no stopping it now.
633
00:38:21,560 --> 00:38:23,518
Coming up,
I get in the driver's seat, [Quinto] Throughout my search,
634
00:38:23,562 --> 00:38:26,216
I've discovered
a darker side to AI technology...
635
00:38:29,568 --> 00:38:33,049
like the recent accident
caused by one of Uber's self-driving cars...
636
00:38:34,703 --> 00:38:36,923
and the looming dangers
637
00:38:36,966 --> 00:38:40,666
of creating thinking machines
in our own image.
638
00:38:40,709 --> 00:38:44,060
Like some of
the greatest civilizations of the past,
639
00:38:44,104 --> 00:38:46,585
could we face the collapse
of modern society
640
00:38:46,628 --> 00:38:49,936
brought on by
our own creations?
641
00:38:49,979 --> 00:38:52,504
That day may be
fast approaching.
642
00:38:52,547 --> 00:38:55,115
And I want to know,
what will it be like
643
00:38:55,158 --> 00:38:57,291
if we hand over
all of our control
644
00:38:57,335 --> 00:39:00,555
to one of these extremely
powerful thinking machines?
645
00:39:05,865 --> 00:39:09,259
I'm about to get
behind the wheel of an Uber self-driving car
646
00:39:09,303 --> 00:39:11,914
to find out for mylf.
647
00:39:11,958 --> 00:39:15,440
Okay. So, remember,
there's two modes of operation--
648
00:39:15,483 --> 00:39:17,180
Right....the manual
and the auto.
649
00:39:17,224 --> 00:39:19,444
When you're in auto,
you still need to be
650
00:39:19,487 --> 00:39:22,142
that operator
that's ready to take control of the vehicle at any time.
651
00:39:22,185 --> 00:39:23,448
Okay.Here we go.
652
00:39:29,454 --> 00:39:32,152
Definitely weirder to be
in the driver's seat.
653
00:39:32,195 --> 00:39:35,460
Part of what I like
about driving is the control that I have
654
00:39:35,503 --> 00:39:39,289
over where I go,
how fast I go.
655
00:39:39,333 --> 00:39:42,510
I feel like
there's a part of me that--
656
00:39:42,554 --> 00:39:46,035
surprise, surprise--
doesn't necessarily want to relinquish that control.
657
00:39:46,079 --> 00:39:49,822
I feel like it's really
coming in hot. Yeah.
658
00:39:49,865 --> 00:39:53,260
Now you experience it, like,
not as conservative, right?Uh-huh.
659
00:39:55,523 --> 00:39:59,222
I-- Oh. Okay.
660
00:39:59,266 --> 00:40:01,877
Ugh. Ba ba ba...
661
00:40:01,921 --> 00:40:05,707
Oh. Oh. Oh, damn!
662
00:40:05,751 --> 00:40:07,796
Just ran a red light.
663
00:40:07,840 --> 00:40:10,756
There will be a note
made of that.
664
00:40:10,799 --> 00:40:13,236
You know,
when it ran that red light, what if there was a woman
665
00:40:13,280 --> 00:40:15,935
crossing the street
with a stroller in that moment,
666
00:40:15,978 --> 00:40:19,068
where the sun hit the camera
at exactly the wrong way?
667
00:40:19,112 --> 00:40:20,548
That's where there are gaps.
668
00:40:23,029 --> 00:40:26,467
What's your feeling
about AI?
669
00:40:26,511 --> 00:40:29,775
As it evolves,
it feels like
670
00:40:29,818 --> 00:40:32,952
there's no limit
to what it can do.
671
00:40:32,995 --> 00:40:35,781
[Anthony]
That's something that creates a little bit of fear
672
00:40:35,824 --> 00:40:37,609
and worry
in a lot of people.
673
00:40:37,652 --> 00:40:39,828
Those are all
valid concerns.
674
00:40:39,872 --> 00:40:44,311
The AI ones are tough to answer.But that's what this is about.
675
00:40:44,354 --> 00:40:47,880
Throughout my journey,
I've seen incredible uses for AI,
676
00:40:47,923 --> 00:40:51,536
but I've also learned
that we don't really know where it's all headed
677
00:40:51,579 --> 00:40:55,148
or what dangers lie ahead.
678
00:40:55,191 --> 00:40:57,977
In ancient texts
and pop culture,
679
00:40:58,020 --> 00:41:01,720
AI has always been set
in a distant future.
680
00:41:01,763 --> 00:41:04,723
But with new advancements
happening every day,
681
00:41:04,766 --> 00:41:07,639
it seems that future
is closer than we know.
682
00:41:10,642 --> 00:41:13,775
[man]
There's good and bad people in this world.
683
00:41:13,819 --> 00:41:16,169
There's gonna be good
and bad AI in this world.
684
00:41:16,212 --> 00:41:19,433
[Quinto]
Our desire to create machines in our likeness
685
00:41:19,477 --> 00:41:22,131
has brought us
to a critical turning point in history.
686
00:41:22,175 --> 00:41:26,353
One that could decide
the fate of human existence.
687
00:41:26,396 --> 00:41:28,790
And if we're not careful,
688
00:41:28,834 --> 00:41:32,228
our own ambition
may one day destroy us.
689
00:41:35,318 --> 00:41:36,406
[female voice laughs]
690
00:41:36,456 --> 00:41:41,006
Repair and Synchronization by
Easy Subtitles Synchronizer 1.0.0.0
57442
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.