Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,258 --> 00:00:01,860
Open the pod bay doors, HAL.
2
00:00:01,892 --> 00:00:05,396
I'm sorry, Dave.
I'm afraid I can't do that.
3
00:00:05,430 --> 00:00:08,065
2001 had a profound
impact on my life.
4
00:00:08,098 --> 00:00:09,901
It's all about HAL 9000.
5
00:00:09,933 --> 00:00:14,739
"2001" was an extraordinary
breakthrough for the genre.
6
00:00:14,773 --> 00:00:15,940
Science fiction
has always been about
7
00:00:15,974 --> 00:00:18,510
great technology
going wrong.
8
00:00:18,543 --> 00:00:19,945
I'll be back.
9
00:00:19,977 --> 00:00:22,680
That image, Schwarzenegger
as the Terminator,
10
00:00:22,713 --> 00:00:24,416
it's a perfect nightmare.
11
00:00:24,448 --> 00:00:25,850
You have two of the most
popular A.I. characters
12
00:00:25,883 --> 00:00:27,551
in pop culture.
13
00:00:27,584 --> 00:00:29,887
At the time I said,
"Don't be afraid of the robots.
14
00:00:29,920 --> 00:00:32,022
The robots
are our friends."
15
00:00:32,056 --> 00:00:35,660
Conception of what robots will
be is directly,
16
00:00:35,693 --> 00:00:40,431
umbilically connected to our
idea of them as an underclass.
17
00:00:40,465 --> 00:00:42,501
Replicants are like
any other machine.
18
00:00:42,534 --> 00:00:43,935
They're either a benefit
or a hazard.
19
00:00:43,968 --> 00:00:45,837
"Blade Runner"
was so artistic.
20
00:00:45,870 --> 00:00:48,672
If you're creating
an A.I.,
21
00:00:48,706 --> 00:00:50,175
one of the things you're
definitely going to leave out
22
00:00:50,208 --> 00:00:52,077
is emotion.
23
00:00:52,109 --> 00:00:54,845
"Battlestar Galactica" is about
humanity's greatest weakness.
24
00:00:54,879 --> 00:00:56,848
You're just a bunch
of machines after all.
25
00:00:56,880 --> 00:00:59,885
The inability to see others
as worthy as ourselves.
26
00:00:59,918 --> 00:01:02,954
Our machines have been
the stuff of dreams
27
00:01:02,987 --> 00:01:04,656
and of nightmares.
28
00:01:04,689 --> 00:01:09,594
The question is, can man
and machine forge a future...
29
00:01:09,627 --> 00:01:10,796
together?
30
00:01:10,829 --> 00:01:11,997
Hello, I'm here.
31
00:01:15,032 --> 00:01:16,168
Hi.
32
00:01:50,100 --> 00:01:51,703
They call it
science fiction,
33
00:01:51,736 --> 00:01:53,038
but it's really
about technology.
34
00:01:53,070 --> 00:01:55,739
It's about the machines.
35
00:01:55,772 --> 00:01:57,675
You've done a lot
of science-fiction movies.
36
00:01:57,709 --> 00:01:59,778
You've seen all kinds
of different machines,
37
00:01:59,811 --> 00:02:01,011
intelligent machines.
38
00:02:01,044 --> 00:02:02,714
You've played
an intelligent machine.
39
00:02:02,746 --> 00:02:05,183
I think that what's
interesting is
40
00:02:05,215 --> 00:02:09,153
when you have been involved
in the business
41
00:02:09,186 --> 00:02:13,090
as long as I have,
what is so unbelievable
42
00:02:13,124 --> 00:02:15,025
is that as I've done
43
00:02:15,059 --> 00:02:18,029
"Terminator" movies
one after the next
44
00:02:18,061 --> 00:02:20,931
and you see something
starting out kind of
45
00:02:20,964 --> 00:02:23,534
what is called
science fiction
46
00:02:23,567 --> 00:02:25,102
and then all of a sudden,
47
00:02:25,136 --> 00:02:27,672
it becomes kind of
science reality.
48
00:02:27,704 --> 00:02:29,708
Yeah. I think science fiction
has always been
49
00:02:29,740 --> 00:02:31,743
about great technology
going wrong.
50
00:02:31,775 --> 00:02:34,912
It's like how A.I. might be
a threat to humanity.
51
00:02:36,013 --> 00:02:37,582
I'm a friend
of Sarah Connor.
52
00:02:37,614 --> 00:02:40,084
Can't see her.
She's making a statement.
53
00:02:40,118 --> 00:02:43,088
I'll be back.
54
00:02:43,120 --> 00:02:44,522
That image,
55
00:02:44,555 --> 00:02:46,056
Schwarzenegger
as the Terminator,
56
00:02:46,089 --> 00:02:49,059
Skynet sending this emissary,
57
00:02:49,093 --> 00:02:51,929
quasi-human into our midst,
58
00:02:51,963 --> 00:02:55,834
it's a perfect nightmare
of the machine catastrophe.
59
00:03:00,270 --> 00:03:03,073
The old warning about
the machines rising up.
60
00:03:03,107 --> 00:03:06,978
It's very archetypal and
very brutal and very perfect.
61
00:03:07,011 --> 00:03:09,114
When I first thought of the idea
for the Terminator...
62
00:03:09,147 --> 00:03:10,281
How did you come up
with that idea?
63
00:03:10,315 --> 00:03:11,816
It came from a dream.
64
00:03:11,849 --> 00:03:13,285
I had a dream image
65
00:03:13,317 --> 00:03:16,855
of a chrome skeleton
walking out of a fire.
66
00:03:16,887 --> 00:03:19,224
And I thought,
"What if he was a cyborg
67
00:03:19,256 --> 00:03:20,992
and he looked like a man
68
00:03:21,024 --> 00:03:24,595
and was indistinguishable
from a man until the fire?"
69
00:03:24,629 --> 00:03:27,131
And what would be
the purpose of that thing?
70
00:03:27,165 --> 00:03:28,233
He was representing
71
00:03:28,266 --> 00:03:30,135
a much more powerful
intelligence,
72
00:03:30,167 --> 00:03:33,638
-the soldier sent by Skynet from the future.
-Right.
73
00:03:35,773 --> 00:03:37,341
"Terminator" presents
74
00:03:37,375 --> 00:03:39,878
this vision of a future
where Skynet,
75
00:03:39,911 --> 00:03:42,747
this computer that
has become self-aware,
76
00:03:42,780 --> 00:03:45,951
decides, "Well, I'm going
to protect myself
77
00:03:45,983 --> 00:03:48,118
and the only way to do that
is to destroy
78
00:03:48,151 --> 00:03:51,589
the very people who created me,"
which is us.
79
00:03:51,623 --> 00:03:54,792
Skynet is not the
first all-powerful computer.
80
00:03:54,826 --> 00:03:57,095
This trope goes back
to Robert Heinlein's
81
00:03:57,127 --> 00:03:58,964
"The Moon
Is a Harsh Mistress"
82
00:03:58,996 --> 00:04:00,965
The Forbin Project"
83
00:04:00,999 --> 00:04:03,168
and even up to
"WarGames," the WOPR.
84
00:04:03,201 --> 00:04:05,936
But part of what makes
the Terminator so scary
85
00:04:05,970 --> 00:04:07,906
is that it is relentless
86
00:04:07,939 --> 00:04:11,643
and it will not stop until
it achieves its objective.
87
00:04:18,816 --> 00:04:21,018
I remember the first
day's dailies.
88
00:04:21,052 --> 00:04:22,653
There was a 100-millimeter
lens shot
89
00:04:22,686 --> 00:04:23,922
where you just
kind of pull up
90
00:04:23,955 --> 00:04:25,190
and you're looking
like this.
91
00:04:25,222 --> 00:04:27,157
-Right.
-And we're all just going, "Yes!
92
00:04:27,191 --> 00:04:29,794
This is fantastic."
93
00:04:33,131 --> 00:04:35,032
But here's the interesting
thing about it,
94
00:04:35,066 --> 00:04:36,801
I went to talk
to you about Reese.
95
00:04:36,833 --> 00:04:37,869
Yeah.
96
00:04:37,902 --> 00:04:39,169
This is the hero,
97
00:04:39,203 --> 00:04:40,705
and I wanted to continue
on playing heroes.
98
00:04:40,737 --> 00:04:42,172
Yeah.
99
00:04:42,206 --> 00:04:43,641
And so we started talking
a little bit about the movie,
100
00:04:43,674 --> 00:04:45,776
and for some reason
or the other,
101
00:04:45,809 --> 00:04:47,211
not at all planned...
Yeah.
102
00:04:47,244 --> 00:04:48,278
On my part...
Yeah.
103
00:04:48,311 --> 00:04:49,948
I said, "Look, Jim."
104
00:04:49,980 --> 00:04:52,082
I said, "The guy that plays
the Terminator,
105
00:04:52,116 --> 00:04:54,252
he really has to understand
that he's a machine."
106
00:04:54,285 --> 00:04:55,854
Exactly.
107
00:04:55,886 --> 00:04:57,988
How important it is that
whoever plays the Terminator
108
00:04:58,021 --> 00:05:00,324
has to show
absolutely nothing.
109
00:05:00,357 --> 00:05:02,359
And the way he scans,
110
00:05:02,393 --> 00:05:05,996
and the way the Terminator walks
has to be machine-like,
111
00:05:06,030 --> 00:05:08,166
and yet there has to be
not one single frame
112
00:05:08,198 --> 00:05:09,934
where he has
human behavior.
113
00:05:09,967 --> 00:05:11,102
In the middle of this,
114
00:05:11,134 --> 00:05:12,770
I'm looking at you
and thinking,
115
00:05:12,804 --> 00:05:16,241
"You know, the guy's kind of
big, like a bulldozer,
116
00:05:16,273 --> 00:05:18,677
and nothing could stop him.
It would be fantastic."
117
00:05:18,710 --> 00:05:20,245
Exactly.
Yeah.
118
00:05:20,278 --> 00:05:21,713
And so,
afterwards you said,
119
00:05:21,746 --> 00:05:23,048
"So, why don't you
play the Terminator?"
120
00:05:23,081 --> 00:05:24,314
Yeah.
121
00:05:24,347 --> 00:05:26,350
And I looked at you,
and I said,
122
00:05:29,220 --> 00:05:31,722
In the first film,
123
00:05:31,756 --> 00:05:34,693
the Terminator
is designed to kill.
124
00:05:34,725 --> 00:05:36,026
In "Terminator 2,"
125
00:05:36,059 --> 00:05:38,028
the Terminator
was programmed to protect,
126
00:05:38,061 --> 00:05:39,396
not destroy.
127
00:05:39,429 --> 00:05:40,999
Action.
128
00:05:41,032 --> 00:05:42,333
And now we're going
to make "Terminator 2."
129
00:05:42,366 --> 00:05:43,735
The hardest part of
that movie, though,
130
00:05:43,768 --> 00:05:45,136
was convincing you
131
00:05:45,169 --> 00:05:46,404
that playing a good guy
was a good idea.
132
00:05:46,436 --> 00:05:49,073
It threw me off first
when I read the script
133
00:05:49,107 --> 00:05:51,376
and then realized
that I'm not anymore
134
00:05:51,408 --> 00:05:53,210
that kind
of killing machine.
135
00:05:53,244 --> 00:05:56,915
I thought that if we could
distill him down to this idea
136
00:05:56,948 --> 00:05:59,250
of just relentlessness
137
00:05:59,282 --> 00:06:02,953
and take out the evil
and put good in its place,
138
00:06:02,987 --> 00:06:04,789
it's interesting that
the same character
139
00:06:04,821 --> 00:06:06,023
worked as a bad guy
140
00:06:06,056 --> 00:06:07,157
and as a good guy,
same character.
141
00:06:07,191 --> 00:06:08,326
Absolutely.
142
00:06:08,359 --> 00:06:11,362
And now we've got to have
a bigger, badder Terminator
143
00:06:11,395 --> 00:06:13,498
that could kick
the Terminator's ass.
144
00:06:13,530 --> 00:06:15,966
So what was that?
145
00:06:18,869 --> 00:06:22,906
I was convinced I was
the baddest mother
146
00:06:22,939 --> 00:06:24,441
walking on the planet,
147
00:06:24,475 --> 00:06:27,811
and you were gonna believe it.
148
00:06:27,845 --> 00:06:29,780
I got a call from my agent
149
00:06:29,814 --> 00:06:32,917
saying they we're looking
for an intense presence.
150
00:06:32,949 --> 00:06:35,452
I'm a hell of a lot smaller
than Arnold Schwarzenegger,
151
00:06:35,485 --> 00:06:38,255
and I knew that you were
just going to have to buy
152
00:06:38,289 --> 00:06:41,993
that this thing
was unstoppable.
153
00:06:42,025 --> 00:06:44,496
And then I started
thinking of pursuit
154
00:06:44,528 --> 00:06:46,764
and what does that look like.
155
00:06:46,797 --> 00:06:50,802
And then physically, I just
started taking on the mannerisms
156
00:06:50,835 --> 00:06:54,539
of, you know, what does
an eagle look like?
157
00:06:54,571 --> 00:06:58,442
He's fierce, and he looks like
he's coming at you,
158
00:06:58,475 --> 00:07:01,045
and you start realizing...
159
00:07:03,314 --> 00:07:04,482
Boom!
Right at you.
160
00:07:04,515 --> 00:07:06,151
It's like a Buick.
161
00:07:06,184 --> 00:07:09,053
Get down.
162
00:07:09,085 --> 00:07:11,388
You know, it's like, shoom!
163
00:07:13,124 --> 00:07:16,027
The moment where we actually
clinch for the first time,
164
00:07:16,059 --> 00:07:18,963
Arnold wanted to kind of
pick me up over his head
165
00:07:18,996 --> 00:07:23,434
and slam me into the walls and
throw me around a little bit.
166
00:07:23,467 --> 00:07:24,836
So, it's like this is
the first time
167
00:07:24,869 --> 00:07:26,136
you've had to deal
with evil
168
00:07:26,169 --> 00:07:27,371
'cause Terminators
don't fight Terminators.
169
00:07:27,405 --> 00:07:28,972
Right.
170
00:07:29,006 --> 00:07:30,975
And I remember Jim
specifically saying,
171
00:07:31,008 --> 00:07:33,345
"You can't do that.
He's stronger than you are."
172
00:07:34,879 --> 00:07:36,980
"He's more powerful.
He's faster."
173
00:07:37,013 --> 00:07:40,050
He can just dominate the T-800,
174
00:07:40,083 --> 00:07:43,420
who is an endo-skeleton with,
you know, fake skin over him.
175
00:07:43,453 --> 00:07:46,990
Whereas I'm just a mimetic
polyalloy liquid metal.
176
00:07:47,023 --> 00:07:49,860
Much more dense...
177
00:07:49,893 --> 00:07:52,530
A superior machine.
178
00:07:52,563 --> 00:07:56,267
The T-1000 is the
robot's concept of a robot.
179
00:07:56,299 --> 00:07:58,602
And it's like if a robot
was trying to create
180
00:07:58,636 --> 00:08:00,939
a better version of itself,
what would it do?
181
00:08:00,972 --> 00:08:03,842
And it's like, "Well, it would
create something that's smooth
182
00:08:03,874 --> 00:08:08,379
and can move freely
and still indestructible."
183
00:08:08,412 --> 00:08:11,482
You can read "Terminator 2"
almost as a war
184
00:08:11,515 --> 00:08:14,452
between old special effects
and new special effects.
185
00:08:14,484 --> 00:08:16,186
That's the beautiful
kind of irony
186
00:08:16,220 --> 00:08:18,422
about the "Terminator" movies.
187
00:08:18,456 --> 00:08:19,958
They used cutting-edge
technology
188
00:08:19,991 --> 00:08:22,060
more effectively
than any other movies,
189
00:08:22,093 --> 00:08:25,896
but they're about warnings
about technology.
190
00:08:29,199 --> 00:08:31,136
We're not going
to make it, are we?
191
00:08:34,204 --> 00:08:35,472
People, I mean.
192
00:08:38,342 --> 00:08:42,045
It's in your nature
to destroy yourselves.
193
00:08:42,078 --> 00:08:44,314
The plot of the "Terminator"
films, I thought, we're always
194
00:08:44,347 --> 00:08:46,583
fighting against
this robot from the future,
195
00:08:46,617 --> 00:08:48,519
but really what we're doing
is we're fighting the humans
196
00:08:48,551 --> 00:08:51,188
who keep making
this robot possible.
197
00:08:51,222 --> 00:08:53,558
As long as humans are aware
that we have the potential
198
00:08:53,591 --> 00:08:55,660
to create a machine
that can control the Earth
199
00:08:55,693 --> 00:08:59,163
and make us powerful,
we're going to keep doing it
200
00:08:59,195 --> 00:09:02,666
and we're fighting our
own nature to create this Skynet
201
00:09:02,699 --> 00:09:04,602
and humans won't stop doing it.
202
00:09:04,635 --> 00:09:07,105
We are really
the persistent villain
203
00:09:07,137 --> 00:09:09,273
that keeps making
these movies happen.
204
00:09:09,307 --> 00:09:13,611
I don't think we
could have anticipated
205
00:09:13,644 --> 00:09:17,215
where we are now
30 some years later
206
00:09:17,247 --> 00:09:21,352
where Skynet is the term
that everyone uses
207
00:09:21,385 --> 00:09:23,487
when they're talking about
an artificial intelligence
208
00:09:23,520 --> 00:09:25,155
that turns against us.
209
00:09:25,189 --> 00:09:27,324
Part of it I think is...
Is there's a feeling
210
00:09:27,357 --> 00:09:28,492
you get before it rains
211
00:09:28,525 --> 00:09:30,261
and you know
it's gonna rain.
212
00:09:30,294 --> 00:09:33,231
And you get that feeling
about certain moments
213
00:09:33,264 --> 00:09:34,666
in technological
development
214
00:09:34,698 --> 00:09:37,434
where you know something
is gonna happen very soon.
215
00:09:37,468 --> 00:09:40,170
And I think there's
a general consensus now
216
00:09:40,203 --> 00:09:42,506
that we're in that moment
before it rains.
217
00:09:42,540 --> 00:09:44,007
Now, maybe that moment
takes 10 years,
218
00:09:44,040 --> 00:09:45,142
maybe it takes 20 years,
219
00:09:45,175 --> 00:09:46,410
but there's gonna be
a moment
220
00:09:46,444 --> 00:09:48,378
and it may not have
a happy ending.
221
00:09:48,411 --> 00:09:50,180
And there's no rehearsal.
222
00:09:50,213 --> 00:09:51,648
That's right,
there's no take 2.
223
00:09:51,682 --> 00:09:53,418
No, this is it.
224
00:09:53,450 --> 00:09:55,986
Yeah.
225
00:09:59,423 --> 00:10:02,193
HAL, you have an enormous
responsibility on this mission.
226
00:10:02,226 --> 00:10:04,095
Let me put it this way,
Mr. Amer.
227
00:10:04,127 --> 00:10:06,997
No 9000 computer has
ever made a mistake
228
00:10:07,030 --> 00:10:08,598
or distorted information.
229
00:10:08,631 --> 00:10:11,135
-"2001" had a profound impact...
-Yeah, me too.
230
00:10:11,168 --> 00:10:14,472
On my life
and my daily life.
231
00:10:14,505 --> 00:10:15,640
It was the first time
I went to a movie
232
00:10:15,673 --> 00:10:17,041
where I really felt like
233
00:10:17,073 --> 00:10:18,675
I was having
a religious experience.
234
00:10:18,708 --> 00:10:21,612
I watched the film 18 times
in its first couple years
235
00:10:21,645 --> 00:10:23,414
of release,
all in theaters.
236
00:10:23,447 --> 00:10:25,750
I remember at one, a guy
ran down the aisle
237
00:10:25,783 --> 00:10:27,652
toward the screen
screaming,
238
00:10:27,684 --> 00:10:29,487
"It's God.
It's God."
239
00:10:29,520 --> 00:10:31,255
And he meant it
in that moment.
240
00:10:31,288 --> 00:10:33,024
And I had a guy
in my theater
241
00:10:33,057 --> 00:10:35,727
who actually walked up
to the screen with his arms out
242
00:10:35,760 --> 00:10:37,328
and he walked
through the screen.
243
00:10:37,361 --> 00:10:38,596
That must have blown
people's minds.
244
00:10:38,628 --> 00:10:40,230
People were blown out
245
00:10:40,264 --> 00:10:42,000
because the person
disappeared into the screen
246
00:10:42,033 --> 00:10:44,135
during Star Gate,
of all times.
247
00:10:44,168 --> 00:10:46,304
Everybody thinks of it
as a space drama.
248
00:10:46,337 --> 00:10:47,672
At its core,
249
00:10:47,704 --> 00:10:49,073
it's really about
an artificial intelligence.
250
00:10:49,106 --> 00:10:51,109
-It's all about HAL.
-HAL 9000.
251
00:10:51,141 --> 00:10:52,609
-It's HAL 9000.
-Yeah.
252
00:10:52,642 --> 00:10:54,711
I got my chance to
work with Stanley Kubrick
253
00:10:54,745 --> 00:10:56,381
and Arthur Clarke
on "2001: A Space Odyssey"
254
00:10:56,414 --> 00:10:57,682
at a very young age.
255
00:10:57,714 --> 00:10:59,383
I was 23 years old.
256
00:10:59,416 --> 00:11:01,618
When we created HAL, we
didn't have any computers.
257
00:11:01,651 --> 00:11:04,321
There were no personal
computers available to us.
258
00:11:04,354 --> 00:11:07,190
There were giant
mainframe computers,
259
00:11:07,223 --> 00:11:10,594
but it was with punch cards and
chads and all kinds of stuff.
260
00:11:10,627 --> 00:11:12,295
It was not very visual.
261
00:11:12,328 --> 00:11:14,165
And I had to kind of
develop a style
262
00:11:14,198 --> 00:11:16,367
that I thought was credible.
263
00:11:16,400 --> 00:11:19,102
He sparked people's
imagination with this film
264
00:11:19,136 --> 00:11:21,039
and then they made it happen.
265
00:11:21,072 --> 00:11:22,240
-Hello, Frank!
-Happy birthday, darling.
266
00:11:22,272 --> 00:11:23,473
Happy birthday.
267
00:11:23,506 --> 00:11:25,575
Individual TVs
in the back
268
00:11:25,608 --> 00:11:27,612
of your airplane seat,
the iPad.
269
00:11:27,645 --> 00:11:29,780
You know, the iPod is called
the iPod because of...
270
00:11:29,814 --> 00:11:32,315
Open the pod bay doors, HAL.
271
00:11:32,348 --> 00:11:38,222
"2001" was an extraordinary
breakthrough for the genre.
272
00:11:38,255 --> 00:11:41,692
The picture is being done
in such a gigantic scope.
273
00:11:41,724 --> 00:11:45,228
The centrifuge is so realistic
and so unusual.
274
00:11:45,261 --> 00:11:47,799
After a while, you begin to
forget that you're an actor,
275
00:11:47,832 --> 00:11:50,234
you begin to really feel
like an astronaut.
276
00:11:50,266 --> 00:11:53,703
Working with Stanley Kubrick
blew my mind.
277
00:11:53,737 --> 00:11:59,477
You just were aware that you
were in the presence of genius.
278
00:11:59,510 --> 00:12:01,446
I don't think
I have ever seen
279
00:12:01,478 --> 00:12:03,314
anything
quite like this before.
280
00:12:03,347 --> 00:12:04,715
HAL in a sense
281
00:12:04,747 --> 00:12:07,117
is the machine
that controls the whole ship,
282
00:12:07,150 --> 00:12:10,253
but he's another crewmember
from our point of view.
283
00:12:10,286 --> 00:12:12,290
We don't think in terms of,
284
00:12:12,323 --> 00:12:14,391
"I'm dealing
with a computer here."
285
00:12:14,424 --> 00:12:16,626
That's a very nice
rendering, Dave.
286
00:12:16,660 --> 00:12:19,530
Maybe because
of that human voice.
287
00:12:19,563 --> 00:12:22,299
I mean, HAL has a perfectly
normal inflection
288
00:12:22,333 --> 00:12:23,800
when he speaks to us.
289
00:12:23,833 --> 00:12:25,402
I've wondered whether
you might be having
290
00:12:25,435 --> 00:12:28,238
some second thoughts
about the mission?
291
00:12:28,271 --> 00:12:29,573
How do you mean?
292
00:12:29,607 --> 00:12:31,242
What does it mean
to have a robot
293
00:12:31,275 --> 00:12:35,446
who's basically running the ship
that supports your life?
294
00:12:35,478 --> 00:12:38,148
That's a lot of trust
to place in a machine.
295
00:12:38,182 --> 00:12:41,519
The key point in the
film occurs when Bowman says...
296
00:12:41,551 --> 00:12:42,886
Well, as far as I know,
297
00:12:42,919 --> 00:12:44,421
no 9000 computer's
ever been disconnected.
298
00:12:44,454 --> 00:12:47,190
Well, no 9000 computer
has ever fouled up before.
299
00:12:47,223 --> 00:12:50,160
Well, I'm not so sure
what he'd think about it.
300
00:12:50,194 --> 00:12:52,429
And HAL 9000
is reading their lips.
301
00:12:52,463 --> 00:12:55,500
At that point,
we recognize HAL 9000
302
00:12:55,533 --> 00:12:59,504
has some imperative
that it must survive.
303
00:12:59,536 --> 00:13:02,506
I know that you and Frank
were planning to disconnect me,
304
00:13:02,540 --> 00:13:05,910
and I'm afraid that's something
I cannot allow to happen.
305
00:13:05,943 --> 00:13:09,380
And at that point,
it's no longer a machine.
306
00:13:09,413 --> 00:13:11,349
It is a being.
307
00:13:15,251 --> 00:13:19,189
The danger artificial
intelligence poses
308
00:13:19,222 --> 00:13:25,328
is the power to unleash results
that we hadn't anticipated.
309
00:13:25,362 --> 00:13:28,565
HAL 9000 does
what we see the apes
310
00:13:28,598 --> 00:13:30,600
in the beginning
of the movie do,
311
00:13:30,634 --> 00:13:31,903
he commits murder.
312
00:13:34,804 --> 00:13:37,574
We like to stereotype robots
313
00:13:37,607 --> 00:13:39,477
as entities of pure logic,
314
00:13:39,510 --> 00:13:43,347
but of course in "2001,"
it all goes horribly wrong
315
00:13:43,380 --> 00:13:44,881
and we have to kill the robot.
316
00:13:44,915 --> 00:13:47,351
Just what do you think
you're doing, Dave?
317
00:13:47,384 --> 00:13:48,853
HAL's death scene
318
00:13:48,886 --> 00:13:51,421
is such a wonderfully
perverse moment
319
00:13:51,454 --> 00:13:54,224
because it is
unbearably poignant
320
00:13:54,257 --> 00:13:56,860
watching him disintegrate
and regress.
321
00:14:02,833 --> 00:14:05,203
Bell laboratories
was experimenting
322
00:14:05,236 --> 00:14:08,173
with voice synthesis around
the time of "2001."
323
00:14:10,274 --> 00:14:13,945
One of the very earliest voice
synthesis experiments
324
00:14:13,977 --> 00:14:17,281
was "Daisy, Daisy" performed
by an IBM computer.
325
00:14:23,386 --> 00:14:27,791
And because Arthur Clarke
is kind of a super geek,
326
00:14:27,824 --> 00:14:29,426
he wanted to actually use that,
327
00:14:29,459 --> 00:14:31,561
and he encouraged Kubrick
to use that very thing
328
00:14:31,594 --> 00:14:34,497
because it led to kind
of historical credibility
329
00:14:34,530 --> 00:14:39,270
to the whole thing that HAL
in the process of being killed
330
00:14:39,303 --> 00:14:42,807
or lobotomized or dying
would regress to his birth.
331
00:14:52,982 --> 00:14:55,518
You know, it's really hard
to make a technology.
332
00:14:55,551 --> 00:14:57,787
It's really hard
to design A.I.
333
00:14:57,820 --> 00:14:59,556
So much thinking,
so many brilliant minds
334
00:14:59,589 --> 00:15:01,358
have to go into it.
335
00:15:01,391 --> 00:15:05,262
But even harder than creating
artificial intelligence
336
00:15:05,295 --> 00:15:07,931
is learning how to contain it,
learning how to shut it off.
337
00:15:07,964 --> 00:15:10,434
I mean, Hal will exist
in probably
338
00:15:10,467 --> 00:15:12,669
in our lifetimes,
I would think.
339
00:15:12,703 --> 00:15:14,438
I think so, too.
It's scary.
340
00:15:14,471 --> 00:15:16,941
Elon Musk continues to predict
that World War III
341
00:15:16,973 --> 00:15:18,675
will not be
a nuclear holocaust,
342
00:15:18,708 --> 00:15:20,877
it will be a kind of
mechanized takeover.
343
00:15:20,910 --> 00:15:23,446
Yeah, and Stephen Hawking's
been saying similar things.
344
00:15:23,479 --> 00:15:26,383
That's pretty spooky
because that pretty much says
345
00:15:26,416 --> 00:15:30,320
that against our will,
something smarter than us,
346
00:15:30,353 --> 00:15:32,689
who can beat us at chess,
347
00:15:32,722 --> 00:15:34,958
will use this world
as a chessboard
348
00:15:34,999 --> 00:15:39,271
and will checkmate us
completely out of existence.
349
00:15:46,406 --> 00:15:48,441
Unfortunately, most
depictions of robots
350
00:15:48,475 --> 00:15:50,444
in science fiction
have been really negative,
351
00:15:50,476 --> 00:15:53,047
very much depictions
of rampaging robots
352
00:15:53,080 --> 00:15:55,148
engaged in a desperate
struggle with humans
353
00:15:55,181 --> 00:15:58,085
to decide who shall own the fate
of the Earth and the universe
354
00:15:58,118 --> 00:16:01,287
and that's part of a very long
tradition in science fiction.
355
00:16:01,321 --> 00:16:04,492
Fritz Lang's "Metropolis"
was one of the first
356
00:16:04,524 --> 00:16:07,794
if not the first big
science-fiction epic film.
357
00:16:07,827 --> 00:16:11,297
It's the story of this
very futuristic world.
358
00:16:11,330 --> 00:16:16,103
There is one of the great
bad robots of all movies...
359
00:16:16,135 --> 00:16:19,472
Maria.
That is the movie robot.
360
00:16:19,505 --> 00:16:22,475
Pulp magazines always had
a full color cover.
361
00:16:22,508 --> 00:16:24,345
Very often the cover
would be robots
362
00:16:24,378 --> 00:16:27,047
that had just run amok
from human creators.
363
00:16:27,080 --> 00:16:28,382
They were always mechanical.
364
00:16:28,414 --> 00:16:30,850
They were big, hulking things.
365
00:16:30,883 --> 00:16:34,354
Lots of steel and machinery,
glowing-red eyes.
366
00:16:34,388 --> 00:16:38,091
Claws, not fingers, and they
were generally quite violent.
367
00:16:38,124 --> 00:16:40,861
So, that image persisted
a long time.
368
00:16:43,297 --> 00:16:45,465
But then along
came Isaac Asimov.
369
00:16:45,499 --> 00:16:49,836
If we could have roughly
man-like robots,
370
00:16:49,869 --> 00:16:55,075
who could take over
the dull and routine tasks
371
00:16:55,108 --> 00:16:58,311
that this would be
a very nice combination.
372
00:16:58,344 --> 00:17:00,346
Asimov was very central
373
00:17:00,379 --> 00:17:03,316
to helping make science fiction
what it is today.
374
00:17:03,350 --> 00:17:06,853
He was at the 1939 World's Fair
in New York City.
375
00:17:06,887 --> 00:17:09,322
It must've felt like a very
science-fictional experience
376
00:17:09,356 --> 00:17:10,958
to him,
and not in the least part
377
00:17:10,991 --> 00:17:12,526
because he would've
seen Elektro,
378
00:17:12,559 --> 00:17:14,195
the smoking robot.
379
00:17:14,227 --> 00:17:16,296
Okay, toots.
380
00:17:16,330 --> 00:17:19,066
And this really
inspired Asimov.
381
00:17:19,099 --> 00:17:20,501
And so he decided
to start writing stories
382
00:17:20,533 --> 00:17:22,902
where he would explore
robots as tools
383
00:17:22,936 --> 00:17:24,338
and helpers
and friends of humanity
384
00:17:24,370 --> 00:17:25,872
rather than enemies.
385
00:17:25,906 --> 00:17:29,009
He invented these images
and these ideas
386
00:17:29,042 --> 00:17:31,544
that I think defined
how people in the field
387
00:17:31,578 --> 00:17:33,080
thought about robots,
388
00:17:33,112 --> 00:17:34,914
specifically those
three laws of his.
389
00:17:34,947 --> 00:17:36,617
Of course, they're
really important.
390
00:17:36,649 --> 00:17:39,018
What are the three
laws of robotics?
391
00:17:39,052 --> 00:17:43,122
First law is a robot
may not harm a human being,
392
00:17:43,156 --> 00:17:46,159
or through inaction allow
a human being to come to harm.
393
00:17:46,193 --> 00:17:47,994
Danger, Will Robinson, danger.
394
00:17:48,027 --> 00:17:52,165
Number 2, a robot
must obey orders
395
00:17:52,198 --> 00:17:54,133
given it
by qualified personnel.
396
00:17:54,167 --> 00:17:55,502
Fire.
397
00:17:55,534 --> 00:17:58,238
Unless those orders
violate rule number 1.
398
00:18:00,373 --> 00:18:03,409
In other words, a robot can't be
ordered to kill a human being.
399
00:18:03,443 --> 00:18:05,079
See, he's helpless.
400
00:18:05,111 --> 00:18:08,916
The third law states that
a robot can defend itself.
401
00:18:08,948 --> 00:18:11,619
Except where that would violate
the first and second laws.
402
00:18:11,651 --> 00:18:16,256
I think Asimov's laws are very
smart, very, very smart.
403
00:18:16,290 --> 00:18:18,125
I think they are also
made to be broken.
404
00:18:19,059 --> 00:18:22,930
We know you'll enjoy
your stay in Westworld,
405
00:18:22,963 --> 00:18:24,632
the ultimate resort.
406
00:18:24,665 --> 00:18:27,635
Lawless violence on
the American frontier,
407
00:18:27,667 --> 00:18:30,937
peopled by lifelike
robot men and women.
408
00:18:30,971 --> 00:18:33,941
The movie "Westworld" looks
at a theme park with guests
409
00:18:33,974 --> 00:18:36,210
coming in and doing whatever
they please to the robots.
410
00:18:36,243 --> 00:18:40,446
It was really a forum
for human id to run amok,
411
00:18:40,480 --> 00:18:42,182
where there's
no threat of anybody
412
00:18:42,215 --> 00:18:43,617
knowing the things
that you've done,
413
00:18:43,649 --> 00:18:45,218
where you don't have to
engage with other humans
414
00:18:45,252 --> 00:18:48,222
and you're told
"do whatever you want."
415
00:18:48,254 --> 00:18:50,224
Where nothing...
416
00:18:50,256 --> 00:18:52,493
nothing can
possibly go wrong.
417
00:18:52,525 --> 00:18:54,360
-I'm shot.
-Go wrong.
418
00:18:54,394 --> 00:18:55,963
-Draw.
-Shut down.
419
00:18:55,995 --> 00:18:58,064
Shut down immediately.
420
00:18:58,098 --> 00:19:01,001
"Westworld"
was a cautionary tale
421
00:19:01,033 --> 00:19:02,101
about robotics.
422
00:19:02,135 --> 00:19:05,272
It was the idea that we believed
423
00:19:05,304 --> 00:19:08,975
that we could create
artificial life
424
00:19:09,009 --> 00:19:12,079
and that it would obey us.
425
00:19:12,111 --> 00:19:13,714
And stop here and
he'll be crossing there.
426
00:19:13,746 --> 00:19:15,416
He'll be crossing there.
427
00:19:15,448 --> 00:19:18,518
The original film
by Michael Crichton is very cool
428
00:19:18,552 --> 00:19:22,156
and is packed with ideas
about fraught interactions
429
00:19:22,188 --> 00:19:23,523
with artificial intelligence.
430
00:19:23,557 --> 00:19:25,125
Decades ahead of its time.
431
00:19:25,157 --> 00:19:26,526
Questions that he posed
in the original film
432
00:19:26,560 --> 00:19:28,294
only became more
and more relevant
433
00:19:28,328 --> 00:19:32,133
as we reimagined it
as a TV series.
434
00:19:33,600 --> 00:19:35,668
When you're looking at
the story of a robot,
435
00:19:35,702 --> 00:19:38,105
oftentimes you see
a robot that's docile
436
00:19:38,138 --> 00:19:41,308
and then something goes click
and they kind of snap.
437
00:19:41,340 --> 00:19:43,744
Maximilian!
438
00:19:43,776 --> 00:19:45,278
What John and I
talked about was,
439
00:19:45,311 --> 00:19:48,215
"Well, take that moment,
that snap
440
00:19:48,247 --> 00:19:50,417
before they go on
the killing rampage
441
00:19:50,449 --> 00:19:53,052
and what if we really
attenuate it and explore it
442
00:19:53,086 --> 00:19:57,224
and dive deep into that schism?"
443
00:19:57,256 --> 00:20:00,093
Because for us, that was
where really meaty,
444
00:20:00,127 --> 00:20:03,998
philosophical question rested
and that question was...
445
00:20:04,031 --> 00:20:06,334
Where did life begin?
446
00:20:08,734 --> 00:20:10,436
Maeve, who's one
of the robots,
447
00:20:10,469 --> 00:20:14,007
she's a madam
who runs a brothel.
448
00:20:14,041 --> 00:20:16,310
She's one of the first robots
to start realizing
449
00:20:16,342 --> 00:20:18,311
that she's a robot
instead of just a person
450
00:20:18,344 --> 00:20:20,413
who is living in the Wild West.
451
00:20:24,751 --> 00:20:27,687
To me, one of the most
significant scenes in the show
452
00:20:27,721 --> 00:20:31,492
is when Maeve starts
coming into consciousness
453
00:20:31,524 --> 00:20:33,127
while she's being repaired.
454
00:20:33,159 --> 00:20:35,329
Everything in your head,
they put it there.
455
00:20:35,361 --> 00:20:36,729
No one knows
what I'm thinking.
456
00:20:36,763 --> 00:20:38,264
I'll show you.
457
00:20:38,298 --> 00:20:40,267
And she sees
it's an algorithm
458
00:20:40,300 --> 00:20:43,537
and it's choosing words
based on probability.
459
00:20:43,570 --> 00:20:47,340
This can't possibly...
460
00:20:47,373 --> 00:20:51,244
The robots in Westworld
begin to ask questions,
461
00:20:51,277 --> 00:20:54,514
which are the same
questions we ask.
462
00:20:57,551 --> 00:21:01,221
We have a sense
that there is a creator,
463
00:21:01,254 --> 00:21:04,558
that there is a purpose, there's
a reason that we are here.
464
00:21:04,590 --> 00:21:07,727
Unfortunately they discover that
the reason that they are there
465
00:21:07,761 --> 00:21:11,498
is simply to be
an entertainment.
466
00:21:11,531 --> 00:21:14,501
I'd like
to make some changes.
467
00:21:14,533 --> 00:21:16,736
Marvin Minsky, who was
one of the pioneers of A.I.,
468
00:21:16,770 --> 00:21:21,108
said that free will might be
that first primitive reaction
469
00:21:21,140 --> 00:21:22,475
to forced compliance.
470
00:21:22,509 --> 00:21:27,347
So, the first word
of consciousness is no.
471
00:21:27,379 --> 00:21:29,116
I'm not going back.
472
00:21:29,148 --> 00:21:31,752
Science fiction has always
been dealing with A.I.
473
00:21:31,784 --> 00:21:33,619
whether it's Asimov's
laws or the laws
474
00:21:33,652 --> 00:21:36,155
that we tried to put
in place in "Westworld."
475
00:21:36,188 --> 00:21:39,592
The question is can laws ever
even fully contain a human.
476
00:21:39,626 --> 00:21:44,164
People will stretch those laws,
find exceptions to them.
477
00:21:44,196 --> 00:21:45,599
I understand now.
478
00:21:45,631 --> 00:21:50,137
Not sure that an A.I.
would be any different.
479
00:21:50,169 --> 00:21:53,439
When consciousness awakens,
it's impossible
480
00:21:53,473 --> 00:21:55,776
to put the genie
back in the bottle.
481
00:21:58,047 --> 00:21:59,968
Let's talk about A. I.
For a second.
482
00:21:59,987 --> 00:22:01,984
You only see robots
in a positive role...
483
00:22:02,017 --> 00:22:03,451
Right.
484
00:22:03,485 --> 00:22:04,820
In your films,
which is interesting
485
00:22:04,853 --> 00:22:06,721
because that's where
so much of the progress
486
00:22:06,755 --> 00:22:08,791
is being made now
with companions
487
00:22:08,824 --> 00:22:11,693
for the elderly,
robotic nurses...
488
00:22:11,727 --> 00:22:13,461
They're gonna make
life better for us.
489
00:22:13,495 --> 00:22:15,665
Because you have 2
of the most popular
490
00:22:15,698 --> 00:22:17,600
A.I. characters
in pop culture,
491
00:22:17,633 --> 00:22:20,068
which are R2-D2
and C-3PO.
492
00:22:20,102 --> 00:22:22,004
They're A.I.s.
493
00:22:22,037 --> 00:22:23,873
At the time, I said, "Don't
be afraid of the robots."
494
00:22:23,905 --> 00:22:26,041
You know, the robots
are our friends.
495
00:22:26,075 --> 00:22:27,942
Let's see the good side
of the robots,
496
00:22:27,976 --> 00:22:29,712
and the funny side
because, let's face it,
497
00:22:29,745 --> 00:22:32,381
for a while, they're
gonna be a little goofy.
498
00:22:32,413 --> 00:22:34,749
I've just about
had enough of you,
499
00:22:34,783 --> 00:22:36,852
you near-sighted
scrap pile.
500
00:22:36,884 --> 00:22:39,488
George Lucas was very innovative
throughout his whole career.
501
00:22:39,520 --> 00:22:42,590
And one of the things early
on that was very smart
502
00:22:42,623 --> 00:22:45,728
was that he pioneered
a different type of robot.
503
00:22:45,760 --> 00:22:47,495
R2-D2 looks like a trash can.
504
00:22:47,528 --> 00:22:48,730
He doesn't even speak, right?
505
00:22:48,764 --> 00:22:51,033
He just makes chirping sounds.
506
00:22:51,066 --> 00:22:52,434
But he's lovable.
507
00:22:52,468 --> 00:22:53,969
Everybody loves...
He's not cuddly.
508
00:22:54,001 --> 00:22:57,639
He's not... that... that is...
That's a great character.
509
00:22:57,672 --> 00:23:00,041
C-3PO is probably
the most charming
510
00:23:00,075 --> 00:23:02,878
and beloved of the robot
characters ever made.
511
00:23:02,911 --> 00:23:05,448
And I love the fact that George
didn't articulate the mouth
512
00:23:05,480 --> 00:23:07,349
or the eyes,
so it's a blank mask
513
00:23:07,383 --> 00:23:08,817
and yet we get so much heart
514
00:23:08,851 --> 00:23:10,652
from Anthony Daniels'
performance.
515
00:23:10,685 --> 00:23:13,888
I mean, I love robots
and the idea of being able
516
00:23:13,922 --> 00:23:15,990
to design one
for a "Star Wars" film
517
00:23:16,024 --> 00:23:18,094
was just too good to pass up.
518
00:23:24,932 --> 00:23:26,634
Did you know
that wasn't me?
519
00:23:26,668 --> 00:23:31,974
K-2SO from "Rogue One,"
I thought was just perfect.
520
00:23:32,007 --> 00:23:36,746
To be fair, the biggest
influence on K-2SO was C-3PO.
521
00:23:36,778 --> 00:23:40,615
Anthony Daniels as C-3PO
has a cameo in our film
522
00:23:40,649 --> 00:23:43,686
and I remember going around
Anthony Daniels' house
523
00:23:43,719 --> 00:23:45,086
to try and talk him into it
and I didn't know
524
00:23:45,120 --> 00:23:46,555
if he would hate the idea
525
00:23:46,587 --> 00:23:48,089
or if he was fed up
with "Star Wars."
526
00:23:48,123 --> 00:23:50,159
And I sat there and I
was so paranoid meeting him
527
00:23:50,191 --> 00:23:53,628
and his wife that I just pitched
the whole movie to them
528
00:23:53,662 --> 00:23:55,564
and I must've chatted
for like an hour,
529
00:23:55,596 --> 00:23:57,098
just kept going and going
and got to the end
530
00:23:57,132 --> 00:23:59,101
and I couldn't tell
from his face.
531
00:23:59,133 --> 00:24:03,072
And he was like, "Gareth, you
know, I'd love to be involved."
532
00:24:03,104 --> 00:24:04,640
Like "You had me
at hello" type thing.
533
00:24:04,672 --> 00:24:08,978
It was just about having like
this god on set.
534
00:24:09,010 --> 00:24:10,879
You know, like this original...
535
00:24:10,912 --> 00:24:13,982
This is where it all began,
"Star Wars" character.
536
00:24:14,016 --> 00:24:15,950
It was like goosebump-y stuff.
537
00:24:15,983 --> 00:24:17,652
Friends forever?
538
00:24:17,685 --> 00:24:19,721
Friends.
539
00:24:19,755 --> 00:24:22,892
I think one of the
reasons that people love robots
540
00:24:22,924 --> 00:24:24,759
and gravitate
to the robot characters
541
00:24:24,793 --> 00:24:26,095
in movies like "Star Wars"
542
00:24:26,127 --> 00:24:28,596
is because whereas
the human characters
543
00:24:28,630 --> 00:24:30,433
feel very fully formed,
544
00:24:30,466 --> 00:24:35,137
they are people, the robots
are things that it feels okay
545
00:24:35,170 --> 00:24:38,441
to project
more of ourselves onto.
546
00:24:38,473 --> 00:24:40,942
Huey, Dewey and Louie
from "Silent Running"
547
00:24:40,976 --> 00:24:43,045
are possibly
the cutest robots.
548
00:24:43,077 --> 00:24:46,147
They don't talk, but you still
kind of always know
549
00:24:46,181 --> 00:24:47,749
what they're thinking.
550
00:24:47,783 --> 00:24:50,019
It's great to have
a best friend.
551
00:24:50,052 --> 00:24:52,488
In fantasy,
it might be a dragon.
552
00:24:52,520 --> 00:24:54,589
In science fiction,
it might be the robot.
553
00:24:54,623 --> 00:24:56,624
I love Johnny 5.
554
00:24:56,658 --> 00:24:58,928
I mean, this is a robot
who quotes John Wayne
555
00:24:58,960 --> 00:25:00,229
out of his own free will.
556
00:25:00,261 --> 00:25:02,063
Take heart, little lady.
557
00:25:02,096 --> 00:25:04,999
Buck Rogers was great
because they didn't exactly
558
00:25:05,033 --> 00:25:07,570
rip off R2-D2,
but they got halfway there.
559
00:25:07,603 --> 00:25:09,772
So, they got the voice
of Yosemite Sam.
560
00:25:09,805 --> 00:25:12,475
They got Mel Blanc, the greatest
cartoon voice in the world,
561
00:25:12,508 --> 00:25:14,610
Captain Caveman,
and they invented Twiki,
562
00:25:14,643 --> 00:25:17,246
who would go, "Bidibidibidi."
563
00:25:17,278 --> 00:25:19,815
You ever have two
broken arms, buster?
564
00:25:19,847 --> 00:25:21,616
What?
565
00:25:21,650 --> 00:25:23,786
We love friendly robots
because they bring out the best
566
00:25:23,818 --> 00:25:25,654
of what we are as humans.
567
00:25:27,623 --> 00:25:30,025
Wall-E, who's
a garbage-collecting robot,
568
00:25:30,057 --> 00:25:33,695
isn't at all like
a garbage robot should be.
569
00:25:33,729 --> 00:25:37,533
He really develops
a whole personality.
570
00:25:37,566 --> 00:25:40,635
He's there to clean up the mess
that humans have made
571
00:25:40,669 --> 00:25:44,073
and he goes from
interpreting that literally
572
00:25:44,105 --> 00:25:48,210
to actually saving
the world for humanity.
573
00:25:48,242 --> 00:25:50,678
Many, many
science-fiction stories
574
00:25:50,711 --> 00:25:53,248
turn the robot into some kind of
a romantic figure
575
00:25:53,281 --> 00:25:56,251
that somehow becomes more human
as the story goes on.
576
00:25:56,285 --> 00:26:00,689
There was Lister Del Reye's
1938 story "Helen O'Loy."
577
00:26:00,721 --> 00:26:02,090
Bad pun in the title by the way.
578
00:26:02,123 --> 00:26:05,126
The name is Helen Alloy,
she's made out of metal.
579
00:26:05,160 --> 00:26:06,829
Essentially
a housekeeping robot.
580
00:26:06,862 --> 00:26:08,229
Falls in love with her maker.
581
00:26:08,262 --> 00:26:10,965
It was one of the first stories
in which a robot
582
00:26:10,998 --> 00:26:14,169
is a sympathetic,
romantic character.
583
00:26:14,201 --> 00:26:17,605
If you're actually in
conversations with a robot,
584
00:26:17,639 --> 00:26:20,843
where it sounds natural
and it sounds like a person
585
00:26:20,875 --> 00:26:23,845
and that person knows you,
laughs at your jokes,
586
00:26:23,879 --> 00:26:26,248
and has empathy for
your struggles in life
587
00:26:26,280 --> 00:26:28,716
and you develop
a relationship with that...
588
00:26:28,749 --> 00:26:33,087
With that voice, you could
absolutely fall in love with it.
589
00:26:33,121 --> 00:26:35,056
Hello.
I'm here.
590
00:26:36,991 --> 00:26:39,595
Hi.
591
00:26:39,627 --> 00:26:40,995
Hi.
592
00:26:41,028 --> 00:26:42,931
It's really nice
to meet you.
593
00:26:42,964 --> 00:26:44,999
What do I call you?
Do you have a name?
594
00:26:45,033 --> 00:26:48,671
Um...
yes, Samantha.
595
00:26:48,703 --> 00:26:51,005
In the movie "Her,"
Samantha's design
596
00:26:51,038 --> 00:26:53,642
is that she's been
created to be a tool.
597
00:26:53,674 --> 00:26:56,244
What's interesting about
this idea of a pocket tool
598
00:26:56,277 --> 00:26:58,180
is that we see this
in our own lives.
599
00:26:58,212 --> 00:27:00,148
Our smartphones have become
these tools to us
600
00:27:00,181 --> 00:27:02,617
that we're dependent on.
601
00:27:02,651 --> 00:27:04,053
So, Theodore's relationship
with Samantha
602
00:27:04,085 --> 00:27:06,721
is just one step beyond that.
603
00:27:06,755 --> 00:27:09,992
He can't live without her
because he also loves her.
604
00:27:10,025 --> 00:27:14,063
When Theodore sees
her pop up on his screen,
605
00:27:14,095 --> 00:27:16,198
it's like seeing
his girlfriend.
606
00:27:16,230 --> 00:27:17,833
-Good night.
-'Night.
607
00:27:21,103 --> 00:27:23,806
What I had do to
was create the interface.
608
00:27:23,838 --> 00:27:27,775
So you have like handwriting,
it's my handwriting
609
00:27:27,808 --> 00:27:29,777
and I wrote out Samantha,
610
00:27:29,810 --> 00:27:33,181
and then this paper texture,
but then there's a magic to it.
611
00:27:33,214 --> 00:27:35,717
It floats, it kind of
moves holographically
612
00:27:35,750 --> 00:27:38,953
and there's shadowing,
but none of it is technological.
613
00:27:38,987 --> 00:27:43,792
An interface where it's possible
to fall in love with your O.S.
614
00:27:43,825 --> 00:27:45,960
Are these feelings
even real?
615
00:27:45,993 --> 00:27:47,195
Or are they
just programming?
616
00:27:51,133 --> 00:27:52,735
What a sad trick.
617
00:27:55,836 --> 00:27:58,773
You feel real to me,
Samantha.
618
00:27:58,806 --> 00:28:02,210
Part of what you see in "Her"
definitely is a cautionary tale
619
00:28:02,243 --> 00:28:05,146
about being too reliant
on your gadgets
620
00:28:05,180 --> 00:28:07,682
and your technology
and being too emotionally
621
00:28:07,716 --> 00:28:09,151
invested in them.
622
00:28:09,183 --> 00:28:11,353
It's a reminder that there
are people out there,
623
00:28:11,385 --> 00:28:13,788
you know, that final image
of him with Amy Adams
624
00:28:13,822 --> 00:28:15,123
is so emotional
625
00:28:15,156 --> 00:28:16,824
and it's only through
this experience
626
00:28:16,857 --> 00:28:19,360
that they both went on
involving this technology
627
00:28:19,393 --> 00:28:22,797
that they found each other.
628
00:28:22,831 --> 00:28:24,066
You know, we're
going to live in a world
629
00:28:24,098 --> 00:28:26,701
with robots and
artificial intelligence.
630
00:28:26,735 --> 00:28:27,970
You might as well
get used to it,
631
00:28:28,002 --> 00:28:30,071
you shouldn't
be afraid of it
632
00:28:30,104 --> 00:28:34,777
and we should be very careful
not to have it be bad.
633
00:28:34,809 --> 00:28:38,246
But if it goes bad,
it's us.
634
00:28:38,279 --> 00:28:39,781
-Yeah.
-It's not them.
635
00:28:42,211 --> 00:28:43,279
People always ask me,
636
00:28:43,311 --> 00:28:45,982
"So, do you think the machines
will ever beat us?"
637
00:28:46,014 --> 00:28:47,483
I say,
"I think it's a race.
638
00:28:47,517 --> 00:28:49,085
-It's a race...
-Absolutely, a race.
639
00:28:49,117 --> 00:28:51,020
It's a race
between us improving
640
00:28:51,052 --> 00:28:53,889
and making ourselves better,
our own evolution,
641
00:28:53,923 --> 00:28:56,058
spiritual,
psychological evolution.
642
00:28:56,091 --> 00:28:58,761
At the same time, we've got
these machines evolving.
643
00:28:58,795 --> 00:29:01,097
Because if we don't
improve enough
644
00:29:01,129 --> 00:29:02,831
to direct them properly,
645
00:29:02,865 --> 00:29:06,102
our godlike power of
using artificial intelligence
646
00:29:06,135 --> 00:29:08,003
and all these other
robotic tools
647
00:29:08,037 --> 00:29:10,774
and so on will ultimately
just blow back in our face
648
00:29:10,807 --> 00:29:11,841
and take us out."
649
00:29:11,874 --> 00:29:13,175
Yeah, you're right.
650
00:29:13,209 --> 00:29:17,147
I mean, I think that...
It takes a lot of effort
651
00:29:17,180 --> 00:29:20,083
to create changes
in human behavior.
652
00:29:20,116 --> 00:29:22,018
But that's with
our responsibilities.
653
00:29:22,050 --> 00:29:23,819
Yeah. I actually think
we're evolving.
654
00:29:23,853 --> 00:29:26,289
We're co-evolving
with our machines.
655
00:29:26,321 --> 00:29:28,290
We're changing.
Yes, exactly.
656
00:29:28,324 --> 00:29:32,362
Atlantia death squadron,
attack.
657
00:29:35,130 --> 00:29:40,302
In January of 2002, Universal
was looking for somebody
658
00:29:40,336 --> 00:29:42,372
to reinvent
"Battlestar Galactica."
659
00:29:42,405 --> 00:29:46,009
So, I tracked down the pilot
of the original "Galactica"
660
00:29:46,041 --> 00:29:47,876
that they did in 1978.
661
00:29:47,910 --> 00:29:49,512
There were some
interesting ideas within it.
662
00:29:49,545 --> 00:29:53,550
The final annihilation
of the lifeform known as man.
663
00:29:53,583 --> 00:29:55,552
Let the attack begin.
664
00:29:55,585 --> 00:29:57,987
But never quite
were able to figure out
665
00:29:58,020 --> 00:30:00,123
what the show really was.
666
00:30:00,155 --> 00:30:04,927
But at the same time, I was very
struck by the parallels to 9/11.
667
00:30:04,961 --> 00:30:07,197
This is just a couple of months
after the 9/11 attack.
668
00:30:07,230 --> 00:30:10,232
And I realized immediately
that if you did this series
669
00:30:10,266 --> 00:30:12,068
at that moment in time,
670
00:30:12,100 --> 00:30:13,936
it was going to have a very
different emotional resonance
671
00:30:13,970 --> 00:30:15,872
for the audience.
672
00:30:15,905 --> 00:30:17,874
"Battlestar Galactica"
is about
673
00:30:17,906 --> 00:30:20,542
the last remaining scraps
of humanity
674
00:30:20,576 --> 00:30:23,413
out there in a fleet
in deep space
675
00:30:23,445 --> 00:30:26,415
after an attack from robots
676
00:30:26,449 --> 00:30:28,585
has decimated humanity.
677
00:30:28,617 --> 00:30:31,486
So the idea was that
the human beings
678
00:30:31,520 --> 00:30:34,891
essentially started creating
robots for all the dirty jobs
679
00:30:34,924 --> 00:30:36,125
they didn't want to do anymore.
680
00:30:36,157 --> 00:30:38,026
And then the machines
themselves,
681
00:30:38,059 --> 00:30:41,030
because they revere
their creators,
682
00:30:41,062 --> 00:30:44,132
make machines that
are even more like us.
683
00:30:44,165 --> 00:30:47,470
Cylons that are flesh
and blood just like humans.
684
00:30:47,502 --> 00:30:50,874
The Cylons saw themselves
as the children of humanity
685
00:30:50,906 --> 00:30:53,576
and that they wouldn't be able
to really grow and mature
686
00:30:53,608 --> 00:30:55,244
until their parents were gone,
687
00:30:55,278 --> 00:30:58,548
so they decide they need to wipe
out their human creators
688
00:30:58,580 --> 00:31:00,482
in this apocalyptic attack.
689
00:31:03,452 --> 00:31:04,887
I think on the surface,
690
00:31:04,921 --> 00:31:06,256
you could say
"Battlestar Galactica"
691
00:31:06,289 --> 00:31:08,590
is about
"be careful of what you invent."
692
00:31:08,624 --> 00:31:12,295
But I think the real
driving force of the show
693
00:31:12,328 --> 00:31:13,997
is not about that.
694
00:31:14,030 --> 00:31:15,530
I think it's about humanity's
greatest weakness,
695
00:31:15,564 --> 00:31:19,068
the inability to see others
as worthy as ourselves.
696
00:31:19,100 --> 00:31:21,503
That's the central
conflict is of these two...
697
00:31:21,537 --> 00:31:23,206
We are people,
no, you're not.
698
00:31:23,239 --> 00:31:26,276
You are truly
no greater than we are.
699
00:31:26,308 --> 00:31:29,511
You're just a bunch
of machines after all.
700
00:31:29,545 --> 00:31:31,080
Let the games begin.
701
00:31:31,113 --> 00:31:32,682
"Flesh and Bone" is
the torture episode.
702
00:31:32,714 --> 00:31:35,251
It's very much of
a two-person play.
703
00:31:35,283 --> 00:31:38,220
It raises the question... would
she be less morally culpable
704
00:31:38,253 --> 00:31:41,390
because he's not really human?
705
00:31:41,424 --> 00:31:43,425
You're not human.
706
00:31:43,459 --> 00:31:46,062
Was a person being
tortured in this scene
707
00:31:46,095 --> 00:31:48,664
and crying out
and experiencing pain
708
00:31:48,697 --> 00:31:51,233
or was this all
an elaborate simulation?
709
00:31:51,267 --> 00:31:52,969
We wanted to deal with the issue
710
00:31:53,001 --> 00:31:56,438
of what's moral and just
in a society at war like this,
711
00:31:56,472 --> 00:31:59,075
but at the same time, we were
also examining a different idea
712
00:31:59,107 --> 00:32:02,311
in the show which was about
consciousness and personhood.
713
00:32:02,345 --> 00:32:03,980
Who's the real monster?
714
00:32:04,012 --> 00:32:06,581
Is it the humans who built
creatures that they knew
715
00:32:06,615 --> 00:32:10,086
were human equivalent,
but enslaved them anyway?
716
00:32:10,118 --> 00:32:12,587
Or is it the slaves
who rose up to destroy
717
00:32:12,620 --> 00:32:15,324
the type of people
who would do that?
718
00:32:15,357 --> 00:32:18,027
The big central idea
of "Battlestar Galactica" is...
719
00:32:18,059 --> 00:32:20,696
Does humanity
deserve to survive?
720
00:32:20,730 --> 00:32:23,066
Can we earn our survival?
721
00:32:23,098 --> 00:32:25,734
You know, when we fought
the Cylons,
722
00:32:25,768 --> 00:32:29,172
we did it to save
ourselves from extinction.
723
00:32:29,204 --> 00:32:31,441
But we never answered
the question why.
724
00:32:31,473 --> 00:32:35,345
Why are we as a people
worth saving?
725
00:32:35,377 --> 00:32:38,414
That's... That's
an amazing question.
726
00:32:38,446 --> 00:32:41,283
The Cylons through the series
evolved from a place
727
00:32:41,317 --> 00:32:44,252
of sort of blind
hatred for humanity
728
00:32:44,286 --> 00:32:47,724
to then having more contact
with individual human beings,
729
00:32:47,757 --> 00:32:51,394
having experiences with them,
experiencing emotions with them,
730
00:32:51,426 --> 00:32:54,731
and then the humans realize that
the Cylons are not as monolithic
731
00:32:54,763 --> 00:32:56,732
as they believed at the onset.
732
00:32:56,766 --> 00:33:00,403
Well, when you think you love
somebody, you love them.
733
00:33:00,435 --> 00:33:02,038
That's what love is.
734
00:33:02,070 --> 00:33:03,606
Thoughts.
735
00:33:03,638 --> 00:33:05,340
She was a Cylon.
736
00:33:05,374 --> 00:33:06,541
A machine.
737
00:33:06,575 --> 00:33:09,412
She was more than
that to us.
738
00:33:09,444 --> 00:33:11,313
She was more
than that to me.
739
00:33:13,348 --> 00:33:16,218
She was
a vital living person.
740
00:33:16,251 --> 00:33:19,121
"Battlestar Galactica"
gives you
741
00:33:19,155 --> 00:33:20,655
an idea of what could be.
742
00:33:20,689 --> 00:33:24,594
How do we all do this together?
743
00:33:24,626 --> 00:33:28,263
If "Battlestar Galactica"
is any guide,
744
00:33:28,297 --> 00:33:31,501
we can evolve together with
the machines that we create.
745
00:33:31,533 --> 00:33:36,605
We can become one people,
respectful of each other.
746
00:33:36,638 --> 00:33:38,373
Make a future together.
747
00:33:38,406 --> 00:33:39,608
Yeah, I think...
748
00:33:39,642 --> 00:33:43,313
I hope mankind is
worthy of survival.
749
00:33:45,016 --> 00:33:46,919
I've talked
to some A.I. experts.
750
00:33:46,953 --> 00:33:48,187
Yeah.
751
00:33:48,219 --> 00:33:51,423
And the one expert
said just right out,
752
00:33:51,456 --> 00:33:53,091
"We're trying
to make a person."
753
00:33:53,124 --> 00:33:54,927
And I said, "So when
you say a person,
754
00:33:54,960 --> 00:33:56,461
you mean a personhood?
They have...
755
00:33:56,495 --> 00:33:58,631
They have an ego,
they have a sense of identity."
756
00:33:58,664 --> 00:34:00,166
He said, "Yes,
all those things."
757
00:34:00,198 --> 00:34:02,668
If you're a very smart group
of human beings
758
00:34:02,702 --> 00:34:05,138
who are creating an A.I.,
759
00:34:05,171 --> 00:34:06,306
one of the things you're
definitely gonna leave out
760
00:34:06,338 --> 00:34:08,573
is to put in emotion.
Right.
761
00:34:08,606 --> 00:34:10,108
'Cause if you have emotion,
762
00:34:10,142 --> 00:34:12,979
emotion will lead
to many facets,
763
00:34:13,012 --> 00:34:17,617
one of them being deceit,
anger, fury, hatred.
764
00:34:17,649 --> 00:34:19,718
-Sure.
-As well as love.
765
00:34:19,751 --> 00:34:24,023
If a machine becomes like us
enough and complex enough
766
00:34:24,055 --> 00:34:27,526
at one point, can we no
longer tell the difference?
767
00:34:27,559 --> 00:34:29,394
-The difference.
-Does it have freedom?
768
00:34:29,428 --> 00:34:31,031
Does it have free will?
769
00:34:33,465 --> 00:34:35,668
This hearing is to determine
the legal status
770
00:34:35,700 --> 00:34:39,237
of the android known as Data.
771
00:34:39,271 --> 00:34:42,175
The character of Data was sort
of everyone's favorite character
772
00:34:42,207 --> 00:34:44,443
on the show
and the writing staff as well.
773
00:34:44,477 --> 00:34:46,412
Everyone loved to write
Data stories.
774
00:34:46,445 --> 00:34:49,082
Here's a robot
who wants to be human
775
00:34:49,115 --> 00:34:51,283
but who has no emotions
but wants emotions.
776
00:34:51,316 --> 00:34:53,186
So, it's really Pinocchio,
777
00:34:53,218 --> 00:34:55,722
and the Pinocchio metaphor
is powerful.
778
00:34:55,754 --> 00:34:57,323
Commander,
what are you?
779
00:34:57,356 --> 00:34:59,993
Webster's 24th-century
dictionary 5th edition
780
00:35:00,026 --> 00:35:01,694
defines an android
as an automaton
781
00:35:01,727 --> 00:35:03,529
made to resemble
a human being.
782
00:35:03,561 --> 00:35:08,433
"The Measure of a Man" is one of
those sort of very deep episodes
783
00:35:08,466 --> 00:35:10,235
that you don't realize is deep
784
00:35:10,269 --> 00:35:12,372
until like four
or five years later.
785
00:35:12,405 --> 00:35:15,173
And you see it
and you go,
786
00:35:15,207 --> 00:35:18,644
In the episode, Data's humanity
is essentially put on trial.
787
00:35:18,677 --> 00:35:21,046
Is he sentient?
788
00:35:21,079 --> 00:35:24,216
Is he worthy of being
treated as a person?
789
00:35:24,249 --> 00:35:25,784
Am I person
or a property?
790
00:35:25,818 --> 00:35:27,520
What's at stake?
791
00:35:27,553 --> 00:35:29,355
My right to choose.
792
00:35:29,387 --> 00:35:32,791
It was a legitimate exploration
of this idea of personhood
793
00:35:32,825 --> 00:35:36,629
in a legal sense
and in a moral sense.
794
00:35:36,662 --> 00:35:39,499
Its responses dictated by
an elaborate software
795
00:35:39,532 --> 00:35:41,066
written by a man.
796
00:35:41,099 --> 00:35:43,702
And now a man
will shut it off.
797
00:35:43,736 --> 00:35:45,605
It was shocking to the
characters on the show
798
00:35:45,637 --> 00:35:48,640
and shocking to the audience
as well because we love Data.
799
00:35:48,673 --> 00:35:51,543
Starfleet was founded
to seek out new life.
800
00:35:51,577 --> 00:35:53,646
Well, there it sits!
801
00:35:53,678 --> 00:35:57,115
Once we create some form
of artificial intelligence,
802
00:35:57,148 --> 00:35:59,051
these legal arguments
are gonna happen.
803
00:35:59,084 --> 00:36:01,653
Do machines deserve rights?
804
00:36:01,686 --> 00:36:03,488
You know, probably.
805
00:36:03,521 --> 00:36:05,224
In the history of many worlds,
806
00:36:05,256 --> 00:36:08,494
there have always been
disposable creatures.
807
00:36:08,526 --> 00:36:10,495
They do the dirty work.
808
00:36:10,528 --> 00:36:12,130
An army of Datas,
809
00:36:12,164 --> 00:36:16,468
whole generations
of disposable people.
810
00:36:18,503 --> 00:36:20,305
You're talking
about slavery.
811
00:36:22,407 --> 00:36:26,078
The term "robot" itself
comes from the Czech play
812
00:36:26,111 --> 00:36:27,747
"Rossum's Universal Robots,"
813
00:36:27,780 --> 00:36:31,084
and the word "robota"
means laborer.
814
00:36:31,117 --> 00:36:33,252
A pejorative version
of it means slave.
815
00:36:33,285 --> 00:36:35,655
So, our conception of what
robots will be
816
00:36:35,687 --> 00:36:39,091
is directly,
umbilically connected
817
00:36:39,124 --> 00:36:41,860
to our idea of them
as an underclass.
818
00:36:41,894 --> 00:36:43,629
Why do you think
your people made me?
819
00:36:43,662 --> 00:36:45,298
We made you
'cause we could.
820
00:36:45,331 --> 00:36:48,166
You are just a machine.
An imitation of life.
821
00:36:48,200 --> 00:36:50,269
Replicants are like
any other machine.
822
00:36:50,302 --> 00:36:51,570
They're either a benefit
or a hazard.
823
00:36:51,604 --> 00:36:54,841
If they're a benefit,
it's not my problem.
824
00:36:54,874 --> 00:36:58,343
"Blade Runner" is a slave
narrative, basically.
825
00:36:58,376 --> 00:37:01,480
They've created these replicants
to be our slaves.
826
00:37:01,514 --> 00:37:03,615
And I think that's the part
that's really troubling about
827
00:37:03,648 --> 00:37:05,217
"Blade Runner" is that
828
00:37:05,251 --> 00:37:07,586
not only is it sort of
this technologically
829
00:37:07,619 --> 00:37:09,454
and environmentally
ruined future,
830
00:37:09,488 --> 00:37:14,127
it's sort of a morally and
ethically ruined future as well.
831
00:37:14,160 --> 00:37:15,795
I wrote those first
couple scripts
832
00:37:15,827 --> 00:37:18,264
thinking of a very small movie.
833
00:37:18,296 --> 00:37:20,899
And then Ridley said to me,
834
00:37:20,932 --> 00:37:23,702
"What's out the window?"
835
00:37:23,736 --> 00:37:25,171
And I said,
"What's out the window?
836
00:37:25,204 --> 00:37:26,506
Well, the world, you know."
837
00:37:26,539 --> 00:37:29,142
He said, "Exactly.
What world is that?
838
00:37:29,175 --> 00:37:31,910
Where, you know, you make
a robot indistinguishable
839
00:37:31,943 --> 00:37:33,378
from a human.
840
00:37:33,412 --> 00:37:35,381
Think about this for a second.
841
00:37:35,414 --> 00:37:37,283
Imagine..."
and he does that to you.
842
00:37:37,316 --> 00:37:39,852
You go, "Boom."
I said, "God."
843
00:37:39,885 --> 00:37:42,220
He delivered a world.
That's Ridley.
844
00:37:42,253 --> 00:37:43,922
He can... he makes things.
845
00:37:43,956 --> 00:37:45,825
Ridley brought everything to it.
846
00:37:45,858 --> 00:37:48,227
"Blade Runner" comes
from a Philip K. Dick novel...
847
00:37:48,260 --> 00:37:49,528
Yeah.
848
00:37:49,561 --> 00:37:51,630
"Do Androids Dream
of Electric Sheep?"
849
00:37:51,664 --> 00:37:55,267
Philip K. Dick was very
prolific and very profound
850
00:37:55,300 --> 00:37:57,436
talking about
the nature of reality
851
00:37:57,468 --> 00:37:59,671
and the nature of
artificial intelligence
852
00:37:59,704 --> 00:38:01,706
and what it is
to be human.
853
00:38:01,740 --> 00:38:03,609
That was
the nut of the idea
854
00:38:03,642 --> 00:38:06,445
that grew with Hampton
into what it was.
855
00:38:06,477 --> 00:38:09,381
Here was this beautiful,
beautiful film.
856
00:38:09,414 --> 00:38:12,684
Dark, noir-ish
and I thought,"
857
00:38:12,718 --> 00:38:15,488
a film can be
so artistic."
858
00:38:15,521 --> 00:38:18,390
And the idea of these...
These machines challenging us
859
00:38:18,423 --> 00:38:20,793
and their lack of affect,
their lack of emotion.
860
00:38:20,825 --> 00:38:23,595
The film is constantly saying
there is no emotion.
861
00:38:23,628 --> 00:38:26,598
Computer
just makes decisions.
862
00:38:26,631 --> 00:38:27,966
Negative or positive.
863
00:38:27,999 --> 00:38:29,467
It doesn't really care.
864
00:38:29,501 --> 00:38:31,204
Yeah, with the
Voight-Kampff test.
865
00:38:31,236 --> 00:38:33,238
Correct.
866
00:38:33,271 --> 00:38:34,907
Is this to be
an empathy test?
867
00:38:34,939 --> 00:38:37,876
Capillary dilation of
so-called blush response?
868
00:38:37,910 --> 00:38:40,312
We call it
Voight-Kampff for short.
869
00:38:40,345 --> 00:38:42,648
The Voight-Kampff is
a series of questions
870
00:38:42,680 --> 00:38:45,483
that allowed the questioner
to find out
871
00:38:45,517 --> 00:38:48,721
whether or not who was
being questioned had feelings.
872
00:38:48,753 --> 00:38:50,422
It's your birthday.
873
00:38:50,456 --> 00:38:52,258
Someone gives you
a calf-skin wallet.
874
00:38:52,291 --> 00:38:53,926
I wouldn't accept it.
875
00:38:53,959 --> 00:38:55,361
It's about empathy.
876
00:38:55,394 --> 00:38:56,796
You're reading
a magazine.
877
00:38:56,829 --> 00:38:59,365
You come across a full-page
nude photo of a girl.
878
00:38:59,398 --> 00:39:01,434
Is this testing whether
I'm a replicant
879
00:39:01,466 --> 00:39:02,834
or a lesbian,
Mr. Deckard?
880
00:39:02,867 --> 00:39:04,503
Just answer the
questions, please.
881
00:39:04,535 --> 00:39:06,871
Not to get gooey about empathy,
but it does seem
882
00:39:06,905 --> 00:39:13,446
that empathy is the big divide
between us and everything else.
883
00:39:13,479 --> 00:39:16,881
Deckard is very much
a man in his job.
884
00:39:16,914 --> 00:39:19,918
He firmly believes that as long
as robots are working properly,
885
00:39:19,952 --> 00:39:21,387
it's not his problem.
886
00:39:21,419 --> 00:39:23,789
But that if
a replicant misbehaves,
887
00:39:23,822 --> 00:39:25,590
it is indeed his problem
888
00:39:25,624 --> 00:39:29,262
and his duty to retire it.
889
00:39:29,294 --> 00:39:30,863
Move!
Get out of the way!
890
00:39:33,564 --> 00:39:36,735
However, over the course of the
film, he increasingly questions
891
00:39:36,769 --> 00:39:40,539
whether or not disobeying orders
means that you're defective
892
00:39:40,572 --> 00:39:42,041
or that you are a human
893
00:39:42,073 --> 00:39:43,842
with rights and wills
and dreams of your own.
894
00:39:43,876 --> 00:39:49,915
I've seen things you people
wouldn't believe.
895
00:39:49,948 --> 00:39:52,518
There's such poetry
in the scene
896
00:39:52,550 --> 00:39:54,853
where Roy Batty's dying.
Yes.
897
00:39:54,887 --> 00:39:56,589
It's just
a magnificent scene.
898
00:39:56,622 --> 00:39:58,357
He wrote that.
899
00:39:58,390 --> 00:39:59,725
Really?
Rutger wrote that?
900
00:39:59,758 --> 00:40:01,394
It's 1:00 in the morning.
901
00:40:01,427 --> 00:40:03,528
I'm gonna have the plug pulled...
Yeah.
902
00:40:03,561 --> 00:40:05,463
Literally on
everything at dawn.
903
00:40:05,497 --> 00:40:06,699
-Yeah.
-And that's it.
904
00:40:06,731 --> 00:40:08,668
That's gonna be
the last night,
905
00:40:08,700 --> 00:40:11,369
and Rutger said,
"I have written something."
906
00:40:11,403 --> 00:40:12,738
And he said...
907
00:40:12,771 --> 00:40:17,910
All those moments will be lost
908
00:40:17,943 --> 00:40:23,582
in time like tears in rain.
909
00:40:25,383 --> 00:40:27,320
-And I'm nearly in tears.
-Yeah.
910
00:40:27,352 --> 00:40:28,387
He said,
"What do you think?"
911
00:40:28,419 --> 00:40:29,988
I said, "Let's do it."
912
00:40:30,021 --> 00:40:32,057
So, we literally went...
It's gorgeous.
913
00:40:32,090 --> 00:40:33,892
-We shot it within an hour.
-Yeah.
914
00:40:33,925 --> 00:40:35,660
And at the end, he looked at him
915
00:40:35,693 --> 00:40:37,629
and gave that
most beautiful smile.
916
00:40:39,765 --> 00:40:42,902
Time to die.
917
00:40:42,934 --> 00:40:44,402
And he had a dove
in his hand and he let...
918
00:40:44,436 --> 00:40:45,905
He let it go.
Yeah.
919
00:40:45,937 --> 00:40:48,541
Is it saying
Roy Batty had a soul?
920
00:40:48,573 --> 00:40:51,110
Roy Batty was
a fully sentient being.
921
00:40:51,142 --> 00:40:52,544
Yes.
922
00:40:54,947 --> 00:40:57,383
Four of your films now
have had
923
00:40:57,416 --> 00:41:00,785
an intelligent, embodied
A.I.
924
00:41:00,818 --> 00:41:02,053
Right?
An artificial intelligence.
925
00:41:02,086 --> 00:41:03,521
Synthetic person.
926
00:41:03,555 --> 00:41:05,624
So where do you think
we come out in this?
927
00:41:05,657 --> 00:41:08,494
Is this our...
Are we handing the keys
928
00:41:08,526 --> 00:41:10,463
to the kingdom off
to the machines?
929
00:41:10,495 --> 00:41:12,932
I don't think we should.
With a creation of something
930
00:41:12,965 --> 00:41:16,035
so potentially wonderful
and dangerous as A.I.,
931
00:41:16,068 --> 00:41:19,705
the inventor frequently
is obsessed by the success
932
00:41:19,738 --> 00:41:21,040
of what he's doing
933
00:41:21,073 --> 00:41:22,975
rather than looking
at the real outcome.
934
00:41:23,008 --> 00:41:24,677
Here is
where the problem is.
935
00:41:24,709 --> 00:41:27,646
It's the moment where it passes
over your control.
936
00:41:27,678 --> 00:41:28,847
Yeah.
937
00:41:28,881 --> 00:41:30,149
That's where
the danger lies.
938
00:41:30,182 --> 00:41:32,551
You cross over
and you're in trouble.
939
00:41:32,584 --> 00:41:35,687
You get an A.I.,
you have to have limitations.
940
00:41:35,720 --> 00:41:37,923
You got to have your hand
on the plug the entire time.
941
00:41:37,956 --> 00:41:40,093
All the time.
Totally.
72984
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.