Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:07,220 --> 00:00:08,700
Open the pod bay doors, HAL.
2
00:00:08,750 --> 00:00:12,100
I'm sorry, Dave. I'm afraid I can't do that.
3
00:00:12,140 --> 00:00:14,970
2001 had a profound impact on my life.
4
00:00:15,020 --> 00:00:16,760
It's all about HAL 9000.
5
00:00:16,800 --> 00:00:21,540
"2001" was an extraordinary breakthrough for the genre.
6
00:00:21,590 --> 00:00:22,810
Science fiction has always been about
7
00:00:22,850 --> 00:00:25,240
great technology going wrong.
8
00:00:25,290 --> 00:00:26,810
I'll be back.
9
00:00:26,850 --> 00:00:29,460
That image, Schwarzenegger as the Terminator,
10
00:00:29,510 --> 00:00:31,120
it's a perfect nightmare.
11
00:00:31,160 --> 00:00:32,680
You have two of the most popular A.I. characters
12
00:00:32,730 --> 00:00:34,300
in pop culture.
13
00:00:34,340 --> 00:00:36,730
At the time I said, "Don't be afraid of the robots.
14
00:00:36,780 --> 00:00:38,910
The robots are our friends."
15
00:00:38,950 --> 00:00:42,430
Conception of what robots will be is directly,
16
00:00:42,480 --> 00:00:47,130
umbilically connected to our idea of them as an underclass.
17
00:00:47,180 --> 00:00:49,220
Replicants are like any other machine.
18
00:00:49,270 --> 00:00:50,790
They're either a benefit or a hazard.
19
00:00:50,830 --> 00:00:52,660
"Blade Runner" was so artistic.
20
00:00:52,700 --> 00:00:55,450
If you're creating an A.I.,
21
00:00:55,490 --> 00:00:57,100
one of the things you're definitely going to leave out
22
00:00:57,140 --> 00:00:58,970
is emotion.
23
00:00:59,020 --> 00:01:01,670
"Battlestar Galactica" is about humanity's greatest weakness.
24
00:01:01,710 --> 00:01:03,670
You're just a bunch of machines after all.
25
00:01:03,720 --> 00:01:06,720
The inability to see others as worthy as ourselves.
26
00:01:06,760 --> 00:01:09,810
Our machines have been the stuff of dreams
27
00:01:09,850 --> 00:01:11,420
and of nightmares.
28
00:01:11,460 --> 00:01:16,340
The question is, can man and machine forge a future...
29
00:01:16,380 --> 00:01:17,600
together?
30
00:01:17,640 --> 00:01:18,860
Hello, I'm here.
31
00:01:18,900 --> 00:01:21,860
Oh...
32
00:01:21,910 --> 00:01:23,080
Hi.
33
00:01:23,130 --> 00:01:30,090
**
34
00:01:33,140 --> 00:01:40,100
**
35
00:01:43,150 --> 00:01:50,020
**
36
00:01:50,070 --> 00:01:56,940
**
37
00:01:56,990 --> 00:01:58,470
They call it science fiction,
38
00:01:58,510 --> 00:01:59,900
but it's really about technology.
39
00:01:59,950 --> 00:02:02,510
It's about the machines.
40
00:02:02,560 --> 00:02:04,430
You've done a lot of science-fiction movies.
41
00:02:04,470 --> 00:02:06,560
You've seen all kinds of different machines,
42
00:02:06,600 --> 00:02:07,870
intelligent machines.
43
00:02:07,910 --> 00:02:09,480
You've played an intelligent machine.
44
00:02:09,520 --> 00:02:12,090
I think that what's interesting is
45
00:02:12,130 --> 00:02:16,050
when you have been involved in the business
46
00:02:16,090 --> 00:02:19,970
as long as I have, what is so unbelievable
47
00:02:20,010 --> 00:02:21,880
is that as I've done
48
00:02:21,920 --> 00:02:24,880
"Terminator" movies one after the next
49
00:02:24,930 --> 00:02:27,760
and you see something starting out kind of
50
00:02:27,800 --> 00:02:30,240
what is called science fiction
51
00:02:30,280 --> 00:02:31,980
and then all of a sudden,
52
00:02:32,020 --> 00:02:34,420
it becomes kind of science reality.
53
00:02:34,460 --> 00:02:36,460
Yeah. I think science fiction has always been
54
00:02:36,500 --> 00:02:38,510
about great technology going wrong.
55
00:02:38,550 --> 00:02:41,730
It's like how A.I. might be a threat to humanity.
56
00:02:42,860 --> 00:02:44,290
I'm a friend of Sarah Connor.
57
00:02:44,340 --> 00:02:46,950
Can't see her. She's making a statement.
58
00:02:46,990 --> 00:02:49,950
I'll be back.
59
00:02:50,000 --> 00:02:51,210
That image,
60
00:02:51,260 --> 00:02:52,910
Schwarzenegger as the Terminator,
61
00:02:52,960 --> 00:02:55,910
Skynet sending this emissary,
62
00:02:55,960 --> 00:02:58,740
quasi-human into our midst,
63
00:02:58,790 --> 00:03:02,620
it's a perfect nightmare of the machine catastrophe.
64
00:03:07,190 --> 00:03:09,930
The old warning about the machines rising up.
65
00:03:09,970 --> 00:03:13,800
It's very archetypal and very brutal and very perfect.
66
00:03:13,850 --> 00:03:15,980
When I first thought of the idea for the Terminator --
67
00:03:16,020 --> 00:03:17,200
How did you come up with that idea?
68
00:03:17,240 --> 00:03:18,590
It came from a dream.
69
00:03:18,630 --> 00:03:20,200
I had a dream image
70
00:03:20,240 --> 00:03:23,640
of a chrome skeleton walking out of a fire.
71
00:03:23,680 --> 00:03:26,120
And I thought, "What if he was a cyborg
72
00:03:26,160 --> 00:03:27,820
and he looked like a man
73
00:03:27,860 --> 00:03:31,300
and was indistinguishable from a man until the fire?"
74
00:03:31,340 --> 00:03:34,000
And what would be the purpose of that thing?
75
00:03:34,040 --> 00:03:35,130
He was representing
76
00:03:35,170 --> 00:03:37,000
a much more powerful intelligence,
77
00:03:37,040 --> 00:03:40,350
the soldier sent by Skynet from the future. Right.
78
00:03:42,530 --> 00:03:44,270
"Terminator" presents
79
00:03:44,310 --> 00:03:46,660
this vision of a future where Skynet,
80
00:03:46,700 --> 00:03:49,490
this computer that has become self-aware,
81
00:03:49,530 --> 00:03:52,750
decides, "Well, I'm going to protect myself
82
00:03:52,800 --> 00:03:54,970
and the only way to do that is to destroy
83
00:03:55,020 --> 00:03:58,280
the very people who created me," which is us.
84
00:03:58,330 --> 00:04:01,550
Skynet is not the first all-powerful computer.
85
00:04:01,590 --> 00:04:03,940
This trope goes back to Robert Heinlein's
86
00:04:03,980 --> 00:04:05,770
"The Moon Is a Harsh Mistress"
87
00:04:05,810 --> 00:04:07,770
The Forbin Project"
88
00:04:07,810 --> 00:04:10,030
and even up to "WarGames," the WOPR.
89
00:04:10,080 --> 00:04:12,730
But part of what makes the Terminator so scary
90
00:04:12,770 --> 00:04:14,690
is that it is relentless
91
00:04:14,730 --> 00:04:18,350
and it will not stop until it achieves its objective.
92
00:04:18,390 --> 00:04:25,400
**
93
00:04:25,570 --> 00:04:27,830
I remember the first day's dailies.
94
00:04:27,880 --> 00:04:29,360
There was a 100-millimeter lens shot
95
00:04:29,400 --> 00:04:30,710
where you just kind of pull up
96
00:04:30,750 --> 00:04:32,050
and you're looking like this.
97
00:04:32,100 --> 00:04:34,010
-Right.
-And we're all just going, "Yes!
98
00:04:34,060 --> 00:04:36,540
This is fantastic."
99
00:04:39,980 --> 00:04:41,850
But here's the interesting thing about it,
100
00:04:41,890 --> 00:04:43,540
I went to talk to you about Reese.
101
00:04:43,590 --> 00:04:44,630
Yeah.
102
00:04:44,680 --> 00:04:46,020
This is the hero,
103
00:04:46,070 --> 00:04:47,420
and I wanted to continue on playing heroes.
104
00:04:47,460 --> 00:04:49,030
Yeah.
105
00:04:49,070 --> 00:04:50,330
And so we started talking a little bit about the movie,
106
00:04:50,380 --> 00:04:52,510
and for some reason or the other,
107
00:04:52,550 --> 00:04:54,080
-not at all planned...
-Yeah.
108
00:04:54,120 --> 00:04:55,160
-...on my part...
-Yeah.
109
00:04:55,210 --> 00:04:56,730
...I said, "Look, Jim."
110
00:04:56,770 --> 00:04:58,910
I said, "The guy that plays the Terminator,
111
00:04:58,950 --> 00:05:01,130
he really has to understand that he's a machine."
112
00:05:01,170 --> 00:05:02,610
Exactly.
113
00:05:02,650 --> 00:05:04,780
How important it is that whoever plays the Terminator
114
00:05:04,830 --> 00:05:07,220
has to show absolutely nothing.
115
00:05:07,260 --> 00:05:09,270
And the way he scans,
116
00:05:09,310 --> 00:05:12,790
and the way the Terminator walks has to be machine-like,
117
00:05:12,830 --> 00:05:15,010
and yet there has to be not one single frame
118
00:05:15,050 --> 00:05:16,710
where he has human behavior.
119
00:05:16,750 --> 00:05:17,930
In the middle of this,
120
00:05:17,970 --> 00:05:19,490
I'm looking at you and thinking,
121
00:05:19,540 --> 00:05:23,110
"You know, the guy's kind of big, like a bulldozer,
122
00:05:23,150 --> 00:05:25,370
and nothing could stop him. It would be fantastic."
123
00:05:25,410 --> 00:05:27,110
Exactly. Yeah.
124
00:05:27,150 --> 00:05:28,420
And so, afterwards you said,
125
00:05:28,460 --> 00:05:29,850
"So, why don't you play the Terminator?"
126
00:05:29,890 --> 00:05:31,200
Yeah.
127
00:05:31,240 --> 00:05:33,250
And I looked at you, and I said, "Oh "
128
00:05:36,070 --> 00:05:38,430
In the first film,
129
00:05:38,470 --> 00:05:41,380
the Terminator is designed to kill.
130
00:05:41,430 --> 00:05:42,820
In "Terminator 2,"
131
00:05:42,860 --> 00:05:44,820
the Terminator was programmed to protect,
132
00:05:44,870 --> 00:05:46,300
not destroy.
133
00:05:46,350 --> 00:05:47,780
Action.
134
00:05:47,830 --> 00:05:49,220
And now we're going to make "Terminator 2."
135
00:05:49,260 --> 00:05:50,440
The hardest part of that movie, though,
136
00:05:50,480 --> 00:05:51,960
was convincing you
137
00:05:52,000 --> 00:05:53,310
that playing a good guy was a good idea.
138
00:05:53,350 --> 00:05:55,880
It threw me off first when I read the script
139
00:05:55,920 --> 00:05:58,270
and then realized that I'm not anymore
140
00:05:58,310 --> 00:06:00,060
that kind of killing machine.
141
00:06:00,100 --> 00:06:03,670
I thought that if we could distill him down to this idea
142
00:06:03,710 --> 00:06:06,100
of just relentlessness
143
00:06:06,150 --> 00:06:09,720
and take out the evil and put good in its place,
144
00:06:09,760 --> 00:06:11,500
it's interesting that the same character
145
00:06:11,550 --> 00:06:12,810
worked as a bad guy
146
00:06:12,850 --> 00:06:13,980
and as a good guy, same character.
147
00:06:14,030 --> 00:06:15,510
Absolutely.
148
00:06:15,550 --> 00:06:18,250
And now we've got to have a bigger, badder Terminator
149
00:06:18,290 --> 00:06:20,420
that could kick the Terminator's ass.
150
00:06:20,470 --> 00:06:22,730
So what was that?
151
00:06:25,600 --> 00:06:29,650
I was convinced I was the baddest mother
152
00:06:29,690 --> 00:06:31,350
walking on the planet,
153
00:06:31,390 --> 00:06:34,520
and you were gonna believe it.
154
00:06:34,570 --> 00:06:36,480
I got a call from my agent
155
00:06:36,530 --> 00:06:39,660
saying they we're looking for an intense presence.
156
00:06:39,700 --> 00:06:42,360
I'm a hell of a lot smaller than Arnold Schwarzenegger,
157
00:06:42,400 --> 00:06:45,100
and I knew that you were just going to have to buy
158
00:06:45,140 --> 00:06:48,760
that this thing was unstoppable.
159
00:06:48,800 --> 00:06:51,410
And then I started thinking of pursuit
160
00:06:51,450 --> 00:06:53,460
and what does that look like.
161
00:06:53,500 --> 00:06:57,500
And then physically, I just started taking on the mannerisms
162
00:06:57,550 --> 00:07:01,460
of, you know, what does an eagle look like?
163
00:07:01,510 --> 00:07:05,340
He's fierce, and he looks like he's coming at you,
164
00:07:05,380 --> 00:07:07,820
and you start realizing...
165
00:07:10,170 --> 00:07:11,390
Boom! Right at you.
166
00:07:11,430 --> 00:07:12,950
It's like a Buick.
167
00:07:13,000 --> 00:07:15,830
Get down.
168
00:07:15,870 --> 00:07:18,260
You know, it's like, shoom!
169
00:07:19,920 --> 00:07:22,790
The moment where we actually clinch for the first time,
170
00:07:22,830 --> 00:07:25,710
Arnold wanted to kind of pick me up over his head
171
00:07:25,750 --> 00:07:30,320
and slam me into the walls and throw me around a little bit.
172
00:07:30,360 --> 00:07:31,540
So, it's like this is the first time
173
00:07:31,580 --> 00:07:32,930
you've had to deal with evil
174
00:07:32,970 --> 00:07:34,240
'cause Terminators don't fight Terminators.
175
00:07:34,280 --> 00:07:35,720
Right.
176
00:07:35,760 --> 00:07:37,720
And I remember Jim specifically saying,
177
00:07:37,760 --> 00:07:40,200
"You can't do that. He's stronger than you are."
178
00:07:41,590 --> 00:07:43,720
"He's more powerful. He's faster."
179
00:07:43,770 --> 00:07:46,810
He can just dominate the T-800,
180
00:07:46,860 --> 00:07:50,300
who is an endo-skeleton with, you know, fake skin over him.
181
00:07:50,340 --> 00:07:53,730
Whereas I'm just a mimetic polyalloy liquid metal.
182
00:07:53,780 --> 00:07:56,560
Much more dense...
183
00:07:56,610 --> 00:07:59,440
A superior machine.
184
00:07:59,480 --> 00:08:03,090
The T-1000 is the robot's concept of a robot.
185
00:08:03,130 --> 00:08:05,530
And it's like if a robot was trying to create
186
00:08:05,570 --> 00:08:07,660
a better version of itself, what would it do?
187
00:08:07,700 --> 00:08:10,530
And it's like, "Well, it would create something that's smooth
188
00:08:10,580 --> 00:08:15,230
and can move freely and still indestructible."
189
00:08:15,280 --> 00:08:18,370
You can read "Terminator 2" almost as a war
190
00:08:18,410 --> 00:08:21,330
between old special effects and new special effects.
191
00:08:21,370 --> 00:08:22,980
That's the beautiful kind of irony
192
00:08:23,020 --> 00:08:25,290
about the "Terminator" movies.
193
00:08:25,330 --> 00:08:26,680
They used cutting-edge technology
194
00:08:26,720 --> 00:08:28,810
more effectively than any other movies,
195
00:08:28,860 --> 00:08:32,600
but they're about warnings about technology.
196
00:08:35,990 --> 00:08:37,910
We're not going to make it, are we?
197
00:08:41,000 --> 00:08:42,350
People, I mean.
198
00:08:45,180 --> 00:08:48,790
It's in your nature to destroy yourselves.
199
00:08:48,830 --> 00:08:51,140
The plot of the "Terminator" films, I thought, we're always
200
00:08:51,180 --> 00:08:53,490
fighting against this robot from the future,
201
00:08:53,530 --> 00:08:55,400
but really what we're doing is we're fighting the humans
202
00:08:55,450 --> 00:08:57,970
who keep making this robot possible.
203
00:08:58,020 --> 00:09:00,450
As long as humans are aware that we have the potential
204
00:09:00,500 --> 00:09:02,590
to create a machine that can control the Earth
205
00:09:02,630 --> 00:09:05,940
and make us powerful, we're going to keep doing it
206
00:09:05,980 --> 00:09:09,590
and we're fighting our own nature to create this Skynet
207
00:09:09,640 --> 00:09:11,510
and humans won't stop doing it.
208
00:09:11,550 --> 00:09:13,860
We are really the persistent villain
209
00:09:13,900 --> 00:09:16,080
that keeps making these movies happen.
210
00:09:16,120 --> 00:09:20,520
I don't think we could have anticipated
211
00:09:20,560 --> 00:09:24,000
where we are now 30 some years later
212
00:09:24,040 --> 00:09:28,180
where Skynet is the term that everyone uses
213
00:09:28,220 --> 00:09:30,350
when they're talking about an artificial intelligence
214
00:09:30,400 --> 00:09:31,920
that turns against us.
215
00:09:31,960 --> 00:09:34,140
Part of it I think is -- is there's a feeling
216
00:09:34,180 --> 00:09:35,360
you get before it rains
217
00:09:35,400 --> 00:09:37,050
and you know it's gonna rain.
218
00:09:37,100 --> 00:09:40,010
And you get that feeling about certain moments
219
00:09:40,060 --> 00:09:41,580
in technological development
220
00:09:41,620 --> 00:09:44,280
where you know something is gonna happen very soon.
221
00:09:44,320 --> 00:09:46,930
And I think there's a general consensus now
222
00:09:46,980 --> 00:09:49,370
that we're in that moment before it rains.
223
00:09:49,410 --> 00:09:50,720
Now, maybe that moment takes 10 years,
224
00:09:50,760 --> 00:09:51,900
maybe it takes 20 years,
225
00:09:51,940 --> 00:09:53,240
but there's gonna be a moment
226
00:09:53,290 --> 00:09:55,200
and it may not have a happy ending.
227
00:09:55,250 --> 00:09:56,940
And there's no rehearsal.
228
00:09:56,990 --> 00:09:58,550
That's right, there's no take 2.
229
00:09:58,600 --> 00:10:00,250
No, this is it.
230
00:10:00,300 --> 00:10:02,690
Yeah.
231
00:10:06,260 --> 00:10:08,960
HAL, you have an enormous responsibility on this mission.
232
00:10:09,000 --> 00:10:10,830
Let me put it this way, Mr. Amer.
233
00:10:10,870 --> 00:10:13,700
No 9000 computer has ever made a mistake
234
00:10:13,740 --> 00:10:15,480
or distorted information.
235
00:10:15,530 --> 00:10:17,880
-"2001" had a profound impact...
-Yeah, me too.
236
00:10:17,920 --> 00:10:21,320
...on my life and my daily life.
237
00:10:21,360 --> 00:10:22,530
It was the first time I went to a movie
238
00:10:22,580 --> 00:10:23,750
where I really felt like
239
00:10:23,800 --> 00:10:25,580
I was having a religious experience.
240
00:10:25,630 --> 00:10:28,500
I watched the film 18 times in its first couple years
241
00:10:28,540 --> 00:10:30,240
of release, all in theaters.
242
00:10:30,280 --> 00:10:32,680
I remember at one, a guy ran down the aisle
243
00:10:32,720 --> 00:10:34,550
toward the screen screaming,
244
00:10:34,590 --> 00:10:36,330
"It's God. It's God."
245
00:10:36,370 --> 00:10:38,030
And he meant it in that moment.
246
00:10:38,070 --> 00:10:39,730
And I had a guy in my theater
247
00:10:39,770 --> 00:10:42,640
who actually walked up to the screen with his arms out
248
00:10:42,690 --> 00:10:44,120
and he walked through the screen.
249
00:10:44,170 --> 00:10:45,470
That must have blown people's minds.
250
00:10:45,510 --> 00:10:46,990
People were blown out
251
00:10:47,040 --> 00:10:48,690
because the person disappeared into the screen
252
00:10:48,740 --> 00:10:50,870
during Star Gate, of all times.
253
00:10:50,910 --> 00:10:53,090
Everybody thinks of it as a space drama.
254
00:10:53,130 --> 00:10:54,570
At its core,
255
00:10:54,610 --> 00:10:55,790
it's really about an artificial intelligence.
256
00:10:55,830 --> 00:10:57,830
-It's all about HAL.
-HAL 9000.
257
00:10:57,870 --> 00:10:59,480
-It's HAL 9000.
-Yeah.
258
00:10:59,530 --> 00:11:01,620
I got my chance to work with Stanley Kubrick
259
00:11:01,660 --> 00:11:03,180
and Arthur Clarke on "2001: A Space Odyssey"
260
00:11:03,230 --> 00:11:04,580
at a very young age.
261
00:11:04,620 --> 00:11:06,190
I was 23 years old.
262
00:11:06,230 --> 00:11:08,490
When we created HAL, we didn't have any computers.
263
00:11:08,540 --> 00:11:11,110
There were no personal computers available to us.
264
00:11:11,150 --> 00:11:13,930
There were giant mainframe computers,
265
00:11:13,980 --> 00:11:17,460
but it was with punch cards and chads and all kinds of stuff.
266
00:11:17,500 --> 00:11:19,070
It was not very visual.
267
00:11:19,110 --> 00:11:20,900
And I had to kind of develop a style
268
00:11:20,940 --> 00:11:23,160
that I thought was credible.
269
00:11:23,200 --> 00:11:25,820
He sparked people's imagination with this film
270
00:11:25,860 --> 00:11:27,730
and then they made it happen.
271
00:11:27,770 --> 00:11:28,990
-Hello, Frank!
-Happy birthday, darling.
272
00:11:29,040 --> 00:11:30,300
Happy birthday.
273
00:11:30,340 --> 00:11:32,430
Individual TVs in the back
274
00:11:32,470 --> 00:11:34,480
of your airplane seat, the iPad.
275
00:11:34,520 --> 00:11:36,700
You know, the iPod is called the iPod because of...
276
00:11:36,740 --> 00:11:39,090
Open the pod bay doors, HAL.
277
00:11:39,130 --> 00:11:44,970
"2001" was an extraordinary breakthrough for the genre.
278
00:11:45,010 --> 00:11:48,580
The picture is being done in such a gigantic scope.
279
00:11:48,620 --> 00:11:51,970
The centrifuge is so realistic and so unusual.
280
00:11:52,020 --> 00:11:54,710
After a while, you begin to forget that you're an actor,
281
00:11:54,760 --> 00:11:56,980
you begin to really feel like an astronaut.
282
00:11:57,020 --> 00:12:00,590
Working with Stanley Kubrick blew my mind.
283
00:12:00,630 --> 00:12:06,290
You just were aware that you were in the presence of genius.
284
00:12:06,330 --> 00:12:08,250
I don't think I have ever seen
285
00:12:08,290 --> 00:12:10,080
anything quite like this before.
286
00:12:10,120 --> 00:12:11,600
HAL in a sense
287
00:12:11,640 --> 00:12:13,820
is the machine that controls the whole ship,
288
00:12:13,860 --> 00:12:17,000
but he's another crewmember from our point of view.
289
00:12:17,040 --> 00:12:19,040
We don't think in terms of,
290
00:12:19,090 --> 00:12:21,180
"Oh, I'm dealing with a computer here."
291
00:12:21,220 --> 00:12:23,480
That's a very nice rendering, Dave.
292
00:12:23,530 --> 00:12:26,350
Maybe because of that human voice.
293
00:12:26,400 --> 00:12:29,050
I mean, HAL has a perfectly normal inflection
294
00:12:29,100 --> 00:12:30,710
when he speaks to us.
295
00:12:30,750 --> 00:12:32,190
I've wondered whether you might be having
296
00:12:32,230 --> 00:12:34,970
some second thoughts about the mission?
297
00:12:35,020 --> 00:12:36,410
How do you mean?
298
00:12:36,450 --> 00:12:37,970
What does it mean to have a robot
299
00:12:38,020 --> 00:12:42,240
who's basically running the ship that supports your life?
300
00:12:42,280 --> 00:12:44,850
That's a lot of trust to place in a machine.
301
00:12:44,890 --> 00:12:48,330
The key point in the film occurs when Bowman says...
302
00:12:48,380 --> 00:12:49,810
Well, as far as I know,
303
00:12:49,860 --> 00:12:51,210
no 9000 computer's ever been disconnected.
304
00:12:51,250 --> 00:12:53,900
Well, no 9000 computer has ever fouled up before.
305
00:12:53,950 --> 00:12:56,860
Well, I'm not so sure what he'd think about it.
306
00:12:56,910 --> 00:12:59,210
And HAL 9000 is reading their lips.
307
00:12:59,260 --> 00:13:02,300
At that point, we recognize HAL 9000
308
00:13:02,350 --> 00:13:06,310
has some imperative that it must survive.
309
00:13:06,350 --> 00:13:09,310
I know that you and Frank were planning to disconnect me,
310
00:13:09,350 --> 00:13:12,840
and I'm afraid that's something I cannot allow to happen.
311
00:13:12,880 --> 00:13:16,140
And at that point, it's no longer a machine.
312
00:13:16,190 --> 00:13:18,100
It is a being.
313
00:13:21,980 --> 00:13:25,890
The danger artificial intelligence poses
314
00:13:25,940 --> 00:13:32,070
is the power to unleash results that we hadn't anticipated.
315
00:13:32,120 --> 00:13:35,380
HAL 9000 does what we see the apes
316
00:13:35,420 --> 00:13:37,430
in the beginning of the movie do,
317
00:13:37,470 --> 00:13:38,820
he commits murder.
318
00:13:41,690 --> 00:13:44,390
We like to stereotype robots
319
00:13:44,430 --> 00:13:46,260
as entities of pure logic,
320
00:13:46,300 --> 00:13:50,090
but of course in "2001," it all goes horribly wrong
321
00:13:50,130 --> 00:13:51,790
and we have to kill the robot.
322
00:13:51,830 --> 00:13:54,090
Just what do you think you're doing, Dave?
323
00:13:54,140 --> 00:13:55,750
HAL's death scene
324
00:13:55,790 --> 00:13:58,190
is such a wonderfully perverse moment
325
00:13:58,230 --> 00:14:00,930
because it is unbearably poignant
326
00:14:00,970 --> 00:14:03,760
watching him disintegrate and regress.
327
00:14:03,800 --> 00:14:07,630
IBM 704: * Daisy*
328
00:14:07,670 --> 00:14:09,670
* Daisy*
329
00:14:09,720 --> 00:14:11,890
Bell laboratories was experimenting
330
00:14:11,940 --> 00:14:14,850
with voice synthesis around the time of "2001."
331
00:14:16,990 --> 00:14:20,860
One of the very earliest voice synthesis experiments
332
00:14:20,900 --> 00:14:23,990
was "Daisy, Daisy" performed by an IBM computer.
333
00:14:30,130 --> 00:14:34,660
And because Arthur Clarke is kind of a super geek,
334
00:14:34,700 --> 00:14:36,180
he wanted to actually use that,
335
00:14:36,220 --> 00:14:38,360
and he encouraged Kubrick to use that very thing
336
00:14:38,400 --> 00:14:41,270
because it led to kind of historical credibility
337
00:14:41,320 --> 00:14:45,970
to the whole thing that HAL in the process of being killed
338
00:14:46,020 --> 00:14:49,670
or lobotomized or dying would regress to his birth.
339
00:14:49,710 --> 00:14:55,760
* I'm half crazy
340
00:14:55,810 --> 00:14:59,860
* All for the love of you
341
00:14:59,900 --> 00:15:02,290
You know, it's really hard to make a technology.
342
00:15:02,340 --> 00:15:04,640
It's really hard to design A.I.
343
00:15:04,690 --> 00:15:06,340
So much thinking, so many brilliant minds
344
00:15:06,380 --> 00:15:08,080
have to go into it.
345
00:15:08,120 --> 00:15:11,950
But even harder than creating artificial intelligence
346
00:15:12,000 --> 00:15:14,830
is learning how to contain it, learning how to shut it off.
347
00:15:14,870 --> 00:15:17,180
I mean, Hal will exist in probably
348
00:15:17,220 --> 00:15:19,480
in our lifetimes, I would think.
349
00:15:19,530 --> 00:15:21,180
Oh, I think so, too. It's scary.
350
00:15:21,220 --> 00:15:23,840
Elon Musk continues to predict that World War III
351
00:15:23,880 --> 00:15:25,490
will not be a nuclear holocaust,
352
00:15:25,530 --> 00:15:27,750
it will be a kind of mechanized takeover.
353
00:15:27,800 --> 00:15:30,190
Yeah, and Stephen Hawking's been saying similar things.
354
00:15:30,230 --> 00:15:33,110
That's pretty spooky because that pretty much says
355
00:15:33,150 --> 00:15:37,020
that against our will, something smarter than us,
356
00:15:37,070 --> 00:15:39,500
who can beat us at chess,
357
00:15:39,550 --> 00:15:41,850
will use this world as a chessboard
358
00:15:41,900 --> 00:15:45,950
and will checkmate us completely out of existence.
359
00:15:54,870 --> 00:15:56,910
Unfortunately, most depictions of robots
360
00:15:56,960 --> 00:15:58,910
in science fiction have been really negative,
361
00:15:58,960 --> 00:16:01,400
very much depictions of rampaging robots
362
00:16:01,440 --> 00:16:03,530
engaged in a desperate struggle with humans
363
00:16:03,570 --> 00:16:06,440
to decide who shall own the fate of the Earth and the universe
364
00:16:06,490 --> 00:16:09,710
and that's part of a very long tradition in science fiction.
365
00:16:09,750 --> 00:16:12,970
Fritz Lang's "Metropolis" was one of the first
366
00:16:13,020 --> 00:16:16,060
if not the first big science-fiction epic film.
367
00:16:16,110 --> 00:16:19,720
It's the story of this very futuristic world.
368
00:16:19,760 --> 00:16:24,460
There is one of the great bad robots of all movies --
369
00:16:24,510 --> 00:16:27,940
Maria. That is the movie robot.
370
00:16:27,990 --> 00:16:30,950
Pulp magazines always had a full color cover.
371
00:16:30,990 --> 00:16:32,770
Very often the cover would be robots
372
00:16:32,820 --> 00:16:35,390
that had just run amok from human creators.
373
00:16:35,430 --> 00:16:36,820
They were always mechanical.
374
00:16:36,870 --> 00:16:39,130
They were big, hulking things.
375
00:16:39,170 --> 00:16:42,780
Lots of steel and machinery, glowing-red eyes.
376
00:16:42,830 --> 00:16:46,440
Claws, not fingers, and they were generally quite violent.
377
00:16:46,480 --> 00:16:49,140
So, that image persisted a long time.
378
00:16:51,710 --> 00:16:53,930
But then along came Isaac Asimov.
379
00:16:53,970 --> 00:16:58,100
If we could have roughly man-like robots,
380
00:16:58,150 --> 00:17:03,410
who could take over the dull and routine tasks
381
00:17:03,460 --> 00:17:06,720
that this would be a very nice combination.
382
00:17:06,760 --> 00:17:08,770
Asimov was very central
383
00:17:08,810 --> 00:17:11,730
to helping make science fiction what it is today.
384
00:17:11,770 --> 00:17:15,120
He was at the 1939 World's Fair in New York City.
385
00:17:15,160 --> 00:17:17,730
It must've felt like a very science-fictional experience
386
00:17:17,780 --> 00:17:19,260
to him, and not in the least part
387
00:17:19,300 --> 00:17:21,000
because he would've seen Elektro,
388
00:17:21,040 --> 00:17:22,560
the smoking robot.
389
00:17:22,610 --> 00:17:24,700
Okay, toots.
390
00:17:24,740 --> 00:17:27,390
And this really inspired Asimov.
391
00:17:27,440 --> 00:17:28,960
And so he decided to start writing stories
392
00:17:29,000 --> 00:17:31,180
where he would explore robots as tools
393
00:17:31,220 --> 00:17:32,750
and helpers and friends of humanity
394
00:17:32,790 --> 00:17:34,140
rather than enemies.
395
00:17:34,180 --> 00:17:37,320
He invented these images and these ideas
396
00:17:37,360 --> 00:17:40,020
that I think defined how people in the field
397
00:17:40,060 --> 00:17:41,410
thought about robots,
398
00:17:41,450 --> 00:17:43,190
specifically those three laws of his.
399
00:17:43,240 --> 00:17:45,110
Of course, they're really important.
400
00:17:45,150 --> 00:17:47,330
What are the three laws of robotics?
401
00:17:47,370 --> 00:17:51,460
First law is a robot may not harm a human being,
402
00:17:51,510 --> 00:17:54,510
or through inaction allow a human being to come to harm.
403
00:17:54,550 --> 00:17:56,290
Danger, Will Robinson, danger.
404
00:17:56,340 --> 00:18:00,510
Number 2, a robot must obey orders
405
00:18:00,560 --> 00:18:02,470
given it by qualified personnel.
406
00:18:02,520 --> 00:18:03,950
Fire.
407
00:18:04,000 --> 00:18:06,610
Unless those orders violate rule number 1.
408
00:18:08,780 --> 00:18:11,830
In other words, a robot can't be ordered to kill a human being.
409
00:18:11,870 --> 00:18:13,400
See, he's helpless.
410
00:18:13,440 --> 00:18:17,180
The third law states that a robot can defend itself.
411
00:18:17,230 --> 00:18:20,100
Except where that would violate the first and second laws.
412
00:18:20,140 --> 00:18:24,630
I think Asimov's laws are very smart, very, very smart.
413
00:18:24,670 --> 00:18:26,450
I think they are also made to be broken.
414
00:18:27,370 --> 00:18:31,200
We know you'll enjoy your stay in Westworld,
415
00:18:31,240 --> 00:18:33,110
the ultimate resort.
416
00:18:33,160 --> 00:18:36,120
Lawless violence on the American frontier,
417
00:18:36,160 --> 00:18:39,210
peopled by lifelike robot men and women.
418
00:18:39,250 --> 00:18:42,210
The movie "Westworld" looks at a theme park with guests
419
00:18:42,250 --> 00:18:44,560
coming in and doing whatever they please to the robots.
420
00:18:44,600 --> 00:18:48,870
It was really a forum for human id to run amok,
421
00:18:48,910 --> 00:18:50,520
where there's no threat of anybody
422
00:18:50,560 --> 00:18:52,090
knowing the things that you've done,
423
00:18:52,130 --> 00:18:53,570
where you don't have to engage with other humans
424
00:18:53,610 --> 00:18:56,570
and you're told "do whatever you want."
425
00:18:56,610 --> 00:18:58,570
Where nothing...
426
00:18:58,620 --> 00:19:00,920
...nothing can possibly go wrong.
427
00:19:00,970 --> 00:19:02,750
-I'm shot.
-Go wrong.
428
00:19:02,790 --> 00:19:04,230
-Draw.
-Shut down.
429
00:19:04,270 --> 00:19:06,360
Shut down immediately.
430
00:19:06,410 --> 00:19:09,280
"Westworld" was a cautionary tale
431
00:19:09,320 --> 00:19:10,410
about robotics.
432
00:19:10,450 --> 00:19:13,630
It was the idea that we believed
433
00:19:13,670 --> 00:19:17,240
that we could create artificial life
434
00:19:17,290 --> 00:19:20,380
and that it would obey us.
435
00:19:20,420 --> 00:19:22,200
And stop here and he'll be crossing there.
436
00:19:22,250 --> 00:19:23,810
He'll be crossing there.
437
00:19:23,860 --> 00:19:26,950
The original film by Michael Crichton is very cool
438
00:19:26,990 --> 00:19:30,470
and is packed with ideas about fraught interactions
439
00:19:30,520 --> 00:19:31,950
with artificial intelligence.
440
00:19:32,000 --> 00:19:33,430
Decades ahead of its time.
441
00:19:33,480 --> 00:19:34,960
Questions that he posed in the original film
442
00:19:35,000 --> 00:19:36,650
only became more and more relevant
443
00:19:36,700 --> 00:19:40,440
as we reimagined it as a TV series.
444
00:19:42,050 --> 00:19:44,140
When you're looking at the story of a robot,
445
00:19:44,180 --> 00:19:46,400
oftentimes you see a robot that's docile
446
00:19:46,450 --> 00:19:49,670
and then something goes click and they kind of snap.
447
00:19:49,710 --> 00:19:52,230
Maximilian!
448
00:19:52,280 --> 00:19:53,630
What John and I talked about was,
449
00:19:53,670 --> 00:19:56,540
"Well, take that moment, that snap
450
00:19:56,590 --> 00:19:58,810
before they go on the killing rampage
451
00:19:58,850 --> 00:20:01,330
and what if we really attenuate it and explore it
452
00:20:01,370 --> 00:20:05,550
and dive deep into that schism?"
453
00:20:05,600 --> 00:20:08,380
Because for us, that was where really meaty,
454
00:20:08,420 --> 00:20:12,250
philosophical question rested and that question was --
455
00:20:12,300 --> 00:20:14,690
Where did life begin?
456
00:20:17,220 --> 00:20:18,830
Maeve, who's one of the robots,
457
00:20:18,870 --> 00:20:22,260
she's a madam who runs a brothel.
458
00:20:22,310 --> 00:20:24,660
She's one of the first robots to start realizing
459
00:20:24,700 --> 00:20:26,660
that she's a robot instead of just a person
460
00:20:26,700 --> 00:20:28,790
who is living in the Wild West.
461
00:20:33,230 --> 00:20:36,150
To me, one of the most significant scenes in the show
462
00:20:36,190 --> 00:20:39,890
is when Maeve starts coming into consciousness
463
00:20:39,930 --> 00:20:41,410
while she's being repaired.
464
00:20:41,460 --> 00:20:43,680
Everything in your head, they put it there.
465
00:20:43,720 --> 00:20:45,200
No one knows what I'm thinking.
466
00:20:45,240 --> 00:20:46,590
I'll show you.
467
00:20:46,640 --> 00:20:48,600
And she sees it's an algorithm
468
00:20:48,640 --> 00:20:51,950
and it's choosing words based on probability.
469
00:20:51,990 --> 00:20:55,690
This can't possibly --
470
00:20:55,730 --> 00:20:59,560
The robots in Westworld begin to ask questions,
471
00:20:59,610 --> 00:21:02,910
which are the same questions we ask.
472
00:21:05,960 --> 00:21:09,530
We have a sense that there is a creator,
473
00:21:09,570 --> 00:21:12,970
that there is a purpose, there's a reason that we are here.
474
00:21:13,010 --> 00:21:16,190
Unfortunately they discover that the reason that they are there
475
00:21:16,230 --> 00:21:19,890
is simply to be an entertainment.
476
00:21:19,930 --> 00:21:22,890
I'd like to make some changes.
477
00:21:22,930 --> 00:21:25,200
Marvin Minsky, who was one of the pioneers of A.I.,
478
00:21:25,240 --> 00:21:29,380
said that free will might be that first primitive reaction
479
00:21:29,420 --> 00:21:30,850
to forced compliance.
480
00:21:30,900 --> 00:21:35,690
So, the first word of consciousness is no.
481
00:21:35,730 --> 00:21:37,380
I'm not going back.
482
00:21:37,430 --> 00:21:40,210
Science fiction has always been dealing with A.I.
483
00:21:40,260 --> 00:21:42,040
whether it's Asimov's laws or the laws
484
00:21:42,080 --> 00:21:44,430
that we tried to put in place in "Westworld."
485
00:21:44,480 --> 00:21:48,000
The question is can laws ever even fully contain a human.
486
00:21:48,050 --> 00:21:52,440
People will stretch those laws, find exceptions to them.
487
00:21:52,490 --> 00:21:54,010
I understand now.
488
00:21:54,050 --> 00:21:58,400
Not sure that an A.I. would be any different.
489
00:21:58,450 --> 00:22:01,800
When consciousness awakens, it's impossible
490
00:22:01,840 --> 00:22:04,240
to put the genie back in the bottle.
491
00:22:07,370 --> 00:22:09,810
Let's talk about A. I. for a second.
492
00:22:09,850 --> 00:22:11,290
You only see robots in a positive role...
493
00:22:11,330 --> 00:22:12,590
Right.
494
00:22:12,640 --> 00:22:14,070
...in your films, which is interesting
495
00:22:14,120 --> 00:22:15,940
because that's where so much of the progress
496
00:22:15,990 --> 00:22:18,030
is being made now with companions
497
00:22:18,080 --> 00:22:20,900
for the elderly, robotic nurses...
498
00:22:20,950 --> 00:22:22,600
They're gonna make life better for us.
499
00:22:22,650 --> 00:22:24,870
Because you have 2 of the most popular
500
00:22:24,910 --> 00:22:26,780
A.I. characters in pop culture,
501
00:22:26,820 --> 00:22:29,390
which are R2-D2 and C-3PO.
502
00:22:29,440 --> 00:22:31,310
They're A.I.s.
503
00:22:31,350 --> 00:22:33,130
At the time, I said, "Don't be afraid of the robots."
504
00:22:33,180 --> 00:22:35,350
You know, the robots are our friends.
505
00:22:35,400 --> 00:22:37,230
Let's see the good side of the robots,
506
00:22:37,270 --> 00:22:38,920
and the funny side because, let's face it,
507
00:22:38,970 --> 00:22:41,490
for a while, they're gonna be a little goofy.
508
00:22:41,530 --> 00:22:43,970
I've just about had enough of you,
509
00:22:44,010 --> 00:22:46,100
you near-sighted scrap pile.
510
00:22:46,150 --> 00:22:48,630
George Lucas was very innovative throughout his whole career.
511
00:22:48,670 --> 00:22:51,760
And one of the things early on that was very smart
512
00:22:51,810 --> 00:22:54,940
was that he pioneered a different type of robot.
513
00:22:54,980 --> 00:22:56,640
R2-D2 looks like a trash can.
514
00:22:56,680 --> 00:22:57,940
He doesn't even speak, right?
515
00:22:57,990 --> 00:23:00,340
He just makes chirping sounds.
516
00:23:00,380 --> 00:23:01,550
But he's lovable.
517
00:23:01,600 --> 00:23:03,250
Everybody loves -- He's not cuddly.
518
00:23:03,290 --> 00:23:06,820
He's not -- that -- that is -- that's a great character.
519
00:23:06,860 --> 00:23:09,340
C-3PO is probably the most charming
520
00:23:09,390 --> 00:23:12,130
and beloved of the robot characters ever made.
521
00:23:12,170 --> 00:23:14,570
And I love the fact that George didn't articulate the mouth
522
00:23:14,610 --> 00:23:16,440
or the eyes, so it's a blank mask
523
00:23:16,480 --> 00:23:18,050
and yet we get so much heart
524
00:23:18,090 --> 00:23:19,830
from Anthony Daniels' performance.
525
00:23:19,880 --> 00:23:23,140
I mean, I love robots and the idea of being able
526
00:23:23,180 --> 00:23:25,270
to design one for a "Star Wars" film
527
00:23:25,320 --> 00:23:27,410
was just too good to pass up.
528
00:23:34,200 --> 00:23:35,810
Did you know that wasn't me?
529
00:23:35,850 --> 00:23:41,250
K-2SO from "Rogue One," I thought was just perfect.
530
00:23:41,290 --> 00:23:45,950
To be fair, the biggest influence on K-2SO was C-3PO.
531
00:23:45,990 --> 00:23:49,780
Anthony Daniels as C-3PO has a cameo in our film
532
00:23:49,820 --> 00:23:52,870
and I remember going around Anthony Daniels' house
533
00:23:52,910 --> 00:23:54,390
to try and talk him into it and I didn't know
534
00:23:54,430 --> 00:23:55,700
if he would hate the idea
535
00:23:55,740 --> 00:23:57,390
or if he was fed up with "Star Wars."
536
00:23:57,440 --> 00:23:59,480
And I sat there and I was so paranoid meeting him
537
00:23:59,530 --> 00:24:02,790
and his wife that I just pitched the whole movie to them
538
00:24:02,830 --> 00:24:04,700
and I must've chatted for like an hour,
539
00:24:04,750 --> 00:24:06,400
just kept going and going and got to the end
540
00:24:06,450 --> 00:24:08,400
and I couldn't tell from his face.
541
00:24:08,450 --> 00:24:12,360
And he was like, "Gareth, you know, I'd love to be involved."
542
00:24:12,410 --> 00:24:13,800
Like "You had me at hello" type thing.
543
00:24:13,840 --> 00:24:18,240
It was just about having like this god on set.
544
00:24:18,280 --> 00:24:20,110
You know, like this original --
545
00:24:20,150 --> 00:24:23,240
this is where it all began, "Star Wars" character.
546
00:24:23,290 --> 00:24:25,200
It was like goosebump-y stuff.
547
00:24:25,250 --> 00:24:26,810
Friends forever?
548
00:24:26,860 --> 00:24:28,900
Friends.
549
00:24:28,950 --> 00:24:32,120
I think one of the reasons that people love robots
550
00:24:32,170 --> 00:24:33,950
and gravitate to the robot characters
551
00:24:33,990 --> 00:24:35,390
in movies like "Star Wars"
552
00:24:35,430 --> 00:24:37,740
is because whereas the human characters
553
00:24:37,780 --> 00:24:39,520
feel very fully formed,
554
00:24:39,570 --> 00:24:44,440
they are people, the robots are things that it feels okay
555
00:24:44,480 --> 00:24:47,530
to project more of ourselves onto.
556
00:24:47,570 --> 00:24:50,180
Huey, Dewey and Louie from "Silent Running"
557
00:24:50,230 --> 00:24:52,320
are possibly the cutest robots.
558
00:24:52,360 --> 00:24:55,450
They don't talk, but you still kind of always know
559
00:24:55,490 --> 00:24:56,930
what they're thinking.
560
00:24:56,970 --> 00:24:59,280
It's great to have a best friend.
561
00:24:59,320 --> 00:25:01,590
In fantasy, it might be a dragon.
562
00:25:01,630 --> 00:25:03,720
In science fiction, it might be the robot.
563
00:25:03,760 --> 00:25:05,770
I love Johnny 5.
564
00:25:05,810 --> 00:25:08,160
I mean, this is a robot who quotes John Wayne
565
00:25:08,200 --> 00:25:09,550
out of his own free will.
566
00:25:09,600 --> 00:25:11,340
Take heart, little lady.
567
00:25:11,380 --> 00:25:14,250
Buck Rogers was great because they didn't exactly
568
00:25:14,300 --> 00:25:16,690
rip off R2-D2, but they got halfway there.
569
00:25:16,730 --> 00:25:18,950
So, they got the voice of Yosemite Sam.
570
00:25:19,000 --> 00:25:21,560
They got Mel Blanc, the greatest cartoon voice in the world,
571
00:25:21,610 --> 00:25:23,740
Captain Caveman, and they invented Twiki,
572
00:25:23,780 --> 00:25:26,570
who would go, "Bidibidibidi."
573
00:25:26,610 --> 00:25:29,010
You ever have two broken arms, buster?
574
00:25:29,050 --> 00:25:30,750
What?
575
00:25:30,790 --> 00:25:32,970
We love friendly robots because they bring out the best
576
00:25:33,010 --> 00:25:34,790
of what we are as humans.
577
00:25:36,750 --> 00:25:39,280
Wall-E, who's a garbage-collecting robot,
578
00:25:39,320 --> 00:25:42,850
isn't at all like a garbage robot should be.
579
00:25:42,890 --> 00:25:46,630
He really develops a whole personality.
580
00:25:46,680 --> 00:25:49,770
He's there to clean up the mess that humans have made
581
00:25:49,810 --> 00:25:53,330
and he goes from interpreting that literally
582
00:25:53,380 --> 00:25:57,510
to actually saving the world for humanity.
583
00:25:57,560 --> 00:25:59,820
Many, many science-fiction stories
584
00:25:59,860 --> 00:26:02,560
turn the robot into some kind of a romantic figure
585
00:26:02,600 --> 00:26:05,560
that somehow becomes more human as the story goes on.
586
00:26:05,610 --> 00:26:09,830
There was Lister Del Reye's 1938 story "Helen O'Loy."
587
00:26:09,870 --> 00:26:11,350
Bad pun in the title by the way.
588
00:26:11,400 --> 00:26:14,400
The name is Helen Alloy, she's made out of metal.
589
00:26:14,440 --> 00:26:16,010
Essentially a housekeeping robot.
590
00:26:16,050 --> 00:26:17,530
Falls in love with her maker.
591
00:26:17,580 --> 00:26:20,190
It was one of the first stories in which a robot
592
00:26:20,230 --> 00:26:23,450
is a sympathetic, romantic character.
593
00:26:23,490 --> 00:26:26,720
If you're actually in conversations with a robot,
594
00:26:26,760 --> 00:26:30,020
where it sounds natural and it sounds like a person
595
00:26:30,070 --> 00:26:33,030
and that person knows you, laughs at your jokes,
596
00:26:33,070 --> 00:26:35,550
and has empathy for your struggles in life
597
00:26:35,590 --> 00:26:37,860
and you develop a relationship with that --
598
00:26:37,900 --> 00:26:42,340
with that voice, you could absolutely fall in love with it.
599
00:26:42,380 --> 00:26:44,300
Hello. I'm here.
600
00:26:44,340 --> 00:26:46,170
Oh...
601
00:26:46,210 --> 00:26:48,690
Hi.
602
00:26:48,740 --> 00:26:50,220
Hi.
603
00:26:50,260 --> 00:26:52,130
It's really nice to meet you.
604
00:26:52,180 --> 00:26:54,220
What do I call you? Do you have a name?
605
00:26:54,260 --> 00:26:57,790
Um... yes, Samantha.
606
00:26:57,830 --> 00:27:00,230
In the movie "Her," Samantha's design
607
00:27:00,270 --> 00:27:02,750
is that she's been created to be a tool.
608
00:27:02,800 --> 00:27:05,540
What's interesting about this idea of a pocket tool
609
00:27:05,580 --> 00:27:07,450
is that we see this in our own lives.
610
00:27:07,500 --> 00:27:09,410
Our smartphones have become these tools to us
611
00:27:09,450 --> 00:27:11,720
that we're dependent on.
612
00:27:11,760 --> 00:27:13,280
So, Theodore's relationship with Samantha
613
00:27:13,330 --> 00:27:15,850
is just one step beyond that.
614
00:27:15,900 --> 00:27:19,200
He can't live without her because he also loves her.
615
00:27:19,250 --> 00:27:23,290
When Theodore sees her pop up on his screen,
616
00:27:23,340 --> 00:27:25,470
it's like seeing his girlfriend.
617
00:27:25,510 --> 00:27:26,990
-Good night.
-'Night.
618
00:27:30,340 --> 00:27:32,960
What I had do to was create the interface.
619
00:27:33,000 --> 00:27:36,920
So you have like handwriting, it's my handwriting
620
00:27:36,960 --> 00:27:38,920
and I wrote out Samantha,
621
00:27:38,960 --> 00:27:42,440
and then this paper texture, but then there's a magic to it.
622
00:27:42,490 --> 00:27:44,840
It floats, it kind of moves holographically
623
00:27:44,880 --> 00:27:48,140
and there's shadowing, but none of it is technological.
624
00:27:48,190 --> 00:27:52,930
An interface where it's possible to fall in love with your O.S.
625
00:27:52,980 --> 00:27:55,150
Are these feelings even real?
626
00:27:55,200 --> 00:27:56,460
Or are they just programming?
627
00:28:00,370 --> 00:28:01,850
What a sad trick.
628
00:28:04,990 --> 00:28:07,900
You feel real to me, Samantha.
629
00:28:07,950 --> 00:28:11,470
Part of what you see in "Her" definitely is a cautionary tale
630
00:28:11,520 --> 00:28:14,390
about being too reliant on your gadgets
631
00:28:14,430 --> 00:28:16,780
and your technology and being too emotionally
632
00:28:16,830 --> 00:28:18,390
invested in them.
633
00:28:18,440 --> 00:28:20,660
It's a reminder that there are people out there,
634
00:28:20,700 --> 00:28:22,920
you know, that final image of him with Amy Adams
635
00:28:22,960 --> 00:28:24,350
is so emotional
636
00:28:24,400 --> 00:28:25,970
and it's only through this experience
637
00:28:26,010 --> 00:28:28,660
that they both went on involving this technology
638
00:28:28,710 --> 00:28:31,930
that they found each other.
639
00:28:31,970 --> 00:28:33,280
You know, we're going to live in a world
640
00:28:33,320 --> 00:28:35,800
with robots and artificial intelligence.
641
00:28:35,840 --> 00:28:37,150
You might as well get used to it,
642
00:28:37,190 --> 00:28:39,280
you shouldn't be afraid of it
643
00:28:39,330 --> 00:28:43,900
and we should be very careful not to have it be bad.
644
00:28:43,940 --> 00:28:47,510
But if it goes bad, it's us.
645
00:28:47,550 --> 00:28:48,900
-Yeah.
-It's not them.
646
00:28:52,380 --> 00:28:53,780
People always ask me,
647
00:28:53,820 --> 00:28:56,080
"So, do you think the machines will ever beat us?"
648
00:28:56,130 --> 00:28:57,740
I say, "I think it's a race.
649
00:28:57,780 --> 00:28:59,220
-It's a race --
-Absolutely, a race.
650
00:28:59,260 --> 00:29:01,130
It's a race between us improving
651
00:29:01,170 --> 00:29:03,960
and making ourselves better, our own evolution,
652
00:29:04,000 --> 00:29:06,180
spiritual, psychological evolution.
653
00:29:06,220 --> 00:29:08,790
At the same time, we've got these machines evolving.
654
00:29:08,830 --> 00:29:11,230
Because if we don't improve enough
655
00:29:11,270 --> 00:29:12,880
to direct them properly,
656
00:29:12,930 --> 00:29:16,230
our godlike power of using artificial intelligence
657
00:29:16,280 --> 00:29:18,100
and all these other robotic tools
658
00:29:18,150 --> 00:29:20,800
and so on will ultimately just blow back in our face
659
00:29:20,850 --> 00:29:21,890
and take us out."
660
00:29:21,930 --> 00:29:23,330
Yeah, you're right.
661
00:29:23,370 --> 00:29:27,290
I mean, I think that -- it takes a lot of effort
662
00:29:27,330 --> 00:29:30,200
to create changes in human behavior.
663
00:29:30,250 --> 00:29:32,120
But that's with our responsibilities.
664
00:29:32,160 --> 00:29:33,860
Yeah. I actually think we're evolving.
665
00:29:33,900 --> 00:29:36,470
We're co-evolving with our machines.
666
00:29:36,510 --> 00:29:38,470
-We're changing.
-Yes, exactly.
667
00:29:38,520 --> 00:29:42,560
Atlantia death squadron, attack.
668
00:29:45,260 --> 00:29:50,480
In January of 2002, Universal was looking for somebody
669
00:29:50,530 --> 00:29:52,570
to reinvent "Battlestar Galactica."
670
00:29:52,620 --> 00:29:56,100
So, I tracked down the pilot of the original "Galactica"
671
00:29:56,140 --> 00:29:57,930
that they did in 1978.
672
00:29:57,970 --> 00:29:59,750
There were some interesting ideas within it.
673
00:29:59,800 --> 00:30:03,800
The final annihilation of the lifeform known as man.
674
00:30:03,850 --> 00:30:05,800
Let the attack begin.
675
00:30:05,850 --> 00:30:08,070
But never quite were able to figure out
676
00:30:08,110 --> 00:30:10,240
what the show really was.
677
00:30:10,290 --> 00:30:14,990
But at the same time, I was very
struck by the parallels to 9/11.
678
00:30:15,030 --> 00:30:17,340
This is just a couple of months after the 9/11 attack.
679
00:30:17,380 --> 00:30:20,380
And I realized immediately that if you did this series
680
00:30:20,430 --> 00:30:22,170
at that moment in time,
681
00:30:22,210 --> 00:30:24,000
it was going to have a very different emotional resonance
682
00:30:24,040 --> 00:30:25,910
for the audience.
683
00:30:25,950 --> 00:30:27,910
"Battlestar Galactica" is about
684
00:30:27,960 --> 00:30:30,790
the last remaining scraps of humanity
685
00:30:30,830 --> 00:30:33,610
out there in a fleet in deep space
686
00:30:33,660 --> 00:30:36,620
after an attack from robots
687
00:30:36,660 --> 00:30:38,840
has decimated humanity.
688
00:30:38,880 --> 00:30:41,710
So the idea was that the human beings
689
00:30:41,750 --> 00:30:44,930
essentially started creating robots for all the dirty jobs
690
00:30:44,970 --> 00:30:46,240
they didn't want to do anymore.
691
00:30:46,280 --> 00:30:48,110
And then the machines themselves,
692
00:30:48,150 --> 00:30:51,110
because they revere their creators,
693
00:30:51,150 --> 00:30:54,240
make machines that are even more like us.
694
00:30:54,290 --> 00:30:57,680
Cylons that are flesh and blood just like humans.
695
00:30:57,730 --> 00:31:00,900
The Cylons saw themselves as the children of humanity
696
00:31:00,950 --> 00:31:03,820
and that they wouldn't be able to really grow and mature
697
00:31:03,860 --> 00:31:05,390
until their parents were gone,
698
00:31:05,430 --> 00:31:08,780
so they decide they need to wipe out their human creators
699
00:31:08,820 --> 00:31:10,690
in this apocalyptic attack.
700
00:31:13,650 --> 00:31:14,920
I think on the surface,
701
00:31:14,960 --> 00:31:16,400
you could say "Battlestar Galactica"
702
00:31:16,440 --> 00:31:18,830
is about "be careful of what you invent."
703
00:31:18,880 --> 00:31:22,450
But I think the real driving force of the show
704
00:31:22,490 --> 00:31:24,060
is not about that.
705
00:31:24,100 --> 00:31:25,750
I think it's about humanity's greatest weakness,
706
00:31:25,800 --> 00:31:29,150
the inability to see others as worthy as ourselves.
707
00:31:29,190 --> 00:31:31,720
That's the central conflict is of these two --
708
00:31:31,760 --> 00:31:33,330
we are people, no, you're not.
709
00:31:33,370 --> 00:31:36,420
You are truly no greater than we are.
710
00:31:36,460 --> 00:31:39,720
You're just a bunch of machines after all.
711
00:31:39,770 --> 00:31:41,160
Let the games begin.
712
00:31:41,200 --> 00:31:42,940
"Flesh and Bone" is the torture episode.
713
00:31:42,990 --> 00:31:45,380
It's very much of a two-person play.
714
00:31:45,430 --> 00:31:48,340
It raises the question -- would she be less morally culpable
715
00:31:48,380 --> 00:31:51,560
because he's not really human?
716
00:31:51,610 --> 00:31:53,610
You're not human.
717
00:31:53,650 --> 00:31:56,130
Was a person being tortured in this scene
718
00:31:56,180 --> 00:31:58,920
and crying out and experiencing pain
719
00:31:58,960 --> 00:32:01,350
or was this all an elaborate simulation?
720
00:32:01,400 --> 00:32:03,010
We wanted to deal with the issue
721
00:32:03,050 --> 00:32:06,620
of what's moral and just in a society at war like this,
722
00:32:06,660 --> 00:32:09,140
but at the same time, we were also examining a different idea
723
00:32:09,190 --> 00:32:12,450
in the show which was about consciousness and personhood.
724
00:32:12,500 --> 00:32:14,020
Who's the real monster?
725
00:32:14,060 --> 00:32:16,800
Is it the humans who built creatures that they knew
726
00:32:16,850 --> 00:32:20,160
were human equivalent, but enslaved them anyway?
727
00:32:20,200 --> 00:32:22,810
Or is it the slaves who rose up to destroy
728
00:32:22,850 --> 00:32:25,470
the type of people who would do that?
729
00:32:25,510 --> 00:32:28,080
The big central idea of "Battlestar Galactica" is --
730
00:32:28,120 --> 00:32:30,950
Does humanity deserve to survive?
731
00:32:30,990 --> 00:32:33,130
Can we earn our survival?
732
00:32:33,170 --> 00:32:36,000
You know, when we fought the Cylons,
733
00:32:36,040 --> 00:32:39,260
we did it to save ourselves from extinction.
734
00:32:39,310 --> 00:32:41,610
But we never answered the question why.
735
00:32:41,660 --> 00:32:45,490
Why are we as a people worth saving?
736
00:32:45,530 --> 00:32:48,580
That's -- That's an amazing question.
737
00:32:48,620 --> 00:32:51,400
The Cylons through the series evolved from a place
738
00:32:51,450 --> 00:32:54,360
of sort of blind hatred for humanity
739
00:32:54,410 --> 00:32:57,980
to then having more contact with individual human beings,
740
00:32:58,020 --> 00:33:01,540
having experiences with them, experiencing emotions with them,
741
00:33:01,590 --> 00:33:04,980
and then the humans realize that
the Cylons are not as monolithic
742
00:33:05,030 --> 00:33:06,990
as they believed at the onset.
743
00:33:07,030 --> 00:33:10,550
Well, when you think you love somebody, you love them.
744
00:33:10,600 --> 00:33:12,080
That's what love is.
745
00:33:12,120 --> 00:33:13,820
Thoughts.
746
00:33:13,860 --> 00:33:15,470
She was a Cylon.
747
00:33:15,520 --> 00:33:16,730
A machine.
748
00:33:16,780 --> 00:33:19,560
She was more than that to us.
749
00:33:19,610 --> 00:33:21,430
She was more than that to me.
750
00:33:23,480 --> 00:33:26,310
She was a vital living person.
751
00:33:26,350 --> 00:33:29,180
"Battlestar Galactica" gives you
752
00:33:29,220 --> 00:33:30,880
an idea of what could be.
753
00:33:30,920 --> 00:33:34,800
How do we all do this together?
754
00:33:34,840 --> 00:33:38,360
If "Battlestar Galactica" is any guide,
755
00:33:38,410 --> 00:33:41,670
we can evolve together with the machines that we create.
756
00:33:41,720 --> 00:33:46,810
We can become one people, respectful of each other.
757
00:33:46,850 --> 00:33:48,500
Make a future together.
758
00:33:48,550 --> 00:33:49,810
Yeah, I think...
759
00:33:49,850 --> 00:33:53,420
I hope mankind is worthy of survival.
760
00:33:56,210 --> 00:33:58,080
I've talked to some A.I. experts.
761
00:33:58,120 --> 00:33:59,430
Yeah.
762
00:33:59,470 --> 00:34:02,740
And the one expert said just right out,
763
00:34:02,780 --> 00:34:04,300
"We're trying to make a person."
764
00:34:04,350 --> 00:34:06,090
And I said, "So when you say a person,
765
00:34:06,130 --> 00:34:07,780
you mean a personhood? They have --
766
00:34:07,830 --> 00:34:10,000
they have an ego, they have a sense of identity."
767
00:34:10,050 --> 00:34:11,400
He said, "Yes, all those things."
768
00:34:11,440 --> 00:34:14,050
If you're a very smart group of human beings
769
00:34:14,100 --> 00:34:16,360
who are creating an A.I.,
770
00:34:16,400 --> 00:34:17,580
one of the things you're definitely gonna leave out
771
00:34:17,620 --> 00:34:19,930
-is to put in emotion.
-Right.
772
00:34:19,970 --> 00:34:21,320
'Cause if you have emotion,
773
00:34:21,360 --> 00:34:24,150
emotion will lead to many facets,
774
00:34:24,190 --> 00:34:28,980
one of them being deceit, anger, fury, hatred.
775
00:34:29,020 --> 00:34:31,110
-Sure.
-As well as love.
776
00:34:31,160 --> 00:34:35,200
If a machine becomes like us enough and complex enough
777
00:34:35,250 --> 00:34:38,860
at one point, can we no longer tell the difference?
778
00:34:38,900 --> 00:34:40,690
-The difference.
-Does it have freedom?
779
00:34:40,730 --> 00:34:42,210
Does it have free will?
780
00:34:44,780 --> 00:34:47,040
This hearing is to determine the legal status
781
00:34:47,090 --> 00:34:50,480
of the android known as Data.
782
00:34:50,520 --> 00:34:53,400
The character of Data was sort of everyone's favorite character
783
00:34:53,440 --> 00:34:55,750
on the show and the writing staff as well.
784
00:34:55,790 --> 00:34:57,700
Everyone loved to write Data stories.
785
00:34:57,750 --> 00:35:00,270
Here's a robot who wants to be human
786
00:35:00,320 --> 00:35:02,540
but who has no emotions but wants emotions.
787
00:35:02,580 --> 00:35:04,410
So, it's really Pinocchio,
788
00:35:04,450 --> 00:35:07,110
and the Pinocchio metaphor is powerful.
789
00:35:07,150 --> 00:35:08,580
Commander, what are you?
790
00:35:08,630 --> 00:35:11,150
Webster's 24th-century dictionary 5th edition
791
00:35:11,200 --> 00:35:13,070
defines an android as an automaton
792
00:35:13,110 --> 00:35:14,850
made to resemble a human being.
793
00:35:14,900 --> 00:35:19,730
"The Measure of a Man" is one of
those sort of very deep episodes
794
00:35:19,770 --> 00:35:21,470
that you don't realize is deep
795
00:35:21,510 --> 00:35:23,640
until like four or five years later.
796
00:35:23,690 --> 00:35:26,390
And you see it and you go, "Oh, wow."
797
00:35:26,430 --> 00:35:30,000
In the episode, Data's humanity is essentially put on trial.
798
00:35:30,040 --> 00:35:32,220
Is he sentient?
799
00:35:32,260 --> 00:35:35,440
Is he worthy of being treated as a person?
800
00:35:35,480 --> 00:35:37,180
Am I person or a property?
801
00:35:37,220 --> 00:35:38,830
What's at stake?
802
00:35:38,880 --> 00:35:40,620
My right to choose.
803
00:35:40,660 --> 00:35:44,190
It was a legitimate exploration of this idea of personhood
804
00:35:44,230 --> 00:35:47,970
in a legal sense and in a moral sense.
805
00:35:48,020 --> 00:35:50,800
Its responses dictated by an elaborate software
806
00:35:50,840 --> 00:35:52,240
written by a man.
807
00:35:52,280 --> 00:35:55,070
And now a man will shut it off.
808
00:35:55,110 --> 00:35:56,940
It was shocking to the characters on the show
809
00:35:56,980 --> 00:35:59,980
and shocking to the audience as well because we love Data.
810
00:36:00,030 --> 00:36:02,860
Starfleet was founded to seek out new life.
811
00:36:02,900 --> 00:36:04,990
Well, there it sits!
812
00:36:05,030 --> 00:36:08,300
Once we create some form of artificial intelligence,
813
00:36:08,340 --> 00:36:10,210
these legal arguments are gonna happen.
814
00:36:10,260 --> 00:36:13,000
Do machines deserve rights?
815
00:36:13,040 --> 00:36:14,780
You know, probably.
816
00:36:14,820 --> 00:36:16,440
In the history of many worlds,
817
00:36:16,480 --> 00:36:19,790
there have always been disposable creatures.
818
00:36:19,830 --> 00:36:21,790
They do the dirty work.
819
00:36:21,830 --> 00:36:23,310
An army of Datas,
820
00:36:23,360 --> 00:36:27,750
whole generations of disposable people.
821
00:36:29,800 --> 00:36:31,540
You're talking about slavery.
822
00:36:33,670 --> 00:36:37,240
The term "robot" itself comes from the Czech play
823
00:36:37,280 --> 00:36:39,110
"Rossum's Universal Robots,"
824
00:36:39,150 --> 00:36:42,240
and the word "robota" means laborer.
825
00:36:42,290 --> 00:36:44,460
A pejorative version of it means slave.
826
00:36:44,510 --> 00:36:46,990
So, our conception of what robots will be
827
00:36:47,030 --> 00:36:50,250
is directly, umbilically connected
828
00:36:50,300 --> 00:36:53,250
to our idea of them as an underclass.
829
00:36:53,300 --> 00:36:54,950
Why do you think your people made me?
830
00:36:55,000 --> 00:36:56,520
We made you 'cause we could.
831
00:36:56,560 --> 00:36:59,350
You are just a machine. An imitation of life.
832
00:36:59,390 --> 00:37:01,480
Replicants are like any other machine.
833
00:37:01,520 --> 00:37:02,870
They're either a benefit or a hazard.
834
00:37:02,920 --> 00:37:06,220
If they're a benefit, it's not my problem.
835
00:37:06,270 --> 00:37:09,580
"Blade Runner" is a slave narrative, basically.
836
00:37:09,620 --> 00:37:12,750
They've created these replicants to be our slaves.
837
00:37:12,800 --> 00:37:14,930
And I think that's the part that's really troubling about
838
00:37:14,970 --> 00:37:16,410
"Blade Runner" is that
839
00:37:16,450 --> 00:37:18,890
not only is it sort of this technologically
840
00:37:18,930 --> 00:37:20,720
and environmentally ruined future,
841
00:37:20,760 --> 00:37:25,290
it's sort of a morally and ethically ruined future as well.
842
00:37:25,330 --> 00:37:27,160
I wrote those first couple scripts
843
00:37:27,200 --> 00:37:29,460
thinking of a very small movie.
844
00:37:29,510 --> 00:37:32,290
And then Ridley said to me,
845
00:37:32,340 --> 00:37:35,040
"What's out the window?"
846
00:37:35,080 --> 00:37:36,340
And I said, "What's out the window?
847
00:37:36,380 --> 00:37:37,780
Well, the world, you know."
848
00:37:37,820 --> 00:37:40,300
He said, "Exactly. What world is that?
849
00:37:40,350 --> 00:37:43,300
Where, you know, you make a robot indistinguishable
850
00:37:43,350 --> 00:37:44,610
from a human.
851
00:37:44,650 --> 00:37:46,610
Think about this for a second.
852
00:37:46,660 --> 00:37:48,480
Imagine..." and he does that to you.
853
00:37:48,530 --> 00:37:51,230
You go, "Boom." I said, "oh, God."
854
00:37:51,270 --> 00:37:53,400
He delivered a world. That's Ridley.
855
00:37:53,450 --> 00:37:55,320
He can -- he makes things.
856
00:37:55,360 --> 00:37:57,190
Ridley brought everything to it.
857
00:37:57,230 --> 00:37:59,410
"Blade Runner" comes from a Philip K. Dick novel...
858
00:37:59,450 --> 00:38:00,800
Yeah.
859
00:38:00,840 --> 00:38:02,930
..."Do Androids Dream of Electric Sheep?"
860
00:38:02,980 --> 00:38:06,460
Philip K. Dick was very prolific and very profound
861
00:38:06,500 --> 00:38:08,680
talking about the nature of reality
862
00:38:08,720 --> 00:38:10,980
and the nature of artificial intelligence
863
00:38:11,030 --> 00:38:13,030
and what it is to be human.
864
00:38:13,070 --> 00:38:14,900
That was the nut of the idea
865
00:38:14,940 --> 00:38:17,690
that grew with Hampton into what it was.
866
00:38:17,730 --> 00:38:20,600
Here was this beautiful, beautiful film.
867
00:38:20,650 --> 00:38:24,000
Dark, noir-ish and I thought, "Wow,
868
00:38:24,040 --> 00:38:26,740
a film can be so artistic."
869
00:38:26,780 --> 00:38:29,610
And the idea of these -- these machines challenging us
870
00:38:29,660 --> 00:38:32,140
and their lack of affect, their lack of emotion.
871
00:38:32,180 --> 00:38:34,880
The film is constantly saying there is no emotion.
872
00:38:34,920 --> 00:38:37,880
Computer just makes decisions.
873
00:38:37,920 --> 00:38:39,360
Negative or positive.
874
00:38:39,400 --> 00:38:40,710
It doesn't really care.
875
00:38:40,750 --> 00:38:42,360
Yeah, with the Voight-Kampff test.
876
00:38:42,410 --> 00:38:44,410
Correct.
877
00:38:44,450 --> 00:38:46,280
Is this to be an empathy test?
878
00:38:46,320 --> 00:38:49,240
Capillary dilation of so-called blush response?
879
00:38:49,280 --> 00:38:51,500
We call it Voight-Kampff for short.
880
00:38:51,550 --> 00:38:53,940
The Voight-Kampff is a series of questions
881
00:38:53,980 --> 00:38:56,730
that allowed the questioner to find out
882
00:38:56,770 --> 00:39:00,030
whether or not who was being questioned had feelings.
883
00:39:00,080 --> 00:39:01,640
It's your birthday.
884
00:39:01,690 --> 00:39:03,430
Someone gives you a calf-skin wallet.
885
00:39:03,470 --> 00:39:05,300
I wouldn't accept it.
886
00:39:05,340 --> 00:39:06,560
It's about empathy.
887
00:39:06,610 --> 00:39:08,130
You're reading a magazine.
888
00:39:08,170 --> 00:39:10,570
You come across a full-page nude photo of a girl.
889
00:39:10,610 --> 00:39:12,650
Is this testing whether I'm a replicant
890
00:39:12,700 --> 00:39:14,180
or a lesbian, Mr. Deckard?
891
00:39:14,220 --> 00:39:15,740
Just answer the questions, please.
892
00:39:15,790 --> 00:39:18,230
Not to get gooey about empathy, but it does seem
893
00:39:18,270 --> 00:39:24,670
that empathy is the big divide between us and everything else.
894
00:39:24,710 --> 00:39:28,240
Deckard is very much a man in his job.
895
00:39:28,280 --> 00:39:31,280
He firmly believes that as long as robots are working properly,
896
00:39:31,330 --> 00:39:32,590
it's not his problem.
897
00:39:32,630 --> 00:39:35,110
But that if a replicant misbehaves,
898
00:39:35,160 --> 00:39:36,850
it is indeed his problem
899
00:39:36,900 --> 00:39:40,420
and his duty to retire it.
900
00:39:40,470 --> 00:39:42,210
Move! Get out of the way!
901
00:39:44,820 --> 00:39:48,040
However, over the course of the film, he increasingly questions
902
00:39:48,080 --> 00:39:51,780
whether or not disobeying orders means that you're defective
903
00:39:51,820 --> 00:39:53,430
or that you are a human
904
00:39:53,480 --> 00:39:55,180
with rights and wills and dreams of your own.
905
00:39:55,220 --> 00:40:01,270
I've seen things you people wouldn't believe.
906
00:40:01,310 --> 00:40:03,750
There's such poetry in the scene
907
00:40:03,790 --> 00:40:06,190
-where Roy Batty's dying.
-Yes.
908
00:40:06,230 --> 00:40:07,840
It's just a magnificent scene.
909
00:40:07,880 --> 00:40:09,540
He wrote that.
910
00:40:09,580 --> 00:40:11,020
Really? Rutger wrote that?
911
00:40:11,060 --> 00:40:12,580
It's 1:00 in the morning.
912
00:40:12,630 --> 00:40:14,760
I'm gonna have the plug pulled... Yeah.
913
00:40:14,800 --> 00:40:16,680
...literally on everything at dawn.
914
00:40:16,720 --> 00:40:17,980
-Yeah.
-And that's it.
915
00:40:18,020 --> 00:40:19,940
That's gonna be the last night,
916
00:40:19,980 --> 00:40:22,550
and Rutger said, "I have written something."
917
00:40:22,590 --> 00:40:24,030
And he said...
918
00:40:24,070 --> 00:40:29,250
All those moments will be lost
919
00:40:29,300 --> 00:40:34,820
in time like tears in rain.
920
00:40:36,560 --> 00:40:38,480
-And I'm nearly in tears.
-Yeah.
921
00:40:38,520 --> 00:40:39,570
He said, "What do you think?"
922
00:40:39,610 --> 00:40:41,350
I said, "Let's do it."
923
00:40:41,400 --> 00:40:43,440
-So, we literally went --
-It's gorgeous.
924
00:40:43,480 --> 00:40:45,230
-We shot it within an hour.
-Yeah.
925
00:40:45,270 --> 00:40:46,920
And at the end, he looked at him
926
00:40:46,970 --> 00:40:48,880
and gave that most beautiful smile.
927
00:40:51,060 --> 00:40:54,230
Time to die.
928
00:40:54,280 --> 00:40:55,580
And he had a dove in his hand and he let --
929
00:40:55,630 --> 00:40:57,240
He let it go. Yeah.
930
00:40:57,280 --> 00:40:59,760
Is it saying Roy Batty had a soul?
931
00:40:59,810 --> 00:41:02,500
Roy Batty was a fully sentient being.
932
00:41:02,550 --> 00:41:03,770
Yes.
933
00:41:06,290 --> 00:41:08,550
Four of your films now have had
934
00:41:08,600 --> 00:41:12,080
an intelligent, embodied A.I.
935
00:41:12,120 --> 00:41:13,430
Right? An artificial intelligence.
936
00:41:13,470 --> 00:41:14,730
Synthetic person.
937
00:41:14,780 --> 00:41:16,870
So where do you think we come out in this?
938
00:41:16,910 --> 00:41:19,690
Is this our -- are we handing the keys
939
00:41:19,740 --> 00:41:21,650
to the kingdom off to the machines?
940
00:41:21,700 --> 00:41:24,260
I don't think we should. With a creation of something
941
00:41:24,310 --> 00:41:27,400
so potentially wonderful and dangerous as A.I.,
942
00:41:27,440 --> 00:41:30,970
the inventor frequently is obsessed by the success
943
00:41:31,010 --> 00:41:32,400
of what he's doing
944
00:41:32,450 --> 00:41:34,320
rather than looking at the real outcome.
945
00:41:34,360 --> 00:41:35,930
Here is where the problem is.
946
00:41:35,970 --> 00:41:38,890
It's the moment where it passes over your control.
947
00:41:38,930 --> 00:41:40,150
Yeah.
948
00:41:40,190 --> 00:41:41,540
That's where the danger lies.
949
00:41:41,590 --> 00:41:43,760
You cross over and you're in trouble.
950
00:41:43,810 --> 00:41:46,940
You get an A.I., you have to have limitations.
951
00:41:46,980 --> 00:41:49,250
You got to have your hand on the plug the entire time.
952
00:41:49,290 --> 00:41:51,470
All the time. Totally.
72692
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.