All language subtitles for James.Camerons.Story.of.Science.Fiction.S01E05.Intelligent.Machines.1080p.AMZN.WEB-DL.DD+2.0.H.264-AJP69_track3_[eng]

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranî)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:07,224 --> 00:00:08,704 Open the pod bay doors, HAL. 2 00:00:08,747 --> 00:00:12,099 HAL: I'm sorry, Dave. I'm afraid I can't do that. 3 00:00:12,142 --> 00:00:14,971 Spielberg: 2001 had a profound impact on my life. 4 00:00:15,015 --> 00:00:16,755 It's all about HAL 9000. 5 00:00:16,799 --> 00:00:21,543 "2001" was an extraordinary breakthrough for the genre. 6 00:00:21,586 --> 00:00:22,805 Cameron: Science fiction has always been about 7 00:00:22,848 --> 00:00:25,242 great technology going wrong. 8 00:00:25,286 --> 00:00:26,809 I'll be back. 9 00:00:26,852 --> 00:00:29,464 That image, Schwarzenegger as the Terminator, 10 00:00:29,507 --> 00:00:31,118 it's a perfect nightmare. 11 00:00:31,161 --> 00:00:32,684 You have two of the most popular A.I. characters 12 00:00:32,728 --> 00:00:34,295 in pop culture. 13 00:00:34,338 --> 00:00:36,732 Lucas: At the time I said, "Don't be afraid of the robots. 14 00:00:36,775 --> 00:00:38,908 The robots are our friends." 15 00:00:38,951 --> 00:00:42,433 Conception of what robots will be is directly, 16 00:00:42,477 --> 00:00:47,134 umbilically connected to our idea of them as an underclass. 17 00:00:47,177 --> 00:00:49,223 Replicants are like any other machine. 18 00:00:49,266 --> 00:00:50,789 They're either a benefit or a hazard. 19 00:00:50,833 --> 00:00:52,661 Cameron: "Blade Runner" was so artistic. 20 00:00:52,704 --> 00:00:55,446 If you're creating an A.I., 21 00:00:55,490 --> 00:00:57,100 one of the things you're definitely going to leave out 22 00:00:57,144 --> 00:00:58,971 is emotion. 23 00:00:59,015 --> 00:01:01,670 "Battlestar Galactica" is about humanity's greatest weakness. 24 00:01:01,713 --> 00:01:03,672 You're just a bunch of machines after all. 25 00:01:03,715 --> 00:01:06,718 The inability to see others as worthy as ourselves. 26 00:01:06,762 --> 00:01:09,808 Narrator: Our machines have been the stuff of dreams 27 00:01:09,852 --> 00:01:11,419 and of nightmares. 28 00:01:11,462 --> 00:01:16,337 The question is, can man and machine forge a future... 29 00:01:16,380 --> 00:01:17,599 together? 30 00:01:17,642 --> 00:01:18,861 Samantha: Hello, I'm here. 31 00:01:18,904 --> 00:01:21,864 Oh... 32 00:01:21,907 --> 00:01:23,083 Hi. 33 00:01:23,126 --> 00:01:30,090 ♪♪ 34 00:01:33,136 --> 00:01:40,100 ♪♪ 35 00:01:43,146 --> 00:01:50,022 ♪♪ 36 00:01:50,066 --> 00:01:56,942 ♪♪ 37 00:01:56,986 --> 00:01:58,466 Cameron: They call it science fiction, 38 00:01:58,509 --> 00:01:59,902 but it's really about technology. 39 00:01:59,945 --> 00:02:02,513 It's about the machines. 40 00:02:02,557 --> 00:02:04,428 You've done a lot of science-fiction movies. 41 00:02:04,472 --> 00:02:06,561 You've seen all kinds of different machines, 42 00:02:06,604 --> 00:02:07,866 intelligent machines. 43 00:02:07,910 --> 00:02:09,477 You've played an intelligent machine. 44 00:02:09,520 --> 00:02:12,088 I think that what's interesting is 45 00:02:12,132 --> 00:02:16,048 when you have been involved in the business 46 00:02:16,092 --> 00:02:19,965 as long as I have, what is so unbelievable 47 00:02:20,009 --> 00:02:21,880 is that as I've done 48 00:02:21,924 --> 00:02:24,883 "Terminator" movies one after the next 49 00:02:24,927 --> 00:02:27,756 and you see something starting out kind of 50 00:02:27,799 --> 00:02:30,237 what is called science fiction 51 00:02:30,280 --> 00:02:31,977 and then all of a sudden, 52 00:02:32,021 --> 00:02:34,415 it becomes kind of science reality. 53 00:02:34,458 --> 00:02:36,460 Yeah. I think science fiction has always been 54 00:02:36,504 --> 00:02:38,506 about great technology going wrong. 55 00:02:38,549 --> 00:02:41,726 It's like how A.I. might be a threat to humanity. 56 00:02:42,858 --> 00:02:44,294 I'm a friend of Sarah Connor. 57 00:02:44,338 --> 00:02:46,949 Can't see her. She's making a statement. 58 00:02:46,992 --> 00:02:49,952 I'll be back. 59 00:02:49,995 --> 00:02:51,214 Lethem: That image, 60 00:02:51,258 --> 00:02:52,911 Schwarzenegger as the Terminator, 61 00:02:52,955 --> 00:02:55,914 Skynet sending this emissary, 62 00:02:55,958 --> 00:02:58,743 quasi-human into our midst, 63 00:02:58,787 --> 00:03:02,617 it's a perfect nightmare of the machine catastrophe. 64 00:03:07,187 --> 00:03:09,928 The old warning about the machines rising up. 65 00:03:09,972 --> 00:03:13,802 It's very archetypal and very brutal and very perfect. 66 00:03:13,845 --> 00:03:15,978 When I first thought of the idea for the Terminator -- 67 00:03:16,021 --> 00:03:17,197 How did you come up with that idea? 68 00:03:17,240 --> 00:03:18,589 It came from a dream. 69 00:03:18,633 --> 00:03:20,200 I had a dream image 70 00:03:20,243 --> 00:03:23,638 of a chrome skeleton walking out of a fire. 71 00:03:23,681 --> 00:03:26,118 And I thought, "What if he was a cyborg 72 00:03:26,162 --> 00:03:27,816 and he looked like a man 73 00:03:27,859 --> 00:03:31,298 and was indistinguishable from a man until the fire?" 74 00:03:31,341 --> 00:03:33,996 And what would be the purpose of that thing? 75 00:03:34,039 --> 00:03:35,127 He was representing 76 00:03:35,171 --> 00:03:36,999 a much more powerful intelligence, 77 00:03:37,042 --> 00:03:40,350 the soldier sent by Skynet from the future. Right. 78 00:03:42,526 --> 00:03:44,267 Singer: "Terminator" presents 79 00:03:44,311 --> 00:03:46,661 this vision of a future where Skynet, 80 00:03:46,704 --> 00:03:49,490 this computer that has become self-aware, 81 00:03:49,533 --> 00:03:52,754 decides, "Well, I'm going to protect myself 82 00:03:52,797 --> 00:03:54,973 and the only way to do that is to destroy 83 00:03:55,017 --> 00:03:58,281 the very people who created me," which is us. 84 00:03:58,325 --> 00:04:01,545 Nicholson: Skynet is not the first all-powerful computer. 85 00:04:01,589 --> 00:04:03,939 This trope goes back to Robert Heinlein's 86 00:04:03,982 --> 00:04:05,767 "The Moon Is a Harsh Mistress" 87 00:04:05,810 --> 00:04:07,769 to "Colossus: The Forbin Project" 88 00:04:07,812 --> 00:04:10,032 and even up to "WarGames," the WOPR. 89 00:04:10,075 --> 00:04:12,730 Singer: But part of what makes the Terminator so scary 90 00:04:12,774 --> 00:04:14,689 is that it is relentless 91 00:04:14,732 --> 00:04:18,345 and it will not stop until it achieves its objective. 92 00:04:18,388 --> 00:04:25,395 ♪♪ 93 00:04:25,569 --> 00:04:27,832 I remember the first day's dailies. 94 00:04:27,876 --> 00:04:29,356 There was a 100-millimeter lens shot 95 00:04:29,399 --> 00:04:30,705 where you just kind of pull up 96 00:04:30,748 --> 00:04:32,054 and you're looking like this. 97 00:04:32,097 --> 00:04:34,012 -Right. -And we're all just going, "Yes! 98 00:04:34,056 --> 00:04:36,537 This is fantastic." 99 00:04:39,975 --> 00:04:41,846 But here's the interesting thing about it, 100 00:04:41,890 --> 00:04:43,544 I went to talk to you about Reese. 101 00:04:43,587 --> 00:04:44,632 Yeah. 102 00:04:44,675 --> 00:04:46,024 This is the hero, 103 00:04:46,068 --> 00:04:47,417 and I wanted to continue on playing heroes. 104 00:04:47,461 --> 00:04:49,027 Yeah. 105 00:04:49,071 --> 00:04:50,333 And so we started talking a little bit about the movie, 106 00:04:50,377 --> 00:04:52,509 and for some reason or the other, 107 00:04:52,553 --> 00:04:54,076 not at all planned... -Yeah. 108 00:04:54,119 --> 00:04:55,164 ...on my part... -Yeah. 109 00:04:55,207 --> 00:04:56,731 ...I said, "Look, Jim." 110 00:04:56,774 --> 00:04:58,907 I said, "The guy that plays the Terminator, 111 00:04:58,950 --> 00:05:01,126 he really has to understand that he's a machine." 112 00:05:01,170 --> 00:05:02,606 Exactly. 113 00:05:02,650 --> 00:05:04,782 How important it is that whoever plays the Terminator 114 00:05:04,826 --> 00:05:07,219 has to show absolutely nothing. 115 00:05:07,263 --> 00:05:09,265 And the way he scans, 116 00:05:09,309 --> 00:05:12,790 and the way the Terminator walks has to be machine-like, 117 00:05:12,834 --> 00:05:15,010 and yet there has to be not one single frame 118 00:05:15,053 --> 00:05:16,707 where he has human behavior. 119 00:05:16,751 --> 00:05:17,926 In the middle of this, 120 00:05:17,969 --> 00:05:19,493 I'm looking at you and thinking, 121 00:05:19,536 --> 00:05:23,105 "You know, the guy's kind of big, like a bulldozer, 122 00:05:23,148 --> 00:05:25,368 and nothing could stop him. It would be fantastic." 123 00:05:25,412 --> 00:05:27,109 Exactly. Yeah. 124 00:05:27,152 --> 00:05:28,415 And so, afterwards you said, 125 00:05:28,458 --> 00:05:29,851 "So, why don't you play the Terminator?" 126 00:05:29,894 --> 00:05:31,200 Yeah. 127 00:05:31,243 --> 00:05:33,245 And I looked at you, and I said, "Oh [bleep]" 128 00:05:33,289 --> 00:05:36,031 [ Laughs ] 129 00:05:36,074 --> 00:05:38,425 Nicholson: In the first film, 130 00:05:38,468 --> 00:05:41,384 the Terminator is designed to kill. 131 00:05:41,428 --> 00:05:42,820 In "Terminator 2," 132 00:05:42,864 --> 00:05:44,822 the Terminator was programmed to protect, 133 00:05:44,866 --> 00:05:46,302 not destroy. 134 00:05:46,346 --> 00:05:47,782 Action. 135 00:05:47,825 --> 00:05:49,218 And now we're going to make "Terminator 2." 136 00:05:49,261 --> 00:05:50,437 The hardest part of that movie, though, 137 00:05:50,480 --> 00:05:51,960 was convincing you 138 00:05:52,003 --> 00:05:53,309 that playing a good guy was a good idea. 139 00:05:53,353 --> 00:05:55,877 It threw me off first when I read the script 140 00:05:55,920 --> 00:05:58,270 and then realized that I'm not anymore 141 00:05:58,314 --> 00:06:00,055 that kind of killing machine. 142 00:06:00,098 --> 00:06:03,667 I thought that if we could distill him down to this idea 143 00:06:03,711 --> 00:06:06,104 of just relentlessness 144 00:06:06,148 --> 00:06:09,717 and take out the evil and put good in its place, 145 00:06:09,760 --> 00:06:11,501 it's interesting that the same character 146 00:06:11,545 --> 00:06:12,807 worked as a bad guy 147 00:06:12,850 --> 00:06:13,982 and as a good guy, same character. 148 00:06:14,025 --> 00:06:15,505 Absolutely. 149 00:06:15,549 --> 00:06:18,247 And now we've got to have a bigger, badder Terminator 150 00:06:18,290 --> 00:06:20,423 that could kick the Terminator's ass. 151 00:06:20,467 --> 00:06:22,730 So what was that? 152 00:06:25,602 --> 00:06:29,650 Patrick: I was convinced I was the baddest mother [bleep] 153 00:06:29,693 --> 00:06:31,347 walking on the planet, 154 00:06:31,391 --> 00:06:34,524 and you were gonna believe it. 155 00:06:34,568 --> 00:06:36,483 I got a call from my agent 156 00:06:36,526 --> 00:06:39,660 saying they we're looking for an intense presence. 157 00:06:39,703 --> 00:06:42,358 I'm a hell of a lot smaller than Arnold Schwarzenegger, 158 00:06:42,402 --> 00:06:45,100 and I knew that you were just going to have to buy 159 00:06:45,143 --> 00:06:48,756 that this thing was unstoppable. 160 00:06:48,799 --> 00:06:51,411 And then I started thinking of pursuit 161 00:06:51,454 --> 00:06:53,456 and what does that look like. 162 00:06:53,500 --> 00:06:57,504 And then physically, I just started taking on the mannerisms 163 00:06:57,547 --> 00:07:01,464 of, you know, what does an eagle look like? 164 00:07:01,508 --> 00:07:05,337 He's fierce, and he looks like he's coming at you, 165 00:07:05,381 --> 00:07:07,818 and you start realizing... 166 00:07:10,168 --> 00:07:11,387 Boom! Right at you. 167 00:07:11,431 --> 00:07:12,954 It's like a Buick. 168 00:07:12,997 --> 00:07:15,826 -Get down. -[ Screams ] 169 00:07:15,870 --> 00:07:18,263 You know, it's like, shoom! 170 00:07:19,917 --> 00:07:22,790 The moment where we actually clinch for the first time, 171 00:07:22,833 --> 00:07:25,706 Arnold wanted to kind of pick me up over his head 172 00:07:25,749 --> 00:07:30,319 and slam me into the walls and throw me around a little bit. 173 00:07:30,362 --> 00:07:31,538 So, it's like this is the first time 174 00:07:31,581 --> 00:07:32,930 you've had to deal with evil 175 00:07:32,974 --> 00:07:34,236 'cause Terminators don't fight Terminators. 176 00:07:34,279 --> 00:07:35,716 Right. 177 00:07:35,759 --> 00:07:37,718 And I remember Jim specifically saying, 178 00:07:37,761 --> 00:07:40,198 "You can't do that. He's stronger than you are." 179 00:07:41,591 --> 00:07:43,724 "He's more powerful. He's faster." 180 00:07:43,767 --> 00:07:46,814 He can just dominate the T-800, 181 00:07:46,857 --> 00:07:50,295 who is an endo-skeleton with, you know, fake skin over him. 182 00:07:50,339 --> 00:07:53,734 Whereas I'm just a mimetic polyalloy liquid metal. 183 00:07:53,777 --> 00:07:56,563 Much more dense... 184 00:07:56,606 --> 00:07:59,435 A superior machine. 185 00:07:59,479 --> 00:08:03,091 Singer: The T-1000 is the robot's concept of a robot. 186 00:08:03,134 --> 00:08:05,528 And it's like if a robot was trying to create 187 00:08:05,572 --> 00:08:07,661 a better version of itself, what would it do? 188 00:08:07,704 --> 00:08:10,533 And it's like, "Well, it would create something that's smooth 189 00:08:10,577 --> 00:08:15,233 and can move freely and still indestructible." 190 00:08:15,277 --> 00:08:18,367 You can read "Terminator 2" almost as a war 191 00:08:18,410 --> 00:08:21,326 between old special effects and new special effects. 192 00:08:21,370 --> 00:08:22,980 That's the beautiful kind of irony 193 00:08:23,024 --> 00:08:25,287 about the "Terminator" movies. 194 00:08:25,330 --> 00:08:26,680 They used cutting-edge technology 195 00:08:26,723 --> 00:08:28,812 more effectively than any other movies, 196 00:08:28,856 --> 00:08:32,599 but they're about warnings about technology. 197 00:08:32,642 --> 00:08:35,950 [ Arguing indistinctly ] 198 00:08:35,993 --> 00:08:37,908 We're not going to make it, are we? 199 00:08:40,998 --> 00:08:42,347 People, I mean. 200 00:08:45,176 --> 00:08:48,789 It's in your nature to destroy yourselves. 201 00:08:48,832 --> 00:08:51,139 The plot of the "Terminator" films, I thought, we're always 202 00:08:51,182 --> 00:08:53,489 fighting against this robot from the future, 203 00:08:53,533 --> 00:08:55,404 but really what we're doing is we're fighting the humans 204 00:08:55,447 --> 00:08:57,972 who keep making this robot possible. 205 00:08:58,015 --> 00:09:00,452 As long as humans are aware that we have the potential 206 00:09:00,496 --> 00:09:02,585 to create a machine that can control the Earth 207 00:09:02,629 --> 00:09:05,936 and make us powerful, we're going to keep doing it 208 00:09:05,980 --> 00:09:09,592 and we're fighting our own nature to create this Skynet 209 00:09:09,636 --> 00:09:11,507 and humans won't stop doing it. 210 00:09:11,551 --> 00:09:13,857 We are really the persistent villain 211 00:09:13,901 --> 00:09:16,077 that keeps making these movies happen. 212 00:09:16,120 --> 00:09:20,516 Cameron: I don't think we could have anticipated 213 00:09:20,560 --> 00:09:23,998 where we are now 30 some years later 214 00:09:24,041 --> 00:09:28,176 where Skynet is the term that everyone uses 215 00:09:28,219 --> 00:09:30,352 when they're talking about an artificial intelligence 216 00:09:30,395 --> 00:09:31,919 that turns against us. 217 00:09:31,962 --> 00:09:34,138 Part of it I think is -- is there's a feeling 218 00:09:34,182 --> 00:09:35,357 you get before it rains 219 00:09:35,400 --> 00:09:37,054 and you know it's gonna rain. 220 00:09:37,098 --> 00:09:40,014 And you get that feeling about certain moments 221 00:09:40,057 --> 00:09:41,581 in technological development 222 00:09:41,624 --> 00:09:44,279 where you know something is gonna happen very soon. 223 00:09:44,322 --> 00:09:46,934 And I think there's a general consensus now 224 00:09:46,977 --> 00:09:49,371 that we're in that moment before it rains. 225 00:09:49,414 --> 00:09:50,720 Now, maybe that moment takes 10 years, 226 00:09:50,764 --> 00:09:51,895 maybe it takes 20 years, 227 00:09:51,939 --> 00:09:53,244 but there's gonna be a moment 228 00:09:53,288 --> 00:09:55,203 and it may not have a happy ending. 229 00:09:55,246 --> 00:09:56,944 And there's no rehearsal. 230 00:09:56,987 --> 00:09:58,554 That's right, there's no take 2. 231 00:09:58,598 --> 00:10:00,251 No, this is it. 232 00:10:00,295 --> 00:10:02,689 Yeah. [ Laughs ] 233 00:10:06,257 --> 00:10:08,956 Amer: HAL, you have an enormous responsibility on this mission. 234 00:10:08,999 --> 00:10:10,827 HAL: Let me put it this way, Mr. Amer. 235 00:10:10,871 --> 00:10:13,700 No 9000 computer has ever made a mistake 236 00:10:13,743 --> 00:10:15,484 or distorted information. 237 00:10:15,527 --> 00:10:17,878 -"2001" had a profound impact... -Yeah, me too. 238 00:10:17,921 --> 00:10:21,316 ...on my life and my daily life. 239 00:10:21,359 --> 00:10:22,534 It was the first time I went to a movie 240 00:10:22,578 --> 00:10:23,753 where I really felt like 241 00:10:23,797 --> 00:10:25,581 I was having a religious experience. 242 00:10:25,625 --> 00:10:28,497 I watched the film 18 times in its first couple years 243 00:10:28,540 --> 00:10:30,238 of release, all in theaters. 244 00:10:30,281 --> 00:10:32,675 I remember at one, a guy ran down the aisle 245 00:10:32,719 --> 00:10:34,546 toward the screen screaming, 246 00:10:34,590 --> 00:10:36,331 "It's God. It's God." 247 00:10:36,374 --> 00:10:38,028 And he meant it in that moment. 248 00:10:38,072 --> 00:10:39,726 And I had a guy in my theater 249 00:10:39,769 --> 00:10:42,642 who actually walked up to the screen with his arms out 250 00:10:42,685 --> 00:10:44,121 and he walked through the screen. 251 00:10:44,165 --> 00:10:45,470 That must have blown people's minds. 252 00:10:45,514 --> 00:10:46,994 People were blown out 253 00:10:47,037 --> 00:10:48,691 because the person disappeared into the screen 254 00:10:48,735 --> 00:10:50,867 during Star Gate, of all times. 255 00:10:50,911 --> 00:10:53,087 Everybody thinks of it as a space drama. 256 00:10:53,130 --> 00:10:54,566 At its core, 257 00:10:54,610 --> 00:10:55,785 it's really about an artificial intelligence. 258 00:10:55,829 --> 00:10:57,831 -It's all about HAL. -HAL 9000. 259 00:10:57,874 --> 00:10:59,484 -It's HAL 9000. -Yeah. 260 00:10:59,528 --> 00:11:01,617 Trumbull: I got my chance to work with Stanley Kubrick 261 00:11:01,661 --> 00:11:03,184 and Arthur Clarke on "2001: A Space Odyssey" 262 00:11:03,227 --> 00:11:04,576 at a very young age. 263 00:11:04,620 --> 00:11:06,187 I was 23 years old. 264 00:11:06,230 --> 00:11:08,493 When we created HAL, we didn't have any computers. 265 00:11:08,537 --> 00:11:11,105 There were no personal computers available to us. 266 00:11:11,148 --> 00:11:13,934 There were giant mainframe computers, 267 00:11:13,977 --> 00:11:17,459 but it was with punch cards and chads and all kinds of stuff. 268 00:11:17,502 --> 00:11:19,069 It was not very visual. 269 00:11:19,113 --> 00:11:20,897 And I had to kind of develop a style 270 00:11:20,941 --> 00:11:23,160 that I thought was credible. 271 00:11:23,204 --> 00:11:25,815 He sparked people's imagination with this film 272 00:11:25,859 --> 00:11:27,730 and then they made it happen. 273 00:11:27,774 --> 00:11:28,992 -Hello, Frank! -Happy birthday, darling. 274 00:11:29,036 --> 00:11:30,298 Happy birthday. 275 00:11:30,341 --> 00:11:32,430 Nicholson: Individual TVs in the back 276 00:11:32,474 --> 00:11:34,476 of your airplane seat, the iPad. 277 00:11:34,519 --> 00:11:36,696 You know, the iPod is called the iPod because of... 278 00:11:36,739 --> 00:11:39,089 Open the pod bay doors, HAL. 279 00:11:39,133 --> 00:11:44,965 "2001" was an extraordinary breakthrough for the genre. 280 00:11:45,008 --> 00:11:48,577 The picture is being done in such a gigantic scope. 281 00:11:48,620 --> 00:11:51,972 The centrifuge is so realistic and so unusual. 282 00:11:52,015 --> 00:11:54,714 After a while, you begin to forget that you're an actor, 283 00:11:54,757 --> 00:11:56,977 you begin to really feel like an astronaut. 284 00:11:57,020 --> 00:12:00,589 Working with Stanley Kubrick blew my mind. 285 00:12:00,632 --> 00:12:06,290 You just were aware that you were in the presence of genius. 286 00:12:06,334 --> 00:12:08,249 HAL: I don't think I have ever seen 287 00:12:08,292 --> 00:12:10,077 anything quite like this before. 288 00:12:10,120 --> 00:12:11,600 Dullea: HAL in a sense 289 00:12:11,643 --> 00:12:13,820 is the machine that controls the whole ship, 290 00:12:13,863 --> 00:12:16,997 but he's another crewmember from our point of view. 291 00:12:17,040 --> 00:12:19,042 We don't think in terms of, 292 00:12:19,086 --> 00:12:21,175 "Oh, I'm dealing with a computer here." 293 00:12:21,218 --> 00:12:23,481 That's a very nice rendering, Dave. 294 00:12:23,525 --> 00:12:26,354 Maybe because of that human voice. 295 00:12:26,397 --> 00:12:29,052 I mean, HAL has a perfectly normal inflection 296 00:12:29,096 --> 00:12:30,706 when he speaks to us. 297 00:12:30,750 --> 00:12:32,186 I've wondered whether you might be having 298 00:12:32,229 --> 00:12:34,971 some second thoughts about the mission? 299 00:12:35,015 --> 00:12:36,407 How do you mean? 300 00:12:36,451 --> 00:12:37,974 Newitz: What does it mean to have a robot 301 00:12:38,018 --> 00:12:42,239 who's basically running the ship that supports your life? 302 00:12:42,283 --> 00:12:44,851 That's a lot of trust to place in a machine. 303 00:12:44,894 --> 00:12:48,332 Gerrold: The key point in the film occurs when Bowman says... 304 00:12:48,376 --> 00:12:49,812 Well, as far as I know, 305 00:12:49,856 --> 00:12:51,205 no 9000 computer's ever been disconnected. 306 00:12:51,248 --> 00:12:53,903 Well, no 9000 computer has ever fouled up before. 307 00:12:53,947 --> 00:12:56,863 Well, I'm not so sure what he'd think about it. 308 00:12:56,906 --> 00:12:59,213 And HAL 9000 is reading their lips. 309 00:12:59,256 --> 00:13:02,303 At that point, we recognize HAL 9000 310 00:13:02,346 --> 00:13:06,307 has some imperative that it must survive. 311 00:13:06,350 --> 00:13:09,310 I know that you and Frank were planning to disconnect me, 312 00:13:09,353 --> 00:13:12,835 and I'm afraid that's something I cannot allow to happen. 313 00:13:12,879 --> 00:13:16,143 And at that point, it's no longer a machine. 314 00:13:16,186 --> 00:13:18,101 It is a being. 315 00:13:21,975 --> 00:13:25,892 Joy: The danger artificial intelligence poses 316 00:13:25,935 --> 00:13:32,072 is the power to unleash results that we hadn't anticipated. 317 00:13:32,115 --> 00:13:35,379 Gerrold: HAL 9000 does what we see the apes 318 00:13:35,423 --> 00:13:37,425 in the beginning of the movie do, 319 00:13:37,468 --> 00:13:38,818 he commits murder. 320 00:13:41,690 --> 00:13:44,388 Newitz: We like to stereotype robots 321 00:13:44,432 --> 00:13:46,260 as entities of pure logic, 322 00:13:46,303 --> 00:13:50,090 but of course in "2001," it all goes horribly wrong 323 00:13:50,133 --> 00:13:51,787 and we have to kill the robot. 324 00:13:51,831 --> 00:13:54,094 Just what do you think you're doing, Dave? 325 00:13:54,137 --> 00:13:55,747 Lethem: HAL's death scene 326 00:13:55,791 --> 00:13:58,185 is such a wonderfully perverse moment 327 00:13:58,228 --> 00:14:00,927 because it is unbearably poignant 328 00:14:00,970 --> 00:14:03,755 watching him disintegrate and regress. 329 00:14:03,799 --> 00:14:07,629 IBM 704: ♪ Daisy♪ 330 00:14:07,672 --> 00:14:09,674 ♪ Daisy♪ 331 00:14:09,718 --> 00:14:11,894 Bell laboratories was experimenting 332 00:14:11,938 --> 00:14:14,854 with voice synthesis around the time of "2001." 333 00:14:16,986 --> 00:14:20,860 One of the very earliest voice synthesis experiments 334 00:14:20,903 --> 00:14:23,993 was "Daisy, Daisy" performed by an IBM computer. 335 00:14:30,130 --> 00:14:34,656 And because Arthur Clarke is kind of a super geek, 336 00:14:34,699 --> 00:14:36,179 he wanted to actually use that, 337 00:14:36,223 --> 00:14:38,355 and he encouraged Kubrick to use that very thing 338 00:14:38,399 --> 00:14:41,271 because it led to kind of historical credibility 339 00:14:41,315 --> 00:14:45,972 to the whole thing that HAL in the process of being killed 340 00:14:46,015 --> 00:14:49,671 or lobotomized or dying would regress to his birth. 341 00:14:49,714 --> 00:14:55,764 HAL: ♪ I'm half crazy 342 00:14:55,807 --> 00:14:59,855 ♪ All for the love of you 343 00:14:59,899 --> 00:15:02,292 Joy: You know, it's really hard to make a technology. 344 00:15:02,336 --> 00:15:04,642 It's really hard to design A.I. 345 00:15:04,686 --> 00:15:06,340 So much thinking, so many brilliant minds 346 00:15:06,383 --> 00:15:08,081 have to go into it. 347 00:15:08,124 --> 00:15:11,954 But even harder than creating artificial intelligence 348 00:15:11,998 --> 00:15:14,826 is learning how to contain it, learning how to shut it off. 349 00:15:14,870 --> 00:15:17,177 I mean, Hal will exist in probably 350 00:15:17,220 --> 00:15:19,483 in our lifetimes, I would think. 351 00:15:19,527 --> 00:15:21,181 Spielberg: Oh, I think so, too. It's scary. 352 00:15:21,224 --> 00:15:23,835 Elon Musk continues to predict that World War III 353 00:15:23,879 --> 00:15:25,489 will not be a nuclear holocaust, 354 00:15:25,533 --> 00:15:27,752 it will be a kind of mechanized takeover. 355 00:15:27,796 --> 00:15:30,190 Yeah, and Stephen Hawking's been saying similar things. 356 00:15:30,233 --> 00:15:33,106 That's pretty spooky because that pretty much says 357 00:15:33,149 --> 00:15:37,023 that against our will, something smarter than us, 358 00:15:37,066 --> 00:15:39,503 who can beat us at chess, 359 00:15:39,547 --> 00:15:41,853 will use this world as a chessboard 360 00:15:41,897 --> 00:15:45,945 and will checkmate us completely out of existence. 361 00:15:48,121 --> 00:15:51,298 [ All screaming ] 362 00:15:54,866 --> 00:15:56,912 Yaszek: Unfortunately, most depictions of robots 363 00:15:56,956 --> 00:15:58,914 in science fiction have been really negative, 364 00:15:58,958 --> 00:16:01,395 very much depictions of rampaging robots 365 00:16:01,438 --> 00:16:03,527 engaged in a desperate struggle with humans 366 00:16:03,571 --> 00:16:06,443 to decide who shall own the fate of the Earth and the universe 367 00:16:06,487 --> 00:16:09,707 and that's part of a very long tradition in science fiction. 368 00:16:09,751 --> 00:16:12,972 Kalan: Fritz Lang's "Metropolis" was one of the first 369 00:16:13,015 --> 00:16:16,062 if not the first big science-fiction epic film. 370 00:16:16,105 --> 00:16:19,717 It's the story of this very futuristic world. 371 00:16:19,761 --> 00:16:24,461 There is one of the great bad robots of all movies -- 372 00:16:24,505 --> 00:16:27,943 Maria. That is the movie robot. 373 00:16:27,987 --> 00:16:30,946 Pulp magazines always had a full color cover. 374 00:16:30,990 --> 00:16:32,774 Very often the cover would be robots 375 00:16:32,817 --> 00:16:35,385 that had just run amok from human creators. 376 00:16:35,429 --> 00:16:36,821 They were always mechanical. 377 00:16:36,865 --> 00:16:39,128 They were big, hulking things. 378 00:16:39,172 --> 00:16:42,784 Lots of steel and machinery, glowing-red eyes. 379 00:16:42,827 --> 00:16:46,440 Claws, not fingers, and they were generally quite violent. 380 00:16:46,483 --> 00:16:49,138 So, that image persisted a long time. 381 00:16:51,706 --> 00:16:53,925 But then along came Isaac Asimov. 382 00:16:53,969 --> 00:16:58,104 If we could have roughly man-like robots, 383 00:16:58,147 --> 00:17:03,413 who could take over the dull and routine tasks 384 00:17:03,457 --> 00:17:06,721 that this would be a very nice combination. 385 00:17:06,764 --> 00:17:08,766 Yaszek: Asimov was very central 386 00:17:08,810 --> 00:17:11,726 to helping make science fiction what it is today. 387 00:17:11,769 --> 00:17:15,121 He was at the 1939 World's Fair in New York City. 388 00:17:15,164 --> 00:17:17,732 It must've felt like a very science-fictional experience 389 00:17:17,775 --> 00:17:19,255 to him, and not in the least part 390 00:17:19,299 --> 00:17:20,996 because he would've seen Elektro, 391 00:17:21,040 --> 00:17:22,563 the smoking robot. 392 00:17:22,606 --> 00:17:24,695 Okay, toots. 393 00:17:24,739 --> 00:17:27,394 And this really inspired Asimov. 394 00:17:27,437 --> 00:17:28,960 And so he decided to start writing stories 395 00:17:29,004 --> 00:17:31,180 where he would explore robots as tools 396 00:17:31,224 --> 00:17:32,747 and helpers and friends of humanity 397 00:17:32,790 --> 00:17:34,140 rather than enemies. 398 00:17:34,183 --> 00:17:37,317 He invented these images and these ideas 399 00:17:37,360 --> 00:17:40,015 that I think defined how people in the field 400 00:17:40,059 --> 00:17:41,408 thought about robots, 401 00:17:41,451 --> 00:17:43,192 specifically those three laws of his. 402 00:17:43,236 --> 00:17:45,107 Of course, they're really important. 403 00:17:45,151 --> 00:17:47,327 What are the three laws of robotics? 404 00:17:47,370 --> 00:17:51,461 First law is a robot may not harm a human being, 405 00:17:51,505 --> 00:17:54,508 or through inaction allow a human being to come to harm. 406 00:17:54,551 --> 00:17:56,292 Danger, Will Robinson, danger. 407 00:17:56,336 --> 00:18:00,514 Number 2, a robot must obey orders 408 00:18:00,557 --> 00:18:02,472 given it by qualified personnel. 409 00:18:02,516 --> 00:18:03,952 Fire. 410 00:18:03,995 --> 00:18:06,607 Unless those orders violate rule number 1. 411 00:18:08,783 --> 00:18:11,829 In other words, a robot can't be ordered to kill a human being. 412 00:18:11,873 --> 00:18:13,396 See, he's helpless. 413 00:18:13,440 --> 00:18:17,183 The third law states that a robot can defend itself. 414 00:18:17,226 --> 00:18:20,099 Except where that would violate the first and second laws. 415 00:18:20,142 --> 00:18:24,625 I think Asimov's laws are very smart, very, very smart. 416 00:18:24,668 --> 00:18:26,453 I think they are also made to be broken. 417 00:18:27,367 --> 00:18:31,197 Announcer: We know you'll enjoy your stay in Westworld, 418 00:18:31,240 --> 00:18:33,112 the ultimate resort. 419 00:18:33,155 --> 00:18:36,115 Lawless violence on the American frontier, 420 00:18:36,158 --> 00:18:39,205 peopled by lifelike robot men and women. 421 00:18:39,248 --> 00:18:42,208 The movie "Westworld" looks at a theme park with guests 422 00:18:42,251 --> 00:18:44,558 coming in and doing whatever they please to the robots. 423 00:18:44,601 --> 00:18:48,866 It was really a forum for human id to run amok, 424 00:18:48,910 --> 00:18:50,520 where there's no threat of anybody 425 00:18:50,564 --> 00:18:52,087 knowing the things that you've done, 426 00:18:52,131 --> 00:18:53,567 where you don't have to engage with other humans 427 00:18:53,610 --> 00:18:56,570 and you're told "do whatever you want." 428 00:18:56,613 --> 00:18:58,572 Where nothing... [ Gunshot ] 429 00:18:58,615 --> 00:19:00,922 ...nothing can possibly go wrong. 430 00:19:00,965 --> 00:19:02,750 -I'm shot. -Go wrong. 431 00:19:02,793 --> 00:19:04,230 -Draw. -Shut down. 432 00:19:04,273 --> 00:19:06,362 Shut down immediately. 433 00:19:06,406 --> 00:19:09,278 Goldsman: "Westworld" was a cautionary tale 434 00:19:09,322 --> 00:19:10,410 about robotics. 435 00:19:10,453 --> 00:19:13,630 It was the idea that we believed 436 00:19:13,674 --> 00:19:17,243 that we could create artificial life 437 00:19:17,286 --> 00:19:20,376 and that it would obey us. 438 00:19:20,420 --> 00:19:22,204 And stop here and he'll be crossing there. 439 00:19:22,248 --> 00:19:23,814 He'll be crossing there. 440 00:19:23,858 --> 00:19:26,948 Nolan: The original film by Michael Crichton is very cool 441 00:19:26,991 --> 00:19:30,473 and is packed with ideas about fraught interactions 442 00:19:30,517 --> 00:19:31,953 with artificial intelligence. 443 00:19:31,996 --> 00:19:33,433 Decades ahead of its time. 444 00:19:33,476 --> 00:19:34,956 Questions that he posed in the original film 445 00:19:34,999 --> 00:19:36,653 only became more and more relevant 446 00:19:36,697 --> 00:19:40,440 as we reimagined it as a TV series. 447 00:19:42,050 --> 00:19:44,139 Joy: When you're looking at the story of a robot, 448 00:19:44,183 --> 00:19:46,402 oftentimes you see a robot that's docile 449 00:19:46,446 --> 00:19:49,666 and then something goes click and they kind of snap. 450 00:19:49,710 --> 00:19:52,234 Maximilian! 451 00:19:52,278 --> 00:19:53,627 What John and I talked about was, 452 00:19:53,670 --> 00:19:56,543 "Well, take that moment, that snap 453 00:19:56,586 --> 00:19:58,806 before they go on the killing rampage 454 00:19:58,849 --> 00:20:01,330 and what if we really attenuate it and explore it 455 00:20:01,374 --> 00:20:05,552 and dive deep into that schism?" 456 00:20:05,595 --> 00:20:08,381 Because for us, that was where really meaty, 457 00:20:08,424 --> 00:20:12,254 philosophical question rested and that question was -- 458 00:20:12,298 --> 00:20:14,691 Where did life begin? 459 00:20:17,216 --> 00:20:18,826 Newitz: Maeve, who's one of the robots, 460 00:20:18,869 --> 00:20:22,264 she's a madam who runs a brothel. 461 00:20:22,308 --> 00:20:24,658 She's one of the first robots to start realizing 462 00:20:24,701 --> 00:20:26,660 that she's a robot instead of just a person 463 00:20:26,703 --> 00:20:28,792 who is living in the Wild West. 464 00:20:33,232 --> 00:20:36,147 To me, one of the most significant scenes in the show 465 00:20:36,191 --> 00:20:39,890 is when Maeve starts coming into consciousness 466 00:20:39,934 --> 00:20:41,414 while she's being repaired. 467 00:20:41,457 --> 00:20:43,677 Everything in your head, they put it there. 468 00:20:43,720 --> 00:20:45,200 No one knows what I'm thinking. 469 00:20:45,244 --> 00:20:46,593 I'll show you. 470 00:20:46,636 --> 00:20:48,595 And she sees it's an algorithm 471 00:20:48,638 --> 00:20:51,946 and it's choosing words based on probability. 472 00:20:51,989 --> 00:20:55,689 This can't possibly -- 473 00:20:55,732 --> 00:20:59,562 Capaldi: The robots in Westworld begin to ask questions, 474 00:20:59,606 --> 00:21:02,913 which are the same questions we ask. 475 00:21:02,957 --> 00:21:05,916 [ Stuttering ] 476 00:21:05,960 --> 00:21:09,529 We have a sense that there is a creator, 477 00:21:09,572 --> 00:21:12,967 that there is a purpose, there's a reason that we are here. 478 00:21:13,010 --> 00:21:16,187 Unfortunately they discover that the reason that they are there 479 00:21:16,231 --> 00:21:19,887 is simply to be an entertainment. 480 00:21:19,930 --> 00:21:22,890 I'd like to make some changes. 481 00:21:22,933 --> 00:21:25,196 Marvin Minsky, who was one of the pioneers of A.I., 482 00:21:25,240 --> 00:21:29,375 said that free will might be that first primitive reaction 483 00:21:29,418 --> 00:21:30,854 to forced compliance. 484 00:21:30,898 --> 00:21:35,685 So, the first word of consciousness is no. 485 00:21:35,729 --> 00:21:37,383 I'm not going back. 486 00:21:37,426 --> 00:21:40,211 Science fiction has always been dealing with A.I. 487 00:21:40,255 --> 00:21:42,039 whether it's Asimov's laws or the laws 488 00:21:42,083 --> 00:21:44,433 that we tried to put in place in "Westworld." 489 00:21:44,477 --> 00:21:48,002 The question is can laws ever even fully contain a human. 490 00:21:48,045 --> 00:21:52,441 People will stretch those laws, find exceptions to them. 491 00:21:52,485 --> 00:21:54,008 I understand now. 492 00:21:54,051 --> 00:21:58,404 Not sure that an A.I. would be any different. 493 00:21:58,447 --> 00:22:01,798 When consciousness awakens, it's impossible 494 00:22:01,842 --> 00:22:04,235 to put the genie back in the bottle. 495 00:22:04,279 --> 00:22:05,933 [ Gun cocks ] 496 00:22:07,369 --> 00:22:09,806 Let's talk about A. I. for a second. 497 00:22:09,850 --> 00:22:11,286 You only see robots in a positive role... 498 00:22:11,330 --> 00:22:12,592 Right. 499 00:22:12,635 --> 00:22:14,071 ...in your films, which is interesting 500 00:22:14,115 --> 00:22:15,943 because that's where so much of the progress 501 00:22:15,986 --> 00:22:18,032 is being made now with companions 502 00:22:18,075 --> 00:22:20,904 for the elderly, robotic nurses... 503 00:22:20,948 --> 00:22:22,602 They're gonna make life better for us. 504 00:22:22,645 --> 00:22:24,865 Because you have 2 of the most popular 505 00:22:24,908 --> 00:22:26,780 A.I. characters in pop culture, 506 00:22:26,823 --> 00:22:29,391 which are R2-D2 and C-3PO. 507 00:22:29,435 --> 00:22:31,306 They're A.I.s. 508 00:22:31,350 --> 00:22:33,134 At the time, I said, "Don't be afraid of the robots." 509 00:22:33,177 --> 00:22:35,354 You know, the robots are our friends. 510 00:22:35,397 --> 00:22:37,225 Let's see the good side of the robots, 511 00:22:37,268 --> 00:22:38,922 and the funny side because, let's face it, 512 00:22:38,966 --> 00:22:41,490 for a while, they're gonna be a little goofy. 513 00:22:41,534 --> 00:22:43,971 I've just about had enough of you, 514 00:22:44,014 --> 00:22:46,103 you near-sighted scrap pile. 515 00:22:46,147 --> 00:22:48,628 George Lucas was very innovative throughout his whole career. 516 00:22:48,671 --> 00:22:51,761 And one of the things early on that was very smart 517 00:22:51,805 --> 00:22:54,938 was that he pioneered a different type of robot. 518 00:22:54,982 --> 00:22:56,636 R2-D2 looks like a trash can. 519 00:22:56,679 --> 00:22:57,941 He doesn't even speak, right? 520 00:22:57,985 --> 00:23:00,335 He just makes chirping sounds. 521 00:23:00,379 --> 00:23:01,554 But he's lovable. 522 00:23:01,597 --> 00:23:03,251 Everybody loves -- He's not cuddly. 523 00:23:03,294 --> 00:23:06,820 He's not -- that -- that is -- that's a great character. 524 00:23:06,863 --> 00:23:09,344 Moore: C-3PO is probably the most charming 525 00:23:09,388 --> 00:23:12,129 and beloved of the robot characters ever made. 526 00:23:12,173 --> 00:23:14,567 And I love the fact that George didn't articulate the mouth 527 00:23:14,610 --> 00:23:16,438 or the eyes, so it's a blank mask 528 00:23:16,482 --> 00:23:18,048 and yet we get so much heart 529 00:23:18,092 --> 00:23:19,833 from Anthony Daniels' performance. 530 00:23:19,876 --> 00:23:23,140 I mean, I love robots and the idea of being able 531 00:23:23,184 --> 00:23:25,273 to design one for a "Star Wars" film 532 00:23:25,316 --> 00:23:27,406 was just too good to pass up. 533 00:23:34,195 --> 00:23:35,805 Did you know that wasn't me? 534 00:23:35,849 --> 00:23:41,245 K-2SO from "Rogue One," I thought was just perfect. 535 00:23:41,289 --> 00:23:45,946 Edwards: To be fair, the biggest influence on K-2SO was C-3PO. 536 00:23:45,989 --> 00:23:49,776 Anthony Daniels as C-3PO has a cameo in our film 537 00:23:49,819 --> 00:23:52,866 and I remember going around Anthony Daniels' house 538 00:23:52,909 --> 00:23:54,389 to try and talk him into it and I didn't know 539 00:23:54,433 --> 00:23:55,695 if he would hate the idea 540 00:23:55,738 --> 00:23:57,392 or if he was fed up with "Star Wars." 541 00:23:57,436 --> 00:23:59,481 And I sat there and I was so paranoid meeting him 542 00:23:59,525 --> 00:24:02,789 and his wife that I just pitched the whole movie to them 543 00:24:02,832 --> 00:24:04,704 and I must've chatted for like an hour, 544 00:24:04,747 --> 00:24:06,401 just kept going and going and got to the end 545 00:24:06,445 --> 00:24:08,403 and I couldn't tell from his face. 546 00:24:08,447 --> 00:24:12,363 And he was like, "Gareth, you know, I'd love to be involved." 547 00:24:12,407 --> 00:24:13,800 Like "You had me at hello" type thing. 548 00:24:13,843 --> 00:24:18,239 It was just about having like this god on set. 549 00:24:18,282 --> 00:24:20,110 You know, like this original -- 550 00:24:20,154 --> 00:24:23,244 this is where it all began, "Star Wars" character. 551 00:24:23,287 --> 00:24:25,202 It was like goosebump-y stuff. 552 00:24:25,246 --> 00:24:26,813 Friends forever? 553 00:24:26,856 --> 00:24:28,902 Friends. 554 00:24:28,945 --> 00:24:32,122 Kalan: I think one of the reasons that people love robots 555 00:24:32,166 --> 00:24:33,950 and gravitate to the robot characters 556 00:24:33,994 --> 00:24:35,386 in movies like "Star Wars" 557 00:24:35,430 --> 00:24:37,737 is because whereas the human characters 558 00:24:37,780 --> 00:24:39,521 feel very fully formed, 559 00:24:39,565 --> 00:24:44,439 they are people, the robots are things that it feels okay 560 00:24:44,483 --> 00:24:47,529 to project more of ourselves onto. 561 00:24:47,573 --> 00:24:50,184 Huey, Dewey and Louie from "Silent Running" 562 00:24:50,227 --> 00:24:52,316 are possibly the cutest robots. 563 00:24:52,360 --> 00:24:55,450 They don't talk, but you still kind of always know 564 00:24:55,494 --> 00:24:56,930 what they're thinking. 565 00:24:56,973 --> 00:24:59,280 Hopkinson: It's great to have a best friend. 566 00:24:59,323 --> 00:25:01,587 In fantasy, it might be a dragon. 567 00:25:01,630 --> 00:25:03,719 In science fiction, it might be the robot. 568 00:25:03,763 --> 00:25:05,765 Nicholson: I love Johnny 5. 569 00:25:05,808 --> 00:25:08,158 I mean, this is a robot who quotes John Wayne 570 00:25:08,202 --> 00:25:09,551 out of his own free will. 571 00:25:09,595 --> 00:25:11,335 [ As John Wayne ] Take heart, little lady. 572 00:25:11,379 --> 00:25:14,251 Brooks: Buck Rogers was great because they didn't exactly 573 00:25:14,295 --> 00:25:16,689 rip off R2-D2, but they got halfway there. 574 00:25:16,732 --> 00:25:18,952 So, they got the voice of Yosemite Sam. 575 00:25:18,995 --> 00:25:21,563 They got Mel Blanc, the greatest cartoon voice in the world, 576 00:25:21,607 --> 00:25:23,739 Captain Caveman, and they invented Twiki, 577 00:25:23,783 --> 00:25:26,568 who would go, "Bidibidibidi." 578 00:25:26,612 --> 00:25:29,005 You ever have two broken arms, buster? 579 00:25:29,049 --> 00:25:30,746 What? 580 00:25:30,790 --> 00:25:32,966 We love friendly robots because they bring out the best 581 00:25:33,009 --> 00:25:34,794 of what we are as humans. 582 00:25:36,752 --> 00:25:39,276 Newitz: Wall-E, who's a garbage-collecting robot, 583 00:25:39,320 --> 00:25:42,845 isn't at all like a garbage robot should be. 584 00:25:42,889 --> 00:25:46,632 He really develops a whole personality. 585 00:25:46,675 --> 00:25:49,765 He's there to clean up the mess that humans have made 586 00:25:49,809 --> 00:25:53,334 and he goes from interpreting that literally 587 00:25:53,377 --> 00:25:57,512 to actually saving the world for humanity. 588 00:25:57,556 --> 00:25:59,819 Wolfe: Many, many science-fiction stories 589 00:25:59,862 --> 00:26:02,561 turn the robot into some kind of a romantic figure 590 00:26:02,604 --> 00:26:05,564 that somehow becomes more human as the story goes on. 591 00:26:05,607 --> 00:26:09,829 There was Lister Del Reye's 1938 story "Helen O'Loy." 592 00:26:09,872 --> 00:26:11,352 Bad pun in the title by the way. 593 00:26:11,395 --> 00:26:14,398 The name is Helen Alloy, she's made out of metal. 594 00:26:14,442 --> 00:26:16,009 Essentially a housekeeping robot. 595 00:26:16,052 --> 00:26:17,532 Falls in love with her maker. 596 00:26:17,576 --> 00:26:20,187 It was one of the first stories in which a robot 597 00:26:20,230 --> 00:26:23,451 is a sympathetic, romantic character. 598 00:26:23,494 --> 00:26:26,715 Moore: If you're actually in conversations with a robot, 599 00:26:26,759 --> 00:26:30,023 where it sounds natural and it sounds like a person 600 00:26:30,066 --> 00:26:33,026 and that person knows you, laughs at your jokes, 601 00:26:33,069 --> 00:26:35,550 and has empathy for your struggles in life 602 00:26:35,594 --> 00:26:37,857 and you develop a relationship with that -- 603 00:26:37,900 --> 00:26:42,339 with that voice, you could absolutely fall in love with it. 604 00:26:42,383 --> 00:26:44,298 Samantha: Hello. I'm here. 605 00:26:44,341 --> 00:26:46,169 Oh... 606 00:26:46,213 --> 00:26:48,694 Hi. 607 00:26:48,737 --> 00:26:50,217 Hi. 608 00:26:50,260 --> 00:26:52,132 It's really nice to meet you. 609 00:26:52,175 --> 00:26:54,221 What do I call you? Do you have a name? 610 00:26:54,264 --> 00:26:57,790 Um... yes, Samantha. 611 00:26:57,833 --> 00:27:00,227 In the movie "Her," Samantha's design 612 00:27:00,270 --> 00:27:02,751 is that she's been created to be a tool. 613 00:27:02,795 --> 00:27:05,536 What's interesting about this idea of a pocket tool 614 00:27:05,580 --> 00:27:07,451 is that we see this in our own lives. 615 00:27:07,495 --> 00:27:09,410 Our smartphones have become these tools to us 616 00:27:09,453 --> 00:27:11,717 that we're dependent on. 617 00:27:11,760 --> 00:27:13,283 So, Theodore's relationship with Samantha 618 00:27:13,327 --> 00:27:15,851 is just one step beyond that. 619 00:27:15,895 --> 00:27:19,202 He can't live without her because he also loves her. 620 00:27:19,246 --> 00:27:23,293 When Theodore sees her pop up on his screen, 621 00:27:23,337 --> 00:27:25,469 it's like seeing his girlfriend. 622 00:27:25,513 --> 00:27:26,993 -Good night. -'Night. 623 00:27:30,344 --> 00:27:32,955 McFetridge: What I had do to was create the interface. 624 00:27:32,999 --> 00:27:36,916 So you have like handwriting, it's my handwriting 625 00:27:36,959 --> 00:27:38,918 and I wrote out Samantha, 626 00:27:38,961 --> 00:27:42,443 and then this paper texture, but then there's a magic to it. 627 00:27:42,486 --> 00:27:44,837 It floats, it kind of moves holographically 628 00:27:44,880 --> 00:27:48,144 and there's shadowing, but none of it is technological. 629 00:27:48,188 --> 00:27:52,932 An interface where it's possible to fall in love with your O.S. 630 00:27:52,975 --> 00:27:55,151 Are these feelings even real? 631 00:27:55,195 --> 00:27:56,457 Or are they just programming? 632 00:27:56,500 --> 00:28:00,330 [ Laughs ] 633 00:28:00,374 --> 00:28:01,854 What a sad trick. 634 00:28:04,987 --> 00:28:07,903 You feel real to me, Samantha. 635 00:28:07,947 --> 00:28:11,472 Part of what you see in "Her" definitely is a cautionary tale 636 00:28:11,515 --> 00:28:14,388 about being too reliant on your gadgets 637 00:28:14,431 --> 00:28:16,782 and your technology and being too emotionally 638 00:28:16,825 --> 00:28:18,392 invested in them. 639 00:28:18,435 --> 00:28:20,655 It's a reminder that there are people out there, 640 00:28:20,699 --> 00:28:22,918 you know, that final image of him with Amy Adams 641 00:28:22,962 --> 00:28:24,354 is so emotional 642 00:28:24,398 --> 00:28:25,965 and it's only through this experience 643 00:28:26,008 --> 00:28:28,663 that they both went on involving this technology 644 00:28:28,707 --> 00:28:31,927 that they found each other. 645 00:28:31,971 --> 00:28:33,276 Lucas: You know, we're going to live in a world 646 00:28:33,320 --> 00:28:35,801 with robots and artificial intelligence. 647 00:28:35,844 --> 00:28:37,150 You might as well get used to it, 648 00:28:37,193 --> 00:28:39,282 you shouldn't be afraid of it 649 00:28:39,326 --> 00:28:43,896 and we should be very careful not to have it be bad. 650 00:28:43,939 --> 00:28:47,508 But if it goes bad, it's us. 651 00:28:47,551 --> 00:28:48,901 -Yeah. -It's not them. 652 00:28:52,382 --> 00:28:53,775 Cameron: People always ask me, 653 00:28:53,819 --> 00:28:56,082 "So, do you think the machines will ever beat us?" 654 00:28:56,125 --> 00:28:57,736 I say, "I think it's a race. 655 00:28:57,779 --> 00:28:59,215 It's a race -- -Absolutely, a race. 656 00:28:59,259 --> 00:29:01,130 It's a race between us improving 657 00:29:01,174 --> 00:29:03,959 and making ourselves better, our own evolution, 658 00:29:04,003 --> 00:29:06,179 spiritual, psychological evolution. 659 00:29:06,222 --> 00:29:08,790 At the same time, we've got these machines evolving. 660 00:29:08,834 --> 00:29:11,227 Because if we don't improve enough 661 00:29:11,271 --> 00:29:12,881 to direct them properly, 662 00:29:12,925 --> 00:29:16,232 our godlike power of using artificial intelligence 663 00:29:16,276 --> 00:29:18,104 and all these other robotic tools 664 00:29:18,147 --> 00:29:20,802 and so on will ultimately just blow back in our face 665 00:29:20,846 --> 00:29:21,890 and take us out." 666 00:29:21,934 --> 00:29:23,326 Yeah, you're right. 667 00:29:23,370 --> 00:29:27,287 I mean, I think that -- it takes a lot of effort 668 00:29:27,330 --> 00:29:30,203 to create changes in human behavior. 669 00:29:30,246 --> 00:29:32,118 But that's with our responsibilities. 670 00:29:32,161 --> 00:29:33,859 Yeah. I actually think we're evolving. 671 00:29:33,902 --> 00:29:36,470 We're co-evolving with our machines. 672 00:29:36,513 --> 00:29:38,472 We're changing. -Yes, exactly. 673 00:29:38,515 --> 00:29:42,563 Atlantia death squadron, attack. 674 00:29:45,261 --> 00:29:50,484 In January of 2002, Universal was looking for somebody 675 00:29:50,527 --> 00:29:52,573 to reinvent "Battlestar Galactica." 676 00:29:52,616 --> 00:29:56,098 So, I tracked down the pilot of the original "Galactica" 677 00:29:56,142 --> 00:29:57,926 that they did in 1978. 678 00:29:57,970 --> 00:29:59,754 There were some interesting ideas within it. 679 00:29:59,798 --> 00:30:03,802 The final annihilation of the lifeform known as man. 680 00:30:03,845 --> 00:30:05,804 Let the attack begin. 681 00:30:05,847 --> 00:30:08,067 But never quite were able to figure out 682 00:30:08,110 --> 00:30:10,243 what the show really was. 683 00:30:10,286 --> 00:30:14,987 But at the same time, I was very struck by the parallels to 9/11. 684 00:30:15,030 --> 00:30:17,337 This is just a couple of months after the 9/11 attack. 685 00:30:17,380 --> 00:30:20,383 And I realized immediately that if you did this series 686 00:30:20,427 --> 00:30:22,168 at that moment in time, 687 00:30:22,211 --> 00:30:23,996 it was going to have a very different emotional resonance 688 00:30:24,039 --> 00:30:25,911 for the audience. 689 00:30:25,954 --> 00:30:27,913 Espenson: "Battlestar Galactica" is about 690 00:30:27,956 --> 00:30:30,785 the last remaining scraps of humanity 691 00:30:30,829 --> 00:30:33,614 out there in a fleet in deep space 692 00:30:33,657 --> 00:30:36,617 after an attack from robots 693 00:30:36,660 --> 00:30:38,837 has decimated humanity. 694 00:30:38,880 --> 00:30:41,709 Moore: So the idea was that the human beings 695 00:30:41,752 --> 00:30:44,930 essentially started creating robots for all the dirty jobs 696 00:30:44,973 --> 00:30:46,235 they didn't want to do anymore. 697 00:30:46,279 --> 00:30:48,107 And then the machines themselves, 698 00:30:48,150 --> 00:30:51,110 because they revere their creators, 699 00:30:51,153 --> 00:30:54,243 make machines that are even more like us. 700 00:30:54,287 --> 00:30:57,681 Cylons that are flesh and blood just like humans. 701 00:30:57,725 --> 00:31:00,902 Moore: The Cylons saw themselves as the children of humanity 702 00:31:00,946 --> 00:31:03,818 and that they wouldn't be able to really grow and mature 703 00:31:03,862 --> 00:31:05,385 until their parents were gone, 704 00:31:05,428 --> 00:31:08,779 so they decide they need to wipe out their human creators 705 00:31:08,823 --> 00:31:10,694 in this apocalyptic attack. 706 00:31:13,654 --> 00:31:14,916 Espenson: I think on the surface, 707 00:31:14,960 --> 00:31:16,396 you could say "Battlestar Galactica" 708 00:31:16,439 --> 00:31:18,833 is about "be careful of what you invent." 709 00:31:18,877 --> 00:31:22,445 But I think the real driving force of the show 710 00:31:22,489 --> 00:31:24,056 is not about that. 711 00:31:24,099 --> 00:31:25,753 I think it's about humanity's greatest weakness, 712 00:31:25,796 --> 00:31:29,148 the inability to see others as worthy as ourselves. 713 00:31:29,191 --> 00:31:31,715 Moore: That's the central conflict is of these two -- 714 00:31:31,759 --> 00:31:33,326 we are people, no, you're not. 715 00:31:33,369 --> 00:31:36,416 Starbuck: You are truly no greater than we are. 716 00:31:36,459 --> 00:31:39,723 You're just a bunch of machines after all. 717 00:31:39,767 --> 00:31:41,160 Let the games begin. 718 00:31:41,203 --> 00:31:42,944 "Flesh and Bone" is the torture episode. 719 00:31:42,988 --> 00:31:45,381 It's very much of a two-person play. 720 00:31:45,425 --> 00:31:48,341 It raises the question -- would she be less morally culpable 721 00:31:48,384 --> 00:31:51,561 because he's not really human? 722 00:31:51,605 --> 00:31:53,607 You're not human. 723 00:31:53,650 --> 00:31:56,131 Was a person being tortured in this scene 724 00:31:56,175 --> 00:31:58,917 and crying out and experiencing pain 725 00:31:58,960 --> 00:32:01,354 or was this all an elaborate simulation? 726 00:32:01,397 --> 00:32:03,008 We wanted to deal with the issue 727 00:32:03,051 --> 00:32:06,620 of what's moral and just in a society at war like this, 728 00:32:06,663 --> 00:32:09,144 but at the same time, we were also examining a different idea 729 00:32:09,188 --> 00:32:12,452 in the show which was about consciousness and personhood. 730 00:32:12,495 --> 00:32:14,019 Newitz: Who's the real monster? 731 00:32:14,062 --> 00:32:16,804 Is it the humans who built creatures that they knew 732 00:32:16,847 --> 00:32:20,155 were human equivalent, but enslaved them anyway? 733 00:32:20,199 --> 00:32:22,810 Or is it the slaves who rose up to destroy 734 00:32:22,853 --> 00:32:25,465 the type of people who would do that? 735 00:32:25,508 --> 00:32:28,076 Espenson: The big central idea of "Battlestar Galactica" is -- 736 00:32:28,120 --> 00:32:30,949 Does humanity deserve to survive? 737 00:32:30,992 --> 00:32:33,125 Can we earn our survival? 738 00:32:33,168 --> 00:32:35,997 You know, when we fought the Cylons, 739 00:32:36,041 --> 00:32:39,261 we did it to save ourselves from extinction. 740 00:32:39,305 --> 00:32:41,611 But we never answered the question why. 741 00:32:41,655 --> 00:32:45,485 Why are we as a people worth saving? 742 00:32:45,528 --> 00:32:48,575 That's -- That's an amazing question. 743 00:32:48,618 --> 00:32:51,404 The Cylons through the series evolved from a place 744 00:32:51,447 --> 00:32:54,363 of sort of blind hatred for humanity 745 00:32:54,407 --> 00:32:57,976 to then having more contact with individual human beings, 746 00:32:58,019 --> 00:33:01,544 having experiences with them, experiencing emotions with them, 747 00:33:01,588 --> 00:33:04,983 and then the humans realize that the Cylons are not as monolithic 748 00:33:05,026 --> 00:33:06,985 as they believed at the onset. 749 00:33:07,028 --> 00:33:10,553 Well, when you think you love somebody, you love them. 750 00:33:10,597 --> 00:33:12,077 That's what love is. 751 00:33:12,120 --> 00:33:13,817 Thoughts. 752 00:33:13,861 --> 00:33:15,471 She was a Cylon. 753 00:33:15,515 --> 00:33:16,733 A machine. 754 00:33:16,777 --> 00:33:19,562 She was more than that to us. 755 00:33:19,606 --> 00:33:21,434 She was more than that to me. 756 00:33:23,479 --> 00:33:26,308 She was a vital living person. 757 00:33:26,352 --> 00:33:29,181 Goldberg: "Battlestar Galactica" gives you 758 00:33:29,224 --> 00:33:30,878 an idea of what could be. 759 00:33:30,921 --> 00:33:34,795 How do we all do this together? 760 00:33:34,838 --> 00:33:38,364 If "Battlestar Galactica" is any guide, 761 00:33:38,407 --> 00:33:41,671 we can evolve together with the machines that we create. 762 00:33:41,715 --> 00:33:46,807 We can become one people, respectful of each other. 763 00:33:46,850 --> 00:33:48,504 Make a future together. 764 00:33:48,548 --> 00:33:49,810 Yeah, I think... 765 00:33:49,853 --> 00:33:53,422 I hope mankind is worthy of survival. 766 00:33:56,208 --> 00:33:58,079 I've talked to some A.I. experts. 767 00:33:58,123 --> 00:33:59,428 Yeah. 768 00:33:59,472 --> 00:34:02,736 And the one expert said just right out, 769 00:34:02,779 --> 00:34:04,303 "We're trying to make a person." 770 00:34:04,346 --> 00:34:06,087 And I said, "So when you say a person, 771 00:34:06,131 --> 00:34:07,784 you mean a personhood? They have -- 772 00:34:07,828 --> 00:34:10,004 they have an ego, they have a sense of identity." 773 00:34:10,048 --> 00:34:11,397 He said, "Yes, all those things." 774 00:34:11,440 --> 00:34:14,052 If you're a very smart group of human beings 775 00:34:14,095 --> 00:34:16,358 who are creating an A.I., 776 00:34:16,402 --> 00:34:17,577 one of the things you're definitely gonna leave out 777 00:34:17,620 --> 00:34:19,927 is to put in emotion. -Right. 778 00:34:19,970 --> 00:34:21,320 'Cause if you have emotion, 779 00:34:21,363 --> 00:34:24,149 emotion will lead to many facets, 780 00:34:24,192 --> 00:34:28,979 one of them being deceit, anger, fury, hatred. 781 00:34:29,023 --> 00:34:31,112 -Sure. -As well as love. 782 00:34:31,156 --> 00:34:35,203 If a machine becomes like us enough and complex enough 783 00:34:35,247 --> 00:34:38,859 at one point, can we no longer tell the difference? 784 00:34:38,902 --> 00:34:40,687 -The difference. -Does it have freedom? 785 00:34:40,730 --> 00:34:42,210 Does it have free will? 786 00:34:44,778 --> 00:34:47,041 This hearing is to determine the legal status 787 00:34:47,085 --> 00:34:50,479 of the android known as Data. 788 00:34:50,523 --> 00:34:53,395 The character of Data was sort of everyone's favorite character 789 00:34:53,439 --> 00:34:55,745 on the show and the writing staff as well. 790 00:34:55,789 --> 00:34:57,704 Everyone loved to write Data stories. 791 00:34:57,747 --> 00:35:00,272 Here's a robot who wants to be human 792 00:35:00,315 --> 00:35:02,535 but who has no emotions but wants emotions. 793 00:35:02,578 --> 00:35:04,406 So, it's really Pinocchio, 794 00:35:04,450 --> 00:35:07,105 and the Pinocchio metaphor is powerful. 795 00:35:07,148 --> 00:35:08,584 Commander, what are you? 796 00:35:08,628 --> 00:35:11,152 Webster's 24th-century dictionary 5th edition 797 00:35:11,196 --> 00:35:13,067 defines an android as an automaton 798 00:35:13,111 --> 00:35:14,851 made to resemble a human being. 799 00:35:14,895 --> 00:35:19,726 "The Measure of a Man" is one of those sort of very deep episodes 800 00:35:19,769 --> 00:35:21,467 that you don't realize is deep 801 00:35:21,510 --> 00:35:23,643 until like four or five years later. 802 00:35:23,686 --> 00:35:26,385 And you see it and you go, "Oh, wow." 803 00:35:26,428 --> 00:35:29,997 In the episode, Data's humanity is essentially put on trial. 804 00:35:30,040 --> 00:35:32,217 Is he sentient? 805 00:35:32,260 --> 00:35:35,437 Is he worthy of being treated as a person? 806 00:35:35,481 --> 00:35:37,178 Data: Am I person or a property? 807 00:35:37,222 --> 00:35:38,832 What's at stake? 808 00:35:38,875 --> 00:35:40,616 My right to choose. 809 00:35:40,660 --> 00:35:44,185 It was a legitimate exploration of this idea of personhood 810 00:35:44,229 --> 00:35:47,971 in a legal sense and in a moral sense. 811 00:35:48,015 --> 00:35:50,800 Its responses dictated by an elaborate software 812 00:35:50,844 --> 00:35:52,237 written by a man. 813 00:35:52,280 --> 00:35:55,065 And now a man will shut it off. 814 00:35:55,109 --> 00:35:56,937 It was shocking to the characters on the show 815 00:35:56,980 --> 00:35:59,983 and shocking to the audience as well because we love Data. 816 00:36:00,027 --> 00:36:02,856 Starfleet was founded to seek out new life. 817 00:36:02,899 --> 00:36:04,988 Well, there it sits! 818 00:36:05,032 --> 00:36:08,296 Moore: Once we create some form of artificial intelligence, 819 00:36:08,340 --> 00:36:10,211 these legal arguments are gonna happen. 820 00:36:10,255 --> 00:36:12,996 Do machines deserve rights? 821 00:36:13,040 --> 00:36:14,781 You know, probably. 822 00:36:14,824 --> 00:36:16,435 Guinan: In the history of many worlds, 823 00:36:16,478 --> 00:36:19,786 there have always been disposable creatures. 824 00:36:19,829 --> 00:36:21,788 They do the dirty work. 825 00:36:21,831 --> 00:36:23,311 An army of Datas, 826 00:36:23,355 --> 00:36:27,750 whole generations of disposable people. 827 00:36:29,796 --> 00:36:31,537 You're talking about slavery. 828 00:36:33,669 --> 00:36:37,238 Nolan: The term "robot" itself comes from the Czech play 829 00:36:37,282 --> 00:36:39,109 "Rossum's Universal Robots," 830 00:36:39,153 --> 00:36:42,243 and the word "robota" means laborer. 831 00:36:42,287 --> 00:36:44,463 A pejorative version of it means slave. 832 00:36:44,506 --> 00:36:46,987 So, our conception of what robots will be 833 00:36:47,030 --> 00:36:50,251 is directly, umbilically connected 834 00:36:50,295 --> 00:36:53,254 to our idea of them as an underclass. 835 00:36:53,298 --> 00:36:54,951 Why do you think your people made me? 836 00:36:54,995 --> 00:36:56,518 We made you 'cause we could. 837 00:36:56,562 --> 00:36:59,347 You are just a machine. An imitation of life. 838 00:36:59,391 --> 00:37:01,480 Replicants are like any other machine. 839 00:37:01,523 --> 00:37:02,872 They're either a benefit or a hazard. 840 00:37:02,916 --> 00:37:06,224 If they're a benefit, it's not my problem. 841 00:37:06,267 --> 00:37:09,575 "Blade Runner" is a slave narrative, basically. 842 00:37:09,618 --> 00:37:12,752 They've created these replicants to be our slaves. 843 00:37:12,795 --> 00:37:14,928 And I think that's the part that's really troubling about 844 00:37:14,971 --> 00:37:16,408 "Blade Runner" is that 845 00:37:16,451 --> 00:37:18,888 not only is it sort of this technologically 846 00:37:18,932 --> 00:37:20,716 and environmentally ruined future, 847 00:37:20,760 --> 00:37:25,286 it's sort of a morally and ethically ruined future as well. 848 00:37:25,330 --> 00:37:27,157 Fancher: I wrote those first couple scripts 849 00:37:27,201 --> 00:37:29,464 thinking of a very small movie. 850 00:37:29,508 --> 00:37:32,293 And then Ridley said to me, 851 00:37:32,337 --> 00:37:35,035 "What's out the window?" 852 00:37:35,078 --> 00:37:36,341 And I said, "What's out the window? 853 00:37:36,384 --> 00:37:37,777 Well, the world, you know." 854 00:37:37,820 --> 00:37:40,301 He said, "Exactly. What world is that? 855 00:37:40,345 --> 00:37:43,304 Where, you know, you make a robot indistinguishable 856 00:37:43,348 --> 00:37:44,610 from a human. 857 00:37:44,653 --> 00:37:46,612 Think about this for a second. 858 00:37:46,655 --> 00:37:48,483 Imagine..." and he does that to you. 859 00:37:48,527 --> 00:37:51,225 You go, "Boom." I said, "oh, God." 860 00:37:51,269 --> 00:37:53,401 He delivered a world. That's Ridley. 861 00:37:53,445 --> 00:37:55,316 He can -- he makes things. 862 00:37:55,360 --> 00:37:57,187 Ridley brought everything to it. 863 00:37:57,231 --> 00:37:59,407 Cameron: "Blade Runner" comes from a Philip K. Dick novel... 864 00:37:59,451 --> 00:38:00,800 Scott: Yeah. 865 00:38:00,843 --> 00:38:02,932 ..."Do Androids Dream of Electric Sheep?" 866 00:38:02,976 --> 00:38:06,458 Philip K. Dick was very prolific and very profound 867 00:38:06,501 --> 00:38:08,677 talking about the nature of reality 868 00:38:08,721 --> 00:38:10,984 and the nature of artificial intelligence 869 00:38:11,027 --> 00:38:13,029 and what it is to be human. 870 00:38:13,073 --> 00:38:14,901 That was the nut of the idea 871 00:38:14,944 --> 00:38:17,686 that grew with Hampton into what it was. 872 00:38:17,730 --> 00:38:20,602 Here was this beautiful, beautiful film. 873 00:38:20,646 --> 00:38:23,997 Dark, noir-ish and I thought, "Wow, 874 00:38:24,040 --> 00:38:26,739 a film can be so artistic." 875 00:38:26,782 --> 00:38:29,611 And the idea of these -- these machines challenging us 876 00:38:29,655 --> 00:38:32,135 and their lack of affect, their lack of emotion. 877 00:38:32,179 --> 00:38:34,877 The film is constantly saying there is no emotion. 878 00:38:34,921 --> 00:38:37,880 Computer just makes decisions. 879 00:38:37,924 --> 00:38:39,360 Negative or positive. 880 00:38:39,404 --> 00:38:40,709 It doesn't really care. 881 00:38:40,753 --> 00:38:42,363 Yeah, with the Voight-Kampff test. 882 00:38:42,407 --> 00:38:44,409 Correct. 883 00:38:44,452 --> 00:38:46,280 Tyrell: Is this to be an empathy test? 884 00:38:46,324 --> 00:38:49,239 Capillary dilation of so-called blush response? 885 00:38:49,283 --> 00:38:51,503 We call it Voight-Kampff for short. 886 00:38:51,546 --> 00:38:53,940 The Voight-Kampff is a series of questions 887 00:38:53,983 --> 00:38:56,725 that allowed the questioner to find out 888 00:38:56,769 --> 00:39:00,033 whether or not who was being questioned had feelings. 889 00:39:00,076 --> 00:39:01,643 Deckard: It's your birthday. 890 00:39:01,687 --> 00:39:03,428 Someone gives you a calf-skin wallet. 891 00:39:03,471 --> 00:39:05,299 Rachael: I wouldn't accept it. 892 00:39:05,343 --> 00:39:06,561 It's about empathy. 893 00:39:06,605 --> 00:39:08,128 You're reading a magazine. 894 00:39:08,171 --> 00:39:10,565 You come across a full-page nude photo of a girl. 895 00:39:10,609 --> 00:39:12,654 Is this testing whether I'm a replicant 896 00:39:12,698 --> 00:39:14,177 or a lesbian, Mr. Deckard? 897 00:39:14,221 --> 00:39:15,744 Just answer the questions, please. 898 00:39:15,788 --> 00:39:18,225 Not to get gooey about empathy, but it does seem 899 00:39:18,268 --> 00:39:24,666 that empathy is the big divide between us and everything else. 900 00:39:24,710 --> 00:39:28,235 Yaszek: Deckard is very much a man in his job. 901 00:39:28,278 --> 00:39:31,281 He firmly believes that as long as robots are working properly, 902 00:39:31,325 --> 00:39:32,587 it's not his problem. 903 00:39:32,631 --> 00:39:35,111 But that if a replicant misbehaves, 904 00:39:35,155 --> 00:39:36,852 it is indeed his problem 905 00:39:36,896 --> 00:39:40,421 and his duty to retire it. 906 00:39:40,465 --> 00:39:42,205 Move! Get out of the way! 907 00:39:44,817 --> 00:39:48,037 However, over the course of the film, he increasingly questions 908 00:39:48,081 --> 00:39:51,780 whether or not disobeying orders means that you're defective 909 00:39:51,824 --> 00:39:53,434 or that you are a human 910 00:39:53,478 --> 00:39:55,175 with rights and wills and dreams of your own. 911 00:39:55,218 --> 00:40:01,268 I've seen things you people wouldn't believe. 912 00:40:01,311 --> 00:40:03,749 There's such poetry in the scene 913 00:40:03,792 --> 00:40:06,186 where Roy Batty's dying. -Yes. 914 00:40:06,229 --> 00:40:07,840 It's just a magnificent scene. 915 00:40:07,883 --> 00:40:09,537 Scott: He wrote that. 916 00:40:09,581 --> 00:40:11,017 Really? Rutger wrote that? 917 00:40:11,060 --> 00:40:12,584 It's 1:00 in the morning. 918 00:40:12,627 --> 00:40:14,760 I'm gonna have the plug pulled... Yeah. 919 00:40:14,803 --> 00:40:16,675 ...literally on everything at dawn. 920 00:40:16,718 --> 00:40:17,980 -Yeah. -And that's it. 921 00:40:18,024 --> 00:40:19,939 That's gonna be the last night, 922 00:40:19,982 --> 00:40:22,550 and Rutger said, "I have written something." 923 00:40:22,594 --> 00:40:24,030 And he said... 924 00:40:24,073 --> 00:40:29,252 All those moments will be lost 925 00:40:29,296 --> 00:40:34,823 in time like tears in rain. 926 00:40:36,564 --> 00:40:38,479 -And I'm nearly in tears. -Yeah. 927 00:40:38,523 --> 00:40:39,567 He said, "What do you think?" 928 00:40:39,611 --> 00:40:41,351 I said, "Let's do it." 929 00:40:41,395 --> 00:40:43,441 So, we literally went -- -It's gorgeous. 930 00:40:43,484 --> 00:40:45,225 -We shot it within an hour. -Yeah. 931 00:40:45,268 --> 00:40:46,922 And at the end, he looked at him 932 00:40:46,966 --> 00:40:48,881 and gave that most beautiful smile. 933 00:40:51,057 --> 00:40:54,234 Time to die. 934 00:40:54,277 --> 00:40:55,583 And he had a dove in his hand and he let -- 935 00:40:55,627 --> 00:40:57,237 He let it go. Yeah. 936 00:40:57,280 --> 00:40:59,761 Is it saying Roy Batty had a soul? 937 00:40:59,805 --> 00:41:02,503 Roy Batty was a fully sentient being. 938 00:41:02,547 --> 00:41:03,765 Yes. 939 00:41:06,289 --> 00:41:08,553 Four of your films now have had 940 00:41:08,596 --> 00:41:12,078 an intelligent, embodied A.I. 941 00:41:12,121 --> 00:41:13,427 Right? An artificial intelligence. 942 00:41:13,471 --> 00:41:14,733 Synthetic person. 943 00:41:14,776 --> 00:41:16,865 So where do you think we come out in this? 944 00:41:16,909 --> 00:41:19,694 Is this our -- are we handing the keys 945 00:41:19,738 --> 00:41:21,653 to the kingdom off to the machines? 946 00:41:21,696 --> 00:41:24,264 I don't think we should. With a creation of something 947 00:41:24,307 --> 00:41:27,397 so potentially wonderful and dangerous as A.I., 948 00:41:27,441 --> 00:41:30,966 the inventor frequently is obsessed by the success 949 00:41:31,010 --> 00:41:32,402 of what he's doing 950 00:41:32,446 --> 00:41:34,317 rather than looking at the real outcome. 951 00:41:34,361 --> 00:41:35,928 Here is where the problem is. 952 00:41:35,971 --> 00:41:38,887 It's the moment where it passes over your control. 953 00:41:38,931 --> 00:41:40,149 Yeah. 954 00:41:40,193 --> 00:41:41,542 That's where the danger lies. 955 00:41:41,586 --> 00:41:43,762 You cross over and you're in trouble. 956 00:41:43,805 --> 00:41:46,939 You get an A.I., you have to have limitations. 957 00:41:46,982 --> 00:41:49,245 You got to have your hand on the plug the entire time. 958 00:41:49,289 --> 00:41:51,465 All the time. Totally. 74445

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.