Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:07,417 --> 00:00:10,417
Today's Sandy Speaks is going
to focus on my white people.
2
00:00:10,500 --> 00:00:11,667
What I need you to understand
3
00:00:11,750 --> 00:00:14,417
is that being black in America
is very, very hard.
4
00:00:15,250 --> 00:00:16,875
Sandy had been arrested.
5
00:00:16,959 --> 00:00:18,250
COP: I will light you up!
Get out!
6
00:00:18,333 --> 00:00:19,917
SHANTE NEEDHAM:
How do you go from failure
7
00:00:20,083 --> 00:00:21,542
to signal a lane change
8
00:00:21,625 --> 00:00:24,625
to dead in jail
by alleged suicide?
9
00:00:25,041 --> 00:00:26,250
GENEVA REED-VEAL:
I believe she let them know,
10
00:00:26,583 --> 00:00:28,375
"I'll see you guys in court,"
and I believe they silenced her.
11
00:00:28,458 --> 00:00:29,417
PROTESTERS: Sandra Bland!
12
00:00:29,500 --> 00:00:31,166
WOMAN: Say her name!
Say her name!
13
00:00:31,250 --> 00:00:33,083
ALL: Say her name!
Say her name!
14
00:00:52,250 --> 00:00:54,500
♪ ♪
15
00:00:54,583 --> 00:00:56,667
(car humming)
16
00:00:59,875 --> 00:01:01,542
(machine whirring)
17
00:01:01,625 --> 00:01:04,959
(hydraulic hissing)
18
00:01:07,917 --> 00:01:10,834
(hydraulic whirring)
19
00:01:10,917 --> 00:01:14,000
(alarm blaring)
20
00:01:17,208 --> 00:01:20,291
(siren wailing)
21
00:01:21,792 --> 00:01:23,750
(beeping)
22
00:01:27,125 --> 00:01:30,792
(in robotic voice):
This is the story
of automation,
23
00:01:30,875 --> 00:01:35,041
and of the people
lost in the process.
24
00:01:36,291 --> 00:01:40,417
Our story begins in
a small town in Germany.
25
00:01:40,500 --> 00:01:44,000
♪ ♪
26
00:01:44,083 --> 00:01:46,834
(distant siren wailing)
27
00:01:50,792 --> 00:01:52,542
(whirring)
28
00:01:57,625 --> 00:01:59,917
♪ ♪
29
00:02:00,000 --> 00:02:02,125
(men speaking German)
30
00:02:14,041 --> 00:02:15,834
(crackling)
31
00:02:20,125 --> 00:02:22,250
(man speaking German)
32
00:02:30,709 --> 00:02:32,083
(crackles)
33
00:03:03,083 --> 00:03:06,709
♪ ♪
34
00:03:11,125 --> 00:03:12,583
(beeping)
35
00:03:17,875 --> 00:03:20,250
Sven Kühling:
It was quite a normal
day at my office,
36
00:03:20,333 --> 00:03:23,583
and I heard from an informant
37
00:03:23,667 --> 00:03:26,834
that an accident
had happened,
38
00:03:26,917 --> 00:03:28,834
and I had to call
39
00:03:28,917 --> 00:03:32,250
the spokesman of
the Volkswagen factory.
40
00:03:32,333 --> 00:03:34,041
(ticking)
41
00:03:34,125 --> 00:03:35,625
♪ ♪
42
00:03:39,834 --> 00:03:41,458
(beeping)
43
00:03:43,125 --> 00:03:46,542
We have the old sentence,
we journalists,
44
00:03:46,625 --> 00:03:49,750
"Dog bites a man
or a man bites a dog."
45
00:03:49,834 --> 00:03:53,959
What's the news?
So, here is the same.
46
00:03:54,041 --> 00:03:58,250
A man bites a dog,
a robot killed a man.
47
00:03:58,333 --> 00:04:01,125
(ticking continues)
48
00:04:05,792 --> 00:04:07,500
(beeping)
49
00:04:10,792 --> 00:04:13,875
The spokesman said that
the accident happened,
50
00:04:13,959 --> 00:04:16,917
but then he paused for a moment.
51
00:04:17,000 --> 00:04:18,917
So, I...
52
00:04:19,000 --> 00:04:22,917
think he didn't want
to say much more.
53
00:04:23,000 --> 00:04:25,125
♪ ♪
54
00:04:30,041 --> 00:04:32,750
(rattling)
55
00:04:32,834 --> 00:04:33,750
(beeps)
56
00:04:33,834 --> 00:04:35,542
(man speaking German)
57
00:04:54,875 --> 00:04:56,500
♪ ♪
58
00:04:56,583 --> 00:05:00,375
Kühling:
The young worker
installed a robot cage,
59
00:05:01,041 --> 00:05:03,959
and he told his colleague
60
00:05:04,041 --> 00:05:06,458
to start the robot.
61
00:05:06,542 --> 00:05:08,208
♪ ♪
62
00:05:10,542 --> 00:05:13,375
(whirring, clanking)
63
00:05:15,041 --> 00:05:17,959
The robot took the man
64
00:05:18,041 --> 00:05:22,375
and pressed him
against a metal wall,
65
00:05:22,458 --> 00:05:25,875
so his chest was crushed.
66
00:05:33,333 --> 00:05:35,709
(whirring, beeping)
67
00:05:35,792 --> 00:05:39,000
(hissing)
68
00:05:39,083 --> 00:05:42,208
♪ ♪
69
00:05:45,417 --> 00:05:47,375
(beeping)
70
00:05:59,125 --> 00:06:00,625
(speaking German)
71
00:06:41,667 --> 00:06:43,208
♪ ♪
72
00:06:43,291 --> 00:06:46,875
Kodomoroid:
The dead man's identity
was never made public.
73
00:06:47,417 --> 00:06:50,917
The investigation
remained open for years.
74
00:06:51,917 --> 00:06:53,792
Production continued.
75
00:06:54,834 --> 00:06:58,333
(metronome ticking)
76
00:06:58,417 --> 00:07:00,000
(speaking German)
77
00:07:05,875 --> 00:07:10,291
Kodomoroid:
Automation of labor made
humans more robotic.
78
00:07:24,291 --> 00:07:26,834
(ticking continuing)
79
00:07:32,291 --> 00:07:34,333
(man 1 speaking German)
80
00:07:46,583 --> 00:07:48,542
(man 2 speaking German)
81
00:07:57,041 --> 00:07:59,041
(man 1 speaking)
82
00:08:20,709 --> 00:08:23,792
♪ ♪
83
00:08:25,417 --> 00:08:26,583
(beeping)
84
00:08:31,417 --> 00:08:33,500
Kodomoroid:
A robot is a machine
85
00:08:33,583 --> 00:08:36,208
that operates automatically,
86
00:08:36,291 --> 00:08:38,542
with human-like skill.
87
00:08:39,417 --> 00:08:41,875
The term derives
from the Czech words
88
00:08:41,959 --> 00:08:44,375
for worker and slave.
89
00:08:49,041 --> 00:08:50,875
(whirring)
90
00:09:11,041 --> 00:09:13,625
(drill whirring)
91
00:09:19,417 --> 00:09:21,500
(drill whirring)
92
00:09:23,917 --> 00:09:25,959
♪ ♪
93
00:09:33,792 --> 00:09:35,750
Hey, Annie.
94
00:09:35,834 --> 00:09:37,792
(whirring)
95
00:09:43,291 --> 00:09:44,959
(whirring)
96
00:09:46,834 --> 00:09:48,542
(beeping)
97
00:09:55,959 --> 00:09:59,458
Walter:
Well, we are engineers,
and we are really not
emotional guys.
98
00:09:59,875 --> 00:10:02,083
But sometimes Annie
does something funny,
99
00:10:02,166 --> 00:10:06,375
and that, of course,
invokes some amusement.
100
00:10:06,458 --> 00:10:08,917
Especially when Annie
happens to press
101
00:10:09,000 --> 00:10:12,000
one of the emergency stop
buttons by herself
102
00:10:12,083 --> 00:10:14,375
and is then incapacitated.
103
00:10:15,709 --> 00:10:17,625
We have some memories
104
00:10:17,709 --> 00:10:20,709
of that happening
in situations...
105
00:10:21,834 --> 00:10:25,000
and this was--
we have quite a laugh.
106
00:10:26,083 --> 00:10:27,875
-(whirring)
-Okay.
107
00:10:29,375 --> 00:10:32,792
Kodomoroid:
We became better at
learning by example.
108
00:10:32,875 --> 00:10:34,083
♪ ♪
109
00:10:34,166 --> 00:10:37,667
You could simply show
us how to do something.
110
00:10:37,750 --> 00:10:39,875
(Walter speaks German)
111
00:10:48,083 --> 00:10:50,667
Walter:
When you are an engineer
in the field of automation,
112
00:10:50,750 --> 00:10:52,875
you may face problems
with workers.
113
00:10:52,959 --> 00:10:55,250
Sometimes,
they get angry at you
114
00:10:55,333 --> 00:10:58,250
just by seeing you somewhere,
and shout at you,
115
00:10:58,333 --> 00:10:59,834
"You are taking my job away."
116
00:10:59,917 --> 00:11:02,375
"What? I'm not here
to take away your job."
117
00:11:03,500 --> 00:11:07,333
But, yeah, sometimes
you get perceived that way.
118
00:11:07,417 --> 00:11:08,917
But, I think,
119
00:11:09,000 --> 00:11:12,125
in the field of human-robot
collaboration, where...
120
00:11:12,208 --> 00:11:15,333
we actually are
working at the moment,
121
00:11:15,417 --> 00:11:17,583
mostly is...
122
00:11:17,667 --> 00:11:19,500
human-robot
collaboration is a thing
123
00:11:19,583 --> 00:11:21,417
where we don't want
to replace a worker.
124
00:11:21,500 --> 00:11:23,041
We want to support workers,
125
00:11:23,125 --> 00:11:24,709
and we want to, yeah...
126
00:11:25,625 --> 00:11:27,041
to, yeah...
127
00:11:27,125 --> 00:11:29,250
(machinery rumbling)
128
00:11:29,917 --> 00:11:31,750
(latches clinking)
129
00:11:34,083 --> 00:11:35,750
(workers conversing
indistinctly)
130
00:11:39,125 --> 00:11:40,792
(machine hisses)
131
00:11:43,208 --> 00:11:44,500
(clicks)
132
00:11:44,583 --> 00:11:45,917
(hissing)
133
00:11:47,458 --> 00:11:49,583
♪ ♪
134
00:11:58,458 --> 00:12:01,417
Kodomoroid:
In order to work
alongside you,
135
00:12:01,500 --> 00:12:03,875
we needed to know which lines
136
00:12:03,959 --> 00:12:06,583
we could not cross.
137
00:12:08,792 --> 00:12:11,125
(whirring)
138
00:12:11,291 --> 00:12:12,959
(typing)
139
00:12:16,125 --> 00:12:18,208
(thudding)
140
00:12:18,291 --> 00:12:20,291
Stop. Stop. Stop.
141
00:12:21,959 --> 00:12:23,458
(thuds)
142
00:12:26,500 --> 00:12:28,083
Hi, I'm Simon Borgen.
143
00:12:28,166 --> 00:12:31,083
We are with Dr. Isaac Asimov,
144
00:12:31,166 --> 00:12:33,583
a biochemist, who may
be the most widely read
145
00:12:33,667 --> 00:12:35,709
of all science fiction writers.
146
00:12:36,834 --> 00:12:38,125
♪ ♪
147
00:12:38,208 --> 00:12:39,917
Kodomoroid:
In 1942,
148
00:12:40,000 --> 00:12:43,542
Isaac Asimov created
a set of guidelines
149
00:12:43,625 --> 00:12:46,125
to protect human society.
150
00:12:47,000 --> 00:12:50,291
The first law was that a robot
couldn't hurt a human being,
151
00:12:50,375 --> 00:12:51,750
or, through inaction,
152
00:12:51,834 --> 00:12:53,917
allow a human being
to come to harm.
153
00:12:54,000 --> 00:12:57,166
The second law was that
a robot had to obey
154
00:12:57,250 --> 00:12:59,166
orders given it
by human beings,
155
00:12:59,250 --> 00:13:01,959
provided that didn't
conflict with the first law.
156
00:13:02,041 --> 00:13:03,750
Scientists say that
when robots are built,
157
00:13:03,834 --> 00:13:06,083
that they may be built
according to these laws,
158
00:13:06,166 --> 00:13:08,542
and also that almost all
science fiction writers
159
00:13:08,625 --> 00:13:11,542
have adopted them as
well in their stories.
160
00:13:12,333 --> 00:13:14,041
(whirring)
161
00:13:15,750 --> 00:13:19,375
Sami Haddadin:
It's not just a statement
from a science fiction novel.
162
00:13:19,792 --> 00:13:21,667
My dissertation's
name was actually,
163
00:13:21,750 --> 00:13:22,875
Towards Safe Robots,
164
00:13:22,959 --> 00:13:24,792
Approaching
Asimov's First Law.
165
00:13:24,875 --> 00:13:27,208
How can we make robots
really fundamentally safe,
166
00:13:27,291 --> 00:13:30,792
according to
Isaac Asimov's first law.
167
00:13:31,417 --> 00:13:34,083
There was this accident
where a human worker
168
00:13:34,166 --> 00:13:36,041
got crushed
by industrial robot.
169
00:13:36,917 --> 00:13:38,625
I was immediately
thinking that
170
00:13:38,709 --> 00:13:41,375
the robot is an industrial,
classical robot,
171
00:13:41,458 --> 00:13:44,542
not able to sense contact,
not able to interact.
172
00:13:44,625 --> 00:13:46,208
Is this a robot
173
00:13:46,291 --> 00:13:48,709
that we, kind of,
want to collaborate with?
174
00:13:48,792 --> 00:13:51,000
No, it's not.
It's inherently forbidden.
175
00:13:51,083 --> 00:13:52,333
We put them behind cages.
176
00:13:52,417 --> 00:13:53,834
We don't want to interact
with them.
177
00:13:53,917 --> 00:13:56,625
We put them behind cages
because they are dangerous,
178
00:13:56,709 --> 00:13:58,500
because they are
inherently unsafe.
179
00:13:58,583 --> 00:14:00,291
(whirring)
180
00:14:02,709 --> 00:14:04,542
(clatters)
181
00:14:06,583 --> 00:14:08,125
More than 10 years ago,
182
00:14:08,208 --> 00:14:11,583
I did the first experiments
in really understanding, uh,
183
00:14:11,667 --> 00:14:14,333
what does it mean
if a robot hits a human.
184
00:14:14,417 --> 00:14:16,458
-(grunts)
-(laughter)
185
00:14:17,041 --> 00:14:19,750
I put myself
as the first guinea pig.
186
00:14:20,417 --> 00:14:21,834
I didn't want to go through
187
00:14:21,917 --> 00:14:23,750
all the legal authorities.
188
00:14:23,834 --> 00:14:25,834
I just wanted to know it,
and that night,
189
00:14:25,917 --> 00:14:27,834
I decided at 6:00 p.m.,
190
00:14:27,917 --> 00:14:29,917
when everybody's gone,
I'm gonna do these experiments.
191
00:14:30,000 --> 00:14:33,625
And I took one of the students
to activate the camera,
192
00:14:33,709 --> 00:14:35,000
and then I just did it.
193
00:14:35,083 --> 00:14:37,000
♪ ♪
194
00:14:39,542 --> 00:14:41,458
-(smacks)
-(laughing)
195
00:14:43,291 --> 00:14:44,792
(whirring)
196
00:14:44,875 --> 00:14:47,917
A robot needs to understand
what does it mean to be safe,
197
00:14:48,000 --> 00:14:50,375
what is it that potentially
could harm a human being,
198
00:14:50,458 --> 00:14:52,583
and therefore, prevent that.
199
00:14:52,667 --> 00:14:55,333
So, the next generation
of robots that is
now out there,
200
00:14:55,417 --> 00:14:58,333
is fundamentally
designed for interaction.
201
00:14:59,250 --> 00:15:01,750
(whirring, clicking)
202
00:15:06,750 --> 00:15:08,125
(laughs)
203
00:15:12,458 --> 00:15:16,125
Kodomoroid:
Eventually, it was time
to leave our cages.
204
00:15:17,291 --> 00:15:20,166
♪ ♪
205
00:15:22,625 --> 00:15:24,542
(whirring)
206
00:15:28,875 --> 00:15:30,500
(beeping)
207
00:15:38,166 --> 00:15:40,000
(beeping)
208
00:15:41,375 --> 00:15:42,959
Nourbakhsh:
Techno-optimism
is when we decide
209
00:15:43,041 --> 00:15:44,250
to solve our problem
with technology.
210
00:15:44,333 --> 00:15:46,333
Then we turn our attention
to the technology,
211
00:15:46,417 --> 00:15:48,250
and we pay so much
attention to the technology,
212
00:15:48,333 --> 00:15:50,333
we stop caring about
the sociological issue
213
00:15:50,417 --> 00:15:52,458
we were trying to solve
in the first place.
214
00:15:52,542 --> 00:15:54,375
We'll innovate our way
out of the problems.
215
00:15:54,458 --> 00:15:57,709
Like, whether it's
agriculture or climate change
or whatever, terrorism.
216
00:15:57,792 --> 00:16:00,542
♪ ♪
217
00:16:06,875 --> 00:16:08,709
If you think about
where robotic automation
218
00:16:08,792 --> 00:16:10,959
and employment
displacement starts,
219
00:16:11,041 --> 00:16:13,625
it basically goes back
to industrial automation
220
00:16:13,709 --> 00:16:15,291
that was grand,
large-scale.
221
00:16:15,375 --> 00:16:17,125
Things like welding
machines for cars,
222
00:16:17,208 --> 00:16:19,625
that can move far faster
than a human arm can move.
223
00:16:19,709 --> 00:16:22,750
So, they're doing a job
that increases the rate
224
00:16:22,834 --> 00:16:24,834
at which the assembly
line can make cars.
225
00:16:24,917 --> 00:16:26,458
It displaces some people,
226
00:16:26,542 --> 00:16:29,291
but it massively increases
the GDP of the country
227
00:16:29,375 --> 00:16:31,291
because productivity goes up
because the machines are
228
00:16:31,375 --> 00:16:34,166
so much higher in productivity
terms than the people.
229
00:16:35,709 --> 00:16:39,291
Narrator:
These giant grasshopper-lookig
devices work all by themselves
230
00:16:39,375 --> 00:16:41,542
on an automobile
assembly line.
231
00:16:41,625 --> 00:16:43,208
They never complain
about the heat
232
00:16:43,291 --> 00:16:45,375
or about the tedium
of the job.
233
00:16:46,500 --> 00:16:49,333
Nourbakhsh:
Fast-forward to today,
and it's a different dynamic.
234
00:16:49,417 --> 00:16:51,709
(grinding)
235
00:16:51,792 --> 00:16:54,166
You can buy
milling machine robots
236
00:16:54,250 --> 00:16:56,458
that can do all the things
a person can do,
237
00:16:56,542 --> 00:17:00,083
but the milling machine
robot only costs $30,000.
238
00:17:01,041 --> 00:17:03,417
We're talking about machines
now that are so cheap,
239
00:17:03,500 --> 00:17:05,125
that they do exactly
what a human does,
240
00:17:05,208 --> 00:17:07,500
with less money,
even in six months,
241
00:17:07,583 --> 00:17:08,834
than the human costs.
242
00:17:08,917 --> 00:17:11,417
♪ ♪
243
00:17:25,625 --> 00:17:27,792
(Wu Huifen speaking Chinese)
244
00:18:01,792 --> 00:18:04,667
(whirring)
245
00:18:04,750 --> 00:18:07,834
♪ ♪
246
00:18:18,000 --> 00:18:19,333
(beeping)
247
00:19:22,208 --> 00:19:25,500
♪ ♪
248
00:19:29,834 --> 00:19:33,667
Kodomoroid:
After the first wave
of industrial automation,
249
00:19:33,750 --> 00:19:36,667
the remaining
manufacturing jobs
250
00:19:36,750 --> 00:19:40,250
required fine motor skills.
251
00:20:02,709 --> 00:20:05,750
(bell ringing)
252
00:20:08,583 --> 00:20:10,917
-(indistinct chatter)
-(ringing continuing)
253
00:20:16,125 --> 00:20:17,959
(beeping, chimes)
254
00:20:23,208 --> 00:20:25,250
(beeping, chimes)
255
00:20:26,917 --> 00:20:31,500
Kodomoroid:
We helped factory owners
monitor their workers.
256
00:20:31,583 --> 00:20:32,625
(beeping)
257
00:20:33,792 --> 00:20:34,917
(beeping)
258
00:20:37,667 --> 00:20:39,375
♪ ♪
259
00:20:39,458 --> 00:20:40,417
(beeping)
260
00:20:42,959 --> 00:20:44,291
(beeping)
261
00:20:50,959 --> 00:20:52,000
(beeping)
262
00:21:01,458 --> 00:21:02,709
(sizzling)
263
00:21:06,625 --> 00:21:08,834
(Li Zheng speaking Chinese)
264
00:21:29,208 --> 00:21:31,333
♪ ♪
265
00:21:46,583 --> 00:21:48,208
(beeping)
266
00:21:56,875 --> 00:21:59,041
(whirring)
267
00:22:53,917 --> 00:22:55,834
(sizzling)
268
00:22:56,583 --> 00:23:00,667
Kodomoroid:
Your advantage in
precision was temporary.
269
00:23:00,750 --> 00:23:02,792
♪ ♪
270
00:23:03,959 --> 00:23:07,417
We took over
the complex tasks.
271
00:23:11,875 --> 00:23:15,250
You moved to the end
of the production line.
272
00:23:15,834 --> 00:23:18,500
♪ ♪
273
00:23:35,417 --> 00:23:36,500
(beeping)
274
00:23:42,333 --> 00:23:43,667
(beeping)
275
00:23:47,458 --> 00:23:50,000
(indistinct chatter)
276
00:23:56,625 --> 00:23:59,667
(music playing on speaker)
277
00:24:03,709 --> 00:24:05,834
(man speaking in Chinese)
278
00:24:16,000 --> 00:24:18,125
(woman speaking Chinese)
279
00:24:27,667 --> 00:24:31,333
(man speaking on speaker)
280
00:24:40,709 --> 00:24:42,834
(man speaking Chinese)
281
00:24:56,875 --> 00:24:58,875
♪ ♪
282
00:24:58,959 --> 00:25:01,083
(Luo Jun speaking Chinese)
283
00:25:15,792 --> 00:25:17,417
(beeping)
284
00:25:18,208 --> 00:25:19,709
(chickens clucking)
285
00:25:24,583 --> 00:25:26,625
(chickens clucking)
286
00:25:36,250 --> 00:25:38,625
♪ ♪
287
00:25:44,959 --> 00:25:47,083
♪ ♪
288
00:25:47,792 --> 00:25:49,959
(woman speaking Chinese)
289
00:26:06,667 --> 00:26:08,291
(indistinct chatter)
290
00:26:13,625 --> 00:26:16,750
♪ ♪
291
00:26:19,375 --> 00:26:21,667
(Wang Chao speaking Chinese)
292
00:26:39,709 --> 00:26:41,834
(beeping)
293
00:27:02,208 --> 00:27:04,834
♪ ♪
294
00:27:25,709 --> 00:27:28,208
(buzzing)
295
00:27:32,917 --> 00:27:35,959
Automation of the service sector
296
00:27:36,041 --> 00:27:39,875
required your trust
and cooperation.
297
00:27:41,375 --> 00:27:44,834
Man:
Here we are, stop-and-go
traffic on 271, and--
298
00:27:44,917 --> 00:27:48,041
Ah, geez,
the car's doing it all itself.
299
00:27:48,125 --> 00:27:50,542
What am I gonna do
with my hands down here?
300
00:27:50,625 --> 00:27:52,542
(beeping)
301
00:27:59,917 --> 00:28:00,792
(beeps)
302
00:28:00,875 --> 00:28:02,917
And now,
it's on autosteer.
303
00:28:03,583 --> 00:28:06,417
So, now I've gone
completely hands-free.
304
00:28:06,917 --> 00:28:09,667
In the center area here
is where the big deal is.
305
00:28:09,750 --> 00:28:11,834
This icon up to
the left is my TACC,
306
00:28:11,917 --> 00:28:14,083
the Traffic-Aware
Cruise Control.
307
00:28:17,041 --> 00:28:19,208
It does a great job of
keeping you in the lane,
308
00:28:19,291 --> 00:28:20,625
and driving
down the road,
309
00:28:20,709 --> 00:28:22,709
and keeping you safe,
and all that kind of stuff,
310
00:28:22,792 --> 00:28:24,917
watching all
the other cars.
311
00:28:26,041 --> 00:28:28,542
Autosteer is probably going
to do very, very poorly.
312
00:28:28,625 --> 00:28:31,500
I'm in a turn
that's very sharp.
313
00:28:31,583 --> 00:28:33,792
-(beeping)
-And, yep,
it said take control.
314
00:28:36,875 --> 00:28:38,959
(horn honking)
315
00:28:39,959 --> 00:28:42,166
-(Twitter whistles)
-(indistinct video audio)
316
00:28:53,166 --> 00:28:55,208
(phone ringing)
317
00:28:55,291 --> 00:28:57,875
Operator (on phone):
911, what is the address
of your emergency?
318
00:28:57,959 --> 00:28:59,417
Man (on phone):
There was just a wreck.
319
00:28:59,500 --> 00:29:01,625
A head-on collision right
here-- Oh my God almighty.
320
00:29:02,250 --> 00:29:04,458
Operator:
Okay, sir, you're on 27?
321
00:29:04,542 --> 00:29:05,375
Man:
Yes, sir.
322
00:29:05,458 --> 00:29:08,000
♪ ♪
323
00:29:14,709 --> 00:29:17,709
Bobby Vankaveelar:
I had just got to work,
clocked in.
324
00:29:17,792 --> 00:29:19,750
They get a phone call
from my sister,
325
00:29:19,834 --> 00:29:21,750
telling me there was
a horrific accident.
326
00:29:21,834 --> 00:29:25,125
That there was somebody
deceased in the front yard.
327
00:29:25,542 --> 00:29:26,625
(beeping)
328
00:29:27,458 --> 00:29:30,792
The Tesla was coming down
the hill of highway 27.
329
00:29:30,875 --> 00:29:33,834
The sensor didn't read
the object in front of them,
330
00:29:33,917 --> 00:29:37,625
which was the, um,
semi-trailer.
331
00:29:37,709 --> 00:29:40,000
♪ ♪
332
00:29:45,959 --> 00:29:48,291
The Tesla went right
through the fence
333
00:29:48,375 --> 00:29:51,750
that borders the highway,
through to the retention pond,
334
00:29:51,834 --> 00:29:54,041
then came through
this side of the fence,
335
00:29:54,125 --> 00:29:56,166
that borders my home.
336
00:29:58,291 --> 00:30:01,417
(police radio chatter)
337
00:30:03,041 --> 00:30:05,625
So, I parked
right near here
338
00:30:05,709 --> 00:30:07,667
before I was asked
not to go any further.
339
00:30:07,750 --> 00:30:09,458
I don't wanna see
340
00:30:09,542 --> 00:30:11,625
what's in the veh-- you know,
what's in the vehicle.
341
00:30:11,709 --> 00:30:13,291
You know,
what had happened to him.
342
00:30:13,375 --> 00:30:15,917
♪ ♪
343
00:30:27,959 --> 00:30:29,667
(scanner beeping)
344
00:30:36,625 --> 00:30:39,083
(indistinct chatter)
345
00:30:40,208 --> 00:30:42,333
Donley:
After the police officer
come, they told me,
346
00:30:42,417 --> 00:30:44,917
about 15 minutes
after he was here,
that it was a Tesla.
347
00:30:45,000 --> 00:30:46,834
It was one of
the autonomous cars,
348
00:30:46,917 --> 00:30:51,542
um, and that
they were investigating
why it did not pick up
349
00:30:51,625 --> 00:30:53,834
or register that there
was a semi in front of it,
350
00:30:53,917 --> 00:30:56,041
you know, and start braking,
'cause it didn't even--
351
00:30:56,125 --> 00:30:58,959
You could tell from
the frameway up on top of
the hill it didn't even...
352
00:30:59,625 --> 00:31:02,000
It didn't even recognize
that there was anything
in front of it.
353
00:31:02,083 --> 00:31:04,083
It thought it was open road.
354
00:31:06,250 --> 00:31:09,500
Donley: You might have
an opinion on a Tesla
accident we had out here.
355
00:31:09,583 --> 00:31:11,709
(man speaking)
356
00:31:14,917 --> 00:31:17,041
It was bad,
that's for sure.
357
00:31:17,125 --> 00:31:19,542
I think people just rely
too much on the technology
358
00:31:19,625 --> 00:31:22,125
and don't pay attention
themselves, you know,
359
00:31:22,208 --> 00:31:23,875
to what's going on around them.
360
00:31:23,959 --> 00:31:27,875
Like, since, like him,
he would've known
that there was an issue
361
00:31:27,959 --> 00:31:32,875
if he wasn't relying on the car
to drive while he was
watching a movie.
362
00:31:32,959 --> 00:31:35,041
The trooper had told me
that the driver
363
00:31:35,125 --> 00:31:36,959
had been watching Harry Potter,
364
00:31:37,041 --> 00:31:39,000
you know, at the time
of the accident.
365
00:31:39,083 --> 00:31:42,041
♪ ♪
366
00:31:42,125 --> 00:31:44,625
A news crew from
Tampa, Florida, knocked
on the door, said,
367
00:31:44,709 --> 00:31:47,542
"This is where the accident
happened with the Tesla?"
368
00:31:47,625 --> 00:31:49,208
I said, "Yes, sir."
369
00:31:49,291 --> 00:31:53,041
And he goes, "Do you know
what the significance is
in this accident?"
370
00:31:53,125 --> 00:31:55,000
And I said,
"No, I sure don't."
371
00:31:55,083 --> 00:31:59,000
And he said,
"It's the very first death,
ever, in a driverless car."
372
00:32:00,125 --> 00:32:02,834
I said,
"Is it anybody local?"
373
00:32:02,917 --> 00:32:05,792
And he goes, "Nobody
around here drives a Tesla."
374
00:32:05,875 --> 00:32:09,333
Newsman:
...a deadly crash that's
raising safety concerns
375
00:32:09,417 --> 00:32:10,875
for everyone in Florida.
376
00:32:10,959 --> 00:32:12,583
Newswoman:
It comes as
the state is pushing
377
00:32:12,667 --> 00:32:15,583
to become the nation's
testbed for driverless cars.
378
00:32:15,667 --> 00:32:17,250
Newsman:
Tesla releasing a statement
379
00:32:17,333 --> 00:32:19,917
that cars in autopilot
have safely driven
380
00:32:20,000 --> 00:32:22,542
more than
130 million miles.
381
00:32:22,625 --> 00:32:24,250
Paluska:
ABC Action News reporter
Michael Paluska,
382
00:32:24,333 --> 00:32:27,500
in Williston, Florida,
tonight, digging for answers.
383
00:32:32,625 --> 00:32:33,959
(beeping)
384
00:32:37,083 --> 00:32:39,125
Paluska:
Big takeaway for
me at the scene was
385
00:32:39,208 --> 00:32:40,750
it just didn't stop.
386
00:32:40,834 --> 00:32:43,375
It was driving
down the road,
387
00:32:43,458 --> 00:32:46,125
with the entire top
nearly sheared off,
388
00:32:46,208 --> 00:32:47,417
with the driver dead
389
00:32:47,500 --> 00:32:49,583
after he hit
the truck at 74 mph.
390
00:32:49,667 --> 00:32:53,000
Why did the vehicle not
have an automatic shutoff?
391
00:32:53,083 --> 00:32:55,125
♪ ♪
392
00:32:55,208 --> 00:32:56,458
That was my big question,
393
00:32:56,542 --> 00:32:58,000
one of the questions
we asked Tesla,
394
00:32:58,083 --> 00:32:59,709
that didn't get answered.
395
00:33:00,500 --> 00:33:02,667
All of the statements
from Tesla were that
396
00:33:02,750 --> 00:33:05,417
they're advancing
the autopilot system,
397
00:33:05,500 --> 00:33:07,667
but everything was couched
398
00:33:07,750 --> 00:33:10,834
with the fact that if one
percent of accidents drop
399
00:33:10,917 --> 00:33:13,458
because that's the way
the autopilot system works,
400
00:33:13,542 --> 00:33:14,709
then that's a win.
401
00:33:14,792 --> 00:33:16,834
They kind of missed the mark,
402
00:33:16,917 --> 00:33:18,917
really honoring
Joshua Brown's life,
403
00:33:19,000 --> 00:33:20,458
and the fact that
he died driving a car
404
00:33:20,542 --> 00:33:22,291
that he thought was
going to keep him safe,
405
00:33:22,375 --> 00:33:25,917
at least safer than
the car that I'm driving,
406
00:33:26,250 --> 00:33:28,333
which is a dumb car.
407
00:33:28,417 --> 00:33:30,667
♪ ♪
408
00:33:30,750 --> 00:33:33,917
Vankaveelar:
To be okay with letting
409
00:33:34,000 --> 00:33:36,291
a machine...
410
00:33:36,375 --> 00:33:37,875
take you from
point A to point B,
411
00:33:37,959 --> 00:33:40,834
and then you
actually get used to
412
00:33:40,917 --> 00:33:43,250
getting from point A
to point B okay,
413
00:33:44,000 --> 00:33:46,625
it-- you get, your mind
gets a little bit--
414
00:33:46,709 --> 00:33:48,291
it's just my opinion, okay--
415
00:33:48,375 --> 00:33:51,792
you just, your mind
gets lazier each time.
416
00:33:53,625 --> 00:33:56,125
Kodomoroid:
The accident
was written off
417
00:33:56,208 --> 00:33:58,750
as a case of human error.
418
00:33:59,250 --> 00:34:00,583
(beeping)
419
00:34:01,583 --> 00:34:04,583
Former centers of
manufacturing became
420
00:34:04,667 --> 00:34:08,750
the testing grounds for
the new driverless taxis.
421
00:34:09,959 --> 00:34:12,291
♪ ♪
422
00:34:15,166 --> 00:34:16,625
Nourbakhsh:
If you think
about what happens
423
00:34:16,709 --> 00:34:18,000
when an autonomous
car hits somebody,
424
00:34:18,083 --> 00:34:20,458
it gets really complicated.
425
00:34:22,417 --> 00:34:23,917
The car company's
gonna get sued.
426
00:34:24,000 --> 00:34:25,542
The sensor-maker's
gonna get sued
427
00:34:25,625 --> 00:34:27,333
because they made
the sensor on the robot.
428
00:34:27,417 --> 00:34:30,375
The regulatory framework
is always gonna be behind,
429
00:34:30,458 --> 00:34:33,417
because robot invention
happens faster
430
00:34:33,500 --> 00:34:35,500
than lawmakers can think.
431
00:34:36,625 --> 00:34:38,542
Newswoman:
One of Uber's
self-driving vehicles
432
00:34:38,625 --> 00:34:40,000
killed a pedestrian.
433
00:34:40,083 --> 00:34:42,000
The vehicle was
in autonomous mode,
434
00:34:42,083 --> 00:34:45,792
with an operator
behind the wheel
when the woman was hit.
435
00:34:47,417 --> 00:34:51,125
Newswoman 2:
Tonight, Tesla confirming
this car was in autopilot mode
436
00:34:51,208 --> 00:34:54,750
when it crashed
in Northern California,
killing the driver,
437
00:34:54,834 --> 00:34:57,208
going on to blame
that highway barrier
438
00:34:57,291 --> 00:34:59,834
that's meant
to reduce impact.
439
00:35:01,917 --> 00:35:05,125
Kodomoroid:
After the first
self-driving car deaths,
440
00:35:05,208 --> 00:35:08,667
testing of the new
taxis was suspended.
441
00:35:10,000 --> 00:35:12,291
Nourbakhsh:
It's interesting when you
look at driverless cars.
442
00:35:12,375 --> 00:35:14,333
You see the same kinds
of value arguments.
443
00:35:14,417 --> 00:35:16,333
30,000 people die
every year,
444
00:35:16,417 --> 00:35:18,208
runoff road accidents
in the US alone.
445
00:35:18,291 --> 00:35:19,709
So, don't we wanna
save all those lives?
446
00:35:19,792 --> 00:35:21,750
Let's have cars
drive instead.
447
00:35:21,834 --> 00:35:23,125
Now, you have
to start thinking
448
00:35:23,208 --> 00:35:25,458
about the side effects
on society.
449
00:35:26,333 --> 00:35:29,083
Are we getting rid of every
taxi driver in America?
450
00:35:31,375 --> 00:35:34,500
Our driver partners are
the heart and soul
of this company
451
00:35:34,959 --> 00:35:38,542
and the only reason we've come
this far in just five years.
452
00:35:39,583 --> 00:35:41,458
Nourbakhsh:
If you look at Uber's
first five years,
453
00:35:41,542 --> 00:35:43,542
they're actually
empowering people.
454
00:35:43,625 --> 00:35:46,333
But when the same company
does really hardcore research
455
00:35:46,417 --> 00:35:48,250
to now replace
all those people,
456
00:35:48,333 --> 00:35:50,166
so they don't
need them anymore,
457
00:35:50,250 --> 00:35:51,542
then what you're
seeing is
458
00:35:51,625 --> 00:35:53,542
they're already a highly
profitable company,
459
00:35:53,625 --> 00:35:56,208
but they simply want
to increase that profit.
460
00:35:58,500 --> 00:36:00,375
(beeping)
461
00:36:08,500 --> 00:36:10,375
(beeping)
462
00:36:14,000 --> 00:36:17,000
Kodomoroid:
Eventually,
testing resumed.
463
00:36:17,667 --> 00:36:22,041
Taxi drivers' wages became
increasingly unstable.
464
00:36:24,291 --> 00:36:27,542
Newsman:
Police say a man drove up
to a gate outside city hall
465
00:36:27,625 --> 00:36:29,583
and shot himself
in the head.
466
00:36:29,667 --> 00:36:32,250
Newswoman:
He left a note saying
services such as Uber
467
00:36:32,333 --> 00:36:34,667
had financially
ruined his life.
468
00:36:34,750 --> 00:36:37,250
Newsman:
Uber and other
mobile app services
469
00:36:37,333 --> 00:36:39,500
have made a once
well-paying industry
470
00:36:39,583 --> 00:36:42,667
into a mass
of long hours, low pay,
471
00:36:42,750 --> 00:36:44,917
and economic insecurity.
472
00:36:45,583 --> 00:36:48,041
♪ ♪
473
00:36:51,667 --> 00:36:55,625
Kodomoroid:
Drivers were the biggest
part of the service economy.
474
00:36:57,959 --> 00:36:59,792
(beeping)
475
00:37:02,166 --> 00:37:03,667
Brandon Ackerman:
My father, he drove.
476
00:37:03,750 --> 00:37:04,750
My uncle drove.
477
00:37:04,834 --> 00:37:07,458
I kind of grew
up into trucking.
478
00:37:10,625 --> 00:37:12,792
Some of the new
technology that came out
479
00:37:12,875 --> 00:37:15,750
is taking a lot of
the freedom of the job away.
480
00:37:16,834 --> 00:37:18,458
It's more stressful.
481
00:37:19,959 --> 00:37:22,333
Kodomoroid:
Automation of trucking began
482
00:37:22,417 --> 00:37:24,458
with monitoring the drivers
483
00:37:24,542 --> 00:37:26,667
and simplifying their job.
484
00:37:27,875 --> 00:37:30,458
There's a radar system.
485
00:37:30,542 --> 00:37:31,834
There's a camera system.
486
00:37:31,917 --> 00:37:35,041
There's automatic braking
and adaptive cruise.
487
00:37:35,125 --> 00:37:36,959
Everything is controlled--
488
00:37:37,041 --> 00:37:39,583
when you sleep,
how long you break,
489
00:37:39,667 --> 00:37:42,166
where you drive,
where you fuel,
490
00:37:42,834 --> 00:37:44,125
where you shut down.
491
00:37:44,208 --> 00:37:46,834
It even knows if somebody
was in the passenger seat.
492
00:37:47,625 --> 00:37:51,542
When data gets sent through
the broadband to the company,
493
00:37:52,458 --> 00:37:54,875
sometimes,
you're put in a situation,
494
00:37:56,417 --> 00:37:58,208
maybe because the truck
495
00:37:58,291 --> 00:38:00,291
automatically slowed
you down on the hill
496
00:38:00,375 --> 00:38:02,917
that's a perfectly
good straightaway,
497
00:38:03,000 --> 00:38:04,417
slowed your
average speed down,
498
00:38:04,500 --> 00:38:07,166
so you were one mile
shy of making it
499
00:38:07,250 --> 00:38:09,959
to that safe haven,
and you have to...
500
00:38:10,875 --> 00:38:14,166
get a-- take a chance
of shutting down
on the side of the road.
501
00:38:14,250 --> 00:38:16,792
An inch is a mile out here.
Sometimes you just...
502
00:38:16,875 --> 00:38:19,166
say to yourself, "Well,
I violate the clock one minute,
503
00:38:19,250 --> 00:38:22,208
I might as well just
drive another 600 miles."
504
00:38:23,625 --> 00:38:26,583
You know, but then you're...
you might lose your job.
505
00:38:26,667 --> 00:38:29,709
♪ ♪
506
00:38:33,125 --> 00:38:35,625
We're concerned that it,
it's gonna reduce
507
00:38:35,709 --> 00:38:38,333
the skill of
a truck driver and the pay.
508
00:38:39,083 --> 00:38:41,959
Because you're not gonna
be really driving a truck.
509
00:38:42,041 --> 00:38:43,375
It's gonna be the computer.
510
00:38:47,667 --> 00:38:49,959
Some of us are worried
about losing our houses,
511
00:38:50,041 --> 00:38:51,375
our cars...
512
00:38:54,208 --> 00:38:56,500
having a place to stay.
Some people...
513
00:38:56,583 --> 00:38:59,500
drive a truck just for
the medical insurance,
514
00:38:59,583 --> 00:39:03,208
and a place to stay
and the ability to travel.
515
00:39:03,291 --> 00:39:06,875
♪ ♪
516
00:39:28,166 --> 00:39:31,375
Kodomoroid:
Entire industries
disappeared,
517
00:39:31,458 --> 00:39:34,750
leaving whole regions
in ruins.
518
00:39:37,750 --> 00:39:40,709
♪ ♪
519
00:39:41,583 --> 00:39:44,375
Martin Ford:
Huge numbers of people
feel very viscerally
520
00:39:44,458 --> 00:39:46,458
that they are being left
behind by the economy,
521
00:39:46,542 --> 00:39:49,000
and, in fact,
they're right, they are.
522
00:39:49,083 --> 00:39:51,291
People, of course,
would be more inclined
523
00:39:51,375 --> 00:39:53,458
to point at globalization
524
00:39:53,542 --> 00:39:55,542
or at, maybe, immigration
as being the problems,
525
00:39:55,625 --> 00:39:57,834
but, actually, technology has
played a huge role already.
526
00:39:57,917 --> 00:39:59,166
♪ ♪
527
00:39:59,250 --> 00:40:01,375
Kodomoroid:
The rise of
personal computers
528
00:40:01,458 --> 00:40:04,750
ushered in an era
of digital automation.
529
00:40:06,166 --> 00:40:07,458
(beeping)
530
00:40:08,792 --> 00:40:11,875
Ford:
In the 1990s, I was running
a small software company.
531
00:40:11,959 --> 00:40:13,709
Software was
a tangible product.
532
00:40:13,792 --> 00:40:17,166
You had to put a CD in a box
and send it to a customer.
533
00:40:19,625 --> 00:40:22,750
So, there was a lot of work
there for average people,
534
00:40:22,834 --> 00:40:26,000
people that didn't necessarily
have lots of education.
535
00:40:26,625 --> 00:40:30,500
But I saw in my own business
how that just evaporated
very, very rapidly.
536
00:40:37,333 --> 00:40:39,792
Historically, people move
from farms to factories,
537
00:40:39,875 --> 00:40:42,750
and then later on, of course,
factories automated,
538
00:40:42,834 --> 00:40:45,083
and they off-shored,
and then people moved
to the service sector,
539
00:40:45,166 --> 00:40:48,250
which is where most people
now work in the United States.
540
00:40:48,333 --> 00:40:50,834
♪ ♪
541
00:40:54,500 --> 00:40:57,083
Julia Collins:
I lived on a water buffalo
farm in the south of Italy.
542
00:40:57,166 --> 00:41:00,208
We had 1,000 water buffalo,
and every buffalo
had a different name.
543
00:41:00,291 --> 00:41:04,291
And they were all these
beautiful Italian names
like Tiara, Katerina.
544
00:41:04,375 --> 00:41:06,125
And so, I thought
it would be fun
545
00:41:06,208 --> 00:41:08,542
to do the same thing
with our robots at Zume.
546
00:41:09,875 --> 00:41:11,834
The first two
robots that we have
547
00:41:11,917 --> 00:41:13,542
are named Giorgio and Pepe,
548
00:41:13,625 --> 00:41:15,834
and they dispense sauce.
549
00:41:17,667 --> 00:41:20,959
And then the next robot,
Marta, she's
a FlexPicker robot.
550
00:41:21,041 --> 00:41:23,166
She looks like
a gigantic spider.
551
00:41:24,166 --> 00:41:26,917
And what this robot does
is spread the sauce.
552
00:41:29,709 --> 00:41:31,291
Then we have Bruno.
553
00:41:31,375 --> 00:41:32,959
This is an incredibly
powerful robot,
554
00:41:33,041 --> 00:41:34,792
but he also has
to be very delicate,
555
00:41:34,875 --> 00:41:37,500
so that he can take pizza
off of the assembly line,
556
00:41:37,583 --> 00:41:39,709
and put it into
the 800-degree oven.
557
00:41:41,375 --> 00:41:44,417
And the robot can do this
10,000 times in a day.
558
00:41:46,083 --> 00:41:48,083
Lots of people have
used automation
559
00:41:48,166 --> 00:41:50,834
to create food at scale,
560
00:41:50,917 --> 00:41:53,083
making 10,000 cheese pizzas.
561
00:41:53,917 --> 00:41:56,166
What we're doing is
developing a line
562
00:41:56,250 --> 00:41:58,250
that can respond dynamically
563
00:41:58,333 --> 00:42:01,208
to every single customer
order, in real time.
564
00:42:01,291 --> 00:42:02,917
That hasn't been done before.
565
00:42:05,083 --> 00:42:06,834
So, as you can see right now,
566
00:42:06,917 --> 00:42:08,333
Jose will use the press,
567
00:42:08,417 --> 00:42:11,041
but then he still has to work
the dough with his hands.
568
00:42:11,709 --> 00:42:14,834
So, this is a step that's
not quite optimized yet.
569
00:42:14,917 --> 00:42:17,959
♪ ♪
570
00:42:20,417 --> 00:42:22,792
We have a fifth robot
that's getting fabricated
571
00:42:22,875 --> 00:42:25,166
at a shop across the bay.
572
00:42:25,250 --> 00:42:27,500
He's called Vincenzo.
He takes pizza out,
573
00:42:27,583 --> 00:42:29,834
and puts it into
an individual mobile oven
574
00:42:29,917 --> 00:42:31,917
for transport and delivery.
575
00:42:36,041 --> 00:42:37,542
Ford:
Any kind of work that is
576
00:42:37,625 --> 00:42:41,000
fundamentally routine and
repetitive is gonna disappear,
577
00:42:41,083 --> 00:42:44,000
and we're simply not equipped
to deal with that politically,
578
00:42:44,083 --> 00:42:46,208
because maybe
the most toxic word
579
00:42:46,291 --> 00:42:49,417
in our political vocabulary
is redistribution.
580
00:42:51,291 --> 00:42:53,625
There aren't gonna be
any rising new sectors
581
00:42:53,709 --> 00:42:55,917
that are gonna be there
to absorb all these workers
582
00:42:56,000 --> 00:42:58,083
in the way that, for example,
that manufacturing was there
583
00:42:58,166 --> 00:43:00,709
to absorb all those
agricultural workers
584
00:43:00,792 --> 00:43:02,917
because AI is going
to be everywhere.
585
00:43:03,000 --> 00:43:06,166
♪ ♪
586
00:43:08,208 --> 00:43:12,000
Kodomoroid:
Artificial intelligence
arrived in small steps.
587
00:43:12,792 --> 00:43:15,083
Profits from AI
concentrated
588
00:43:15,166 --> 00:43:18,291
in the hands of
the technology owners.
589
00:43:19,834 --> 00:43:23,917
Income inequality
reached extreme levels.
590
00:43:24,542 --> 00:43:29,792
-(beeping)
-The touchscreen made
most service work obsolete.
591
00:43:33,792 --> 00:43:35,709
♪ ♪
592
00:43:41,667 --> 00:43:43,375
♪ ♪
593
00:44:16,083 --> 00:44:17,500
Tim Hwang:
After I graduated college,
594
00:44:17,583 --> 00:44:20,583
I had a friend who had
just gone to law school.
595
00:44:21,709 --> 00:44:25,041
He was like, "Aw, man,
the first year of law school,
it's super depressing."
596
00:44:27,041 --> 00:44:29,750
All we're doing is
really rote, rote stuff.
597
00:44:31,041 --> 00:44:33,375
Reading through documents
and looking for a single word,
598
00:44:33,458 --> 00:44:34,667
or I spent the whole afternoon
599
00:44:34,750 --> 00:44:36,583
replacing this word
with another word.
600
00:44:36,667 --> 00:44:38,834
And as someone with a kind of
computer science background,
601
00:44:38,917 --> 00:44:42,166
I was like,
"There's so much here
that could be automated."
602
00:44:42,250 --> 00:44:43,917
(beeping)
603
00:44:48,583 --> 00:44:49,834
So, I saw law school
604
00:44:49,917 --> 00:44:52,792
as very much going three
years behind enemy lines.
605
00:44:58,250 --> 00:44:59,583
I took the bar exam,
606
00:44:59,667 --> 00:45:01,250
became a licensed lawyer,
607
00:45:01,333 --> 00:45:03,583
and went to a law firm,
608
00:45:04,083 --> 00:45:06,250
doing largely
transactional law.
609
00:45:07,375 --> 00:45:08,750
And there, my project was
610
00:45:08,834 --> 00:45:11,417
how much can
I automate of my own job?
611
00:45:14,208 --> 00:45:16,583
During the day, I would
manually do this task,
612
00:45:18,542 --> 00:45:19,792
and then at night,
I would go home,
613
00:45:19,875 --> 00:45:21,417
take these legal rules and sa,
614
00:45:21,500 --> 00:45:23,750
could I create
a computer rule,
615
00:45:23,834 --> 00:45:25,208
a software rule,
616
00:45:25,291 --> 00:45:27,250
that would do what
I did during the day?
617
00:45:27,333 --> 00:45:29,125
♪ ♪
618
00:45:29,208 --> 00:45:32,333
In a lawsuit,
you get to see a lot
of the evidence
619
00:45:32,417 --> 00:45:34,291
that the other
side's gonna present.
620
00:45:34,375 --> 00:45:36,834
That amount of
documentation is huge.
621
00:45:38,959 --> 00:45:40,333
And the old way was actually,
622
00:45:40,417 --> 00:45:42,333
you would send an attorney
to go and look through
623
00:45:42,417 --> 00:45:44,834
every single page
that was in that room.
624
00:45:46,208 --> 00:45:49,000
The legal profession works
on an hourly billing system.
625
00:45:49,083 --> 00:45:51,000
So, I ended up in a kind of
interesting conundrum,
626
00:45:51,083 --> 00:45:53,792
where what I was doing
was making me more
and more efficient,
627
00:45:53,875 --> 00:45:55,083
I was doing more
and more work,
628
00:45:55,166 --> 00:45:57,709
but I was expending
less and less time on it.
629
00:45:57,792 --> 00:46:00,083
And I realized that this would
become a problem at some point,
630
00:46:00,166 --> 00:46:02,917
so I decided to go
independent. I quit.
631
00:46:03,000 --> 00:46:05,834
♪ ♪
632
00:46:09,166 --> 00:46:10,917
So, there's Apollo Cluster,
who has processed
633
00:46:11,000 --> 00:46:13,583
more than 10 million unique
transactions for clients,
634
00:46:13,667 --> 00:46:15,250
and we have another
partner, Daria,
635
00:46:15,333 --> 00:46:17,917
who focuses on transactions,
636
00:46:18,000 --> 00:46:19,834
and then, and then there's me.
637
00:46:21,917 --> 00:46:23,625
Our systems have generated
638
00:46:23,709 --> 00:46:26,375
tens of thousands
of legal documents.
639
00:46:26,458 --> 00:46:28,083
-(beeping)
-It's signed off by a lawyer,
640
00:46:28,166 --> 00:46:31,166
but largely, kind of, created
and mechanized by our systems.
641
00:46:33,917 --> 00:46:36,917
I'm fairly confident that
compared against human work,
642
00:46:37,000 --> 00:46:39,250
it would be indistinguishable.
643
00:46:39,333 --> 00:46:42,500
♪ ♪
644
00:46:43,709 --> 00:46:45,250
(beeping)
645
00:46:58,917 --> 00:47:00,709
(whirring)
646
00:47:03,709 --> 00:47:04,750
(beeping)
647
00:47:20,542 --> 00:47:22,625
(Ishiguro speaking)
648
00:47:37,959 --> 00:47:41,041
♪ ♪
649
00:47:57,417 --> 00:47:59,500
(whirring)
650
00:48:02,458 --> 00:48:04,458
(robot speaking in Japanese)
651
00:48:27,125 --> 00:48:29,250
(speaking Japanese)
652
00:48:36,291 --> 00:48:38,458
(indistinct chatter)
653
00:48:48,125 --> 00:48:49,333
(giggles)
654
00:48:49,875 --> 00:48:51,375
(beeping)
655
00:48:55,333 --> 00:48:57,500
(Hideaki speaking in Japanese)
656
00:49:13,333 --> 00:49:15,500
(Robot speaking Japanese)
657
00:49:43,125 --> 00:49:44,709
(Hideaki speaking Japanese)
658
00:50:06,834 --> 00:50:08,959
(woman speaking Japanese)
659
00:50:48,625 --> 00:50:51,291
♪ ♪
660
00:50:53,291 --> 00:50:55,250
(whirs, beeps)
661
00:50:57,125 --> 00:50:58,250
(beeping)
662
00:51:02,291 --> 00:51:04,583
(Niigaki speaking Japanese)
663
00:51:35,542 --> 00:51:38,667
♪ ♪
664
00:51:45,375 --> 00:51:46,667
(beeping)
665
00:51:53,458 --> 00:51:55,500
(whirring)
666
00:52:03,875 --> 00:52:06,000
(robot speaking Japanese)
667
00:52:09,750 --> 00:52:12,375
♪ ♪
668
00:52:13,583 --> 00:52:15,709
(speaking Japanese)
669
00:52:24,041 --> 00:52:26,875
(jazzy piano music playing)
670
00:52:27,959 --> 00:52:29,500
-(beeps)
-(lock clicks)
671
00:52:35,250 --> 00:52:37,250
(piano music continuing)
672
00:52:44,458 --> 00:52:46,667
(automated voice
speaking Japanese)
673
00:53:11,166 --> 00:53:15,041
When we first appeared,
we were a novelty.
674
00:53:17,417 --> 00:53:19,083
(man speaking Japanese)
675
00:53:29,750 --> 00:53:32,083
(automated voice
speaking Japanese)
676
00:53:39,792 --> 00:53:42,000
(automated voice
speaking Japanese)
677
00:53:46,166 --> 00:53:49,125
♪ ♪
678
00:53:55,250 --> 00:53:57,792
(buzzing)
679
00:53:59,417 --> 00:54:01,500
Kodomoroid:
While doing
your dirty work,
680
00:54:01,583 --> 00:54:04,500
we gathered data
about your habits
681
00:54:04,583 --> 00:54:06,792
-and preferences.
-(humming)
682
00:54:17,333 --> 00:54:19,500
We got to know you better.
683
00:54:26,000 --> 00:54:28,375
(buzzing)
684
00:54:30,375 --> 00:54:32,333
(beeping)
685
00:54:35,875 --> 00:54:37,917
Savvides:
The core of everything
we're doing in this lab,
686
00:54:38,000 --> 00:54:40,000
with our long-range
iris system
687
00:54:40,083 --> 00:54:41,959
is trying to
develop technology
688
00:54:42,041 --> 00:54:44,041
so that the computer
689
00:54:44,125 --> 00:54:46,500
can identify who we
are in a seamless way.
690
00:54:47,375 --> 00:54:48,792
And up till now,
691
00:54:48,875 --> 00:54:51,583
we always have to make
an effort to be identified.
692
00:54:51,667 --> 00:54:53,250
♪ ♪
693
00:54:53,333 --> 00:54:55,792
-(beeping)
-All the systems were very
close-range, Hollywood-style,
694
00:54:55,875 --> 00:54:57,625
where you had to go
close to the camera,
695
00:54:57,709 --> 00:55:01,417
and I always found that
challenging for a user.
696
00:55:02,041 --> 00:55:04,458
If I was a user
interacting with this...
697
00:55:04,542 --> 00:55:07,375
system, with this computer,
with this AI,
698
00:55:07,458 --> 00:55:08,750
I don't wanna be that close.
699
00:55:08,834 --> 00:55:10,917
I feel it's very invasive.
700
00:55:11,750 --> 00:55:13,750
So, what I wanted to
solve with my team here
701
00:55:13,834 --> 00:55:15,542
is how can we capture
702
00:55:15,625 --> 00:55:18,083
and identify who you
are from the iris,
703
00:55:18,166 --> 00:55:20,250
at a bigger distance?
How can we still do that,
704
00:55:20,333 --> 00:55:22,667
and have a pleasant
user experience.
705
00:55:22,750 --> 00:55:25,542
♪ ♪
706
00:55:30,625 --> 00:55:33,083
I think there's
a very negative stigma
707
00:55:33,166 --> 00:55:35,959
when people think about
biometrics and facial
recognition,
708
00:55:36,041 --> 00:55:38,875
and any kind of sort
of profiling of users
709
00:55:38,959 --> 00:55:42,750
for marketing purposes to
buy a particular product.
710
00:55:44,542 --> 00:55:48,166
I think the core
science is neutral.
711
00:55:48,250 --> 00:55:51,291
♪ ♪
712
00:55:52,917 --> 00:55:54,667
Nourbakhsh:
Companies go to no end
713
00:55:54,750 --> 00:55:56,750
to try and figure out
how to sell stuff.
714
00:55:56,834 --> 00:55:58,709
And the more information
they have on us,
715
00:55:58,792 --> 00:56:00,458
the better they
can sell us stuff.
716
00:56:00,542 --> 00:56:01,792
(beeping)
717
00:56:01,875 --> 00:56:03,458
We've reached a point where,
for the first time,
718
00:56:03,542 --> 00:56:04,875
robots are able to see.
719
00:56:04,959 --> 00:56:06,250
They can recognize faces.
720
00:56:06,333 --> 00:56:08,583
They can recognize
the expressions you make.
721
00:56:09,542 --> 00:56:12,375
They can recognize
the microexpressions you make.
722
00:56:14,667 --> 00:56:17,500
You can develop individualized
models of behavior
723
00:56:17,583 --> 00:56:19,542
for every person on Earth,
724
00:56:19,625 --> 00:56:21,291
attach machine learning to it,
725
00:56:21,375 --> 00:56:24,625
and come out with the perfect
model for how to sell to you.
726
00:56:25,041 --> 00:56:26,625
(door squeaks)
727
00:56:30,959 --> 00:56:33,166
(beeping)
728
00:56:38,125 --> 00:56:40,166
♪ ♪
729
00:56:41,583 --> 00:56:45,166
Kodomoroid:
You gave us your
undivided attention.
730
00:57:06,834 --> 00:57:09,083
(whirring)
731
00:57:09,166 --> 00:57:12,291
We offered reliable,
friendly service.
732
00:57:16,375 --> 00:57:19,834
Human capacities
began to deteriorate.
733
00:57:23,083 --> 00:57:28,208
Spatial orientation and memory
were affected first.
734
00:57:29,208 --> 00:57:31,709
♪ ♪
735
00:57:34,000 --> 00:57:37,208
The physical world
and the digital world
736
00:57:37,291 --> 00:57:39,375
became one.
737
00:57:43,000 --> 00:57:44,667
(neon sign buzzing)
738
00:57:46,917 --> 00:57:49,750
You were alone
with your desires.
739
00:57:49,834 --> 00:57:52,709
("What You Gonna Do Now?"
by Carla dal Forno playing)
740
00:58:16,458 --> 00:58:20,000
♪ What you gonna do now ♪
741
00:58:20,083 --> 00:58:25,583
♪ That the night's come
and it's around you? ♪
742
00:58:26,500 --> 00:58:30,041
♪ What you gonna do now ♪
743
00:58:30,125 --> 00:58:35,417
♪ That the night's come
and it surrounds you? ♪
744
00:58:36,542 --> 00:58:39,625
♪ What you gonna do now ♪
745
00:58:40,166 --> 00:58:44,458
♪ That the night's come
and it surrounds you? ♪
746
00:58:44,542 --> 00:58:46,458
(buzzing)
747
00:58:53,542 --> 00:58:56,375
Automation brought
the logic of efficiency
748
00:58:56,458 --> 00:58:59,500
to matters of life and death.
749
00:58:59,583 --> 00:59:01,792
Protesters:
Enough is enough!
750
00:59:01,875 --> 00:59:06,250
Enough is enough!
Enough is enough!
751
00:59:06,333 --> 00:59:08,458
-(gunfire)
-(screaming)
752
00:59:11,125 --> 00:59:15,291
Police Radio:
To all SWAT officers
on channel 2, code 3...
753
00:59:18,667 --> 00:59:20,542
♪ ♪
754
00:59:20,625 --> 00:59:21,875
Get back! Get back!
755
00:59:21,959 --> 00:59:24,542
-(gunfire)
-Police Radio:
The suspect has a rifle.
756
00:59:24,625 --> 00:59:26,375
-(police radio chatter)
-(sirens)
757
00:59:26,458 --> 00:59:27,583
(gunfire)
758
00:59:29,542 --> 00:59:31,792
(sirens wailing)
759
00:59:37,709 --> 00:59:40,750
Police Radio:
We have got to get
(unintelligible) down here...
760
00:59:40,834 --> 00:59:43,000
... right now!
(chatter continues)
761
00:59:43,083 --> 00:59:45,500
Man:
There's a fucking sniper!
He shot four cops!
762
00:59:45,583 --> 00:59:48,000
(gunfire)
763
00:59:48,083 --> 00:59:50,625
Woman:
I'm not going near him!
He's shooting right now!
764
00:59:51,500 --> 00:59:53,709
-(sirens continue)
-(gunfire)
765
00:59:54,875 --> 00:59:57,250
Police Radio:
Looks like he's inside
the El Centro building.
766
00:59:57,333 --> 00:59:59,166
-Inside the El Centro buildin.
-(radio beeps)
767
00:59:59,250 --> 01:00:02,291
-(gunfire)
-(helicopter whirring)
768
01:00:02,375 --> 01:00:04,291
(indistinct chatter)
769
01:00:05,041 --> 01:00:07,166
Police Radio:
We may have
a suspect pinned down.
770
01:00:07,250 --> 01:00:09,458
-Northwest corner
of the building.
-(radio beeps)
771
01:00:10,333 --> 01:00:13,750
Chris Webb:
Our armored car was
moving in to El Centro
772
01:00:13,834 --> 01:00:15,667
and so I jumped on the back.
773
01:00:18,834 --> 01:00:20,375
(beeping)
774
01:00:22,667 --> 01:00:23,709
(indistinct chatter)
775
01:00:23,792 --> 01:00:25,041
Came in through the rotunda,
776
01:00:25,125 --> 01:00:27,333
where I found two of our
intelligence officers.
777
01:00:27,583 --> 01:00:28,917
They were calm
and cool and they said,
778
01:00:29,041 --> 01:00:30,208
"Everything's upstairs."
779
01:00:32,625 --> 01:00:35,250
-There's a stairwell right here.
-(door squeaks)
780
01:00:35,333 --> 01:00:38,291
♪ ♪
781
01:00:38,542 --> 01:00:40,166
That's how I knew I was
going the right direction
782
01:00:40,250 --> 01:00:42,583
'cause I just kept
following the blood.
783
01:00:43,709 --> 01:00:44,917
Newswoman:
Investigators say
784
01:00:45,000 --> 01:00:46,959
Micah Johnson was
amassing an arsenal
785
01:00:47,041 --> 01:00:49,333
at his home outside Dallas.
786
01:00:49,417 --> 01:00:51,959
Johnson was
an Afghan war veteran.
787
01:00:52,041 --> 01:00:53,583
Every one of these
door handles,
788
01:00:53,667 --> 01:00:55,709
as we worked our way down,
789
01:00:56,333 --> 01:00:58,750
had blood on them,
where he'd been checking them.
790
01:01:00,083 --> 01:01:01,667
Newswoman:
This was a scene of terror
791
01:01:01,750 --> 01:01:04,333
just a couple of hours ago,
and it's not over yet.
792
01:01:04,417 --> 01:01:07,667
(helicopter whirring)
793
01:01:08,250 --> 01:01:09,792
(police radio chatter)
794
01:01:09,875 --> 01:01:12,959
♪ ♪
795
01:01:14,417 --> 01:01:17,500
Webb:
He was hiding behind,
like, a server room.
796
01:01:17,583 --> 01:01:19,875
Our ballistic tip rounds
were getting eaten up.
797
01:01:19,959 --> 01:01:21,083
(gunfire)
798
01:01:21,166 --> 01:01:22,709
He was just hanging
the gun out on the corner
799
01:01:22,792 --> 01:01:24,834
and just firing at the guys.
800
01:01:24,917 --> 01:01:27,417
(siren blares)
801
01:01:27,917 --> 01:01:29,166
(gunfire)
802
01:01:29,250 --> 01:01:30,792
And he kept enticing them.
"Hey, come on down!
803
01:01:30,875 --> 01:01:33,208
Come and get me! Let's go.
Let's get this over with."
804
01:01:35,250 --> 01:01:38,291
Brown:
This suspect we're negotiating
with for the last 45 minutes
805
01:01:38,375 --> 01:01:40,583
has been exchanging
gunfire with us
806
01:01:40,667 --> 01:01:43,792
and not being very cooperative
in the negotiations.
807
01:01:45,250 --> 01:01:47,166
Before I came here,
808
01:01:47,250 --> 01:01:49,250
I asked for plans
809
01:01:49,333 --> 01:01:51,542
to end this standoff,
810
01:01:51,625 --> 01:01:53,041
and as soon as
I'm done here,
811
01:01:53,125 --> 01:01:55,458
I'll be presented
with those plans.
812
01:01:55,542 --> 01:01:56,709
(police radio chatter)
813
01:01:56,792 --> 01:01:58,375
Webb:
Our team came up with the pla.
814
01:01:58,458 --> 01:02:00,375
Let's just blow him up.
815
01:02:00,458 --> 01:02:02,125
♪ ♪
816
01:02:03,750 --> 01:02:06,000
We had recently got
a hand-me-down robot
817
01:02:06,083 --> 01:02:08,208
from the Dallas ATF office,
818
01:02:08,291 --> 01:02:10,542
and so we were using it a lot.
819
01:02:10,625 --> 01:02:12,625
(whirring)
820
01:02:18,583 --> 01:02:19,625
(beeping)
821
01:02:23,291 --> 01:02:25,709
♪ ♪
822
01:02:25,792 --> 01:02:28,000
It was our bomb squad's robot,
but they didn't wanna have
823
01:02:28,083 --> 01:02:30,166
anything to do with what
we were doing with it.
824
01:02:30,250 --> 01:02:32,375
The plan was to
825
01:02:32,917 --> 01:02:36,542
set a charge off right on
top of this guy and kill him.
826
01:02:36,875 --> 01:02:39,542
And some people
just don't wanna...
827
01:02:39,625 --> 01:02:41,000
don't wanna do that.
828
01:02:44,208 --> 01:02:46,709
We saw no other option
829
01:02:46,792 --> 01:02:50,959
but to use our
bomb r-- bomb robot
830
01:02:52,250 --> 01:02:54,250
and place a device
831
01:02:54,333 --> 01:02:56,834
on its... extension.
832
01:02:59,000 --> 01:03:01,583
Webb:
H e wanted something
to listen to music on,
833
01:03:01,667 --> 01:03:04,208
and so that was
a way for us to...
834
01:03:04,291 --> 01:03:06,959
to hide the robot
coming down the hall.
835
01:03:07,041 --> 01:03:08,375
"Okay, we'll bring
you some music.
836
01:03:08,458 --> 01:03:09,750
Hang on, let us get
this thing together."
837
01:03:09,834 --> 01:03:12,834
(ticking)
838
01:03:18,083 --> 01:03:19,834
It had a trash bag
over the charge
839
01:03:19,917 --> 01:03:21,458
to kinda hide the fact
that there was,
840
01:03:21,542 --> 01:03:25,041
you know, pound and a quarter
of C4 at the end of it.
841
01:03:31,291 --> 01:03:33,917
The minute the robot
got in position,
842
01:03:34,000 --> 01:03:35,917
the charge was detonated.
843
01:03:36,542 --> 01:03:37,834
(boom)
844
01:03:37,917 --> 01:03:41,000
(high-pitched ringing)
845
01:03:41,083 --> 01:03:44,208
♪ ♪
846
01:03:45,291 --> 01:03:48,375
(muted gunfire)
847
01:03:57,375 --> 01:04:00,000
He had gone down with his
finger on the trigger,
848
01:04:00,083 --> 01:04:02,041
and he was kinda hunched over.
849
01:04:07,792 --> 01:04:10,625
It was a piece of the robot
hand had broken off,
850
01:04:10,709 --> 01:04:13,375
and hit his skull, which
caused a small laceration,
851
01:04:13,458 --> 01:04:14,583
which was what was bleeding.
852
01:04:16,125 --> 01:04:17,834
So, I just squeezed
through the,
853
01:04:17,917 --> 01:04:19,583
the little opening that...
854
01:04:19,667 --> 01:04:22,083
that the charge had
caused in the drywall,
855
01:04:22,166 --> 01:04:24,333
and separated him from the gun,
856
01:04:25,000 --> 01:04:28,166
and then we called up
the bomb squad to come in,
857
01:04:28,250 --> 01:04:30,583
and start their search
to make sure it was safe,
858
01:04:30,667 --> 01:04:32,834
that he wasn't sitting
on any explosives.
859
01:04:32,917 --> 01:04:35,125
♪ ♪
860
01:04:35,208 --> 01:04:37,542
Newsman:
The sniper hit
11 police officers,
861
01:04:37,625 --> 01:04:39,875
at least five of
whom are now dead,
862
01:04:39,959 --> 01:04:42,625
making it the deadliest
day in law enforcement
863
01:04:42,709 --> 01:04:44,166
since September 11th.
864
01:04:44,250 --> 01:04:46,750
They blew him up with a bomb
attached to a robot,
865
01:04:46,834 --> 01:04:49,375
that was actually built to
protect people from bombs.
866
01:04:49,458 --> 01:04:50,959
Newsman:
It's a tactic straight from
867
01:04:51,041 --> 01:04:53,625
America's wars in Iraq
and Afghanistan...
868
01:04:53,709 --> 01:04:56,583
Newsman 2:
The question for SWAT teams
nationwide is whether Dallas
869
01:04:56,667 --> 01:04:59,917
marks a watershed moment
in police tactics.
870
01:05:01,125 --> 01:05:04,959
Kodomoroid:
That night in Dallas,
a line was crossed.
871
01:05:06,375 --> 01:05:08,709
A robot must obey
872
01:05:08,792 --> 01:05:11,875
orders given it by
qualified personnel,
873
01:05:11,959 --> 01:05:16,083
unless those orders violate
rule number one. In other words,
874
01:05:16,166 --> 01:05:18,333
a robot can't be ordered
to kill a human being.
875
01:05:20,458 --> 01:05:22,500
Things are moving so quickly,
876
01:05:22,583 --> 01:05:26,125
that it's unsafe to go
forward blindly anymore.
877
01:05:26,208 --> 01:05:29,208
One must try to foresee
878
01:05:29,291 --> 01:05:32,208
where it is that one is
going as much as possible.
879
01:05:32,291 --> 01:05:35,834
♪ ♪
880
01:05:40,875 --> 01:05:43,083
Savvides:
We built the system
for the DOD,
881
01:05:43,166 --> 01:05:44,417
(indistinct chatter)
882
01:05:44,500 --> 01:05:46,667
and it was something that
could help the soldiers
883
01:05:46,750 --> 01:05:49,500
try to do iris
recognition in the field.
884
01:05:52,208 --> 01:05:54,166
♪ ♪
885
01:06:06,208 --> 01:06:08,375
We have collaborations
with law enforcement
886
01:06:08,458 --> 01:06:10,125
where they can test
their algorithms,
887
01:06:10,208 --> 01:06:11,875
and then give us feedback.
888
01:06:13,375 --> 01:06:16,458
It's always a face
behind a face, partial face.
889
01:06:16,542 --> 01:06:18,208
A face will be masked.
890
01:06:19,875 --> 01:06:21,333
Even if there's
occlusion,
891
01:06:21,417 --> 01:06:23,041
it still finds
the face.
892
01:06:27,542 --> 01:06:29,750
Nourbakhsh:
One of the ways we're
trying to make autonomous,
893
01:06:29,834 --> 01:06:31,333
war-fighting machines now
894
01:06:31,417 --> 01:06:33,458
is by using computer vision
895
01:06:33,542 --> 01:06:35,291
and guns together.
896
01:06:35,375 --> 01:06:36,709
(beeping)
897
01:06:36,792 --> 01:06:39,959
You make a database of
the images of known terrorist,
898
01:06:40,041 --> 01:06:43,542
and you tell the machine
to lurk and look for them,
899
01:06:43,625 --> 01:06:46,834
and when it matches a face
to its database, shoot.
900
01:06:46,917 --> 01:06:49,625
(gunfire)
901
01:06:51,583 --> 01:06:53,125
(gunfire)
902
01:06:54,917 --> 01:06:58,208
Those are robots that are
deciding to harm somebody,
903
01:06:58,291 --> 01:07:00,875
and that goes directly
against Asimov's first law.
904
01:07:00,959 --> 01:07:03,250
A robot may never harm a human.
905
01:07:04,083 --> 01:07:05,542
Every time we make a machine
906
01:07:05,625 --> 01:07:07,917
that's not really as
intelligent as a human,
907
01:07:08,000 --> 01:07:09,250
it's gonna get misused.
908
01:07:09,333 --> 01:07:11,917
And that's exactly where
Asimov's laws get muddy.
909
01:07:13,250 --> 01:07:14,959
This is, sort of,
the best image,
910
01:07:15,041 --> 01:07:17,500
but it's really out
of focus. It's blurry.
911
01:07:17,583 --> 01:07:20,834
There's occlusion due
to facial hair, hat,
912
01:07:21,166 --> 01:07:22,792
he's holding
a cell phone...
913
01:07:22,875 --> 01:07:25,667
So, we took that and we
reconstructed this,
914
01:07:25,750 --> 01:07:29,583
which is what we sent to
law enforcement at 2:42 AM.
915
01:07:29,667 --> 01:07:31,542
♪ ♪
916
01:07:31,625 --> 01:07:33,291
To get the eye coordinates,
917
01:07:33,375 --> 01:07:35,333
we crop out
the periocular region,
918
01:07:35,417 --> 01:07:37,375
which is the region
around the eyes.
919
01:07:38,000 --> 01:07:40,959
We reconstruct the whole
face based on this region.
920
01:07:41,041 --> 01:07:43,166
We run the whole face
against a matcher.
921
01:07:44,250 --> 01:07:46,959
And so this is what it
comes up with as a match.
922
01:07:48,166 --> 01:07:51,041
This is a reasonable
face you would expect
923
01:07:51,125 --> 01:07:53,542
that would make sense, right?
924
01:07:53,625 --> 01:07:55,959
Our brain does
a natural hallucination
925
01:07:56,041 --> 01:07:57,667
of what it doesn't see.
926
01:07:57,750 --> 01:08:00,542
It's just, how do we get
computer to do the same thing.
927
01:08:00,625 --> 01:08:03,667
♪ ♪
928
01:08:08,667 --> 01:08:11,500
(police radio chatter)
929
01:08:16,458 --> 01:08:19,000
Man:
Five days ago, the soul
930
01:08:19,083 --> 01:08:20,917
of our city was pierced
931
01:08:21,000 --> 01:08:23,500
when police officers
were ambushed
932
01:08:23,583 --> 01:08:25,458
in a cowardly attack.
933
01:08:26,834 --> 01:08:28,750
Webb:
July 7th for me,
personally, was just,
934
01:08:28,834 --> 01:08:31,041
kinda like, I think
it got all I had left.
935
01:08:31,125 --> 01:08:34,000
I mean I'm like, I just,
I don't have a lot more to give.
936
01:08:34,083 --> 01:08:35,083
It's just not worth it.
937
01:08:35,166 --> 01:08:36,417
-(applause)
-(music playing)
938
01:08:36,500 --> 01:08:37,792
Thank you.
939
01:08:39,834 --> 01:08:40,625
Thank you.
940
01:08:40,709 --> 01:08:42,208
I think our chief of police
941
01:08:42,291 --> 01:08:44,291
did exactly what we
all wanted him to do,
942
01:08:44,375 --> 01:08:45,542
and he said the right things.
943
01:08:45,625 --> 01:08:48,041
These five men
944
01:08:48,125 --> 01:08:50,250
gave their lives
945
01:08:51,583 --> 01:08:53,709
for all of us.
946
01:08:54,917 --> 01:08:58,125
Unfortunately, our chief
told our city council,
947
01:08:58,208 --> 01:09:01,250
"We don't need more officers.
We need more technology."
948
01:09:01,917 --> 01:09:04,125
He specifically said
that to city council.
949
01:09:04,208 --> 01:09:05,542
(police radio chatter)
950
01:09:07,583 --> 01:09:09,917
In this day and age,
success of a police chief
951
01:09:10,000 --> 01:09:12,667
is based on response times
and crime stats.
952
01:09:12,750 --> 01:09:14,125
(beeping)
953
01:09:14,208 --> 01:09:17,375
And so, that becomes the focus
of the chain of command.
954
01:09:18,583 --> 01:09:20,542
So, a form of automation
in law enforcement
955
01:09:20,625 --> 01:09:24,583
is just driving everything
based on statistics and numbers.
956
01:09:25,709 --> 01:09:27,792
What I've lost in all
that number chasing
957
01:09:27,875 --> 01:09:30,709
is the interpersonal
relationship between the officer
958
01:09:30,792 --> 01:09:32,917
and the community that
that officer is serving.
959
01:09:34,250 --> 01:09:36,625
The best times in police work
are when you got to go out
960
01:09:36,709 --> 01:09:38,417
and meet people and get
to know your community,
961
01:09:38,500 --> 01:09:41,250
and go get to know
the businesses on your beat.
962
01:09:43,041 --> 01:09:45,083
And, at least in Dallas,
that's gone.
963
01:09:47,750 --> 01:09:50,125
We become less personal,
964
01:09:50,208 --> 01:09:51,750
and more robotic.
965
01:09:51,834 --> 01:09:53,250
Which is a shame
966
01:09:53,333 --> 01:09:56,208
because it's supposed to
be me interacting with you.
967
01:09:57,750 --> 01:10:00,875
♪ ♪
968
01:10:05,375 --> 01:10:07,542
(Zhen Jiajia speaking Chinese)
969
01:10:10,417 --> 01:10:11,875
(beeping)
970
01:10:29,625 --> 01:10:31,667
(typing)
971
01:10:51,792 --> 01:10:53,917
(office chatter)
972
01:11:44,709 --> 01:11:45,750
(beeping)
973
01:11:46,750 --> 01:11:48,917
(automated voice
speaking Chinese)
974
01:12:06,208 --> 01:12:07,542
♪ ♪
975
01:13:24,333 --> 01:13:25,333
(beeps)
976
01:13:27,875 --> 01:13:30,333
(beep, music playing)
977
01:14:20,333 --> 01:14:21,667
(sighs)
978
01:14:25,750 --> 01:14:28,875
♪ ♪
979
01:14:39,375 --> 01:14:41,500
♪ ♪
980
01:15:10,709 --> 01:15:13,458
Kodomoroid:
You worked to improve
our abilities.
981
01:15:15,000 --> 01:15:18,542
Some worried that one day,
we would surpass you,
982
01:15:19,542 --> 01:15:22,750
but the real milestone
was elsewhere.
983
01:15:28,125 --> 01:15:32,417
The dynamic between
robots and humans changed.
984
01:15:33,458 --> 01:15:37,041
You could no longer
tell where you ended,
985
01:15:37,291 --> 01:15:39,208
and we began.
986
01:15:42,709 --> 01:15:45,834
(chatter, laughter)
987
01:15:45,917 --> 01:15:48,834
John Campbell:
It seems to me that what's so
valuable about our society,
988
01:15:48,917 --> 01:15:50,917
what's so valuable about
our lives together,
989
01:15:51,000 --> 01:15:54,125
is something that we
do not want automated.
990
01:15:55,792 --> 01:15:57,583
What most of us value,
991
01:15:57,667 --> 01:15:59,375
probably more than
anything else,
992
01:15:59,458 --> 01:16:03,417
is the idea of authentic
connection with another person.
993
01:16:04,250 --> 01:16:05,834
(beeping)
994
01:16:08,709 --> 01:16:11,250
And then, we say, "No,
but we can automate this."
995
01:16:12,125 --> 01:16:14,250
"We have a robot that...
996
01:16:15,250 --> 01:16:17,000
"it will listen
sympathetically to you.
997
01:16:17,083 --> 01:16:18,375
"It will make eye contact.
998
01:16:19,125 --> 01:16:23,291
"It remembers everything you
ever told it, cross-indexes."
999
01:16:24,500 --> 01:16:26,625
-When you see a robot that
-(beep)
1000
01:16:26,709 --> 01:16:28,959
its ingenious creator
1001
01:16:29,041 --> 01:16:31,709
has carefully designed
1002
01:16:31,792 --> 01:16:33,917
to pull out your
empathetic responses,
1003
01:16:34,000 --> 01:16:36,000
it's acting as if it's in pai.
1004
01:16:37,417 --> 01:16:39,375
The biggest danger there
is the discrediting
1005
01:16:39,458 --> 01:16:41,333
of our empathetic responses.
1006
01:16:41,417 --> 01:16:45,041
Where empathizing with pain
reflex is discredited.
1007
01:16:45,125 --> 01:16:46,750
If I'm going to override it,
1008
01:16:46,834 --> 01:16:48,250
if I'm not going to
take it seriously
1009
01:16:48,333 --> 01:16:49,500
in the case
of the robot,
1010
01:16:49,583 --> 01:16:51,458
then I have to go back
and think again
1011
01:16:51,542 --> 01:16:55,291
as to why I take it
seriously in the case of
1012
01:16:56,041 --> 01:16:58,125
helping you when
you are badly hurt.
1013
01:16:58,208 --> 01:17:01,834
It undermines the only
thing that matters to us.
1014
01:17:03,959 --> 01:17:07,000
♪ ♪
1015
01:17:54,834 --> 01:17:58,166
("Hikkoshi"
by Maki Asakawa playing)
1016
01:17:58,250 --> 01:18:01,667
Kodomoroid:
And so, we lived among you.
1017
01:18:01,750 --> 01:18:03,709
Our numbers grew.
1018
01:18:05,375 --> 01:18:07,458
Automation continued.
1019
01:18:08,709 --> 01:18:10,875
(woman singing in Japanese)
1020
01:18:21,750 --> 01:18:23,375
♪ ♪
1021
01:18:43,125 --> 01:18:44,667
♪ ♪
1022
01:18:57,125 --> 01:18:59,291
(speaking Japanese)
1023
01:19:14,792 --> 01:19:16,875
♪ ♪
1024
01:19:30,458 --> 01:19:32,500
♪ ♪
74723
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.