Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:09,080 --> 00:00:09,920
[Narrator] As far as we know,
we're the only beings on the planet
2
00:00:09,921 --> 00:00:12,559
conscious of our own existence.
3
00:00:12,560 --> 00:00:15,319
Is homo sapiens also to be
the final species of mankind?
4
00:00:15,320 --> 00:00:19,719
[Sjoblad] There has been a number
of human species on this planet.
5
00:00:19,720 --> 00:00:23,599
All of these have lived
and they have passed away.
6
00:00:23,600 --> 00:00:28,199
Who says that homo
sapiens are any different?
7
00:00:28,200 --> 00:00:30,119
Of course, we are just
also a passing stage
8
00:00:30,120 --> 00:00:33,639
in the evolution of this planet.
9
00:00:33,640 --> 00:00:36,519
[Narrator] But are we
rendering ourselves obsolete?
10
00:00:36,520 --> 00:00:40,959
It is not obvious to me
11
00:00:40,960 --> 00:00:42,279
that a replacement
of our species
12
00:00:42,280 --> 00:00:45,599
by our own
technological creation
13
00:00:45,600 --> 00:00:47,360
would necessarily
be a bad thing.
14
00:00:52,320 --> 00:00:53,599
[Sandberg] I'm not very
afraid of artificial intelligence.
15
00:00:53,600 --> 00:00:57,359
I'm rather optimistic
16
00:00:57,360 --> 00:00:57,960
that this is going to be
one of the best things
17
00:00:57,961 --> 00:00:59,959
that have ever happened.
18
00:00:59,960 --> 00:01:02,519
[Narrator] Other
experts are skeptical.
19
00:01:02,520 --> 00:01:05,159
Ask yourself: Are you
personally prepared
20
00:01:05,160 --> 00:01:06,920
to become a member of
the number-two species?
21
00:01:08,600 --> 00:01:14,559
Are yo
22
00:01:14,560 --> 00:01:18,200
u prepared to go to take the risk that machines
could become far more intelligent than we are?
23
00:01:20,960 --> 00:01:22,639
[Narrator] This is
not science fiction,
24
00:01:22,640 --> 00:01:24,256
but the logical consequence
of technological progress.
25
00:01:24,280 --> 00:01:27,640
[Indistinct radio chatter]
26
00:01:34,880 --> 00:01:39,679
[Narrator] The digital age
27
00:01:39,680 --> 00:01:40,999
has delivered boundless
possibilities for humankind.
28
00:01:41,000 --> 00:01:44,559
After all, technology
has always been intended
29
00:01:44,560 --> 00:01:46,080
to help optimize
our natural abilities.
30
00:01:52,120 --> 00:01:53,799
But now computers and robots
31
00:01:53,800 --> 00:01:55,759
are being designed to
become more human.
32
00:01:55,760 --> 00:01:58,879
We're evolving together,
merging into one.
33
00:01:58,880 --> 00:02:02,000
The final consequence
is a controversial vision.
34
00:02:02,800 --> 00:02:06,200
Super sapiens.
35
00:02:15,560 --> 00:02:18,920
[Narrator] Our planet
originated 4.6 billion years ago.
36
00:02:24,880 --> 00:02:25,880
Homo sapiens started
to walk the earth
37
00:02:25,881 --> 00:02:28,599
around the 200,000-year Mark.
38
00:02:28,600 --> 00:02:30,999
He developed
culture, art, writing,
39
00:02:31,000 --> 00:02:33,159
and he built monuments.
40
00:02:33,160 --> 00:02:38,639
Humans also
41
00:02:38,640 --> 00:02:42,479
started to alter their environment. The
intelligent man begins to shape the world
42
00:02:42,480 --> 00:02:44,600
according to his wishes.
43
00:02:49,280 --> 00:02:50,280
With the onset of the
industrial revolution,
44
00:02:50,281 --> 00:02:53,239
the pace of human
progress exploded.
45
00:02:53,240 --> 00:02:55,999
New technologies helped
us to explore our origins
46
00:02:56,000 --> 00:02:58,240
and look into the future.
47
00:03:02,400 --> 00:03:06,399
Today, the sum of our knowledge
48
00:03:06,400 --> 00:03:08,599
is virtually doubling
on a monthly basis,
49
00:03:08,600 --> 00:03:11,159
multiplying digitally at
such an extreme speed
50
00:03:11,160 --> 00:03:15,079
that we can hardly keep pace.
51
00:03:15,080 --> 00:03:19,639
But what does this
mean for humanity?
52
00:03:19,640 --> 00:03:22,199
[Dawkins] In
biological evolution,
53
00:03:22,200 --> 00:03:25,399
nothing ever plans ahead.
54
00:03:25,400 --> 00:03:29,919
It's always the survival of the immediate short-term
advantage. Nowadays, we can look into the distant future
55
00:03:29,920 --> 00:03:32,399
and plan for
long-term advantage,
56
00:03:32,400 --> 00:03:35,319
even at the expense
of short-term advantage.
57
00:03:35,320 --> 00:03:38,079
That's unique, that's new,
that's never existed before.
58
00:03:38,080 --> 00:03:41,440
[Indistinct news
reports overlapping]
59
00:03:45,000 --> 00:03:45,960
[Narrator] Inside an
unassuming home
60
00:03:45,961 --> 00:03:48,559
just north of San Francisco,
61
00:03:48,560 --> 00:03:50,160
the future of humanity
is being imagined.
62
00:03:50,240 --> 00:03:54,959
My name is randal koene.
63
00:03:54,960 --> 00:03:56,639
And I guess my life's
mission is to create
64
00:03:56,640 --> 00:03:58,719
what I call substrate
independent minds,
65
00:03:58,720 --> 00:04:01,159
or what's popularly
known as mind uploading,
66
00:04:01,160 --> 00:04:02,959
the ability to move
a person's mind
67
00:04:02,960 --> 00:04:05,559
into a digital substrate.
68
00:04:05,560 --> 00:04:08,199
My background is in
computational neuroscience
69
00:04:08,200 --> 00:04:10,519
and electrical
engineering and physics.
70
00:04:10,520 --> 00:04:12,559
And I've been working
on this for, well,
71
00:04:12,560 --> 00:04:16,079
almost two decades now.
72
00:04:16,080 --> 00:04:21,479
[Narra
73
00:04:21,480 --> 00:04:25,880
[Tor] Randal knows that he can't realize his vision alone.
The challenge is too great, the questions too complex.
74
00:04:28,800 --> 00:04:33,359
But if randal should succeed,
75
00:04:33,360 --> 00:04:34,920
he would forever
change human existence.
76
00:04:38,400 --> 00:04:42,199
A mind that exists in a machine
77
00:04:42,200 --> 00:04:44,639
lives differently than
one in a mortal body.
78
00:04:44,640 --> 00:04:47,799
Randal wants to bring the
most capable scientists together
79
00:04:47,800 --> 00:04:50,081
to create a kind of handbook
for his controversial vision.
80
00:04:50,840 --> 00:04:55,919
[Koene] The big question
81
00:04:55,920 --> 00:04:58,719
is: How do we do it? How do
you upload a mind into a machine
82
00:04:58,720 --> 00:05:00,679
since, after all, no
one's done this before
83
00:05:00,680 --> 00:05:02,119
and no one really
understands what the mind is
84
00:05:02,120 --> 00:05:03,560
or how it works.
85
00:05:06,560 --> 00:05:08,359
[Narrator] Randal still
has a long way to go.
86
00:05:08,360 --> 00:05:11,079
But he's doing
everything possible
87
00:05:11,080 --> 00:05:11,920
to ensure that his
ambitious project
88
00:05:11,921 --> 00:05:13,200
does not hit a dead end.
89
00:05:17,040 --> 00:05:20,199
In the process, randal relies
on cutting-edge technology
90
00:05:20,200 --> 00:05:23,479
and the very latest research.
91
00:05:23,480 --> 00:05:28,039
We do know that the human brain
92
00:05:28,040 --> 00:05:29,559
is made up of around
100 billion neurons
93
00:05:29,560 --> 00:05:33,359
arranged along a pathway
94
00:05:33,360 --> 00:05:34,999
that stretches nearly
six million kilometres.
95
00:05:35,000 --> 00:05:38,119
Should we mess
with that complexity?
96
00:05:38,120 --> 00:05:42,239
On the road to the hum
97
00:05:42,240 --> 00:05:45,679
an ghost in the machine,
fundamental questions arise.
98
00:05:45,680 --> 00:05:47,879
[Koene] Are you still yourself
if you have perfect memory?
99
00:05:47,880 --> 00:05:51,039
Is that still human?
100
00:05:51,040 --> 00:05:52,559
Is it still you if you
have new senses,
101
00:05:52,560 --> 00:05:54,960
and you can see in the X-ray
spectrum or something like that?
102
00:05:58,960 --> 00:06:04,279
You have to ask yourself: What
does this all mean? And to me,
103
00:06:04,280 --> 00:06:07,439
many of these things are unanswerables at this
moment. It's an exploration rather than a certainty.
104
00:06:07,440 --> 00:06:09,480
[Narrator] Machines are
already becoming more human.
105
00:06:17,080 --> 00:06:20,480
[Playing rock music]
106
00:06:39,600 --> 00:06:43,000
[Man] Whoo, yeah!
107
00:06:46,840 --> 00:06:48,759
[Narrator] At Oregon
state university,
108
00:06:48,760 --> 00:06:50,799
a research team
is working on atrias,
109
00:06:50,800 --> 00:06:52,799
a robot that can walk and
run like a human being,
110
00:06:52,800 --> 00:06:56,879
and much more.
111
00:06:56,880 --> 00:06:58,576
This robot can also detect
unexpected obstacles.
112
00:06:58,600 --> 00:07:04,199
We hum
113
00:07:04,200 --> 00:07:08,119
ans don't have much of a problem handling
uneven terrain. But robots have a harder time.
114
00:07:08,120 --> 00:07:10,999
Here we go.
115
00:07:11,000 --> 00:07:11,840
[Narrator] Teaching
machines to walk on two legs
116
00:07:11,841 --> 00:07:14,039
has always been a challenge.
117
00:07:14,040 --> 00:07:15,760
[Woman] Ooh, nice!
118
00:07:19,520 --> 00:07:22,999
[Narrator] Now for
the ultimate test.
119
00:07:23,000 --> 00:07:24,520
Atrias will compete
with a human being.
120
00:07:28,800 --> 00:07:31,120
The developers are confident
their robot is up to the task.
121
00:07:33,920 --> 00:07:34,520
I'm ready.
122
00:07:34,521 --> 00:07:38,559
You need these.
123
00:07:38,560 --> 00:07:41,840
Oh, man.[Narrator] its opponent gets a
handicapto enable a more balanced comparison.
124
00:07:43,000 --> 00:07:44,159
Take your marks.
125
00:07:44,160 --> 00:07:46,279
Get set.
126
00:07:46,280 --> 00:07:47,560
[Starter pistol banging]
127
00:07:58,960 --> 00:08:03,439
[Whooping]
128
00:08:03,440 --> 00:08:05,439
Okay, okay, well,
congratulations.
129
00:08:05,440 --> 00:08:07,839
[Woman] Thank you.
130
00:08:07,840 --> 00:08:10,799
I have to admit you
won the race. Yes.
131
00:08:10,800 --> 00:08:12,479
You did have a few advantages,
though. I saw you using your arms.
132
00:08:12,480 --> 00:08:15,159
Mmm-hmm.
133
00:08:15,160 --> 00:08:18,319
And you were looking everywhere
with your eyes. But that's okay.
134
00:08:18,320 --> 00:08:21,439
[Woman] Yeah. Oh,
man. [Narrator] The resea
135
00:08:21,440 --> 00:08:23,416
rchers launch a second attempt, this time
limiting the runner's freedom of movement.
136
00:08:23,440 --> 00:08:27,759
Are you
137
00:08:27,760 --> 00:08:29,919
ready? Yes, I'm ready.
138
00:08:29,920 --> 00:08:32,679
[Narrator] And taking away
her sight. This is so hard.
139
00:08:32,680 --> 00:08:34,479
Oh, my gosh.
140
00:08:34,480 --> 00:08:36,959
Ooh.
141
00:08:36,960 --> 00:08:39,239
[Man] Okay, you're
almost to the end, actually,
142
00:08:39,240 --> 00:08:39,840
if you just want to keep going. [Woman]
Seriously? I've almost made it to the end?
143
00:08:39,841 --> 00:08:40,959
[Man] Uh-huh.
144
00:08:40,960 --> 00:08:41,840
Oh, this is...
145
00:08:41,841 --> 00:08:43,479
It's such a workout, man.
146
00:08:43,480 --> 00:08:45,639
You always have to keep
moving and balancing.
147
00:08:45,640 --> 00:08:46,560
No wonder atrias's
motors got so hot.
148
00:08:46,561 --> 00:08:49,119
Yeah, and there you are.
149
00:08:49,120 --> 00:08:51,759
Okay.
150
00:08:51,760 --> 00:08:53,479
Yay!
151
00:08:53,480 --> 00:08:55,279
Okay, wait, let me bring
you a chair. Stay there.
152
00:08:55,280 --> 00:08:57,519
Can I undo my arms?
153
00:08:57,520 --> 00:08:59,959
Wait... okay. Move back.
154
00:08:59,960 --> 00:09:01,079
There you go.
155
00:09:01,080 --> 00:09:01,640
There you are.
156
00:09:01,641 --> 00:09:02,919
Ah!
157
00:09:02,920 --> 00:09:05,879
[Man] Now you can see.
158
00:09:05,880 --> 00:09:09,200
That was so much harder than I
expected it would be. Oh, my gosh.
159
00:09:12,560 --> 00:09:16,679
[Narrator] Humans
are still one step ahead.
160
00:09:16,680 --> 00:09:18,040
But atrias is
constantly learning.
161
00:09:22,200 --> 00:09:24,559
The project is a small step
162
00:09:24,560 --> 00:09:26,800
towards the ambitious goal of
making machines more human.
163
00:09:30,800 --> 00:09:32,679
Brain researcher randal koene
spends a lot of time on the road
164
00:09:32,680 --> 00:09:38,799
searching for answers.
165
00:09:38,800 --> 00:09:42,959
His ultimate goal is to upload the human brain into a
machine. In silicon valley, he's found like-minded people,
166
00:09:42,960 --> 00:09:44,920
and extraordinary inventions.
167
00:09:49,240 --> 00:09:53,839
In a small town in
northern California,
168
00:09:53,840 --> 00:09:55,999
a developer is working on one
of the food industry's dreams:
169
00:09:56,000 --> 00:09:59,439
A machine that can
automatically determine
170
00:09:59,440 --> 00:10:01,839
whether a potato is good enough
171
00:10:01,840 --> 00:10:03,240
to be accepted into
the marketplace.
172
00:10:07,640 --> 00:10:10,120
Randal is here to find out more
about this intelligent machine.
173
00:10:11,080 --> 00:10:15,199
How's the potato sorting?
174
00:10:15,200 --> 00:10:16,279
Good, they're flagged...
175
00:10:16,280 --> 00:10:20,319
[Koene] Well...
176
00:10:20,320 --> 00:10:22,759
And we're using actually
this camera temporarily,
177
00:10:22,760 --> 00:10:26,239
but what we're going
to use is this one
178
00:10:26,240 --> 00:10:29,719
which has 1,000 neurons.
179
00:10:29,720 --> 00:10:32,479
So the brain that's going to do the sorting
is going to be inside the camera itself?
180
00:10:32,480 --> 00:10:34,599
Yeah, we'll have to
put three, three of them
181
00:10:34,600 --> 00:10:35,999
in order to be able to
inspect the whole potatoes,
182
00:10:36,000 --> 00:10:40,159
while it's flying.
183
00:10:40,160 --> 00:10:41,879
You've got potatoes
flying out of the end,
184
00:10:41,880 --> 00:10:43,199
and you're inspecting
them as they're in the air.
185
00:10:43,200 --> 00:10:45,599
Yeah, and basically,
what we're going to do
186
00:10:45,600 --> 00:10:46,959
is to inspect the texture and
the colour, and also the shape,
187
00:10:46,960 --> 00:10:50,599
to make sure that
the shape is regular.
188
00:10:50,600 --> 00:10:53,439
Because sometimes when we
get the potatoes out of the ground,
189
00:10:53,440 --> 00:10:57,079
it's pretty nasty shape,
190
00:10:57,080 --> 00:10:58,880
but this one, this one
is actually quite nice.
191
00:11:04,760 --> 00:11:08,799
[Narrator] One way to
build intelligent machines
192
00:11:08,800 --> 00:11:10,559
is to train them to observe,
understand, remember,
193
00:11:10,560 --> 00:11:14,359
and to learn from the process.
194
00:11:14,360 --> 00:11:18,719
This little computer
is learning to r
195
00:11:18,720 --> 00:11:21,719
ead. It's shown letters
and registers their shapes.
196
00:11:21,720 --> 00:11:26,159
Once the system has
learned to recognize the letters,
197
00:11:26,160 --> 00:11:28,360
the ability to read accurately
is the next logical step.
198
00:11:30,000 --> 00:11:33,320
[Automated male voice]
Initiating learning sequence.
199
00:11:37,880 --> 00:11:41,359
[Automated female voice] "M".
200
00:11:41,360 --> 00:11:43,000
"N".
201
00:11:45,160 --> 00:11:46,880
"A".
202
00:11:49,840 --> 00:11:51,440
"S".
203
00:11:52,680 --> 00:11:55,680
"B".
204
00:11:57,920 --> 00:11:59,520
"R".
205
00:12:01,960 --> 00:12:04,279
[Automated male voice] "This...
206
00:12:04,280 --> 00:12:06,119
Is...
207
00:12:06,120 --> 00:12:08,439
My...
208
00:12:08,440 --> 00:12:11,679
Brain..."
209
00:12:11,680 --> 00:12:13,400
"This is my brain."
210
00:12:16,760 --> 00:12:17,959
[Narrator] Guy
paillet's invention
211
00:12:17,960 --> 00:12:19,439
performs even
more complex skills.
212
00:12:19,440 --> 00:12:22,959
I just wanted to
show you very quickly,
213
00:12:22,960 --> 00:12:25,016
the way we learn, and we
solve the confusion problem,
214
00:12:25,040 --> 00:12:29,079
like learning the duck.
215
00:12:29,080 --> 00:12:32,119
[Koene] Yeah.
216
00:12:32,120 --> 00:12:34,320
And it will overgeneralize,
because if I go now to bird,
217
00:12:34,440 --> 00:12:41,039
you will say if it's a bird,
218
00:12:41,040 --> 00:12:44,759
it will say what's a bird is a duck. [Koene] Oh, I see it. It
thinks it's a duck, too, yeah. So I'm going to say, "okay, this
219
00:12:44,760 --> 00:12:46,359
is actually exactly what
makes human learning,
220
00:12:46,360 --> 00:12:48,679
neural learning, different
from the machine learning,
221
00:12:48,680 --> 00:12:51,519
because if you've never seen
222
00:12:51,520 --> 00:12:53,039
either a duck or a
regular bird before,
223
00:12:53,040 --> 00:12:55,599
if you see one
today and you see,
224
00:12:55,600 --> 00:12:57,719
say, a duck tomorrow,
225
00:12:57,720 --> 00:12:59,839
you would think, "oh,
there is another bird."
226
00:12:59,840 --> 00:13:01,599
Another one of those
new things I just saw.
227
00:13:01,600 --> 00:13:02,719
[Paillet] Yeah, it's
a subcategorization.
228
00:13:02,720 --> 00:13:04,399
So here we are just showing,
229
00:13:04,400 --> 00:13:07,759
we have only two neurons
230
00:13:07,760 --> 00:13:10,839
and it can already quite make a
distinction between the duck and the bird.
231
00:13:10,840 --> 00:13:14,199
[Koene] Mmm-hmm,
that's really quick,
232
00:13:14,200 --> 00:13:15,999
that it just learned that
right there.[Paillet] yeah.
233
00:13:16,000 --> 00:13:18,839
[Koene] That is
something you wouldn't see
234
00:13:18,840 --> 00:13:21,039
from traditional machine learning. [Narrator] Guy
paillet's special little chip can not only record,
235
00:13:21,040 --> 00:13:23,999
recognize, and
differentiate things.
236
00:13:24,000 --> 00:13:26,879
[Paillet] Now it's forgetting.
237
00:13:26,880 --> 00:13:28,479
[Narrator] It can also
forget its own knowledge.
238
00:13:28,480 --> 00:13:32,679
And forgetting is
a very human trait.
239
00:13:32,680 --> 00:13:37,279
So this is basically the forget.
240
00:13:37,280 --> 00:13:41,199
And one way of actually
making it more constant
241
00:13:41,200 --> 00:13:43,879
is to force it to make
a second neuron.
242
00:13:43,880 --> 00:13:46,319
So the chip can
recognize by itself
243
00:13:46,320 --> 00:13:48,039
when it sees something new
244
00:13:48,040 --> 00:13:49,040
and needs a new neuron for that,
245
00:13:49,041 --> 00:13:50,999
and can do forgetting,
246
00:13:51,000 --> 00:13:51,920
and it does rapid
real-time learning
247
00:13:51,921 --> 00:13:54,519
and real-time recognition,
248
00:13:54,520 --> 00:13:56,480
which is the same sort of
thing you see in humans.
249
00:14:01,040 --> 00:14:02,639
[Paillet] This technology
has more potential
250
00:14:02,640 --> 00:14:06,359
to become instrumental
for what randal wants to do.
251
00:14:06,360 --> 00:14:09,279
Because it does not
need to be programmed,
252
00:14:09,280 --> 00:14:12,239
it just has to be trained.
253
00:14:12,240 --> 00:14:14,359
We are at the very
beginning right now.
254
00:14:14,360 --> 00:14:16,119
So this is one of the
things that we are doing with
255
00:14:16,120 --> 00:14:19,479
this tower that we
call the neurostack,
256
00:14:19,480 --> 00:14:21,919
in order to try to see if we
can start a chain reaction,
257
00:14:21,920 --> 00:14:24,280
of like when we are
thinking in our brain.
258
00:14:32,120 --> 00:14:36,079
[Narrator] On the
other side of the world,
259
00:14:36,080 --> 00:14:38,639
in Hong Kong,
260
00:14:38,640 --> 00:14:40,039
a pioneer of
artificial intelligence
261
00:14:40,040 --> 00:14:41,759
is on his way to see some
of the latest developments
262
00:14:41,760 --> 00:14:45,879
in his field.
263
00:14:45,880 --> 00:14:46,240
He has grave concerns about
where the science is heading.
264
00:14:46,760 --> 00:14:51,999
My
265
00:14:52,000 --> 00:14:54,639
Name is Hugo De garis. I was
a professor of artificial brains,
266
00:14:54,640 --> 00:14:57,879
which is a field I've
largely pioneered.
267
00:14:57,880 --> 00:15:02,239
[Goertzel] I first became
aware of Hugo's work
268
00:15:02,240 --> 00:15:04,519
in the late 1980s
and early 1990s.
269
00:15:04,520 --> 00:15:06,520
I started reading articles
about some mad scientist
270
00:15:07,080 --> 00:15:12,839
working in Japan
271
00:15:12,840 --> 00:15:14,919
using an artificial evolutionary process to learn
neural nets simulating small brain circuits,
272
00:15:14,920 --> 00:15:18,719
and the aim was to do
that to control robots.
273
00:15:18,720 --> 00:15:22,039
And I thought, "well,
that's pretty cool."
274
00:15:22,040 --> 00:15:24,839
[De garis] But I'm
also rather political,
275
00:15:24,840 --> 00:15:26,199
so I was thinking about the
longer-term consequences
276
00:15:26,200 --> 00:15:28,079
and came to the very
pessimistic, negative conclusion
277
00:15:28,080 --> 00:15:31,719
that there may eventually
be a species-dominance war
278
00:15:31,720 --> 00:15:35,839
over this issue of whether
humanity should build
279
00:15:35,840 --> 00:15:37,680
these god-like, massively
intelligent machines.
280
00:15:44,920 --> 00:15:48,280
Maybe they will look upon
human beings as negligible pests.
281
00:15:52,240 --> 00:15:56,439
"They're like a cancer
on the surface of the earth!
282
00:15:56,440 --> 00:15:58,639
They're polluting the earth!"
283
00:15:58,640 --> 00:16:01,079
Or maybe these
machines will say,
284
00:16:01,080 --> 00:16:02,839
"well, let's get
rid of the oxygen,
285
00:16:02,840 --> 00:16:04,240
because it's
rusting our circuitry."
286
00:16:07,680 --> 00:16:11,799
[Harris] There is no
reason in principle
287
00:16:11,800 --> 00:16:13,159
that information processing
has to take place in meat,
288
00:16:13,160 --> 00:16:17,319
you know, in the wetware
of our biological brains.
289
00:16:17,320 --> 00:16:20,959
That gets you halfway to
expecting that at some point
290
00:16:20,960 --> 00:16:24,199
we will build machines that are
more intelligent than ourselves.
291
00:16:24,200 --> 00:16:28,239
It's a fundamental game-changer.
292
00:16:28,240 --> 00:16:29,080
Superintelligence will
be the last invention
293
00:16:29,081 --> 00:16:31,639
that humans ever need to make.
294
00:16:31,640 --> 00:16:33,696
After you have superintelligent
scientists and inventors,
295
00:16:33,720 --> 00:16:39,799
programmers,
296
00:16:39,800 --> 00:16:43,559
then further advances will be made by this machine
superintelligence, and at digital timescales
297
00:16:43,560 --> 00:16:44,880
rather than
biological timescales.
298
00:16:47,360 --> 00:16:48,679
[Mandeep] Hi, robosapiens.
299
00:16:48,680 --> 00:16:51,839
What are you doing?
300
00:16:51,840 --> 00:16:56,279
[Robosapiens] Why, mandeep,
301
00:16:56,280 --> 00:16:59,999
I am just looking around.
[Mandeep] I am trying to talk to you.
302
00:17:00,000 --> 00:17:02,519
Where are you now?
303
00:17:02,520 --> 00:17:04,839
[Robosapiens] I think
I am in Hong Kong,
304
00:17:04,840 --> 00:17:06,839
but sometimes I wonder what...
305
00:17:06,840 --> 00:17:09,079
They're usually
hard to interrupt.
306
00:17:09,080 --> 00:17:11,679
I don't understand you.
307
00:17:11,680 --> 00:17:16,039
[Robosapiens] I
am listening to you,
308
00:17:16,040 --> 00:17:17,919
but I can't understand
everything a human can... Yet.
309
00:17:17,920 --> 00:17:20,559
For me, the question isn't:
Do we create a.I.S or not?
310
00:17:20,560 --> 00:17:25,239
They're coming.
311
00:17:25,240 --> 00:17:26,999
The question is: What can I
do to affect how it happens?
312
00:17:27,000 --> 00:17:31,359
[Mandeep] Track!
313
00:17:31,360 --> 00:17:32,160
[Robosapiens] Tracking friend.
314
00:17:32,161 --> 00:17:35,879
[Mandeep] Identify.
315
00:17:35,880 --> 00:17:37,839
[Robosapiens] I
think that's a book.
316
00:17:37,840 --> 00:17:41,159
I would rather see a.I.
317
00:17:41,160 --> 00:17:42,759
Develop sort of by the
people, for the people,
318
00:17:42,760 --> 00:17:46,159
than develop within
a large cooperation
319
00:17:46,160 --> 00:17:48,039
or within a government's
military organization.
320
00:17:48,040 --> 00:17:51,399
A.i. Is under
rapid-fire development.
321
00:17:51,400 --> 00:17:56,039
[Robosapiens] I am a robot and I
am only beginning to understand.
322
00:17:56,040 --> 00:17:59,319
[Schneider] This is the
beginning of a new era
323
00:17:59,320 --> 00:18:00,679
in which we train machines
324
00:18:00,680 --> 00:18:02,959
in ways that are roughly similar
325
00:18:02,960 --> 00:18:05,159
to the way that
perhaps children learn,
326
00:18:05,160 --> 00:18:09,119
through brain-like techniques,
327
00:18:09,120 --> 00:18:11,480
multi-layered neural networks
that are capable of intuition.
328
00:18:12,840 --> 00:18:16,719
[Robosapiens] ...Molecules
is a chemical element
329
00:18:16,720 --> 00:18:20,239
which can act...
330
00:18:20,240 --> 00:18:22,799
So it is going to be difficult very soon
to even know what the machine is doing,
331
00:18:22,800 --> 00:18:26,319
to make sense.
332
00:18:26,320 --> 00:18:27,919
A.i. Is going to be
very sophisticated
333
00:18:27,920 --> 00:18:31,319
far sooner than we think.
334
00:18:31,320 --> 00:18:35,919
I mean, we have neurons
335
00:18:35,920 --> 00:18:38,159
that can fire maybe 100 times per second
or so. Whereas even current-day transistors
336
00:18:38,160 --> 00:18:40,599
operate at gigahertz levels,
337
00:18:40,600 --> 00:18:41,999
billions of times per second.
338
00:18:42,000 --> 00:18:44,719
If you look at how fast
339
00:18:44,720 --> 00:18:46,439
signals can travel
in biological neurons,
340
00:18:46,440 --> 00:18:48,799
100 metres per second
really is maximum speed,
341
00:18:48,800 --> 00:18:54,719
whereas in computers,
342
00:18:54,720 --> 00:18:57,799
signals can travel at the speed of light. Ultimately, the limits
of computation in the machine substrate are just far outside
343
00:18:57,800 --> 00:18:59,039
with an impossible
biological construction.
344
00:18:59,040 --> 00:19:04,519
[Har
345
00:19:04,520 --> 00:19:05,799
[ris] A week's worth of progress in a
machine running a million times faster
346
00:19:05,800 --> 00:19:08,719
than a human brain.
347
00:19:08,720 --> 00:19:09,959
That's 20,000 years
of human progress.
348
00:19:09,960 --> 00:19:11,520
So it's the idea that
the minds involved
349
00:19:14,320 --> 00:19:18,759
will happily take direction
from us at that point,
350
00:19:18,760 --> 00:19:21,080
that may be a
little far-fetched.
351
00:19:22,680 --> 00:19:27,359
[Narrator] Skeptics
warn of the consequences
352
00:19:27,360 --> 00:19:29,160
of continued unchecked
technological progress.
353
00:19:30,800 --> 00:19:34,160
Do we build gods or do we
build our potential exterminators?
354
00:19:37,560 --> 00:19:42,239
That I see
355
00:19:42,240 --> 00:19:48,359
as the big question
of the 21st century.
356
00:19:48,360 --> 00:19:50,239
[Narrator] But development
is not a one-way street.
357
00:19:50,240 --> 00:19:52,519
Randal koene wants to
upload human consciousness
358
00:19:52,520 --> 00:19:56,079
into a computer.
359
00:19:56,080 --> 00:19:57,439
This would likely endow
the ghost in the machine
360
00:19:57,440 --> 00:20:00,799
with human qualities.
361
00:20:00,800 --> 00:20:03,599
[Koene] People often ask
me what it would be like
362
00:20:03,600 --> 00:20:05,239
if I uploaded my mind or
if they uploaded their mind.
363
00:20:05,240 --> 00:20:08,319
What is this going
to be like for me,
364
00:20:08,320 --> 00:20:09,599
to be living in a
different kind of body,
365
00:20:09,600 --> 00:20:12,839
having an emulated brain?
366
00:20:12,840 --> 00:20:15,999
To be honest, we
can't really know
367
00:20:16,000 --> 00:20:18,199
because there are so
many things involved there.
368
00:20:18,200 --> 00:20:22,799
You can't predict
it all in advance.
369
00:20:22,800 --> 00:20:24,479
All we can do is make
small iterative steps,
370
00:20:24,480 --> 00:20:26,480
and at every iteration,
try to learn more about it.
371
00:20:34,240 --> 00:20:38,759
I think it's really important
372
00:20:38,760 --> 00:20:40,479
for the long-term
future of humanity.
373
00:20:40,480 --> 00:20:42,679
It's important for
not just our survival,
374
00:20:42,680 --> 00:20:45,639
our survival in the
sense that we survive
375
00:20:45,640 --> 00:20:47,479
the next big meteorite impact.
376
00:20:47,480 --> 00:20:50,679
But it's also important
for our meaningful survival.
377
00:20:50,680 --> 00:20:54,799
So the ability to continue
to develop ourselves
378
00:20:54,800 --> 00:20:57,119
and to be able to reach beyond what
we can even understand and grasp
379
00:20:57,120 --> 00:20:59,640
within this little niche
that we evolved into.
380
00:21:03,920 --> 00:21:06,800
[Narrator] Randal is on his way
to a meeting in San Francisco.
381
00:21:12,120 --> 00:21:15,359
He has an appointment
with a researcher
382
00:21:15,360 --> 00:21:16,799
he has wanted to
meet for a long time.
383
00:21:16,800 --> 00:21:21,759
On a small scale,
384
00:21:21,760 --> 00:21:25,440
Stephen Larson has managed to achieve what randal
might be working on for the rest of his life.
385
00:21:28,760 --> 00:21:30,719
In the midst of day-to-day life, a
new life form will soon be revealed.
386
00:21:30,720 --> 00:21:34,519
Threadworms are nasty parasites.
387
00:21:34,520 --> 00:21:36,240
But a digital one is
something to celebrate.
388
00:21:41,360 --> 00:21:43,799
[Koene] Stephen?
389
00:21:43,800 --> 00:21:44,959
How you doing? Can I get you
something? Good to meet you.
390
00:21:44,960 --> 00:21:45,400
No, I'm good.
391
00:21:45,401 --> 00:21:47,439
Yeah?
392
00:21:47,440 --> 00:21:50,479
So I can't believe that we haven't
met before. I mean, given openworm
393
00:21:50,480 --> 00:21:52,479
and emulating the
worm, basically.
394
00:21:52,480 --> 00:21:54,999
Yes, openworm is a project
395
00:21:55,000 --> 00:21:57,119
where we're building the first
digital organism in a computer.
396
00:21:57,120 --> 00:22:00,479
It's an open science project.
397
00:22:00,480 --> 00:22:01,759
So that means
that we collaborate
398
00:22:01,760 --> 00:22:03,519
with people on the Internet.
399
00:22:03,520 --> 00:22:05,439
Okay, so let me show you
actually the simulated part,
400
00:22:05,440 --> 00:22:07,079
since we were
talking about that.
401
00:22:07,080 --> 00:22:10,319
So this is how it looks
402
00:22:10,320 --> 00:22:13,839
when it's not the beautiful rendering.
So this is actually the output of...
403
00:22:13,840 --> 00:22:16,319
[Koene] This is
beautiful, actually.
404
00:22:16,320 --> 00:22:17,479
[Larson] Well, thank you, yes.
405
00:22:17,480 --> 00:22:18,360
So...
406
00:22:18,361 --> 00:22:19,679
[Narrator] The artificial worm
407
00:22:19,680 --> 00:22:20,680
seems to move in a
completely natural way,
408
00:22:20,681 --> 00:22:24,879
just like a real threadworm.
409
00:22:24,880 --> 00:22:25,800
[Koene] ...To make it
as realistic as possible.
410
00:22:25,801 --> 00:22:27,839
And it responds, right?
411
00:22:27,840 --> 00:22:28,999
It responds to things.
412
00:22:29,000 --> 00:22:31,159
If somebody looked at that,
413
00:22:31,160 --> 00:22:32,399
they would see something that
looks a bit like a real creature,
414
00:22:32,400 --> 00:22:34,679
or at least it seems
like a real creature
415
00:22:34,680 --> 00:22:37,039
in some ways, right?
416
00:22:37,040 --> 00:22:38,479
So if they asked
you a question like,
417
00:22:38,480 --> 00:22:39,200
"oh, is this thing
in your computer,
418
00:22:39,201 --> 00:22:41,159
is this alive?
419
00:22:41,160 --> 00:22:42,679
And can it feel pain?"
420
00:22:42,680 --> 00:22:44,879
What would you say to that?
421
00:22:44,880 --> 00:22:46,599
Yeah, I think right
now, obviously,
422
00:22:46,600 --> 00:22:48,256
our definition of life is
purely in physical matter.
423
00:22:48,280 --> 00:22:52,959
So we've...
424
00:22:52,960 --> 00:22:54,999
We would have a hard
time defining anything,
425
00:22:55,000 --> 00:22:57,599
I think, that's in a computer
426
00:22:57,600 --> 00:22:59,279
as anything more than sort
of a shadow of life, right?
427
00:22:59,280 --> 00:23:04,159
What happens if you can do
this for a lot of different parts,
428
00:23:04,160 --> 00:23:05,279
and you end up with basically an
entire organism inside of a machine,
429
00:23:05,280 --> 00:23:09,039
life in a machine?
430
00:23:09,040 --> 00:23:11,039
Yeah.
431
00:23:11,040 --> 00:23:13,599
You know, what do you think?
432
00:23:13,600 --> 00:23:15,079
Yeah, okay, imagine we open up your skull, and
we go and we find one neuron in your brain,
433
00:23:15,080 --> 00:23:17,959
and then we
replace it, we plug in,
434
00:23:17,960 --> 00:23:19,719
in the place where that
neuron is connected,
435
00:23:19,720 --> 00:23:20,799
we plug our computer, and
now that computer is doing
436
00:23:20,800 --> 00:23:23,359
what that one neuron can do.
437
00:23:23,360 --> 00:23:24,879
And you know, are
you a robot at that point?
438
00:23:24,880 --> 00:23:27,879
You'd say, "no, it's
just one neuron," right?
439
00:23:27,880 --> 00:23:29,839
But now let's say I go and I
take a second neuron, right?
440
00:23:29,840 --> 00:23:32,479
And I replace it.
441
00:23:32,480 --> 00:23:32,840
And now am I machine yet?
442
00:23:32,841 --> 00:23:35,679
No.
443
00:23:35,680 --> 00:23:38,479
Okay, third, fourth, fifth. So the
whole idea of this thought experiment
444
00:23:38,480 --> 00:23:40,479
is that now when you go to like, an iteration, and you
say, well, "so how many neurons do I need to replace
445
00:23:40,480 --> 00:23:41,680
before you're no longer you?"
446
00:23:47,160 --> 00:23:51,159
[Narrator] Stephen's project
shakes the very foundations
447
00:23:51,160 --> 00:23:54,559
of our self-conception,
448
00:23:54,560 --> 00:23:56,159
who we are, and
what makes us human.
449
00:23:56,160 --> 00:24:01,639
Now you've got this en
450
00:24:01,640 --> 00:24:03,039
tire human brain in your computer. So
you have to wonder: Is that a person?
451
00:24:03,040 --> 00:24:09,599
Right.
452
00:24:09,600 --> 00:24:10,639
[Koene] And how do you treat this brain in your computer?
That leads you then to the whole question of mind uploading,
453
00:24:10,640 --> 00:24:14,039
you know?
454
00:24:14,040 --> 00:24:15,839
Yeah.
455
00:24:15,840 --> 00:24:16,720
This is very close.
This is this is basically it.
456
00:24:16,721 --> 00:24:19,999
Right.
457
00:24:20,000 --> 00:24:22,519
And would someone volunteer
for that? Would you volunteer for it?
458
00:24:22,520 --> 00:24:23,360
Would you want
to do it yourself?
459
00:24:23,361 --> 00:24:25,639
Right, so I guess...
460
00:24:25,640 --> 00:24:27,359
I mean, there's also
specific questions like:
461
00:24:27,360 --> 00:24:30,039
Do you have to chop up my
brain in order to upload me?
462
00:24:30,040 --> 00:24:32,479
Because then I
don't have it anymore
463
00:24:32,480 --> 00:24:33,759
and I can't really go back.
464
00:24:33,760 --> 00:24:35,519
So I'm not so eager.
465
00:24:35,520 --> 00:24:36,559
There's no undo button
then, that's the thing.
466
00:24:36,560 --> 00:24:39,239
Right!
467
00:24:39,240 --> 00:24:43,639
So I'm not yet so eager to,
you know, jump in and sign up.
468
00:24:43,640 --> 00:24:45,639
But there's all sorts of other you know, interesting questions
that I'm sure you're exploring, like, "okay, so if you're
469
00:24:45,640 --> 00:24:47,000
like, how secure
is that computer?"
470
00:24:51,040 --> 00:24:52,559
[Narrator] An important
problem to address
471
00:24:52,560 --> 00:24:55,119
as we're confronted
by an alarming increase
472
00:24:55,120 --> 00:24:57,719
in the threat of computer
hacking and cyber crime.
473
00:24:57,720 --> 00:25:02,599
Randal, however,
is not deterred.
474
00:25:02,600 --> 00:25:06,719
[Koene] If I want to be
able to upload my mind
475
00:25:06,720 --> 00:25:10,519
or upload your mind,
476
00:25:10,520 --> 00:25:12,079
if we want to be able to do
whole brain emulation as such,
477
00:25:12,080 --> 00:25:15,159
it means that we need to
be able to get really close
478
00:25:15,160 --> 00:25:17,359
to the activity that's
going on inside the brain.
479
00:25:17,360 --> 00:25:20,119
We need to be
able to register it,
480
00:25:20,120 --> 00:25:21,919
characterize what's
going on in a circuit.
481
00:25:21,920 --> 00:25:24,079
That's the way that people
make neural prostheses,
482
00:25:24,080 --> 00:25:26,559
that's the way that
they can create a replica
483
00:25:26,560 --> 00:25:30,359
of what's going on.
484
00:25:30,360 --> 00:25:32,439
But right now, we still have a long way to go, because
most of the interface technology that's available
485
00:25:32,440 --> 00:25:35,559
is very primitive,
486
00:25:35,560 --> 00:25:36,759
and can only barely
record some information
487
00:25:36,760 --> 00:25:39,479
or send some
signals to the brain.
488
00:25:39,480 --> 00:25:42,119
[Narrator] The blue
brain project in Geneva
489
00:25:42,120 --> 00:25:44,039
is decoding the
complexities of the brain.
490
00:25:44,040 --> 00:25:47,199
[Walker] We understand
a lot about the cosmos.
491
00:25:47,200 --> 00:25:48,519
We understand huge
amounts of physics,
492
00:25:48,520 --> 00:25:51,759
we understand chemistry.
493
00:25:51,760 --> 00:25:53,399
But why I am here,
capable of talking to you,
494
00:25:53,400 --> 00:25:56,239
why I can understand
your questions,
495
00:25:56,240 --> 00:25:59,159
why I can answer them,
496
00:25:59,160 --> 00:26:00,439
why I may feel an
emotion while I'm doing so,
497
00:26:00,440 --> 00:26:02,200
this, today, we seriously
do not understand.
498
00:26:04,000 --> 00:26:08,559
And I think this is a
great human endeavour.
499
00:26:08,560 --> 00:26:10,519
And it's what
motivates most of us.
500
00:26:10,520 --> 00:26:15,479
[Narrator] Ric
501
00:26:15,480 --> 00:26:17,199
hard Walker heads a team of researchers
who are taking the first serious steps
502
00:26:17,200 --> 00:26:18,720
required to answer
those questions.
503
00:26:25,200 --> 00:26:28,799
They're using
the brains of rats,
504
00:26:28,800 --> 00:26:30,919
segmenting them
into ultra-thin slices
505
00:26:30,920 --> 00:26:33,679
and recording their
measurements.
506
00:26:33,680 --> 00:26:37,439
From these, a complex
model of the rat brain
507
00:26:37,440 --> 00:26:40,679
is produced in the computer,
508
00:26:40,680 --> 00:26:42,839
a kind of three-dimensional map
509
00:26:42,840 --> 00:26:44,920
with all the natural
biological links and structures.
510
00:26:49,680 --> 00:26:52,439
The important point is
511
00:26:52,440 --> 00:26:54,799
it doesn't really matter what brain you're
talking about, as long as it's a mammal brain.
512
00:26:54,800 --> 00:26:58,119
So you can do it for rats,
513
00:26:58,120 --> 00:26:59,399
you can do it for
different bits of the brain,
514
00:26:59,400 --> 00:27:01,959
you can do it for
rats of different ages.
515
00:27:01,960 --> 00:27:03,919
We can do it for other
species of animal, too,
516
00:27:03,920 --> 00:27:05,280
and we want to
end up with humans.
517
00:27:10,800 --> 00:27:11,839
[Narrator] Richard
Walker and the researchers
518
00:27:11,840 --> 00:27:13,479
have already moved forward...
519
00:27:13,480 --> 00:27:16,639
[Electric lock beeping]
520
00:27:16,640 --> 00:27:18,359
not least thanks to their
highly sophisticated laboratory.
521
00:27:18,360 --> 00:27:23,039
They analyze scans of a
mouse's brain on a gigantic screen.
522
00:27:23,040 --> 00:27:27,879
Each line represents
an individual neuron,
523
00:27:27,880 --> 00:27:31,119
a brain cell connected
to hundreds of other cells.
524
00:27:31,120 --> 00:27:34,479
Countless signals are sent
ceaselessly back and forth
525
00:27:34,480 --> 00:27:37,919
across this organic network,
526
00:27:37,920 --> 00:27:40,039
each of them stimulating
a specific reaction
527
00:27:40,040 --> 00:27:42,439
at their intended receiver.
528
00:27:42,440 --> 00:27:47,959
[Walker] This one here...
529
00:27:47,960 --> 00:27:50,599
[Narrator] Our thoughts are a result of
the complex interplay between these cells,
530
00:27:50,600 --> 00:27:52,000
as is, therefore,
our consciousness.
531
00:27:52,920 --> 00:27:57,759
The goal the
scientists are striving for
532
00:27:57,760 --> 00:28:00,439
is to crack the code behind
this evolutionary mechanism.
533
00:28:00,440 --> 00:28:03,839
[Man] To provide us some
solutions in terms of intelligence...
534
00:28:03,840 --> 00:28:07,119
When this is turned
into technology,
535
00:28:07,120 --> 00:28:08,679
I think it can become
a very powerful tool.
536
00:28:08,680 --> 00:28:11,759
And tools are not
all used for good.
537
00:28:11,760 --> 00:28:15,319
When we have our machine tools,
538
00:28:15,320 --> 00:28:18,759
we can use them to make wonderful surgical
instruments. We can use them to make vehicles.
539
00:28:18,760 --> 00:28:20,479
We can use them to make weapons.
540
00:28:20,480 --> 00:28:22,639
Some of those
applications are good,
541
00:28:22,640 --> 00:28:24,839
and some are bad.
542
00:28:24,840 --> 00:28:26,399
I think the ethics is not
the ethics of the basic work.
543
00:28:26,400 --> 00:28:30,599
I think we should
carry on doing that.
544
00:28:30,600 --> 00:28:33,279
Humans have always built tools.
545
00:28:33,280 --> 00:28:35,119
Humans have always
searched for basic knowledge.
546
00:28:35,120 --> 00:28:37,401
And I think it'd be catastrophic
if we stopped doing that.
547
00:28:40,880 --> 00:28:42,799
[Narrator] Our most
advanced technologies
548
00:28:42,800 --> 00:28:45,040
are being used to solve the
mysteries of the human brain.
549
00:28:49,560 --> 00:28:54,759
On the other hand,
550
00:28:54,760 --> 00:29:00,199
artificial intelligence is helping
to make machines more human.
551
00:29:00,200 --> 00:29:02,799
[Man] Can I ask you a question?
552
00:29:02,800 --> 00:29:06,199
Why is the sky blue?
553
00:29:06,200 --> 00:29:09,279
[Automated male voice]
The sky's blue colour
554
00:29:09,280 --> 00:29:11,479
is a result of the
earth's atmosphere
555
00:29:11,480 --> 00:29:14,199
scattering all the short
wavelength blue light
556
00:29:14,200 --> 00:29:17,999
shined on us by the sun.
557
00:29:18,000 --> 00:29:21,279
The human eye
perceives the colour blue
558
00:29:21,280 --> 00:29:23,360
when looking at
the sky as a result.
559
00:29:24,240 --> 00:29:29,399
[Nar
560
00:29:29,400 --> 00:29:32,519
[rator] Meet I.B.M.'S
state-of-the-art a.I. System Watson.
561
00:29:32,520 --> 00:29:36,959
Watson is a
self-learning computer
562
00:29:36,960 --> 00:29:38,399
that can understand
human language.
563
00:29:38,400 --> 00:29:41,919
In 2011, Watson became famous
564
00:29:41,920 --> 00:29:44,479
when it defeated two human
opponents in a quiz show.
565
00:29:44,480 --> 00:29:47,120
It can even diagnose
medical problems.
566
00:29:52,960 --> 00:29:54,079
Doctors were able to use
Watson's incredible computing power
567
00:29:54,080 --> 00:29:57,999
to help identify the
genetic cause of disease
568
00:29:58,000 --> 00:30:01,399
in a cancer patient
and suggest a treatment.
569
00:30:01,400 --> 00:30:05,799
To do this, Watson scanned
3,500 medical studies
570
00:30:05,800 --> 00:30:08,240
and 400,000 additional
texts, all in a mere 17 seconds.
571
00:30:16,480 --> 00:30:19,719
Watson is also used
inside children's toys.
572
00:30:19,720 --> 00:30:23,359
Dolls and dinosaurs are no
longer limited in their speech.
573
00:30:23,360 --> 00:30:27,199
They're now capable of simple
yet complete conversations.
574
00:30:27,200 --> 00:30:31,759
[Meyerson] Would you
like to eat the centrepiece?
575
00:30:31,760 --> 00:30:34,119
[Toy's automated voice]
Well, it does look tasty,
576
00:30:34,120 --> 00:30:36,359
but I'm not really that hungry.
577
00:30:36,360 --> 00:30:39,519
[Meyerson] Oh, good lord.
578
00:30:39,520 --> 00:30:41,239
A picky dinosaur.
579
00:30:41,240 --> 00:30:43,879
It's actually not a toy.
580
00:30:43,880 --> 00:30:45,999
It just disguised as a toy.
581
00:30:46,000 --> 00:30:48,999
Very cleverly wrapped
in green plastic
582
00:30:49,000 --> 00:30:51,239
and disguised as an
educational device.
583
00:30:51,240 --> 00:30:54,399
But really, it's a personality.
584
00:30:54,400 --> 00:30:57,679
It's a companion.
585
00:30:57,680 --> 00:31:01,999
It's friends and intellect all wrapped
up in one. The fact of the matter is,
586
00:31:02,000 --> 00:31:03,759
if you think about what Watson has accomplished,
people talk about it sometimes as a.I.,
587
00:31:03,760 --> 00:31:08,399
you know, artificial
intelligence.
588
00:31:08,400 --> 00:31:11,279
I actually prefer to think of it as accessible intelligence.
What I mean by that is really a revolutionary way
589
00:31:11,280 --> 00:31:13,119
of interacting with these
tremendous capabilities.
590
00:31:13,120 --> 00:31:15,919
You literally have access
591
00:31:15,920 --> 00:31:17,839
by simple direct
human-like interaction.
592
00:31:17,840 --> 00:31:19,799
You can say, "I'd like to
make something for dinner
593
00:31:19,800 --> 00:31:22,759
that's based on...
594
00:31:22,760 --> 00:31:26,759
I've got a particular fish in
mind, basa. What can I make?"
595
00:31:26,760 --> 00:31:29,959
And chef Watson will simply come back
with an answer. That is at the human level.
596
00:31:29,960 --> 00:31:31,599
You're not a chief
information officer,
597
00:31:31,600 --> 00:31:35,039
you're somebody
standing there in the house,
598
00:31:35,040 --> 00:31:37,119
sitting there and getting really tired of bringing in
pizza. You want to access it, it's like accessing this.
599
00:31:37,120 --> 00:31:39,119
Ask a question.
600
00:31:39,120 --> 00:31:41,159
Not that hard.
601
00:31:41,160 --> 00:31:43,719
[Meyerson] Right,
go for it. Not that hard.
602
00:31:43,720 --> 00:31:46,679
[Narrator] The smart
toy is really something.
603
00:31:46,680 --> 00:31:49,639
[Sandberg] It's a very fun
game, I guess for the whole family,
604
00:31:49,640 --> 00:31:53,279
except that it also shows us
605
00:31:53,280 --> 00:31:54,759
we're inviting artificial
intelligence into our living room.
606
00:31:54,760 --> 00:31:57,359
And we're probably not
thinking very much about it,
607
00:31:57,360 --> 00:32:00,519
because it's just a toy.
608
00:32:00,520 --> 00:32:02,359
When something is a toy, we
tend to find that simple and benign,
609
00:32:02,360 --> 00:32:06,799
unless it's too
expensive or too violent.
610
00:32:06,800 --> 00:32:09,479
But what is
happening is, of course,
611
00:32:09,480 --> 00:32:11,599
with artificial intelligence,
612
00:32:11,600 --> 00:32:15,040
it's rapidly becoming a
part of our everyday life.
613
00:32:18,440 --> 00:32:22,479
[De garis] Ask yourself:
614
00:32:22,480 --> 00:32:24,319
How much money
615
00:32:24,320 --> 00:32:25,359
would you be prepared to
spend to buy a home robot
616
00:32:25,360 --> 00:32:28,319
that was genuinely useful?
617
00:32:28,320 --> 00:32:30,199
Imagine it could babysit
the kids and wash the dishes
618
00:32:30,200 --> 00:32:32,679
and wash the clothes
and clean the house
619
00:32:32,680 --> 00:32:35,199
and entertain you
and [inaudible] you
620
00:32:35,200 --> 00:32:36,559
and educate you
and all those things?
621
00:32:36,560 --> 00:32:39,599
Obviously, a major industry
622
00:32:39,600 --> 00:32:41,536
worth trillions of dollars a
year worldwide in the 2020s.
623
00:32:41,560 --> 00:32:47,159
So you see that coming.
624
00:32:47,160 --> 00:32:51,359
And then, well, you can see the writing on
the wall pretty well. So if my left hand here
625
00:32:51,360 --> 00:32:53,159
represents human
intelligence level,
626
00:32:53,160 --> 00:32:56,399
and my my right hand here,
627
00:32:56,400 --> 00:32:59,599
that represents
current, more or less,
628
00:32:59,600 --> 00:33:02,239
machine intelligence level,
629
00:33:02,240 --> 00:33:04,639
well, you can imagine
what's going to happen, right?
630
00:33:04,640 --> 00:33:06,519
Millions, if not
billions of people
631
00:33:06,520 --> 00:33:08,399
will see with their own eyes,
632
00:33:08,400 --> 00:33:10,519
they'll see that I.Q.
Gap close, right?
633
00:33:10,520 --> 00:33:13,559
And then lots of people start
asking the obvious questions.
634
00:33:13,560 --> 00:33:16,839
Well, is humanity going
to allow these machines
635
00:33:16,840 --> 00:33:18,440
to become as intelligent
as human beings?
636
00:33:21,800 --> 00:33:23,359
[Narrator] At the Hong
Kong polytechnic university,
637
00:33:23,360 --> 00:33:25,199
a team of researchers
is working full-steam
638
00:33:25,200 --> 00:33:28,199
to develop precisely
the machines
639
00:33:28,200 --> 00:33:30,120
that Hugo De garis warns
against so emphatically.
640
00:33:35,880 --> 00:33:37,039
The head of the team here
is Hugo's former colleague,
641
00:33:37,040 --> 00:33:39,680
Ben goertzel.
642
00:33:42,360 --> 00:33:43,639
And Hugo and I
ended up collaborating
643
00:33:43,640 --> 00:33:46,279
on his China brain project.
644
00:33:46,280 --> 00:33:48,399
He and his students were
building a sort of neural net
645
00:33:48,400 --> 00:33:53,559
a.i. Supercomputer
646
00:33:53,560 --> 00:33:57,119
on a bunch of networked-together g.P.U.S. And
I went there and taught some of his students
647
00:33:57,120 --> 00:34:00,559
how to use the early versions
of my opencog system.
648
00:34:00,560 --> 00:34:04,279
Ended up hiring a few
people out of Hugo's lab
649
00:34:04,280 --> 00:34:07,039
to come to Hong Kong
and work with me here
650
00:34:07,040 --> 00:34:08,520
on opencog-related things.
651
00:34:12,520 --> 00:34:16,559
[De garis] How can I put this?
652
00:34:16,560 --> 00:34:18,839
He my best friend, so I don't hurt his
feelings, but... Hey, Hugo, how's it going, man?
653
00:34:18,840 --> 00:34:21,759
It's hot.
654
00:34:21,760 --> 00:34:22,760
Yeah, it is hot.
655
00:34:22,761 --> 00:34:24,959
Yeah, it's been a while, huh?
656
00:34:24,960 --> 00:34:26,159
He's not as political,
put it that way, as I am.
657
00:34:26,160 --> 00:34:28,919
He's much more optimistic.
658
00:34:28,920 --> 00:34:31,239
I'm more pessimistic.
659
00:34:31,240 --> 00:34:33,759
I would claim more realistic.
660
00:34:33,760 --> 00:34:36,439
What's up with you?
661
00:34:36,440 --> 00:34:39,119
You're the one with the news.
662
00:34:39,120 --> 00:34:43,199
Yeah, we got a lot... A lot of stuff going on
here, actually. So can your robot talk to me yet?
663
00:34:43,200 --> 00:34:45,319
Well, we've been doing
some stuff with the robots...
664
00:34:45,320 --> 00:34:47,719
When I was 18, I got
conscripted to fight in Vietnam.
665
00:34:47,720 --> 00:34:50,920
And you know, believe me,
that experience radicalizes you!
666
00:34:52,720 --> 00:34:56,479
Let's see a bit of
the funny little robots.
667
00:34:56,480 --> 00:34:58,999
Okay.
668
00:34:59,000 --> 00:35:00,359
[De garis] It forces you
to look on the dark side.
669
00:35:00,360 --> 00:35:02,879
[Goertzel] You can
grab a chair, I guess.
670
00:35:02,880 --> 00:35:05,999
[De garis] These machines
may become so advanced,
671
00:35:06,000 --> 00:35:07,239
they may look on us the
way we look on bacteria.
672
00:35:07,240 --> 00:35:13,039
Like...
673
00:35:13,040 --> 00:35:15,279
[Stamping feet]...Every time I do this, stamping my
feet on the floor...[Stamping feet] Probably I'm killing
674
00:35:15,280 --> 00:35:17,879
it's a hard floor.
675
00:35:17,880 --> 00:35:20,879
But I'm human.
676
00:35:20,880 --> 00:35:24,159
I don't give a damn about these bacteria that I'm
killing, because I feel I'm so superior to them.
677
00:35:24,160 --> 00:35:27,559
They're one-celled creatures.
678
00:35:27,560 --> 00:35:28,679
I'm a hundred-trillion-cell
creature.
679
00:35:28,680 --> 00:35:31,959
Their death means nothing to me.
680
00:35:31,960 --> 00:35:36,199
[Mandeep] So hi there.
681
00:35:36,200 --> 00:35:38,239
[Goertzel] Do you remember Hugo?
682
00:35:38,240 --> 00:35:40,799
Hugo De garis?
683
00:35:40,800 --> 00:35:42,160
[Robotics whirring]
684
00:35:46,760 --> 00:35:48,719
[Robosapiens] Hugo
De garis is a brain builder.
685
00:35:48,720 --> 00:35:51,559
And he believes that
topological quantum computing
686
00:35:51,560 --> 00:35:54,559
can help revolutionize
computer science.
687
00:35:54,560 --> 00:35:58,799
We both see very clearly the
potentials inherent in physics
688
00:35:58,800 --> 00:36:02,799
to just process information
and manifest intelligence
689
00:36:02,800 --> 00:36:06,639
many, many, many,
many orders of magnitude
690
00:36:06,640 --> 00:36:09,719
beyond what the
human brain can do.
691
00:36:09,720 --> 00:36:11,359
[De garis] So the
circles are now nodes.
692
00:36:11,360 --> 00:36:13,119
So how do you represent a link?
693
00:36:13,120 --> 00:36:17,319
Again lines?
694
00:36:17,320 --> 00:36:22,279
[Goertzel] Probably the one area where we have had the
most arguments, which have been, you know, friendly,
695
00:36:22,280 --> 00:36:23,839
and by and large
stimulating and entertaining,
696
00:36:23,840 --> 00:36:27,159
is in terms of the
human consequences
697
00:36:27,160 --> 00:36:30,799
of the advancement of
a.I. Over the next decades.
698
00:36:30,800 --> 00:36:35,399
Hugo has seemed to
consider it more likely
699
00:36:35,400 --> 00:36:38,319
that super-advanced a.I.S
will just squash all humanity.
700
00:36:38,320 --> 00:36:41,439
Whereas my best
guess, for what it's worth,
701
00:36:41,440 --> 00:36:44,519
is that enhanced humans
and improved human life
702
00:36:44,520 --> 00:36:46,840
will coexist with dramatically
superior superintelligences.
703
00:36:48,000 --> 00:36:52,319
We have one advantage, too.
704
00:36:52,320 --> 00:36:55,319
We get to make the first move.
705
00:36:55,320 --> 00:36:59,119
We will create, design the first
superintelligence. And so if we do our job right,
706
00:36:59,120 --> 00:37:01,519
then this superintelligence
should be an extension of us,
707
00:37:01,520 --> 00:37:05,839
it should be on our side.
708
00:37:05,840 --> 00:37:08,079
I'm not so sure that will work.
709
00:37:08,080 --> 00:37:10,799
We can't even control
our own teenagers,
710
00:37:10,800 --> 00:37:13,120
so how are we expected to
create a new kind of intelligence
711
00:37:13,200 --> 00:37:17,879
that we barely will understand,
712
00:37:17,880 --> 00:37:19,759
because it can
rewrite its own code,
713
00:37:19,760 --> 00:37:21,799
and will think very
differently from us,
714
00:37:21,800 --> 00:37:23,399
and still maintain
control of it?
715
00:37:23,400 --> 00:37:27,719
I'm rather optimistic
716
00:37:27,720 --> 00:37:29,759
that this is going to be one of the
best thingsthat have ever happened.
717
00:37:29,760 --> 00:37:31,479
However, there is
a small probability
718
00:37:31,480 --> 00:37:32,839
that it turns out to be the
last thing humanity ever does,
719
00:37:32,840 --> 00:37:35,799
and it's the end of our species
720
00:37:35,800 --> 00:37:37,959
and all the value we
can ever have achieved.
721
00:37:37,960 --> 00:37:40,879
Inside the control
module, then...
722
00:37:40,880 --> 00:37:43,280
I think Ben, by nature, he's
more of a techie than I am.
723
00:37:44,440 --> 00:37:49,519
I mean, I was.
724
00:37:49,520 --> 00:37:51,319
I've gone back to my first and
original love of math physics.
725
00:37:51,320 --> 00:37:56,839
All the positi
726
00:37:56,840 --> 00:37:59,239
ve things that could come out of artificial
intelligence, you know, I agree with Ben.
727
00:37:59,240 --> 00:38:01,359
I just don't think
he goes far enough.
728
00:38:01,360 --> 00:38:04,359
Bit of a Frankenstein
monster from various theses...
729
00:38:04,360 --> 00:38:07,559
[Harris] If we built machines
more intelligent than ourselves,
730
00:38:07,560 --> 00:38:10,999
that were capable of making
731
00:38:11,000 --> 00:38:13,759
further changes to
their own source code,
732
00:38:13,760 --> 00:38:15,880
so that they became
the drivers of progress
733
00:38:16,960 --> 00:38:22,119
in building future a.I.,
734
00:38:22,120 --> 00:38:24,401
then we would have built
something that in very short order
735
00:38:25,920 --> 00:38:30,639
would become
functionally god-like,
736
00:38:30,640 --> 00:38:33,319
but we essentially would
have built a god in a box.
737
00:38:33,320 --> 00:38:35,399
And we will want
to be very careful
738
00:38:35,400 --> 00:38:37,640
not to build an angry god.
739
00:38:44,560 --> 00:38:49,399
[Narrator] The prototypes
in Ben goertzel's labs
740
00:38:49,400 --> 00:38:50,919
seem neither angry nor
particularly highly developed.
741
00:38:50,920 --> 00:38:55,919
But looks are deceiving.
742
00:38:55,920 --> 00:38:57,799
The team is working on a way
to network intelligent machines,
743
00:38:57,800 --> 00:38:59,880
by uploading their artificial
minds into cyberspace.
744
00:39:01,280 --> 00:39:05,879
The mind will be in the cloud.
745
00:39:05,880 --> 00:39:08,319
The sensors are
on the robot's body.
746
00:39:08,320 --> 00:39:10,719
What we learn here from
playing with these prototypes
747
00:39:10,720 --> 00:39:12,959
could then be commercialized.
748
00:39:12,960 --> 00:39:14,999
If you can get intelligent
robots out there
749
00:39:15,000 --> 00:39:17,439
with millions of users,
750
00:39:17,440 --> 00:39:19,159
I mean, imagine the
amount of learning
751
00:39:19,160 --> 00:39:21,359
that the a.I. Can do from
all these users, right?
752
00:39:21,360 --> 00:39:24,359
Yeah, that's been your vision.
753
00:39:24,360 --> 00:39:25,799
I mean, over at hanson's
office, we've got robots that...
754
00:39:25,800 --> 00:39:29,399
His latest robot heads
really look totally realistic.
755
00:39:29,400 --> 00:39:33,639
Look.
756
00:39:33,640 --> 00:39:35,479
Like... like human beings.
757
00:39:35,480 --> 00:39:37,720
[Goertzel in video] Then can
I see reality the way you do?
758
00:39:38,000 --> 00:39:41,400
[Automated female voice]
Reality cannot be detected.
759
00:39:43,480 --> 00:39:46,840
That's an interesting
metaphysics.
760
00:39:49,600 --> 00:39:54,839
[A
761
00:39:54,840 --> 00:39:56,999
[Utomated female voice] Make sense
to me. Oh, makes sense to me, too.
762
00:39:57,000 --> 00:39:59,479
And he's coming over
to Beijing now, as well.
763
00:39:59,480 --> 00:40:01,879
Yeah.
764
00:40:01,880 --> 00:40:02,800
[De garis] So what's
in the suitcase?
765
00:40:02,801 --> 00:40:04,119
[Hanson] Oh, this Sofia's body.
766
00:40:04,120 --> 00:40:08,079
Here we go.
767
00:40:08,080 --> 00:40:11,639
Aye-aye-aye!
768
00:40:11,640 --> 00:40:15,639
There's a lot of requests for
companionship robots of various kinds.
769
00:40:15,640 --> 00:40:17,199
Elder care companionship.
770
00:40:17,200 --> 00:40:19,679
You know...
771
00:40:19,680 --> 00:40:21,919
[De garis] 'Cause
it's so realistic!
772
00:40:21,920 --> 00:40:23,799
I mean, just look like a mould.
773
00:40:23,800 --> 00:40:27,119
[Hanson] Yeah,
so I like to sculpt...
774
00:40:27,120 --> 00:40:30,559
I have that animation background. Now, wait a
minute, did it just move...? Oh, that was you.
775
00:40:30,560 --> 00:40:32,959
[Hanson] Yeah, so what you might
notice is her lips can actually purse.
776
00:40:32,960 --> 00:40:37,159
[De garis] Yeah.
777
00:40:37,160 --> 00:40:40,839
It is kind of freaky.
778
00:40:40,840 --> 00:40:44,479
[Hanson] Her mouth can make the full range of speaking motions.
So she's got cameras in her eyes and a camera in her chest.
779
00:40:44,480 --> 00:40:49,679
She tracks faces.
780
00:40:49,680 --> 00:40:52,559
She'll actually see your face and
make eye contact. [De garis] Brilliant.
781
00:40:52,560 --> 00:40:53,919
[Hanson] Yeah.
782
00:40:53,920 --> 00:40:54,520
[Narrator] Hugo
seems fascinated,
783
00:40:54,521 --> 00:40:57,239
yet skeptical.
784
00:40:57,240 --> 00:40:59,559
So in essence, the
collaboration between you two...
785
00:40:59,560 --> 00:41:04,879
And a bunch of other people.
786
00:41:04,880 --> 00:41:06,879
Keeping it simple, you're the body and you're the mind.
Yeah, so can you can you imagine yourself uploaded
787
00:41:06,880 --> 00:41:08,879
into one of these robots?
788
00:41:08,880 --> 00:41:11,039
I might be the millionth.
789
00:41:11,040 --> 00:41:13,679
Millionth?
790
00:41:13,680 --> 00:41:15,920
I wouldn't be the first.
[Goertzel] How about the second?
791
00:41:16,840 --> 00:41:20,479
[Narrator] Like all
technological breakthroughs,
792
00:41:20,480 --> 00:41:22,359
the development of human-like
machines may come at a cost.
793
00:41:22,360 --> 00:41:26,119
But that will not stop the
human spirit of exploration
794
00:41:26,120 --> 00:41:29,839
and thirst for knowledge.
795
00:41:29,840 --> 00:41:33,319
[Goertzel] Well,
once I'm in there,
796
00:41:33,320 --> 00:41:35,719
and then I tell you
how great it is...
797
00:41:35,720 --> 00:41:38,559
Yeah, like that.
798
00:41:38,560 --> 00:41:41,159
Then you may be convinced.
You know randal, right?
799
00:41:41,160 --> 00:41:43,239
From minduploading. Org?
800
00:41:43,240 --> 00:41:46,359
The neurophysicist?
801
00:41:46,360 --> 00:41:47,999
Yeah, he's working on trying to figure
out how to get all the information...
802
00:41:48,000 --> 00:41:52,839
Oh, from the brain?
803
00:41:52,840 --> 00:41:56,279
[Goertzel] ...Out of a person's brain. How to read a
brain, upload a brain. Yeah, how to read the brain.
804
00:41:56,280 --> 00:41:57,240
So then you could scan all the
informationout of a human brain...
805
00:41:57,241 --> 00:42:01,079
put it into a computer,
806
00:42:01,080 --> 00:42:03,279
which you could use to drive a body
like this. I mean, that's a huge task.
807
00:42:03,280 --> 00:42:04,399
[Goertzel] Well,
they can't do it...
808
00:42:04,400 --> 00:42:06,719
It's a quadrillion connections!
809
00:42:06,720 --> 00:42:07,720
[Hanson] Yeah, so
randal's work is really neat.
810
00:42:07,721 --> 00:42:12,759
[Narra
811
00:42:12,760 --> 00:42:14,639
[Tor] Randal koene's project
entails enormous challenges.
812
00:42:14,640 --> 00:42:18,279
I would like to transcend
the limits of biology,
813
00:42:18,280 --> 00:42:21,079
mostly because those limits
814
00:42:21,080 --> 00:42:23,799
are limits that are imposed
by natural selection.
815
00:42:23,800 --> 00:42:27,559
So they're imposed by the
needs of the environment,
816
00:42:27,560 --> 00:42:30,359
not necessarily the desires
of what we want to accomplish,
817
00:42:30,360 --> 00:42:34,199
or needs that we
haven't encountered yet.
818
00:42:34,200 --> 00:42:37,159
We've been messing around
with biological evolution
819
00:42:37,160 --> 00:42:39,479
for centuries, even millennia.
820
00:42:39,480 --> 00:42:42,679
Now we're coming to the point
821
00:42:42,680 --> 00:42:45,119
where we can mess around with genetics
directly. And we can mess around with computers,
822
00:42:45,120 --> 00:42:47,359
with robotic creations.
823
00:42:47,360 --> 00:42:49,159
We are acting as
a kind of catalyst,
824
00:42:49,160 --> 00:42:51,319
perhaps even
more than a catalyst,
825
00:42:51,320 --> 00:42:53,399
for dramatic changes
826
00:42:53,400 --> 00:42:55,200
in the course of evolution
broadly understood.
827
00:42:59,880 --> 00:43:00,880
[Narrator] But the nature
of the consequences
828
00:43:00,881 --> 00:43:03,359
is by no means solely technical.
829
00:43:03,360 --> 00:43:06,959
When you talk about copying
information from your brain
830
00:43:06,960 --> 00:43:10,279
into another system,
831
00:43:10,280 --> 00:43:12,199
that poses a variety of
philosophical problems.
832
00:43:12,200 --> 00:43:16,759
You seem to have
been duplicated at best,
833
00:43:16,760 --> 00:43:18,759
but the fact that
your identical twin
834
00:43:18,760 --> 00:43:22,719
is now able to live
out its adventures
835
00:43:22,720 --> 00:43:26,359
and is immortal presumably,
836
00:43:26,360 --> 00:43:29,119
that will be a cold comfort
to you having been copied.
837
00:43:29,120 --> 00:43:32,279
So there are questions
of identity here
838
00:43:32,280 --> 00:43:34,920
that would arguably
confound any notion
839
00:43:37,040 --> 00:43:41,759
of uploading a human
mind into another condition.
840
00:43:41,760 --> 00:43:45,039
Some people think that
the self is a program,
841
00:43:45,040 --> 00:43:46,999
and you can download
and upload it at will.
842
00:43:47,000 --> 00:43:50,679
I think that's a
category mistake.
843
00:43:50,680 --> 00:43:53,079
I don't think a
person is a program
844
00:43:53,080 --> 00:43:56,159
because the program is
like a mathematical equation,
845
00:43:56,160 --> 00:43:57,799
where individuals who
function causally in the world
846
00:43:57,800 --> 00:44:02,199
occupy space.
847
00:44:02,200 --> 00:44:04,479
Equations aren't
located anywhere.
848
00:44:04,480 --> 00:44:05,919
People often ask me: What is
it that you have to take along?
849
00:44:05,920 --> 00:44:10,079
And what can you leave behind,
850
00:44:10,080 --> 00:44:11,759
if you're going
to emulate a brain
851
00:44:11,760 --> 00:44:13,359
so that you have a mind upload?
852
00:44:13,360 --> 00:44:14,559
Which parts are essential?
853
00:44:14,560 --> 00:44:16,519
What makes up me?
854
00:44:16,520 --> 00:44:18,359
[Goertzel] The best way they
have to scan your brain now
855
00:44:18,360 --> 00:44:22,119
is to remove your head...
856
00:44:22,120 --> 00:44:23,239
freeze it, slice it like
in the deli counter,
857
00:44:23,240 --> 00:44:26,319
and then scan each slice.
858
00:44:26,320 --> 00:44:30,199
[De garis] Incredibly clumsy.
859
00:44:30,200 --> 00:44:31,479
Then you try to reconstruct the connectome inside the
computer, and they're working on the three-dimensional
860
00:44:31,480 --> 00:44:33,239
image processing...
861
00:44:33,240 --> 00:44:34,679
There must be a much better way.
862
00:44:34,680 --> 00:44:36,559
Well, I think if we just,
like, draw a hole here
863
00:44:36,560 --> 00:44:38,999
and put some carbon
nanotubes in your head...
864
00:44:39,000 --> 00:44:43,279
Yeah, yeah...
865
00:44:43,280 --> 00:44:46,439
then maybe those nanotubes can snake around and read...
They could have tiny little radio transmitters and they...
866
00:44:46,440 --> 00:44:49,279
It would also serve as a
brain computer interface
867
00:44:49,280 --> 00:44:51,759
while you're still alive, right?
868
00:44:51,760 --> 00:44:52,520
I think with my own personality,
if I upload myself anywhere,
869
00:44:52,521 --> 00:44:57,039
I'm not that conservative.
870
00:44:57,040 --> 00:44:58,519
So I would just... I would
want to enhance my mind
871
00:44:58,520 --> 00:45:03,239
however I could.
872
00:45:03,240 --> 00:45:05,479
If I could see through a billion sensors at
once and not just one body, then why not, right?
873
00:45:05,480 --> 00:45:07,839
And if I could solve a
differential equation in my mind
874
00:45:07,840 --> 00:45:10,879
in a fraction of a second?
875
00:45:10,880 --> 00:45:14,479
Sure.
876
00:45:14,480 --> 00:45:18,159
So once we've gotten the mind out of this substrate
that it evolved in, but is now trapped in,
877
00:45:18,160 --> 00:45:19,759
then that's just the beginning
878
00:45:19,760 --> 00:45:21,559
of an endless
variety of options.
879
00:45:21,560 --> 00:45:22,999
But there's still that deep
philosophical problem, right?
880
00:45:23,000 --> 00:45:26,279
The sense of self.
881
00:45:26,280 --> 00:45:28,199
So would this have
its own sense of self?
882
00:45:28,200 --> 00:45:31,919
So I think that that would
that would be resolved,
883
00:45:31,920 --> 00:45:34,039
as we start getting maybe
more of a scientific understanding
884
00:45:34,040 --> 00:45:38,879
of what self is...
885
00:45:38,880 --> 00:45:40,599
[Harris] There's a horrible possibility
here, which is that we could build machines
886
00:45:40,600 --> 00:45:42,519
that are far more
intelligent than us
887
00:45:42,520 --> 00:45:45,559
in terms of their competence,
888
00:45:45,560 --> 00:45:46,999
in terms of the kinds of goals
they can achieve in the world.
889
00:45:47,000 --> 00:45:49,839
And they may, in
fact, not be conscious.
890
00:45:49,840 --> 00:45:54,079
The lights may not be on,
891
00:45:54,080 --> 00:45:56,959
and there will be nothing that it's like to be those
machines. And that is quite an alarming prospect from my view,
892
00:45:56,960 --> 00:46:00,999
ethically speaking, because
then you could just have,
893
00:46:01,000 --> 00:46:05,159
you know, the galaxy
being populated by,
894
00:46:05,160 --> 00:46:07,479
you know, blind robots.
895
00:46:07,480 --> 00:46:10,999
And who would want that?
896
00:46:11,000 --> 00:46:12,839
[Goertzel] What if
we wire me into her?
897
00:46:12,840 --> 00:46:16,279
Now, my hypothesis is
898
00:46:16,280 --> 00:46:18,000
wiring my brain into
the robot's cloud mind,
899
00:46:18,760 --> 00:46:22,639
I think I would feel something.
900
00:46:22,640 --> 00:46:25,479
Yeah.
901
00:46:25,480 --> 00:46:26,879
Including consciousness.
902
00:46:26,880 --> 00:46:28,079
I mean, it's uncanny, isn't it?
903
00:46:28,080 --> 00:46:29,400
[Hanson] It is.
904
00:46:33,160 --> 00:46:34,999
Now she looks
like your daughter.
905
00:46:35,000 --> 00:46:36,159
[Goertzel] She looks
much, much smarter.
906
00:46:36,160 --> 00:46:37,439
[Narrator] What looks
like a harmless game
907
00:46:37,440 --> 00:46:38,719
is fundamentally one of
the greatest challenges
908
00:46:38,720 --> 00:46:40,080
mankind has ever faced.
909
00:46:44,320 --> 00:46:45,200
[Distorted overlapping voices]
910
00:46:45,201 --> 00:46:49,519
It deals with the question
911
00:46:49,520 --> 00:46:50,320
of which role technology
should play for us
912
00:46:50,321 --> 00:46:52,919
in the future.
913
00:46:52,920 --> 00:46:54,399
Randal koene has
already answered
914
00:46:54,400 --> 00:46:56,919
this question for himself.
915
00:46:56,920 --> 00:46:59,479
[Koene] So to be honest,
of course I don't really know
916
00:46:59,480 --> 00:47:00,919
how long it's going to take to
create whole brain emulation,
917
00:47:00,920 --> 00:47:03,959
because it's a very
complicated task,
918
00:47:03,960 --> 00:47:05,159
and it's very hard to
tell with tasks like that
919
00:47:05,160 --> 00:47:07,119
just how long they take.
920
00:47:07,120 --> 00:47:08,599
It depends on so many things.
921
00:47:08,600 --> 00:47:11,759
And we need a
lot of different tools,
922
00:47:11,760 --> 00:47:13,759
many of which are just basically in development. So it's
entirely possible that I may not live to see the outcome,
923
00:47:13,760 --> 00:47:17,679
or may not live to
help get us there.
924
00:47:17,680 --> 00:47:19,679
But what I'm trying to do
is to at least pave a path,
925
00:47:19,680 --> 00:47:22,919
to create a road map
to help move it along,
926
00:47:22,920 --> 00:47:24,919
and to make sure that
there are projects in existence
927
00:47:24,920 --> 00:47:27,039
and people who
understand the meaning
928
00:47:27,040 --> 00:47:29,159
and the importance
of these ideas,
929
00:47:29,160 --> 00:47:30,639
so that it can carry
forward anyhow.
930
00:47:30,640 --> 00:47:33,759
And I think that's
why I do this.
931
00:47:33,760 --> 00:47:35,720
And perhaps this is how I
make meaning in my life.
932
00:47:39,920 --> 00:47:42,639
[Narrator] It has been a
long journey from the big bang
933
00:47:42,640 --> 00:47:45,159
to the emergence of an
intelligent, self-aware being,
934
00:47:45,160 --> 00:47:48,399
one that never
ceases to question.
935
00:47:48,400 --> 00:47:52,759
[Bostrom] And
this will be a wate
936
00:47:52,760 --> 00:47:54,799
rshed moment, more
important than anything
937
00:47:54,800 --> 00:47:56,279
that has happened
in human history.
938
00:47:56,280 --> 00:47:59,639
The creation of
superintelligence
939
00:47:59,640 --> 00:48:02,519
will be the last invention that we will ever need to make.
And if one is looking for some kind of comparable event,
940
00:48:02,520 --> 00:48:04,720
I don't know, maybe the
rise of life in the first place.
941
00:48:09,920 --> 00:48:10,200
[Sandberg] Human nature is
all about self-transformation,
942
00:48:10,201 --> 00:48:14,839
I think.
943
00:48:14,840 --> 00:48:17,879
We all try to become better
people. And as a society, of course,
944
00:48:17,880 --> 00:48:19,599
we will not really
stop learning things.
945
00:48:19,600 --> 00:48:24,119
Innovation has always
been with humans.
946
00:48:24,120 --> 00:48:26,719
So it's not really possible
to stop technology.
947
00:48:26,720 --> 00:48:29,919
However, you might aim at
getting beneficial technologies
948
00:48:29,920 --> 00:48:31,920
before dangerous technologies.
949
00:48:37,400 --> 00:48:39,799
Maybe these machines
will be friendly towards us,
950
00:48:39,800 --> 00:48:42,239
but on the other
hand, maybe not.
951
00:48:42,240 --> 00:48:44,879
So we're taking a
risk in building them.
952
00:48:44,880 --> 00:48:46,599
And at some stage, humanity
is going to have to choose.
953
00:48:46,600 --> 00:48:49,319
Your children or
your grandchildren
954
00:48:49,320 --> 00:48:50,919
will have to make a decision.
955
00:48:50,920 --> 00:48:54,759
It'll be very real.
956
00:48:54,760 --> 00:48:57,440
It won't be just a piece of science
fiction that it seems like today.
77361
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.