Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:01,034 --> 00:00:03,105
(buzzing)
2
00:00:18,251 --> 00:00:20,788
Even if we can
change human beings,
3
00:00:21,021 --> 00:00:23,169
in what direction
do we change them?
4
00:00:23,189 --> 00:00:26,339
Do we want to change them
this way or that way?
5
00:00:26,359 --> 00:00:30,210
This is an example
of the way in which
6
00:00:30,230 --> 00:00:35,281
technological advance impinges
on sociological necessity.
7
00:00:35,301 --> 00:00:43,301
♪
8
00:00:56,356 --> 00:00:58,204
What we need to
ask ourselves is,
9
00:00:58,224 --> 00:01:02,442
"Where does my individual
ability to control my life"
10
00:01:02,462 --> 00:01:05,278
or to influence the
political process
11
00:01:05,298 --> 00:01:08,782
"lie in relation to these
new forms of technology?"
12
00:01:08,802 --> 00:01:11,651
♪
13
00:01:11,671 --> 00:01:13,219
Government and politicians
14
00:01:13,239 --> 00:01:15,088
don't understand
what's happening.
15
00:01:15,108 --> 00:01:17,490
See, they don't even realized
this change is happening.
16
00:01:17,510 --> 00:01:18,758
♪
17
00:01:18,778 --> 00:01:21,828
It is very difficult,
not impossible,
18
00:01:21,848 --> 00:01:25,765
to predict what the
precise effects will be,
19
00:01:25,785 --> 00:01:30,703
and in... in many cases, like
with other technologies,
20
00:01:30,723 --> 00:01:33,273
we have to socket and see.
21
00:01:33,293 --> 00:01:35,608
Who would have predicted
the internet?
22
00:01:35,628 --> 00:01:38,411
And I talk about this
matter as humanity 2.0
23
00:01:38,431 --> 00:01:40,280
'cause in a sense, this is
kind of where we're heading,
24
00:01:40,300 --> 00:01:42,749
to some kind of new...
new normal, as it were,
25
00:01:42,769 --> 00:01:45,351
of what it is to
be a human being.
26
00:01:45,371 --> 00:01:47,554
It's not a problem
27
00:01:47,574 --> 00:01:49,689
that we should dismiss
or underestimate.
28
00:01:49,709 --> 00:01:51,882
It's staggering in
its proportions.
29
00:01:52,112 --> 00:01:54,888
Ignorance and disbelief
at the same time.
30
00:01:55,115 --> 00:01:59,499
People don't believe that
change is happening this fast.
31
00:01:59,519 --> 00:02:00,867
That's the problem.
32
00:02:00,887 --> 00:02:06,372
♪
33
00:02:06,392 --> 00:02:08,675
This is a stone
34
00:02:08,695 --> 00:02:10,902
formed naturally in
the Earth's crust
35
00:02:11,131 --> 00:02:14,347
over millions of years
through pressure and heat.
36
00:02:14,367 --> 00:02:18,618
It was discovered in the
Olduvai Gorge in Tanzania.
37
00:02:18,638 --> 00:02:22,322
Dated around 2.5
million years BC,
38
00:02:22,342 --> 00:02:26,459
it is arguably one of the
first examples of technology.
39
00:02:26,479 --> 00:02:29,496
Stone tools were first adapted
for the use of cutting,
40
00:02:29,516 --> 00:02:34,234
scraping, or pounding
materials by Homo habilis,
41
00:02:34,254 --> 00:02:37,637
one of our earliest ancestors.
42
00:02:37,657 --> 00:02:39,672
Over one million years later,
43
00:02:39,692 --> 00:02:42,208
mankind made one of
the most significant
44
00:02:42,228 --> 00:02:45,278
of all technological
discoveries...
45
00:02:45,298 --> 00:02:46,579
Fire.
46
00:02:46,599 --> 00:02:48,481
The ability to control fire
47
00:02:48,501 --> 00:02:51,184
was a turning point
for human evolution.
48
00:02:51,204 --> 00:02:54,320
It kept us warm, allowed
us to see in the dark,
49
00:02:54,340 --> 00:02:56,289
and allowed us to cook food,
50
00:02:56,309 --> 00:02:59,559
which many scientists believe
was a huge contributor
51
00:02:59,579 --> 00:03:01,361
to the ascent of mind.
52
00:03:01,381 --> 00:03:02,929
♪
53
00:03:02,949 --> 00:03:07,267
Each age, each empire,
has brought with it
54
00:03:07,287 --> 00:03:11,304
the discovery and invention
of numerous technologies
55
00:03:11,324 --> 00:03:15,341
each in their own way
redesigning human life...
56
00:03:15,361 --> 00:03:17,610
♪
57
00:03:17,630 --> 00:03:20,313
Leading us to now...
58
00:03:20,333 --> 00:03:21,937
modern-day society.
59
00:03:23,803 --> 00:03:27,353
We're now more advanced,
connected, knowledgeable,
60
00:03:27,373 --> 00:03:31,391
and resistant to disease
than ever before,
61
00:03:31,411 --> 00:03:33,326
and it is all due to our ability
62
00:03:33,346 --> 00:03:36,930
to apply scientific knowledge
for practical purposes
63
00:03:36,950 --> 00:03:40,366
in a bid to maximize efficiency.
64
00:03:40,386 --> 00:03:44,304
Just as the stone set us on
a path of transformation,
65
00:03:44,324 --> 00:03:46,472
the technologies of the future
66
00:03:46,492 --> 00:03:49,475
may bring with them
a paradigm shift,
67
00:03:49,495 --> 00:03:54,280
changing two major features
of the human experience.
68
00:03:54,300 --> 00:03:56,549
Two things that have
defined our lives
69
00:03:56,569 --> 00:03:58,952
for as long as we can remember.
70
00:03:58,972 --> 00:04:04,012
Two things that have always
been involuntary constants:
71
00:04:04,244 --> 00:04:07,293
Trading our time for sustenance
72
00:04:07,313 --> 00:04:09,896
and losing that time
through senescence.
73
00:04:09,916 --> 00:04:14,661
♪
74
00:04:19,392 --> 00:04:23,543
♪
75
00:04:23,563 --> 00:04:25,945
(indistinct chattering)
76
00:04:25,965 --> 00:04:29,282
♪
77
00:04:29,302 --> 00:04:30,750
(telephone ringing)
78
00:04:30,770 --> 00:04:36,022
♪
79
00:04:36,042 --> 00:04:38,558
(Zapping)
80
00:04:38,578 --> 00:04:46,532
♪
81
00:04:46,552 --> 00:04:48,868
(laughing)
82
00:04:48,888 --> 00:04:56,888
♪
83
00:04:57,463 --> 00:04:59,712
The Industrial Revolution
effectively freed man
84
00:04:59,732 --> 00:05:01,414
from being a beast of burden.
85
00:05:01,434 --> 00:05:03,583
The computer revolution
will similarly free him
86
00:05:03,603 --> 00:05:06,019
from dull, repetitive routine.
87
00:05:06,039 --> 00:05:07,720
The computer revolution
is, however,
88
00:05:07,740 --> 00:05:09,989
perhaps better compared
with the Copernican
89
00:05:10,009 --> 00:05:12,058
or the Darwinian Revolution,
90
00:05:12,078 --> 00:05:15,528
both of which greatly changed
man's idea of himself
91
00:05:15,548 --> 00:05:17,597
and the world in which he lives.
92
00:05:17,617 --> 00:05:21,367
In the space of 60 years,
we have landed on the moon,
93
00:05:21,387 --> 00:05:23,903
seen the rise of
computing power,
94
00:05:23,923 --> 00:05:25,605
mobile phones,
95
00:05:25,625 --> 00:05:27,740
the explosion of the internet,
96
00:05:27,760 --> 00:05:31,344
and we have sequenced
the human genome.
97
00:05:31,364 --> 00:05:33,646
We took man to the moon and back
98
00:05:33,666 --> 00:05:35,915
with four kilobytes of memory.
99
00:05:35,935 --> 00:05:40,787
The phone in your pocket
is at least 250,000 times
100
00:05:40,807 --> 00:05:43,022
more powerful than that.
101
00:05:43,042 --> 00:05:46,592
We are ever-increasingly
doing more with less.
102
00:05:46,612 --> 00:05:48,394
One of the things
that has been born
103
00:05:48,414 --> 00:05:50,963
out of this technological
revolution
104
00:05:50,983 --> 00:05:54,033
is the ability to
replace human workers
105
00:05:54,053 --> 00:05:56,969
with more efficient machines.
106
00:05:56,989 --> 00:05:58,871
This is largely due to the speed
107
00:05:58,891 --> 00:06:02,442
at which we are advancing our
technological capabilities.
108
00:06:02,462 --> 00:06:04,510
(applause)
109
00:06:04,530 --> 00:06:08,748
Information technology grows
in an exponential manner.
110
00:06:08,768 --> 00:06:10,717
It's not linear.
111
00:06:10,737 --> 00:06:12,685
And our intuition is linear.
112
00:06:12,705 --> 00:06:14,150
When we walked
through the savanna
113
00:06:14,374 --> 00:06:16,122
a thousand years ago, we
made linear predictions
114
00:06:16,142 --> 00:06:18,024
where that animal would
be and that worked fine.
115
00:06:18,044 --> 00:06:20,426
It's hardwired in our brains,
116
00:06:20,446 --> 00:06:22,962
but the pace of
exponential growth
117
00:06:22,982 --> 00:06:26,632
is really what describes
information technologies,
118
00:06:26,652 --> 00:06:28,735
and it's not just computation.
119
00:06:28,755 --> 00:06:30,103
There's a big difference
between linear
120
00:06:30,123 --> 00:06:31,137
and exponential growth.
121
00:06:31,157 --> 00:06:33,840
If I take 30 steps linearly,
122
00:06:33,860 --> 00:06:36,909
one, two, three, four,
five, I get to 30.
123
00:06:36,929 --> 00:06:39,011
If I take 30 steps
exponentially,
124
00:06:39,031 --> 00:06:42,582
two, four, eight, sixteen,
I get to a billion.
125
00:06:42,602 --> 00:06:44,450
It makes a huge difference.
126
00:06:44,470 --> 00:06:46,786
And that really describes
information technology.
127
00:06:46,806 --> 00:06:48,855
When I was a student at MIT,
128
00:06:48,875 --> 00:06:51,057
we all shared one computer,
it took up a whole building.
129
00:06:51,077 --> 00:06:52,792
The computer in your
cell phone today
130
00:06:52,812 --> 00:06:55,795
is a million times cheaper,
a million times smaller,
131
00:06:55,815 --> 00:06:58,030
a thousand times more powerful.
132
00:06:58,050 --> 00:07:00,900
That's a billionfold increase
in capability per dollar
133
00:07:00,920 --> 00:07:02,535
that we've actually experienced
134
00:07:02,555 --> 00:07:04,103
since I was a student,
135
00:07:04,123 --> 00:07:07,607
and we're gonna do it again
in the next 25 years.
136
00:07:07,627 --> 00:07:10,543
Currently, on an
almost daily basis,
137
00:07:10,563 --> 00:07:13,179
new algorithms, programs,
138
00:07:13,199 --> 00:07:15,748
and feats in
mechanical engineering
139
00:07:15,768 --> 00:07:19,185
are getting closer and
closer to being a reliable
140
00:07:19,205 --> 00:07:24,211
and more cost-effective
alternative to a human worker.
141
00:07:24,444 --> 00:07:28,127
This process is known
as automation.
142
00:07:28,147 --> 00:07:30,930
This is not just about,
143
00:07:30,950 --> 00:07:32,832
you know, automation
where we expect it,
144
00:07:32,852 --> 00:07:34,229
which is in factories
145
00:07:34,454 --> 00:07:36,569
and among blue-collar
workers and so forth.
146
00:07:36,589 --> 00:07:39,071
It is coming quite aggressively
147
00:07:39,091 --> 00:07:40,973
for people at much
higher skill levels,
148
00:07:40,993 --> 00:07:42,842
and that will only
grow in the future
149
00:07:42,862 --> 00:07:45,711
as we continue on
this exponential arc.
150
00:07:45,731 --> 00:07:48,681
This business that, you know,
not having to work very hard
151
00:07:48,701 --> 00:07:51,117
because machines are taking
care of things for you,
152
00:07:51,137 --> 00:07:53,019
I mean, you see this
also in the 19th century
153
00:07:53,039 --> 00:07:54,620
with the Industrial Revolution.
154
00:07:54,640 --> 00:07:57,623
And, in fact, I think
one of the problems
155
00:07:57,643 --> 00:07:59,158
with the Industrial Revolution,
156
00:07:59,178 --> 00:08:02,094
and this is where Marxism
got so much traction,
157
00:08:02,114 --> 00:08:05,254
was that machines
actually did render
158
00:08:05,485 --> 00:08:07,633
a lot of people
unemployed, okay?
159
00:08:07,653 --> 00:08:10,970
That already happened in the
19th and 20th centuries.
160
00:08:10,990 --> 00:08:14,073
And it was only by labor
organizing itself
161
00:08:14,093 --> 00:08:15,373
that it was able to kind of deal
162
00:08:15,495 --> 00:08:17,610
with the situation intelligently
163
00:08:17,630 --> 00:08:19,512
because there was no
automatic, you might say,
164
00:08:19,532 --> 00:08:21,180
transition to something else.
165
00:08:21,200 --> 00:08:23,015
It was just, you know, "We
don't need you anymore."
166
00:08:23,035 --> 00:08:24,517
We have these more
efficient machines,
167
00:08:24,537 --> 00:08:25,985
and so now we don't... you know,
168
00:08:26,005 --> 00:08:27,787
"now you just have to find
work somewhere else."
169
00:08:27,807 --> 00:08:30,256
Automation clearly has been
happening for a long time,
170
00:08:30,276 --> 00:08:33,059
and it has, you know, automated
171
00:08:33,079 --> 00:08:35,795
a lot of very laborious work
that we don't want to do,
172
00:08:35,815 --> 00:08:38,130
and that's gonna continue to
be the case in the future,
173
00:08:38,150 --> 00:08:41,534
but I do think that this
time is genuinely different.
174
00:08:41,554 --> 00:08:43,269
If we look at what's
happened historically,
175
00:08:43,289 --> 00:08:45,071
what we've seen is
that automation
176
00:08:45,091 --> 00:08:47,298
has primarily been a
mechanical phenomenon,
177
00:08:47,527 --> 00:08:49,575
and the classic example of that
178
00:08:49,595 --> 00:08:50,877
is, of course, agriculture.
179
00:08:50,897 --> 00:08:52,545
I'm a farmer.
180
00:08:52,565 --> 00:08:54,614
Here's what mechanical
engineering has done
181
00:08:54,634 --> 00:08:56,582
for all of us who
work on the farms
182
00:08:56,602 --> 00:08:57,917
and for you, too.
183
00:08:57,937 --> 00:08:59,685
It used to be, in
the United States
184
00:08:59,705 --> 00:09:00,853
and in most advanced countries,
185
00:09:00,873 --> 00:09:02,989
that most people
worked on farms.
186
00:09:03,009 --> 00:09:04,824
Now, almost no one
works on a farm.
187
00:09:04,844 --> 00:09:06,659
It's less than two percent.
188
00:09:06,679 --> 00:09:09,061
And, of course, as a result
of that, we're better off.
189
00:09:09,081 --> 00:09:12,231
We have, you know,
more comfortable jobs,
190
00:09:12,251 --> 00:09:13,799
food is cheaper,
191
00:09:13,819 --> 00:09:15,701
we have a much more
advanced society.
192
00:09:15,721 --> 00:09:17,837
The question is, "Can that
continue indefinitely?"
193
00:09:17,857 --> 00:09:19,071
And what we're seeing this time
194
00:09:19,091 --> 00:09:21,207
is that things are
really quite different.
195
00:09:21,227 --> 00:09:23,009
If this keeps up,
196
00:09:23,029 --> 00:09:26,212
it won't be long before
machines will do everything.
197
00:09:26,232 --> 00:09:28,180
Nobody will have work.
198
00:09:28,200 --> 00:09:32,251
So as we move deeper into
the automated future,
199
00:09:32,271 --> 00:09:35,688
we will see different
stages take form.
200
00:09:35,708 --> 00:09:37,957
The first stage
that we're entering
201
00:09:37,977 --> 00:09:40,726
is the stage where automated
robots are working
202
00:09:40,746 --> 00:09:43,029
side by side with
people in factories.
203
00:09:43,049 --> 00:09:45,164
Some of those jobs are
slowly going away,
204
00:09:45,184 --> 00:09:48,768
but in the near future,
within two to three years,
205
00:09:48,788 --> 00:09:50,870
you're going to see
a huge percentage
206
00:09:50,890 --> 00:09:52,905
of those factory
jobs be replaced
207
00:09:52,925 --> 00:09:55,942
with automated systems
and automated robots.
208
00:09:55,962 --> 00:10:00,846
The next stage following
that is we could see
209
00:10:00,866 --> 00:10:04,116
up to a third of jobs in America
210
00:10:04,136 --> 00:10:07,887
be replaced by 2025
211
00:10:07,907 --> 00:10:10,256
by robots or automated systems.
212
00:10:10,276 --> 00:10:12,024
That's a huge
percentage of people
213
00:10:12,044 --> 00:10:13,693
that could be unemployed
214
00:10:13,713 --> 00:10:15,895
because of this
automated tsunami
215
00:10:15,915 --> 00:10:17,263
that's coming, basically.
216
00:10:17,283 --> 00:10:20,766
We have a colleague
here called Carl Frey
217
00:10:20,786 --> 00:10:24,870
who has put together
a list of jobs
218
00:10:24,890 --> 00:10:26,138
by their vulnerability
219
00:10:26,158 --> 00:10:29,175
to getting replaced
by automation,
220
00:10:29,195 --> 00:10:32,404
and the least vulnerable are
things like choreographers,
221
00:10:32,632 --> 00:10:35,915
managers, social workers.
222
00:10:35,935 --> 00:10:38,384
People who have people skills
223
00:10:38,404 --> 00:10:42,054
and who have creativity.
224
00:10:42,074 --> 00:10:50,074
♪
225
00:10:53,119 --> 00:10:55,201
One area that I look a
lot at is fast food.
226
00:10:55,221 --> 00:10:57,336
I mean, the fast food industry
is tremendously important
227
00:10:57,356 --> 00:10:58,871
in American economy.
228
00:10:58,891 --> 00:11:01,874
If you look at the
years since recovery
229
00:11:01,894 --> 00:11:03,676
from the Great Recession,
230
00:11:03,696 --> 00:11:06,746
the majority of jobs,
somewhere around 60 percent,
231
00:11:06,766 --> 00:11:09,215
have been low-wage
service sector jobs.
232
00:11:09,235 --> 00:11:11,751
A lot of those have been
in areas like fast food,
233
00:11:11,771 --> 00:11:14,820
and yet, to me, it seems
almost inevitable
234
00:11:14,840 --> 00:11:17,446
that, ultimately, fast
food is gonna automate.
235
00:11:17,677 --> 00:11:20,292
There's a company right
here in San Francisco
236
00:11:20,312 --> 00:11:24,830
called "Momentum Machines"
which is actually working on
237
00:11:24,850 --> 00:11:26,932
a machine to automate
hamburger production,
238
00:11:26,952 --> 00:11:28,234
and it can crank out
239
00:11:28,254 --> 00:11:32,938
about 400 gourmet
hamburgers per hour,
240
00:11:32,958 --> 00:11:37,443
and they ultimately expect
to sort of roll that out
241
00:11:37,463 --> 00:11:40,846
not just in fast
food establishments,
242
00:11:40,866 --> 00:11:42,715
perhaps in convenience stores
243
00:11:42,735 --> 00:11:43,949
and maybe even vending machines.
244
00:11:43,969 --> 00:11:45,317
It could be all over the place.
245
00:11:45,337 --> 00:11:50,856
♪
246
00:11:50,876 --> 00:11:54,193
I can see manufacturing now
becoming completely automated.
247
00:11:54,213 --> 00:11:56,195
I can see, you know,
hundreds of millions
248
00:11:56,215 --> 00:11:57,897
of workers being
put out of jobs,
249
00:11:57,917 --> 00:11:59,765
That's almost certain
it's gonna happen.
250
00:11:59,785 --> 00:12:01,967
So you have lots of companies
right now that are automating
251
00:12:01,987 --> 00:12:04,036
their factories and
their warehouses.
252
00:12:04,056 --> 00:12:05,771
Amazon is a great example.
253
00:12:05,791 --> 00:12:08,307
They're using robots to
automate their systems.
254
00:12:08,327 --> 00:12:10,376
The robots actually
grab the products
255
00:12:10,396 --> 00:12:11,844
and bring the products
to the people
256
00:12:11,864 --> 00:12:14,447
who put those products
into the box.
257
00:12:14,467 --> 00:12:16,315
So there are still people
258
00:12:16,335 --> 00:12:18,451
within the factories at Amazon,
259
00:12:18,471 --> 00:12:21,821
but in the near future, those
jobs may go away as well.
260
00:12:21,841 --> 00:12:24,423
There is a company
here in Silicon Valley
261
00:12:24,443 --> 00:12:26,125
called "Industrial Perception,"
262
00:12:26,145 --> 00:12:29,428
and they built a robot
that can approach
263
00:12:29,448 --> 00:12:32,832
a stack of boxes that are
son of stacked haphazardly
264
00:12:32,852 --> 00:12:35,201
in some nonstandard way
265
00:12:35,221 --> 00:12:38,437
and visually, by looking
at that stack of boxes,
266
00:12:38,457 --> 00:12:40,139
figure out how to
move those boxes.
267
00:12:40,159 --> 00:12:42,174
And they built a machine
that ultimately
268
00:12:42,194 --> 00:12:45,945
will be able to move perhaps
one box every second,
269
00:12:45,965 --> 00:12:48,536
and that compares with
about three seconds
270
00:12:48,768 --> 00:12:51,851
for a human worker who's
very industrious.
271
00:12:51,871 --> 00:12:53,953
Okay, and this machine,
you can imagine,
272
00:12:53,973 --> 00:12:55,087
will work continuously.
273
00:12:55,107 --> 00:12:56,522
It's never gonna get injured,
274
00:12:56,542 --> 00:13:00,326
never file a workers'
compensation claim,
275
00:13:00,346 --> 00:13:02,428
and, yet, it's
moving into an area
276
00:13:02,448 --> 00:13:04,964
that, up until now, at
least, we would have said
277
00:13:04,984 --> 00:13:08,400
is really something that is
exclusively the provid...
278
00:13:08,420 --> 00:13:10,102
Province of the human worker.
279
00:13:10,122 --> 00:13:14,540
I mean, it's this ability
to look at something
280
00:13:14,560 --> 00:13:17,276
and then based on what you see,
281
00:13:17,296 --> 00:13:18,477
manipulate your environment.
282
00:13:18,497 --> 00:13:20,446
It's son of the confluence
283
00:13:20,466 --> 00:13:23,382
of visual perception
and dexterity.
284
00:13:23,402 --> 00:13:27,119
♪
285
00:13:27,139 --> 00:13:29,355
We'll see self-driving
cars on the road
286
00:13:29,375 --> 00:13:31,123
within 10 or 15 years.
287
00:13:31,143 --> 00:13:32,858
Fifteen years from
now, we'll be debating
288
00:13:32,878 --> 00:13:35,561
whether we should even allow
human beings to be on...
289
00:13:35,581 --> 00:13:37,296
Be on the road at all.
290
00:13:37,316 --> 00:13:39,899
Tesla says that by next year,
291
00:13:39,919 --> 00:13:44,470
that, you know, their cars
will be 90 percent automated.
292
00:13:44,490 --> 00:13:46,372
Which means that the
jobs of taxi drivers,
293
00:13:46,392 --> 00:13:48,541
truck drivers, goes away.
294
00:13:48,561 --> 00:13:51,911
Suddenly, we don't need
to own cars anymore.
295
00:13:51,931 --> 00:13:55,181
Humanity isn't ready for such
a basic change such as that.
296
00:13:55,201 --> 00:13:57,883
♪
297
00:13:57,903 --> 00:13:59,552
Call center jobs,
voice recognition
298
00:13:59,572 --> 00:14:01,609
is pretty sophisticated
these days.
299
00:14:01,841 --> 00:14:06,091
And you can imagine
replacing many kinds of,
300
00:14:06,111 --> 00:14:08,027
you know, helplines and things.
301
00:14:08,047 --> 00:14:10,563
There's a company called
"lPsoft" that has created
302
00:14:10,583 --> 00:14:13,532
an intelligent software system,
303
00:14:13,552 --> 00:14:16,402
an automated system,
called "Amelia."
304
00:14:16,422 --> 00:14:19,405
She can not only understand
what you're saying to her,
305
00:14:19,425 --> 00:14:22,074
she understands the context
of what you're saying,
306
00:14:22,094 --> 00:14:24,376
and she can learn
from her mistakes.
307
00:14:24,396 --> 00:14:26,579
This is a huge deal because
what we're going to see
308
00:14:26,599 --> 00:14:29,148
is all of the customer
service agent jobs,
309
00:14:29,168 --> 00:14:31,417
if she is successful, if
this software program,
310
00:14:31,437 --> 00:14:34,486
this automated software
program is successful,
311
00:14:34,506 --> 00:14:37,590
we could see all of
those jobs go away.
312
00:14:37,610 --> 00:14:39,647
These things tend to
go to marginal cost,
313
00:14:39,879 --> 00:14:43,495
and marginal cost is copying
software, which is nothing,
314
00:14:43,515 --> 00:14:45,497
and running it on a computer
315
00:14:45,517 --> 00:14:47,166
which will probably
be very cheap.
316
00:14:47,186 --> 00:14:51,136
♪
317
00:14:51,156 --> 00:14:55,274
Human doctors will be, in
some respect, pushed aside
318
00:14:55,294 --> 00:14:56,976
because machines can
do a better job
319
00:14:56,996 --> 00:14:58,644
of diagnosis than they can.
320
00:14:58,664 --> 00:15:00,613
Now will they have the empathy
that current doctors do?
321
00:15:00,633 --> 00:15:02,314
I don't know, but at least
they'll have the knowledge
322
00:15:02,334 --> 00:15:04,216
that our doctors do,
they'll be more advanced,
323
00:15:04,236 --> 00:15:06,051
so I can see this
option in healthcare.
324
00:15:06,071 --> 00:15:08,587
The one that is likely to
be the biggest growth area,
325
00:15:08,607 --> 00:15:12,091
from an economic standpoint,
is the android companions
326
00:15:12,111 --> 00:15:15,327
to help elderly people,
okay, because there's...
327
00:15:15,347 --> 00:15:18,030
You know, given the
rise in elderly people
328
00:15:18,050 --> 00:15:21,233
over the next 20, 30
years, that it's...
329
00:15:21,253 --> 00:15:23,202
And it's unlikely there are
gonna be enough people
330
00:15:23,222 --> 00:15:25,404
going into the
nursing profession
331
00:15:25,424 --> 00:15:27,940
to actually serve them,
especially if we're thinking
332
00:15:27,960 --> 00:15:31,143
in terms of home-based care.
333
00:15:31,163 --> 00:15:34,013
The robot surgeon, I think,
is something that will happen
334
00:15:34,033 --> 00:15:36,707
in the not-too-distant future
335
00:15:36,936 --> 00:15:41,253
because a lot of that is to
do with manual dexterity
336
00:15:41,273 --> 00:15:46,292
and having the expertise
to recognize...
337
00:15:46,312 --> 00:15:50,296
To understand what you're
manipulating as a surgeon.
338
00:15:50,316 --> 00:15:53,565
Terrific amount of expertise
for a human to accumulate,
339
00:15:53,585 --> 00:15:55,534
but I can imagine that we
would be able to build
340
00:15:55,554 --> 00:15:58,370
something that is a
specialized robot surgeon
341
00:15:58,390 --> 00:16:00,706
that can carry out
particular operations
342
00:16:00,726 --> 00:16:03,108
such as a prostate operation.
343
00:16:03,128 --> 00:16:04,710
That's one that people are
working on right now,
344
00:16:04,730 --> 00:16:06,412
and I think they're nearly there
345
00:16:06,432 --> 00:16:07,646
of being able to produce
346
00:16:07,666 --> 00:16:10,015
a reliable robot surgeon
that can do that.
347
00:16:10,035 --> 00:16:11,951
You might not want to submit
yourself to this thing,
348
00:16:11,971 --> 00:16:14,053
you might think, but in fact,
I think we'll be able to make
349
00:16:14,073 --> 00:16:17,323
a very reliable robot surgeon
to do that sort of thing.
350
00:16:17,343 --> 00:16:24,697
♪
351
00:16:24,717 --> 00:16:26,432
I can see this option in finance
352
00:16:26,452 --> 00:16:28,500
because they're moving
to digital currencies.
353
00:16:28,520 --> 00:16:31,637
And... and we're now
moving to crowdfunding
354
00:16:31,657 --> 00:16:35,240
and crowdbanking and all
these other advances.
355
00:16:35,260 --> 00:16:37,476
One of my favorites is
investment bankers.
356
00:16:37,496 --> 00:16:41,513
Artificial intelligence
already does more
357
00:16:41,533 --> 00:16:44,516
stock market trades today
than any human being.
358
00:16:44,536 --> 00:16:48,220
Lots of decisions like
decisions about mortgages
359
00:16:48,240 --> 00:16:50,656
and insurance, already
those things have been,
360
00:16:50,676 --> 00:16:53,592
you know, largely taken
over by programs,
361
00:16:53,612 --> 00:16:56,395
and I think that kind of trend
is only gonna continue.
362
00:16:56,415 --> 00:17:04,415
♪
363
00:17:08,560 --> 00:17:11,276
Every time there's a
technological change,
364
00:17:11,296 --> 00:17:13,379
it will, unfortunately, put a
lot of people out of work.
365
00:17:13,399 --> 00:17:15,214
It happened with the cotton gin.
366
00:17:15,234 --> 00:17:18,751
It's happened with every
single technological change.
367
00:17:18,771 --> 00:17:22,454
So, sure, technology
destroys jobs,
368
00:17:22,474 --> 00:17:24,156
but it creates new ones.
369
00:17:24,176 --> 00:17:27,159
Moving from the age of
work that we're in now
370
00:17:27,179 --> 00:17:31,430
into the abundant,
ubiquitous automation age,
371
00:17:31,450 --> 00:17:33,828
that bridge that
we have to cross
372
00:17:34,053 --> 00:17:36,668
is gonna be a very
interesting time period.
373
00:17:36,688 --> 00:17:39,238
I think in the very beginning
of that time period,
374
00:17:39,258 --> 00:17:43,275
you're going to see automation
star-t to replace jobs,
375
00:17:43,295 --> 00:17:46,712
but those jobs will transfer
into other forms of work.
376
00:17:46,732 --> 00:17:50,082
So, for example, instead
of working in a factory,
377
00:17:50,102 --> 00:17:53,218
you will learn to code and
you will code the robots
378
00:17:53,238 --> 00:17:54,753
that are working in the factory.
379
00:17:54,773 --> 00:17:58,090
When I was a young man and
I went for careers advice,
380
00:17:58,110 --> 00:18:00,125
I don't know what they
would have made of me
381
00:18:00,145 --> 00:18:04,363
asking for a job as a webmaster.
382
00:18:04,383 --> 00:18:07,232
It didn't exist, there
wasn't a web at that time.
383
00:18:07,252 --> 00:18:10,335
And, right now, we have
over 200,000 vacancies
384
00:18:10,355 --> 00:18:12,838
for people who can
analyze big data.
385
00:18:12,858 --> 00:18:15,174
And we really do need people
386
00:18:15,194 --> 00:18:17,109
and mechanisms for analyzing it
387
00:18:17,129 --> 00:18:20,846
and getting the most
information from that data,
388
00:18:20,866 --> 00:18:23,782
and that problem is only gonna
increase in the future.
389
00:18:23,802 --> 00:18:26,452
And I do think that there's
gonna be a lot of employment
390
00:18:26,472 --> 00:18:28,120
moving in that direction.
391
00:18:28,140 --> 00:18:33,192
♪
392
00:18:33,212 --> 00:18:36,328
The history of our country
proves that new inventions
393
00:18:36,348 --> 00:18:40,432
create thousands of jobs for
every one they displace.
394
00:18:40,452 --> 00:18:44,470
So it wasn't long before your
grandfather had a better job
395
00:18:44,490 --> 00:18:46,672
at more pay for less work.
396
00:18:46,692 --> 00:18:48,407
We're always offered
this solution
397
00:18:48,427 --> 00:18:50,709
of still more education,
still more training.
398
00:18:50,729 --> 00:18:52,411
If people lose
their routine job,
399
00:18:52,431 --> 00:18:53,812
then let's send them
back to school.
400
00:18:53,832 --> 00:18:55,714
They'll pick up some new skills,
401
00:18:55,734 --> 00:18:57,249
they'll learn something new,
and then they'll be able
402
00:18:57,269 --> 00:18:59,885
to move into some more
rewarding career.
403
00:18:59,905 --> 00:19:02,254
That's not gonna operate
so well in the future
404
00:19:02,274 --> 00:19:03,622
where the machines are coming
405
00:19:03,642 --> 00:19:05,557
for those skilled jobs as well.
406
00:19:05,577 --> 00:19:07,326
The fact is that machines
are really good at
407
00:19:07,346 --> 00:19:08,861
picking up skills
and doing all kinds
408
00:19:08,881 --> 00:19:11,196
of extraordinarily
complex things,
409
00:19:11,216 --> 00:19:12,564
so those jobs aren't necessarily
410
00:19:12,584 --> 00:19:14,366
gonna be there either.
411
00:19:14,386 --> 00:19:17,269
And a second insight, I
think, is that historically,
412
00:19:17,289 --> 00:19:20,205
it's always been the case that
the vast majority of people
413
00:19:20,225 --> 00:19:22,174
have always done routine work.
414
00:19:22,194 --> 00:19:24,743
So even if people can
make that transition,
415
00:19:24,763 --> 00:19:26,678
if they can succeed in
going back to school
416
00:19:26,698 --> 00:19:29,214
and learning something
new, in percentage terms,
417
00:19:29,234 --> 00:19:31,283
those jobs don't constitute
418
00:19:31,303 --> 00:19:32,885
that much of the total
employment out there.
419
00:19:32,905 --> 00:19:34,753
I mean, most people are doing
these more routine things.
420
00:19:34,773 --> 00:19:37,189
So, you know, we're up
against a real problem
421
00:19:37,209 --> 00:19:39,892
that's probably gonna require
a political solution.
422
00:19:39,912 --> 00:19:44,730
It's probably going to require
direct redistribution.
423
00:19:44,750 --> 00:19:46,431
That's my take on it,
424
00:19:46,451 --> 00:19:49,801
and that's a staggering
political challenge,
425
00:19:49,821 --> 00:19:51,570
especially in the United States.
426
00:19:51,590 --> 00:19:53,872
This would be fine if
we had generations
427
00:19:53,892 --> 00:19:57,743
to adapt to the change so
that the next generation
428
00:19:57,763 --> 00:19:59,211
could develop a
different lifestyle,
429
00:19:59,231 --> 00:20:00,679
different value system.
430
00:20:00,699 --> 00:20:02,681
The problem is that all
of this is happening
431
00:20:02,701 --> 00:20:03,682
within the same generation.
432
00:20:03,702 --> 00:20:05,884
Within a period of 15 years,
433
00:20:05,904 --> 00:20:09,454
we're gonna start wiping out
most of the jobs that we know.
434
00:20:09,474 --> 00:20:12,291
That's really what worries me.
435
00:20:12,311 --> 00:20:15,761
A term commonly used when
describing the trajectory
436
00:20:15,781 --> 00:20:18,990
of technological progress
and where it's leading us
437
00:20:19,218 --> 00:20:21,900
is the "technological
singularity."
438
00:20:21,920 --> 00:20:23,902
The term is borrowed
from physics
439
00:20:23,922 --> 00:20:27,973
to describe an event horizon
or a moment in space time
440
00:20:27,993 --> 00:20:30,275
that you cannot see beyond.
441
00:20:30,295 --> 00:20:32,511
We are currently in
the transistor era
442
00:20:32,531 --> 00:20:34,613
of information technology.
443
00:20:34,633 --> 00:20:39,484
In 1965, co-founder of
Intel, Gordon Moore,
444
00:20:39,504 --> 00:20:41,954
made the observation that
the processing power
445
00:20:41,974 --> 00:20:45,591
of computers doubles
every 18 months.
446
00:20:45,611 --> 00:20:48,293
The prediction that this
trend will continue
447
00:20:48,313 --> 00:20:50,729
is known as Moore's Law.
448
00:20:50,749 --> 00:20:51,997
When Intel created
449
00:20:52,017 --> 00:20:55,934
their first computer
processing unit in 1971,
450
00:20:55,954 --> 00:20:59,271
it has 2,300 transistors
451
00:20:59,291 --> 00:21:04,009
and had a processing
speed of 740 kilohertz.
452
00:21:04,029 --> 00:21:08,814
Today, a typical CPU has
over a billion transistors
453
00:21:08,834 --> 00:21:11,750
with an average speed
of two gigahertz.
454
00:21:11,770 --> 00:21:15,320
However, many predict
that by 2020,
455
00:21:15,340 --> 00:21:18,357
the miniaturization of
transistors and silicon chips
456
00:21:18,377 --> 00:21:20,325
will reach its limit,
457
00:21:20,345 --> 00:21:25,297
and Moore's Law will fizzle
out into a post-silicon era.
458
00:21:25,317 --> 00:21:27,432
Another way of
describing the term
459
00:21:27,452 --> 00:21:29,935
"technological singularity"
460
00:21:29,955 --> 00:21:32,471
is a time when
artificial intelligence
461
00:21:32,491 --> 00:21:36,341
surpasses human
intellectual capacity.
462
00:21:36,361 --> 00:21:37,909
But does this mean
that a computer
463
00:21:37,929 --> 00:21:39,878
can produce a new idea
464
00:21:39,898 --> 00:21:42,981
or make an original
contribution to knowledge?
465
00:21:43,001 --> 00:21:46,585
Artificial intelligence, Al,
is a longstanding project
466
00:21:46,605 --> 00:21:49,454
which has to do with
basically trying
467
00:21:49,474 --> 00:21:51,790
to use machines as a way
of trying to understand
468
00:21:51,810 --> 00:21:53,558
the nature of intelligence
469
00:21:53,578 --> 00:21:55,827
and, originally, the
idea was, in some sense,
470
00:21:55,847 --> 00:21:58,063
to manufacture within machines
471
00:21:58,083 --> 00:22:00,732
something that could
simulate human intelligence.
472
00:22:00,752 --> 00:22:02,801
But I think now, as the
years have gone on,
473
00:22:02,821 --> 00:22:04,770
we now think in terms
of intelligence
474
00:22:04,790 --> 00:22:06,098
in a much more abstract way,
475
00:22:06,325 --> 00:22:10,042
so the ability to engage
in massive computations,
476
00:22:10,062 --> 00:22:11,810
right, where you
can end up making
477
00:22:11,830 --> 00:22:14,379
quite intelligent decisions
much more quickly
478
00:22:14,399 --> 00:22:15,814
than a human being can.
479
00:22:15,834 --> 00:22:17,749
So in this respect,
artificial intelligence
480
00:22:17,769 --> 00:22:19,751
in a sense is a, you might say,
481
00:22:19,771 --> 00:22:21,887
as trying to go to the
next level of intelligence
482
00:22:21,907 --> 00:22:23,055
beyond the human.
483
00:22:23,075 --> 00:22:24,890
♪
484
00:22:24,910 --> 00:22:27,492
A proper Al could substitute
485
00:22:27,512 --> 00:22:32,030
for practically any human
job at some level of skill,
486
00:22:32,050 --> 00:22:34,900
so it's... it would be
487
00:22:34,920 --> 00:22:36,902
a completely
different situation.
488
00:22:36,922 --> 00:22:40,372
You can imagine any kind
of job could, in theory,
489
00:22:40,392 --> 00:22:45,077
be replaced by technology,
if you build human-level Al.
490
00:22:45,097 --> 00:22:49,448
Now that, of course, may or
may not be a good thing.
491
00:22:49,468 --> 00:22:51,883
You'd be able to, for
example, make robots
492
00:22:51,903 --> 00:22:54,052
that could do all kinds of jobs
493
00:22:54,072 --> 00:22:56,121
that humans don't
necessarily want to do.
494
00:22:56,141 --> 00:22:59,145
There are the so-called
three D's jobs
495
00:22:59,378 --> 00:23:01,960
that are dirty,
dangerous, or dull,
496
00:23:01,980 --> 00:23:03,995
which humans might
not want to do,
497
00:23:04,015 --> 00:23:06,932
and yet, where you
might actually want
498
00:23:06,952 --> 00:23:08,533
a human level of intelligence
499
00:23:08,553 --> 00:23:10,502
to do the job well or
do the job properly.
500
00:23:10,522 --> 00:23:12,437
These are things which
are achievable.
501
00:23:12,457 --> 00:23:13,905
- Yeah.
- This isn't something...
502
00:23:13,925 --> 00:23:15,540
I don't think it's
science fiction.
503
00:23:15,560 --> 00:23:17,164
I think this is
entirely feasible
504
00:23:17,396 --> 00:23:19,878
that we could build a computer
505
00:23:19,898 --> 00:23:22,447
which is vastly superhuman
506
00:23:22,467 --> 00:23:24,916
which is conscious,
which has emotions,
507
00:23:24,936 --> 00:23:27,486
which is, essentially,
a new species
508
00:23:27,506 --> 00:23:31,123
of self-aware intelligence
and conscious in every way
509
00:23:31,143 --> 00:23:34,059
and has got emotions the
same as you and I do.
510
00:23:34,079 --> 00:23:35,761
I don't see any
fundamental limits
511
00:23:35,781 --> 00:23:38,130
on what we can do, and
we already know enough
512
00:23:38,150 --> 00:23:41,633
about basic science to
start doing that now.
513
00:23:41,653 --> 00:23:45,003
So some people are
concerned about,
514
00:23:45,023 --> 00:23:48,507
you know, possible
risks of building Al
515
00:23:48,527 --> 00:23:51,810
and building something that
is very, very powerful
516
00:23:51,830 --> 00:23:54,112
where there are
unintended consequences
517
00:23:54,132 --> 00:23:57,449
of the thing that you've
built and where it might
518
00:23:57,469 --> 00:23:58,683
do things that you can't predict
519
00:23:58,703 --> 00:24:00,819
that might be
extremely dangerous.
520
00:24:00,839 --> 00:24:02,654
So a so-called
"existential risk,"
521
00:24:02,674 --> 00:24:03,955
as some people call it.
522
00:24:03,975 --> 00:24:06,925
We are going to hand
off to our machines
523
00:24:06,945 --> 00:24:09,094
all the multidimensional
problems
524
00:24:09,114 --> 00:24:11,496
that we are incapable
of coping with.
525
00:24:11,516 --> 00:24:14,833
You and I can take a
problem with two or three
526
00:24:14,853 --> 00:24:17,702
or four or even seven inputs.
527
00:24:17,722 --> 00:24:18,837
But 300?
528
00:24:18,857 --> 00:24:21,235
A thousand, a million inputs?
529
00:24:21,460 --> 00:24:22,841
We're dead in the water.
530
00:24:22,861 --> 00:24:24,843
The machines can cope with that.
531
00:24:24,863 --> 00:24:26,678
The advantage that
computers have
532
00:24:26,698 --> 00:24:29,581
is that they communicate
at gigabit speeds.
533
00:24:29,601 --> 00:24:31,149
They all network together.
534
00:24:31,169 --> 00:24:32,951
We talk in slow motion.
535
00:24:32,971 --> 00:24:37,088
So computers will achieve
this level of awareness
536
00:24:37,108 --> 00:24:39,124
probably in the next
20, 30, 40 years.
537
00:24:39,144 --> 00:24:42,694
It's not that if it's
good or that it's evil.
538
00:24:42,714 --> 00:24:44,129
It's we're probably good enough
539
00:24:44,149 --> 00:24:46,865
to not program an evil Al.
540
00:24:46,885 --> 00:24:50,535
It's that if it's
lethally indifferent.
541
00:24:50,555 --> 00:24:52,170
If it has certain things
542
00:24:52,190 --> 00:24:55,874
that it's tasked
with accomplishing
543
00:24:55,894 --> 00:24:57,843
and humans are in the way.
544
00:24:57,863 --> 00:25:00,946
So there's this concern that
once we reach that moment
545
00:25:00,966 --> 00:25:03,248
where the computers
outperform us
546
00:25:03,268 --> 00:25:05,951
in ways that are
quite meaningful,
547
00:25:05,971 --> 00:25:09,282
that then they will somehow
be motivated to dispose of us
548
00:25:09,508 --> 00:25:12,123
or take over us or
something of this kind.
549
00:25:12,143 --> 00:25:13,792
I don't really believe that
550
00:25:13,812 --> 00:25:16,094
because these kinds
of developments,
551
00:25:16,114 --> 00:25:18,263
which probably are a little
farther off in the future
552
00:25:18,283 --> 00:25:20,699
than some of their
enthusiasts think,
553
00:25:20,719 --> 00:25:22,868
there will be time
for us to adapt,
554
00:25:22,888 --> 00:25:26,037
to come to terms with it,
to organize social systems
555
00:25:26,057 --> 00:25:28,006
that will enable us
to deal adequately
556
00:25:28,026 --> 00:25:30,108
with these new forms
of intelligence.
557
00:25:30,128 --> 00:25:31,977
So I don't thi... this is
not just gonna be something
558
00:25:31,997 --> 00:25:34,145
that's gonna happen
as a miracle tomorrow
559
00:25:34,165 --> 00:25:35,981
and then we'll be
taken by surprise.
560
00:25:36,001 --> 00:25:38,049
But I do think the
key thing here is
561
00:25:38,069 --> 00:25:41,720
that we need to treat
these futuristic things
562
00:25:41,740 --> 00:25:44,256
as not as far away as
people say they are.
563
00:25:44,276 --> 00:25:46,157
Just because they're
not likely to happen
564
00:25:46,177 --> 00:25:47,959
in 15 years, let's say,
565
00:25:47,979 --> 00:25:49,995
it doesn't mean they won't
happen in 50 years.
566
00:25:50,015 --> 00:25:54,032
It's gonna be of kind of
historical dimensions,
567
00:25:54,052 --> 00:25:56,201
and it's very hard
to predict, I think,
568
00:25:56,221 --> 00:26:00,605
whether it's gonna take
us in a utopian direction
569
00:26:00,625 --> 00:26:02,173
or in a dystopian direction
570
00:26:02,193 --> 00:26:04,943
or more likely
something in between,
571
00:26:04,963 --> 00:26:06,244
but just very different.
572
00:26:06,264 --> 00:26:08,179
Very hard to predict.
573
00:26:08,199 --> 00:26:10,849
You see, it's our job
to take raw materials,
574
00:26:10,869 --> 00:26:12,751
adapt them to useful forms,
575
00:26:12,771 --> 00:26:16,121
take natural forces, harness
them to do man's work.
576
00:26:16,141 --> 00:26:18,290
The automated systems
of the future
577
00:26:18,310 --> 00:26:21,960
are a natural process
of human innovation.
578
00:26:21,980 --> 00:26:26,331
It all comes back to the idea
of doing more with less.
579
00:26:26,351 --> 00:26:28,066
This process of innovation
580
00:26:28,086 --> 00:26:31,770
is driven not by
necessity, but desire,
581
00:26:31,790 --> 00:26:35,363
or to simply fill a
gap in the market.
582
00:26:35,594 --> 00:26:37,976
Farm owners didn't
really need to replace
583
00:26:37,996 --> 00:26:40,845
their workers with
machines, but they did so
584
00:26:40,865 --> 00:26:43,715
because they could
foresee the benefits.
585
00:26:43,735 --> 00:26:46,051
It's a natural
cycle of business.
586
00:26:46,071 --> 00:26:49,754
Doing more with less leads
to greater prosperity.
587
00:26:49,774 --> 00:26:51,323
♪
588
00:26:51,343 --> 00:26:53,124
The hope is that we
can adapt to this
589
00:26:53,144 --> 00:26:54,292
politically and socially.
590
00:26:54,312 --> 00:26:55,760
In order to do that,
591
00:26:55,780 --> 00:26:57,262
we have to begin a
conversation now.
592
00:26:57,282 --> 00:26:59,698
Remember that we're up against
593
00:26:59,718 --> 00:27:02,167
an exponential arc of progress.
594
00:27:02,187 --> 00:27:04,336
Things are gonna keep
moving faster and faster,
595
00:27:04,356 --> 00:27:06,805
so we need to star-t
talking about this now
596
00:27:06,825 --> 00:27:08,373
and we need to sort of
get the word out there
597
00:27:08,393 --> 00:27:09,841
so that people will realize
598
00:27:09,861 --> 00:27:12,210
that this problem
is coming at us,
599
00:27:12,230 --> 00:27:14,179
so that we can begin to discuss
600
00:27:14,199 --> 00:27:16,081
viable political
solutions to this
601
00:27:16,101 --> 00:27:18,850
because, again, I think
it will require,
602
00:27:18,870 --> 00:27:20,385
ultimately, a political choice.
603
00:27:20,405 --> 00:27:24,410
It's not something that
is gonna son itself out
604
00:27:24,643 --> 00:27:27,659
by itself as a result of
the normal functioning
605
00:27:27,679 --> 00:27:29,260
of the market economy.
606
00:27:29,280 --> 00:27:31,296
It's something that will require
607
00:27:31,316 --> 00:27:33,064
some son of an intervention
608
00:27:33,084 --> 00:27:34,366
and, you know, pan
of the problem is
609
00:27:34,386 --> 00:27:36,067
that in the United States,
610
00:27:36,087 --> 00:27:39,904
roughly half of the population
is very conservative
611
00:27:39,924 --> 00:27:42,140
and they really don't
believe in this idea
612
00:27:42,160 --> 00:27:44,809
of intervention in the market.
613
00:27:44,829 --> 00:27:46,945
It's gonna be a
tough transition,
614
00:27:46,965 --> 00:27:49,014
and those that find
themselves out of jobs
615
00:27:49,034 --> 00:27:51,282
because a robot has taken it
616
00:27:51,302 --> 00:27:53,218
are gonna be pretty pissed off.
617
00:27:53,238 --> 00:27:57,188
The effect of automation
on jobs and livelihood
618
00:27:57,208 --> 00:28:00,725
is going to be behind this
like the original Luddites.
619
00:28:00,745 --> 00:28:02,861
It wasn't... it wasn't
that they were against
620
00:28:02,881 --> 00:28:05,797
technological developments
in some ethereal sense.
621
00:28:05,817 --> 00:28:08,967
It was that this was
taking their damn jobs.
622
00:28:08,987 --> 00:28:11,836
I absolutely think there could
be a Neo-Luddite movement
623
00:28:11,856 --> 00:28:14,339
against the future,
against technology
624
00:28:14,359 --> 00:28:15,740
because they're gonna say,
625
00:28:15,760 --> 00:28:17,075
"Well, hey, you're
taking our jobs,"
626
00:28:17,095 --> 00:28:18,472
you're taking our
livelihoods away.
627
00:28:18,697 --> 00:28:20,412
"You're taking everything
away from us."
628
00:28:20,432 --> 00:28:22,814
But I think that's when
it's gonna be important
629
00:28:22,834 --> 00:28:25,984
that leaders and government
step in and say,
630
00:28:26,004 --> 00:28:27,285
"It may seem that way,
631
00:28:27,305 --> 00:28:29,220
but life is going to get
better for everyone."
632
00:28:29,240 --> 00:28:31,890
We're gonna have more time
to do things that we want,
633
00:28:31,910 --> 00:28:33,391
more vacations, more passions.
634
00:28:33,411 --> 00:28:35,193
This is the modern world.
635
00:28:35,213 --> 00:28:38,396
We can create the utopia
that we've always dreamt of.
636
00:28:38,416 --> 00:28:41,266
Why are we saying,
"My job's not safe,"
637
00:28:41,286 --> 00:28:44,836
or, "Automation's going
to steal my jobs"?
638
00:28:44,856 --> 00:28:46,471
These are the... these
are the phrases
639
00:28:46,491 --> 00:28:48,373
that keep getting
pushed out there.
640
00:28:48,393 --> 00:28:50,075
They're negative phrases,
641
00:28:50,095 --> 00:28:53,311
and instead, it seems that
we would look at this,
642
00:28:53,331 --> 00:28:55,013
especially if someone has
been working in a factory
643
00:28:55,033 --> 00:28:56,281
their whole life,
644
00:28:56,301 --> 00:28:58,416
that they would look at
that system and say,
645
00:28:58,436 --> 00:29:02,053
"Thank goodness that this is
starting to be automated."
646
00:29:02,073 --> 00:29:04,055
I don't want anyone
to have to crawl
647
00:29:04,075 --> 00:29:07,225
into a hole in the
ground and pull up coal.
648
00:29:07,245 --> 00:29:09,794
No human being should
have to go do that.
649
00:29:09,814 --> 00:29:12,330
If you make an awful
lot of computers
650
00:29:12,350 --> 00:29:14,432
and a lot of robots, and
the computers can make
651
00:29:14,452 --> 00:29:16,201
those robots very sophisticated
652
00:29:16,221 --> 00:29:18,103
and do lots of
sophisticated jobs,
653
00:29:18,123 --> 00:29:21,506
you could eliminate most of
the high-value physical jobs
654
00:29:21,526 --> 00:29:24,375
and also most of the
high-value intellectual jobs.
655
00:29:24,395 --> 00:29:26,478
What you're left with,
then, are those jobs
656
00:29:26,498 --> 00:29:28,213
where you have to
be a human being,
657
00:29:28,233 --> 00:29:30,882
so I find it quite
paradoxical in some ways
658
00:29:30,902 --> 00:29:32,917
that the more advanced
the technology becomes,
659
00:29:32,937 --> 00:29:35,453
the more it forces
us to become humans.
660
00:29:35,473 --> 00:29:36,988
So in some ways, it's very good.
661
00:29:37,008 --> 00:29:40,125
It forces us to explore
what is humanity about?
662
00:29:40,145 --> 00:29:41,292
What are the
fundamentally important
663
00:29:41,312 --> 00:29:42,894
things about being a human?
664
00:29:42,914 --> 00:29:45,096
It's not being able to,
you know, flip a burger
665
00:29:45,116 --> 00:29:48,333
or, you know, carve
something intricately.
666
00:29:48,353 --> 00:29:49,834
A computer or a robot
667
00:29:49,854 --> 00:29:52,036
could do that far better
than a human being.
668
00:29:52,056 --> 00:29:54,339
One thing I've noticed, if
you talk to techno-optimists
669
00:29:54,359 --> 00:29:57,842
about the future of work
and how it's gonna unfold,
670
00:29:57,862 --> 00:30:00,278
very often they will
focus on this issue
671
00:30:00,298 --> 00:30:03,181
of how will we all be
fulfilled in the future?
672
00:30:03,201 --> 00:30:05,016
What will be our purpose in life
673
00:30:05,036 --> 00:30:06,518
when we don't want to work?
674
00:30:06,538 --> 00:30:09,554
And, you know, you can
son of posit this
675
00:30:09,574 --> 00:30:12,023
in terms of-there was
a guy named Maslow
676
00:30:12,043 --> 00:30:14,259
who came up with a
hierarchy of human needs,
677
00:30:14,279 --> 00:30:15,560
Maslow's pyramid.
678
00:30:15,580 --> 00:30:17,395
And at the base of that pyramid
679
00:30:17,415 --> 00:30:20,565
are the foundational things
like food and shelter,
680
00:30:20,585 --> 00:30:22,333
and at the top of that
pyramid, of course,
681
00:30:22,353 --> 00:30:24,335
are all these intangible
things like, you know,
682
00:30:24,355 --> 00:30:25,937
a sense of purpose in your life
683
00:30:25,957 --> 00:30:27,472
and fulfillment and so forth.
684
00:30:27,492 --> 00:30:31,075
What you'll find among the
most techno-optimistic people
685
00:30:31,095 --> 00:30:32,577
is that they will want to skip
686
00:30:32,597 --> 00:30:34,279
right over the base
of that pyramid
687
00:30:34,299 --> 00:30:36,609
and jump right to the top
and start talking about,
688
00:30:36,835 --> 00:30:38,516
"Oh, gosh, how are
we gonna," you know,
689
00:30:38,536 --> 00:30:39,951
"what's the meaning
of our life gonna be
690
00:30:39,971 --> 00:30:41,986
when we don't have to work?"
691
00:30:42,006 --> 00:30:45,323
But the reality is that
the base of that pyramid,
692
00:30:45,343 --> 00:30:47,592
food, shelter, all the
things that we need
693
00:30:47,612 --> 00:30:49,327
to have, you know,
a decent life,
694
00:30:49,347 --> 00:30:50,562
that's the elephant in the room.
695
00:30:50,582 --> 00:30:54,032
That stuff costs real money.
696
00:30:54,052 --> 00:30:55,433
That stuff is gonna involve
697
00:30:55,453 --> 00:30:58,036
perhaps raising taxes
on a lot of the people
698
00:30:58,056 --> 00:30:59,470
that are doing really
well right now,
699
00:30:59,490 --> 00:31:00,972
and that's probably
part of the reason
700
00:31:00,992 --> 00:31:02,540
that they prefer not
to talk about it.
701
00:31:02,560 --> 00:31:05,243
So what do we do
with the 99 percent
702
00:31:05,263 --> 00:31:06,411
of the population on this planet
703
00:31:06,431 --> 00:31:08,213
if they don't have jobs?
704
00:31:08,233 --> 00:31:11,115
The suggestion is
and the goal is
705
00:31:11,135 --> 00:31:12,951
to make this an
efficient system.
706
00:31:12,971 --> 00:31:15,386
You put automation in
the hands of everyone.
707
00:31:15,406 --> 00:31:17,488
In the near future, we're
going to see systems
708
00:31:17,508 --> 00:31:19,624
where we can 3D
print our clothing,
709
00:31:19,644 --> 00:31:21,326
we can 3D print food.
710
00:31:21,346 --> 00:31:23,962
If you automate these
self-replicating
711
00:31:23,982 --> 00:31:27,165
industrial machines to
pull the raw materials
712
00:31:27,185 --> 00:31:29,067
and distribute those
raw materials
713
00:31:29,087 --> 00:31:31,436
to everyone who has the capacity
714
00:31:31,456 --> 00:31:33,271
to 3D print their own house
715
00:31:33,291 --> 00:31:36,107
or 3D print their own farm bot,
716
00:31:36,127 --> 00:31:40,411
you have literally
solved the equation
717
00:31:40,431 --> 00:31:43,114
of how do I automate my life
718
00:31:43,134 --> 00:31:45,450
and how do I automate
my basic necessities?
719
00:31:45,470 --> 00:31:47,352
If we had the political will
720
00:31:47,372 --> 00:31:49,287
to take all these
new technologies
721
00:31:49,307 --> 00:31:51,685
and the wealth and
abundance they create,
722
00:31:51,910 --> 00:31:54,359
and distribute these
across our society
723
00:31:54,379 --> 00:31:55,960
in the First World countries
724
00:31:55,980 --> 00:31:57,629
and also across the whole world,
725
00:31:57,649 --> 00:31:59,631
then, of course, I mean,
the sky's the limit.
726
00:31:59,651 --> 00:32:01,566
We can solve all
kinds of problems.
727
00:32:01,586 --> 00:32:04,402
But we will have to have the
political will to do that,
728
00:32:04,422 --> 00:32:07,038
and I don't see a whole lot
of evidence for it right now.
729
00:32:07,058 --> 00:32:10,275
There really is enough
already for everybody,
730
00:32:10,295 --> 00:32:12,343
certainly to have
an adequate life,
731
00:32:12,363 --> 00:32:16,281
if not a life of superabundant,
732
00:32:16,301 --> 00:32:19,317
so, you know, I don't think
that the introduction
733
00:32:19,337 --> 00:32:21,252
of more labor-saving devices
734
00:32:21,272 --> 00:32:25,056
or more is really gonna make
any difference in that.
735
00:32:25,076 --> 00:32:26,291
The reason there are poor people
736
00:32:26,311 --> 00:32:27,722
is 'cause there's rich people.
737
00:32:27,946 --> 00:32:34,465
♪
738
00:32:34,485 --> 00:32:37,035
You're simultaneously
making a lot of people
739
00:32:37,055 --> 00:32:38,703
almost completely useless
740
00:32:38,723 --> 00:32:41,639
while generating a
lot more wealth
741
00:32:41,659 --> 00:32:43,708
and value than ever before.
742
00:32:43,728 --> 00:32:44,976
So I worry about this.
743
00:32:44,996 --> 00:32:46,244
I worry about the schism
744
00:32:46,264 --> 00:32:49,280
between the super
rich and the poor.
745
00:32:49,300 --> 00:32:53,017
The ultra rich, if
they're representative
746
00:32:53,037 --> 00:32:54,986
of some of the people we've
seen in Silicon Valley,
747
00:32:55,006 --> 00:32:56,254
I really, really worry
748
00:32:56,274 --> 00:32:58,256
because I wonder if they
really have a soul.
749
00:32:58,276 --> 00:33:01,726
I really... I wonder if they
really have an awareness
750
00:33:01,746 --> 00:33:04,195
of how regular people feel
751
00:33:04,215 --> 00:33:06,497
and if they share the
values of humanity.
752
00:33:06,517 --> 00:33:09,200
It really bothers me that
you have this ultra rich
753
00:33:09,220 --> 00:33:12,170
that is out of touch with
regular people, with humanity.
754
00:33:12,190 --> 00:33:14,372
This is being filmed right
now in San Francisco,
755
00:33:14,392 --> 00:33:17,608
which is by all accounts one
of the wealthiest cities
756
00:33:17,628 --> 00:33:19,744
and most advanced
cities in the world,
757
00:33:19,764 --> 00:33:21,512
and it's pretty much ground zero
758
00:33:21,532 --> 00:33:24,182
for this technological
revolution,
759
00:33:24,202 --> 00:33:25,483
and, yet, as I came here,
760
00:33:25,503 --> 00:33:28,419
I almost tripped
over homeless people
761
00:33:28,439 --> 00:33:30,054
sleeping on the sidewalk.
762
00:33:30,074 --> 00:33:32,724
That is the reality
of today's economy
763
00:33:32,744 --> 00:33:33,779
and today's society.
764
00:33:34,012 --> 00:33:35,560
In a very real sense,
765
00:33:35,580 --> 00:33:38,763
we already live in the
economy of abundance,
766
00:33:38,783 --> 00:33:41,466
and yet we have not
solved this problem.
767
00:33:41,486 --> 00:33:44,102
I think the future
for the four billion
768
00:33:44,122 --> 00:33:46,796
poor people in the world is
actually a very good one.
769
00:33:47,025 --> 00:33:49,574
We've seen the amount of food
in the world, for example,
770
00:33:49,594 --> 00:33:51,801
has more than doubled
in the last 25 years.
771
00:33:52,030 --> 00:33:54,145
That's likely to continue.
772
00:33:54,165 --> 00:33:58,282
Worldwide, we're seeing
massive economic growth.
773
00:33:58,302 --> 00:34:01,619
That really means that people
in poor countries today
774
00:34:01,639 --> 00:34:04,055
will be much better
off in the future,
775
00:34:04,075 --> 00:34:06,124
so there will still
be some poor people,
776
00:34:06,144 --> 00:34:07,458
relatively speaking,
777
00:34:07,478 --> 00:34:09,627
but compared to
today's poor people,
778
00:34:09,647 --> 00:34:11,295
they'll be actually
quite well-off.
779
00:34:11,315 --> 00:34:14,365
I think this is an
amplifier for inequality.
780
00:34:14,385 --> 00:34:18,236
It's gonna make what we see
now much more amplified.
781
00:34:18,256 --> 00:34:19,604
The number of people
that are doing
782
00:34:19,624 --> 00:34:21,239
really well in the economy
783
00:34:21,259 --> 00:34:23,307
I think is likely to
continue to shrink.
784
00:34:23,327 --> 00:34:25,176
Those people that are doing well
785
00:34:25,196 --> 00:34:29,147
will do extraordinarily well,
but for more and more people,
786
00:34:29,167 --> 00:34:30,715
they're simply gonna
find themselves
787
00:34:30,735 --> 00:34:32,840
in a position where they
don't have a lot to offer.
788
00:34:33,071 --> 00:34:34,685
They don't have a
marketable skill.
789
00:34:34,705 --> 00:34:38,122
They don't have a viable way
to really earn an income
790
00:34:38,142 --> 00:34:39,757
or, in particular,
middle-class income.
791
00:34:39,777 --> 00:34:41,359
We should value the fact
792
00:34:41,379 --> 00:34:43,594
that we can spend more
time doing human work
793
00:34:43,614 --> 00:34:47,365
and the robots will get
on, increase the economy.
794
00:34:47,385 --> 00:34:49,600
They'll be still taking
all the resources
795
00:34:49,620 --> 00:34:51,736
and convening them
into material goods
796
00:34:51,756 --> 00:34:54,806
at very low cost, so the
economy will expand,
797
00:34:54,826 --> 00:34:56,307
we'll be better off,
798
00:34:56,327 --> 00:34:57,863
and we can concentrate
on what matters.
799
00:34:58,096 --> 00:35:00,278
There's nothing to
worry about in there.
800
00:35:00,298 --> 00:35:02,346
A constant stream
of savings dollars
801
00:35:02,366 --> 00:35:06,284
must flow into big and
small business each year.
802
00:35:06,304 --> 00:35:08,875
These dollars help
to buy the land,
803
00:35:09,107 --> 00:35:13,558
the buildings, the
tools and equipment,
804
00:35:13,578 --> 00:35:15,626
and create new job opportunities
805
00:35:15,646 --> 00:35:17,562
for our expanding population.
806
00:35:17,582 --> 00:35:21,299
♪
807
00:35:21,319 --> 00:35:22,767
We need consumers out there.
808
00:35:22,787 --> 00:35:25,570
We need people who
can actually buy
809
00:35:25,590 --> 00:35:28,306
the things that are
produced by the economy.
810
00:35:28,326 --> 00:35:30,208
If you look at the way
our economy works,
811
00:35:30,228 --> 00:35:32,777
ultimately, it's driven
by end consumption,
812
00:35:32,797 --> 00:35:35,713
and by that, I mean people
and to a limited extent,
813
00:35:35,733 --> 00:35:37,815
governments, who buy things
because they want them
814
00:35:37,835 --> 00:35:39,484
or they need them.
815
00:35:39,504 --> 00:35:41,152
You know, businesses
in our economy,
816
00:35:41,172 --> 00:35:43,221
they also buy things, of course,
817
00:35:43,241 --> 00:35:45,490
but they do that in order
to produce something else,
818
00:35:45,510 --> 00:35:47,492
and one business may sell
to another business,
819
00:35:47,512 --> 00:35:49,861
but at the end of that,
at the end of that chain,
820
00:35:49,881 --> 00:35:52,919
there has to stand a consumer
or perhaps a government
821
00:35:53,151 --> 00:35:55,533
who buys that product or service
822
00:35:55,553 --> 00:35:57,535
just because they want
it or they need it.
823
00:35:57,555 --> 00:35:58,932
So this is not the case
824
00:35:59,157 --> 00:36:00,872
that things can just
keep going like this
825
00:36:00,892 --> 00:36:03,674
and get more and more
unequal over time
826
00:36:03,694 --> 00:36:05,409
and everything will
still be fine.
827
00:36:05,429 --> 00:36:06,744
I think that it won't be fine.
828
00:36:06,764 --> 00:36:09,881
It will actually have an
impact on our economy
829
00:36:09,901 --> 00:36:12,250
and on our economic growth.
830
00:36:12,270 --> 00:36:15,786
We need intelligent
planning because, you know,
831
00:36:15,806 --> 00:36:18,789
being unemployed is not a
positive thing in itself.
832
00:36:18,809 --> 00:36:20,491
There has to be some
kind of transition point
833
00:36:20,511 --> 00:36:22,426
to some other form
of life after that.
834
00:36:22,446 --> 00:36:24,295
And, again, at the moment,
835
00:36:24,315 --> 00:36:26,697
I really don't see enough
attention being paid to this,
836
00:36:26,717 --> 00:36:29,960
so we need to take this future
prospect seriously now.
837
00:36:30,188 --> 00:36:32,203
♪
838
00:36:32,223 --> 00:36:35,239
If we manage to adapt
to this expected wave
839
00:36:35,259 --> 00:36:39,577
of technological unemployment
both politically and socially,
840
00:36:39,597 --> 00:36:41,846
it's likely to facilitate a time
841
00:36:41,866 --> 00:36:44,482
when work takes on a
different meaning
842
00:36:44,502 --> 00:36:46,551
and a new role in our lives.
843
00:36:46,571 --> 00:36:48,219
♪
844
00:36:48,239 --> 00:36:50,922
Ideas of how we should
approach our relationship
845
00:36:50,942 --> 00:36:54,458
with work have changed
throughout history.
846
00:36:54,478 --> 00:36:57,328
In ancient Greece,
Aristotle said,
847
00:36:57,348 --> 00:37:02,466
"A working paid job absorbs
and degrades the mind."
848
00:37:02,486 --> 00:37:05,236
I.E., if a person would
not willingly adopt
849
00:37:05,256 --> 00:37:06,771
their job for free,
850
00:37:06,791 --> 00:37:08,906
the argument can be made
that they have become
851
00:37:08,926 --> 00:37:11,475
absorbed and degraded by it,
852
00:37:11,495 --> 00:37:15,346
working purely out of
financial obligation.
853
00:37:15,366 --> 00:37:18,916
In 1844, Karl Marx
famously described
854
00:37:18,936 --> 00:37:22,887
the workers of society as
"alienated from their work
855
00:37:22,907 --> 00:37:26,691
and wholly saturated by it."
856
00:37:26,711 --> 00:37:28,793
He felt that most
work didn't allow
857
00:37:28,813 --> 00:37:31,462
an individual's
character to grow.
858
00:37:31,482 --> 00:37:33,864
He encouraged people
to find fulfillment
859
00:37:33,884 --> 00:37:36,400
and freedom in their work.
860
00:37:36,420 --> 00:37:39,604
During World War Two,
the ethos towards work
861
00:37:39,624 --> 00:37:41,672
was that it was a patriotic duty
862
00:37:41,692 --> 00:37:44,942
in order to support
the war effort.
863
00:37:44,962 --> 00:37:48,546
To best understand our current
relationship with work
864
00:37:48,566 --> 00:37:51,682
and perhaps by extension
modern life itself,
865
00:37:51,702 --> 00:37:55,953
we can look to the writings
of a man called Guy Debord.
866
00:37:55,973 --> 00:38:01,292
Debord was a French Marxist
theorist and in 1967,
867
00:38:01,312 --> 00:38:04,428
he published a powerful
and influential critique
868
00:38:04,448 --> 00:38:07,865
on Western society entitled.
869
00:38:07,885 --> 00:38:11,502
The Society of the Spectacle.
870
00:38:11,522 --> 00:38:14,472
He describes our workers
are ruled by commodities
871
00:38:14,492 --> 00:38:15,840
and that production
872
00:38:15,860 --> 00:38:19,310
is an inescapable
duty of the masses,
873
00:38:19,330 --> 00:38:21,579
such is the economic system
874
00:38:21,599 --> 00:38:24,682
that to work is to survive.
875
00:38:24,702 --> 00:38:27,051
"The capitalist
economy," he says,
876
00:38:27,071 --> 00:38:31,389
"requires the vast majority
to take part as wage workers"
877
00:38:31,409 --> 00:38:34,425
in the unending
pursuit of its ends.
878
00:38:34,445 --> 00:38:37,461
A requirement to which,
as everyone knows,
879
00:38:37,481 --> 00:38:40,498
"one must either submit or die."
880
00:38:40,518 --> 00:38:42,833
The assumption has crept
into our rhetoric
881
00:38:42,853 --> 00:38:44,435
and our understanding
that we live in
882
00:38:44,455 --> 00:38:46,470
a leisure society
to some extent.
883
00:38:46,490 --> 00:38:49,407
We have flexible working time.
884
00:38:49,427 --> 00:38:52,476
You hear the term a lot
"relative poverty"
885
00:38:52,496 --> 00:38:53,978
it's, again, it's
absolute poverty,
886
00:38:53,998 --> 00:38:57,048
and all these kinds of ideas
suggest that, in fact,
887
00:38:57,068 --> 00:38:59,105
we should feel pretty
pleased with ourselves.
888
00:38:59,337 --> 00:39:01,385
We should both feel
quite leisured
889
00:39:01,405 --> 00:39:03,721
and we should feel less
in bondage to work
890
00:39:03,741 --> 00:39:05,890
than perhaps somebody
in the 19th century
891
00:39:05,910 --> 00:39:08,959
who was kind of shackled
to a machine in a factory.
892
00:39:08,979 --> 00:39:12,430
But, in fact, we're
very unhappy.
893
00:39:12,450 --> 00:39:14,832
It's irrelevant
how much you work
894
00:39:14,852 --> 00:39:16,834
in actual terms anymore.
895
00:39:16,854 --> 00:39:18,869
The way in which the
spectacle operates
896
00:39:18,889 --> 00:39:23,133
is to make of leisure
itself an adjunct to work.
897
00:39:23,361 --> 00:39:26,444
In other words, the idea
of networking and working
898
00:39:26,464 --> 00:39:29,680
are in some sense
locked into an unholy
899
00:39:29,700 --> 00:39:32,383
and reciprocal relationship
with each other.
900
00:39:32,403 --> 00:39:34,418
You know, the fact that
you're not working
901
00:39:34,438 --> 00:39:35,953
is only because
you've been working,
902
00:39:35,973 --> 00:39:37,421
and the fact that you're working
903
00:39:37,441 --> 00:39:39,523
is only so that you cannot work.
904
00:39:39,543 --> 00:39:42,827
In other words, so
engrafted is that rubric
905
00:39:42,847 --> 00:39:44,155
in the way that we approach life
906
00:39:44,382 --> 00:39:46,897
that... that we can
never be rid of it.
907
00:39:46,917 --> 00:39:50,634
Debord also observed that
as technology advances,
908
00:39:50,654 --> 00:39:52,970
production becomes
more efficient.
909
00:39:52,990 --> 00:39:56,674
Accordingly, the workers'
tasks invariably become
910
00:39:56,694 --> 00:39:59,477
more trivial and menial.
911
00:39:59,497 --> 00:40:02,880
It would seem that as human
labor becomes irrelevant,
912
00:40:02,900 --> 00:40:06,517
the harder it is to
find fulfilling work.
913
00:40:06,537 --> 00:40:09,587
The truth of the matter is
that most people already,
914
00:40:09,607 --> 00:40:12,823
in Britain, are
doing useless jobs
915
00:40:12,843 --> 00:40:16,026
and have been for
generations, actually.
916
00:40:16,046 --> 00:40:18,796
Most jobs in management
are completely useless.
917
00:40:18,816 --> 00:40:21,098
They basically consist
in the rearrangement
918
00:40:21,118 --> 00:40:23,189
of information into
different patterns
919
00:40:23,421 --> 00:40:25,669
that are meant to take on
the semblance of meaning
920
00:40:25,689 --> 00:40:27,872
in the bureaucratic context.
921
00:40:27,892 --> 00:40:31,942
So most work is, in fact,
a waste of time already,
922
00:40:31,962 --> 00:40:34,612
and I think people
understand that intuitively.
923
00:40:34,632 --> 00:40:37,581
When I go into companies,
I often ask the question,
924
00:40:37,601 --> 00:40:38,883
"Why are you employing people?"
925
00:40:38,903 --> 00:40:40,484
You could get monkeys
926
00:40:40,504 --> 00:40:42,853
"or you could get
robots to do this job."
927
00:40:42,873 --> 00:40:44,088
The people are not
allowed to think.
928
00:40:44,108 --> 00:40:45,923
They are processing.
929
00:40:45,943 --> 00:40:48,092
They're just like a machine.
930
00:40:48,112 --> 00:40:51,562
They're being so hemmed down,
931
00:40:51,582 --> 00:40:55,533
they operate with an algorithm
and they just do it.
932
00:40:55,553 --> 00:40:58,903
We all have the need to
find meaning in our lives,
933
00:40:58,923 --> 00:41:02,234
and it is our professions
that define us.
934
00:41:02,460 --> 00:41:04,742
To work is to provide a service
935
00:41:04,762 --> 00:41:06,777
either to yourself
or for others,
936
00:41:06,797 --> 00:41:09,847
but most of us would like
our work to be purposeful
937
00:41:09,867 --> 00:41:13,584
and contributory to
society in some way.
938
00:41:13,604 --> 00:41:15,820
It is an uncomfortable truth
939
00:41:15,840 --> 00:41:18,722
that with our present
model of economics
940
00:41:18,742 --> 00:41:22,927
not everyone is able to
monetize their passions.
941
00:41:22,947 --> 00:41:25,963
If any of this were
to come to fruition,
942
00:41:25,983 --> 00:41:29,200
if we learned to make
automation work for us,
943
00:41:29,220 --> 00:41:34,238
the question remains, "What
do we do with our days?"
944
00:41:34,258 --> 00:41:36,006
There's a good and a bad.
945
00:41:36,026 --> 00:41:38,709
The good is that the cost
of everything drops.
946
00:41:38,729 --> 00:41:41,078
We can solve some of the
basic problems of humanity
947
00:41:41,098 --> 00:41:45,115
like disease, hunger, lodging.
948
00:41:45,135 --> 00:41:49,277
We can look after all the
basic needs of human beings.
949
00:41:49,507 --> 00:41:53,891
The dark side is that
automation takes jobs away,
950
00:41:53,911 --> 00:41:56,894
and the question is, "What
do we do for a living?"
951
00:41:56,914 --> 00:41:59,797
Some of us will
seek enlightenment
952
00:41:59,817 --> 00:42:02,166
and rise and will keep
learning and growing,
953
00:42:02,186 --> 00:42:04,268
but the majority of people
don't care about those things.
954
00:42:04,288 --> 00:42:08,005
Majority of people just want
to do, you know, grunt work.
955
00:42:08,025 --> 00:42:11,575
They want to socialize with
people as they do at work.
956
00:42:11,595 --> 00:42:15,079
Sennett wrote in his book,
The Corrosion of Character,
957
00:42:15,099 --> 00:42:18,048
that in late capitalism,
958
00:42:18,068 --> 00:42:21,218
one of the great
kind of supports
959
00:42:21,238 --> 00:42:24,855
for human interaction
and for human meaning
960
00:42:24,875 --> 00:42:27,758
is the longevity of
social relations
961
00:42:27,778 --> 00:42:30,315
and the interactions in
working environments
962
00:42:30,548 --> 00:42:32,830
and that if that's taken away,
963
00:42:32,850 --> 00:42:36,600
if what's required is to
be continually responsive
964
00:42:36,620 --> 00:42:40,237
and changing in a
precarious world,
965
00:42:40,257 --> 00:42:43,941
then people no longer
find the fulfillment
966
00:42:43,961 --> 00:42:46,577
or the substance in
what they're doing.
967
00:42:46,597 --> 00:42:50,214
There is an underlying desire
for people to do things,
968
00:42:50,234 --> 00:42:51,782
you know, you spoke
about the idea
969
00:42:51,802 --> 00:42:54,184
that people want to be
engaged creatively.
970
00:42:54,204 --> 00:42:55,653
They want to be
engaged, you know,
971
00:42:55,673 --> 00:42:58,989
go back to basic Marxist
ideas of praxis
972
00:42:59,009 --> 00:43:00,658
and right back to John Locke.
973
00:43:00,678 --> 00:43:02,693
They want to be engaged
in what Locke thought of
974
00:43:02,713 --> 00:43:04,295
as primary acquisition,
975
00:43:04,315 --> 00:43:07,164
mixing their labor, either
their creative thinking
976
00:43:07,184 --> 00:43:08,699
or their physical labor even,
977
00:43:08,719 --> 00:43:11,201
with the world in order
to transform it.
978
00:43:11,221 --> 00:43:13,671
They want to do that,
and that's a very basic
979
00:43:13,691 --> 00:43:15,973
human instinct to do that.
980
00:43:15,993 --> 00:43:19,109
And the idea of a leisured
class, as it were,
981
00:43:19,129 --> 00:43:23,280
a class who is not involved
in a praxis with the world,
982
00:43:23,300 --> 00:43:25,616
but is simply involved
in a passive way
983
00:43:25,636 --> 00:43:29,320
as a recipient of things is
actually repugnant to people.
984
00:43:29,340 --> 00:43:34,191
They would sooner work for
the man in a meaningless job
985
00:43:34,211 --> 00:43:36,794
and construct a false ideology
986
00:43:36,814 --> 00:43:38,996
of involvement and engagement
987
00:43:39,016 --> 00:43:41,165
than they would actually
sit on their ass.
988
00:43:41,185 --> 00:43:42,900
We can't get away
from the fact that...
989
00:43:42,920 --> 00:43:45,736
That people work
because they have to.
990
00:43:45,756 --> 00:43:48,772
That's, you know, the primary
motivation for most people,
991
00:43:48,792 --> 00:43:50,307
that if you don't work,
992
00:43:50,327 --> 00:43:52,743
you're gonna be living
on the street, okay?
993
00:43:52,763 --> 00:43:55,079
Once we... if we ever
move into a future
994
00:43:55,099 --> 00:43:56,714
where that's not the case
995
00:43:56,734 --> 00:43:57,982
and people don't have
to worry about that,
996
00:43:58,002 --> 00:43:59,216
then we can begin to take on
997
00:43:59,236 --> 00:44:01,318
these more philosophical
questions
998
00:44:01,338 --> 00:44:05,411
of-of, you know, but we're
not at that point yet.
999
00:44:05,643 --> 00:44:09,360
We can't pretend that
we are living in an age
1000
00:44:09,380 --> 00:44:14,398
where that necessity for
an income doesn't exist.
1001
00:44:14,418 --> 00:44:18,869
Douglas Rushkoff stated in 2009,
1002
00:44:18,889 --> 00:44:22,427
"We all want paychecks
or at least money."
1003
00:44:22,660 --> 00:44:25,376
We want food, shelter, clothing,
1004
00:44:25,396 --> 00:44:28,312
and all the things
that money buys us.
1005
00:44:28,332 --> 00:44:32,049
"But do we all
really want jobs?"
1006
00:44:32,069 --> 00:44:35,953
According to the UN Food and
Agriculture Organization,
1007
00:44:35,973 --> 00:44:37,388
there is enough food produced
1008
00:44:37,408 --> 00:44:39,056
to provide everyone in the world
1009
00:44:39,076 --> 00:44:44,194
with 2,720 kilocalories
per person per day.
1010
00:44:44,214 --> 00:44:46,251
♪
1011
00:45:03,300 --> 00:45:05,816
At this stage, it's
difficult to think of
1012
00:45:05,836 --> 00:45:08,218
other possible ways of life.
1013
00:45:08,238 --> 00:45:10,354
The need to earn a
living has been a pan
1014
00:45:10,374 --> 00:45:12,990
of every cultural
narrative in history.
1015
00:45:13,010 --> 00:45:16,193
It's a precondition
of human life.
1016
00:45:16,213 --> 00:45:18,862
The challenge facing
the future of work
1017
00:45:18,882 --> 00:45:21,999
is politically unclear.
1018
00:45:22,019 --> 00:45:23,934
It is likely to require
1019
00:45:23,954 --> 00:45:26,870
not only a redistribution
of wealth,
1020
00:45:26,890 --> 00:45:30,307
but a redistribution
of the workload.
1021
00:45:30,327 --> 00:45:34,011
But will working less
mean living more?
1022
00:45:34,031 --> 00:45:35,779
♪
1023
00:45:35,799 --> 00:45:39,083
And is our fear of
becoming irrelevant
1024
00:45:39,103 --> 00:45:42,414
greater than our fear of death?
1025
00:45:45,909 --> 00:45:53,909
♪
1026
00:46:17,274 --> 00:46:21,225
The process of physical aging
is known as senescence,
1027
00:46:21,245 --> 00:46:23,555
and none of us are
spared from it.
1028
00:46:23,781 --> 00:46:28,065
It remains to this day
an evolutionary enigma.
1029
00:46:28,085 --> 00:46:30,234
Our cells are programmed to wane
1030
00:46:30,254 --> 00:46:33,971
and our entire bodies are
fated to become frail.
1031
00:46:33,991 --> 00:46:36,473
It was seen that the laws
of nature would prefer it
1032
00:46:36,493 --> 00:46:38,442
if we dwindle and die.
1033
00:46:38,462 --> 00:46:42,246
♪
1034
00:46:42,266 --> 00:46:44,576
Negligible senescence, however,
1035
00:46:44,802 --> 00:46:47,951
is the lack of
symptoms of aging.
1036
00:46:47,971 --> 00:46:52,256
Negligibly senescent organisms
include certain species
1037
00:46:52,276 --> 00:46:55,058
of sturgeon, giant tortoise,
1038
00:46:55,078 --> 00:46:58,457
flatworm, clam, and tardigrade.
1039
00:47:00,083 --> 00:47:04,134
One species of jellyfish
called turritopsis dohrnii
1040
00:47:04,154 --> 00:47:08,071
has even been observed to
be biologically immortal.
1041
00:47:08,091 --> 00:47:11,909
It has the capability to
reverse its biotic cycle
1042
00:47:11,929 --> 00:47:13,911
and revert back to
the polyp stage
1043
00:47:13,931 --> 00:47:16,380
at any point in its development.
1044
00:47:16,400 --> 00:47:19,316
There is only one thing
wrong with dying,
1045
00:47:19,336 --> 00:47:21,247
and that's doing it
when you don't want to.
1046
00:47:22,272 --> 00:47:24,254
Doing it when you do want
to is not a problem.
1047
00:47:24,274 --> 00:47:26,089
Now if you put that
bargain to anybody,
1048
00:47:26,109 --> 00:47:28,358
"Look, this is the deal:
1049
00:47:28,378 --> 00:47:30,494
You will die, but only
when you want to."
1050
00:47:30,514 --> 00:47:32,329
Who would not take that bargain?
1051
00:47:32,349 --> 00:47:36,867
In 2014, a team of
scientists at Harvard
1052
00:47:36,887 --> 00:47:39,336
were able to effectively
reverse the age
1053
00:47:39,356 --> 00:47:41,538
of an older mouse by treating it
1054
00:47:41,558 --> 00:47:43,574
with the blood of
a younger mouse
1055
00:47:43,594 --> 00:47:46,905
through a process
called parabiosis.
1056
00:47:47,931 --> 00:47:49,313
For the first time in history
1057
00:47:49,333 --> 00:47:51,643
it is deemed
scientifically possible
1058
00:47:51,869 --> 00:47:54,585
to gain control over
the aging process.
1059
00:47:54,605 --> 00:48:02,605
♪
1060
00:48:03,113 --> 00:48:05,596
Ultimately, when people
get the hang of the idea
1061
00:48:05,616 --> 00:48:07,898
that aging is a medical problem
1062
00:48:07,918 --> 00:48:10,234
and that everybody's got it,
1063
00:48:10,254 --> 00:48:13,929
then it's not going to
be the way it is today.
1064
00:48:19,363 --> 00:48:23,180
He thinks it's possible
that people will extend...
1065
00:48:23,200 --> 00:48:26,670
Be able to extend their lifespan
by considerable amounts.
1066
00:48:26,904 --> 00:48:28,252
I think he's on record as saying
1067
00:48:28,272 --> 00:48:31,221
the first 1,000-year-old
person is already alive.
1068
00:48:31,241 --> 00:48:33,490
It's highly likely, in my view,
1069
00:48:33,510 --> 00:48:36,426
that people born today
or born 10 years ago
1070
00:48:36,446 --> 00:48:39,196
will actually be able to live
1071
00:48:39,216 --> 00:48:41,265
as long as they
like, so to speak,
1072
00:48:41,285 --> 00:48:46,403
without any risk of death from
the ill health of old age.
1073
00:48:46,423 --> 00:48:52,109
The way to apply comprehensive
maintenance to aging
1074
00:48:52,129 --> 00:48:53,699
is a divide and
conquer approach.
1075
00:48:53,931 --> 00:48:55,479
It is not a magic bullet.
1076
00:48:55,499 --> 00:48:57,447
It is not some single
thing that we can do,
1077
00:48:57,467 --> 00:49:00,384
let alone a single thing
that we could do just once.
1078
00:49:00,404 --> 00:49:03,587
Aging is the lifelong
accumulation
1079
00:49:03,607 --> 00:49:05,622
of damage to the body,
1080
00:49:05,642 --> 00:49:08,692
and that damage occurs
as an intrinsic,
1081
00:49:08,712 --> 00:49:10,714
unavoidable side effect
1082
00:49:10,948 --> 00:49:13,363
of the way the body
normally works.
1083
00:49:13,383 --> 00:49:15,032
Even though there
are many, many,
1084
00:49:15,052 --> 00:49:17,567
many different types of
damage at the molecular level
1085
00:49:17,587 --> 00:49:21,471
and the cellular level,
they can all be classified
1086
00:49:21,491 --> 00:49:23,974
into a very manageable
number of categories,
1087
00:49:23,994 --> 00:49:25,575
just seven major categories.
1088
00:49:25,595 --> 00:49:28,445
So now the bottom line,
what do we do about it?
1089
00:49:28,465 --> 00:49:31,742
How do we actually implement
the maintenance approach?
1090
00:49:31,969 --> 00:49:33,744
There are four
fundamental paradigms,
1091
00:49:33,971 --> 00:49:35,485
they all begin with.
1092
00:49:35,505 --> 00:49:37,120
They are called
replacement, removal,
1093
00:49:37,140 --> 00:49:38,689
repair, and reinforcement.
1094
00:49:38,709 --> 00:49:41,591
We've got particular ways
to do all these things.
1095
00:49:41,611 --> 00:49:44,695
Sometimes replacement,
sometimes simply elimination
1096
00:49:44,715 --> 00:49:48,754
of the superfluous material, the
garbage that's accumulated.
1097
00:49:48,986 --> 00:49:52,302
Sometimes repair
of the material.
1098
00:49:52,322 --> 00:49:54,571
Occasionally, in a couple
of cases, reinforcement...
1099
00:49:54,591 --> 00:49:57,140
That means making
the cell robust
1100
00:49:57,160 --> 00:49:59,509
so that the damage, which
would normally have caused
1101
00:49:59,529 --> 00:50:02,045
the pathology no
longer does that.
1102
00:50:02,065 --> 00:50:05,148
I wanna talk about one thing
that we're doing in our lab,
1103
00:50:05,168 --> 00:50:07,351
which involved the number
one cause of death
1104
00:50:07,371 --> 00:50:09,519
in the western world,
cardiovascular disease
1105
00:50:09,539 --> 00:50:11,254
causes hear-t attacks
and strokes.
1106
00:50:11,274 --> 00:50:14,091
It all begins with these
things called foam cells,
1107
00:50:14,111 --> 00:50:16,785
which are originally
white blood cells.
1108
00:50:17,014 --> 00:50:20,630
They become poisoned by
toxins in the bloodstream.
1109
00:50:20,650 --> 00:50:23,500
The main toxin that's
responsible is known...
1110
00:50:23,520 --> 00:50:26,436
It's called 7-ketocholesterol,
that's this thing,
1111
00:50:26,456 --> 00:50:29,473
and we found some bacteria
that could eat it.
1112
00:50:29,493 --> 00:50:31,708
We then found out
how they eat it,
1113
00:50:31,728 --> 00:50:34,678
we found out the enzymes that
they use to break it down,
1114
00:50:34,698 --> 00:50:37,214
and we found out how to
modify those enzymes
1115
00:50:37,234 --> 00:50:39,516
so that they can go
into human cells,
1116
00:50:39,536 --> 00:50:42,619
go to the right place in the
cell that they're needed,
1117
00:50:42,639 --> 00:50:44,388
which is called the lysosome,
1118
00:50:44,408 --> 00:50:48,558
and actually do their job
there, and it actually works.
1119
00:50:48,578 --> 00:50:51,821
Cells are protected from
this toxic substance...
1120
00:50:52,049 --> 00:50:53,697
That's what these
graphs are showing.
1121
00:50:53,717 --> 00:50:56,133
So this is pretty good news.
1122
00:50:56,153 --> 00:50:58,735
The damage that accumulates
that eventually causes
1123
00:50:58,755 --> 00:51:01,772
the diseases and
disabilities of old age
1124
00:51:01,792 --> 00:51:04,374
is initially harmless.
1125
00:51:04,394 --> 00:51:07,611
The body is set up to tolerate
a certain amount of it.
1126
00:51:07,631 --> 00:51:10,280
That's critical, because
while we damage it
1127
00:51:10,300 --> 00:51:13,216
at that sub-pathological level,
1128
00:51:13,236 --> 00:51:15,352
it means that it's
not participating
1129
00:51:15,372 --> 00:51:17,254
in metabolism, so to speak.
1130
00:51:17,274 --> 00:51:20,390
It's not actually interacting
with the way the body works.
1131
00:51:20,410 --> 00:51:22,726
So medicines that
target that damage
1132
00:51:22,746 --> 00:51:25,295
are much, much less likely
1133
00:51:25,315 --> 00:51:28,231
to have unacceptable
side effects
1134
00:51:28,251 --> 00:51:31,201
than medicines that try
to manipulate the body
1135
00:51:31,221 --> 00:51:33,103
so as to stop the damage
from being created
1136
00:51:33,123 --> 00:51:34,295
in the first place.
1137
00:51:42,399 --> 00:51:48,351
It's unlikely, in fact, by
working on longevity per se,
1138
00:51:48,371 --> 00:51:50,253
that we will crack it.
1139
00:51:50,273 --> 00:51:53,390
It's going to be... It
seems to me more probable
1140
00:51:53,410 --> 00:51:57,761
that we will crack longevity
simply by getting rid of,
1141
00:51:57,781 --> 00:52:01,364
sequentially, the prime
causes of death.
1142
00:52:01,384 --> 00:52:06,403
I hear people talk about
living hundreds of years.
1143
00:52:06,423 --> 00:52:07,604
Inside, I'm like, yeah, right,
1144
00:52:07,624 --> 00:52:10,474
I mean, because if
you study the brain,
1145
00:52:10,494 --> 00:52:12,776
the dead end is the brain.
1146
00:52:12,796 --> 00:52:14,344
We all star-t developing.
1147
00:52:14,364 --> 00:52:16,513
Alzheimer's pathology
at 40 years old.
1148
00:52:16,533 --> 00:52:18,715
It's not a matter of whether
you get Alzheimer's,
1149
00:52:18,735 --> 00:52:19,805
it's when.
1150
00:52:20,871 --> 00:52:25,722
It's when, and genetically,
1151
00:52:25,742 --> 00:52:27,824
we all have some predisposition
1152
00:52:27,844 --> 00:52:30,360
to when we're gonna
get this disease.
1153
00:52:30,380 --> 00:52:32,162
It's pan of the program.
1154
00:52:32,182 --> 00:52:34,397
Let's fix that part
of the program
1155
00:52:34,417 --> 00:52:36,600
so we can live past 90 years old
1156
00:52:36,620 --> 00:52:39,169
with an intact, working brain
1157
00:52:39,189 --> 00:52:41,430
to continue the
evolution of our mind.
1158
00:52:42,592 --> 00:52:47,878
That is number one in my
book, because here's a fact:
1159
00:52:47,898 --> 00:52:50,413
Life span's almost 80
right now on average.
1160
00:52:50,433 --> 00:52:54,284
By 85, half of people
will have Alzheimer's.
1161
00:52:54,304 --> 00:52:56,186
Do the math.
1162
00:52:56,206 --> 00:52:57,454
Seventy-four million
Baby Boomers
1163
00:52:57,474 --> 00:52:58,822
headed toward risk age.
1164
00:52:58,842 --> 00:53:01,491
Eighty-five, 50 percent
have Alzheimer's,
1165
00:53:01,511 --> 00:53:03,193
current life span's 80.
1166
00:53:03,213 --> 00:53:04,794
They're gonna be 85 pretty soon.
1167
00:53:04,814 --> 00:53:07,898
Half our population at 85's
gonna have this disease.
1168
00:53:07,918 --> 00:53:09,699
And then keep going
up to 90 and 100,
1169
00:53:09,719 --> 00:53:11,201
and it gets even worse.
1170
00:53:11,221 --> 00:53:13,336
This is enemy number one.
1171
00:53:13,356 --> 00:53:16,462
It's interesting, just this week
1172
00:53:17,594 --> 00:53:20,234
it was discovered that
an Egyptian mummy
1173
00:53:21,264 --> 00:53:23,947
had died of cancer, so even
way back in those times,
1174
00:53:23,967 --> 00:53:25,615
cancer was around.
1175
00:53:25,635 --> 00:53:28,718
What seems to have happened
is, as we have lived longer,
1176
00:53:28,738 --> 00:53:31,888
the number of diseases
that pop up to kill us
1177
00:53:31,908 --> 00:53:34,424
star-ts to increase,
and the reality is
1178
00:53:34,444 --> 00:53:37,494
I think this is a son of
whack-a-mole situation
1179
00:53:37,514 --> 00:53:41,364
as we beat cancer to
death and it disappears,
1180
00:53:41,384 --> 00:53:42,632
something else will pop up.
1181
00:53:42,652 --> 00:53:44,601
Cancer is a specific disease,
1182
00:53:44,621 --> 00:53:46,736
and every cancer is a
specific gene involved
1183
00:53:46,756 --> 00:53:48,471
together with lifestyle.
1184
00:53:48,491 --> 00:53:52,242
Alzheimer's, specific
disease, specific genetics.
1185
00:53:52,262 --> 00:53:55,278
I can go on and on...
Diabetes, hear-t disease.
1186
00:53:55,298 --> 00:53:59,416
These are diseases,
and as you get older,
1187
00:53:59,436 --> 00:54:02,686
your susceptibility to
these diseases increases,
1188
00:54:02,706 --> 00:54:04,688
and your genetics will determine
1189
00:54:04,708 --> 00:54:07,457
whether you get them
and when you get them
1190
00:54:07,477 --> 00:54:08,725
given your lifespan.
1191
00:54:08,745 --> 00:54:10,493
That's not aging,
1192
00:54:10,513 --> 00:54:12,796
that's just living long
enough to be susceptible,
1193
00:54:12,816 --> 00:54:17,000
so we may very well eradicate,
in our fantasy world,
1194
00:54:17,020 --> 00:54:20,270
all the cancers and strokes
and hear-t disease
1195
00:54:20,290 --> 00:54:23,306
and diabetes and Alzheimer's
we get right now
1196
00:54:23,326 --> 00:54:26,543
by 80 or 90 years old, and
then what's gonna happen?
1197
00:54:26,563 --> 00:54:29,613
You live out to 110, and
guess what's gonna happen,
1198
00:54:29,633 --> 00:54:31,548
new, other genetic variants
1199
00:54:31,568 --> 00:54:33,617
suddenly rear their
ugly heads and say,
1200
00:54:33,637 --> 00:54:36,319
"Now we're gonna affect
whether you live to 110"
1201
00:54:36,339 --> 00:54:38,021
without Alzheimer's
and hear-t disease
1202
00:54:38,041 --> 00:54:39,623
"and cancer and diabetes."
1203
00:54:39,643 --> 00:54:41,418
And it'll go on and
go on and go on.
1204
00:54:43,380 --> 00:54:45,795
There will undoubtedly
be enormous challenges
1205
00:54:45,815 --> 00:54:48,796
concerning the biological
approach to longevity.
1206
00:54:49,819 --> 00:54:52,936
There could, however,
be an alternative route
1207
00:54:52,956 --> 00:54:54,936
to extreme longevity.
1208
00:54:56,426 --> 00:54:58,041
When people are
worried about death,
1209
00:54:58,061 --> 00:54:59,743
I guess the issue is what is it
1210
00:54:59,763 --> 00:55:03,480
that they would like to
have stay alive, okay?
1211
00:55:03,500 --> 00:55:07,317
And I think that's often very
unclear what the answer is,
1212
00:55:07,337 --> 00:55:10,053
because if you look at somebody
like Ray Kurzweil, for example,
1213
00:55:10,073 --> 00:55:11,921
with his promises
of the singularity
1214
00:55:11,941 --> 00:55:13,790
and our merging with
machine intelligence
1215
00:55:13,810 --> 00:55:15,692
and then being able
to kind of have
1216
00:55:15,712 --> 00:55:17,327
this kind of infinite
consciousness
1217
00:55:17,347 --> 00:55:20,397
projected outward
into the Cosmos.
1218
00:55:20,417 --> 00:55:21,865
I don't think he's
imaging a human body
1219
00:55:21,885 --> 00:55:24,834
living forever, okay?
1220
00:55:24,854 --> 00:55:28,338
And if that's what we're
talking about is immortality,
1221
00:55:28,358 --> 00:55:30,607
what I kind of think
Kurzweil is talking about,
1222
00:55:30,627 --> 00:55:32,409
then I can see it.
1223
00:55:32,429 --> 00:55:35,979
I mean, I could see at least
as something to work toward.
1224
00:55:35,999 --> 00:55:40,050
In 2005, Google's
Director of Engineering,
1225
00:55:40,070 --> 00:55:42,652
Ray Kurzweil, published a book
1226
00:55:42,672 --> 00:55:45,555
entitled The
Singularity is Near:
1227
00:55:45,575 --> 00:55:48,758
When Humans Transcend Biology.
1228
00:55:48,778 --> 00:55:52,028
He predicts that by 2045,
1229
00:55:52,048 --> 00:55:54,564
it'll be possible
to upload our minds
1230
00:55:54,584 --> 00:55:57,634
into a computer,
effectively allowing us
1231
00:55:57,654 --> 00:55:59,725
to live indefinitely.
1232
00:56:14,637 --> 00:56:17,087
When people think of
death at this point,
1233
00:56:17,107 --> 00:56:19,422
they think of a body
going into a coffin,
1234
00:56:19,442 --> 00:56:21,725
and the coffin going
into the ground.
1235
00:56:21,745 --> 00:56:25,862
When in fact that age
of death is dying.
1236
00:56:25,882 --> 00:56:28,698
We are ending that stage.
1237
00:56:28,718 --> 00:56:30,433
We're entering a new stage
1238
00:56:30,453 --> 00:56:32,669
where we could possibly
upload consciousness
1239
00:56:32,689 --> 00:56:34,571
into a silicone substrate.
1240
00:56:34,591 --> 00:56:36,773
You know, a lot of these
science fiction ideas
1241
00:56:36,793 --> 00:56:39,642
from decades ago
are becoming real
1242
00:56:39,662 --> 00:56:42,011
and already people are
spending millions of pounds
1243
00:56:42,031 --> 00:56:44,848
research today on
making this happen.
1244
00:56:44,868 --> 00:56:47,917
Nobody expects to do it
before 2040 at the earliest.
1245
00:56:47,937 --> 00:56:51,554
My guess is 2050, it'll
be a few rich people
1246
00:56:51,574 --> 00:56:54,624
and a few kings and queens here
and there and politicians.
1247
00:56:54,644 --> 00:56:58,094
By 2060-2070, it's
reasonably well-off people.
1248
00:56:58,114 --> 00:57:01,865
By 2075, pretty much
anybody could be immortal.
1249
00:57:01,885 --> 00:57:04,167
I'm not convinced that
uploading my consciousness
1250
00:57:04,187 --> 00:57:07,437
onto a computer is a
form of immortality.
1251
00:57:07,457 --> 00:57:09,773
I would like to live forever,
1252
00:57:09,793 --> 00:57:12,205
but I'm not sure that I
would like to live forever
1253
00:57:12,429 --> 00:57:17,981
as some digibytes of
memory in a computer.
1254
00:57:18,001 --> 00:57:19,682
I wouldn't call that
living forever.
1255
00:57:19,702 --> 00:57:21,518
The things I wanna
do with my body
1256
00:57:21,538 --> 00:57:24,212
that I won't be able to
do in that computer.
1257
00:57:24,441 --> 00:57:26,623
Immortality is a question
that keeps arising
1258
00:57:26,643 --> 00:57:29,759
in the technology community,
and it's one which I think
1259
00:57:29,779 --> 00:57:33,196
is entirely feasible
in principle.
1260
00:57:33,216 --> 00:57:34,864
We won't actually
become immortal,
1261
00:57:34,884 --> 00:57:38,501
but what we will do is we
will get the technology
1262
00:57:38,521 --> 00:57:43,072
by around about 2050 to
connect a human brain
1263
00:57:43,092 --> 00:57:46,976
to the machine world so well
that most of your thinking
1264
00:57:46,996 --> 00:57:49,045
is happening inside
the computer world,
1265
00:57:49,065 --> 00:57:50,880
inside the l.T.,
1266
00:57:50,900 --> 00:57:52,615
so your brain is
still being used,
1267
00:57:52,635 --> 00:57:55,585
but 99 percent of your thoughts,
99 percent of your memories
1268
00:57:55,605 --> 00:57:57,243
are actually out
there in the cloud
1269
00:57:57,474 --> 00:57:59,055
or whatever you wanna call it,
1270
00:57:59,075 --> 00:58:00,957
and only one percent
is inside your head.
1271
00:58:00,977 --> 00:58:02,725
So walk into work this morning,
1272
00:58:02,745 --> 00:58:05,562
you get hit by a bus,
it doesn't matter.
1273
00:58:05,582 --> 00:58:09,132
You just upload you
mind into an android
1274
00:58:09,152 --> 00:58:11,668
on Monday morning and carry on
as if nothing had happened.
1275
00:58:11,688 --> 00:58:13,570
The question with that
kind of technology
1276
00:58:13,590 --> 00:58:17,807
and the extension
of human capacity
1277
00:58:17,827 --> 00:58:23,012
and human life
through technology
1278
00:58:23,032 --> 00:58:26,082
is where does the
human, um, end,
1279
00:58:26,102 --> 00:58:27,884
and the technology begin?
1280
00:58:27,904 --> 00:58:31,554
If we upload our
consciousness into a robot,
1281
00:58:31,574 --> 00:58:35,959
a humanoid robot that has
touch, the ability to feel,
1282
00:58:35,979 --> 00:58:39,729
all of the sensorial inputs,
if they're the same,
1283
00:58:39,749 --> 00:58:43,066
there is a potential
of continuity, right?
1284
00:58:43,086 --> 00:58:47,237
So you can have the same
types of experiences
1285
00:58:47,257 --> 00:58:50,206
in that new substrate that
you have as a human being.
1286
00:58:50,226 --> 00:58:52,297
It's beyond a continuity issue,
1287
00:58:52,529 --> 00:58:55,879
it's an issue that you
have the ability to record
1288
00:58:55,899 --> 00:58:58,948
and recall without
the content...
1289
00:58:58,968 --> 00:59:00,884
The content being the
sensations, images
1290
00:59:00,904 --> 00:59:03,553
and feelings and thoughts you
experienced your whole life
1291
00:59:03,573 --> 00:59:04,854
that have associated
with each other
1292
00:59:04,874 --> 00:59:06,256
through your neuronetwork.
1293
00:59:06,276 --> 00:59:07,891
Where are they stored?
1294
00:59:07,911 --> 00:59:11,094
I don't think consciousness
and the brain
1295
00:59:11,114 --> 00:59:12,896
is anything to do
with any particular
1296
00:59:12,916 --> 00:59:14,864
individual region in the brain,
1297
00:59:14,884 --> 00:59:17,600
but it's something
that's all about...
1298
00:59:17,620 --> 00:59:19,903
Its distributed organization.
1299
00:59:19,923 --> 00:59:21,604
I don't think there
are any mysteries.
1300
00:59:21,624 --> 00:59:24,307
There are no causal
mysteries in the brain,
1301
00:59:24,327 --> 00:59:29,612
and I think that
there's a perfectly...
1302
00:59:29,632 --> 00:59:32,048
A comprehensible, physical chain
1303
00:59:32,068 --> 00:59:34,751
of cause and effect that
goes from the things
1304
00:59:34,771 --> 00:59:37,220
that I see and hear around me
1305
00:59:37,240 --> 00:59:39,022
and the words that
come out of my mouth,
1306
00:59:39,042 --> 00:59:43,593
which would encompass
consciousness, I suppose.
1307
00:59:43,613 --> 00:59:44,894
But the things that...
1308
00:59:44,914 --> 00:59:46,763
The moment you say
something like that,
1309
00:59:46,783 --> 00:59:50,800
you're on the edge of the
philosophical precipice.
1310
00:59:50,820 --> 00:59:52,135
When you think about a machine,
1311
00:59:52,155 --> 00:59:55,038
the question is are you
simulating consciousness
1312
00:59:55,058 --> 00:59:57,231
or are you simulating cognition?
1313
00:59:58,828 --> 01:00:03,346
Cognition requires
inputs and reactions
1314
01:00:03,366 --> 01:00:04,814
that are associated
with each other
1315
01:00:04,834 --> 01:00:07,317
to create an output
and an outcome.
1316
01:00:07,337 --> 01:00:08,918
And you can program that all day
1317
01:00:08,938 --> 01:00:11,054
and you can make that
as sophisticated
1318
01:00:11,074 --> 01:00:12,789
and as information-dense
as you want,
1319
01:00:12,809 --> 01:00:15,756
almost to the point that
it mimics a real person.
1320
01:00:17,213 --> 01:00:19,729
But the question is will it
ever have the consciousness
1321
01:00:19,749 --> 01:00:21,698
that our species
with our genetics,
1322
01:00:21,718 --> 01:00:24,000
with our brain has.
1323
01:00:24,020 --> 01:00:27,370
No, a machine has its
own consciousness.
1324
01:00:27,390 --> 01:00:29,706
All you're doing
is programming it
1325
01:00:29,726 --> 01:00:33,176
to be cognitively
responsive the way you are.
1326
01:00:33,196 --> 01:00:35,311
I remember well,
when my father died,
1327
01:00:35,331 --> 01:00:39,649
asking my mother, "if
I could've captured"
1328
01:00:39,669 --> 01:00:42,952
the very being of my
father in a machine,
1329
01:00:42,972 --> 01:00:44,754
and I could put
him in an android
1330
01:00:44,774 --> 01:00:47,414
that looked exactly like
him, had all the mannerisms,
1331
01:00:47,644 --> 01:00:50,960
and it was warm and it smelled
and it felt like him,
1332
01:00:50,980 --> 01:00:53,196
would you do it?" and she
said, "Absolutely not.
1333
01:00:53,216 --> 01:00:55,832
It wouldn't be your father,
it wouldn't be him."
1334
01:00:55,852 --> 01:00:57,800
I think that someday
you can upload
1335
01:00:57,820 --> 01:00:59,731
your current neuronetwork...
1336
01:01:00,923 --> 01:01:03,006
but that's not you.
1337
01:01:03,026 --> 01:01:05,808
That's just your current
neuro map, right?
1338
01:01:05,828 --> 01:01:12,281
♪
1339
01:01:12,301 --> 01:01:14,384
As with any concept
that proposes
1340
01:01:14,404 --> 01:01:17,687
to change the natural
order of things,
1341
01:01:17,707 --> 01:01:22,213
the idea of extreme longevity
can be met with disbelief.
1342
01:01:23,746 --> 01:01:26,262
But there is currently an
international movement
1343
01:01:26,282 --> 01:01:29,298
called transhumanism
that is concerned
1344
01:01:29,318 --> 01:01:31,367
with fundamentally transforming
1345
01:01:31,387 --> 01:01:34,837
the human condition by
developing technologies
1346
01:01:34,857 --> 01:01:37,373
to greatly enhance human beings
1347
01:01:37,393 --> 01:01:42,467
in an intellectual, physical,
and psychological capacity.
1348
01:01:42,699 --> 01:01:45,314
I really want to just
simply live indefinitely,
1349
01:01:45,334 --> 01:01:48,217
and not have the Spectre
of Death hanging over me,
1350
01:01:48,237 --> 01:01:50,478
potentially at any
moment taking away
1351
01:01:50,707 --> 01:01:53,222
this thing that we
call existence.
1352
01:01:53,242 --> 01:01:55,458
So for me, that's
the primary goal
1353
01:01:55,478 --> 01:01:57,794
of the transhumanist movement.
1354
01:01:57,814 --> 01:02:00,863
Transhumanists believe
we should use technology
1355
01:02:00,883 --> 01:02:05,101
to overcome our
biological limitations.
1356
01:02:05,121 --> 01:02:06,436
What does that mean?
1357
01:02:06,456 --> 01:02:10,973
Well, very
simplistically, perhaps,
1358
01:02:10,993 --> 01:02:13,443
I think we should be aiming
for what one might call
1359
01:02:13,463 --> 01:02:16,012
a triple S civilization
1360
01:02:16,032 --> 01:02:20,149
of super intelligence,
super longevity,
1361
01:02:20,169 --> 01:02:21,884
and super happiness.
1362
01:02:21,904 --> 01:02:24,320
We have been evolving through
hundreds and hundreds
1363
01:02:24,340 --> 01:02:26,889
of thousands of
years, human beings,
1364
01:02:26,909 --> 01:02:29,392
and transhumanism is
the climax of that.
1365
01:02:29,412 --> 01:02:32,128
It's the result of how
we're going to get
1366
01:02:32,148 --> 01:02:37,033
to some kind of great future
where we are way beyond
1367
01:02:37,053 --> 01:02:38,801
what it means to
be a human being.
1368
01:02:38,821 --> 01:02:44,107
Unfortunately, organic
robots grow old and die,
1369
01:02:44,127 --> 01:02:48,878
and this isn't a choice,
it's completely involuntary.
1370
01:02:48,898 --> 01:02:51,180
120 years from now,
in the absence
1371
01:02:51,200 --> 01:02:54,484
of radical biological
interventions,
1372
01:02:54,504 --> 01:02:58,287
everyone listening to
this video will be dead,
1373
01:02:58,307 --> 01:03:00,548
and not beautifully as so,
1374
01:03:00,777 --> 01:03:04,494
but one's last years tends
to be those of decrepitude,
1375
01:03:04,514 --> 01:03:08,064
frequently senility, infirmity,
1376
01:03:08,084 --> 01:03:13,202
and transhumanists don't
accept aging as inevitable.
1377
01:03:13,222 --> 01:03:15,838
There's no immutable
law of nature
1378
01:03:15,858 --> 01:03:18,941
that says that organic
robots must grow old.
1379
01:03:18,961 --> 01:03:21,567
After all, silicone robots,
they don't need to grow old.
1380
01:03:21,798 --> 01:03:25,281
Their pans can be
replaced and upgraded.
1381
01:03:25,301 --> 01:03:26,983
Our bodies are
capable of adjusting
1382
01:03:27,003 --> 01:03:28,914
in ways we've hardly dreamt of.
1383
01:03:30,173 --> 01:03:31,948
If we can only find the key.
1384
01:03:33,109 --> 01:03:35,953
I'm so close now, so very close.
1385
01:03:37,880 --> 01:03:40,163
- The key to what?
- To be able to replace
1386
01:03:40,183 --> 01:03:41,964
diseased and damaged
pans of the body
1387
01:03:41,984 --> 01:03:44,828
as easily as we replace
eye corneas now.
1388
01:03:45,588 --> 01:03:46,896
Can't be done.
1389
01:03:48,291 --> 01:03:49,463
It can be done!
1390
01:03:50,960 --> 01:03:53,910
The relationship
between life and death
1391
01:03:53,930 --> 01:03:58,981
and the role of technology
in forestalling death
1392
01:03:59,001 --> 01:04:02,919
creates death, in a way,
as a new kind of problem.
1393
01:04:02,939 --> 01:04:06,222
Death becomes something
that needs to be solved.
1394
01:04:06,242 --> 01:04:07,356
Why would it be good
to live forever?
1395
01:04:07,376 --> 01:04:08,591
'Cause if you have a shit day,
1396
01:04:08,611 --> 01:04:11,527
it dilutes the depression
1397
01:04:11,547 --> 01:04:14,330
within countless other
days, you know?
1398
01:04:14,350 --> 01:04:19,502
And all of these metrics
are about failing to exist
1399
01:04:19,522 --> 01:04:22,972
in the full light of
your own autonomy.
1400
01:04:22,992 --> 01:04:24,907
That's all they're about,
1401
01:04:24,927 --> 01:04:27,043
and the paradox of
your own autonomy,
1402
01:04:27,063 --> 01:04:29,634
which is you're simultaneously
completely free
1403
01:04:29,866 --> 01:04:32,548
and completely unfree
at the same time.
1404
01:04:32,568 --> 01:04:34,150
I have been to many conferences
1405
01:04:34,170 --> 01:04:36,452
where you got the
anti-transhumanist person
1406
01:04:36,472 --> 01:04:39,088
saying, "This is just
denial of death."
1407
01:04:39,108 --> 01:04:42,225
At the end of the day, that's
all it's about, right?
1408
01:04:42,245 --> 01:04:45,328
And it's a kind of...
The last hangover
1409
01:04:45,348 --> 01:04:46,896
of the Abrahamic religions,
1410
01:04:46,916 --> 01:04:48,197
this idea that we're
gonna, you know,
1411
01:04:48,217 --> 01:04:49,932
come back to God and all this,
1412
01:04:49,952 --> 01:04:52,401
and we're gonna realize
our God-like nature,
1413
01:04:52,421 --> 01:04:57,461
and this is really the last
kind of point for that.
1414
01:04:58,928 --> 01:05:00,376
I think there's a lot
of truth to that,
1415
01:05:00,396 --> 01:05:02,178
especially in terms
of the issues
1416
01:05:02,198 --> 01:05:03,946
we've been talking about
where everybody just seems
1417
01:05:03,966 --> 01:05:06,015
to just take for granted
that if you're given
1418
01:05:06,035 --> 01:05:08,651
the chance to live forever,
you'd live forever.
1419
01:05:08,671 --> 01:05:11,554
I think yes, I think
that that's true.
1420
01:05:11,574 --> 01:05:14,680
I don't think it's...
I think it's true,
1421
01:05:14,911 --> 01:05:16,225
I don't know if
it's as problematic
1422
01:05:16,245 --> 01:05:18,194
as people kind of claim it is.
1423
01:05:18,214 --> 01:05:19,595
In other words, that
there's something wrong
1424
01:05:19,615 --> 01:05:21,364
with having this fear of death
1425
01:05:21,384 --> 01:05:23,099
and wanting to live forever.
1426
01:05:23,119 --> 01:05:25,301
No, I think living forever is...
1427
01:05:25,321 --> 01:05:28,070
I think the question is what
are you doing with your time?
1428
01:05:28,090 --> 01:05:30,306
In what capacity do you
wanna live forever?
1429
01:05:30,326 --> 01:05:32,608
So I do think it makes all
the difference in the world
1430
01:05:32,628 --> 01:05:34,510
whether we're talking
about Kurzweil's way
1431
01:05:34,530 --> 01:05:36,312
or we're talking about
Aubrey de Grey's way.
1432
01:05:36,332 --> 01:05:38,147
The way the human
species operates
1433
01:05:38,167 --> 01:05:41,217
is that we're really never
fully ready for anything.
1434
01:05:41,237 --> 01:05:44,186
However, the prospect
of living indefinitely
1435
01:05:44,206 --> 01:05:48,257
is too promising to turn
down or to slow down
1436
01:05:48,277 --> 01:05:51,360
or to just not go
after at full speed.
1437
01:05:51,380 --> 01:05:53,462
By enabling us to
find technologies
1438
01:05:53,482 --> 01:05:55,531
to live indefinitely,
we're not making it
1439
01:05:55,551 --> 01:05:57,266
so that we're going
to live forever,
1440
01:05:57,286 --> 01:05:59,468
we're just making it so
we have that choice.
1441
01:05:59,488 --> 01:06:01,404
If people wanna pull out of life
1442
01:06:01,424 --> 01:06:03,072
at some point down the future,
1443
01:06:03,092 --> 01:06:04,730
they're certainly
welcome to do that.
1444
01:06:04,961 --> 01:06:08,377
However, it's gonna be great
to eliminate death if we want,
1445
01:06:08,397 --> 01:06:10,468
because everyone
wants that choice.
1446
01:06:12,068 --> 01:06:14,383
There are other
socioeconomic repercussions
1447
01:06:14,403 --> 01:06:17,748
of living longer that
need to be considered.
1448
01:06:17,974 --> 01:06:20,489
The combination of
an aging population
1449
01:06:20,509 --> 01:06:23,392
and the escalating
expenses of healthcare,
1450
01:06:23,412 --> 01:06:26,295
social care, and
retirement is a problem
1451
01:06:26,315 --> 01:06:29,365
that already exists
the world over.
1452
01:06:29,385 --> 01:06:31,367
In the last century alone,
1453
01:06:31,387 --> 01:06:33,602
medicine has massively
contributed
1454
01:06:33,622 --> 01:06:36,068
to increased life expectancy.
1455
01:06:37,460 --> 01:06:40,409
According to the World
Health Organization,
1456
01:06:40,429 --> 01:06:43,512
the number of people
aged 60 years and over
1457
01:06:43,532 --> 01:06:48,050
is expected to increase
from the 605 million today
1458
01:06:48,070 --> 01:06:52,388
to 2 billion by the year 2050.
1459
01:06:52,408 --> 01:06:55,257
As people live longer, they
become more susceptible
1460
01:06:55,277 --> 01:06:58,461
to noncommunicable diseases.
1461
01:06:58,481 --> 01:07:02,331
This becomes
enormously expensive.
1462
01:07:02,351 --> 01:07:08,204
Dementia alone costs the
NHS 23 billion a year.
1463
01:07:08,224 --> 01:07:10,795
Currently, elderly non-workers
1464
01:07:11,027 --> 01:07:14,804
account for a vast portion
of our population
1465
01:07:15,031 --> 01:07:18,501
and a vast portion of our
work force care for them.
1466
01:07:19,668 --> 01:07:23,548
It is economically
beneficial to end aging.
1467
01:07:25,541 --> 01:07:28,557
Social life is organized
around people having...
1468
01:07:28,577 --> 01:07:32,161
Occupying certain roles
at certain ages, right?
1469
01:07:32,181 --> 01:07:34,296
And you can already see
the kinds of problems
1470
01:07:34,316 --> 01:07:36,599
that are caused to
the welfare system
1471
01:07:36,619 --> 01:07:39,668
when people live substantially
beyond the age of 65,
1472
01:07:39,688 --> 01:07:45,074
because when the whole number
65 was selected by Bismarck
1473
01:07:45,094 --> 01:07:47,176
when he started the first
social security system,
1474
01:07:47,196 --> 01:07:50,079
in Germany, the expectation was
that people would be living
1475
01:07:50,099 --> 01:07:52,681
two years beyond
the retirement age
1476
01:07:52,701 --> 01:07:54,450
to be able to get the
social security.
1477
01:07:54,470 --> 01:07:56,619
So it wasn't gonna
break the bank, okay?
1478
01:07:56,639 --> 01:07:59,455
Problem now is you've got
people who are living 20 years
1479
01:07:59,475 --> 01:08:02,091
or more beyond the
retirement age,
1480
01:08:02,111 --> 01:08:03,692
and that's unaffordable.
1481
01:08:03,712 --> 01:08:06,695
There's no question that,
within society as a whole,
1482
01:08:06,715 --> 01:08:10,857
there is an enormous tendency
to knee-jerk reactions
1483
01:08:11,087 --> 01:08:14,170
with regard to the problems
that might be created
1484
01:08:14,190 --> 01:08:17,173
if we were to eliminate aging.
1485
01:08:17,193 --> 01:08:19,308
There have been people
that said, you know,
1486
01:08:19,328 --> 01:08:21,811
"You'll be bored, you won't
have anything to do."
1487
01:08:21,831 --> 01:08:25,147
Speaking from a
place of a lifespan
1488
01:08:25,167 --> 01:08:27,183
that's 80 or 90 years old,
1489
01:08:27,203 --> 01:08:28,784
saying that we're
going to be bored
1490
01:08:28,804 --> 01:08:32,221
if we live to 150 years old
really is just invalid.
1491
01:08:32,241 --> 01:08:34,657
We have no idea what
we'll do with that time.
1492
01:08:34,677 --> 01:08:36,358
Pan of this transhumanism stuff,
1493
01:08:36,378 --> 01:08:39,462
where it gets some kind
of real policy traction,
1494
01:08:39,482 --> 01:08:43,399
is people who want us not
to live to be 1,000,
1495
01:08:43,419 --> 01:08:46,235
but maybe if we can
take that 20 years
1496
01:08:46,255 --> 01:08:49,538
that we're living longer now
than we did 100 years ago
1497
01:08:49,558 --> 01:08:51,707
and keep that productive.
1498
01:08:51,727 --> 01:08:53,476
So in other words, if you
could still be strong
1499
01:08:53,496 --> 01:08:56,879
and still be sharp
into your 70s and 80s,
1500
01:08:56,899 --> 01:08:59,715
and so not have to pull
any social security
1501
01:08:59,735 --> 01:09:02,785
until quite late in life
and then you'll be...
1502
01:09:02,805 --> 01:09:04,553
You'll have 20 extra years
1503
01:09:04,573 --> 01:09:07,156
where you're actually
contributing to the economy.
1504
01:09:07,176 --> 01:09:08,624
So one of the areas that we're
gonna have to think about
1505
01:09:08,644 --> 01:09:11,160
in the near future
if we do achieve
1506
01:09:11,180 --> 01:09:14,230
extreme longevity physically
1507
01:09:14,250 --> 01:09:17,366
is the idea of overpopulation.
1508
01:09:17,386 --> 01:09:20,536
This is a controversial
idea, of course,
1509
01:09:20,556 --> 01:09:24,773
and we may face a time period
where we have to say to people,
1510
01:09:24,793 --> 01:09:28,878
"You have to be licensed to
have more than one child."
1511
01:09:28,898 --> 01:09:31,714
The ideas around children, I
hope, will probably change
1512
01:09:31,734 --> 01:09:37,319
when people start to realize
that the values of children
1513
01:09:37,339 --> 01:09:39,622
need to be defined first
before we have them.
1514
01:09:39,642 --> 01:09:41,357
And that's not
something that we do.
1515
01:09:41,377 --> 01:09:44,360
We just have them, and
we don't define why
1516
01:09:44,380 --> 01:09:45,794
or for what purpose.
1517
01:09:45,814 --> 01:09:47,563
I'm not saying there has
to be a defined purpose,
1518
01:09:47,583 --> 01:09:50,232
but I'm saying that just
to continue our gene line
1519
01:09:50,252 --> 01:09:51,700
isn't the biggest reason.
1520
01:09:51,720 --> 01:09:53,435
At the moment, ultimately,
1521
01:09:53,455 --> 01:09:56,872
we see in any society where
fertility rate goes down
1522
01:09:56,892 --> 01:09:59,842
because of female
prosperity and emancipation
1523
01:09:59,862 --> 01:10:02,711
and education, we
also see the age
1524
01:10:02,731 --> 01:10:05,948
of the average
childbirth go up, right?
1525
01:10:05,968 --> 01:10:07,850
We see women having
their children later.
1526
01:10:07,870 --> 01:10:10,886
Now of course, at the moment,
there's a deadline for that,
1527
01:10:10,906 --> 01:10:12,755
but that's not going
to exist anymore,
1528
01:10:12,775 --> 01:10:14,690
because menopause
is pan of aging.
1529
01:10:14,710 --> 01:10:17,359
So women who are choosing
to have their children
1530
01:10:17,379 --> 01:10:19,361
a bit later now,
1531
01:10:19,381 --> 01:10:21,263
it stands to reason
that a lot of them
1532
01:10:21,283 --> 01:10:22,831
are probably gonna choose
to have their children
1533
01:10:22,851 --> 01:10:24,934
a lot later and a lot later
and that, of course,
1534
01:10:24,954 --> 01:10:28,270
also has an enormous
depressive impact
1535
01:10:28,290 --> 01:10:31,407
on the trajectory of
global population.
1536
01:10:31,427 --> 01:10:32,875
If we actually said
to everybody, "Okay",
1537
01:10:32,895 --> 01:10:34,944
you're all now gonna
live for 1,000 years,
1538
01:10:34,964 --> 01:10:36,879
we could restructure society
1539
01:10:36,899 --> 01:10:38,914
so it's on these
LOGO-year cycles.
1540
01:10:38,934 --> 01:10:42,484
That's possible, but
the problem becomes
1541
01:10:42,504 --> 01:10:46,522
when you still allow people
to live the normal length
1542
01:10:46,542 --> 01:10:49,391
and you're also allowing some
people to live 1,000 years
1543
01:10:49,411 --> 01:10:52,017
then how do you compare
the value of the lives,
1544
01:10:52,248 --> 01:10:53,395
the amount of experience?
1545
01:10:53,415 --> 01:10:56,398
Supposing a 585-year-old guy
1546
01:10:56,418 --> 01:10:58,567
goes up for a job
against a 23-year-old.
1547
01:10:58,587 --> 01:11:00,336
How do you measure
the experience?
1548
01:11:00,356 --> 01:11:01,937
What, the old guy
always gets the job?
1549
01:11:01,957 --> 01:11:05,908
I mean, really, these kinds
of problems would arise
1550
01:11:05,928 --> 01:11:08,477
unless there was some
kind of legislation
1551
01:11:08,497 --> 01:11:11,034
about permissible
variation in age.
1552
01:11:11,267 --> 01:11:12,815
This is a bit of a conundrum
1553
01:11:12,835 --> 01:11:17,045
because we're all
expanding our lifespan,
1554
01:11:17,273 --> 01:11:20,389
and the question is,
would you like to live
1555
01:11:20,409 --> 01:11:22,758
for not 100 years but 200?
1556
01:11:22,778 --> 01:11:24,893
Would you choose
to if you could?
1557
01:11:24,913 --> 01:11:27,296
It would be very, very
difficult to say no.
1558
01:11:27,316 --> 01:11:30,799
The reality is the replacement
of human piece pans
1559
01:11:30,819 --> 01:11:33,569
is probably gonna take
us in that direction,
1560
01:11:33,589 --> 01:11:35,068
but it will be market driven,
1561
01:11:35,291 --> 01:11:37,740
and those people with the
money will be able to afford
1562
01:11:37,760 --> 01:11:39,875
to live a lot longer than
those people without.
1563
01:11:39,895 --> 01:11:41,810
Pretty much most of the
discovery these days
1564
01:11:41,830 --> 01:11:44,413
takes place in Western
Europe of the United States
1565
01:11:44,433 --> 01:11:49,318
or one or two other countries,
China, Singapore, and so on,
1566
01:11:49,338 --> 01:11:52,688
but if they're valuable enough...
And I don't mean monetarily...
1567
01:11:52,708 --> 01:11:55,991
If they're worth having,
then people extend them.
1568
01:11:56,011 --> 01:11:58,327
We have to star-t somewhere,
and I don't believe
1569
01:11:58,347 --> 01:12:00,429
in the dog in the
manger attitude
1570
01:12:00,449 --> 01:12:01,697
is that you don't
give it to anybody
1571
01:12:01,717 --> 01:12:03,766
until you can provide
it for everybody.
1572
01:12:03,786 --> 01:12:06,535
All technologies
are discontinuous.
1573
01:12:06,555 --> 01:12:09,092
There are people at
this very moment
1574
01:12:09,325 --> 01:12:11,040
who are walking four kilometers
1575
01:12:11,060 --> 01:12:14,076
to get a bucket of
water from a well.
1576
01:12:14,096 --> 01:12:17,046
You know, there are people who
are having cornea operations
1577
01:12:17,066 --> 01:12:20,816
that are done with a needle
where it's stuck in their eye
1578
01:12:20,836 --> 01:12:23,085
and their cornea is scraped out.
1579
01:12:23,105 --> 01:12:25,754
You know, so these
ideas of totalizing
1580
01:12:25,774 --> 01:12:28,624
utopian technological
intervention
1581
01:12:28,644 --> 01:12:31,460
are pan of a discontinuous
technological world
1582
01:12:31,480 --> 01:12:34,830
and the world will always be
discontinuous technologically.
1583
01:12:34,850 --> 01:12:39,368
When a child has to drink
dirty water, cannot get food,
1584
01:12:39,388 --> 01:12:41,970
and is dying of
starvation and disease,
1585
01:12:41,990 --> 01:12:44,840
and the solution is
just a few dollars,
1586
01:12:44,860 --> 01:12:46,442
there's something badly wrong.
1587
01:12:46,462 --> 01:12:49,478
We need to fix those things.
1588
01:12:49,498 --> 01:12:53,515
The only way that we could
fix them in the past
1589
01:12:53,535 --> 01:12:55,551
would've been at
unbelievable cost
1590
01:12:55,571 --> 01:12:56,919
because of the limitation
1591
01:12:56,939 --> 01:12:59,121
of our industrial
capacity and capability.
1592
01:12:59,141 --> 01:13:00,643
Not anymore.
1593
01:13:05,981 --> 01:13:10,399
Not the one which is...
Which stops us from aging.
1594
01:13:10,419 --> 01:13:15,164
In the last 20 years, healthcare
in Sub-Saharan Africa
1595
01:13:15,391 --> 01:13:16,839
has greatly improved.
1596
01:13:16,859 --> 01:13:17,940
♪
1597
01:13:17,960 --> 01:13:20,909
HIV prevalence has gone down,
1598
01:13:20,929 --> 01:13:23,879
Infant modality
rate has gone down.
1599
01:13:23,899 --> 01:13:26,882
Immunization rates have gone up,
1600
01:13:26,902 --> 01:13:29,885
and the drug supply in
many areas has risen.
1601
01:13:29,905 --> 01:13:30,986
♪
1602
01:13:31,006 --> 01:13:33,822
However, healthcare
and medication
1603
01:13:33,842 --> 01:13:35,691
in developing countries
1604
01:13:35,711 --> 01:13:39,895
is not always affordable
or even readily available.
1605
01:13:39,915 --> 01:13:41,764
It is not just medicine.
1606
01:13:41,784 --> 01:13:44,196
According to the World
Health Organization,
1607
01:13:44,420 --> 01:13:47,703
over 700 million
people worldwide
1608
01:13:47,723 --> 01:13:51,607
do not have access to
clean drinking water.
1609
01:13:51,627 --> 01:13:54,510
We still live in an age where
over one billion people
1610
01:13:54,530 --> 01:13:56,779
live off less than
a dollar a day
1611
01:13:56,799 --> 01:13:59,481
and live in extreme poverty.
1612
01:13:59,501 --> 01:14:02,484
It is ultimately not
a scientific issue,
1613
01:14:02,504 --> 01:14:04,882
it is a geopolitical issue.
1614
01:15:09,238 --> 01:15:12,020
Now a lot of people, a
lot of philanthropists
1615
01:15:12,040 --> 01:15:16,859
are of the view that the
most important thing to do
1616
01:15:16,879 --> 01:15:20,696
is to address the trailing
edge of quality of life.
1617
01:15:20,716 --> 01:15:23,765
In other words, to help
the disadvantaged.
1618
01:15:23,785 --> 01:15:26,535
But some visionary
philanthropists
1619
01:15:26,555 --> 01:15:29,571
such as the ones that fund
SENS Research Foundation...
1620
01:15:29,591 --> 01:15:31,206
And the fact is I
agree with this,
1621
01:15:31,226 --> 01:15:33,208
and that's why I've put
most of my inheritance
1622
01:15:33,228 --> 01:15:35,844
into SENS Research
Foundation, too...
1623
01:15:35,864 --> 01:15:39,915
We feel that, actually,
in the long run,
1624
01:15:39,935 --> 01:15:43,552
you lose out if you
focus too exclusively
1625
01:15:43,572 --> 01:15:45,120
on the trailing edge.
1626
01:15:45,140 --> 01:15:48,290
You've got also to push
forward the leading edge,
1627
01:15:48,310 --> 01:15:51,757
so that in the long term,
everybody moves forward.
1628
01:16:43,999 --> 01:16:46,848
Due to a lack of funding
from governments,
1629
01:16:46,868 --> 01:16:51,186
anti-aging research is often
pushed into the private sector.
1630
01:16:51,206 --> 01:16:55,090
If we look at funding
for disease...
1631
01:16:55,110 --> 01:16:56,925
Cancer, hear-t disease...
1632
01:16:56,945 --> 01:16:59,161
They get six billion, eight
billion, ten billion.
1633
01:16:59,181 --> 01:17:02,364
AIDS still gets two
to four billion.
1634
01:17:02,384 --> 01:17:04,700
Let's look at
Alzheimer's disease.
1635
01:17:04,720 --> 01:17:06,969
It's probably the most
important disease of aging,
1636
01:17:06,989 --> 01:17:10,672
the brain... it gets
under a half a billion
1637
01:17:10,692 --> 01:17:11,907
from the federal government,
1638
01:17:11,927 --> 01:17:14,076
and a lot of that
goes to programs
1639
01:17:14,096 --> 01:17:15,944
that are tied up
with pharma trials
1640
01:17:15,964 --> 01:17:18,113
where we don't really
see it in the labs,
1641
01:17:18,133 --> 01:17:20,849
so you're maybe down to two
or three hundred million.
1642
01:17:20,869 --> 01:17:24,219
It's not nearly enough
to really make a dent.
1643
01:17:24,239 --> 01:17:29,958
The question is, why when it
comes to Alzheimer's disease,
1644
01:17:29,978 --> 01:17:32,127
which is a problem
in the elderly
1645
01:17:32,147 --> 01:17:34,229
do we just see it as...
The federal government
1646
01:17:34,249 --> 01:17:36,398
seems to see it as a
red-haired Stepchild.
1647
01:17:36,418 --> 01:17:39,334
They don't take care of it.
1648
01:17:39,354 --> 01:17:41,003
Some people say it's because...
1649
01:17:41,023 --> 01:17:42,204
People say, "Well, it
affects old people",
1650
01:17:42,224 --> 01:17:44,431
they lived their
lives, let them go."
1651
01:17:44,660 --> 01:17:47,142
No one wants to admit that,
but maybe subconsciously,
1652
01:17:47,162 --> 01:17:49,077
when Congress is
thinking about this,
1653
01:17:49,097 --> 01:17:50,979
that's at play, who knows?
1654
01:17:50,999 --> 01:17:53,915
Um, maybe it's much
more compelling
1655
01:17:53,935 --> 01:17:57,986
to wanna put money into diseases
that affect young people
1656
01:17:58,006 --> 01:18:00,122
who still have their
whole life to live
1657
01:18:00,142 --> 01:18:02,090
when they have AIDS
or breast cancer
1658
01:18:02,110 --> 01:18:04,793
or cancer that can strike
somebody at 30 or 40 years old.
1659
01:18:04,813 --> 01:18:07,162
Age might be pan of it, and
even if it's something
1660
01:18:07,182 --> 01:18:09,164
where you'd say, "No,
it can't be that!"
1661
01:18:09,184 --> 01:18:10,966
you never know what's
happening subconsciously
1662
01:18:10,986 --> 01:18:12,801
in those who are
making the decisions.
1663
01:18:12,821 --> 01:18:14,403
Otherwise, it just
makes no sense at all.
1664
01:18:14,423 --> 01:18:16,104
I don't know how to explain it.
1665
01:18:16,124 --> 01:18:17,939
When you talk to
people about aging
1666
01:18:17,959 --> 01:18:20,976
and rejuvenation medicine,
you're talking about things
1667
01:18:20,996 --> 01:18:23,412
that they haven't put
in the same category
1668
01:18:23,432 --> 01:18:25,147
as things that they can fight.
1669
01:18:25,167 --> 01:18:26,314
They are willing to put money
1670
01:18:26,334 --> 01:18:28,283
towards solving cancer
and curing cancer.
1671
01:18:28,303 --> 01:18:30,852
It's something they might have
the potential of experiencing.
1672
01:18:30,872 --> 01:18:33,889
But the thing that's 100 percent
in terms of probability,
1673
01:18:33,909 --> 01:18:36,091
they haven't classified that
as in the same category
1674
01:18:36,111 --> 01:18:38,293
when actually it is and
actually it's more dramatic
1675
01:18:38,313 --> 01:18:40,796
because 100 percent of
people experience it.
1676
01:18:40,816 --> 01:18:43,932
You need to have the
will to be cured.
1677
01:18:43,952 --> 01:18:48,136
Beyond that, medical
science will play its pan.
1678
01:18:48,156 --> 01:18:51,106
I think it's essentially a crime
1679
01:18:51,126 --> 01:18:53,742
to not support life
extension science,
1680
01:18:53,762 --> 01:18:55,477
because if you support
the other side,
1681
01:18:55,497 --> 01:18:58,146
you're an advocate
for killing someone.
1682
01:18:58,166 --> 01:19:02,818
When you actually support
a culture of death,
1683
01:19:02,838 --> 01:19:06,154
when you support
embracing death,
1684
01:19:06,174 --> 01:19:09,991
what you're really doing is not
supporting embracing life.
1685
01:19:10,011 --> 01:19:11,927
Everyone ought to be healthy,
1686
01:19:11,947 --> 01:19:14,496
however long ago they were born.
1687
01:19:14,516 --> 01:19:17,399
When someone says, "Oh, dear,
we shouldn't defeat aging",
1688
01:19:17,419 --> 01:19:19,301
we shouldn't work to
eliminate aging,"
1689
01:19:19,321 --> 01:19:22,104
what they're actually saying
is they're not in favor
1690
01:19:22,124 --> 01:19:24,940
of healthcare for the elderly,
or to be more precise,
1691
01:19:24,960 --> 01:19:27,142
what they're saying is
they're only in favor
1692
01:19:27,162 --> 01:19:28,977
of healthcare for the elderly
1693
01:19:28,997 --> 01:19:31,980
so long as it doesn't
work very well.
1694
01:19:32,000 --> 01:19:34,071
And I think that's fucked up.
1695
01:19:35,103 --> 01:19:37,786
In September 2013,
1696
01:19:37,806 --> 01:19:41,123
Google announced the
conception of Calico,
1697
01:19:41,143 --> 01:19:43,225
an independent biotech company
1698
01:19:43,245 --> 01:19:46,556
that remains to this day
a little mysterious.
1699
01:19:46,782 --> 01:19:51,099
Its aim is to tackle aging
and devise interventions
1700
01:19:51,119 --> 01:19:55,437
that enable people to lead
longer and healthier lives.
1701
01:19:55,457 --> 01:19:58,106
In September 2014,
1702
01:19:58,126 --> 01:20:01,076
the life extension company
announced it was partnering
1703
01:20:01,096 --> 01:20:04,913
with biopharmaceutical
giant AbbVie,
1704
01:20:04,933 --> 01:20:10,418
and made a $1.5 billion
investment into research.
1705
01:20:10,438 --> 01:20:12,254
I think one of the
biggest obstacles
1706
01:20:12,274 --> 01:20:14,823
that we have at the
moment to come to terms
1707
01:20:14,843 --> 01:20:17,192
with this future world
we're talking about
1708
01:20:17,212 --> 01:20:19,327
is a lot of people who basically
1709
01:20:19,347 --> 01:20:21,963
don't want it to happen at all
1710
01:20:21,983 --> 01:20:25,300
and so are placing
all kinds of ethical
1711
01:20:25,320 --> 01:20:27,369
and institutional restrictions
1712
01:20:27,389 --> 01:20:29,271
on the development of this stuff
1713
01:20:29,291 --> 01:20:31,306
so that it becomes difficult,
let's say, in universities
1714
01:20:31,326 --> 01:20:33,942
to experiment with certain
kinds of drugs, right?
1715
01:20:33,962 --> 01:20:36,144
To develop certain kinds
of machines maybe even,
1716
01:20:36,164 --> 01:20:38,446
and as a result, all of
that kind of research
1717
01:20:38,466 --> 01:20:41,116
ends up going into either
the private sector
1718
01:20:41,136 --> 01:20:43,985
or maybe underground, right?
1719
01:20:44,005 --> 01:20:46,354
Or going into some country
that's an ethics-free zone
1720
01:20:46,374 --> 01:20:48,490
like China or
someplace like that,
1721
01:20:48,510 --> 01:20:50,325
and I think that's where
the real problems
1722
01:20:50,345 --> 01:20:53,428
potentially lie, because we
really need to be doing,
1723
01:20:53,448 --> 01:20:55,230
you know, we need to be
developing this stuff,
1724
01:20:55,250 --> 01:20:57,332
but in the public eye, right?
1725
01:20:57,352 --> 01:21:00,435
So it should be done by the
mainstream authorities
1726
01:21:00,455 --> 01:21:02,204
so we can monitor
the consequences
1727
01:21:02,224 --> 01:21:03,471
as they're happening
and then be able
1728
01:21:03,491 --> 01:21:05,173
to take appropriate action.
1729
01:21:05,193 --> 01:21:06,541
But I'm afraid that
a lot of this stuff
1730
01:21:06,561 --> 01:21:10,011
is perhaps being driven outside
1731
01:21:10,031 --> 01:21:12,147
because of all the restrictions
that are placed on it.
1732
01:21:12,167 --> 01:21:13,915
That I think is very worrisome,
1733
01:21:13,935 --> 01:21:15,517
because then you can't
keep track of the results,
1734
01:21:15,537 --> 01:21:18,186
and you don't know exactly
what's happening.
1735
01:21:18,206 --> 01:21:20,288
And I think that's a
real problem already
1736
01:21:20,308 --> 01:21:22,219
with a lot of this
more futuristic stuff.
1737
01:21:24,412 --> 01:21:26,328
Arguably, the human condition
1738
01:21:26,348 --> 01:21:29,564
is defined by our
anxiety of death.
1739
01:21:29,584 --> 01:21:32,167
It's of little wonder
that throughout history,
1740
01:21:32,187 --> 01:21:35,437
mankind has built
countless belief systems
1741
01:21:35,457 --> 01:21:38,340
in a bid to pacify
the fear of death
1742
01:21:38,360 --> 01:21:41,643
through the promise
of endless paradise.
1743
01:21:41,663 --> 01:21:45,480
Ultimately, death always wins.
1744
01:21:45,500 --> 01:21:50,285
If it's not so much death
we fear, it's dying.
1745
01:21:50,305 --> 01:21:52,387
The relationship
between life and death
1746
01:21:52,407 --> 01:21:57,225
is often figured in
terms of immortality
1747
01:21:57,245 --> 01:21:59,194
and the quest for immortality.
1748
01:21:59,214 --> 01:22:01,496
There's a philosopher, a
brilliant philosopher
1749
01:22:01,516 --> 01:22:05,300
called Stephen Cave, who,
in his book Immortality,
1750
01:22:05,320 --> 01:22:09,371
argues that our fear of
death is the great driver
1751
01:22:09,391 --> 01:22:13,541
of all civilization,
of all human endeavor,
1752
01:22:13,561 --> 01:22:16,978
and he identifies
four different ways
1753
01:22:16,998 --> 01:22:20,415
in which people
seek immortality,
1754
01:22:20,435 --> 01:22:24,552
so firstly the idea
of extending life,
1755
01:22:24,572 --> 01:22:26,554
of living forever.
1756
01:22:26,574 --> 01:22:28,990
Secondly, the idea
of resurrection
1757
01:22:29,010 --> 01:22:33,328
so that we might come back
after death in some form.
1758
01:22:33,348 --> 01:22:36,965
Thirdly, the idea
of the immortality
1759
01:22:36,985 --> 01:22:41,169
of some pan of ourselves
beyond the physical body.
1760
01:22:41,189 --> 01:22:44,372
So perhaps the immortality
of the soul, for example,
1761
01:22:44,392 --> 01:22:47,208
or living on in Heaven.
1762
01:22:47,228 --> 01:22:51,613
And finally the idea
of leaving a legacy.
1763
01:22:51,633 --> 01:22:54,482
I think that one of
life's challenges
1764
01:22:54,502 --> 01:22:56,584
really actually is
to come to terms
1765
01:22:56,604 --> 01:22:59,621
with our own finitude
and modality
1766
01:22:59,641 --> 01:23:02,023
and human limitations,
1767
01:23:02,043 --> 01:23:05,460
and this is an
enormous challenge.
1768
01:23:05,480 --> 01:23:13,480
♪
1769
01:23:21,529 --> 01:23:26,047
Technology, a reflection
of our times.
1770
01:23:26,067 --> 01:23:31,286
Efficient, computerized, with
a sleek beauty all its own.
1771
01:23:31,306 --> 01:23:33,786
Technology is the
human imagination
1772
01:23:34,009 --> 01:23:36,191
convened into reality.
1773
01:23:36,211 --> 01:23:38,226
We are all interested
in the future,
1774
01:23:38,246 --> 01:23:40,462
for that is where you and
I are going to spend
1775
01:23:40,482 --> 01:23:42,530
the rest of our lives.
1776
01:23:42,550 --> 01:23:45,133
(high-pitched tone)
1777
01:23:45,153 --> 01:23:52,507
♪
1778
01:23:52,527 --> 01:23:54,576
It's impossible to say for sure
1779
01:23:54,596 --> 01:23:57,512
where these new
technologies will take us
1780
01:23:57,532 --> 01:24:01,583
and how we will prepare to
implement them into society.
1781
01:24:01,603 --> 01:24:04,452
It is likely that they will
affect the sensibilities
1782
01:24:04,472 --> 01:24:06,509
of global infrastructure.
1783
01:24:08,076 --> 01:24:12,227
There are always anxieties
surrounding new technologies,
1784
01:24:12,247 --> 01:24:14,596
and this time is no exception.
1785
01:24:14,616 --> 01:24:15,730
♪
1786
01:24:15,750 --> 01:24:18,133
I think people fear change,
1787
01:24:18,153 --> 01:24:21,102
and so the future represents
this enormous amount
1788
01:24:21,122 --> 01:24:23,505
of change that's coming at us.
1789
01:24:23,525 --> 01:24:25,373
I do think it's
overwhelming for people,
1790
01:24:25,393 --> 01:24:28,676
you know, they are
afraid to change
1791
01:24:28,696 --> 01:24:30,412
the paradigm that they live in,
1792
01:24:30,432 --> 01:24:33,214
and when we talk about the
future of work and death,
1793
01:24:33,234 --> 01:24:35,483
what we're really talking
about is changing
1794
01:24:35,503 --> 01:24:37,585
a paradigm that has
existed for us
1795
01:24:37,605 --> 01:24:38,820
as long as we can remember.
1796
01:24:38,840 --> 01:24:42,090
All of this Kind
of scare mongering
1797
01:24:42,110 --> 01:24:46,161
about harm and risk
and stuff like that
1798
01:24:46,181 --> 01:24:50,432
really is... it's based on a
kind of psychological illusion,
1799
01:24:50,452 --> 01:24:54,636
namely that you imagine that
you kind of see the bad state
1800
01:24:54,656 --> 01:24:57,472
as a bad state when it happens,
1801
01:24:57,492 --> 01:25:00,141
whereas in fact, what
more likely happens
1802
01:25:00,161 --> 01:25:02,777
is that you kinda get adjusted
to the various changes
1803
01:25:02,797 --> 01:25:04,412
that are happening
in your environment
1804
01:25:04,432 --> 01:25:06,147
so that when you actually
do reach that state
1805
01:25:06,167 --> 01:25:09,818
we're talking about,
it'll seem normal.
1806
01:25:09,838 --> 01:25:12,120
And because, look,
when the automobile
1807
01:25:12,140 --> 01:25:14,489
was introduced in the
early 20th century,
1808
01:25:14,509 --> 01:25:16,524
people were saying this
is just gonna pump
1809
01:25:16,544 --> 01:25:18,393
a lot of smoke into
the atmosphere,
1810
01:25:18,413 --> 01:25:20,295
it's going to ruin our
contact with nature
1811
01:25:20,315 --> 01:25:21,995
'cause we'll be in these
enclosed vehicles,
1812
01:25:22,117 --> 01:25:23,498
we'll be going so fast,
1813
01:25:23,518 --> 01:25:25,300
we won't be able to
appreciate things,
1814
01:25:25,320 --> 01:25:27,402
there'll be congestion,
blah, blah, blah,
1815
01:25:27,422 --> 01:25:29,370
they were right!
1816
01:25:29,390 --> 01:25:30,638
They were right, but of course,
1817
01:25:30,658 --> 01:25:32,841
by the time you
get to that state
1818
01:25:32,861 --> 01:25:35,210
where the automobile
has had that impact,
1819
01:25:35,230 --> 01:25:36,744
it's also had all
this benefit as well,
1820
01:25:36,764 --> 01:25:39,214
and your whole life has been
kind of restructured around it.
1821
01:25:39,234 --> 01:25:42,150
Arguably, people who are using
1822
01:25:42,170 --> 01:25:47,522
or have been conceived using
in vitro fertilization
1823
01:25:47,542 --> 01:25:53,328
are cyborgs way before they
were ever even people.
1824
01:25:53,348 --> 01:25:56,397
Now that doesn't mean that
we understand kinship
1825
01:25:56,417 --> 01:25:58,233
in a radically different way.
1826
01:25:58,253 --> 01:25:59,868
Just look at the
Industrial Revolution!
1827
01:25:59,888 --> 01:26:03,438
Is anyone actually... Does
anyone actually regret
1828
01:26:03,458 --> 01:26:05,306
that the Industrial
Revolution occurred?
1829
01:26:05,326 --> 01:26:07,609
No, it was fairly
turbulent, right?
1830
01:26:07,629 --> 01:26:10,578
You know, we did actually
have a little bit of strife
1831
01:26:10,598 --> 01:26:13,281
in the translation from
a pre-industrial world
1832
01:26:13,301 --> 01:26:14,582
to the world we know today.
1833
01:26:14,602 --> 01:26:16,618
But the fact is, we adapted.
1834
01:26:16,638 --> 01:26:18,920
The most important thing
here is to try to compare it
1835
01:26:18,940 --> 01:26:20,788
to something in the past.
1836
01:26:20,808 --> 01:26:24,526
Imagine we were... It was
1914, 100 years back,
1837
01:26:24,546 --> 01:26:26,928
and I told you that most
people on the planet
1838
01:26:26,948 --> 01:26:28,530
would have the ability to have
1839
01:26:28,550 --> 01:26:30,465
this tiny cell phone
screen in front of them
1840
01:26:30,485 --> 01:26:33,434
and video conference with ten
of their friends all at once.
1841
01:26:33,454 --> 01:26:36,271
If it was 1914, you would
look at me and say,
1842
01:26:36,291 --> 01:26:38,273
"That's absurd,
this guy's insane."
1843
01:26:38,293 --> 01:26:40,608
However, it's the
son of same concept
1844
01:26:40,628 --> 01:26:42,343
when I tell you now in 50 years
1845
01:26:42,363 --> 01:26:44,612
we're going to be digital
beings of ourselves,
1846
01:26:44,632 --> 01:26:46,281
it's no so far-fetched.
1847
01:26:46,301 --> 01:26:48,349
You have to look at it in
the historical context.
1848
01:26:48,369 --> 01:26:51,419
All concepts of
technological progress
1849
01:26:51,439 --> 01:26:55,590
in that way are linked to
post-enlightenment ideas
1850
01:26:55,610 --> 01:26:58,393
or non... they're
linked to the idea
1851
01:26:58,413 --> 01:27:01,563
of the arrow of time being
in free flight forward,
1852
01:27:01,583 --> 01:27:05,600
but they're also chiliastic,
they propose an end state.
1853
01:27:05,620 --> 01:27:06,901
They propose the end state,
1854
01:27:06,921 --> 01:27:08,670
and the end state
is the singularity,
1855
01:27:08,690 --> 01:27:11,339
but they propose it as
something desirable.
1856
01:27:11,359 --> 01:27:14,776
Now any kind of philosophy
like that, it's, you know,
1857
01:27:14,796 --> 01:27:18,513
jam tomorrow, jam yesterday,
but never jam today.
1858
01:27:18,533 --> 01:27:20,915
They're all philosophies
that are about
1859
01:27:20,935 --> 01:27:24,419
accept the shit you're in...
Work, consume, die...
1860
01:27:24,439 --> 01:27:26,654
Because there is something
better in the future,
1861
01:27:26,674 --> 01:27:28,590
or there's something more
innovative in the future.
1862
01:27:28,610 --> 01:27:30,925
There are good scenarios and
there are bad scenarios.
1863
01:27:30,945 --> 01:27:32,560
I don't know where we're headed.
1864
01:27:32,580 --> 01:27:35,897
I mean, I don't think
anyone really knows.
1865
01:27:35,917 --> 01:27:37,899
You know, if anyone claims
to know the future,
1866
01:27:37,919 --> 01:27:39,634
they're guessing, they're
extrapolating forward,
1867
01:27:39,654 --> 01:27:41,302
and we draw some
lines and curves
1868
01:27:41,322 --> 01:27:42,804
and see where
technology's gonna be.
1869
01:27:42,824 --> 01:27:44,639
What that means,
1870
01:27:44,659 --> 01:27:46,307
I don't think any of
us really understand.
1871
01:27:46,327 --> 01:27:48,977
Everyone assumes that the
future is going to be
1872
01:27:48,997 --> 01:27:50,678
dramatically
different from today
1873
01:27:50,698 --> 01:27:52,041
and that's absolutely true,
1874
01:27:52,267 --> 01:27:53,915
but it's also true that
the future will be
1875
01:27:53,935 --> 01:27:55,617
an extension of today's world.
1876
01:27:55,637 --> 01:27:58,486
The problems that exist
in today's world
1877
01:27:58,506 --> 01:28:01,389
are still gonna be
with us in the future.
1878
01:28:01,409 --> 01:28:03,391
Human nature is
not gonna change.
1879
01:28:03,411 --> 01:28:06,027
The end point in
all of this game
1880
01:28:06,047 --> 01:28:08,696
will become a bit of a moral
1881
01:28:08,716 --> 01:28:12,367
and an ethical
question for society
1882
01:28:12,387 --> 01:28:14,526
where decisions will
have to be made.
1883
01:28:15,657 --> 01:28:19,674
Like life itself, work and
death, for better or worse,
1884
01:28:19,694 --> 01:28:21,843
are two features of
the human experience
1885
01:28:21,863 --> 01:28:23,934
that are thrust upon us.
1886
01:28:25,400 --> 01:28:27,715
Whether or not we
define work and death
1887
01:28:27,735 --> 01:28:29,984
as problems in need of remedy,
1888
01:28:30,004 --> 01:28:33,855
human ingenuity is a progressive
and natural extension
1889
01:28:33,875 --> 01:28:35,857
of our own evolution.
1890
01:28:35,877 --> 01:28:43,698
♪
1891
01:28:43,718 --> 01:28:46,834
Advancing our technological
capabilities is a way
1892
01:28:46,854 --> 01:28:50,438
of dealing with our
limitations as human beings.
1893
01:28:50,458 --> 01:28:51,973
♪
1894
01:28:51,993 --> 01:28:53,675
Must we do something
1895
01:28:53,695 --> 01:28:56,411
just because we're capable
of doing something?
1896
01:28:56,431 --> 01:28:58,546
Or can we withhold
our hands and say.
1897
01:28:58,566 --> 01:29:00,982
"No, this is not a
good thing to do"?
1898
01:29:01,002 --> 01:29:04,018
This is something that
the human species
1899
01:29:04,038 --> 01:29:05,820
must decide for itself.
1900
01:29:05,840 --> 01:29:08,623
You and I, we can't just
leave it to the scientists.
1901
01:29:08,643 --> 01:29:10,850
We have to know what's
going on and why!
1902
01:29:14,649 --> 01:29:17,649
♪151354
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.