Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:01,070 --> 00:00:03,244
Viewers like you make
this program possible.
2
00:00:03,279 --> 00:00:05,350
Support your local PBS station.
3
00:00:11,425 --> 00:00:15,325
♪
4
00:00:15,360 --> 00:00:17,603
MILES O'BRIEN:
Machines that think like humans.
5
00:00:17,638 --> 00:00:21,124
Our dream to create machines
in our own image
6
00:00:21,159 --> 00:00:23,816
that are smart and intelligent
7
00:00:23,851 --> 00:00:26,371
goes back to antiquity.
8
00:00:26,405 --> 00:00:28,476
Well, can it bring it to me?
9
00:00:28,511 --> 00:00:30,685
O'BRIEN:
Is it possible that the dream
of artificial intelligence
10
00:00:30,720 --> 00:00:33,067
has become reality?
11
00:00:34,172 --> 00:00:35,759
They're able to do things
12
00:00:35,794 --> 00:00:38,107
that we didn't
think they could do.
13
00:00:39,729 --> 00:00:43,353
MANOLIS KELLIS:
Go was thought to be a game
where machines would never win.
14
00:00:43,388 --> 00:00:46,770
The number of choices
for every move is enormous.
15
00:00:46,805 --> 00:00:50,878
O'BRIEN:
And now, the possibilities
seem endless.
16
00:00:50,912 --> 00:00:52,673
MUSTAFA SULEYMAN:
And this is going to be
17
00:00:52,707 --> 00:00:54,709
one of the greatest boosts
18
00:00:54,744 --> 00:00:57,160
to productivity in the history
of our species.
19
00:00:59,300 --> 00:01:02,786
That looks like just a hint
of some type of smoke.
20
00:01:02,821 --> 00:01:05,134
O'BRIEN:
Identifying problems
before a human can...
21
00:01:05,168 --> 00:01:07,377
LECIA SEQUIST:
We taught the model to recognize
22
00:01:07,412 --> 00:01:09,655
developing lung cancer.
23
00:01:09,690 --> 00:01:12,831
O'BRIEN:
...and inventing new drugs.
24
00:01:12,865 --> 00:01:14,936
PETRINA KAMYA:
I never thought
that we would be able
25
00:01:14,971 --> 00:01:16,869
to be doing the things
we're doing with A.I..
26
00:01:16,904 --> 00:01:19,217
O'BRIEN:
But along with the hope...
27
00:01:19,251 --> 00:01:21,046
[imitating Obama]:
This is a dangerous time.
28
00:01:21,081 --> 00:01:23,945
O'BRIEN:
...comes deep concern.
29
00:01:23,980 --> 00:01:25,740
One of the first drops
in the feared flood
30
00:01:25,775 --> 00:01:27,328
of A.I.-created disinformation.
31
00:01:27,363 --> 00:01:31,367
We have lowered barriers
to entry to manipulate reality.
32
00:01:31,401 --> 00:01:34,094
We're going to live in a world
where we don't know what's real.
33
00:01:34,128 --> 00:01:37,994
The risks are uncertain
and potentially enormous.
34
00:01:38,028 --> 00:01:42,171
O'BRIEN:
How powerful is A.I.?
How does it work?
35
00:01:42,205 --> 00:01:44,759
And how can we reap
its extraordinary benefits...
36
00:01:44,794 --> 00:01:46,209
Sybil looked here,
37
00:01:46,244 --> 00:01:48,936
and anticipated
that there would be a problem.
38
00:01:48,970 --> 00:01:51,145
O'BRIEN:
...without jeopardizing
our future?
39
00:01:51,180 --> 00:01:52,457
"A.I. Revolution"
40
00:01:52,491 --> 00:01:55,218
right now, on "NOVA!"
41
00:01:55,253 --> 00:01:58,359
[whirring]
42
00:01:58,394 --> 00:02:07,955
♪
43
00:02:07,989 --> 00:02:11,061
Tell me the backstory
on inflection A.I..
44
00:02:11,096 --> 00:02:15,238
[voiceover]:
Our story begins
with the making of this story.
45
00:02:15,273 --> 00:02:20,209
PI [on computer]:
The story of Inflection A.I.
is an exciting one.
46
00:02:20,243 --> 00:02:21,279
O'BRIEN [voiceover]:
I was researching
47
00:02:21,313 --> 00:02:22,452
an interview subject.
48
00:02:22,487 --> 00:02:25,006
Who is Mustafa Suleyman?
49
00:02:25,041 --> 00:02:26,594
[voiceover]:
Something I've done
50
00:02:26,629 --> 00:02:29,045
a thousand times
in my 40-year career.
51
00:02:29,079 --> 00:02:30,874
PI [on computer]:
Mustafa Suleyman is a
true pioneer
52
00:02:30,909 --> 00:02:33,325
in the field
of artificial intelligence.
53
00:02:33,360 --> 00:02:35,534
[voiceover]:
But this time, it was different:
54
00:02:35,569 --> 00:02:38,675
I wasn't typing out
search terms.
55
00:02:38,710 --> 00:02:41,471
What is machine learning?
56
00:02:41,506 --> 00:02:44,888
O'BRIEN [voiceover]:
I was having a conversation
with a computer.
57
00:02:44,923 --> 00:02:47,028
PI:
Sounds like an
exciting project, Miles.
58
00:02:47,063 --> 00:02:50,446
[voiceover]:
It felt like something
big had changed.
59
00:02:50,480 --> 00:02:54,139
PI:
Machine learning, ML, is a type
of artificial intelligence.
60
00:02:54,174 --> 00:02:55,658
O'BRIEN [voiceover]:
And as it happened,
61
00:02:55,692 --> 00:03:00,456
I was focused on one of the
innovators of this revolution.
62
00:03:00,490 --> 00:03:02,009
Okay, so if I do this...
63
00:03:02,043 --> 00:03:05,012
[voiceover]:
Mustafa Suleyman is co-founder
64
00:03:05,046 --> 00:03:07,911
of a startup called Inflection.
65
00:03:07,946 --> 00:03:10,086
It makes an artificial
intelligence assistant
66
00:03:10,120 --> 00:03:11,743
called Pi.
67
00:03:11,777 --> 00:03:13,158
So now you can speak...
68
00:03:13,193 --> 00:03:16,920
O'BRIEN [voiceover]:
I met them both in London.
69
00:03:16,955 --> 00:03:18,405
It's fundamentally
different, isn't it?
70
00:03:18,439 --> 00:03:19,923
Yeah,
it's a conversational style.
71
00:03:19,958 --> 00:03:22,374
All of us humans
learn through stories,
72
00:03:22,409 --> 00:03:25,239
and through narrative, and
through interactive dialogue.
73
00:03:25,274 --> 00:03:27,897
And now, the machine can
kind of come alive,
74
00:03:27,931 --> 00:03:29,657
and talk to you
about whatever it is
75
00:03:29,692 --> 00:03:31,072
that's on top of your mind.
76
00:03:31,107 --> 00:03:35,215
O'BRIEN:
Tell me about the PBS program
"NOVA."
77
00:03:35,249 --> 00:03:38,563
[voiceover]:
Chatbots can offer up
quick answers, write poems,
78
00:03:38,597 --> 00:03:41,151
finish essays,
and translate languages
79
00:03:41,186 --> 00:03:43,292
among many other things.
80
00:03:43,326 --> 00:03:46,018
PI [over phone]:
"NOVA" is a science
documentary series...
81
00:03:46,053 --> 00:03:47,503
O'BRIEN [voiceover]:
They aren't perfect,
82
00:03:47,537 --> 00:03:49,781
but they have put artificial
intelligence in our hands,
83
00:03:49,815 --> 00:03:52,232
and into
the public consciousness.
84
00:03:52,266 --> 00:03:55,787
And it seems
we're equal parts leery
85
00:03:55,821 --> 00:03:57,409
and intrigued.
86
00:03:57,444 --> 00:03:59,135
SULEYMAN:
A.I. is a tool
87
00:03:59,169 --> 00:04:02,725
for helping us to understand
the world around us,
88
00:04:02,759 --> 00:04:06,936
predict what's likely to happen,
and then invent
89
00:04:06,970 --> 00:04:09,973
solutions that help improve
the world around us.
90
00:04:10,008 --> 00:04:13,494
My motivation was to try
to use A.I. tools
91
00:04:13,529 --> 00:04:15,772
to, uh, you know,
invent the future.
92
00:04:15,807 --> 00:04:18,465
The rise
in artificial intelligence...
93
00:04:18,499 --> 00:04:20,294
REPORTER:
A.I. technology is developing...
94
00:04:20,329 --> 00:04:23,780
O'BRIEN [voiceover]:
Lately, it seems a dark future
is already here...
95
00:04:23,815 --> 00:04:27,957
The technology could replace
millions of jobs...
96
00:04:27,991 --> 00:04:29,717
O'BRIEN [voiceover]:
...if you listen
to the news reporting.
97
00:04:29,752 --> 00:04:32,927
The moment civilization
was transformed.
98
00:04:32,962 --> 00:04:35,378
O'BRIEN [voiceover]:
So how can
artificial intelligence help us,
99
00:04:35,413 --> 00:04:37,760
and how might it hurt us?
100
00:04:37,794 --> 00:04:40,935
At the center of
the public handwringing:
101
00:04:40,970 --> 00:04:44,939
how should we put
guardrails around it?
102
00:04:44,974 --> 00:04:47,873
We definitely need
more regulations in place...
103
00:04:47,908 --> 00:04:49,289
O'BRIEN [voiceover]:
Artificial intelligence
is moving fast
104
00:04:49,323 --> 00:04:51,187
and changing the world.
105
00:04:51,221 --> 00:04:52,740
Can we keep up?
106
00:04:52,775 --> 00:04:54,673
Non-human minds
smarter than our own.
107
00:04:54,708 --> 00:04:57,055
O'BRIEN [voiceover]:
The news coverage may make it
seem like
108
00:04:57,089 --> 00:04:59,782
artificial intelligence
is something new.
109
00:04:59,816 --> 00:05:01,784
At a moment of revolution...
110
00:05:01,818 --> 00:05:04,234
O'BRIEN [voiceover]:
But human beings have been
thinking about this
111
00:05:04,269 --> 00:05:07,410
for a very long time.
112
00:05:07,445 --> 00:05:11,276
I have a very fine brain.
113
00:05:11,311 --> 00:05:15,591
Our dream to create machines
in our own image
114
00:05:15,625 --> 00:05:19,664
that are smart and intelligent
goes back to antiquity.
115
00:05:19,698 --> 00:05:22,149
Uh, it's,
it's something that has,
116
00:05:22,183 --> 00:05:26,947
has permeated the evolution
of society and of science.
117
00:05:26,981 --> 00:05:29,363
[mortars firing]
118
00:05:29,398 --> 00:05:31,779
O'BRIEN [voiceover]:
The modern origins
of artificial intelligence
119
00:05:31,814 --> 00:05:33,954
can be traced
back to World War II,
120
00:05:33,988 --> 00:05:38,372
and the prodigious
human brain of Alan Turing.
121
00:05:38,407 --> 00:05:41,099
The legendary
British mathematician
122
00:05:41,133 --> 00:05:43,101
developed a machine
123
00:05:43,135 --> 00:05:47,208
capable of deciphering
coded messages from the Nazis.
124
00:05:47,243 --> 00:05:51,281
After the war, he was among
the first to predict computers
125
00:05:51,316 --> 00:05:54,526
might one day match
the human brain.
126
00:05:54,561 --> 00:05:57,529
There are no surviving
recordings of Turing's voice,
127
00:05:57,564 --> 00:06:03,155
but in 1951, he gave
a short lecture on BBC radio.
128
00:06:03,190 --> 00:06:07,746
We asked an A.I.-generated voice
to read a passage.
129
00:06:07,781 --> 00:06:10,059
TURING A.I. VOICE:
I think it is probable,
for instance,
130
00:06:10,093 --> 00:06:12,061
that at the end of the century,
131
00:06:12,095 --> 00:06:14,132
it will be possible
to program a machine
132
00:06:14,166 --> 00:06:16,099
to answer questions
in such a way
133
00:06:16,134 --> 00:06:18,205
that it will be extremely
difficult to guess
134
00:06:18,239 --> 00:06:20,276
whether the answers are being
given by a man
135
00:06:20,310 --> 00:06:22,382
or by the machine.
136
00:06:22,416 --> 00:06:25,281
O'BRIEN [voiceover]:
And so,
the Turing test was born.
137
00:06:25,315 --> 00:06:27,421
Could anyone build a machine
138
00:06:27,456 --> 00:06:29,768
that could converse
with a human in a way
139
00:06:29,803 --> 00:06:32,840
that is indistinguishable
from another person?
140
00:06:32,875 --> 00:06:36,154
In 1956,
141
00:06:36,188 --> 00:06:38,432
a group of pioneering scientists
spent the summer
142
00:06:38,467 --> 00:06:41,262
brainstorming
at Dartmouth College.
143
00:06:42,298 --> 00:06:44,404
And they told the world that
they have coined
144
00:06:44,438 --> 00:06:46,164
a new academic field of study.
145
00:06:46,198 --> 00:06:48,304
They called it
artificial intelligence
146
00:06:48,338 --> 00:06:51,825
O'BRIEN [voiceover]:
For decades,
their aspirations remained
147
00:06:51,859 --> 00:06:54,621
far ahead of
the capabilities of computers.
148
00:06:56,277 --> 00:06:57,969
In 1978,
149
00:06:58,003 --> 00:07:02,836
"NOVA" released its first film
on artificial intelligence.
150
00:07:02,870 --> 00:07:04,700
We have seen the first
crude beginnings
151
00:07:04,734 --> 00:07:06,426
of artificial intelligence...
152
00:07:06,460 --> 00:07:08,289
O'BRIEN [voiceover]:
And the legendary science
fiction writer,
153
00:07:08,324 --> 00:07:12,811
Arthur C. Clark was,
as always, prescient.
154
00:07:12,846 --> 00:07:14,434
It doesn't really exist yet at
any level,
155
00:07:14,468 --> 00:07:18,472
because our most complex
computers are still morons,
156
00:07:18,507 --> 00:07:21,406
high-speed morons,
but still morons.
157
00:07:21,441 --> 00:07:24,202
Nevertheless, we have
the possibility of machines
158
00:07:24,236 --> 00:07:26,480
which can outpace their
creators,
159
00:07:26,515 --> 00:07:31,002
and therefore,
become more intelligent than us.
160
00:07:32,244 --> 00:07:36,317
At the time, researchers were
developing "expert systems,"
161
00:07:36,352 --> 00:07:41,495
purpose-built
to perform specific tasks.
162
00:07:41,530 --> 00:07:43,083
So the thing that we need to do
163
00:07:43,117 --> 00:07:47,639
to make machine understand, um,
you know, our world,
164
00:07:47,674 --> 00:07:50,573
is to put all our knowledge
into a machine
165
00:07:50,608 --> 00:07:53,438
and then provide it
with some rules.
166
00:07:53,473 --> 00:07:55,405
♪
167
00:07:55,440 --> 00:07:58,512
O'BRIEN [voiceover]:
Classic A.I. reached a pivotal
moment in 1997
168
00:07:58,547 --> 00:08:02,689
when an artificial intelligence
program devised by IBM,
169
00:08:02,723 --> 00:08:05,761
called "Deep Blue" defeated
world chess champion
170
00:08:05,795 --> 00:08:09,109
and grandmaster Garry Kasparov.
171
00:08:09,143 --> 00:08:13,147
It searched about 200 million
positions a second,
172
00:08:13,182 --> 00:08:15,564
navigating through
a tree of possibilities
173
00:08:15,598 --> 00:08:18,083
to determine the best move.
174
00:08:18,118 --> 00:08:20,534
RUS:
The program analyzed
the board configuration,
175
00:08:20,569 --> 00:08:23,537
could project forward
millions of moves
176
00:08:23,572 --> 00:08:26,126
to examine millions of
possibilities,
177
00:08:26,160 --> 00:08:28,646
and then picked the best path.
178
00:08:28,680 --> 00:08:31,372
O'BRIEN [voiceover]:
Effective, but brittle,
179
00:08:31,407 --> 00:08:35,411
Deep Blue wasn't
strategizing as a human does.
180
00:08:35,445 --> 00:08:38,310
From the outset, artificial
intelligence researchers
181
00:08:38,345 --> 00:08:41,037
imagined making machines
182
00:08:41,072 --> 00:08:42,798
that think like us.
183
00:08:42,832 --> 00:08:46,077
The human brain, with
more than 80 billion neurons,
184
00:08:46,111 --> 00:08:48,976
learns not by following rules,
185
00:08:49,011 --> 00:08:52,255
but rather by taking
in a steady stream of data,
186
00:08:52,290 --> 00:08:54,637
and looking for patterns.
187
00:08:56,018 --> 00:08:58,330
KELLIS:
The way that learning
actually works
188
00:08:58,365 --> 00:09:01,195
in the human brain is by
updating the weights
189
00:09:01,230 --> 00:09:02,852
of the synaptic connections
190
00:09:02,887 --> 00:09:04,716
that are underlying this
neural network.
191
00:09:04,751 --> 00:09:08,617
O'BRIEN [voiceover]:
Manolis Kellis is a
Professor of Computer Science
192
00:09:08,651 --> 00:09:12,966
at the Massachusetts Institute
of Technology.
193
00:09:13,000 --> 00:09:14,968
So we have trillions
of parameters in our brain
194
00:09:15,002 --> 00:09:17,108
that we can adjust
based on experience.
195
00:09:17,142 --> 00:09:19,006
I'm getting a reward.
196
00:09:19,041 --> 00:09:20,801
I will update
the strength of the connections
197
00:09:20,836 --> 00:09:22,872
that led to this reward--
I'm getting punished,
198
00:09:22,907 --> 00:09:24,633
I will diminish the strength
of the connections
199
00:09:24,667 --> 00:09:26,324
that led to the punishment.
200
00:09:26,358 --> 00:09:28,498
So this is
the original neural network.
201
00:09:28,533 --> 00:09:32,054
We did not invent it,
we, you know, we inherited it.
202
00:09:32,088 --> 00:09:36,161
O'BRIEN [voiceover]:
But could an artificial
neural network
203
00:09:36,196 --> 00:09:39,199
be made in our own image?
Turing imagined it.
204
00:09:39,233 --> 00:09:41,511
But computers were nowhere near
205
00:09:41,546 --> 00:09:45,067
powerful enough to do it
until recently.
206
00:09:46,689 --> 00:09:48,484
It's only with the advent
of extraordinary data sets
207
00:09:48,518 --> 00:09:51,142
that we have, uh,
since the early 2000s,
208
00:09:51,176 --> 00:09:54,145
that we were able to build up
enough images,
209
00:09:54,179 --> 00:09:55,733
enough annotations,
210
00:09:55,767 --> 00:09:58,839
enough text to be able
to finally train
211
00:09:58,874 --> 00:10:01,842
these sufficiently powerful
models.
212
00:10:03,361 --> 00:10:05,708
O'BRIEN [voiceover]:
An artificial neural network is,
in fact,
213
00:10:05,743 --> 00:10:08,262
modeled on the human brain.
214
00:10:08,297 --> 00:10:11,611
It uses interconnected nodes,
or neurons,
215
00:10:11,645 --> 00:10:13,889
that communicate with
each other.
216
00:10:13,923 --> 00:10:16,650
Each node receives
inputs from other nodes
217
00:10:16,685 --> 00:10:20,585
and processes those inputs
to produce outputs,
218
00:10:20,620 --> 00:10:24,140
which are then passed on to
still other nodes.
219
00:10:24,175 --> 00:10:27,523
It learns by adjusting
the strength of the connections
220
00:10:27,557 --> 00:10:32,010
between the nodes based on
the data it is exposed to.
221
00:10:32,045 --> 00:10:34,495
This process
of adjusting the connections
222
00:10:34,530 --> 00:10:36,394
is called training,
223
00:10:36,428 --> 00:10:38,845
and it allows an
artificial neural network
224
00:10:38,879 --> 00:10:42,227
to recognize patterns
and learn from its experiences
225
00:10:42,262 --> 00:10:44,505
like humans do.
226
00:10:46,197 --> 00:10:47,681
A child,
how is it learning so fast?
227
00:10:47,716 --> 00:10:49,372
It is learning so fast
228
00:10:49,407 --> 00:10:51,685
because it's constantly
predicting the future
229
00:10:51,720 --> 00:10:54,101
and then seeing what happens
230
00:10:54,136 --> 00:10:57,208
and updating their weights in
their neural network
231
00:10:57,242 --> 00:10:59,210
based on what just happened.
232
00:10:59,244 --> 00:11:00,521
Now you can take this
233
00:11:00,556 --> 00:11:01,937
self-supervised learning
paradigm
234
00:11:01,971 --> 00:11:04,146
and apply it to machines.
235
00:11:05,699 --> 00:11:08,909
O'BRIEN [voiceover]:
At first, some of these
artificial neural networks
236
00:11:08,944 --> 00:11:11,705
were trained on vintage
Atari video games
237
00:11:11,740 --> 00:11:13,707
like "Space Invaders"
238
00:11:13,742 --> 00:11:16,710
and "Breakout."
239
00:11:16,745 --> 00:11:19,920
Games reduce the complexity
of the real world
240
00:11:19,955 --> 00:11:23,613
to a very narrow set
of actions that can be taken.
241
00:11:23,648 --> 00:11:26,168
O'BRIEN [voiceover]:
Before he started Inflection,
242
00:11:26,202 --> 00:11:29,274
Mustafa Suleyman co-founded
a company called
243
00:11:29,309 --> 00:11:31,760
DeepMind in 2010.
244
00:11:31,794 --> 00:11:35,764
It was acquired by Google
four years later.
245
00:11:35,798 --> 00:11:37,179
When an A.I. plays a game,
246
00:11:37,213 --> 00:11:40,769
we show it frame-by-frame,
every pixel
247
00:11:40,803 --> 00:11:42,943
in the moving image.
248
00:11:42,978 --> 00:11:44,876
And so the A.I. learns
to associate pixels
249
00:11:44,911 --> 00:11:46,982
with actions that it can take
250
00:11:47,016 --> 00:11:50,813
moving left or right
or pressing the fire button.
251
00:11:52,194 --> 00:11:55,369
O'BRIEN [voiceover]:
When it obliterates blocks
or shoots aliens,
252
00:11:55,404 --> 00:11:58,752
the connections between the
nodes that enabled that success
253
00:11:58,787 --> 00:12:00,512
are strengthened.
254
00:12:00,547 --> 00:12:02,860
In other words, it is rewarded.
255
00:12:02,894 --> 00:12:05,897
When it fails, no reward.
256
00:12:05,932 --> 00:12:08,589
Eventually,
all those reinforced connections
257
00:12:08,624 --> 00:12:10,833
overrule the weaker ones.
258
00:12:10,868 --> 00:12:13,733
The program has learned
how to win.
259
00:12:15,458 --> 00:12:17,840
This sort of repeated allocation
of reward
260
00:12:17,875 --> 00:12:22,362
for repetitive behavior
is a great way to train a dog.
261
00:12:22,396 --> 00:12:24,260
It's a great way to teach a kid.
262
00:12:24,295 --> 00:12:27,022
It's a great way for us
as adults to adapt our behavior.
263
00:12:27,056 --> 00:12:29,438
And in fact,
it's actually a good way
264
00:12:29,472 --> 00:12:32,130
to train machine learning
algorithms to get better.
265
00:12:34,926 --> 00:12:38,136
O'BRIEN [voiceover]:
In 2014, DeepMind began work
on an artificial neural network
266
00:12:38,171 --> 00:12:40,760
called "AlphaGo"
267
00:12:40,794 --> 00:12:42,209
that could play the ancient,
268
00:12:42,244 --> 00:12:45,178
and deceptively complex,
board game of Go.
269
00:12:47,007 --> 00:12:50,424
KELLIS:
Go was thought to be a game
where machines would never win.
270
00:12:50,459 --> 00:12:53,807
The number of choices
for every move is enormous.
271
00:12:53,842 --> 00:12:55,913
O'BRIEN [voiceover]:
But at DeepMind,
272
00:12:55,947 --> 00:12:57,707
they were counting on
273
00:12:57,742 --> 00:13:02,022
the astounding growth
of compute power.
274
00:13:02,057 --> 00:13:04,611
And I think that's the key
concept to try to grasp,
275
00:13:04,645 --> 00:13:09,133
is that we are massively,
exponentially growing
276
00:13:09,167 --> 00:13:11,963
the amount of computation used,
and in some sense,
277
00:13:11,998 --> 00:13:14,690
that computation is a proxy
278
00:13:14,724 --> 00:13:17,727
for how intelligent
the model is.
279
00:13:18,867 --> 00:13:22,663
O'BRIEN [voiceover]:
AlphaGo was trained two ways.
280
00:13:22,698 --> 00:13:25,459
First, it was fed a large
data set of expert Go games
281
00:13:25,494 --> 00:13:28,842
so that it could
learn how to play the game.
282
00:13:28,877 --> 00:13:31,120
This is known
as supervised learning.
283
00:13:31,155 --> 00:13:36,643
Then the software played against
itself many millions of times,
284
00:13:36,677 --> 00:13:39,370
so-called
reinforcement learning.
285
00:13:39,404 --> 00:13:42,822
This gradually improved
its skills and strategies.
286
00:13:42,856 --> 00:13:45,617
In March 2016,
287
00:13:45,652 --> 00:13:47,378
AlphaGo faced Lee Sedol,
288
00:13:47,412 --> 00:13:49,345
one of the world's
top-ranking players
289
00:13:49,380 --> 00:13:52,866
in a five-game match in
Seoul, South Korea.
290
00:13:52,901 --> 00:13:54,937
AlphaGo not only won,
291
00:13:54,972 --> 00:13:59,217
but also made a move so novel,
the Go cognoscenti
292
00:13:59,252 --> 00:14:02,151
thought it was a huge blunder.That's a very
surprising move.
293
00:14:03,912 --> 00:14:06,017
There's no question to me
that these A.I. models
294
00:14:06,052 --> 00:14:07,777
are creative.
295
00:14:07,812 --> 00:14:10,608
They're incredibly creative.
296
00:14:10,642 --> 00:14:14,715
O'BRIEN [voiceover]:
It turns out the move
was a stroke of brilliance.
297
00:14:14,750 --> 00:14:16,717
And this
emergent creative behavior
298
00:14:16,752 --> 00:14:18,754
was a hint of what was to come:
299
00:14:18,788 --> 00:14:21,895
generative A.I.
300
00:14:21,930 --> 00:14:23,655
Meanwhile,
301
00:14:23,690 --> 00:14:26,072
a company called OpenA.I.
was creating
302
00:14:26,106 --> 00:14:28,074
a generative A.I. model
303
00:14:28,108 --> 00:14:31,215
that would become ChatGPT.
304
00:14:31,249 --> 00:14:33,355
It allows users
to engage in a dialogue
305
00:14:33,389 --> 00:14:37,221
with a machine
that seems uncannily human.
306
00:14:37,255 --> 00:14:39,913
It was first released in 2018,
307
00:14:39,948 --> 00:14:43,917
but it was a subsequent version
that became a global sensation
308
00:14:43,952 --> 00:14:46,230
in late 2022.
309
00:14:46,264 --> 00:14:48,957
This promises to be
the viral sensation
310
00:14:48,991 --> 00:14:51,649
that could completely reset
how we do things.
311
00:14:51,683 --> 00:14:53,651
Cranking out entire essays
312
00:14:53,685 --> 00:14:55,446
in a matter of seconds.
313
00:14:55,480 --> 00:14:58,345
O'BRIEN [voiceover]:
Not only did it wow the public,
it also caught
314
00:14:58,380 --> 00:15:01,659
artificial intelligence
innovators off guard.
315
00:15:03,005 --> 00:15:05,076
YOSHUA BENGIO:
It surprised me a lot
316
00:15:05,111 --> 00:15:07,216
that they're able
to do things that
317
00:15:07,251 --> 00:15:10,737
we didn't think they could do
simply by
318
00:15:10,771 --> 00:15:14,948
learning to imitate
how humans respond.
319
00:15:14,983 --> 00:15:18,503
And I thought this
kind of abilities would take
320
00:15:18,538 --> 00:15:21,299
many more years or decades.
321
00:15:21,334 --> 00:15:25,027
O'BRIEN [voiceover]:
ChatGPT is
a large language model.
322
00:15:25,062 --> 00:15:29,273
LLMs start by consuming massive
amounts of text:
323
00:15:29,307 --> 00:15:31,309
books, articles and websites,
324
00:15:31,344 --> 00:15:34,209
which are publicly available on
the internet.
325
00:15:34,243 --> 00:15:37,867
By recognizing patterns
in billions of words,
326
00:15:37,902 --> 00:15:41,078
they can make guesses
at the next word in a sentence.
327
00:15:41,112 --> 00:15:44,357
That's how ChatGPT
generates unique answers
328
00:15:44,391 --> 00:15:46,497
to your questions.
329
00:15:46,531 --> 00:15:49,051
If I ask for a haiku
about the blue sky
330
00:15:49,086 --> 00:15:53,883
it writes something
that seems completely original.
331
00:15:53,918 --> 00:15:55,782
KELLIS:
If you're good at predicting
332
00:15:55,816 --> 00:15:57,715
this next word,
333
00:15:57,749 --> 00:15:59,751
it means you're understanding
something about the sentence.
334
00:15:59,786 --> 00:16:02,444
What the style
of the sentence is,
335
00:16:02,478 --> 00:16:05,171
what the feeling
of the sentence is.
336
00:16:05,205 --> 00:16:08,795
And you can't tell whether
this was a human or a machine.
337
00:16:08,829 --> 00:16:10,831
That's basically the definition
of the Turing test.
338
00:16:10,866 --> 00:16:14,663
O'BRIEN [voiceover]:
So, how is this changing
our world?
339
00:16:14,697 --> 00:16:18,356
Well, It might change my world--
as an arm amputee.
340
00:16:18,391 --> 00:16:20,393
Ready for my casting call,right?
341
00:16:20,427 --> 00:16:21,704
MONROE [chuckling]:
Yes.
342
00:16:21,739 --> 00:16:23,499
Let's do it.All right.
343
00:16:23,534 --> 00:16:25,294
O'BRIEN [voiceover]:
That's Brian Monroe of
the Hanger Clinic.
344
00:16:25,329 --> 00:16:26,606
He's been my prosthetist
345
00:16:26,640 --> 00:16:29,540
since an injury
took my arm above the elbow
346
00:16:29,574 --> 00:16:31,266
ten years ago.
347
00:16:31,300 --> 00:16:33,820
So what we're going to do today
is take a mold of your arm.Uh-huh.
348
00:16:33,854 --> 00:16:36,202
Kind of is like
a cast for a broken bone.
349
00:16:36,236 --> 00:16:40,275
O'BRIEN [voiceover]:
Up until now, I have used
a body-powered prosthetic.
350
00:16:40,309 --> 00:16:43,416
Harness and a cable allow me
to move it
351
00:16:43,450 --> 00:16:45,211
by shrugging my shoulders.
352
00:16:45,245 --> 00:16:49,353
The technology is
more than a century old.
353
00:16:49,387 --> 00:16:51,148
But artificial intelligence,
354
00:16:51,182 --> 00:16:53,978
coupled with small
electric motors,
355
00:16:54,013 --> 00:16:58,569
is finally pushing prosthetics
into the 21st century.
356
00:17:00,088 --> 00:17:02,262
Which brings me to Chicago
357
00:17:02,297 --> 00:17:05,576
and the offices of
a small company called Coapt.
358
00:17:05,610 --> 00:17:08,613
I met the C.E.O., Blair Locke,
359
00:17:08,648 --> 00:17:12,479
a pioneer in the push
to apply artificial intelligence
360
00:17:12,514 --> 00:17:16,518
to artificial limbs.
361
00:17:16,552 --> 00:17:18,968
So, what do we have here?
What are we going to do?
362
00:17:19,003 --> 00:17:22,041
This allows us to very easily
test how your control would be
363
00:17:22,075 --> 00:17:25,113
using a pretty simple cuff;
this has electrodes in it,
364
00:17:25,147 --> 00:17:27,080
and we'll let the power
of the electronics
365
00:17:27,115 --> 00:17:28,633
that are doing
the machine learning
366
00:17:28,668 --> 00:17:30,773
see what you're capable of.All right, let's give it a try.
367
00:17:30,808 --> 00:17:33,086
[voiceover]:
Like most amputees,
368
00:17:33,121 --> 00:17:37,159
I feel my missing hand almost
as if it was still there--
369
00:17:37,194 --> 00:17:38,643
a phantom.
370
00:17:38,678 --> 00:17:40,335
Everything will touch.
Is that okay?
371
00:17:40,369 --> 00:17:41,405
Yeah.Not too tight?
372
00:17:41,439 --> 00:17:43,200
No. All good.Okay.
373
00:17:43,234 --> 00:17:45,409
O'BRIEN [voiceover]:
It's almost entirely immobile,
stuck in molasses.
374
00:17:45,443 --> 00:17:47,963
Make a fist, not too hard.
375
00:17:47,997 --> 00:17:52,105
O'BRIEN [voiceover]:
But I am able to imagine
moving it ever so slightly.
376
00:17:52,140 --> 00:17:53,796
And I'm gonna have you squeeze
into that a little bit harder.
377
00:17:53,831 --> 00:17:56,523
Very good, and I see the
pattern on the screen
378
00:17:56,558 --> 00:17:57,800
change a little bit.
379
00:17:57,835 --> 00:17:59,354
O'BRIEN [voiceover]:
And when I do,
380
00:17:59,388 --> 00:18:02,288
I generate an array of faint
electrical signals in my stump.
381
00:18:02,322 --> 00:18:04,117
That's your muscle information.
382
00:18:04,152 --> 00:18:06,015
It feels,
it feels like I'm overcoming
383
00:18:06,050 --> 00:18:07,914
something that's really stuck.
384
00:18:07,948 --> 00:18:09,398
I don't know,
is that enough signal?
385
00:18:09,433 --> 00:18:11,331
Should be.
Oh, okay.
386
00:18:11,366 --> 00:18:12,574
We don't need a lot of signal,
387
00:18:12,608 --> 00:18:13,920
we're going for information
388
00:18:13,954 --> 00:18:15,784
in the signal,
not how loud it is.
389
00:18:15,818 --> 00:18:18,511
O'BRIEN [voiceover]:
And this is where artificial
intelligence comes in.
390
00:18:20,306 --> 00:18:23,585
Using a virtual
prosthetic depicted on a screen,
391
00:18:23,619 --> 00:18:27,313
I trained a machine learning
algorithm to become fluent
392
00:18:27,347 --> 00:18:31,834
in the language
of my nerves and muscles.
393
00:18:31,869 --> 00:18:33,457
We see eight different signals
on the screen.
394
00:18:33,491 --> 00:18:35,907
All eight of those
sensor sites
395
00:18:35,942 --> 00:18:37,426
are going to
feed in together
396
00:18:37,461 --> 00:18:39,152
and let the algorithm
sort out the data.
397
00:18:39,187 --> 00:18:41,189
What you are experiencing
398
00:18:41,223 --> 00:18:43,846
is your ability
to teach the system
399
00:18:43,881 --> 00:18:45,676
what is hand-closed to you.
400
00:18:45,710 --> 00:18:47,678
And that's different
than what it would be to me.
401
00:18:47,712 --> 00:18:52,096
O'BRIEN [voiceover]:
I told the software
what motion I desired,
402
00:18:52,131 --> 00:18:54,443
open, close, or rotate,
403
00:18:54,478 --> 00:18:58,585
then imagined moving
my phantom limb accordingly.
404
00:18:58,620 --> 00:19:00,794
This generates an array
of electromyographic,
405
00:19:00,829 --> 00:19:03,659
or EMG, signals in
my remaining muscles.
406
00:19:03,694 --> 00:19:07,353
I was training the A.I.
to connect the pattern
407
00:19:07,387 --> 00:19:10,183
of these electrical signals
with a specific movement.
408
00:19:12,427 --> 00:19:13,738
LOCK:
The system adapts,
409
00:19:13,773 --> 00:19:16,327
and as you add more data
and use it over time,
410
00:19:16,362 --> 00:19:18,329
it becomes more robust,
411
00:19:18,364 --> 00:19:22,161
and it learns
to improve upon use.
412
00:19:22,195 --> 00:19:24,542
O'BRIEN:
Is it me that's learning, or
the algorithm that's learning?
413
00:19:24,577 --> 00:19:26,648
Or are we learning together?LOCK:
You're learning together.
414
00:19:26,682 --> 00:19:27,718
Okay.
415
00:19:28,753 --> 00:19:32,309
O'BRIEN [voiceover]:
So, how does the Coapt pattern
recognition system work?
416
00:19:32,343 --> 00:19:37,210
It's called a Bayesian
classification model.
417
00:19:37,245 --> 00:19:39,005
As I train the software,
418
00:19:39,039 --> 00:19:41,628
it labels my
various EMG patterns
419
00:19:41,663 --> 00:19:44,321
into corresponding
classes of movement--
420
00:19:44,355 --> 00:19:48,566
hand open, hand closed,
wrist rotation, for example.
421
00:19:48,601 --> 00:19:51,224
As I use the arm,
422
00:19:51,259 --> 00:19:53,882
it compares the electrical
signals I'm transmitting
423
00:19:53,916 --> 00:19:57,713
to the existing library
of classifications I taught it.
424
00:19:57,748 --> 00:20:00,682
It relies on
statistical probability
425
00:20:00,716 --> 00:20:03,512
to choose the best match.
426
00:20:03,547 --> 00:20:05,480
And this is just one way
machine learning
427
00:20:05,514 --> 00:20:08,345
is quietly
revolutionizing medicine.
428
00:20:11,175 --> 00:20:13,626
Computer scientist
Regina Barzilay
429
00:20:13,660 --> 00:20:16,801
first started working on
artificial intelligence
430
00:20:16,836 --> 00:20:21,185
in the 1990s, just as
rule-based A.I. like Deep Blue
431
00:20:21,220 --> 00:20:23,670
was giving way
to neural networks.
432
00:20:23,705 --> 00:20:25,810
She used the techniques
433
00:20:25,845 --> 00:20:27,778
to decipher dead languages.
434
00:20:27,812 --> 00:20:30,919
You might call it
a small language model.
435
00:20:30,953 --> 00:20:33,473
Something that is fun and
intellectually very challenging,
436
00:20:33,508 --> 00:20:35,544
but it's not like
it's going to change our life.
437
00:20:37,063 --> 00:20:39,824
O'BRIEN [voiceover]:
And then her life changed
in an instant.
438
00:20:39,859 --> 00:20:42,482
CONSTANCE LEHMAN:
We see a spot there.
439
00:20:42,517 --> 00:20:46,037
O'BRIEN [voiceover]:
In 2014, she was diagnosed
with breast cancer.
440
00:20:46,072 --> 00:20:47,867
BARZILAY [voiceover]:
When you go through the
treatment,
441
00:20:47,901 --> 00:20:49,109
there are a lot of
people who are suffering.
442
00:20:49,144 --> 00:20:50,628
I was interested in
443
00:20:50,663 --> 00:20:53,873
what I can do about it, and
clearly it was not continuing
444
00:20:53,907 --> 00:20:55,599
deciphering dead languages,
445
00:20:55,633 --> 00:20:57,946
and it was quite a journey.
446
00:20:57,980 --> 00:21:02,468
O'BRIEN [voiceover]:
Not surprisingly, she began that
journey with mammograms.
447
00:21:02,502 --> 00:21:04,159
LEHMAN:
It's a little bit
more prominent.
448
00:21:04,193 --> 00:21:05,954
O'BRIEN [voiceover]:
She and Constance Lehman,
449
00:21:05,988 --> 00:21:10,027
a radiologist at
Massachusetts General Hospital,
450
00:21:10,061 --> 00:21:12,650
realized the Achilles heel
in the diagnostic system
451
00:21:12,685 --> 00:21:15,446
is the human eye.
452
00:21:15,481 --> 00:21:17,621
BARZILAY [voiceover]:
So the question that we ask is,
453
00:21:17,655 --> 00:21:19,416
what is the likelihood
of these patients
454
00:21:19,450 --> 00:21:22,695
to develop cancer
within the next five years?
455
00:21:22,729 --> 00:21:24,490
We, with our human eyes,
456
00:21:24,524 --> 00:21:26,388
cannot really make these
assertions
457
00:21:26,423 --> 00:21:29,011
because the patterns
are so subtle.
458
00:21:29,046 --> 00:21:32,601
LEHMAN:Now, is that different
from the surrounding tissue?
459
00:21:32,636 --> 00:21:35,121
O'BRIEN [voiceover]:
It's a perfect use case
for pattern recognition
460
00:21:35,155 --> 00:21:38,780
using what is known as
a convolutional neural network.
461
00:21:38,814 --> 00:21:40,609
♪
462
00:21:40,644 --> 00:21:43,957
Here's an example
of how CNNs get smart:
463
00:21:43,992 --> 00:21:48,652
they comb through a picture with
many virtual magnifying glasses.
464
00:21:48,686 --> 00:21:52,069
Each one is looking for
a specific kind of puzzle piece,
465
00:21:52,103 --> 00:21:54,623
like an edge,
a shape, or a texture.
466
00:21:54,658 --> 00:21:56,867
Then it makes
simplified versions,
467
00:21:56,901 --> 00:22:00,491
repeating the process
on larger and larger sections.
468
00:22:00,526 --> 00:22:03,218
Eventually
the puzzle can be assembled.
469
00:22:03,252 --> 00:22:05,358
And it's time to make a guess.
470
00:22:05,393 --> 00:22:08,603
Is it a cat? A dog? A tree?
471
00:22:08,637 --> 00:22:12,676
Sometimes the guess is right,
but sometimes it's wrong.
472
00:22:12,710 --> 00:22:14,954
And here's the learning part:
473
00:22:14,988 --> 00:22:17,405
with a process
called backpropagation,
474
00:22:17,439 --> 00:22:22,306
labeled images are sent back to
correct the previous operation.
475
00:22:22,341 --> 00:22:24,964
So the next time
it plays the guessing game,
476
00:22:24,998 --> 00:22:27,000
it will be even better.
477
00:22:27,035 --> 00:22:30,279
To validate the model,
Regina and her team gathered up
478
00:22:30,314 --> 00:22:33,282
more than 128,000 mammograms
479
00:22:33,317 --> 00:22:36,493
collected at seven sites
in four countries.
480
00:22:36,527 --> 00:22:40,220
More than 3,800 of them
led to a cancer diagnosis
481
00:22:40,255 --> 00:22:43,983
within five years.
482
00:22:44,017 --> 00:22:45,812
You just give to it the image,
483
00:22:45,847 --> 00:22:48,159
and then
the five years of outcomes,
484
00:22:48,194 --> 00:22:52,440
and it can learn the likelihood
of getting a cancer diagnosis.
485
00:22:52,474 --> 00:22:56,305
O'BRIEN [voiceover]:
The software, called Mirai,
was a success.
486
00:22:56,340 --> 00:23:00,551
In fact, it is between
75% and 84% accurate
487
00:23:00,586 --> 00:23:04,003
in predicting
future cancer diagnoses.
488
00:23:06,454 --> 00:23:11,182
Then, a friend of
Regina's developed lung cancer.
489
00:23:11,217 --> 00:23:12,874
SEQUIST:
In lung cancer, it's actually
490
00:23:12,908 --> 00:23:15,324
sort of mind boggling
how much has changed.
491
00:23:15,359 --> 00:23:18,638
O'BRIEN [voiceover]:
Her friend saw oncologist
Lecia Sequist.
492
00:23:19,881 --> 00:23:21,054
She and Regina wondered
493
00:23:21,089 --> 00:23:24,472
if artificial intelligence
could be applied
494
00:23:24,506 --> 00:23:26,819
to CAT scans of patients' lungs.
495
00:23:26,853 --> 00:23:28,441
SEQUIST:
We taught the model
496
00:23:28,476 --> 00:23:32,790
to recognize the patterns
of developing lung cancer
497
00:23:32,825 --> 00:23:35,483
by using thousands of CAT scans
498
00:23:35,517 --> 00:23:36,622
from patients who were
participating
499
00:23:36,656 --> 00:23:37,933
in a clinical trial.
500
00:23:37,968 --> 00:23:40,108
From the new study?
Oh, interesting.Correct.
501
00:23:40,142 --> 00:23:42,179
SEQUIST [voiceover]:
We had a lot of information
about them.
502
00:23:42,213 --> 00:23:43,836
We had demographic information,
503
00:23:43,870 --> 00:23:45,665
we had health information,
504
00:23:45,700 --> 00:23:47,322
and we had outcomes information.
505
00:23:47,356 --> 00:23:50,429
O'BRIEN [voiceover]:
They call the model Sibyl.
506
00:23:50,463 --> 00:23:51,809
In the retrospective study,
right,
507
00:23:51,844 --> 00:23:53,432
so the retrospective data...
508
00:23:53,466 --> 00:23:54,950
O'BRIEN [voiceover]:
Radiologist Florian Fintelmann
509
00:23:54,985 --> 00:23:56,918
showed me what it can do.
510
00:23:56,952 --> 00:24:00,059
FINTELMANN:
This is earlier,
and this is later.
511
00:24:00,093 --> 00:24:01,750
There is nothing
512
00:24:01,785 --> 00:24:05,098
that I can perceive, pick up,
or describe.
513
00:24:05,133 --> 00:24:07,722
There's no, what we call,
a precursor lesion
514
00:24:07,756 --> 00:24:08,861
on this CT scan.
515
00:24:08,895 --> 00:24:10,449
Sibyl looked here
516
00:24:10,483 --> 00:24:12,623
and then anticipated that
there would be a problem
517
00:24:12,658 --> 00:24:15,315
based on the baseline scan.What is it seeing?
518
00:24:15,350 --> 00:24:16,903
That's the million dollar
question.
519
00:24:16,938 --> 00:24:19,112
And, and maybe not
the million dollar question.
520
00:24:19,147 --> 00:24:21,529
Does it really matter? Does it?
521
00:24:21,563 --> 00:24:23,910
O'BRIEN [voiceover]:
When they compared
the predictions
522
00:24:23,945 --> 00:24:28,294
to actual outcomes from previous
cases, Sybil fared well.
523
00:24:28,328 --> 00:24:30,572
It correctly forecast cancer
524
00:24:30,607 --> 00:24:33,541
between 80% and 95% of the time,
525
00:24:33,575 --> 00:24:36,475
depending on the population
it studied.
526
00:24:36,509 --> 00:24:39,167
The technique is
still in the trial phase.
527
00:24:39,201 --> 00:24:41,203
But once it is deployed,
528
00:24:41,238 --> 00:24:44,862
it could provide
a potent tool for prevention.
529
00:24:47,624 --> 00:24:50,005
The hope is that if you
can predict very early on
530
00:24:50,040 --> 00:24:52,456
that the patient
is in the wrong way,
531
00:24:52,491 --> 00:24:55,286
you can do clinical trials,
you can develop the drugs
532
00:24:55,321 --> 00:25:00,084
that are doing the prevention,
rather than treatment
533
00:25:00,119 --> 00:25:02,673
of very advanced disease
that we are doing today.
534
00:25:04,054 --> 00:25:07,367
O'BRIEN [voiceover]:
Which takes us back to DeepMind
and AlphaGo.
535
00:25:07,402 --> 00:25:09,784
The fun and games
were just the beginning,
536
00:25:09,818 --> 00:25:12,372
a means to an end.
537
00:25:12,407 --> 00:25:15,962
We have always set out
at DeepMind
538
00:25:15,997 --> 00:25:19,552
to, um, use our technologies to
make the world a better place.
539
00:25:19,587 --> 00:25:22,486
O'BRIEN [voiceover]:
In 2021,
540
00:25:22,521 --> 00:25:24,488
the company released AlphaFold.
541
00:25:24,523 --> 00:25:26,801
It is pattern
recognition software
542
00:25:26,835 --> 00:25:29,113
designed to make
it easier for researchers
543
00:25:29,148 --> 00:25:30,839
to understand proteins,
544
00:25:30,874 --> 00:25:34,153
long chains of amino acids
545
00:25:34,187 --> 00:25:36,224
involved in nearly
every function in our bodies.
546
00:25:36,258 --> 00:25:38,122
How a protein folds
547
00:25:38,157 --> 00:25:40,539
into a specific,
three-dimensional shape
548
00:25:40,573 --> 00:25:45,198
determines how it interacts
with other molecules.
549
00:25:45,233 --> 00:25:47,062
SULEYMAN:
There's this correlation between
550
00:25:47,097 --> 00:25:50,307
what the protein does
and how it's structured.
551
00:25:50,341 --> 00:25:53,793
So if we can predict
how the protein folds,
552
00:25:53,828 --> 00:25:56,382
then say something
about their function.
553
00:25:56,416 --> 00:25:59,765
O'BRIEN:
If we know how a disease's
protein is shaped, or folded,
554
00:25:59,799 --> 00:26:03,665
we can sometimes create
a drug to disable it.
555
00:26:03,700 --> 00:26:07,842
But the shape of millions
of proteins remained a mystery.
556
00:26:07,876 --> 00:26:10,396
DeepMind trained AlphaFold
557
00:26:10,430 --> 00:26:13,399
on thousands of
known protein structures.
558
00:26:13,433 --> 00:26:15,643
It leveraged this knowledge
to predict
559
00:26:15,677 --> 00:26:18,335
200 million
protein structures,
560
00:26:18,369 --> 00:26:22,822
nearly all the proteins
known to science.
561
00:26:22,857 --> 00:26:25,549
SULEYMAN:
You take some high-quality
known data,
562
00:26:25,584 --> 00:26:28,794
and you use that to,
you know,
563
00:26:28,828 --> 00:26:32,867
make a prediction about how
a similar piece of information
564
00:26:32,901 --> 00:26:35,179
is likely to unfold
over some time series,
565
00:26:35,214 --> 00:26:37,250
and the structure
of proteins is,
566
00:26:37,285 --> 00:26:39,425
you know, in that sense,
no different to
567
00:26:39,459 --> 00:26:42,601
making a prediction in
the game of Go or in Atari
568
00:26:42,635 --> 00:26:44,326
or in a mammography scan,
569
00:26:44,361 --> 00:26:46,846
or indeed,
in a large language model.
570
00:26:46,881 --> 00:26:48,399
KAMYA:
These thin sticks here?
571
00:26:48,434 --> 00:26:50,747
Yeah?They represent
the amino acids
572
00:26:50,781 --> 00:26:52,300
that make up a protein.
573
00:26:52,334 --> 00:26:53,473
O'BRIEN [voiceover]:
Theoretical chemist
574
00:26:53,508 --> 00:26:56,269
Petrina Kamya works for
a company called
575
00:26:56,304 --> 00:26:58,340
Insilico Medicine.
576
00:26:58,375 --> 00:27:00,135
It uses AlphaFold
577
00:27:00,170 --> 00:27:02,206
and its own
deep-learning models
578
00:27:02,241 --> 00:27:07,626
to make accurate predictions
about protein structures.
579
00:27:07,660 --> 00:27:09,835
What we're doing in drug design
is we're designing a molecule
580
00:27:09,869 --> 00:27:13,045
that is analogous
to the natural molecule
581
00:27:13,079 --> 00:27:14,253
that binds to the protein,
582
00:27:14,287 --> 00:27:16,013
but instead it will lock it,
if this molecule
583
00:27:16,048 --> 00:27:18,637
is involved in a disease
where it's hyperactive.
584
00:27:19,672 --> 00:27:21,398
O'BRIEN [voiceover]:
If the molecule fits well,
585
00:27:21,432 --> 00:27:24,332
it can inhibit the
disease-causing proteins.
586
00:27:24,366 --> 00:27:25,851
So you're
filtering it down
587
00:27:25,885 --> 00:27:28,405
like you're choosing
an Airbnb or something to,
588
00:27:28,439 --> 00:27:30,372
you know, number of bedrooms,
whatever.To suit your needs.
589
00:27:30,407 --> 00:27:31,546
[laughs]Exactly, right.
590
00:27:31,580 --> 00:27:33,168
Right, yeah.That's a very good analogy.
591
00:27:33,203 --> 00:27:35,032
It's sort of like Airbnb.
592
00:27:35,067 --> 00:27:37,103
So you are putting in
your criteria,
593
00:27:37,138 --> 00:27:38,726
and then Airbnb will
filter out
594
00:27:38,760 --> 00:27:39,968
all the different
properties
595
00:27:40,003 --> 00:27:41,176
based on your criteria.
596
00:27:41,211 --> 00:27:42,522
So you can be very, very
restrictive
597
00:27:42,557 --> 00:27:43,903
or you can be very,
very free...Right.
598
00:27:43,938 --> 00:27:46,216
In terms of guiding the
generative algorithms
599
00:27:46,250 --> 00:27:47,631
and telling them
what types of molecules
600
00:27:47,666 --> 00:27:49,357
you want them to generate.
601
00:27:49,391 --> 00:27:54,327
O'BRIEN [voiceover]:
It will take 48 to 72 hours
of computing time
602
00:27:54,362 --> 00:27:57,676
to identify the best
candidates ranked in order.
603
00:27:57,710 --> 00:27:59,160
How long would it
have taken you
604
00:27:59,194 --> 00:28:02,094
to figure that out
as a computational chemist?
605
00:28:02,128 --> 00:28:03,716
I would have thought of
some of these,
606
00:28:03,751 --> 00:28:04,752
but not all of them.Okay.
607
00:28:05,787 --> 00:28:08,687
O'BRIEN [voiceover]:
While there are no shortcuts
for human trials,
608
00:28:08,721 --> 00:28:10,758
nor should we hope for that,
609
00:28:10,792 --> 00:28:15,210
this could greatly speed up
the drug development pipeline.
610
00:28:16,625 --> 00:28:18,386
There will not be the need
to invest so heavily
611
00:28:18,420 --> 00:28:20,595
in preclinical discovery,
612
00:28:20,629 --> 00:28:24,185
and so,
drugs can therefore be cheaper.
613
00:28:24,219 --> 00:28:25,876
And you can go
after those diseases
614
00:28:25,911 --> 00:28:28,637
that are
otherwise neglected,
615
00:28:28,672 --> 00:28:30,329
because you don't have
to invest so heavily
616
00:28:30,363 --> 00:28:31,502
in order for you
to come up with a drug,
617
00:28:31,537 --> 00:28:33,746
a viable drug.
618
00:28:33,781 --> 00:28:35,817
O'BRIEN [voiceover]:
But medicine isn't
the only place
619
00:28:35,852 --> 00:28:38,095
where A.I. is breaking
new frontiers.
620
00:28:38,130 --> 00:28:41,478
It's conducting
financial analysis,
621
00:28:41,512 --> 00:28:44,274
helps with fraud detection.
622
00:28:44,308 --> 00:28:45,758
[mechanical whirring]
623
00:28:45,793 --> 00:28:49,037
It's now being deployed
to discover novel materials
624
00:28:49,072 --> 00:28:53,455
and could help us build
clean energy technology.
625
00:28:53,490 --> 00:28:57,908
And It is even helping
to save lives
626
00:28:57,943 --> 00:28:59,841
as the climate crisis
boils over.
627
00:29:01,084 --> 00:29:02,637
[indistinct radio chatter]
628
00:29:02,671 --> 00:29:03,949
In St. Helena, California,
629
00:29:03,983 --> 00:29:05,295
dispatchers at the
630
00:29:05,329 --> 00:29:09,023
CAL FIRE Sonoma-Lake-Napa
Command Center
631
00:29:09,057 --> 00:29:11,508
caught a break in 2023.
632
00:29:11,542 --> 00:29:17,169
Wildfires blackened nearly
700 acres of their territory.
633
00:29:17,203 --> 00:29:19,343
We were at 400,000 acres
in 2020.
634
00:29:20,655 --> 00:29:22,381
Something like that would
generate a response from us...
635
00:29:22,415 --> 00:29:25,591
O'BRIEN [voiceover]:
Chief Mike Marcucci has
been fighting fires
636
00:29:25,625 --> 00:29:27,627
for more than 30 years.
637
00:29:27,662 --> 00:29:29,768
MARCUCCI [voiceover]:
Once we started having
these devastating fires,
638
00:29:29,802 --> 00:29:30,838
we needed more intel.
639
00:29:30,872 --> 00:29:32,667
The need for intelligence
640
00:29:32,701 --> 00:29:35,325
is just overwhelming
in today's fire service.
641
00:29:36,567 --> 00:29:38,293
O'BRIEN [voiceover]:
Over the past 20 years,
642
00:29:38,328 --> 00:29:40,226
California
has installed a network
643
00:29:40,261 --> 00:29:42,435
of more than
1,000 remotely operated
644
00:29:42,470 --> 00:29:46,750
pan, tilt, zoom surveillance
cameras on mountaintops.
645
00:29:48,096 --> 00:29:50,098
PETE AVANSINO:
Vegetation fire,
Highway 29 at Doton Road.
646
00:29:51,928 --> 00:29:55,034
O'BRIEN [voiceover]:
All those cameras generate
petabytes of video.
647
00:29:56,104 --> 00:29:58,900
CAL FIRE partnered with
scientists at U.C. San Diego
648
00:29:58,935 --> 00:30:00,833
to train a neural network
649
00:30:00,868 --> 00:30:03,284
to spot the early signs
of trouble.
650
00:30:03,318 --> 00:30:06,597
It's called
ALERT California.
651
00:30:06,632 --> 00:30:08,220
SeLEGUE:
So here's one
that just popped up.
652
00:30:08,254 --> 00:30:10,256
Here's an anomaly.
653
00:30:10,291 --> 00:30:14,260
O'BRIEN [voiceover]:
CAL FIRE Staff Chief of Fire
and Intelligence Philip SeLegue
654
00:30:14,295 --> 00:30:17,608
showed me how it works
while it was in action,
655
00:30:17,643 --> 00:30:19,507
detecting nascent fires,
656
00:30:19,541 --> 00:30:21,854
micro fires.
657
00:30:21,889 --> 00:30:23,338
That looks like
just a little hint
658
00:30:23,373 --> 00:30:25,824
of some type of smoke
that was there...
659
00:30:25,858 --> 00:30:27,515
O'BRIEN [voiceover]:
Based on this,
dispatchers can orchestrate
660
00:30:27,549 --> 00:30:29,172
a fast response.
661
00:30:30,863 --> 00:30:35,592
A.I. has given us the ability
to detect and to see
662
00:30:35,626 --> 00:30:37,421
where those fires
are starting.
663
00:30:37,456 --> 00:30:40,183
AVANSINO:
Transport 1447
responding via MDC.
664
00:30:40,217 --> 00:30:41,632
O'BRIEN [voiceover]:
For all they know,
665
00:30:41,667 --> 00:30:45,050
they have nipped
some megafires in the bud.
666
00:30:45,084 --> 00:30:46,292
The success are the fires
667
00:30:46,327 --> 00:30:47,915
that you don't hear about
in the news.
668
00:30:47,949 --> 00:30:50,296
O'BRIEN [voiceover]:
Artificial intelligence
669
00:30:50,331 --> 00:30:52,885
can't put out
wildfires just yet.
670
00:30:52,920 --> 00:30:56,855
Human firefighters
still need to do that job.
671
00:30:58,408 --> 00:31:00,617
But researchers are pushing hard
672
00:31:00,651 --> 00:31:02,722
to combine neural networks
673
00:31:02,757 --> 00:31:05,346
with mobility and dexterity.
674
00:31:06,899 --> 00:31:08,487
This is where people
get nervous.
675
00:31:08,521 --> 00:31:10,213
Will they take our jobs?
676
00:31:10,247 --> 00:31:12,249
Or could they turn against us?
677
00:31:13,216 --> 00:31:14,873
But at M.I.T.,
678
00:31:14,907 --> 00:31:17,530
they're exploring ideas
to make robots
679
00:31:17,565 --> 00:31:19,222
good human partners.
680
00:31:21,017 --> 00:31:22,984
We are interested in
making machines
681
00:31:23,019 --> 00:31:25,745
that help people with
physical and cognitive tasks.
682
00:31:25,780 --> 00:31:27,333
So this is
really great,
683
00:31:27,368 --> 00:31:30,336
it has the stiffness
that we wanted...
684
00:31:30,371 --> 00:31:33,132
O'BRIEN [voiceover]:
Daniela Rus is director of
M.I.T.'s Computer Science
685
00:31:33,167 --> 00:31:36,204
and
Artificial Intelligence Lab.
686
00:31:36,239 --> 00:31:37,205
Oh, can you
bring it to me?
687
00:31:37,240 --> 00:31:39,069
O'BRIEN [voiceover]:
CSAIL.
688
00:31:39,104 --> 00:31:41,140
They are different, like,
kind of like muscles
689
00:31:41,175 --> 00:31:42,555
or actuators.
690
00:31:42,590 --> 00:31:44,143
RUS [voiceover]:
We can do so much more
691
00:31:44,178 --> 00:31:47,181
when we get people and machines
working together.
692
00:31:48,354 --> 00:31:49,631
We can get better reach.
693
00:31:49,666 --> 00:31:50,632
We can get lift,
694
00:31:50,667 --> 00:31:53,808
precision, strength, vision.
695
00:31:53,842 --> 00:31:55,499
All of these are
physical superpowers
696
00:31:55,534 --> 00:31:56,707
we can get
through machines.
697
00:31:58,088 --> 00:31:59,089
O'BRIEN [voiceover]:
So, they're focusing
698
00:31:59,124 --> 00:32:00,677
on making it safe for humans
699
00:32:00,711 --> 00:32:03,887
to work in close proximity
to machines.
700
00:32:03,922 --> 00:32:06,821
They're using some of
the technology that's inside
701
00:32:06,855 --> 00:32:08,236
my prosthetic arm.
702
00:32:08,271 --> 00:32:10,307
Electrodes
that can read
703
00:32:10,342 --> 00:32:12,758
the faint
EMG signals generated
704
00:32:12,792 --> 00:32:14,070
as our nerves command
705
00:32:14,104 --> 00:32:15,588
our muscles to move.
706
00:32:18,005 --> 00:32:20,662
They have the capability to
interact with a human,
707
00:32:20,697 --> 00:32:22,009
to understand the human,
708
00:32:22,043 --> 00:32:24,701
to step in and help the human
as needed.
709
00:32:24,735 --> 00:32:28,567
I am at your disposal with
187 other languages,
710
00:32:28,601 --> 00:32:30,224
along with their various
711
00:32:30,258 --> 00:32:32,053
dialects and sub tongues.
712
00:32:32,088 --> 00:32:34,228
O'BRIEN [voiceover]:
But making robots as useful
713
00:32:34,262 --> 00:32:37,024
as they are in the movies
is a big challenge.
714
00:32:37,058 --> 00:32:38,818
♪
715
00:32:38,853 --> 00:32:42,512
Most neural networks run on
powerful supercomputers--
716
00:32:42,546 --> 00:32:46,861
thousands of processors
occupying entire buildings.
717
00:32:48,380 --> 00:32:50,037
RUS:
We have brains that require
718
00:32:50,071 --> 00:32:53,902
massive computation,
which you cannot include
719
00:32:53,937 --> 00:32:56,457
on a self-contained body.
720
00:32:56,491 --> 00:32:59,805
We address
the size challenge by
721
00:32:59,839 --> 00:33:01,565
making liquid networks.
722
00:33:01,600 --> 00:33:03,567
O'BRIEN [voiceover]:
Liquid networks.
723
00:33:03,602 --> 00:33:05,052
So it looks like
an autonomous vehicle
724
00:33:05,086 --> 00:33:06,294
like I've seen before,
725
00:33:06,329 --> 00:33:07,709
but it is a little
different, right?
726
00:33:07,744 --> 00:33:09,056
ALEXANDER AMINI:
Very different.
727
00:33:09,090 --> 00:33:10,402
This is an
autonomous vehicle
728
00:33:10,436 --> 00:33:11,955
that can drive in
brand-new environments
729
00:33:11,990 --> 00:33:14,578
that it has never seen
before for the first time.
730
00:33:15,752 --> 00:33:17,788
O'BRIEN [voiceover]:
Most self-driving cars
today rely,
731
00:33:17,823 --> 00:33:20,688
to some extent,
on detailed databases
732
00:33:20,722 --> 00:33:23,587
that help them recognize
their immediate environment.
733
00:33:23,622 --> 00:33:28,420
Those robot cars get lost
in unfamiliar terrain.
734
00:33:29,869 --> 00:33:32,113
O'BRIEN:
In this case,
you're not relying on
735
00:33:32,148 --> 00:33:34,978
a huge, expansive
neural network.
736
00:33:35,013 --> 00:33:36,462
You're running on
19 neurons, right?
737
00:33:36,497 --> 00:33:38,464
Correct.
738
00:33:38,499 --> 00:33:40,432
O'BRIEN [voiceover]:
Computer scientist
Alexander Amini
739
00:33:40,466 --> 00:33:43,745
took me on a ride
in an autonomous vehicle
740
00:33:43,780 --> 00:33:47,335
with a liquid neural
network brain.
741
00:33:47,370 --> 00:33:49,475
AMINI:
We've become very accustomed
to relying on
742
00:33:49,510 --> 00:33:52,099
big, giant data centers
and cloud compute.
743
00:33:52,133 --> 00:33:53,721
But in an autonomous vehicle,
744
00:33:53,755 --> 00:33:55,274
you cannot make
such assumptions, right?
745
00:33:55,309 --> 00:33:56,862
You need to be able to operate,
746
00:33:56,896 --> 00:33:58,726
even if you lose
internet connectivity
747
00:33:58,760 --> 00:34:01,211
and you cannot
talk to the cloud anymore,
748
00:34:01,246 --> 00:34:02,799
your entire neural network,
749
00:34:02,833 --> 00:34:05,077
the brain of the car,
needs to live on the car,
750
00:34:05,112 --> 00:34:07,907
and that imposes a lot
of interesting constraints.
751
00:34:09,116 --> 00:34:10,462
O'BRIEN [voiceover]:
To build a brain smart enough
752
00:34:10,496 --> 00:34:12,429
and small enough to
do this job,
753
00:34:12,464 --> 00:34:15,225
they took some inspiration
from nature,
754
00:34:15,260 --> 00:34:19,436
a lowly worm
called C. elegans.
755
00:34:19,471 --> 00:34:22,750
Its brain contains all of
300 neurons,
756
00:34:22,784 --> 00:34:25,235
but it's a very
different kind of neuron.
757
00:34:27,237 --> 00:34:28,721
It can capture
more complex behaviors
758
00:34:28,756 --> 00:34:30,240
in every single piece
of that puzzle.
759
00:34:30,275 --> 00:34:31,448
And also the wiring,
760
00:34:31,483 --> 00:34:33,899
how a neuron talks to
another neuron
761
00:34:33,933 --> 00:34:35,763
is completely different
than what we see
762
00:34:35,797 --> 00:34:37,489
in today's
neural networks.
763
00:34:38,973 --> 00:34:42,321
O'BRIEN [voiceover]:
Autonomous cars that tap
into today's neural networks
764
00:34:42,356 --> 00:34:46,084
require huge amounts of
compute power in the cloud.
765
00:34:47,637 --> 00:34:50,433
But this car is using
just 19 liquid neurons.
766
00:34:51,572 --> 00:34:54,644
A worm at the wheel...
sort of.
767
00:34:54,678 --> 00:34:56,059
AMINI [voiceover]:
Today's A.I. models
768
00:34:56,094 --> 00:34:57,750
are really
pushing the boundaries
769
00:34:57,785 --> 00:35:00,339
of the scale of compute
that we have.
770
00:35:00,374 --> 00:35:02,134
They're also pushing
the boundaries
771
00:35:02,169 --> 00:35:03,446
of the data sets
that we have.
772
00:35:03,480 --> 00:35:04,999
And that's not sustainable,
773
00:35:05,033 --> 00:35:07,001
because ultimately,
we need to deploy A.I.
774
00:35:07,035 --> 00:35:08,589
onto the device itself,
right?
775
00:35:08,623 --> 00:35:10,867
Onto the cars,
onto the surgical robots.
776
00:35:10,901 --> 00:35:12,455
All of these edge devices
777
00:35:12,489 --> 00:35:15,837
that actually makes
the decisions.
778
00:35:15,872 --> 00:35:18,461
O'BRIEN [voiceover]:
The A.I. worm may, in fact,
779
00:35:18,495 --> 00:35:19,807
turn.
780
00:35:22,637 --> 00:35:23,949
The portability of
artificial intelligence
781
00:35:23,983 --> 00:35:27,194
was on my mind
when it came time
782
00:35:27,228 --> 00:35:30,956
to pick up
my new myoelectric arm...
783
00:35:30,990 --> 00:35:33,959
equipped with
Coapt A.I. pattern recognition.
784
00:35:33,993 --> 00:35:35,788
All right,
let's just check this
785
00:35:35,823 --> 00:35:37,238
real quick...
786
00:35:37,273 --> 00:35:38,584
O'BRIEN [voiceover]:
A few weeks after
787
00:35:38,619 --> 00:35:39,930
my trip to Chicago,
788
00:35:39,965 --> 00:35:41,346
I met Brian Monroe
789
00:35:41,380 --> 00:35:45,108
at his home office
outside Washington, D.C.
790
00:35:45,143 --> 00:35:47,248
Are you happy with
the way it came out?Yeah.
791
00:35:47,283 --> 00:35:49,216
Would you
tell me otherwise?
792
00:35:49,250 --> 00:35:52,080
[laughing]:
Yeah, I would, yeah...
793
00:35:53,220 --> 00:35:54,462
O'BRIEN [voiceover]:
As usual,
794
00:35:54,497 --> 00:35:57,258
he did a great job
making a tight socket.
795
00:35:58,639 --> 00:36:00,261
How's the socket feel?
Does it feel like
796
00:36:00,296 --> 00:36:01,814
it's sliding down or
797
00:36:01,849 --> 00:36:04,265
falling out...No, it fits like a glove.
798
00:36:05,439 --> 00:36:07,026
O'BRIEN [voiceover]:
It's really important in
this case,
799
00:36:07,061 --> 00:36:10,340
because the electrodes designed
to read the signals
800
00:36:10,375 --> 00:36:12,722
from my muscles...
801
00:36:12,756 --> 00:36:14,344
...have to stay in place snugly
802
00:36:14,379 --> 00:36:18,348
in order to generate
accurate, reliable commands
803
00:36:18,383 --> 00:36:20,039
to the actuators
in my new hand.
804
00:36:21,662 --> 00:36:23,319
Wait, is that you?That's me.
805
00:36:25,217 --> 00:36:27,461
[voiceover]:
He also provided me with
806
00:36:27,495 --> 00:36:29,497
a human-like bionic hand.
807
00:36:30,774 --> 00:36:32,500
But getting it
to work just right
808
00:36:32,535 --> 00:36:34,433
took some time.
809
00:36:34,468 --> 00:36:36,677
That's open and it's closing.
810
00:36:36,711 --> 00:36:37,885
It's backwards?
811
00:36:37,919 --> 00:36:39,266
Yeah.Now try.
812
00:36:39,300 --> 00:36:40,646
If it's reversed,
813
00:36:40,681 --> 00:36:42,096
I can swap the electrodes.There we go.
814
00:36:42,130 --> 00:36:44,098
That's got it.Is it the right direction?
815
00:36:44,132 --> 00:36:45,444
Yeah.Uh-huh. Okay.
816
00:36:45,479 --> 00:36:48,896
O'BRIEN [voiceover]:
It's a long way from the movies,
817
00:36:48,930 --> 00:36:50,380
and I'm no Luke Skywalker.
818
00:36:50,415 --> 00:36:54,350
But my new arm and I
are now together.
819
00:36:54,384 --> 00:36:56,075
And I'm heartened
to know
820
00:36:56,110 --> 00:36:57,698
that I have the freedom
and independence
821
00:36:57,732 --> 00:36:59,182
to teach and tweak it
822
00:36:59,217 --> 00:37:00,218
on my own.
823
00:37:00,252 --> 00:37:01,771
That's kind of cool.Yeah.
824
00:37:01,805 --> 00:37:04,291
[voiceover]:
Hopefully we will listen to
each other.
825
00:37:04,325 --> 00:37:05,809
It's pretty awesome.
826
00:37:05,844 --> 00:37:07,570
O'BRIEN [voiceover]:
But we might want to listen
827
00:37:07,604 --> 00:37:09,675
with a skeptical ear.
828
00:37:10,952 --> 00:37:14,404
JORDAN PEELE [imitating Obama]:
You see, I would never
say these things,
829
00:37:14,439 --> 00:37:16,958
at least not in
a public address,
830
00:37:16,993 --> 00:37:18,891
but someone else would.
831
00:37:18,926 --> 00:37:21,239
Someone like Jordan Peele.
832
00:37:22,930 --> 00:37:25,001
This is a dangerous time.
833
00:37:25,035 --> 00:37:28,280
O'BRIEN [voiceover]:
It's even more dangerous now
than it was in 2018
834
00:37:28,315 --> 00:37:30,420
when comedian Jordan Peele
835
00:37:30,455 --> 00:37:33,492
combined his pitch-perfect
Obama impression
836
00:37:33,527 --> 00:37:39,326
with A.I. software to make
this convincing fake video.
837
00:37:39,360 --> 00:37:42,294
...or whether we become some
kind of [bleep] up dystopia.
838
00:37:42,329 --> 00:37:44,296
♪
839
00:37:44,331 --> 00:37:46,333
O'BRIEN [voiceover]:
Fakes are about as old as
840
00:37:46,367 --> 00:37:48,300
photography itself.
841
00:37:48,335 --> 00:37:51,683
Mussolini, Hitler,
and Stalin
842
00:37:51,717 --> 00:37:54,755
all ordered that pictures be
doctored or redacted,
843
00:37:54,789 --> 00:37:58,172
erasing those
who fell out of favor,
844
00:37:58,206 --> 00:38:00,312
consolidating power,
845
00:38:00,347 --> 00:38:03,350
manipulating their followers
through images.
846
00:38:03,384 --> 00:38:04,627
HANY FARID:
They've always been manipulated,
847
00:38:04,661 --> 00:38:07,112
throughout history, but--
848
00:38:07,146 --> 00:38:09,356
there was literally,
you can count on one hand,
849
00:38:09,390 --> 00:38:10,943
the number of people
in the world who could do this.
850
00:38:10,978 --> 00:38:13,291
But now,
you need almost no skill.
851
00:38:13,325 --> 00:38:15,016
And we said,
"Give us an image
852
00:38:15,051 --> 00:38:16,224
"of a middle-aged woman,
newscaster,
853
00:38:16,259 --> 00:38:17,950
sitting at her desk,
reading the news."
854
00:38:17,985 --> 00:38:20,228
O'BRIEN [voiceover]:
Hany Farid is a professor
of computer science
855
00:38:20,263 --> 00:38:22,161
at U.C. Berkeley.
856
00:38:22,196 --> 00:38:24,440
[on computer]:
And this is your daily dose
of future flash.
857
00:38:24,474 --> 00:38:25,717
O'BRIEN [voiceover]:
He and his team
858
00:38:25,751 --> 00:38:28,064
are trying to navigate
the house of mirrors
859
00:38:28,098 --> 00:38:31,412
that is the world of
A.I.-enabled deepfake imagery.
860
00:38:32,413 --> 00:38:33,690
Not perfect.
861
00:38:33,725 --> 00:38:36,037
She's not blinking,
but it's pretty good.
862
00:38:36,072 --> 00:38:38,730
And by the way, he did this
in a day and a half.
863
00:38:38,764 --> 00:38:40,352
FARID [voiceover]:
It's the
classic automation story.
864
00:38:40,387 --> 00:38:42,354
We have lowered
barriers to entry
865
00:38:42,389 --> 00:38:44,252
to manipulate reality.
866
00:38:44,287 --> 00:38:45,771
And when you do that,
867
00:38:45,806 --> 00:38:47,186
more and more people
will do it.
868
00:38:47,221 --> 00:38:48,326
Some good people
will do it,
869
00:38:48,360 --> 00:38:49,534
but lots of bad people
will do it.
870
00:38:49,568 --> 00:38:50,914
There'll be some
interesting use cases,
871
00:38:50,949 --> 00:38:52,606
and there'll be a lot of
nefarious use cases.
872
00:38:52,640 --> 00:38:55,816
Okay, so, um...
873
00:38:55,850 --> 00:38:57,921
Glasses off.
How's the framing?
874
00:38:57,956 --> 00:38:59,026
Everything okay?
875
00:38:59,060 --> 00:39:00,372
[voiceover]:
About a week before
876
00:39:00,407 --> 00:39:02,029
I got on a plane to see him...Hold on.
877
00:39:02,063 --> 00:39:03,962
O'BRIEN [voiceover]:
He asked me to meet him on Zoom
878
00:39:03,996 --> 00:39:05,584
so he could
get a good recording
879
00:39:05,619 --> 00:39:06,999
of my voice and mannerisms.
880
00:39:07,034 --> 00:39:09,485
And I assume
you're recording, Miles.
881
00:39:09,519 --> 00:39:11,590
O'BRIEN [voiceover]:
And he turned the table on me
a little bit,
882
00:39:11,625 --> 00:39:13,489
asking me a lot of questions
883
00:39:13,523 --> 00:39:15,214
to get a good sampling.
884
00:39:15,249 --> 00:39:16,802
FARID [on computer]:
How are you feeling about
885
00:39:16,837 --> 00:39:19,909
the role of A.I.
as it enters into our world
886
00:39:19,943 --> 00:39:21,393
on a daily basis?
887
00:39:21,428 --> 00:39:23,361
I think it's very important,
first of all,
888
00:39:23,395 --> 00:39:26,122
to calibrate the concern level.
889
00:39:26,156 --> 00:39:28,504
Let's take it away from
the "Terminator" scenario...
890
00:39:29,712 --> 00:39:31,748
[voiceover]:
The "Terminator" scenario.
891
00:39:31,783 --> 00:39:33,232
Come with me
if you want to live.
892
00:39:34,544 --> 00:39:37,513
O'BRIEN [voiceover]:
You know, a malevolent
neural network
893
00:39:37,547 --> 00:39:39,273
hellbent on exterminating
humanity.
894
00:39:39,307 --> 00:39:40,723
You're really real.
895
00:39:40,757 --> 00:39:42,656
O'BRIEN [voiceover]:
In the film series,
896
00:39:42,690 --> 00:39:44,002
the cyborg assassin
897
00:39:44,036 --> 00:39:47,108
is memorably played
by Arnold Schwarzenegger.
898
00:39:47,143 --> 00:39:49,145
Hany thought it would be fun
899
00:39:49,179 --> 00:39:52,390
to use A.I.
to turn Arnold into me.
900
00:39:52,424 --> 00:39:53,391
Okay.
901
00:39:54,633 --> 00:39:56,117
O'BRIEN [voiceover]:
A week later, I showed up at
902
00:39:56,152 --> 00:39:58,050
Berkeley's
School of Information,
903
00:39:58,085 --> 00:40:01,951
ironically located in
the oldest building on campus.
904
00:40:03,504 --> 00:40:05,437
So you had me do
this strange thing on Zoom.
905
00:40:05,472 --> 00:40:07,784
Here I am.
What did you do with me?
906
00:40:07,819 --> 00:40:09,441
Yeah, well,
it's gonna teach you
907
00:40:09,476 --> 00:40:10,925
to let me record
your Zoom call, isn't it?
908
00:40:10,960 --> 00:40:12,962
I did this
with some trepidation.
909
00:40:12,996 --> 00:40:15,136
[voiceover]:
I was excited to see what tricks
910
00:40:15,171 --> 00:40:16,517
were up his sleeve.
911
00:40:16,552 --> 00:40:18,174
FARID [voiceover]:
I uploaded 90 seconds of audio,
912
00:40:18,208 --> 00:40:20,279
and I clicked a box saying
913
00:40:20,314 --> 00:40:22,627
"Miles has given me
permission to use his voice,"
914
00:40:22,661 --> 00:40:23,731
which I don't actually
915
00:40:23,766 --> 00:40:25,802
think you did.
[chuckles]
916
00:40:25,837 --> 00:40:27,908
Um, and, I waited about,
eh, maybe 20 seconds,
917
00:40:27,942 --> 00:40:30,704
and it said, "Okay, what would
you like for Miles to say?"
918
00:40:30,738 --> 00:40:32,430
And I started typing,
919
00:40:32,464 --> 00:40:34,673
and I generated an audio
of you saying
920
00:40:34,708 --> 00:40:36,054
whatever I wanted you to say.
921
00:40:36,088 --> 00:40:38,263
We are synthesizing,
922
00:40:38,297 --> 00:40:40,610
at much, much lower
resolution.
923
00:40:40,645 --> 00:40:41,956
O'BRIEN [voiceover]:
You could have knocked me over
924
00:40:41,991 --> 00:40:44,683
with a feather
when I watched this.
925
00:40:44,718 --> 00:40:46,098
A.I. O'BRIEN:
Terminators were
science fiction back then,
926
00:40:46,133 --> 00:40:49,412
but if you follow the
recent A.I. media coverage,
927
00:40:49,447 --> 00:40:52,346
you might think that Terminators
are just around the corner.
928
00:40:52,380 --> 00:40:54,106
The reality is...
929
00:40:54,141 --> 00:40:56,005
O'BRIEN [voiceover]:
The eyes and the mouth
need some work,
930
00:40:56,039 --> 00:40:58,456
but it sure does
sound like me.
931
00:40:59,491 --> 00:41:02,770
And consider what happened
in May of 2023.
932
00:41:02,805 --> 00:41:05,877
Someone posted
this A.I.-generated image
933
00:41:05,911 --> 00:41:08,431
of what appeared to be
a terrorist bombing
934
00:41:08,466 --> 00:41:09,881
at the Pentagon.
935
00:41:09,915 --> 00:41:11,158
NEWS ANCHOR:
Today we may have witnessed
936
00:41:11,192 --> 00:41:13,160
one of the first drops
in the feared flood
937
00:41:13,194 --> 00:41:15,265
of A.I.-created
disinformation.
938
00:41:15,300 --> 00:41:16,853
O'BRIEN [voiceover]:
It was shared on Twitter
939
00:41:16,888 --> 00:41:18,337
via what seemed to be
940
00:41:18,372 --> 00:41:21,617
a verified account
from Bloomberg News.
941
00:41:21,651 --> 00:41:23,584
NEWS ANCHOR:
It only took seconds
to spread fast.
942
00:41:23,619 --> 00:41:26,898
The Dow now down about
200 points...
943
00:41:26,932 --> 00:41:28,762
Two minutes later,
the stock market dropped
944
00:41:28,796 --> 00:41:31,143
a half a trillion dollars
945
00:41:31,178 --> 00:41:33,525
from a single fake image.
946
00:41:33,560 --> 00:41:35,078
Anybody could've made
that image,
947
00:41:35,113 --> 00:41:36,942
whether it was intentionally
manipulating the market
948
00:41:36,977 --> 00:41:38,116
or unintentionally,
949
00:41:38,150 --> 00:41:39,358
in some ways,
it doesn't really matter.
950
00:41:40,498 --> 00:41:41,913
O'BRIEN [voiceover]:
So what are the technological
951
00:41:41,947 --> 00:41:45,330
innovations that make this tool
widely available?
952
00:41:47,021 --> 00:41:48,920
One technique is called
953
00:41:48,954 --> 00:41:51,163
the generative
adversarial network,
954
00:41:51,198 --> 00:41:52,406
or GAN.
955
00:41:52,440 --> 00:41:53,718
Two algorithms
956
00:41:53,752 --> 00:41:57,307
in a dizzying
student-teacher back and forth.
957
00:41:57,342 --> 00:42:00,379
Let's say it's learning how to
generate a cat.
958
00:42:00,414 --> 00:42:02,796
FARID:
And it starts by
just splatting down
959
00:42:02,830 --> 00:42:04,314
a bunch of pixels onto a canvas.
960
00:42:04,349 --> 00:42:07,421
And it sends it over to
a discriminator.
961
00:42:07,455 --> 00:42:09,423
And the discriminator
has access
962
00:42:09,457 --> 00:42:11,390
to millions and millions
of images
963
00:42:11,425 --> 00:42:12,564
of the category that
you want.
964
00:42:12,599 --> 00:42:14,048
And it says,
965
00:42:14,083 --> 00:42:15,947
"Nope, that doesn't look
like all these other things."
966
00:42:15,981 --> 00:42:19,157
So it goes back to the generator
and says, "Try again."
967
00:42:19,191 --> 00:42:20,365
Modifies some pixels,
968
00:42:20,399 --> 00:42:21,642
sends it back
to the discriminator,
969
00:42:21,677 --> 00:42:23,126
and they do this in
what's called
970
00:42:23,161 --> 00:42:24,369
an adversarial loop.
971
00:42:24,403 --> 00:42:25,750
O'BRIEN [voiceover]:
And eventually,
972
00:42:25,784 --> 00:42:28,200
after many
thousands of volleys,
973
00:42:28,235 --> 00:42:31,134
the generator
finally serves up a cat.
974
00:42:31,169 --> 00:42:33,136
And the discriminator says,
975
00:42:33,171 --> 00:42:35,380
"Do more like that."
976
00:42:35,414 --> 00:42:37,589
Today, we have a whole new way
of doing these things.
977
00:42:37,624 --> 00:42:39,108
They're called diffusion-based.
978
00:42:40,143 --> 00:42:41,489
What diffusion does
979
00:42:41,524 --> 00:42:44,078
is it has vacuumed up
billions of images
980
00:42:44,113 --> 00:42:46,598
with captions
that are descriptive.
981
00:42:46,633 --> 00:42:49,256
O'BRIEN [voiceover]:
It starts by making those
labeled images
982
00:42:49,290 --> 00:42:51,154
visually noisy on purpose.
983
00:42:52,984 --> 00:42:55,055
FARID:
And then it corrupts it more,
and it goes backwards
984
00:42:55,089 --> 00:42:56,539
and corrupts it more,
and goes backwards
985
00:42:56,574 --> 00:42:57,609
and corrupts it more
and goes backwards--
986
00:42:57,644 --> 00:42:59,853
and it does that
six billion times.
987
00:43:00,923 --> 00:43:02,269
O'BRIEN [voiceover]:
Eventually it corrupts it
988
00:43:02,303 --> 00:43:07,067
so it's unrecognizable
from the original image.
989
00:43:07,101 --> 00:43:09,690
Now that it knows how
to turn an image into nothing,
990
00:43:09,725 --> 00:43:11,623
it can reverse the process,
991
00:43:11,658 --> 00:43:15,351
turning seemingly nothing,
into a beautiful image.
992
00:43:16,421 --> 00:43:18,078
FARID:
What it's learned is how to take
993
00:43:18,112 --> 00:43:21,668
a completely indescript image,
just pure noise,
994
00:43:21,702 --> 00:43:25,326
and go back to a coherent image,
conditioned on a text prompt.
995
00:43:25,361 --> 00:43:28,606
You're basically
reverse engineering an image
996
00:43:28,640 --> 00:43:29,986
down to the pixel.
997
00:43:30,021 --> 00:43:31,401
Yeah, exactly, yeah.
998
00:43:31,436 --> 00:43:33,472
And it's-- and by the way--
if you had asked me,
999
00:43:33,507 --> 00:43:34,991
"Will this work?"
I would have said,
1000
00:43:35,026 --> 00:43:36,475
"No, there's no way
this system works."
1001
00:43:36,510 --> 00:43:38,823
It just, it just doesn't
seem like it should work.
1002
00:43:38,857 --> 00:43:40,652
And that's sort of the magic
1003
00:43:40,687 --> 00:43:42,274
of when you get this much data
1004
00:43:42,309 --> 00:43:44,622
and very powerful algorithms
and very powerful computing
1005
00:43:44,656 --> 00:43:47,728
to be able to crunch
these massive data sets.
1006
00:43:47,763 --> 00:43:49,627
I mean, we're not
going to contain it.
1007
00:43:49,661 --> 00:43:50,697
That's done.
1008
00:43:50,731 --> 00:43:51,732
[voiceover]:
I sat down with Hany
1009
00:43:51,767 --> 00:43:52,906
and two of his grad students:
1010
00:43:52,940 --> 00:43:56,668
Justin Norman
and Sarah Barrington.
1011
00:43:56,703 --> 00:43:59,153
We looked at some
the A.I. trickery
1012
00:43:59,188 --> 00:44:00,741
they have seen and made.
1013
00:44:00,776 --> 00:44:03,157
Somebody else
wrote some base code
1014
00:44:03,192 --> 00:44:04,607
and they got grew on to
1015
00:44:04,642 --> 00:44:06,367
and grow on to and
grow on to and eventually...
1016
00:44:06,402 --> 00:44:07,852
O'BRIEN [voiceover]:
In a world where anything
1017
00:44:07,886 --> 00:44:09,923
can be manipulated
with such ease
1018
00:44:09,957 --> 00:44:11,165
and seeming authenticity,
1019
00:44:11,200 --> 00:44:14,755
how are we to know
what's real anymore?
1020
00:44:14,790 --> 00:44:15,929
How you look at the world,
1021
00:44:15,963 --> 00:44:17,344
how you interact with
people in it,
1022
00:44:17,378 --> 00:44:19,173
and where you look for
your threats of that change.
1023
00:44:19,208 --> 00:44:23,522
O'BRIEN [voiceover]:
Generative A.I. is now
part of a larger ecosystem
1024
00:44:23,557 --> 00:44:26,802
that is built on mistrust.
1025
00:44:26,836 --> 00:44:28,044
We're going to live
in a world where
1026
00:44:28,079 --> 00:44:29,597
we don't know what's real.
1027
00:44:29,632 --> 00:44:30,771
FARID [voiceover]:
There is distrust of
governments,
1028
00:44:30,806 --> 00:44:32,428
there is distrust of media,
1029
00:44:32,462 --> 00:44:33,774
there is distrust
of academics.
1030
00:44:33,809 --> 00:44:36,604
And now throw on top of that
video evidence.
1031
00:44:36,639 --> 00:44:38,261
So-called video evidence.
1032
00:44:38,296 --> 00:44:40,022
I think this is
the very definition
1033
00:44:40,056 --> 00:44:42,231
of throwing jet fuel onto
a dumpster fire.
1034
00:44:42,265 --> 00:44:43,991
And it's already happening,
1035
00:44:44,026 --> 00:44:45,544
and I imagine
we will see more of it.
1036
00:44:45,579 --> 00:44:47,167
[Arnold's voice]:
Come with me
if you want to live.
1037
00:44:47,201 --> 00:44:48,824
O'BRIEN [voiceover]:
But it also can be
1038
00:44:48,858 --> 00:44:49,825
kind of fun.
1039
00:44:49,859 --> 00:44:50,998
As Hany promised,
1040
00:44:51,033 --> 00:44:53,138
here's my face
1041
00:44:53,173 --> 00:44:55,071
on the Terminator's body.
1042
00:44:55,106 --> 00:44:56,452
[gunfire blasting]
1043
00:44:56,486 --> 00:44:58,903
Long before A.I. might take
1044
00:44:58,937 --> 00:45:01,250
an existential turn
against humanity,
1045
00:45:01,284 --> 00:45:04,046
we will need to
reckon with the likes...
1046
00:45:04,080 --> 00:45:06,634
Go! Now!O'BRIEN [voiceover]:
Of the Milesinator.
1047
00:45:06,669 --> 00:45:08,740
TRAILER NARRATOR:
This time, he's back.
1048
00:45:08,775 --> 00:45:10,362
[booming]
1049
00:45:10,397 --> 00:45:11,812
O'BRIEN [voiceover]:
Who will no doubt, be back.
1050
00:45:11,847 --> 00:45:13,434
Trust me.
1051
00:45:14,781 --> 00:45:15,920
O'BRIEN [voiceover]:
Trust,
1052
00:45:15,954 --> 00:45:18,232
but always verify.
1053
00:45:18,267 --> 00:45:21,511
So, what kind of A.I. magic
1054
00:45:21,546 --> 00:45:23,859
is readily available online?
1055
00:45:23,893 --> 00:45:25,757
It's pretty simple
to make it look
1056
00:45:25,792 --> 00:45:28,208
like you're fluent
in another language.
1057
00:45:28,242 --> 00:45:30,658
[speaking Mandarin]:
1058
00:45:31,970 --> 00:45:33,109
It was pretty easy to do,
1059
00:45:33,144 --> 00:45:35,560
I just had to upload
a video and wait.
1060
00:45:35,594 --> 00:45:38,528
[speaking German]:
1061
00:45:39,944 --> 00:45:42,705
And, suddenly,
I look pretty darn smart.
1062
00:45:42,740 --> 00:45:45,950
[speaking Greek]:
1063
00:45:46,916 --> 00:45:49,022
Sure, it's fun,
but I think you can see
1064
00:45:49,056 --> 00:45:50,748
where it leads to mischief
1065
00:45:50,782 --> 00:45:53,129
and possibly even mayhem.
1066
00:45:54,165 --> 00:45:58,548
[voiceover]:
Yoshua Bengio is an
artificial intelligence pioneer.
1067
00:45:58,583 --> 00:46:00,240
He says he didn't spend
much time
1068
00:46:00,274 --> 00:46:02,725
thinking about
science fiction dystopia
1069
00:46:02,760 --> 00:46:05,659
as he was creating
the technology.
1070
00:46:05,693 --> 00:46:08,662
But as his brilliant ideas
became reality,
1071
00:46:08,696 --> 00:46:10,871
reality set in.
1072
00:46:10,906 --> 00:46:12,252
BENGIO:
And the more I read,
1073
00:46:12,286 --> 00:46:14,150
the more
I thought about it...
1074
00:46:14,185 --> 00:46:16,359
the more concerned I got.
1075
00:46:17,395 --> 00:46:20,398
If we are not honest
with ourselves,
1076
00:46:20,432 --> 00:46:21,399
we're gonna fool ourselves.
1077
00:46:21,433 --> 00:46:23,401
We're gonna...
lose.
1078
00:46:24,643 --> 00:46:25,990
O'BRIEN [voiceover]:
Avoiding that outcome
1079
00:46:26,024 --> 00:46:28,509
is now his main priority.
1080
00:46:28,544 --> 00:46:30,580
He has signed
several public warnings
1081
00:46:30,615 --> 00:46:32,755
issued by A.I. thought leaders,
1082
00:46:32,790 --> 00:46:36,103
including this stark
single-sentence statement
1083
00:46:36,138 --> 00:46:38,209
in May of 2023.
1084
00:46:38,243 --> 00:46:41,074
"Mitigating the risk of
extinction from A.I.
1085
00:46:41,108 --> 00:46:43,041
"should be a global priority
1086
00:46:43,076 --> 00:46:45,803
"alongside other
societal scale risks,
1087
00:46:45,837 --> 00:46:47,321
"such as pandemics
1088
00:46:47,356 --> 00:46:48,806
and nuclear war."
1089
00:46:51,878 --> 00:46:55,605
As we approach more and more
capable A.I. systems
1090
00:46:55,640 --> 00:46:59,886
that might even become stronger
than humans in many areas,
1091
00:46:59,920 --> 00:47:01,439
they become
more and more dangerous.
1092
00:47:01,473 --> 00:47:02,854
Can't we just pull
the plug on the thing?
1093
00:47:02,889 --> 00:47:04,442
Oh, that's
the safest thing to do,
1094
00:47:04,476 --> 00:47:05,719
pull the plug.
1095
00:47:05,753 --> 00:47:08,204
Before it gets
so powerful that
1096
00:47:08,239 --> 00:47:09,861
it prevents us from
pulling the plug.
1097
00:47:09,896 --> 00:47:11,794
DAVE:
Open the pod bay doors, Hal.
1098
00:47:11,829 --> 00:47:13,554
HAL:
I'm sorry, Dave,
1099
00:47:13,589 --> 00:47:15,591
I'm afraid I can't do that.
1100
00:47:16,764 --> 00:47:18,525
O'BRIEN [voiceover]:
It may be some time
1101
00:47:18,559 --> 00:47:20,078
before computers are able
1102
00:47:20,113 --> 00:47:22,460
to act like
movie supervillains...
1103
00:47:22,494 --> 00:47:23,530
HAL:
Goodbye.
1104
00:47:24,565 --> 00:47:28,293
O'BRIEN [voiceover]:
But there are near-term dangers
already emerging.
1105
00:47:28,328 --> 00:47:31,538
Besides deepfakes and
misinformation,
1106
00:47:31,572 --> 00:47:35,438
A.I. can also supercharge bias
and hate content,
1107
00:47:35,473 --> 00:47:38,269
replace human jobs...
1108
00:47:38,303 --> 00:47:39,926
This is why
we're striking, everybody.[crowd exclaiming]
1109
00:47:41,065 --> 00:47:42,238
O'BRIEN [voiceover]:
And make it easier
1110
00:47:42,273 --> 00:47:45,655
for terrorists
to create bioweapons.
1111
00:47:45,690 --> 00:47:48,451
And A.I. systems are so complex
1112
00:47:48,486 --> 00:47:51,144
that they are difficult
to comprehend,
1113
00:47:51,178 --> 00:47:53,767
all but impossible to audit.
1114
00:47:53,801 --> 00:47:55,562
RUS [voiceover]:
Nobody really understands
1115
00:47:55,596 --> 00:47:58,220
how those systems
reach their decisions.
1116
00:47:58,254 --> 00:48:00,394
So we have to be
much more thoughtful
1117
00:48:00,429 --> 00:48:02,810
about how we
test and evaluate them
1118
00:48:02,845 --> 00:48:04,191
before releasing them.
1119
00:48:04,226 --> 00:48:07,367
They're concerned
whether machine will be able
1120
00:48:07,401 --> 00:48:09,576
to begin to
think for itself.
1121
00:48:09,610 --> 00:48:12,268
O'BRIEN [voiceover]:
The U.S. and Europe have begun
charting a strategy
1122
00:48:12,303 --> 00:48:13,994
to try to ensure safe, secure,
1123
00:48:14,029 --> 00:48:17,446
and trustworthy
artificial intelligence.
1124
00:48:17,480 --> 00:48:19,931
RISHI SUNAK:
...in a way that will
be safe for our communities...
1125
00:48:19,966 --> 00:48:21,381
O'BRIEN [voiceover]:
But how to do that
1126
00:48:21,415 --> 00:48:23,590
in the midst of a frenetic race
1127
00:48:23,624 --> 00:48:24,591
to dominate a technology
1128
00:48:24,625 --> 00:48:28,629
with a predicted economic impact
1129
00:48:28,664 --> 00:48:32,357
of 13 trillion dollars by 2030.
1130
00:48:32,392 --> 00:48:35,740
There is such a strong
commercial incentive
1131
00:48:35,774 --> 00:48:38,225
to develop this
and win the competition
1132
00:48:38,260 --> 00:48:39,502
against the other companies,
1133
00:48:39,537 --> 00:48:41,884
not to mention
the other countries,
1134
00:48:41,919 --> 00:48:44,714
that it's hard
to stop that train.
1135
00:48:45,819 --> 00:48:49,133
But that's what
governments should be doing.
1136
00:48:49,167 --> 00:48:51,756
NEWS ANCHOR:
The titans of social media
1137
00:48:51,790 --> 00:48:54,517
didn't want to come to
Capitol Hill.
1138
00:48:54,552 --> 00:48:56,002
O'BRIEN [voiceover]:
Historically, the tech industry
1139
00:48:56,036 --> 00:48:58,936
has bridled against regulation.
1140
00:48:58,970 --> 00:49:01,973
You have an army of lawyers
and lobbyists
1141
00:49:02,008 --> 00:49:03,250
that have fought us on this...
1142
00:49:03,285 --> 00:49:04,355
SULEYMAN [voiceover]:
There's no question that
1143
00:49:04,389 --> 00:49:05,666
guardrails
will slow things down,
1144
00:49:05,701 --> 00:49:06,978
But, the risks are uncertain
1145
00:49:07,013 --> 00:49:10,361
and potentially enormous.
1146
00:49:10,395 --> 00:49:11,776
So, it makes sense for us
1147
00:49:11,810 --> 00:49:13,605
to start having
the conversation right now.
1148
00:49:14,952 --> 00:49:16,470
O'BRIEN [voiceover]:
For me, the conversation
1149
00:49:16,505 --> 00:49:19,128
about A.I. is personal.
1150
00:49:19,163 --> 00:49:21,959
Okay, no network detected.
1151
00:49:21,993 --> 00:49:23,167
Okay, um...
1152
00:49:23,201 --> 00:49:25,514
Oh, here we go.Okay.
1153
00:49:25,548 --> 00:49:27,343
And now I'm going to open,
open, open, open, open...
1154
00:49:28,827 --> 00:49:30,864
[voiceover]:
I used the Coapt app
1155
00:49:30,898 --> 00:49:34,040
to train the A.I.
inside my new prosthetic.
1156
00:49:34,074 --> 00:49:37,043
♪
1157
00:49:37,077 --> 00:49:38,734
It says all of my
training data is good,
1158
00:49:38,768 --> 00:49:40,011
it's four of five stars.
1159
00:49:40,046 --> 00:49:41,426
And now let's try to close.
1160
00:49:41,461 --> 00:49:43,014
[whirring]
1161
00:49:43,049 --> 00:49:44,188
All right.
1162
00:49:44,222 --> 00:49:49,055
Seems to be doing
what it was told.
1163
00:49:49,089 --> 00:49:50,573
[voiceover]:
Was my new arm listening?
1164
00:49:50,608 --> 00:49:51,954
Maybe.
1165
00:49:51,989 --> 00:49:53,887
I decided to make things
simpler.
1166
00:49:54,992 --> 00:49:58,892
I took off the hand and
attached a myoelectric hook.
1167
00:49:58,926 --> 00:50:00,790
[quietly]:
All right.
1168
00:50:00,825 --> 00:50:03,414
[voiceover]:
Function over form.
1169
00:50:03,448 --> 00:50:06,106
Not a conversation piece
necessarily at a cocktail party
1170
00:50:06,141 --> 00:50:08,074
like this thing is.
1171
00:50:08,108 --> 00:50:10,835
This looks more like
Luke Skywalker, I suppose.
1172
00:50:10,869 --> 00:50:14,080
But this thing has a tremendous
amount of function to it.
1173
00:50:14,114 --> 00:50:16,806
Although, right now,
it wants to stay open.
1174
00:50:16,841 --> 00:50:18,774
[voiceover]:
And that problem persisted.
1175
00:50:18,808 --> 00:50:20,672
Find a tripod plate...
1176
00:50:20,707 --> 00:50:22,053
[voiceover]:
When I tried using it
1177
00:50:22,088 --> 00:50:23,986
to set up my basement studio
1178
00:50:24,021 --> 00:50:25,298
for a live broadcast.
1179
00:50:25,332 --> 00:50:27,990
Come on, close.
1180
00:50:28,025 --> 00:50:30,061
[voiceover]:
I was quickly frustrated.
1181
00:50:30,096 --> 00:50:32,443
[item drops, audio beep]
1182
00:50:32,477 --> 00:50:33,754
Really annoying.
1183
00:50:33,789 --> 00:50:36,240
Not useful.
1184
00:50:36,274 --> 00:50:39,588
[voiceover]:
The hook continuously
opened on its own.
1185
00:50:39,622 --> 00:50:41,279
[clattering]Damn it!
1186
00:50:41,314 --> 00:50:43,764
[voiceover]:
So I completely reset
1187
00:50:43,799 --> 00:50:46,043
and retrained the arm.
1188
00:50:47,009 --> 00:50:48,976
And... reset,
there we go.
1189
00:50:49,011 --> 00:50:51,772
Add data...
1190
00:50:51,807 --> 00:50:54,465
[voiceover]:
But the software was
1191
00:50:54,499 --> 00:50:55,914
artificially unhappy.
1192
00:50:57,847 --> 00:50:59,918
"Electrodes are not
making good skin contact."
1193
00:50:59,953 --> 00:51:02,369
Maybe that is my problem,
ultimately.
1194
00:51:03,681 --> 00:51:05,303
[voiceover]:
My problem really is
1195
00:51:05,338 --> 00:51:07,650
I haven't given this
enough time.
1196
00:51:07,685 --> 00:51:09,928
Amputees tell me it can take
1197
00:51:09,963 --> 00:51:11,585
many months to really learn
1198
00:51:11,620 --> 00:51:13,587
how to use an arm like
this one.
1199
00:51:14,588 --> 00:51:17,281
The choke point isn't
artificial intelligence.
1200
00:51:17,315 --> 00:51:19,938
Dead as a doornail.
1201
00:51:19,973 --> 00:51:21,664
[voiceover]:
But rather, what is the best way
1202
00:51:21,699 --> 00:51:23,597
to communicate
my intentions to it?
1203
00:51:24,978 --> 00:51:26,428
Little reboot there,
I guess.
1204
00:51:26,462 --> 00:51:28,188
All right.
1205
00:51:28,223 --> 00:51:29,327
Close.
1206
00:51:29,362 --> 00:51:31,881
Open, close.
1207
00:51:31,916 --> 00:51:34,229
[voiceover]:
It turns out machine learning
1208
00:51:34,263 --> 00:51:37,266
isn't smart enough to
give me a replacement arm
1209
00:51:37,301 --> 00:51:39,234
like Luke Skywalker got.
1210
00:51:39,268 --> 00:51:43,100
Nor is it capable
of creating the Terminator.
1211
00:51:43,134 --> 00:51:47,000
Right now, it seems many
hopes and fears
1212
00:51:47,034 --> 00:51:48,070
for artificial intelligence...
1213
00:51:48,105 --> 00:51:49,520
Oh!
1214
00:51:49,554 --> 00:51:52,281
[voiceover]:
...are rooted
in science fiction.
1215
00:51:54,249 --> 00:51:58,253
But we are walking down a road
to the unknown.
1216
00:51:58,287 --> 00:52:01,187
The door is opening to
a revolution.
1217
00:52:02,705 --> 00:52:03,706
[door closes]
1218
00:52:03,741 --> 00:52:07,741
♪
89480
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.