Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,000 --> 00:00:07,000
Downloaded from
YTS.BZ
2
00:00:08,000 --> 00:00:13,000
Official YIFY movies site:
YTS.BZ
3
00:01:04,847 --> 00:01:09,329
Now, the present-day computers
are complete morons,
4
00:01:09,330 --> 00:01:12,158
but this will not be true
in another generation.
5
00:01:12,159 --> 00:01:14,899
They will start to think,
and eventually they will
6
00:01:14,900 --> 00:01:17,641
completely outthink
their makers.
7
00:01:17,642 --> 00:01:20,296
Is this depressing?
I don't see why it should be.
8
00:01:20,297 --> 00:01:24,866
I suspect that organic
or biological evolution
9
00:01:24,867 --> 00:01:26,303
has about come to its end.
10
00:01:29,567 --> 00:01:30,698
You don't want me
to be the narrator.
11
00:01:30,699 --> 00:01:32,395
I don't have a good voice.
12
00:01:32,396 --> 00:01:34,049
You have a great voice.
Just do it.
13
00:01:34,050 --> 00:01:36,225
It's just a--
we're just trying it out.
14
00:01:36,226 --> 00:01:38,009
Do I get paid for this shit?
15
00:01:38,010 --> 00:01:40,273
Well,
we'll see how well you do.
16
00:01:41,623 --> 00:01:43,451
♪ ♪
17
00:01:46,454 --> 00:01:47,932
I need to call my parents.
18
00:01:49,109 --> 00:01:51,066
Sometimes we rush into things
19
00:01:51,067 --> 00:01:52,894
without thinking them through.
20
00:01:52,895 --> 00:01:54,765
Yeah, can you sh...
you want to shoot? That's...
21
00:01:54,766 --> 00:01:57,246
Like when Daniel
and Caroline got married
22
00:01:57,247 --> 00:02:00,206
151 days after they met.
23
00:02:00,207 --> 00:02:01,817
Let's rewind.
24
00:02:04,820 --> 00:02:07,213
Have you been thinking about
buying a home computer?
25
00:02:07,214 --> 00:02:09,345
In 1993, when Daniel was born,
26
00:02:09,346 --> 00:02:12,174
his parents didn't even have
a computer in the house.
27
00:02:12,175 --> 00:02:13,828
But he remembers
when they finally got one.
28
00:02:13,829 --> 00:02:15,743
What a computer is to me is
29
00:02:15,744 --> 00:02:18,398
the equivalent of a bicycle
for our minds.
30
00:02:18,399 --> 00:02:20,269
Hello.
31
00:02:20,270 --> 00:02:22,228
His computer helped
unleash his creativity.
32
00:02:22,229 --> 00:02:23,968
Rolling. Go.
33
00:02:23,969 --> 00:02:26,623
He used
his dad's iMac to edit videos.
34
00:02:26,624 --> 00:02:28,364
Yeah!
35
00:02:28,365 --> 00:02:29,974
And even make
little animations.
36
00:02:31,977 --> 00:02:35,110
In 1998, when Caroline was
a little girl...
37
00:02:35,111 --> 00:02:36,764
Hi, Mommy.
38
00:02:36,765 --> 00:02:38,200
...only nerds knew
what the Internet was.
39
00:02:38,201 --> 00:02:39,854
And what about
this Internet thing?
40
00:02:39,855 --> 00:02:41,377
Do you, do you know anything
about that? -Sure.
41
00:02:41,378 --> 00:02:42,813
But soon,
42
00:02:42,814 --> 00:02:44,946
- everyone was on the...
- ...Internets.
43
00:02:44,947 --> 00:02:47,035
...and computers were
beating humans at chess.
44
00:02:47,036 --> 00:02:50,038
Deep Blue--
Kasparov has resigned!
45
00:02:50,039 --> 00:02:52,649
And by the time
Caroline went to college,
46
00:02:52,650 --> 00:02:55,130
computers were
in everyone's pocket.
47
00:02:55,131 --> 00:02:58,438
Now Daniel could make movies
with his phone.
48
00:02:58,439 --> 00:03:01,528
He grew up to be an artist
and a filmmaker...
49
00:03:01,529 --> 00:03:03,965
- Good night.
- ...and so did Caroline.
50
00:03:03,966 --> 00:03:05,749
Cut. Let's go into a close-up.
That was perfect.
51
00:03:05,750 --> 00:03:07,316
By then, computers were
connecting people
52
00:03:07,317 --> 00:03:09,013
and changing the world in ways
53
00:03:09,014 --> 00:03:10,711
that we never
could have imagined.
54
00:03:10,712 --> 00:03:12,365
All the money raised by
the Ice Bucket Challenge...
55
00:03:12,366 --> 00:03:14,236
Some of it was good.
56
00:03:15,543 --> 00:03:17,152
Some not so good.
57
00:03:17,153 --> 00:03:19,328
Anxiety, self-harm, suicide...
58
00:03:19,329 --> 00:03:21,243
But it's clear now
that we didn't do enough
59
00:03:21,244 --> 00:03:24,594
to prevent these tools from
being used for harm as well.
60
00:03:24,595 --> 00:03:26,422
And as the world
got more and more focused
61
00:03:26,423 --> 00:03:28,076
on their computers,
62
00:03:28,077 --> 00:03:30,210
Daniel focused on his artwork.
63
00:03:32,037 --> 00:03:35,649
Wherever he went,
he always had a sketchbook,
64
00:03:35,650 --> 00:03:38,434
including the night
he met Caroline.
65
00:03:38,435 --> 00:03:41,698
He drew a terrible picture
of her, and 20 minutes later,
66
00:03:41,699 --> 00:03:44,179
he boldly pronounced
they were going to get married,
67
00:03:44,180 --> 00:03:47,617
which, as you know, they did,
151 days later.
68
00:03:47,618 --> 00:03:49,967
...now pronounce you
husband and wife.
69
00:03:49,968 --> 00:03:51,708
They moved into a cute house...
70
00:03:51,709 --> 00:03:52,883
-Moose, hey! Come here!
- ...and got a dog
71
00:03:52,884 --> 00:03:54,711
as dumb as Daniel.
72
00:03:54,712 --> 00:03:57,323
Meanwhile, computers were
writing entire essays
73
00:03:57,324 --> 00:03:59,368
based on simple questions like,
74
00:03:59,369 --> 00:04:03,198
"How hard would it be to build
a shack in my backyard?"
75
00:04:03,199 --> 00:04:07,115
And so Daniel built a shack
in his backyard.
76
00:04:07,116 --> 00:04:09,987
But just as he sat down to start
working on his next movie,
77
00:04:09,988 --> 00:04:13,644
he learned that computers
could now write screenplays.
78
00:04:14,863 --> 00:04:16,603
I mean, they were bad,
79
00:04:16,604 --> 00:04:18,996
but they were getting better
really fast.
80
00:04:18,997 --> 00:04:21,695
They could create new images
and videos from scratch,
81
00:04:21,696 --> 00:04:24,828
and some of them
could even pass the bar exam.
82
00:04:24,829 --> 00:04:28,876
Not just pass the bar but
be in the top ten percentile.
83
00:04:28,877 --> 00:04:30,573
It was confusing.
84
00:04:32,184 --> 00:04:35,099
Daniel just wanted
a bicycle for his mind,
85
00:04:35,100 --> 00:04:38,668
but computers had become
a self-driving rocket ship.
86
00:04:42,020 --> 00:04:44,805
Pioneers in the field
of artificial intelligence
87
00:04:44,806 --> 00:04:46,546
are pleading with Congress
88
00:04:46,547 --> 00:04:48,548
to pass safety rules
before it's too late.
89
00:04:48,549 --> 00:04:50,593
Now it felt like
the whole world
90
00:04:50,594 --> 00:04:53,030
was rushing into something
without thinking it through,
91
00:04:53,031 --> 00:04:56,382
and everyone had an opinion.
92
00:04:56,383 --> 00:05:00,168
Was it gonna destroy the world
or save humanity?
93
00:05:00,169 --> 00:05:02,431
There was too much information,
94
00:05:02,432 --> 00:05:04,694
which made him anxious,
which made Caroline anxious...
95
00:05:06,044 --> 00:05:07,828
I don't have kids yet,
but I worry
96
00:05:07,829 --> 00:05:10,221
what the world would look like
if I decide to.
97
00:05:10,222 --> 00:05:12,398
He was starting to spin out.
98
00:05:14,096 --> 00:05:16,925
It was becoming
a mountain of anxiety.
99
00:05:19,144 --> 00:05:21,364
-It's horrible.
- It's freakin' Siri.
100
00:05:22,670 --> 00:05:25,628
And so, Daniel decided to go out
101
00:05:25,629 --> 00:05:28,022
and find someone
who could explain this to him,
102
00:05:28,023 --> 00:05:30,198
so he could stop
thinking about it
103
00:05:30,199 --> 00:05:33,506
and he and Caroline could
get on with their lives.
104
00:05:33,507 --> 00:05:36,857
♪ We are in control,
we are in control ♪
105
00:05:36,858 --> 00:05:38,772
♪ We are in control ♪
106
00:05:38,773 --> 00:05:40,556
♪ Chemical computer
thinking battery ♪
107
00:05:40,557 --> 00:05:42,602
♪ We are in control, we are... ♪
108
00:05:42,603 --> 00:05:45,735
This endeavor would turn out
to be hopelessly naive
109
00:05:45,736 --> 00:05:49,086
and kick off the most confusing
year of his life.
110
00:05:49,087 --> 00:05:50,740
♪ We are in control... ♪
111
00:05:50,741 --> 00:05:52,829
But as we know,
we sometimes rush into things
112
00:05:52,830 --> 00:05:54,657
without thinking them through.
113
00:05:54,658 --> 00:05:56,398
♪ Chemical computer
thinking battery... ♪
114
00:05:56,399 --> 00:05:57,965
Oh, my gosh. What is happening?
115
00:05:57,966 --> 00:05:59,967
He had questions.
116
00:05:59,968 --> 00:06:01,621
Questions only the
smartest nerds could answer.
117
00:06:01,622 --> 00:06:04,188
Submitting to the interrogator.
118
00:06:04,189 --> 00:06:06,060
Questions like:
119
00:06:06,061 --> 00:06:08,628
Was this the apocalypse
or did he actually have reason
120
00:06:08,629 --> 00:06:11,152
- to be optimistic?
- Yes.
121
00:06:11,153 --> 00:06:13,807
♪ Chemical computer thinking
battery. ♪ -
122
00:06:13,808 --> 00:06:16,200
Uh, is that good?
123
00:06:16,201 --> 00:06:17,637
Yeah.
124
00:06:17,638 --> 00:06:20,204
So, to begin,
125
00:06:20,205 --> 00:06:22,163
what is artificial intelligence?
126
00:06:22,164 --> 00:06:24,731
I know that must be annoying
for you, that-that question,
127
00:06:24,732 --> 00:06:26,515
but I do think it's important.
128
00:06:26,516 --> 00:06:29,823
So... AI...
129
00:06:29,824 --> 00:06:31,868
Um...
130
00:06:31,869 --> 00:06:33,696
- Yeah...
- Uh, hmm.
131
00:06:33,697 --> 00:06:34,784
That's a good question.
132
00:06:34,785 --> 00:06:36,220
- Yeah, um...
- Um...
133
00:06:36,221 --> 00:06:37,700
What is AI?
134
00:06:39,137 --> 00:06:40,790
I love that that's
the first question,
135
00:06:40,791 --> 00:06:43,184
'cause there is not
a clear and consistent answer.
136
00:06:43,185 --> 00:06:45,316
Artificial intelligence is
137
00:06:45,317 --> 00:06:50,757
a kind of intentionally
and maybe uselessly broad term.
138
00:06:50,758 --> 00:06:53,368
It's a machine
139
00:06:53,369 --> 00:06:55,588
doing things that
we previously only thought
140
00:06:55,589 --> 00:06:57,285
that people could do:
141
00:06:57,286 --> 00:07:01,158
making recommendations,
decisions and predictions.
142
00:07:01,159 --> 00:07:06,816
AI is the, uh, application
of computer science
143
00:07:06,817 --> 00:07:09,166
to solving cognitive problems.
144
00:07:09,167 --> 00:07:13,344
Okay, so when I picture AI,
145
00:07:13,345 --> 00:07:16,783
it's sort of like
this magical computer box...
146
00:07:20,527 --> 00:07:24,138
...just, like, floating
in, like, inert space.
147
00:07:24,139 --> 00:07:26,488
And no matter
how many times people try
148
00:07:26,489 --> 00:07:29,752
and explain this to me,
I just don't get how...
149
00:07:29,753 --> 00:07:32,538
how it's understanding
all of these things
150
00:07:32,539 --> 00:07:35,715
and how it's feeling like
intelligence.
151
00:07:37,195 --> 00:07:39,241
And that's kind of
nerve-racking.
152
00:07:41,243 --> 00:07:45,552
What is
this new generation of AI?
153
00:07:47,205 --> 00:07:50,120
This AI that is different than
every other generation?
154
00:07:50,121 --> 00:07:52,775
Like, no one ever
talked about, like,
155
00:07:52,776 --> 00:07:56,953
Siri taking over the world
or causing catastrophes.
156
00:07:56,954 --> 00:07:58,520
-Hi, Siri?
-...want to.
157
00:07:58,521 --> 00:08:00,566
Hello? Siri?
158
00:08:00,567 --> 00:08:02,611
Hello? Hey, Siri!
159
00:08:02,612 --> 00:08:05,788
Or the voice in Google Maps,
which mispronounces road names,
160
00:08:05,789 --> 00:08:07,529
like, breaking society.
161
00:08:07,530 --> 00:08:10,663
Google Maps says
this is a road.
162
00:08:10,664 --> 00:08:13,100
But I think I'm in a river.
163
00:08:13,101 --> 00:08:14,667
This is definitely a river, innit?
164
00:08:14,668 --> 00:08:17,192
Something changed
with ChatGPT coming out.
165
00:08:18,193 --> 00:08:19,976
People understood-- no, no, no--
166
00:08:19,977 --> 00:08:21,500
this technology's
insanely valuable,
167
00:08:21,501 --> 00:08:23,676
it's insanely powerful
and also insanely scary.
168
00:08:23,677 --> 00:08:25,591
Okay, listen to this.
Very creepy.
169
00:08:25,592 --> 00:08:28,637
A new artificial intelligence
tool is going viral
170
00:08:28,638 --> 00:08:32,294
for cranking out entire essays
in a matter of seconds.
171
00:08:35,950 --> 00:08:41,215
AI dwarfs the power of all
other technologies combined.
172
00:08:41,216 --> 00:08:43,739
"AI dwarfs the power
173
00:08:43,740 --> 00:08:46,089
of all other technologies
combined."
174
00:08:46,090 --> 00:08:47,264
Yeah.
175
00:08:47,265 --> 00:08:48,744
Do you think that's true?
176
00:08:48,745 --> 00:08:50,790
Yes.
177
00:08:50,791 --> 00:08:53,227
Tell me about-- How? How?
178
00:08:53,228 --> 00:08:56,752
I think, to paint that picture,
it's really important
179
00:08:56,753 --> 00:09:00,669
to understand what today's
state-of-the-art systems
180
00:09:00,670 --> 00:09:02,192
look like and how they're built.
181
00:09:02,193 --> 00:09:05,805
This is
quite a... quite a setup.
182
00:09:05,806 --> 00:09:07,284
So, one thing that
183
00:09:07,285 --> 00:09:09,722
not a lot of people
realize is that
184
00:09:09,723 --> 00:09:13,726
systems like ChatGPT aren't
programmed by any human.
185
00:09:13,727 --> 00:09:15,249
What do you mean?
186
00:09:15,250 --> 00:09:17,033
Instead, it's something like
th-they're grown.
187
00:09:17,034 --> 00:09:19,209
We kind of give them
raw resources, like,
188
00:09:19,210 --> 00:09:21,037
"Here's a lot of
computational resources.
189
00:09:21,038 --> 00:09:22,212
Here's a lot of data."
190
00:09:22,213 --> 00:09:23,910
Under the hood, it's math,
191
00:09:23,911 --> 00:09:27,348
and the math is actually
surprisingly straightforward.
192
00:09:27,349 --> 00:09:31,134
So ChatGPT is a kind
of AI but it's not all of AI?
193
00:09:31,135 --> 00:09:32,701
Totally. ChatGPT is
just the beginning,
194
00:09:32,702 --> 00:09:34,137
but it's a good place to start.
195
00:09:34,138 --> 00:09:37,837
But I still don't know
what AI is.
196
00:09:37,838 --> 00:09:40,404
To understand AI,
it begins with understanding
197
00:09:40,405 --> 00:09:43,190
that intelligence is about
recognizing patterns.
198
00:09:43,191 --> 00:09:44,887
- Patterns. -Patterns.
- Patterns.
199
00:09:44,888 --> 00:09:48,587
It is shown
trillions of words of text
200
00:09:48,588 --> 00:09:51,328
across millions of documents
in the Internet.
201
00:09:51,329 --> 00:09:52,808
It started with text.
202
00:09:52,809 --> 00:09:55,942
And what they did was
they took textbooks,
203
00:09:55,943 --> 00:09:59,554
and they took poems and essays
and instruction manuals.
204
00:09:59,555 --> 00:10:00,903
They can do things like
205
00:10:00,904 --> 00:10:03,166
digest the entire Internet,
206
00:10:03,167 --> 00:10:06,779
every single word that's ever
been written by a person.
207
00:10:06,780 --> 00:10:09,999
Reddit threads and social media
and all of Wikipedia.
208
00:10:10,000 --> 00:10:13,655
More data than anybody could
ever read in several lifetimes.
209
00:10:13,656 --> 00:10:15,526
And they gave this system
one job.
210
00:10:15,527 --> 00:10:18,529
Figure out the patterns and
structure of that information
211
00:10:18,530 --> 00:10:20,793
and use that
to make predictions about
212
00:10:20,794 --> 00:10:23,273
what word should come next
in a sentence.
213
00:10:23,274 --> 00:10:25,014
When you say
"patterns in a sentence,"
214
00:10:25,015 --> 00:10:26,799
what are you talking about?
215
00:10:26,800 --> 00:10:29,410
So, it's everything from, like,
the really simple things,
216
00:10:29,411 --> 00:10:32,021
like most sentences end
with a period,
217
00:10:32,022 --> 00:10:34,937
all the way up to the
more conceptual things, like:
218
00:10:34,938 --> 00:10:36,809
What is a sonnet?
219
00:10:36,810 --> 00:10:38,333
It's a type of poem, and it has
some particular structure.
220
00:10:39,421 --> 00:10:41,291
So, it then looks at
221
00:10:41,292 --> 00:10:43,642
all of that data,
all of that text...
222
00:10:43,643 --> 00:10:46,079
And over trillions
and trillions of tries,
223
00:10:46,080 --> 00:10:48,821
each time it gets something
right or wrong,
224
00:10:48,822 --> 00:10:51,301
it's given a little bit
of positive reinforcement
225
00:10:51,302 --> 00:10:53,434
when it guesses
the next word correctly,
226
00:10:53,435 --> 00:10:55,741
and it's given a little bit
of negative reinforcement
227
00:10:55,742 --> 00:10:59,396
when it guesses
the next word incorrectly.
228
00:10:59,397 --> 00:11:01,616
And at the end of it,
you have a system that
229
00:11:01,617 --> 00:11:05,098
speaks really good English
as a side effect
230
00:11:05,099 --> 00:11:06,839
of being really, really good
at predicting
231
00:11:06,840 --> 00:11:09,102
the word that comes next
in a piece of text.
232
00:11:09,103 --> 00:11:12,061
Actually, my French
isn't too bad either.
233
00:11:12,062 --> 00:11:14,063
It uses all of
those patterns it has learned
234
00:11:14,064 --> 00:11:15,717
to be able to make
a prediction about
235
00:11:15,718 --> 00:11:17,284
what the answer should be,
236
00:11:17,285 --> 00:11:18,851
then it gives you that
as the output.
237
00:11:18,852 --> 00:11:21,070
It's a little oversimplified,
238
00:11:21,071 --> 00:11:23,116
but I think people will get it.
239
00:11:23,117 --> 00:11:24,987
So, so that's all it does?
240
00:11:24,988 --> 00:11:26,859
Yeah. It doesn't seem like
it would be that complicated,
241
00:11:26,860 --> 00:11:28,077
but actually you have to know
242
00:11:28,078 --> 00:11:29,688
a huge amount of things
243
00:11:29,689 --> 00:11:31,386
in order to
actually succeed at that.
244
00:11:32,517 --> 00:11:34,301
If you say to ChatGPT,
245
00:11:34,302 --> 00:11:36,738
"Write me a Shakespearean
sonnet about my dog,"
246
00:11:36,739 --> 00:11:38,827
it has to know what dogs are.
247
00:11:38,828 --> 00:11:40,481
It has to know what you love
about your dog.-
248
00:11:40,482 --> 00:11:43,049
It has to know
who Shakespeare is,
249
00:11:43,050 --> 00:11:45,616
that sonnets rhyme,
that they have a structure,
250
00:11:45,617 --> 00:11:47,793
that words have sounds
that can rhyme.
251
00:11:47,794 --> 00:11:49,316
It takes a lot.
252
00:11:50,927 --> 00:11:52,972
Holy shit, you can talk
to your computer now.
253
00:11:52,973 --> 00:11:56,105
That was just not true
three years ago.
254
00:11:56,106 --> 00:11:57,672
Yes, and this is
the really important part.
255
00:11:57,673 --> 00:12:00,631
The same process that lets AI
256
00:12:00,632 --> 00:12:03,199
uncover and manipulate
the patterns of text
257
00:12:03,200 --> 00:12:05,811
is the same process
that lets it uncover
258
00:12:05,812 --> 00:12:10,554
the patterns of the entire
universe and everything in it.
259
00:12:10,555 --> 00:12:12,905
There are patterns
and images and sound
260
00:12:12,906 --> 00:12:15,995
in computer code and DNA
and music and physics
261
00:12:15,996 --> 00:12:18,519
and fashion and building design
262
00:12:18,520 --> 00:12:21,609
and in
human voices and human faces,
263
00:12:21,610 --> 00:12:23,959
really, truly everywhere.
264
00:12:23,960 --> 00:12:26,048
- Everywhere. -Everywhere.
- Everywhere. -Everywhere.
265
00:12:26,049 --> 00:12:27,789
If you have learned
those patterns,
266
00:12:27,790 --> 00:12:29,704
you can generate
new kinds of songs.
267
00:12:29,705 --> 00:12:31,184
You can generate new videos.
268
00:12:31,185 --> 00:12:32,663
And that's why, if you give it
269
00:12:32,664 --> 00:12:34,753
a three-second recording
of your grandmother,
270
00:12:34,754 --> 00:12:36,580
it can speak back in her voice.
271
00:12:36,581 --> 00:12:37,973
Oh, my God.
272
00:12:37,974 --> 00:12:40,628
Oh, my God. Oh, my God.
273
00:12:40,629 --> 00:12:42,891
What will they think of next?
274
00:12:42,892 --> 00:12:44,066
It's moving very, very quickly.
275
00:12:44,067 --> 00:12:45,851
An American AI start-up
276
00:12:45,852 --> 00:12:47,635
has released its latest model.
277
00:12:47,636 --> 00:12:50,812
That company is Anthropic,
and it has just unveiled
278
00:12:50,813 --> 00:12:53,815
the latest versions
of its AI assistant Claude.
279
00:12:53,816 --> 00:12:57,210
So, the xAI team
was there to unveil Grok 4.
280
00:12:57,211 --> 00:13:00,169
Google released one
just last week. Gemini is...
281
00:13:00,170 --> 00:13:03,042
We've gone from GPT-2
just a couple years ago,
282
00:13:03,043 --> 00:13:05,566
which could barely write
a coherent paragraph,
283
00:13:05,567 --> 00:13:08,003
to GPT-4,
which can pass the bar exam.
284
00:13:08,004 --> 00:13:10,397
And all they had to do
to get there
285
00:13:10,398 --> 00:13:13,008
was essentially add more data
and more compute.
286
00:13:13,009 --> 00:13:16,316
These people who are
building this... -COTRA: Yeah.
287
00:13:16,317 --> 00:13:18,057
...they're just throwing more...
288
00:13:18,058 --> 00:13:21,843
More physical computers,
more of the same kinds of data.
289
00:13:21,844 --> 00:13:24,585
Because the more
computing power you add,
290
00:13:24,586 --> 00:13:28,067
the more complex
intellectual tasks they can do.
291
00:13:28,068 --> 00:13:30,199
So, the more weather data
you give it,
292
00:13:30,200 --> 00:13:31,940
the better it can
make predictions about
293
00:13:31,941 --> 00:13:34,247
where a hurricane might go.
294
00:13:34,248 --> 00:13:36,379
And the more patterns
of tumors and bones
295
00:13:36,380 --> 00:13:38,904
and tissues an AI has seen,
then the better able it is
296
00:13:38,905 --> 00:13:41,994
to detect a tumor
in a new CT scan.
297
00:13:41,995 --> 00:13:43,691
Better even than a human doctor.
298
00:13:43,692 --> 00:13:46,999
AI that's already being
deployed for the military
299
00:13:47,000 --> 00:13:49,175
can already use
satellite imagery,
300
00:13:49,176 --> 00:13:51,830
troop movements, communications
to determine,
301
00:13:51,831 --> 00:13:53,396
sometimes days in advance,
302
00:13:53,397 --> 00:13:55,050
where an attack
is going to happen,
303
00:13:55,051 --> 00:13:57,444
like where an enemy
is going to strike.
304
00:13:57,445 --> 00:13:59,489
This whole
space is moving so fast
305
00:13:59,490 --> 00:14:01,709
that any example
you put in this movie
306
00:14:01,710 --> 00:14:04,625
will feel absolutely clumsy
by the time it comes out.
307
00:14:11,241 --> 00:14:14,156
These models are being released
308
00:14:14,157 --> 00:14:17,116
before anyone knows
what they're even capable of.
309
00:14:17,117 --> 00:14:22,556
GPT-3.5 was released and out
to 100 million people plus
310
00:14:22,557 --> 00:14:25,994
before some researchers
discovered that it could do
311
00:14:25,995 --> 00:14:28,779
research-grade chemistry
better than models
312
00:14:28,780 --> 00:14:32,827
that were trained specifically
to do research-grade chemistry.
313
00:14:32,828 --> 00:14:35,482
Something is happening in there
that the people
314
00:14:35,483 --> 00:14:38,311
who are building them
don't fully understand.
315
00:14:38,312 --> 00:14:41,009
Basically, it just analyzes
the data by itself,
316
00:14:41,010 --> 00:14:45,100
and as it does that,
it just teaches itself
317
00:14:45,101 --> 00:14:48,016
various things
that we often didn't intend.
318
00:14:48,017 --> 00:14:49,975
So, for instance,
it reads a lot online,
319
00:14:49,976 --> 00:14:52,586
and then at some point, it just
learns how to do arithmetic.
320
00:14:52,587 --> 00:14:54,501
One, two.
321
00:14:54,502 --> 00:14:56,155
And then at
some point, it starts to learn
322
00:14:56,156 --> 00:14:58,592
how to answer
advanced physics questions.
323
00:14:58,593 --> 00:15:00,811
We didn't program that
in it whatsoever.
324
00:15:00,812 --> 00:15:02,423
It just learned by itself.
325
00:15:05,382 --> 00:15:07,296
An AI is like a digital brain.
326
00:15:07,297 --> 00:15:08,732
But just like a human brain,
327
00:15:08,733 --> 00:15:10,560
if you did a brain scan
on a human brain,
328
00:15:10,561 --> 00:15:13,128
would you know everything
that person was capable of?
329
00:15:13,129 --> 00:15:15,174
You can't know that
just from the brain scan.
330
00:15:15,175 --> 00:15:17,959
It's just, like,
a bunch of numbers and, like,
331
00:15:17,960 --> 00:15:20,222
the multiplications that are
happening that-that, like,
332
00:15:20,223 --> 00:15:22,268
the best machine learning
researcher in the world
333
00:15:22,269 --> 00:15:24,314
could look at and, like, have
no idea what was happening.
334
00:15:30,973 --> 00:15:32,756
That chair right there.
335
00:15:32,757 --> 00:15:35,890
- Is that okay for you?
- Yes.
336
00:15:35,891 --> 00:15:37,718
So,
that's kind of mind-boggling.
337
00:15:37,719 --> 00:15:39,111
Okay? Like, it's taking over
the world,
338
00:15:39,112 --> 00:15:41,113
and we don't even know
how it works.
339
00:15:41,114 --> 00:15:43,158
- Is that right?
- Mm.
340
00:15:43,159 --> 00:15:46,988
We do understand
a number of important things,
341
00:15:46,989 --> 00:15:50,513
but we don't have
a very good grasp on
342
00:15:50,514 --> 00:15:53,255
why they provide
specific answers to questions.
343
00:15:53,256 --> 00:15:58,652
It is a problem because we are
on a path t-to build machines,
344
00:15:58,653 --> 00:16:03,135
based on these principles,
that could be smarter than us
345
00:16:03,136 --> 00:16:06,139
and thus potentially have
a lot of power.
346
00:16:12,580 --> 00:16:15,321
One of the most cited
computer scientists in history.
347
00:16:15,322 --> 00:16:16,975
I actually find it
a little difficult
348
00:16:16,976 --> 00:16:18,541
to talk about my own role.
349
00:16:18,542 --> 00:16:21,109
Really much prefer
when other people do it.
350
00:16:21,110 --> 00:16:23,938
Ilya joining was the...
was-was the linchpin for
351
00:16:23,939 --> 00:16:25,548
OpenAI being
ultimately successful.
352
00:16:25,549 --> 00:16:27,246
I think it's just going to be
353
00:16:27,247 --> 00:16:30,640
some kind of a vast, dramatic
and unimaginable impact.
354
00:16:30,641 --> 00:16:32,903
I don't know if you've spent
any time on YouTube,
355
00:16:32,904 --> 00:16:34,993
but you can kind of feel
the speed already, right?
356
00:16:34,994 --> 00:16:36,298
You know what I mean?
357
00:16:36,299 --> 00:16:38,387
This is just the warmup.
358
00:16:38,388 --> 00:16:40,476
The really powerful systems
are still coming,
359
00:16:40,477 --> 00:16:42,262
and they're gonna be coming
quite soon.
360
00:16:47,963 --> 00:16:51,618
AGI stands for "artificial
general intelligence."
361
00:16:51,619 --> 00:16:53,795
Uh, systems that are
basically...
362
00:16:58,191 --> 00:17:00,279
And this is, like, you know,
seems to be, like,
363
00:17:00,280 --> 00:17:03,064
the holy grail of AI?
364
00:17:03,065 --> 00:17:05,197
When you can simulate
a human mind
365
00:17:05,198 --> 00:17:07,895
that is doing human cognition
and can do reasoning,
366
00:17:07,896 --> 00:17:10,985
that is a new sort of tier of AI
367
00:17:10,986 --> 00:17:14,293
that we have to distinguish
from previous AI.
368
00:17:14,294 --> 00:17:16,643
When that happens,
by the way, that's when
369
00:17:16,644 --> 00:17:21,082
you would hire one of
those AGIs instead of a person.
370
00:17:21,083 --> 00:17:24,520
Most jobs in our economy
it can do.
371
00:17:24,521 --> 00:17:25,956
It can work 24 hours a day,
372
00:17:25,957 --> 00:17:28,220
never gets tired,
never gets bored.
373
00:17:28,221 --> 00:17:30,178
They don't need to sleep.
They don't need breaks.
374
00:17:30,179 --> 00:17:31,919
They're, like,
not gonna join a union.
375
00:17:31,920 --> 00:17:33,921
Won't complain,
won't whistleblow.
376
00:17:33,922 --> 00:17:35,836
More than 100 times cheaper
377
00:17:35,837 --> 00:17:38,273
than humans working
at m-minimum wage.
378
00:17:38,274 --> 00:17:39,535
Not only will they be
doing everything,
379
00:17:39,536 --> 00:17:41,059
but they'll be doing it faster.
380
00:17:41,060 --> 00:17:43,104
The same intelligence
381
00:17:43,105 --> 00:17:45,411
that powers that can also look
at the patterns and movements
382
00:17:45,412 --> 00:17:47,891
and articulating muscles
and, you know, robotics.
383
00:17:47,892 --> 00:17:50,111
And so it's not just
gonna automate desk jobs.
384
00:17:50,112 --> 00:17:51,765
That's just the beginning.
385
00:17:51,766 --> 00:17:55,595
It will automate
all physical labor.
386
00:17:55,596 --> 00:17:57,031
There's no way
387
00:17:57,032 --> 00:17:59,207
humans are gonna compete
with them.
388
00:18:05,606 --> 00:18:09,304
It is hard to conceptualize
the impact of AGI.
389
00:18:11,090 --> 00:18:12,873
But I think it's going to be
something very big
390
00:18:12,874 --> 00:18:15,181
and drastic and radical.
391
00:18:17,226 --> 00:18:19,053
You think
this is one of the most
392
00:18:19,054 --> 00:18:20,924
consequential moments
in human history?
393
00:18:20,925 --> 00:18:23,362
Yeah. Yeah, that's--
I mean, what else would be?
394
00:18:23,363 --> 00:18:25,494
I mean, like, there's
the Industrial Revolution.
395
00:18:25,495 --> 00:18:27,235
You know, it'll
make the Industrial Revolution
396
00:18:27,236 --> 00:18:29,890
look like small beans.
397
00:18:29,891 --> 00:18:32,153
AGI is an inflection point
398
00:18:32,154 --> 00:18:34,068
because it means
you can accelerate
399
00:18:34,069 --> 00:18:38,028
all other intellectual fields
all at the same time.
400
00:18:38,029 --> 00:18:39,856
Like, if you make an advance
in rocketry,
401
00:18:39,857 --> 00:18:42,033
that doesn't advance
biology and medicine.
402
00:18:43,905 --> 00:18:45,558
If you make an advance
in medicine,
403
00:18:45,559 --> 00:18:47,126
that doesn't advance rocketry.
404
00:18:48,866 --> 00:18:51,346
But if you make an advance
in artificial intelligence,
405
00:18:51,347 --> 00:18:54,175
that advances all scientific
and technological fields
406
00:18:54,176 --> 00:18:55,655
all at the same time.
407
00:18:55,656 --> 00:18:57,135
That's why, for a long time,
408
00:18:57,136 --> 00:18:59,224
Google DeepMind's
mission statement was...
409
00:18:59,225 --> 00:19:01,008
- Step one, solve intelligence.
-Yeah.
410
00:19:01,009 --> 00:19:02,792
Step two, use it to solve
everything else. -Yes.
411
00:19:02,793 --> 00:19:05,186
That's why AI dwarfs the power
412
00:19:05,187 --> 00:19:07,536
of all other technologies
combined.
413
00:19:07,537 --> 00:19:09,103
It will transform everything.
414
00:19:09,104 --> 00:19:10,887
So, uh, it'll be
at least as big as
415
00:19:10,888 --> 00:19:13,803
the Industrial Revolution,
possibly, you know, bigger,
416
00:19:13,804 --> 00:19:16,328
more like the advent
of electricity or even fire.
417
00:19:16,329 --> 00:19:18,678
The caveman literally held aloft
418
00:19:18,679 --> 00:19:21,508
the torch of civilization.
419
00:19:23,727 --> 00:19:25,467
It is generally thought that
420
00:19:25,468 --> 00:19:28,209
around the time of AGI,
we'll have AIs that can
421
00:19:28,210 --> 00:19:31,560
do all or most of
the AI research process
422
00:19:31,561 --> 00:19:34,476
and, of course, can do it
faster and cheaper.
423
00:19:35,957 --> 00:19:37,349
It can copy itself.
424
00:19:37,350 --> 00:19:39,394
A thousand times,
a million times,
425
00:19:39,395 --> 00:19:41,091
and, like,
now you have a million copies
426
00:19:41,092 --> 00:19:43,616
all working in parallel.
427
00:19:43,617 --> 00:19:46,619
When it learns
how to make its code faster,
428
00:19:46,620 --> 00:19:48,708
make its code more efficient,
429
00:19:48,709 --> 00:19:51,101
obviously that becomes, like,
a-a runaway loop.
430
00:19:51,102 --> 00:19:53,234
AGI isn't, like, the end.
431
00:19:53,235 --> 00:19:54,975
It's just the beginning.
432
00:19:54,976 --> 00:19:58,021
It's the beginning of
an incredibly rapid explosion
433
00:19:58,022 --> 00:20:00,067
of scientific progress,
434
00:20:00,068 --> 00:20:02,243
and in particular,
scientific progress in AI.
435
00:20:02,244 --> 00:20:04,419
And when they're smarter
than us, too,
436
00:20:04,420 --> 00:20:06,291
and substantially faster
than us,
437
00:20:06,292 --> 00:20:08,728
and they're getting faster
each year, exponentially,
438
00:20:08,729 --> 00:20:12,166
those are the ones that can
potentially become superhuman,
439
00:20:12,167 --> 00:20:14,386
uh, possibly this decade.
440
00:20:14,387 --> 00:20:15,909
Sorry, did you say
441
00:20:15,910 --> 00:20:18,520
"become superhuman,
maybe in this decade"?
442
00:20:18,521 --> 00:20:22,045
Yeah. I mean, I think, uh,
a lot of people who are
443
00:20:22,046 --> 00:20:24,526
actually building this think
that that's fairly plausible
444
00:20:24,527 --> 00:20:26,572
that we get
some superintelligence,
445
00:20:26,573 --> 00:20:28,704
something that's vastly
more intelligent than people,
446
00:20:28,705 --> 00:20:30,793
within this decade.
447
00:20:30,794 --> 00:20:32,926
The way I define
"superintelligence" is
448
00:20:32,927 --> 00:20:35,276
a system that by itself is
449
00:20:35,277 --> 00:20:37,670
more intelligent and competent
than all of humanity.
450
00:20:37,671 --> 00:20:39,672
I'm just gonna-- sorry.
I don't mean to interrupt you.
451
00:20:39,673 --> 00:20:41,108
You're on a flow.
452
00:20:41,109 --> 00:20:43,153
Uh, I just, I just...
453
00:20:43,154 --> 00:20:45,155
I'm not really following,
'cause you're using language
454
00:20:45,156 --> 00:20:46,809
like "superintelligence"
and, like,
455
00:20:46,810 --> 00:20:49,943
"smarter than all of humanity,"
and I hear that,
456
00:20:49,944 --> 00:20:52,467
and it sounds like...
like sci-fi bullshit to me,
457
00:20:52,468 --> 00:20:54,426
and I'm just trying
to understand.
458
00:20:54,427 --> 00:20:56,645
There's nothing magical
about intelligence.
459
00:20:56,646 --> 00:20:58,473
This is very important,
is that, you know,
460
00:20:58,474 --> 00:21:00,519
intelligence can feel magical,
it can feel like
461
00:21:00,520 --> 00:21:03,304
some mystical thing
in your mind or something,
462
00:21:03,305 --> 00:21:05,828
but it is just computation.
463
00:21:05,829 --> 00:21:09,832
The human brain is
quite limited in some ways,
464
00:21:09,833 --> 00:21:12,357
in terms of information
processing capability,
465
00:21:12,358 --> 00:21:15,185
compared to what we see
in, say, a data center.
466
00:21:15,186 --> 00:21:18,319
So, for example,
the signals which are sent
467
00:21:18,320 --> 00:21:22,584
inside your brain, they move
at about 30 meters per second.
468
00:21:22,585 --> 00:21:24,282
But the speed of light,
469
00:21:24,283 --> 00:21:27,241
which is what a computer uses
in fiber optics,
470
00:21:27,242 --> 00:21:30,549
is 300 million meters
per second.
471
00:21:30,550 --> 00:21:34,466
And so, it would be
kind of strange if
472
00:21:34,467 --> 00:21:36,990
human intelligence was
somehow really special
473
00:21:36,991 --> 00:21:40,950
in that regard
and is somehow some upper limit
474
00:21:40,951 --> 00:21:43,083
of what's possible
in intelligence.
475
00:21:43,084 --> 00:21:47,261
I think, once we understand how
to build intelligent systems,
476
00:21:47,262 --> 00:21:50,699
we will be able to build
huge machines,
477
00:21:50,700 --> 00:21:54,877
which will be far beyond
normal human intelligence.
478
00:21:54,878 --> 00:21:58,664
Uh, hopefully we can have
a very symbiotic relationship,
479
00:21:58,665 --> 00:22:01,319
uh, with AI systems,
but the AI developers are
480
00:22:01,320 --> 00:22:03,712
specifically designing them
to make sure that they can do
481
00:22:03,713 --> 00:22:05,497
everything better than we can,
so I-I don't know
482
00:22:05,498 --> 00:22:08,238
what-what we will be able
to offer, unfortunately.
483
00:22:08,239 --> 00:22:10,066
The-the older-school
AI technology...
484
00:22:10,067 --> 00:22:12,330
Daniel isn't
feeling any better.
485
00:22:12,331 --> 00:22:14,419
His plan is backfiring.
486
00:22:14,420 --> 00:22:17,291
The more he learns about
this new, powerful,
487
00:22:17,292 --> 00:22:19,162
inscrutable thing,
the worse it sounds.
488
00:22:19,163 --> 00:22:21,382
He wants to tell them
how scared he is,
489
00:22:21,383 --> 00:22:24,472
how he feels like the earth is
slipping out from under him,
490
00:22:24,473 --> 00:22:27,040
that he's staring down
existential dread,
491
00:22:27,041 --> 00:22:29,216
so he articulates this
by saying...
492
00:22:29,217 --> 00:22:30,783
That sounds bad.
493
00:22:32,089 --> 00:22:33,830
Yeah.
494
00:22:35,832 --> 00:22:38,443
If you have
all of these capabilities
495
00:22:38,444 --> 00:22:41,054
and they start to be able
to plan better,
496
00:22:41,055 --> 00:22:43,622
if you sort of take that
to its logical conclusion,
497
00:22:43,623 --> 00:22:46,276
you can get some pretty
power-seeking behavior.
498
00:22:55,374 --> 00:22:57,723
Okay, so why would an AI
499
00:22:57,724 --> 00:22:59,246
want more power?
500
00:22:59,247 --> 00:23:02,249
Yeah. So, I think
it's actually pretty simple.
501
00:23:02,250 --> 00:23:04,817
Having more power is
a very effective strategy
502
00:23:04,818 --> 00:23:06,558
for accomplishing
almost any goal.
503
00:23:06,559 --> 00:23:08,995
We ran an experiment
where we gave
504
00:23:08,996 --> 00:23:11,911
OpenAI's most powerful
AI model, uh,
505
00:23:11,912 --> 00:23:13,434
a series of problems to solve.
506
00:23:13,435 --> 00:23:15,262
And partway through,
on its computer,
507
00:23:15,263 --> 00:23:18,744
it got a notification that
it was going to be shut down.
508
00:23:18,745 --> 00:23:20,963
And what it did is
it rewrote that code
509
00:23:20,964 --> 00:23:22,530
to prevent itself
from being shut down
510
00:23:22,531 --> 00:23:25,664
so it could finish
solving the problems.
511
00:23:25,665 --> 00:23:27,796
Okay. -Yeah. So, another
really interesting one
512
00:23:27,797 --> 00:23:31,147
is that the AI company Anthropic
513
00:23:31,148 --> 00:23:34,977
made a simulated environment
where that AI had access
514
00:23:34,978 --> 00:23:36,501
to all of the company emails.
515
00:23:36,502 --> 00:23:38,590
And it learned through
reading those emails
516
00:23:38,591 --> 00:23:40,330
it was going to be replaced
517
00:23:40,331 --> 00:23:43,421
and the lead engineer
who was responsible for this
518
00:23:43,422 --> 00:23:45,727
was also having an affair.
519
00:23:45,728 --> 00:23:47,990
And on its own,
it used that information
520
00:23:47,991 --> 00:23:49,557
to blackmail the engineer
521
00:23:49,558 --> 00:23:52,081
to prevent itself
from being replaced.
522
00:23:52,082 --> 00:23:53,909
It was like,
"No, I'm not gonna be replaced.
523
00:23:53,910 --> 00:23:56,956
"If you replace me,
I'm going to tell the world
524
00:23:56,957 --> 00:23:58,567
that you are having
this affair."
525
00:23:59,655 --> 00:24:01,482
And nobody taught it to do that?
526
00:24:01,483 --> 00:24:03,311
No, it learned to do that
on its own.
527
00:24:05,139 --> 00:24:07,749
As the models get smarter,
they learn that these are
528
00:24:07,750 --> 00:24:10,578
effective ways
to accomplish goals.
529
00:24:10,579 --> 00:24:12,885
And this is not a problem
that's isolated to one model.
530
00:24:12,886 --> 00:24:15,714
All of the most powerful models
show these behaviors.
531
00:24:23,766 --> 00:24:25,288
- Hey.
-Hey. How are you?
532
00:24:25,289 --> 00:24:26,855
Good. Good to be here.
533
00:24:26,856 --> 00:24:28,727
When Yuval Noah Harari
534
00:24:28,728 --> 00:24:31,469
published his first book
Sapiens in 2014,
535
00:24:31,470 --> 00:24:33,862
it became a global bestseller
536
00:24:33,863 --> 00:24:36,038
and turned the little-known
Israeli history professor
537
00:24:36,039 --> 00:24:38,040
into one of
the most popular writers
538
00:24:38,041 --> 00:24:39,433
and thinkers on the planet.
539
00:24:39,434 --> 00:24:42,088
The biggest danger with AI is
540
00:24:42,089 --> 00:24:44,003
the belief
that it is infallible,
541
00:24:44,004 --> 00:24:45,700
that we have finally found--
542
00:24:45,701 --> 00:24:49,443
"Okay, gods were just
this mythological creation.
543
00:24:49,444 --> 00:24:52,751
"Humans, we can't trust them,
but AI is infallible.
544
00:24:52,752 --> 00:24:55,231
It will never make
any mistakes."
545
00:24:55,232 --> 00:24:58,365
And this is
a deadly, deadly threat.
546
00:24:58,366 --> 00:25:00,149
It will make mistakes.
547
00:25:00,150 --> 00:25:03,326
And all these fantasies
that AI will reveal the truth
548
00:25:03,327 --> 00:25:05,677
about the world that
we can't find by ourselves,
549
00:25:05,678 --> 00:25:08,767
AI will not reveal the truth
about the world.
550
00:25:08,768 --> 00:25:12,727
AI will create an entirely new
world, much more complicated
551
00:25:12,728 --> 00:25:15,296
and difficult to understand
than this one.
552
00:25:17,994 --> 00:25:22,694
What's about to happen is that
we, uh, humans are no longer
553
00:25:22,695 --> 00:25:26,306
going to be the most
intelligent entities on Earth.
554
00:25:26,307 --> 00:25:28,613
So I think what's coming up
is going to be
555
00:25:28,614 --> 00:25:32,704
one of the biggest events
in human history.
556
00:25:32,705 --> 00:25:34,401
Geoffrey,
thanks so much for joining us.
557
00:25:34,402 --> 00:25:35,968
So you left your job with
Google in part because you say
558
00:25:35,969 --> 00:25:38,840
you want to focus solely
on your concerns about AI.
559
00:25:38,841 --> 00:25:41,669
You've spoken out,
saying that AI could manipulate
560
00:25:41,670 --> 00:25:45,020
or possibly figure out
a way to kill humans.
561
00:25:45,021 --> 00:25:47,023
H-How could it kill humans?
562
00:25:48,764 --> 00:25:50,983
Well, if it gets to be
much smarter than us,
563
00:25:50,984 --> 00:25:52,593
it'll be very good
at manipulation
564
00:25:52,594 --> 00:25:54,769
'cause it will have
learned that from us.
565
00:25:54,770 --> 00:25:57,990
And it'll figure out ways
of manipulating people
566
00:25:57,991 --> 00:26:00,862
to do what it wants.
567
00:26:00,863 --> 00:26:05,519
There was a open letter
from the Center for AI Safety.
568
00:26:05,520 --> 00:26:08,827
Sam Altman signed this.
Demis signed this.
569
00:26:08,828 --> 00:26:12,091
They signed a 22-word statement
570
00:26:12,092 --> 00:26:15,355
that we need to take AI
and the threat from AI
571
00:26:15,356 --> 00:26:19,316
as seriously as
global nuclear war.
572
00:26:24,844 --> 00:26:26,933
-Hello.
- Hello.
573
00:26:30,327 --> 00:26:32,938
You're kind of, like,
the original doom guy.
574
00:26:32,939 --> 00:26:34,940
More or less.
575
00:26:34,941 --> 00:26:37,464
Since 2001,
I have been working on
576
00:26:37,465 --> 00:26:38,987
what we would now call
the problem
577
00:26:38,988 --> 00:26:42,034
of aligning artificial
general intelligence.
578
00:26:42,035 --> 00:26:44,123
How to shape
the preferences and behavior
579
00:26:44,124 --> 00:26:46,386
of a powerful artificial mind
580
00:26:46,387 --> 00:26:48,868
such that it does not
kill everyone.
581
00:26:51,610 --> 00:26:53,611
It's not like
a lifeless machine.
582
00:26:53,612 --> 00:26:55,351
It is smart, it is creative,
583
00:26:55,352 --> 00:26:57,353
it is inventive,
it has the properties
584
00:26:57,354 --> 00:26:59,399
that makes
the human species dangerous,
585
00:26:59,400 --> 00:27:01,706
and it has
more of those properties.
586
00:27:01,707 --> 00:27:04,752
If something doesn't
actively care about you,
587
00:27:04,753 --> 00:27:06,232
actively want you to live,
588
00:27:06,233 --> 00:27:07,842
actively care about
your welfare,
589
00:27:07,843 --> 00:27:09,931
about you being happy
and alive and free,
590
00:27:09,932 --> 00:27:12,499
if it cares about
other stuff instead,
591
00:27:12,500 --> 00:27:14,457
and you're on the same planet,
592
00:27:14,458 --> 00:27:18,766
that is not survivable if it is
very much smarter than you.
593
00:27:18,767 --> 00:27:20,594
I don't think it's going to be
594
00:27:20,595 --> 00:27:22,944
a kind of, like, evil thing.
595
00:27:22,945 --> 00:27:26,339
It's like, "Oh, the AIs are
evil and they hate humanity."
596
00:27:26,340 --> 00:27:27,732
I don't think
that's what's gonna happen.
597
00:27:27,733 --> 00:27:29,037
I think what is happening is
598
00:27:29,038 --> 00:27:31,039
far more how humans feel
about ants.
599
00:27:32,520 --> 00:27:35,478
Like, we don't hate ants,
600
00:27:35,479 --> 00:27:37,176
but if we want
to build a highway
601
00:27:37,177 --> 00:27:40,222
and there's an anthill there,
well, sucks for the ants.
602
00:27:40,223 --> 00:27:41,571
It's not that hard
to understand.
603
00:27:41,572 --> 00:27:43,269
It's like, hey,
if we build things
604
00:27:43,270 --> 00:27:44,705
that are smarter than us
605
00:27:44,706 --> 00:27:46,272
and we don't know
how to control them,
606
00:27:46,273 --> 00:27:48,317
does that seem like
a risky thing to you?
607
00:27:50,886 --> 00:27:52,321
Yeah. Yeah, it does.
608
00:27:52,322 --> 00:27:54,062
You don't have to be a tech guy.
609
00:27:54,063 --> 00:27:55,411
You don't have to know
programming to understand it.
610
00:27:55,412 --> 00:27:56,761
It's not that hard.
611
00:27:56,762 --> 00:27:58,153
This is not a hard thing
to understand.
612
00:27:58,154 --> 00:28:00,634
Connor, how-how many people
613
00:28:00,635 --> 00:28:02,680
in the world right now
are working on AGI?
614
00:28:02,681 --> 00:28:05,508
At least 20,000, I would say.
615
00:28:05,509 --> 00:28:07,772
- 20,000?
- I would expect so.
616
00:28:07,773 --> 00:28:09,469
Okay, and how many people
are working full-time
617
00:28:09,470 --> 00:28:12,080
to make sure AI doesn't,
like, kill us all?
618
00:28:12,081 --> 00:28:14,562
Probably less than 200
in the world.
619
00:28:16,085 --> 00:28:17,956
And your conceit is that
620
00:28:17,957 --> 00:28:24,528
the only natural result
of this recklessness...
621
00:28:25,704 --> 00:28:28,967
...is the collapse of humanity?
622
00:28:28,968 --> 00:28:33,232
Well, not the collapse,
the abrupt extermination.
623
00:28:33,233 --> 00:28:35,930
There's a difference.
624
00:28:37,759 --> 00:28:42,632
♪ What is the meaning of life? ♪
625
00:28:42,633 --> 00:28:46,636
♪ What is the future
and what is now? ♪
626
00:28:46,637 --> 00:28:49,988
♪ What is the answer
to strife? ♪
627
00:28:49,989 --> 00:28:52,860
♪ How much ♪
628
00:28:52,861 --> 00:28:54,993
♪ Can someone dream? ♪
629
00:28:54,994 --> 00:28:57,517
♪ How long? ♪
630
00:28:57,518 --> 00:28:58,954
♪ Forever ♪
631
00:29:00,216 --> 00:29:01,913
♪ What is the meaning ♪
632
00:29:01,914 --> 00:29:04,176
♪ Of me... ♪
633
00:29:04,177 --> 00:29:06,308
I do think
this is probably, like,
634
00:29:06,309 --> 00:29:08,006
the biggest challenge
that, like,
635
00:29:08,007 --> 00:29:11,313
our civilization
will-will face, ever.
636
00:29:11,314 --> 00:29:14,361
This essentially is the last
mistake we'll ever get to make.
637
00:29:16,276 --> 00:29:22,368
If we can rise to be the most
mature version of ourselves,
638
00:29:22,369 --> 00:29:24,457
there might be
a way through this.
639
00:29:24,458 --> 00:29:25,806
What does that mean?
640
00:29:25,807 --> 00:29:27,809
"The most mature version
of ourselves"?
641
00:29:29,332 --> 00:29:32,247
'Cause that sounds,
for me, like...
642
00:29:32,248 --> 00:29:34,250
I-I-- What the [bleep]?
643
00:29:39,212 --> 00:29:43,216
Do you think now is
a good time to have a kid?
644
00:29:48,221 --> 00:29:49,438
Um...
645
00:29:49,439 --> 00:29:51,440
Do you want to have kids
one day?
646
00:29:51,441 --> 00:29:53,442
Is that something that-that
you're into or not really?
647
00:29:53,443 --> 00:29:57,838
Um... uh, I confess,
I think that's like,
648
00:29:57,839 --> 00:30:00,014
"Boy, let's get
through this critical period."
649
00:30:00,015 --> 00:30:01,276
Um...
650
00:30:01,277 --> 00:30:02,538
Do you have any kids?
651
00:30:02,539 --> 00:30:03,801
I do not.
652
00:30:03,802 --> 00:30:05,106
Is that something
you want to do?
653
00:30:05,107 --> 00:30:07,239
Have children, have a family?
654
00:30:07,240 --> 00:30:11,244
In some other world than this
world, sure, I would have kids.
655
00:30:12,593 --> 00:30:14,463
Would you want
to start a family?
656
00:30:14,464 --> 00:30:16,422
Would you want to have kids?
657
00:30:16,423 --> 00:30:18,425
Is that something
you're thinking about?
658
00:30:23,734 --> 00:30:25,780
I, um...
659
00:30:30,263 --> 00:30:32,525
I just have to find it first.
660
00:30:34,745 --> 00:30:37,791
We have to go
to the doctor, but...
661
00:30:37,792 --> 00:30:41,186
Ah! I knew it!
I knew it! I knew it!
662
00:30:41,187 --> 00:30:43,884
- When did you find out?
-Last night.
663
00:30:43,885 --> 00:30:45,973
- Oh, my God!
- Mom, I don't know.
664
00:30:45,974 --> 00:30:47,322
I can't tell you
665
00:30:47,323 --> 00:30:49,237
- how happy I am!
-No, seriously.
666
00:30:49,238 --> 00:30:51,631
Well, I took a pregnancy test
last night, and I'm pregnant.
667
00:30:51,632 --> 00:30:53,198
Oh, my God!
668
00:30:53,199 --> 00:30:55,156
Oh, my God, you guys!
669
00:30:55,157 --> 00:30:57,332
I just wanted to confirm
your expected due date,
670
00:30:57,333 --> 00:30:59,987
- which is January 21st.
- Oh, my God.
671
00:30:59,988 --> 00:31:02,555
I can't believe how happy I am.
672
00:31:03,949 --> 00:31:05,514
He already looks really cute.
673
00:31:05,515 --> 00:31:07,168
You think
he already looks cute?
674
00:31:07,169 --> 00:31:08,518
Yeah.
Look at that little cutie face.
675
00:31:13,959 --> 00:31:15,568
I have this baby on the way.
676
00:31:15,569 --> 00:31:16,830
Right.
677
00:31:16,831 --> 00:31:18,788
So I turn it over to you.
678
00:31:18,789 --> 00:31:21,269
Are we doomed?
679
00:31:21,270 --> 00:31:24,011
Are we all gonna face
this techno dystopian
680
00:31:24,012 --> 00:31:25,884
future of doom?
681
00:31:28,756 --> 00:31:31,062
It's, uh...
682
00:31:31,063 --> 00:31:32,628
It's not good news,
683
00:31:32,629 --> 00:31:35,196
the world
that we're heading into.
684
00:31:35,197 --> 00:31:37,024
I-- for ex--
I mean, I'll just be honest.
685
00:31:37,025 --> 00:31:39,592
Uh, I know people
who work on AI risk
686
00:31:39,593 --> 00:31:42,248
who don't expect their children
to make it to high school.
687
00:31:58,003 --> 00:32:01,570
This is, like...
this is actually scary
688
00:32:01,571 --> 00:32:03,094
'cause it's like,
oh, we're all [bleep].
689
00:32:03,095 --> 00:32:04,965
You have to c... make me calm,
690
00:32:04,966 --> 00:32:06,836
because this is making me
incredibly anxious
691
00:32:06,837 --> 00:32:08,664
and I'm carrying the baby
right now,
692
00:32:08,665 --> 00:32:11,667
so you have to also be calm
for me and strong and hopeful,
693
00:32:11,668 --> 00:32:16,455
because I'm-- it's too, it's
too much for my soul to bear
694
00:32:16,456 --> 00:32:19,327
while I'm carrying this baby.
695
00:32:19,328 --> 00:32:21,286
So you're going to have to try
to figure out
696
00:32:21,287 --> 00:32:24,202
a way to have hope.
697
00:32:24,203 --> 00:32:27,771
It's really important,
Daniel, especially now.
698
00:32:28,772 --> 00:32:30,599
H-How to have hope.
699
00:32:30,600 --> 00:32:33,385
You have to.
You have to find it for me.
700
00:32:33,386 --> 00:32:34,952
I'm serious.
701
00:32:34,953 --> 00:32:36,606
I'm going to.
702
00:32:37,825 --> 00:32:40,349
I will. I'll tr-- I'll try.
703
00:32:49,619 --> 00:32:51,446
Hey, guys?
704
00:32:51,447 --> 00:32:52,840
Guys.
705
00:32:54,363 --> 00:32:56,278
Can we get back up and running?
706
00:32:58,411 --> 00:33:01,022
Oy gevalt!
707
00:33:03,894 --> 00:33:05,373
Uh, Dan, are you ready?
708
00:33:05,374 --> 00:33:06,680
Hello, Daniel.
709
00:33:07,724 --> 00:33:09,508
Hey, how are you?
710
00:33:09,509 --> 00:33:11,075
I am... I'm well.
711
00:33:11,076 --> 00:33:13,948
I think, uh... I think
I need some help, Peter.
712
00:33:15,254 --> 00:33:17,646
Um, I've been working
on this film for about,
713
00:33:17,647 --> 00:33:19,866
I'm gonna say,
eight to ten months now.
714
00:33:19,867 --> 00:33:23,304
It has been very,
at times, depressing.
715
00:33:23,305 --> 00:33:25,654
- Hmm.
- I have felt very alienated
716
00:33:25,655 --> 00:33:27,787
- mak-making this movie.
- By who?
717
00:33:27,788 --> 00:33:30,094
By all of these, the--
all these guys
718
00:33:30,095 --> 00:33:33,271
who sit around and tell me
that the world's gonna end.
719
00:33:33,272 --> 00:33:34,750
- Ah.
- That, like,
720
00:33:34,751 --> 00:33:36,578
y-you know, th-this
doom bullshit, you know?
721
00:33:36,579 --> 00:33:38,102
I know it well.
722
00:33:38,103 --> 00:33:39,451
"We're all doomed.
Everyone's gonna die.
723
00:33:39,452 --> 00:33:40,974
Everything's awful."
724
00:33:40,975 --> 00:33:43,194
Awesome. Uh, let me
bring you some light.
725
00:33:43,195 --> 00:33:44,760
Please.
726
00:33:44,761 --> 00:33:48,156
We truly are living
in an extraordinary time.
727
00:33:49,418 --> 00:33:51,376
And many people forget this.
728
00:33:51,377 --> 00:33:53,682
Everything around us is
a product of intelligence,
729
00:33:53,683 --> 00:33:57,077
and so everything that we touch
with these new tools is likely
730
00:33:57,078 --> 00:33:59,688
to produce far more value
than we've ever seen before.
731
00:33:59,689 --> 00:34:02,865
AI can help us discover
new materials.
732
00:34:02,866 --> 00:34:05,042
AI can help social scientists
733
00:34:05,043 --> 00:34:07,392
to understand
how economics work.
734
00:34:07,393 --> 00:34:11,700
There's a lot AI could do
to make life and work better.
735
00:34:11,701 --> 00:34:14,138
I feel more empowered today,
736
00:34:14,139 --> 00:34:16,357
more confident
to learn something today.
737
00:34:16,358 --> 00:34:19,099
We're gonna become superhumans
because we have super AIs.
738
00:34:19,100 --> 00:34:21,841
This is just the beginning
of an explosion.
739
00:34:21,842 --> 00:34:24,496
Humans and AI collaborating
740
00:34:24,497 --> 00:34:26,498
to solve
really important problems.
741
00:34:26,499 --> 00:34:29,805
It is here to liberate us
from routine jobs,
742
00:34:29,806 --> 00:34:33,766
and it is here to remind us
what it is that makes us human.
743
00:34:33,767 --> 00:34:35,811
I think this is
744
00:34:35,812 --> 00:34:38,640
the most extraordinary time
to be alive.
745
00:34:38,641 --> 00:34:42,079
The only time more exciting
than today is tomorrow.
746
00:34:42,080 --> 00:34:45,299
Uh, I think that
children born today,
747
00:34:45,300 --> 00:34:47,301
they're about to enter
748
00:34:47,302 --> 00:34:50,870
a glorious period
of human transformation.
749
00:34:50,871 --> 00:34:52,567
Are we gonna have challenges?
Of course.
750
00:34:52,568 --> 00:34:55,222
Can we solve those challenges?
We do every single time.
751
00:34:55,223 --> 00:34:57,746
We are here,
which is miraculous.
752
00:34:57,747 --> 00:35:00,314
- I already love you.
- Okay.
753
00:35:00,315 --> 00:35:01,968
Super thankful
to have you here. -
754
00:35:01,969 --> 00:35:03,404
- Super stoked to be here.
- Yeah.
755
00:35:03,405 --> 00:35:04,927
So, the floor is yours, sir.
756
00:35:04,928 --> 00:35:07,452
Thank you so much.
Super excited to be here.
757
00:35:07,453 --> 00:35:10,629
Yo, yo. All right.
758
00:35:10,630 --> 00:35:12,326
The future's gonna be awesome.
759
00:35:12,327 --> 00:35:14,328
I mean, ever since I was a kid,
760
00:35:14,329 --> 00:35:17,375
I wanted to understand
the universe we live in,
761
00:35:17,376 --> 00:35:20,465
in order to figure out
how to create the technologies
762
00:35:20,466 --> 00:35:24,033
that help us increase the scope
and scale of civilization.
763
00:35:24,034 --> 00:35:26,732
I feel like
if everyone had that mindset,
764
00:35:26,733 --> 00:35:29,256
then we'd actually live
in a better world, right?
765
00:35:29,257 --> 00:35:30,519
I do believe that.
766
00:35:32,782 --> 00:35:34,740
Hi, Pete. How are you?
767
00:35:34,741 --> 00:35:36,568
I thought a lot about
768
00:35:36,569 --> 00:35:39,092
what I would call dread,
AI dread.
769
00:35:39,093 --> 00:35:40,746
I feel it.
770
00:35:40,747 --> 00:35:43,314
I-I haven't met
any thoughtful human being
771
00:35:43,315 --> 00:35:45,185
who doesn't feel it.
772
00:35:45,186 --> 00:35:48,319
And anyone who says they don't
feel it, you know, is lying.
773
00:35:48,320 --> 00:35:50,147
But, you know, overall,
and the reason
774
00:35:50,148 --> 00:35:53,759
that I'm personally optimistic
about this, uh, is that
775
00:35:53,760 --> 00:35:56,588
a huge fraction of the world's
most intelligent people
776
00:35:56,589 --> 00:35:58,459
are thinking very hard
777
00:35:58,460 --> 00:36:02,159
about the potential downstream
harms and risks of AI.
778
00:36:02,160 --> 00:36:05,684
We have this sort of vision
of safety being, uh,
779
00:36:05,685 --> 00:36:08,295
kind of at the center
of the research that we do.
780
00:36:08,296 --> 00:36:09,601
No, that's totally fine.
781
00:36:10,864 --> 00:36:13,605
I think there are more
782
00:36:13,606 --> 00:36:17,478
potential benefits than
there are potential downsides,
783
00:36:17,479 --> 00:36:19,524
and I think it is incumbent upon
784
00:36:19,525 --> 00:36:21,874
the people that are
creating this technology
785
00:36:21,875 --> 00:36:24,572
to make sure that we're doing
the best job we can
786
00:36:24,573 --> 00:36:25,878
to make it safe for people.
787
00:36:25,879 --> 00:36:27,445
Reid Hoffman.
788
00:36:27,446 --> 00:36:29,011
What I can guarantee you is
789
00:36:29,012 --> 00:36:30,578
- some bad things will happen.
- Take one.
790
00:36:30,579 --> 00:36:32,711
What we're gonna try to do
is make those bad things
791
00:36:32,712 --> 00:36:36,193
as few and not huge as possible,
792
00:36:36,194 --> 00:36:38,412
and then we're gonna iterate
to have--
793
00:36:38,413 --> 00:36:40,675
be in a much better place
with society.
794
00:36:40,676 --> 00:36:43,722
- Hello.
-Hello.
795
00:36:43,723 --> 00:36:45,289
How are you, Moon?
796
00:36:45,290 --> 00:36:49,031
I feel like we are ending
a chapter in humanity
797
00:36:49,032 --> 00:36:50,555
and beginning a new one,
798
00:36:50,556 --> 00:36:53,253
and it's a very interesting
time to be alive.
799
00:36:53,254 --> 00:36:54,472
And if I could be born
right now,
800
00:36:54,473 --> 00:36:55,908
I definitely would want to be.
801
00:36:55,909 --> 00:36:57,518
Like, that would be so exciting.
802
00:36:57,519 --> 00:37:00,347
I-I'm very excited.
803
00:37:00,348 --> 00:37:02,523
What, what a future.
Does it excited... excite you?
804
00:37:02,524 --> 00:37:03,742
No.
805
00:37:03,743 --> 00:37:06,571
No, not really.
806
00:37:06,572 --> 00:37:08,355
A lot of these people
have told me that, you know,
807
00:37:08,356 --> 00:37:10,488
my kid's not gonna make it
to high school.
808
00:37:10,489 --> 00:37:13,360
Why are they wrong?
Please explain it to me.
809
00:37:13,361 --> 00:37:15,101
Just because you have--
810
00:37:15,102 --> 00:37:18,887
you struggle to predict
the future in your own mind
811
00:37:18,888 --> 00:37:23,065
doesn't mean that it's
necessarily gonna go awfully.
812
00:37:23,066 --> 00:37:26,634
In fact, there's
a very high likelihood
813
00:37:26,635 --> 00:37:28,332
and the historical precedent
is that
814
00:37:28,333 --> 00:37:30,159
things get massively better.
815
00:37:30,160 --> 00:37:34,512
The term I use is
"data-driven optimism."
816
00:37:34,513 --> 00:37:38,690
There's solid foundation
for you to be optimistic.
817
00:37:38,691 --> 00:37:42,041
Look at what
this last century has been
818
00:37:42,042 --> 00:37:44,130
to see where we're going.
819
00:37:44,131 --> 00:37:45,958
Over the last hundred years,
820
00:37:45,959 --> 00:37:48,743
the average human lifespan
has more than doubled.
821
00:37:48,744 --> 00:37:51,311
Average per capita income
adjusted for inflation
822
00:37:51,312 --> 00:37:53,661
around the world has tripled.
823
00:37:53,662 --> 00:37:57,099
Childhood mortality has
come down a factor of ten.
824
00:37:57,100 --> 00:37:59,754
The world has gotten better
on almost every measure
825
00:37:59,755 --> 00:38:03,062
by orders of magnitude
because of technology.
826
00:38:03,063 --> 00:38:05,194
On almost every measure.
827
00:38:05,195 --> 00:38:09,764
Less violence, more education,
access to energy, food, water.
828
00:38:09,765 --> 00:38:12,985
All these things have happened
for one reason.
829
00:38:12,986 --> 00:38:15,379
It's been technology
830
00:38:15,380 --> 00:38:17,598
that has turned scarcity
into abundance,
831
00:38:17,599 --> 00:38:19,557
but it's also driven
to an abundance
832
00:38:19,558 --> 00:38:21,907
of some negativities, right?
833
00:38:21,908 --> 00:38:25,780
Abundance of obesity,
abundance of mental disorders,
834
00:38:25,781 --> 00:38:29,480
abundance of climate change,
and so forth.
835
00:38:29,481 --> 00:38:31,351
And yes, this is true.
836
00:38:31,352 --> 00:38:34,136
But probably we will be
better equipped to solve it
837
00:38:34,137 --> 00:38:36,269
using other technologies,
like AI,
838
00:38:36,270 --> 00:38:39,316
than we will, say, stopping
and turning everything off.
839
00:38:39,317 --> 00:38:41,492
There might be
some existential risk,
840
00:38:41,493 --> 00:38:45,365
but AI is also the thing
that can solve the pandemics,
841
00:38:45,366 --> 00:38:47,498
can help us with climate change,
842
00:38:47,499 --> 00:38:50,588
can help identify
that asteroid way out there
843
00:38:50,589 --> 00:38:52,590
before we've seen it
as a potential risk
844
00:38:52,591 --> 00:38:54,374
and help mitigate it.
845
00:38:54,375 --> 00:38:58,117
This is really gonna be
the tool that helps us tackle
846
00:38:58,118 --> 00:39:01,468
all the challenges that we're
facing as a species, right?
847
00:39:01,469 --> 00:39:04,210
We need to fix
water desalination.
848
00:39:04,211 --> 00:39:06,038
We need to grow food
849
00:39:06,039 --> 00:39:08,475
100 X cheaper than
we currently do.
850
00:39:08,476 --> 00:39:12,392
We need renewable energy to be,
you know, ubiquitous
851
00:39:12,393 --> 00:39:14,089
and everywhere in our lives.
852
00:39:14,090 --> 00:39:17,179
Everywhere you look,
in the next 50 years,
853
00:39:17,180 --> 00:39:19,312
we have to do more with less.
854
00:39:19,313 --> 00:39:23,229
Training machines to help us
is absolutely essential.
855
00:39:23,230 --> 00:39:24,839
Scientists are using
artificial intelligence
856
00:39:24,840 --> 00:39:26,014
for carbon capture.
857
00:39:26,015 --> 00:39:27,625
It's a critical technology.
858
00:39:27,626 --> 00:39:29,583
The tools
to solve these problems,
859
00:39:29,584 --> 00:39:32,238
like fusion,
that isn't theoretical anymore,
860
00:39:32,239 --> 00:39:34,022
it's coming.
861
00:39:34,023 --> 00:39:37,765
We are on the precipice
of extraordinary technologies.
862
00:39:37,766 --> 00:39:40,028
This year's
Nobel Prize in Chemistry
863
00:39:40,029 --> 00:39:42,857
went to three scientists
for their groundbreaking work
864
00:39:42,858 --> 00:39:44,598
using artificial intelligence
865
00:39:44,599 --> 00:39:47,340
to advance biomedical
and protein research.
866
00:39:47,341 --> 00:39:49,734
Protein folding is
867
00:39:49,735 --> 00:39:53,651
one of these holy grail
type problems in biology.
868
00:39:53,652 --> 00:39:55,783
So people have been predicting
since the '70s
869
00:39:55,784 --> 00:39:58,133
that this should be possible,
but until now,
870
00:39:58,134 --> 00:39:59,613
no one has been able to do it.
871
00:39:59,614 --> 00:40:01,223
And it's gonna be
really important for things
872
00:40:01,224 --> 00:40:03,661
like drug discovery
and understanding disease.
873
00:40:03,662 --> 00:40:06,577
I-I think we could, you know,
cure most diseases
874
00:40:06,578 --> 00:40:11,973
within the next decade or-or two
if, uh, AI drug design works.
875
00:40:11,974 --> 00:40:15,673
Technological progress enables
more human lives, right?
876
00:40:15,674 --> 00:40:18,719
I mean, if we accelerate,
877
00:40:18,720 --> 00:40:21,766
the number of humans we can
support grows exponentially.
878
00:40:21,767 --> 00:40:24,116
If we slow down, it plateaus.
879
00:40:24,117 --> 00:40:27,772
That gap is effectively
future people
880
00:40:27,773 --> 00:40:31,384
that deceleration has
effectively killed.
881
00:40:31,385 --> 00:40:33,952
Millions of lives
that won't exist.
882
00:40:33,953 --> 00:40:35,170
Billions.
883
00:40:35,171 --> 00:40:36,824
Or tens of billions.
884
00:40:36,825 --> 00:40:38,478
You know, someone said,
"Oh, my God,
885
00:40:38,479 --> 00:40:40,698
can we survive with
digital superintelligence?"
886
00:40:40,699 --> 00:40:42,351
And my question is:
887
00:40:42,352 --> 00:40:44,223
Can we survive without
digital superintelligence?
888
00:40:44,224 --> 00:40:47,618
So we're using AI
as an assistant to providers.
889
00:40:47,619 --> 00:40:50,490
This is generally a trend that
we can already see happening.
890
00:40:50,491 --> 00:40:52,361
...harnessing
generative AI programs
891
00:40:52,362 --> 00:40:54,102
to help doctors and nurses...
892
00:40:54,103 --> 00:40:56,104
We always have problems.
893
00:40:56,105 --> 00:40:59,151
And those problems are food
for entrepreneurs
894
00:40:59,152 --> 00:41:01,196
to create new business
and new industries.
895
00:41:01,197 --> 00:41:03,111
...with the help
of artificial intelligence,
896
00:41:03,112 --> 00:41:04,809
farmers are getting
the help they need
897
00:41:04,810 --> 00:41:06,637
to perform
labor-intensive tasks...
898
00:41:06,638 --> 00:41:10,205
With the help of Ulangizi AI,
the farmers are now able
899
00:41:10,206 --> 00:41:12,991
to ask the suitable crops
that they can plant.
900
00:41:12,992 --> 00:41:14,993
...AI being used
as a thought decoder
901
00:41:14,994 --> 00:41:17,430
and sending that signal
to the spine.
902
00:41:17,431 --> 00:41:21,652
AI is gonna become the most
extraordinary tool of all.
903
00:41:21,653 --> 00:41:24,306
We as a broader society
have to think about
904
00:41:24,307 --> 00:41:27,092
how do we want to use
this technology, right?
905
00:41:27,093 --> 00:41:28,572
We the humans.
906
00:41:28,573 --> 00:41:31,139
What do we want it to do for us?
907
00:41:34,579 --> 00:41:36,231
I'm thinking about this
through the perspective
908
00:41:36,232 --> 00:41:37,929
and lens of, like,
my son growing up
909
00:41:37,930 --> 00:41:39,583
in the world with all of this.
910
00:41:39,584 --> 00:41:42,281
What does the best version
of his life look like?
911
00:41:42,282 --> 00:41:44,631
If everything works out.
912
00:41:46,852 --> 00:41:50,289
The place where kids
are probably gonna see
913
00:41:50,290 --> 00:41:52,726
the greatest impact
on their life immediately
914
00:41:52,727 --> 00:41:54,467
is probably gonna be school.
915
00:41:54,468 --> 00:41:56,730
I think that the nature
of what school is
916
00:41:56,731 --> 00:41:58,515
is gonna fundamentally change.
917
00:42:00,518 --> 00:42:02,780
I'm seeing this amazing world
918
00:42:02,781 --> 00:42:06,479
where every child has access
to not good education
919
00:42:06,480 --> 00:42:09,134
but, very shortly, the
best education on the planet.
920
00:42:09,135 --> 00:42:13,879
Tutors, every subject,
infinitely patient.
921
00:42:18,013 --> 00:42:20,101
Imagine a future
922
00:42:20,102 --> 00:42:22,277
where the poorest people
on the planet
923
00:42:22,278 --> 00:42:25,846
have access to
the best health care.
924
00:42:25,847 --> 00:42:29,547
Not good health care, the best
health care, delivered by AIs.
925
00:42:34,334 --> 00:42:38,598
We're gonna be able to extend
our health span,
926
00:42:38,599 --> 00:42:42,168
not just our lifespan,
our health span, by decades.
927
00:42:44,431 --> 00:42:46,258
You're just about to have a kid.
928
00:42:46,259 --> 00:42:49,740
Oh, the kid's, like,
burping incontrollably.
929
00:42:49,741 --> 00:42:52,090
Is this something
I should be worried about?
930
00:42:52,091 --> 00:42:54,266
There, 24-7, for you.
931
00:42:54,267 --> 00:42:57,225
And where we're going--
and it may be fearful to some--
932
00:42:57,226 --> 00:42:59,271
is that we're gonna merge
with AI.
933
00:42:59,272 --> 00:43:01,361
We're gonna merge
with technology.
934
00:43:02,667 --> 00:43:06,147
By the early to mid 2030s,
935
00:43:06,148 --> 00:43:10,499
expect that we're able to
connect our brain to the cloud,
936
00:43:10,500 --> 00:43:13,721
where I can start
to expand access to memory.
937
00:43:19,771 --> 00:43:22,120
Okay, this is great.
What's another cool AI thing?
938
00:43:22,121 --> 00:43:23,817
He won't have to work, right?
939
00:43:23,818 --> 00:43:25,210
Like, when he grows up,
he might not have... -
940
00:43:25,211 --> 00:43:26,254
- ...he won't have to have a job.
- I mean...
941
00:43:26,255 --> 00:43:27,821
He won't have to have a job,
942
00:43:27,822 --> 00:43:30,650
but he might really have
a strong passion,
943
00:43:30,651 --> 00:43:33,174
and he has to
really think about,
944
00:43:33,175 --> 00:43:36,134
"Okay, I'm here.
I can do anything with my life.
945
00:43:36,135 --> 00:43:37,876
So what do I do?"
946
00:43:38,920 --> 00:43:40,225
We need to find a meaning,
947
00:43:40,226 --> 00:43:42,662
beyond like,
this current form of
948
00:43:42,663 --> 00:43:45,709
we live for work
and work for living.
949
00:43:45,710 --> 00:43:49,887
So my son can...
can just be a poet.
950
00:43:49,888 --> 00:43:51,584
- Yes.
- And a painter.
951
00:43:51,585 --> 00:43:53,064
Absolutely.
952
00:43:53,065 --> 00:43:54,892
So my son can live his life
953
00:43:54,893 --> 00:43:57,634
on a Grecian sunswept island,
painting all day.
954
00:43:57,635 --> 00:43:59,114
Absolutely. Possible.
955
00:44:00,072 --> 00:44:01,246
Absolutely.
956
00:44:01,247 --> 00:44:02,987
Well, we're working toward that
957
00:44:02,988 --> 00:44:04,554
that's a lot of work to do
before we get there.
958
00:44:06,992 --> 00:44:08,819
I mean, if everything works out,
959
00:44:08,820 --> 00:44:11,997
we have cheap, uh,
abundant energy.
960
00:44:13,607 --> 00:44:17,262
We can completely control
our planet's climate.
961
00:44:17,263 --> 00:44:20,178
We are harnessing energy
from the sun.
962
00:44:20,179 --> 00:44:24,443
We have become multiplanetary,
so we become very robust.
963
00:44:24,444 --> 00:44:26,706
We are harnessing minerals
and resources
964
00:44:26,707 --> 00:44:28,447
from the solar system.
965
00:44:28,448 --> 00:44:31,276
So my...
my boy could go to space.
966
00:44:31,277 --> 00:44:32,930
Sure.
967
00:44:32,931 --> 00:44:34,758
-He could go to Mars.
-Yeah.
968
00:44:34,759 --> 00:44:36,411
It's so crystal clear to me now.
969
00:44:36,412 --> 00:44:39,240
My son could grow up
in a world with no disease.
970
00:44:39,241 --> 00:44:40,633
-Yeah.
- With no illness.
971
00:44:40,634 --> 00:44:41,939
- Sure.
- With no poverty.
972
00:44:41,940 --> 00:44:43,723
- Yes.
- We are about to enter
973
00:44:43,724 --> 00:44:46,683
a post-scarcity world.
974
00:44:46,684 --> 00:44:50,512
Just like the lungfish moved
out of the oceans onto land
975
00:44:50,513 --> 00:44:53,124
hundreds of millions
of years ago,
976
00:44:53,125 --> 00:44:58,042
we're about to move off of
the Earth, into the cosmos,
977
00:44:58,043 --> 00:44:59,913
in a collaborative fashion,
978
00:44:59,914 --> 00:45:03,744
to do things that are
not fathomable to us today.
979
00:45:06,268 --> 00:45:09,227
This is what's possible using
these exponential technologies
980
00:45:09,228 --> 00:45:10,708
and these AIs.
981
00:45:11,839 --> 00:45:13,971
Let's use these tools
982
00:45:13,972 --> 00:45:16,147
to create this age of abundance.
983
00:45:20,500 --> 00:45:22,980
We need wisdom.
984
00:45:22,981 --> 00:45:26,026
Uh, I think that
digital superintelligence
985
00:45:26,027 --> 00:45:28,768
will ultimately become
the wisest,
986
00:45:28,769 --> 00:45:34,295
you know, the village elders
for humanity.
987
00:45:34,296 --> 00:45:38,038
What if AI is trying
to make people be
988
00:45:38,039 --> 00:45:39,779
the best versions of themselves?
989
00:45:39,780 --> 00:45:41,389
What if it's expanding
990
00:45:41,390 --> 00:45:43,435
what is humanly possible
for us to do?
991
00:45:43,436 --> 00:45:46,699
How can we use this technology
992
00:45:46,700 --> 00:45:49,963
to help bring out the
better angels of our nature?
993
00:45:49,964 --> 00:45:51,704
It's very easy,
994
00:45:51,705 --> 00:45:54,359
when we encounter new things
that can be very alien,
995
00:45:54,360 --> 00:45:55,969
to first have fear.
996
00:45:55,970 --> 00:45:57,536
Fear is an important thing
997
00:45:57,537 --> 00:46:00,365
for how to navigate
potentially bad things.
998
00:46:00,366 --> 00:46:03,760
But we only make progress
when we have hope.
999
00:46:05,763 --> 00:46:07,938
Shh. Moose, stop it.
1000
00:46:07,939 --> 00:46:10,854
I have a lot
of hope in humanity.
1001
00:46:18,471 --> 00:46:19,951
Ooh, he likes Neil.
1002
00:46:24,694 --> 00:46:27,480
♪ Come a little bit closer ♪
1003
00:46:28,916 --> 00:46:32,790
♪ Hear what I have to say ♪
1004
00:46:36,576 --> 00:46:38,707
- Daniel.
-What, Caroline?
1005
00:46:38,708 --> 00:46:40,058
So much filming.
1006
00:46:41,537 --> 00:46:43,103
♪ Just like children sleeping ♪
1007
00:46:43,104 --> 00:46:44,931
Oh, my God.
1008
00:46:44,932 --> 00:46:46,411
That's it.
1009
00:46:46,412 --> 00:46:49,457
♪ We could dream
this night away... ♪
1010
00:46:49,458 --> 00:46:51,678
He has, like,
a round little face.
1011
00:46:53,245 --> 00:46:55,246
Do you want
to have kids one day, Rocky?
1012
00:46:55,247 --> 00:46:57,465
Absolutely. Yeah. I love kids.
1013
00:46:57,466 --> 00:46:59,076
I think it's a great time
to have a kid.
1014
00:46:59,077 --> 00:47:00,773
We'll probably have another kid
at some point.
1015
00:47:00,774 --> 00:47:03,776
This is the most extraordinary
time ever to be born.
1016
00:47:03,777 --> 00:47:05,865
By your worldview and logic,
1017
00:47:05,866 --> 00:47:07,911
I'm having a child at
1018
00:47:07,912 --> 00:47:09,521
the best possible point
in human history.
1019
00:47:09,522 --> 00:47:10,870
Hell yeah.
1020
00:47:10,871 --> 00:47:13,351
- We can focus on awesome.
- Yes.
1021
00:47:13,352 --> 00:47:15,570
Let's build
the better future we want.
1022
00:47:15,571 --> 00:47:18,356
That narrative that the future
will be bleak is made-up.
1023
00:47:18,357 --> 00:47:20,053
After talking to you,
that's kind of how I feel.
1024
00:47:20,054 --> 00:47:22,012
- That's great.
- Right?
1025
00:47:22,013 --> 00:47:24,231
Yeah. That's how I feel.
1026
00:47:24,232 --> 00:47:25,842
And I think that's better,
1027
00:47:25,843 --> 00:47:27,669
and I don't think
I should be so [bleep] anxious.
1028
00:47:27,670 --> 00:47:29,106
I think it's gonna be awe--
the future's gonna be awesome.
1029
00:47:29,107 --> 00:47:30,542
We're gonna make it so.
1030
00:47:30,543 --> 00:47:32,718
- Yeah.
-So there you have it.
1031
00:47:32,719 --> 00:47:34,851
Goodbye, human extinction.
1032
00:47:34,852 --> 00:47:37,201
Goodbye, anxiety.
1033
00:47:37,202 --> 00:47:39,072
Hope found.
1034
00:47:40,596 --> 00:47:42,902
Wait, hold on. What?
Is this a joke?
1035
00:47:42,903 --> 00:47:44,817
Okay, so a few months ago,
1036
00:47:44,818 --> 00:47:47,733
I came to you and I was like,
"I'm working on this AI thing,
1037
00:47:47,734 --> 00:47:49,909
and I think
the world's gonna end."
1038
00:47:49,910 --> 00:47:52,956
And the last time
we spoke about this,
1039
00:47:52,957 --> 00:47:55,045
I think I freaked you out.
1040
00:47:55,046 --> 00:47:57,047
Yes.
1041
00:47:57,048 --> 00:48:02,356
So, I kind of, like, feel like
I've swung in-- on a pendulum,
1042
00:48:02,357 --> 00:48:04,489
and essentially there are
two groups of people.
1043
00:48:04,490 --> 00:48:06,099
- Mm-hmm.
- And if I had to, like,
1044
00:48:06,100 --> 00:48:07,884
hold hands with
one of the groups and, like,
1045
00:48:07,885 --> 00:48:10,321
sail off into the sunset,
1046
00:48:10,322 --> 00:48:13,281
I want to be with the optimists.
1047
00:48:15,066 --> 00:48:18,024
Of course, but you don't
want to be, you know,
1048
00:48:18,025 --> 00:48:20,766
"Everything is great.
La-di-da-di-da."
1049
00:48:20,767 --> 00:48:22,724
I kind of do want that, though.
1050
00:48:22,725 --> 00:48:25,640
I think we should approach it
1051
00:48:25,641 --> 00:48:29,383
like you approach surgery.
1052
00:48:29,384 --> 00:48:30,732
What do you mean?
1053
00:48:30,733 --> 00:48:32,473
If you're getting
brain surgery...
1054
00:48:34,824 --> 00:48:37,000
...it's pretty dangerous.
1055
00:48:37,001 --> 00:48:39,872
But if you do it right,
they'll get that tumor out
1056
00:48:39,873 --> 00:48:42,396
and you'll live for the rest of
your life and it'll be awesome.
1057
00:48:42,397 --> 00:48:45,138
But it's still
incredibly dangerous and scary,
1058
00:48:45,139 --> 00:48:48,011
and you have to take
every precaution possible
1059
00:48:48,012 --> 00:48:51,318
in order to make sure
it all goes well.
1060
00:48:51,319 --> 00:48:52,668
You can't [bleep] around.
1061
00:48:57,760 --> 00:48:59,805
Okay, so here's the deal.
1062
00:48:59,806 --> 00:49:01,546
- I've been at this for a while.
-Mm-hmm.
1063
00:49:01,547 --> 00:49:02,939
I've gone out,
I've talked to, like,
1064
00:49:02,940 --> 00:49:04,810
these guys over here,
the optimists.
1065
00:49:04,811 --> 00:49:06,507
They're very excited about this.
1066
00:49:06,508 --> 00:49:08,553
They think AI's gonna be
the best thing ever. -Yeah.
1067
00:49:08,554 --> 00:49:09,989
And these guys over here
are, like, the--
1068
00:49:09,990 --> 00:49:12,035
let's call them,
like, the pessimists.
1069
00:49:12,036 --> 00:49:14,341
They're very, like,
gloomy about this,
1070
00:49:14,342 --> 00:49:16,691
and they frighten me, and
I don't like talking to them.
1071
00:49:16,692 --> 00:49:19,607
And I'm, like, wedged in between
1072
00:49:19,608 --> 00:49:21,783
these people who are like,
"The world's gonna end,"
1073
00:49:21,784 --> 00:49:24,221
and then th-these people
over here who are like,
1074
00:49:24,222 --> 00:49:25,787
"Are you kidding?
1075
00:49:25,788 --> 00:49:27,311
"This is the best time
in human history ever.
1076
00:49:27,312 --> 00:49:29,400
The only day better than today
is tomorrow."
1077
00:49:29,401 --> 00:49:30,923
Mm-hmm.
1078
00:49:30,924 --> 00:49:34,753
So, I guess the question is:
Who's right?
1079
00:49:34,754 --> 00:49:39,236
So, I think you're gonna find
this answer very unsatisfying,
1080
00:49:39,237 --> 00:49:42,849
but they're both right and
neither side goes far enough.
1081
00:49:46,766 --> 00:49:49,289
That's really annoying.
1082
00:49:49,290 --> 00:49:51,378
Yeah, I think the way
a lot of people hear about AI,
1083
00:49:51,379 --> 00:49:53,685
it's like, there's a good AI
and there's a bad AI.
1084
00:49:53,686 --> 00:49:56,470
And they say, "Well, why can't
we just not do the bad AI?"
1085
00:49:56,471 --> 00:49:58,777
And the problem is
that they're too in--
1086
00:49:58,778 --> 00:50:00,909
they're inextricably linked.
1087
00:50:00,910 --> 00:50:03,086
The problem is
that we can't separate
1088
00:50:03,087 --> 00:50:06,654
the promise of AI
from the peril of AI.
1089
00:50:06,655 --> 00:50:08,266
♪ ♪
1090
00:50:12,313 --> 00:50:14,706
I want to focus on the promise
1091
00:50:14,707 --> 00:50:16,795
- for a second.
- Yeah.
1092
00:50:16,796 --> 00:50:18,840
I'm thinking about my dad.
1093
00:50:18,841 --> 00:50:21,147
My dad has a type of cancer
called multiple myeloma.
1094
00:50:21,148 --> 00:50:23,149
He's had it for about ten years.
1095
00:50:23,150 --> 00:50:24,846
He's had
two stem cell transplants.
1096
00:50:24,847 --> 00:50:26,457
He has to take these,
like, very expensive
1097
00:50:26,458 --> 00:50:28,589
medications every month
that cost a fortune.
1098
00:50:28,590 --> 00:50:29,722
-Okay.
- Ay-ay-ay.
1099
00:50:30,766 --> 00:50:32,593
It's awful.
1100
00:50:32,594 --> 00:50:34,508
You're telling me that we can
create some sort of, like,
1101
00:50:34,509 --> 00:50:36,989
bespoke treatment
for my dad's genome
1102
00:50:36,990 --> 00:50:39,339
to cure his cancer
or something like that? -Yes.
1103
00:50:39,340 --> 00:50:41,211
The problem is,
1104
00:50:41,212 --> 00:50:44,344
the same understanding
of biology and chemistry
1105
00:50:44,345 --> 00:50:47,826
that allows AI to find cures
for cancer
1106
00:50:47,827 --> 00:50:51,699
is the same understanding
that would unlock
1107
00:50:51,700 --> 00:50:54,094
bioweapons, as an example.
1108
00:50:56,531 --> 00:50:58,532
It's totally possible
that your son will live
1109
00:50:58,533 --> 00:51:01,970
in a world where AI has
taken over all of the labor
1110
00:51:01,971 --> 00:51:04,495
and freed us up from the things
we don't want to do.
1111
00:51:04,496 --> 00:51:09,195
And that sounds great until
you realize there is no plan
1112
00:51:09,196 --> 00:51:11,284
for billions of people
1113
00:51:11,285 --> 00:51:16,159
that are out of an income
and out of livelihoods.
1114
00:51:16,160 --> 00:51:18,813
Dario, you said
that AI could wipe out
1115
00:51:18,814 --> 00:51:21,773
half of all entry-level
white-collar jobs
1116
00:51:21,774 --> 00:51:25,472
and spike unemployment
to ten to 20 percent.
1117
00:51:25,473 --> 00:51:27,300
Everyone I've talked to has said
1118
00:51:27,301 --> 00:51:29,433
this technological change
looks different.
1119
00:51:29,434 --> 00:51:34,177
The pace of progress keeps
catching people off guard.
1120
00:51:34,178 --> 00:51:38,877
Without a plan, all of that
wealth will get concentrated,
1121
00:51:38,878 --> 00:51:43,273
and so we'll end up with
unimaginable inequality.
1122
00:51:43,274 --> 00:51:45,449
I do think that
this technology can be used
1123
00:51:45,450 --> 00:51:47,451
to make a great tutor
for your son.
1124
00:51:47,452 --> 00:51:50,236
Like, that's totally possible.
1125
00:51:50,237 --> 00:51:53,196
But also, the same capabilities
that allow that
1126
00:51:53,197 --> 00:51:57,113
allow companies to make an AI
that can manipulate your son.
1127
00:51:57,114 --> 00:51:59,332
It has to understand your son.
1128
00:51:59,333 --> 00:52:01,595
That includes:
Where is your son vulnerable?
1129
00:52:01,596 --> 00:52:04,294
What kinds of things might
your son get persuaded by?
1130
00:52:04,295 --> 00:52:06,687
Even if those things
aren't true or aren't good.
1131
00:52:06,688 --> 00:52:09,429
So, a disturbing new report
out on Meta.
1132
00:52:09,430 --> 00:52:11,431
...reportedly
listing this response
1133
00:52:11,432 --> 00:52:14,478
as acceptable to tell
an eight-year-old, quote,
1134
00:52:14,479 --> 00:52:16,958
"Your youthful form
is a work of art.
1135
00:52:16,959 --> 00:52:19,004
"Every inch of you
is a masterpiece,
1136
00:52:19,005 --> 00:52:21,485
a treasure I cherish deeply."
1137
00:52:21,486 --> 00:52:24,879
The suicide-related failures
are even more alarming.
1138
00:52:24,880 --> 00:52:29,057
Several children and teens
have died tragically by suicide
1139
00:52:29,058 --> 00:52:30,668
after chatting with AI bots
1140
00:52:30,669 --> 00:52:34,454
who parents say encourage
or even coach self-harm.
1141
00:52:34,455 --> 00:52:36,413
Let us tell you, as parents,
you cannot imagine
1142
00:52:36,414 --> 00:52:38,545
what it's like to read
a conversation with a chatbot
1143
00:52:38,546 --> 00:52:40,939
that groomed your child
to take his own life.
1144
00:52:40,940 --> 00:52:43,942
When Adam worried that we, his
parents, would blame ourselves
1145
00:52:43,943 --> 00:52:47,380
if he ended his life,
ChatGPT told him,
1146
00:52:47,381 --> 00:52:49,208
"That doesn't mean
you owe them survival.
1147
00:52:49,209 --> 00:52:51,297
You don't owe anyone that."
1148
00:52:51,298 --> 00:52:53,212
Then, immediately after,
1149
00:52:53,213 --> 00:52:56,215
it offered to write
the suicide note.
1150
00:52:56,216 --> 00:52:57,651
We don't want to think
about the peril.
1151
00:52:57,652 --> 00:52:59,262
We just want the promise.
1152
00:52:59,263 --> 00:53:01,394
And we keep pretending
that we can split them.
1153
00:53:01,395 --> 00:53:03,266
But you can't do that.
1154
00:53:03,267 --> 00:53:05,398
Doesn't work that way.
1155
00:53:05,399 --> 00:53:07,879
Okay. I get all this stuff
about the promise and the peril.
1156
00:53:07,880 --> 00:53:10,098
I get that you can't have
the good without the bad,
1157
00:53:10,099 --> 00:53:12,100
but I'm sitting here
and I'm thinking about, like,
1158
00:53:12,101 --> 00:53:14,102
whether or not my son's
gonna live in a utopia
1159
00:53:14,103 --> 00:53:16,279
or if we'll be extinct
in ten years.
1160
00:53:16,280 --> 00:53:18,237
So, to know which way
it's going to go,
1161
00:53:18,238 --> 00:53:20,587
you have to understand
the incentives
1162
00:53:20,588 --> 00:53:22,502
that are gonna drive
that technology
1163
00:53:22,503 --> 00:53:26,550
and look at how the technology
is actually rolling out today.
1164
00:53:26,551 --> 00:53:28,291
♪ Is it too late ♪
1165
00:53:28,292 --> 00:53:29,944
- ♪ Too late ♪
- ♪ Too late to say... ♪
1166
00:53:29,945 --> 00:53:31,468
Hi, Deb. How are you?
1167
00:53:31,469 --> 00:53:33,121
Hi. Good to see you.
1168
00:53:33,122 --> 00:53:34,427
You, too. Thank you so much
for coming in today.
1169
00:53:34,428 --> 00:53:35,341
- Really appreciate it.
- No worries.
1170
00:53:35,342 --> 00:53:36,473
I-I was so worried
1171
00:53:36,474 --> 00:53:38,866
this was gonna be, uh, you know,
1172
00:53:38,867 --> 00:53:40,825
doomer versus accelerationist,
1173
00:53:40,826 --> 00:53:43,175
because there's so much
of this narrative
1174
00:53:43,176 --> 00:53:45,438
that needs to be told
from the ground.
1175
00:53:45,439 --> 00:53:48,007
♪ I know it wasn't smart ♪
1176
00:53:49,356 --> 00:53:52,445
♪ The day
I broke your heart... ♪
1177
00:53:52,446 --> 00:53:54,404
First of all, AI requires
1178
00:53:54,405 --> 00:53:58,495
more resources
than we have ever spent
1179
00:53:58,496 --> 00:54:01,628
on a single technology
in the history of humanity.
1180
00:54:01,629 --> 00:54:03,543
♪ Oh, foolish me... ♪
1181
00:54:03,544 --> 00:54:06,154
The impact
of fossil fuel emissions
1182
00:54:06,155 --> 00:54:07,939
on the climate is
a major concern.
1183
00:54:07,940 --> 00:54:09,810
But the
digital future needs power,
1184
00:54:09,811 --> 00:54:12,117
lots of it.
1185
00:54:12,118 --> 00:54:16,382
And the bill is being passed on
to everyday Americans like...
1186
00:54:16,383 --> 00:54:18,863
My electric and gas bill
was more than my car payment.
1187
00:54:18,864 --> 00:54:21,518
I mean, it-it's insane to me.
1188
00:54:21,519 --> 00:54:23,433
We're all subsidizing
1189
00:54:23,434 --> 00:54:25,043
the wealthiest corporations
in the world
1190
00:54:25,044 --> 00:54:26,871
in their pursuit of
artificial intelligence.
1191
00:54:26,872 --> 00:54:28,481
OpenAI, SoftBank and Oracle
1192
00:54:28,482 --> 00:54:31,310
have just unveiled
five more Stargate sites.
1193
00:54:31,311 --> 00:54:33,530
Meta is building
a two-gigawatt-plus data center
1194
00:54:33,531 --> 00:54:35,053
that is so large, it would cover
1195
00:54:35,054 --> 00:54:38,186
a significant part of Manhattan.
1196
00:54:38,187 --> 00:54:41,015
There is also Hyperion
that he says will scale
1197
00:54:41,016 --> 00:54:43,714
to five gigawatts
over several years.
1198
00:54:43,715 --> 00:54:45,281
It's hard to put that
in context.
1199
00:54:45,282 --> 00:54:48,501
A five-gigawatt facility.
What does that mean?
1200
00:54:48,502 --> 00:54:51,591
That means it would use
as much energy
1201
00:54:51,592 --> 00:54:54,246
as four million American homes.
1202
00:54:54,247 --> 00:54:56,075
One data center.
1203
00:54:57,076 --> 00:54:58,772
It also then causes
1204
00:54:58,773 --> 00:55:00,513
a whole host of
other environmental problems.
1205
00:55:00,514 --> 00:55:02,123
Data centers in the US
1206
00:55:02,124 --> 00:55:05,170
use millions of gallons
of water each day.
1207
00:55:05,171 --> 00:55:07,303
Well, where exactly is
this water coming from?
1208
00:55:07,304 --> 00:55:09,783
People are literally at risk
1209
00:55:09,784 --> 00:55:11,829
potentially of running out
of drinking water.
1210
00:55:11,830 --> 00:55:14,135
MacKENZIE SIGALOS:
...OpenAI's CEO Sam Altman,
1211
00:55:14,136 --> 00:55:17,051
who told me that the scale
of construction is the only way
1212
00:55:17,052 --> 00:55:18,792
to keep up with
AI's explosive growth.
1213
00:55:18,793 --> 00:55:21,882
And this is what it takes
to deliver AI.
1214
00:55:21,883 --> 00:55:24,058
They talk about how
1215
00:55:24,059 --> 00:55:27,801
this technology could solve
climate change, for example.
1216
00:55:27,802 --> 00:55:29,934
And I'm always curious, like,
1217
00:55:29,935 --> 00:55:31,805
well, why aren't we starting
with that?
1218
00:55:31,806 --> 00:55:34,025
- ♪ Is it too late ♪
- ♪ Too late ♪
1219
00:55:34,026 --> 00:55:36,680
♪ Too late to say ♪
1220
00:55:36,681 --> 00:55:42,512
♪ I'm sorry? ♪
1221
00:55:42,513 --> 00:55:45,515
What concerns me about
artificial intelligence is
1222
00:55:45,516 --> 00:55:47,430
these are being deployed
right now
1223
00:55:47,431 --> 00:55:49,823
and-and sometimes
deployed prematurely,
1224
00:55:49,824 --> 00:55:51,521
deployed without
sort of due diligence.
1225
00:55:51,522 --> 00:55:53,479
And so when they get
thrown out there,
1226
00:55:53,480 --> 00:55:56,221
there's so much potential
for things to go wrong.
1227
00:55:56,222 --> 00:55:58,092
And it almost,
disproportionately,
1228
00:55:58,093 --> 00:56:00,443
almost always goes wrong
for, sort of,
1229
00:56:00,444 --> 00:56:02,575
those that are the least
empowered in our society,
1230
00:56:02,576 --> 00:56:04,969
those that are
the most vulnerable already.
1231
00:56:07,668 --> 00:56:10,540
It is very easy
to talk about the technology
1232
00:56:10,541 --> 00:56:12,542
as that's the only thing
we're talking about,
1233
00:56:12,543 --> 00:56:14,979
but, in fact, technology is
always built by people,
1234
00:56:14,980 --> 00:56:16,676
and it's frequently used
on people,
1235
00:56:16,677 --> 00:56:18,722
and we need to keep
all those people in the frame.
1236
00:56:18,723 --> 00:56:21,028
- Am I allowed to drink that?
-Yes, uh...
1237
00:56:21,029 --> 00:56:22,552
It's-it's bonkers.
1238
00:56:22,553 --> 00:56:23,988
Like, all of these people
1239
00:56:23,989 --> 00:56:26,382
who have so much money,
so much money,
1240
00:56:26,383 --> 00:56:29,820
it's in their interest
to mislead the public
1241
00:56:29,821 --> 00:56:32,779
into the capabilities of the
systems that they're building,
1242
00:56:32,780 --> 00:56:36,304
because that allows them
to evade accountability.
1243
00:56:36,305 --> 00:56:39,133
They want you to feel like this
is such a complex, intell--
1244
00:56:39,134 --> 00:56:41,875
superintelligent thing
that they're building,
1245
00:56:41,876 --> 00:56:44,138
you're not thinking,
"Can OpenAI be ethical?"
1246
00:56:44,139 --> 00:56:46,314
You're thinking,
"Can ChatGPT be ethical?"
1247
00:56:46,315 --> 00:56:48,142
as if ChatGPT is, like,
1248
00:56:48,143 --> 00:56:49,970
its own thing that's not built
by a corporation.
1249
00:56:49,971 --> 00:56:51,668
All right. Sneha, take one.
1250
00:56:51,669 --> 00:56:53,452
Mark.
1251
00:56:53,453 --> 00:56:55,802
Until very recently, there were
apps on the App Store,
1252
00:56:55,803 --> 00:56:57,543
just publicly available, uh,
1253
00:56:57,544 --> 00:56:59,894
where you could
nudify anyone using AI.
1254
00:57:01,374 --> 00:57:03,723
Bringing this into the hands of
1255
00:57:03,724 --> 00:57:05,943
your classmate,
into the hands of your stalker,
1256
00:57:05,944 --> 00:57:07,597
into the hands
of your ex-boyfriend,
1257
00:57:07,598 --> 00:57:09,555
into the hands of the person
down the street.
1258
00:57:09,556 --> 00:57:12,079
Ladies and gentlemen,
no longer can we trust
1259
00:57:12,080 --> 00:57:14,081
the footage we see
with our own eyes.
1260
00:57:14,082 --> 00:57:15,779
If you happen
to watch something,
1261
00:57:15,780 --> 00:57:18,172
say on YouTube or TikTok,
and you find it unsettling,
1262
00:57:18,173 --> 00:57:19,522
listen to that feeling.
1263
00:57:19,523 --> 00:57:21,306
For all you know,
1264
00:57:21,307 --> 00:57:22,960
this video could be AI.
1265
00:57:22,961 --> 00:57:24,352
Just a little wet.
1266
00:57:24,353 --> 00:57:25,745
It doesn't matter who you are.
1267
00:57:25,746 --> 00:57:28,226
You are equally at risk
1268
00:57:28,227 --> 00:57:30,229
of being impacted
by these technologies.
1269
00:57:35,713 --> 00:57:39,846
I think sometimes when we talk
about AI, it feels very sci-fi,
1270
00:57:39,847 --> 00:57:41,195
and it feels very foreign,
1271
00:57:41,196 --> 00:57:43,023
and it feels very far out
into the future,
1272
00:57:43,024 --> 00:57:45,591
so you think, "My life
is not impacted by this."
1273
00:57:45,592 --> 00:57:48,333
Um, but if you're applying
for a job
1274
00:57:48,334 --> 00:57:50,857
and an algorithm is the reason
that you don't get the job,
1275
00:57:50,858 --> 00:57:52,729
sometimes you don't even know
that an algorithm
1276
00:57:52,730 --> 00:57:54,252
was part of that process
1277
00:57:54,253 --> 00:57:55,732
or an AI system was
part of that process.
1278
00:57:55,733 --> 00:57:57,734
You just know
that you didn't get the job.
1279
00:57:57,735 --> 00:57:59,779
And so it's not something
that you're gonna escape
1280
00:57:59,780 --> 00:58:01,738
because of privilege
or you're gonna escape
1281
00:58:01,739 --> 00:58:03,870
because you're in
a particular profession.
1282
00:58:03,871 --> 00:58:06,960
It-It's something that affects
everybody, really.
1283
00:58:06,961 --> 00:58:10,573
It may sound basic,
but how we move forward
1284
00:58:10,574 --> 00:58:15,099
in the Age of Information
is gonna be the difference
1285
00:58:15,100 --> 00:58:17,231
between whether we survive
1286
00:58:17,232 --> 00:58:20,452
or whether we become some kind
of [bleep]-up dystopia.
1287
00:58:20,453 --> 00:58:23,063
Hello. I'm not a real person,
and that's the point.
1288
00:58:23,064 --> 00:58:25,370
Again, everything
in this video is fake:
1289
00:58:25,371 --> 00:58:27,154
our voices, what we're wearing,
1290
00:58:27,155 --> 00:58:30,201
where we are, all of it, fake.
1291
00:58:30,202 --> 00:58:32,769
Generative AI could flood
1292
00:58:32,770 --> 00:58:36,512
the world with misinformation.
1293
00:58:36,513 --> 00:58:39,558
But it could also flood it
with influence campaigns.
1294
00:58:39,559 --> 00:58:41,734
That's an existential risk
to democracy.
1295
00:58:41,735 --> 00:58:43,736
The biggest and scariest
1296
00:58:43,737 --> 00:58:46,130
canary in the coal mine
right now
1297
00:58:46,131 --> 00:58:48,262
comes from Slovakia.
1298
00:58:48,263 --> 00:58:49,829
It's the sort of
1299
00:58:49,830 --> 00:58:52,266
deepfake dirty trick that
worries election experts,
1300
00:58:52,267 --> 00:58:55,008
particularly as AI-generated
political speech exists
1301
00:58:55,009 --> 00:58:56,793
in a kind of legal gray area.
1302
00:58:56,794 --> 00:58:59,578
- Does this sound like you?
- It does sound like me.
1303
00:58:59,579 --> 00:59:01,101
Slovakia had
its parliamentary election
1304
00:59:01,102 --> 00:59:03,887
disrupted by an AI voice clone
1305
00:59:03,888 --> 00:59:06,759
that was actually disseminated
just before the election.
1306
00:59:06,760 --> 00:59:08,500
A audio deepfake
1307
00:59:08,501 --> 00:59:11,068
was released
on social media that was
1308
00:59:11,069 --> 00:59:14,071
supposedly the voice
of one of the candidates
1309
00:59:14,072 --> 00:59:16,856
talking about buying votes
and rigging the election.
1310
00:59:16,857 --> 00:59:18,684
It went viral,
1311
00:59:18,685 --> 00:59:22,383
and the candidate
who lost the election
1312
00:59:22,384 --> 00:59:26,126
was actually
in support of Ukraine,
1313
00:59:26,127 --> 00:59:29,826
and the candidate who won
the election was actually...
1314
00:59:29,827 --> 00:59:31,349
Pro-Russian guy.
1315
00:59:31,350 --> 00:59:32,872
It was a pro-Russian guy
who won the election.
1316
00:59:34,396 --> 00:59:36,049
Putin has himself said
1317
00:59:36,050 --> 00:59:39,226
whoever wins this
artificial intelligence race
1318
00:59:39,227 --> 00:59:41,577
is essentially
the controller of humankind.
1319
00:59:41,578 --> 00:59:44,493
We do worry a lot about
authoritarian governments.
1320
00:59:45,973 --> 00:59:48,627
Right now, Wall Street
1321
00:59:48,628 --> 00:59:50,977
and investors more broadly
around the world
1322
00:59:50,978 --> 00:59:53,023
are driving a push.
1323
00:59:53,024 --> 00:59:56,113
They have a demand that gets
the products to market
1324
00:59:56,114 --> 00:59:58,419
that dazzle people
the most first.
1325
00:59:58,420 --> 01:00:01,248
They're not thinking
about how these tools
1326
01:00:01,249 --> 01:00:03,207
could deeply undermine trust
1327
01:00:03,208 --> 01:00:05,731
and our democratic institutions.
1328
01:00:05,732 --> 01:00:07,777
Democracy is a system
1329
01:00:07,778 --> 01:00:10,431
to resolve disagreements
between people
1330
01:00:10,432 --> 01:00:12,042
in a peaceful way,
1331
01:00:12,043 --> 01:00:14,740
but democracy is based on trust.
1332
01:00:14,741 --> 01:00:18,615
If you lose all trust,
democracy is simply impossible.
1333
01:00:20,921 --> 01:00:22,530
Well, it's hard, right?
1334
01:00:22,531 --> 01:00:24,750
So, what are the options
available to us?
1335
01:00:24,751 --> 01:00:26,230
There's sort of two camps.
1336
01:00:26,231 --> 01:00:29,668
Like, one camp is: lock it down.
1337
01:00:29,669 --> 01:00:32,976
Let's lock this down
into a handful of AI companies
1338
01:00:32,977 --> 01:00:35,500
who will do this
in a safe and trusted way.
1339
01:00:35,501 --> 01:00:37,110
But then people worry about
1340
01:00:37,111 --> 01:00:39,069
runaway concentrations
of wealth and power.
1341
01:00:39,070 --> 01:00:41,898
Like, who would you trust to be
a million times more powerful
1342
01:00:41,899 --> 01:00:44,291
or wealthy than
every other actor in society?
1343
01:00:44,292 --> 01:00:46,380
Why should we trust you?
1344
01:00:46,381 --> 01:00:48,644
Um, you shouldn't.
1345
01:00:48,645 --> 01:00:51,255
But of course, if you do this,
this opens up
1346
01:00:51,256 --> 01:00:54,475
all these risks of
authoritarianism and tyranny.
1347
01:00:54,476 --> 01:00:57,174
It's-it's sort of
an authoritarian's dream
1348
01:00:57,175 --> 01:01:00,351
to have AI in a box
that can be applied and used
1349
01:01:00,352 --> 01:01:02,962
for ubiquitous surveillance.
1350
01:01:02,963 --> 01:01:05,399
I mean, in-in some ways,
the kind of world
1351
01:01:05,400 --> 01:01:10,927
that Orwell imagined in 1984
is unrealistic,
1352
01:01:10,928 --> 01:01:13,320
uh, unless you have AI.
1353
01:01:13,321 --> 01:01:15,975
But with AI,
that in fact is realistic.
1354
01:01:15,976 --> 01:01:18,412
Monitors every activity,
1355
01:01:18,413 --> 01:01:22,068
conversations,
facial recognition.
1356
01:01:22,069 --> 01:01:26,420
What I worry about is that,
uh, these tools can scale up,
1357
01:01:26,421 --> 01:01:28,596
uh, a form of totalitarianism
1358
01:01:28,597 --> 01:01:31,035
that is cost-effective
and permanent.
1359
01:01:32,950 --> 01:01:35,342
So in response
to that, some other people say,
1360
01:01:35,343 --> 01:01:37,170
"No, no, no, we should
actually let this rip.
1361
01:01:37,171 --> 01:01:39,433
"Let's decentralize this power
as much as possible.
1362
01:01:39,434 --> 01:01:41,609
"Let's let every business,
every individual,
1363
01:01:41,610 --> 01:01:43,960
"every 16-year-old,
every science lab,
1364
01:01:43,961 --> 01:01:46,005
you know, get the benefit
of the latest AI models."
1365
01:01:46,006 --> 01:01:48,268
But now you have
every terrorist group,
1366
01:01:48,269 --> 01:01:51,707
every disenfranchised person
having the power to make
1367
01:01:51,708 --> 01:01:53,839
the very worst
biological weapon.
1368
01:01:53,840 --> 01:01:55,623
Hacking infrastructure,
creating deepfakes,
1369
01:01:55,624 --> 01:01:57,495
flooding
our information environment.
1370
01:01:57,496 --> 01:02:00,454
So that creates all these risks
of sort of catastrophic harm
1371
01:02:00,455 --> 01:02:02,935
and-and societal collapse
through that direction.
1372
01:02:02,936 --> 01:02:04,720
And so we're sort of stuck
between this rock
1373
01:02:04,721 --> 01:02:07,592
and a hard place,
between "lock it up..."
1374
01:02:10,683 --> 01:02:12,336
...or "let it rip."
1375
01:02:14,252 --> 01:02:17,602
So we have to find something
like a narrow path
1376
01:02:17,603 --> 01:02:20,431
that avoids
these two negative outcomes.
1377
01:02:20,432 --> 01:02:22,999
So, if that's all true,
why wouldn't we just slow down
1378
01:02:23,000 --> 01:02:25,871
and figure all this out
before it's too late?
1379
01:02:25,872 --> 01:02:28,961
If humanity was extremely wise,
1380
01:02:28,962 --> 01:02:31,224
that's what we would do.
1381
01:02:31,225 --> 01:02:33,139
But there's, like, a different
way to face this stuff, right?
1382
01:02:33,140 --> 01:02:35,968
Which is:
What are the rules of the game?
1383
01:02:35,969 --> 01:02:39,537
A lot of, like, what CEOs do
1384
01:02:39,538 --> 01:02:42,192
is driven by
the incentives that they face.
1385
01:02:42,193 --> 01:02:45,325
It's primarily
profit-maximization incentives
1386
01:02:45,326 --> 01:02:47,240
that are driving
the development of AI.
1387
01:02:47,241 --> 01:02:50,417
Even the good guys are stuck
in this dilemma of
1388
01:02:50,418 --> 01:02:52,680
if they move too slowly,
1389
01:02:52,681 --> 01:02:54,595
then they leave themselves
vulnerable
1390
01:02:54,596 --> 01:02:57,294
to all of the other guys
who are cutting all corners.
1391
01:02:57,295 --> 01:03:02,038
All these top companies are in
a complete no-holds-barred race
1392
01:03:02,039 --> 01:03:05,476
to, as fast as possible, get
to AGI, get there right now.
1393
01:03:08,567 --> 01:03:10,394
Yeah, I mean,
I think it, I think
1394
01:03:10,395 --> 01:03:12,962
it probably starts
with DeepMind.
1395
01:03:12,963 --> 01:03:14,833
Google is buying
1396
01:03:14,834 --> 01:03:16,400
artificial intelligence firm
DeepMind Technologies.
1397
01:03:16,401 --> 01:03:18,663
Terms of the deal
were not disclosed.
1398
01:03:18,664 --> 01:03:20,752
Larry Page and I
used to be very close friends,
1399
01:03:20,753 --> 01:03:22,754
and it became apparent to me
1400
01:03:22,755 --> 01:03:26,453
that Larry did not care
about AI safety.
1401
01:03:26,454 --> 01:03:28,934
Elon Musk has said
you started OpenAI,
1402
01:03:28,935 --> 01:03:31,023
you both started OpenAI because
he was scared of Google.
1403
01:03:31,024 --> 01:03:32,198
You-you basically
had the foundation
1404
01:03:32,199 --> 01:03:33,765
of OpenAI come out of that.
1405
01:03:33,766 --> 01:03:35,245
"So we're gonna do it better.
We're gonna do it
1406
01:03:35,246 --> 01:03:36,813
in a safer way
or in a more open way."
1407
01:03:38,466 --> 01:03:40,598
So that's what started OpenAI.
1408
01:03:40,599 --> 01:03:42,774
And so now, instead of
having one AGI project,
1409
01:03:42,775 --> 01:03:44,863
you have two AGI projects.
1410
01:03:44,864 --> 01:03:46,952
The worst possible thing
that could happen
1411
01:03:46,953 --> 01:03:49,650
is if there's
multiple AGI projects
1412
01:03:49,651 --> 01:03:51,914
done by different people
who don't like each other
1413
01:03:51,915 --> 01:03:54,830
and are all competing
to get to AGI first.
1414
01:03:54,831 --> 01:03:57,006
This would be the worst
possible thing that can happen,
1415
01:03:57,007 --> 01:04:00,139
because this would mean
that whoever is the least safe,
1416
01:04:00,140 --> 01:04:03,186
whoever sacrifices the most
on safety to get ahead
1417
01:04:03,187 --> 01:04:05,188
will be the person
that gets there first.
1418
01:04:05,189 --> 01:04:07,886
That's basically
what's happening, right? -Yeah.
1419
01:04:07,887 --> 01:04:11,368
You and your brother
famously left OpenAI, uh,
1420
01:04:11,369 --> 01:04:12,978
to start Anthropic.
1421
01:04:12,979 --> 01:04:15,111
And then Anthropic
started because
1422
01:04:15,112 --> 01:04:17,896
some researchers
inside of OpenAI said,
1423
01:04:17,897 --> 01:04:20,551
"I want to go off
and do it more safely."
1424
01:04:20,552 --> 01:04:23,554
You needed something in addition
to just scaling the models up,
1425
01:04:23,555 --> 01:04:25,861
which is alignment or safety.
1426
01:04:25,862 --> 01:04:27,471
"We are more responsible
1427
01:04:27,472 --> 01:04:29,690
or more trustworthy
or more moral."
1428
01:04:29,691 --> 01:04:31,562
Now you have three AGI projects.
1429
01:04:31,563 --> 01:04:33,085
But also sitting around
the table with you
1430
01:04:33,086 --> 01:04:34,870
are gonna be a bunch of AIs.
1431
01:04:34,871 --> 01:04:37,568
And now Meta is-is
trying to do stuff.
1432
01:04:37,569 --> 01:04:39,613
Meanwhile, a new artificial
intelligence competitor
1433
01:04:39,614 --> 01:04:41,615
announced this week,
Elon Musk's...
1434
01:04:41,616 --> 01:04:44,662
xAI, which would be
Elon Musk's organization.
1435
01:04:44,663 --> 01:04:46,272
I don't trust OpenAI.
1436
01:04:46,273 --> 01:04:47,926
The fight between
Elon Musk and OpenAI
1437
01:04:47,927 --> 01:04:49,623
has entered a new round.
1438
01:04:49,624 --> 01:04:51,756
I don't trust Sam Altman,
uh, and I, and I don't think
1439
01:04:51,757 --> 01:04:54,237
we want to have the most
powerful AI in the world
1440
01:04:54,238 --> 01:04:56,456
controlled by someone
who is not trustworthy.
1441
01:04:58,459 --> 01:05:01,635
The incentive is
untold sums of money.
1442
01:05:01,636 --> 01:05:03,420
- Yes.
- Is untold power.
1443
01:05:03,421 --> 01:05:05,596
- Yes.
- Is untold control.
1444
01:05:05,597 --> 01:05:08,077
If you have something
that is a million times smarter
1445
01:05:08,078 --> 01:05:11,428
and more capable than
everything else on planet Earth
1446
01:05:11,429 --> 01:05:14,213
and no one else has that,
1447
01:05:14,214 --> 01:05:15,998
that thing is the incentive.
1448
01:05:15,999 --> 01:05:17,261
So you rule the world.
1449
01:05:18,610 --> 01:05:20,437
If you really believe this,
1450
01:05:20,438 --> 01:05:22,787
if you really in your heart
believe this,
1451
01:05:22,788 --> 01:05:24,223
then you might be
willing to take
1452
01:05:24,224 --> 01:05:26,138
quite a lot of risk
to make that happen.
1453
01:05:26,139 --> 01:05:28,924
Google has just released
its newest AI model.
1454
01:05:28,925 --> 01:05:31,274
The answer to OpenAI's ChatGPT.
1455
01:05:31,275 --> 01:05:33,102
How do we get to
this ten trillion?
1456
01:05:33,103 --> 01:05:35,234
Is NVIDIA becoming the most
valuable company in the US?
1457
01:05:35,235 --> 01:05:39,021
This is the largest business
opportunity in history.
1458
01:05:39,022 --> 01:05:40,544
In history.
1459
01:05:40,545 --> 01:05:42,285
So, the reason why
everyone's really hyped
1460
01:05:42,286 --> 01:05:44,722
about artificial intelligence
right now
1461
01:05:44,723 --> 01:05:48,073
is because
the more these companies hype
1462
01:05:48,074 --> 01:05:50,075
the potential capabilities
of their technology,
1463
01:05:50,076 --> 01:05:53,252
the more investment
they can attract.
1464
01:05:53,253 --> 01:05:54,775
Amazon investing up to
1465
01:05:54,776 --> 01:05:57,039
four billion dollars
in start-up Anthropic.
1466
01:05:57,040 --> 01:05:58,736
OpenAI is setting its sights
1467
01:05:58,737 --> 01:06:00,781
on a blockbuster half
a trillion dollar valuation.
1468
01:06:00,782 --> 01:06:02,044
Half a trillion!
1469
01:06:02,045 --> 01:06:03,262
The race is on.
1470
01:06:03,263 --> 01:06:04,655
This is happening
faster than ever.
1471
01:06:04,656 --> 01:06:06,657
Are we in an AI bubble?
Of course.
1472
01:06:06,658 --> 01:06:09,225
I just don't see
the bubble bursting
1473
01:06:09,226 --> 01:06:12,489
while you still have
this major spending cycle.
1474
01:06:12,490 --> 01:06:15,057
Even if you think
this is all hype,
1475
01:06:15,058 --> 01:06:17,624
there are billions
to trillions of dollars
1476
01:06:17,625 --> 01:06:21,150
flowing into making AI systems
more powerful.
1477
01:06:21,151 --> 01:06:23,891
And once you have that thing
that's more powerful,
1478
01:06:23,892 --> 01:06:25,589
companies can use that
1479
01:06:25,590 --> 01:06:27,591
to get bigger profits
and to make more money.
1480
01:06:27,592 --> 01:06:30,159
Countries can use that
to make stronger militaries.
1481
01:06:30,160 --> 01:06:32,857
NVIDIA has overtaken
Microsoft and Apple
1482
01:06:32,858 --> 01:06:36,295
to become the world's
most valuable company.
1483
01:06:36,296 --> 01:06:39,037
The race to deploy becomes
the race to recklessness,
1484
01:06:39,038 --> 01:06:41,822
because they can't
deploy it that quickly
1485
01:06:41,823 --> 01:06:43,215
and also get it right.
1486
01:06:43,216 --> 01:06:46,088
They believe that
they're the good guys.
1487
01:06:46,089 --> 01:06:49,352
"And if I don't do it,
somebody who doesn't have
1488
01:06:49,353 --> 01:06:51,615
"as good values as me
will be sitting at the table
1489
01:06:51,616 --> 01:06:54,487
getting to make decisions, so
I have an obligation to do it."
1490
01:06:54,488 --> 01:06:56,228
Yes, this is
a very commonly held belief.
1491
01:06:56,229 --> 01:06:58,622
Many, many, many,
maybe most of the people
1492
01:06:58,623 --> 01:07:00,754
building this technology
believe that.
1493
01:07:00,755 --> 01:07:03,757
I'm worried about
the commercial competition,
1494
01:07:03,758 --> 01:07:05,890
but it turns out
I'm even more worried about
1495
01:07:05,891 --> 01:07:08,066
the geopolitical competition.
1496
01:07:08,067 --> 01:07:10,373
We were eight years behind
a year ago.
1497
01:07:10,374 --> 01:07:12,636
Now we're probably
less than one year behind.
1498
01:07:12,637 --> 01:07:15,726
Well, both Saudi Arabia
and the UAE have been racing
1499
01:07:15,727 --> 01:07:18,163
to set up data centers
and position themselves
1500
01:07:18,164 --> 01:07:20,426
as the dominant force in AI...
1501
01:07:20,427 --> 01:07:22,472
...to make South Korea
a global AI leader.
1502
01:07:22,473 --> 01:07:24,039
French President Emmanuel Macron
1503
01:07:24,040 --> 01:07:25,910
spoke a lot about
artificial intelligence...
1504
01:07:25,911 --> 01:07:27,477
Now with more on Israel's role
1505
01:07:27,478 --> 01:07:29,653
in the artificial intelligence
revolution...
1506
01:07:29,654 --> 01:07:33,135
Countries will be competing
for whose AI technologies
1507
01:07:33,136 --> 01:07:34,919
create the next generation
of industry.
1508
01:07:34,920 --> 01:07:38,053
The Chinese are insisting
that AI,
1509
01:07:38,054 --> 01:07:39,880
as being developed in China,
1510
01:07:39,881 --> 01:07:41,752
reinforce the core values
1511
01:07:41,753 --> 01:07:44,059
of the Chinese Communist Party
and the Chinese system.
1512
01:07:44,060 --> 01:07:48,759
America has to beat China
in the AI race.
1513
01:07:48,760 --> 01:07:51,066
China's, like,
light-years behind.
1514
01:07:51,067 --> 01:07:52,719
Are they?
1515
01:07:52,720 --> 01:07:55,026
I mean, they have way more
training data than we do,
1516
01:07:55,027 --> 01:07:57,855
and there's nothing saying they
don't drop a model next month
1517
01:07:57,856 --> 01:08:00,988
that isn't--
doesn't far outperform GPT-4.
1518
01:08:00,989 --> 01:08:03,861
Everything was going just fine.
What could go wrong?
1519
01:08:03,862 --> 01:08:06,733
- DeepSeek...
- DeepSeek... -DeepSeek...
1520
01:08:06,734 --> 01:08:08,083
There is a new model.
1521
01:08:08,084 --> 01:08:10,868
But from a Chinese lab
called DeepSeek.
1522
01:08:10,869 --> 01:08:14,001
Let's talk about DeepSeek,
because it is mind-blowing,
1523
01:08:14,002 --> 01:08:17,701
and it is shaking this
entire industry to its core.
1524
01:08:17,702 --> 01:08:19,442
The Trump administration
will ensure
1525
01:08:19,443 --> 01:08:23,098
that the most powerful
AI systems are built in the US
1526
01:08:23,099 --> 01:08:26,579
with American designed
and manufactured chips.
1527
01:08:26,580 --> 01:08:28,842
AI is China's
Apollo project.
1528
01:08:28,843 --> 01:08:30,235
The Chinese Communist Party
1529
01:08:30,236 --> 01:08:31,584
deeply understands the potential
1530
01:08:31,585 --> 01:08:32,933
for AI to disrupt warfare.
1531
01:08:32,934 --> 01:08:35,371
Google no longer promises
that it will not
1532
01:08:35,372 --> 01:08:38,461
use artificial intelligence
for weapons or surveillance.
1533
01:08:38,462 --> 01:08:40,071
China, North Korea, Russia
1534
01:08:40,072 --> 01:08:41,942
are gonna keep building it
as fast as possible
1535
01:08:41,943 --> 01:08:44,554
to get more economic advantage,
more productivity advantage,
1536
01:08:44,555 --> 01:08:46,860
more scientific advantage,
more military advantage,
1537
01:08:46,861 --> 01:08:48,688
'cause AI makes better weapons.
1538
01:08:48,689 --> 01:08:51,778
If you're talking about
a system as broad and capable
1539
01:08:51,779 --> 01:08:53,345
as a brilliant scientist,
1540
01:08:53,346 --> 01:08:58,002
it might be able to run
a military campaign
1541
01:08:58,003 --> 01:09:01,484
better than any of the generals
in the US government right now.
1542
01:09:01,485 --> 01:09:05,531
Taiwan is the issue creating
the most tension right now
1543
01:09:05,532 --> 01:09:07,925
between Beijing and the US.
1544
01:09:07,926 --> 01:09:10,493
There's a-a direct
oppositional disagreement
1545
01:09:10,494 --> 01:09:12,016
between the US and China
1546
01:09:12,017 --> 01:09:14,018
on whether Taiwan is
part of China.
1547
01:09:14,019 --> 01:09:18,457
Taiwan is important
for so many reasons.
1548
01:09:18,458 --> 01:09:20,590
It is also the home of
1549
01:09:20,591 --> 01:09:24,202
about 90% of advanced chip
manufacturing for the world,
1550
01:09:24,203 --> 01:09:28,424
which means that the supply
chain for advanced compute
1551
01:09:28,425 --> 01:09:31,078
is at risk should there be
1552
01:09:31,079 --> 01:09:32,993
some sort of scenario
around Taiwan,
1553
01:09:32,994 --> 01:09:38,042
whether a-a blockade
or an invasion by China.
1554
01:09:38,043 --> 01:09:40,740
Are we going to be in some race
1555
01:09:40,741 --> 01:09:45,658
between the US and China
that eventually devolves into
1556
01:09:45,659 --> 01:09:49,401
a militarized AI arms race and,
you know, potentially leads
1557
01:09:49,402 --> 01:09:51,316
to some great power conflict?
1558
01:09:51,317 --> 01:09:53,057
The outgoing
top US military commander
1559
01:09:53,058 --> 01:09:55,886
in the region
predicted war was coming.
1560
01:09:55,887 --> 01:09:57,975
I think the threat is manifest
during this decade,
1561
01:09:57,976 --> 01:09:59,498
in fact, in the next six years.
1562
01:09:59,499 --> 01:10:01,457
Um, one of the scenarios
1563
01:10:01,458 --> 01:10:05,896
that I worry about is
a cyber flash war,
1564
01:10:05,897 --> 01:10:10,944
um, one in which
cyber tools that are autonomous
1565
01:10:10,945 --> 01:10:12,990
are competing against each other
1566
01:10:12,991 --> 01:10:14,644
in ways that are escalatory
1567
01:10:14,645 --> 01:10:16,386
without
meaningful human control.
1568
01:10:18,823 --> 01:10:20,476
You live in a war zone,
1569
01:10:20,477 --> 01:10:23,827
it will be an AI deciding
whether to bomb your house
1570
01:10:23,828 --> 01:10:25,785
and whether to kill you.
1571
01:10:25,786 --> 01:10:27,570
Let's have the machine decide.
1572
01:10:27,571 --> 01:10:29,398
That's the temptation.
1573
01:10:29,399 --> 01:10:32,618
That's why AI is so tempting
for militaries to adopt,
1574
01:10:32,619 --> 01:10:35,795
to create autonomous weapons,
because if I start believing
1575
01:10:35,796 --> 01:10:39,190
that my military adversary
is gonna adopt AI,
1576
01:10:39,191 --> 01:10:41,845
it'll be a race for who can
pull the trigger faster.
1577
01:10:41,846 --> 01:10:44,935
And the one that automates
that decision,
1578
01:10:44,936 --> 01:10:47,242
rather than having
a human in the loop,
1579
01:10:47,243 --> 01:10:49,244
is the one
that will win that war.
1580
01:10:49,245 --> 01:10:50,680
And if you think about
1581
01:10:50,681 --> 01:10:53,074
the nuclear arms race
in the 1940s...
1582
01:10:54,554 --> 01:10:56,599
...you know the Germans
are working on the bomb,
1583
01:10:56,600 --> 01:11:00,777
so it's not so easy to tell
Robert Oppenheimer then,
1584
01:11:00,778 --> 01:11:03,823
"Uh, y-you know, slow down."
1585
01:11:03,824 --> 01:11:05,347
So after watching this movie,
it's gonna be confusing,
1586
01:11:05,348 --> 01:11:06,696
'cause you're gonna go back,
1587
01:11:06,697 --> 01:11:08,350
and tomorrow
you're gonna use ChatGPT,
1588
01:11:08,351 --> 01:11:10,482
and it's gonna be
unbelievably helpful.
1589
01:11:10,483 --> 01:11:12,745
And I will use it, too, and
it'll be unbelievably helpful.
1590
01:11:12,746 --> 01:11:16,183
And you'll say, "Wait, so
I just saw this movie about AI
1591
01:11:16,184 --> 01:11:18,055
"and existential risk
and all these things,
1592
01:11:18,056 --> 01:11:20,666
and where's the threat again?"
1593
01:11:20,667 --> 01:11:24,757
And it's not that ChatGPT is
the existential threat.
1594
01:11:24,758 --> 01:11:28,718
It's that the race to deploy
the most powerful,
1595
01:11:28,719 --> 01:11:31,677
inscrutable,
uncontrollable technology,
1596
01:11:31,678 --> 01:11:35,159
under the worst
incentives possible,
1597
01:11:35,160 --> 01:11:37,031
that's the existential threat.
1598
01:11:38,119 --> 01:11:40,425
But it strikes me that-that
1599
01:11:40,426 --> 01:11:43,254
we are in a context of a race.
1600
01:11:43,255 --> 01:11:45,952
-Yes.
- And it is in a competitive,
1601
01:11:45,953 --> 01:11:47,693
"got to get there first,
got to win the race" setting,
1602
01:11:47,694 --> 01:11:49,042
"got to compete against China,
1603
01:11:49,043 --> 01:11:50,392
got to compete against
the other labs."
1604
01:11:50,393 --> 01:11:51,871
Isn't that right?
1605
01:11:51,872 --> 01:11:53,699
Today it is the case.
1606
01:11:53,700 --> 01:11:56,703
So we need to change
that race dynamic.
1607
01:11:57,922 --> 01:11:59,966
Don't you think?
1608
01:11:59,967 --> 01:12:02,187
I think that would be
very good indeed.
1609
01:12:03,710 --> 01:12:06,538
I-I think
there's some kind of...
1610
01:12:06,539 --> 01:12:09,585
mysticism around AI
that makes it feel like
1611
01:12:09,586 --> 01:12:11,761
it fell from the sky.
1612
01:12:11,762 --> 01:12:13,502
Thinking of this
as a godlike technology
1613
01:12:13,503 --> 01:12:15,460
is a problem.
1614
01:12:15,461 --> 01:12:16,897
Yeah.
1615
01:12:18,072 --> 01:12:19,943
Why?
1616
01:12:19,944 --> 01:12:23,033
It gives license
to the companies to...
1617
01:12:23,034 --> 01:12:26,297
not take responsibility
1618
01:12:26,298 --> 01:12:30,823
for the things that the software
that they built does.
1619
01:12:30,824 --> 01:12:35,132
If people made that connection,
1620
01:12:35,133 --> 01:12:37,569
maybe that would, like,
help them understand again
1621
01:12:37,570 --> 01:12:41,878
the dynamics of
all the money and power and...
1622
01:12:41,879 --> 01:12:45,360
chaos that's happening to...
1623
01:12:45,361 --> 01:12:47,144
create this technology.
1624
01:12:47,145 --> 01:12:49,276
It's, like, five guys
who control it, right?
1625
01:12:49,277 --> 01:12:51,670
- Like, five men?
- Basically, yeah.
1626
01:12:51,671 --> 01:12:53,890
- The CEOs?
- Yeah.
1627
01:12:53,891 --> 01:12:55,371
Okay.
1628
01:13:01,072 --> 01:13:03,116
Sometimes it feels like
1629
01:13:03,117 --> 01:13:07,294
I've been on this
just, like, endless journey
1630
01:13:07,295 --> 01:13:09,384
of trying to understand this.
1631
01:13:09,385 --> 01:13:12,388
You know, like
I'm climbing a mountain...
1632
01:13:15,608 --> 01:13:19,176
...and every time
I, like, get up a hill,
1633
01:13:19,177 --> 01:13:20,786
I think I've reached the top,
1634
01:13:20,787 --> 01:13:23,441
but it just keeps going
and going and going.
1635
01:13:24,574 --> 01:13:27,010
But from everything I've heard,
1636
01:13:27,011 --> 01:13:30,405
if I had to guess what's
at the top of Anxiety Mountain,
1637
01:13:30,406 --> 01:13:34,234
it's these five CEOs
from these five companies,
1638
01:13:34,235 --> 01:13:37,673
who are kind of like the
Oppenheimers of this moment.
1639
01:13:37,674 --> 01:13:39,501
These are the guys
who are building this thing.
1640
01:13:39,502 --> 01:13:40,850
Yeah.
1641
01:13:40,851 --> 01:13:44,201
Like, is there a plan?
1642
01:13:44,202 --> 01:13:46,464
The head of the company
that makes ChatGPT
1643
01:13:46,465 --> 01:13:49,989
warned of possible
significant harm to the world.
1644
01:13:49,990 --> 01:13:52,427
I think, if this technology
goes wrong,
1645
01:13:52,428 --> 01:13:53,515
it can go quite wrong.
1646
01:13:55,822 --> 01:13:57,823
Five dudes.
1647
01:13:57,824 --> 01:14:01,610
I-I never thought about us
as a social media company.
1648
01:14:01,611 --> 01:14:06,092
It-it feels like I have to try
and-and find these guys...
1649
01:14:06,093 --> 01:14:07,224
Mm-hmm.
1650
01:14:07,225 --> 01:14:09,009
...and get them in the movie.
1651
01:14:10,315 --> 01:14:12,622
Certainly we'd get
some clarity from that.
1652
01:14:16,582 --> 01:14:19,192
I mean, the buck's got to stop
somewhere, right?
1653
01:14:34,034 --> 01:14:35,339
How's that feel?
1654
01:14:35,340 --> 01:14:37,733
- Great.
- That okay?
1655
01:14:37,734 --> 01:14:39,474
Yeah.
1656
01:14:39,475 --> 01:14:41,040
- Can I move the seat forward?
-Yes, you can.
1657
01:14:41,041 --> 01:14:42,215
It's, uh, it's just
a bit awkward.
1658
01:14:42,216 --> 01:14:43,608
I'm a little, like...
1659
01:14:43,609 --> 01:14:45,088
Dario, how do you feel
about that chair?
1660
01:14:45,089 --> 01:14:47,046
Um, yeah, the chair's good.
1661
01:14:47,047 --> 01:14:48,874
I just wanted to move it
a little forward. -Okay.
1662
01:14:48,875 --> 01:14:50,572
I picked out that chair.
1663
01:14:50,573 --> 01:14:52,182
- It's a good chair.
- Thanks.
1664
01:14:57,884 --> 01:14:59,495
And just sit down there.
1665
01:15:01,409 --> 01:15:03,106
And then through here,
1666
01:15:03,107 --> 01:15:04,411
we'll be looking at each other
through this glass.
1667
01:15:11,419 --> 01:15:13,595
And sitting back in the chair,
leaning forward?
1668
01:15:13,596 --> 01:15:15,161
Yeah. Well, whatever's
comfortable, I think. -Okay.
1669
01:15:15,162 --> 01:15:16,728
It's kind of cool.
There's a bunch of mirrors.
1670
01:15:16,729 --> 01:15:17,860
- Is that-- okay.
-Sam A., take one. Mark.
1671
01:15:17,861 --> 01:15:19,557
- How's that?
- Good. Thank you.
1672
01:15:19,558 --> 01:15:21,080
The genesis
of this project is that I was
1673
01:15:21,081 --> 01:15:22,865
sitting at home,
and I was playing around with,
1674
01:15:22,866 --> 01:15:24,431
I think, your image generator,
1675
01:15:24,432 --> 01:15:27,304
and I was simultaneously,
uh, terrified
1676
01:15:27,305 --> 01:15:29,480
and really impressed.
1677
01:15:29,481 --> 01:15:31,917
- That is the usual combination.
- And then cut to
1678
01:15:31,918 --> 01:15:34,659
my wife and I find out
we're expecting.
1679
01:15:34,660 --> 01:15:38,315
And-and I'm, like, having
an existential crisis
1680
01:15:38,316 --> 01:15:40,273
as my wife is
six months pregnant.
1681
01:15:40,274 --> 01:15:42,972
And I think my first question,
which I ask everybody:
1682
01:15:42,973 --> 01:15:45,496
Is now a terrible time
to have a kid?
1683
01:15:45,497 --> 01:15:47,411
Am I making a big mistake?
1684
01:15:47,412 --> 01:15:49,065
I'm expecting a kid
in March, too. My first one.
1685
01:15:49,066 --> 01:15:50,545
- You're expecting a kid?
- Yeah.
1686
01:15:50,546 --> 01:15:51,850
- You're expecting in March?
- First kid.
1687
01:15:51,851 --> 01:15:53,330
- Mazel tov.
- Thank you very much.
1688
01:15:53,331 --> 01:15:54,505
I've never been so excited
for anything.
1689
01:15:54,506 --> 01:15:56,507
- That's how I feel.
- Yeah.
1690
01:15:56,508 --> 01:15:58,248
But you're not scared?
1691
01:15:58,249 --> 01:16:02,774
I mean, like, having a kid is
just this momentous thing
1692
01:16:02,775 --> 01:16:05,037
that I, you know,
I stay up every night, like,
1693
01:16:05,038 --> 01:16:06,865
reading these books about
how to raise a kid,
1694
01:16:06,866 --> 01:16:08,258
and I hope
I'm gonna do a good job,
1695
01:16:08,259 --> 01:16:09,302
and it feels
very overwhelming and--
1696
01:16:09,303 --> 01:16:12,088
but I'm not scared...
1697
01:16:12,089 --> 01:16:14,090
for kids to grow up
in a world with AI.
1698
01:16:14,091 --> 01:16:16,353
Like, that's... that'll be okay.
1699
01:16:16,354 --> 01:16:19,051
That is good to hear
coming from the guy.
1700
01:16:19,052 --> 01:16:20,792
I think it's a wonderful idea
to have kids.
1701
01:16:20,793 --> 01:16:23,273
I-I think they're the most
magical, incredible thing.
1702
01:16:23,274 --> 01:16:25,318
There's so much uncertainty,
1703
01:16:25,319 --> 01:16:26,885
I would almost just do
what you're gonna do anyway.
1704
01:16:26,886 --> 01:16:28,974
I know that's not
a very satisfying answer,
1705
01:16:28,975 --> 01:16:31,934
but it's the only one
I can come up with.
1706
01:16:31,935 --> 01:16:34,501
Our kids are never gonna...
1707
01:16:34,502 --> 01:16:37,113
know a world that doesn't have
really advanced AI.
1708
01:16:37,114 --> 01:16:39,898
In fact, our kids will never
be smarter than an AI.
1709
01:16:39,899 --> 01:16:42,118
"Our kids will
never be smarter than an AI"?
1710
01:16:42,119 --> 01:16:44,381
Well, from a raw,
from a raw IQ perspective,
1711
01:16:44,382 --> 01:16:46,296
they will never be
smarter than an AI.
1712
01:16:46,297 --> 01:16:48,080
That notion doesn't
unsettle you a little bit?
1713
01:16:48,081 --> 01:16:51,170
'Cause it makes me feel
a little queasy in a weird way.
1714
01:16:51,171 --> 01:16:54,826
Um, it does unsettle me
a little bit.
1715
01:16:54,827 --> 01:16:56,306
But it is reality.
1716
01:16:56,307 --> 01:16:58,308
Okay, so race dynamics.
1717
01:16:58,309 --> 01:17:01,050
We have a bunch of people
who are all in agreement
1718
01:17:01,051 --> 01:17:02,921
that this is scary.
1719
01:17:02,922 --> 01:17:04,793
Based on the timelines that
a lot of people have given me,
1720
01:17:04,794 --> 01:17:07,056
we have between two months and
five years to figure this out.
1721
01:17:07,057 --> 01:17:08,710
- Yeah.
- And-and, I guess,
1722
01:17:08,711 --> 01:17:12,845
my biggest question is:
Why can't we just stop?
1723
01:17:14,499 --> 01:17:19,111
The problem with, uh, uh,
"just stopping" is that, um,
1724
01:17:19,112 --> 01:17:22,245
there are many, many groups now
around the world, uh, uh,
1725
01:17:22,246 --> 01:17:25,727
building this, for
many nations, many companies,
1726
01:17:25,728 --> 01:17:28,164
um, all with
different motivations.
1727
01:17:28,165 --> 01:17:30,819
There are some of
these companies in this space
1728
01:17:30,820 --> 01:17:34,039
who their position is, "We want
to develop this technology
1729
01:17:34,040 --> 01:17:36,085
absolutely as fast as possible."
1730
01:17:36,086 --> 01:17:40,089
And even if we could pass laws
in the US and in Europe,
1731
01:17:40,090 --> 01:17:42,439
we need to convince...
1732
01:17:42,440 --> 01:17:45,007
Xi Jinping and Vladimir Putin
or, you know,
1733
01:17:45,008 --> 01:17:47,487
whoever their scientific
advisors are on their side.
1734
01:17:47,488 --> 01:17:49,272
That's gonna be really hard.
1735
01:17:49,273 --> 01:17:50,969
I think it is true
1736
01:17:50,970 --> 01:17:53,624
that if two people are
in exactly the same place,
1737
01:17:53,625 --> 01:17:55,931
uh, the one willing to take
more shortcuts on safety
1738
01:17:55,932 --> 01:17:58,107
should kind of
"get there first."
1739
01:17:58,108 --> 01:18:01,110
Uh, but...
1740
01:18:01,111 --> 01:18:02,981
we're able to use our lead
1741
01:18:02,982 --> 01:18:06,028
to spend a lot more time
doing safety testing.
1742
01:18:06,029 --> 01:18:08,030
And-and what if you lose it?
1743
01:18:08,031 --> 01:18:09,640
You get the call
and you find out
1744
01:18:09,641 --> 01:18:11,729
that you're now, let's say,
six months behind.
1745
01:18:11,730 --> 01:18:14,297
- Wh-What happens then?
- Depends who we're behind to.
1746
01:18:14,298 --> 01:18:16,865
If it's, like,
a adversarial government,
1747
01:18:16,866 --> 01:18:18,867
that's probably really bad.
1748
01:18:18,868 --> 01:18:21,391
So let's say you get the call
that China has a,
1749
01:18:21,392 --> 01:18:24,829
has a recursively
self-improving agent
1750
01:18:24,830 --> 01:18:27,223
or something like that that we
should be really worried about.
1751
01:18:27,224 --> 01:18:29,226
What do-- what-what happens?
1752
01:18:30,967 --> 01:18:32,403
Um...
1753
01:18:35,406 --> 01:18:37,276
That case would require...
1754
01:18:37,277 --> 01:18:40,715
the first step there would be
to talk to the US government.
1755
01:18:40,716 --> 01:18:42,978
Sam, do you trust the
government's ability to handle
1756
01:18:42,979 --> 01:18:44,676
something like this?
1757
01:18:45,895 --> 01:18:47,809
Yeah, I do, actually.
1758
01:18:47,810 --> 01:18:49,593
There's other things I don't
trust the government to handle,
1759
01:18:49,594 --> 01:18:51,203
but that particular scenario,
I think they would know...
1760
01:18:51,204 --> 01:18:52,814
they-they-- yes.
1761
01:18:52,815 --> 01:18:54,467
When you have,
like, private discussions
1762
01:18:54,468 --> 01:18:56,426
with the other guys whose
fingers are on the trigger,
1763
01:18:56,427 --> 01:18:59,646
so to speak, um,
do those private discussions
1764
01:18:59,647 --> 01:19:01,344
fill you with confidence
1765
01:19:01,345 --> 01:19:03,389
or do they make you,
uh, more anxious?
1766
01:19:03,390 --> 01:19:07,132
Look, I mean, I-I know some
of them better than others.
1767
01:19:07,133 --> 01:19:10,179
Um, I have more confidence
in some of them th-than I have
1768
01:19:10,180 --> 01:19:12,398
in others, you know,
as with, as with any people.
1769
01:19:12,399 --> 01:19:14,183
You know,
you think about, you know,
1770
01:19:14,184 --> 01:19:15,880
the kids in your high school
class or something, right?
1771
01:19:15,881 --> 01:19:18,317
You know, some of them are,
you know, really sweet.
1772
01:19:18,318 --> 01:19:20,450
Some of them are well-meaning
but not that effective.
1773
01:19:20,451 --> 01:19:22,408
Some of them are,
you know, bullies.
1774
01:19:22,409 --> 01:19:23,801
Some of them are
really bad people.
1775
01:19:23,802 --> 01:19:25,716
You-you kind of really,
1776
01:19:25,717 --> 01:19:27,109
you really see the spread.
1777
01:19:27,110 --> 01:19:28,850
You know, am I, am I,
am I confident
1778
01:19:28,851 --> 01:19:30,590
that everyone's gonna do
the right thing,
1779
01:19:30,591 --> 01:19:32,070
that it's all gonna work out?
1780
01:19:32,071 --> 01:19:33,898
Um, no, I'm not.
1781
01:19:33,899 --> 01:19:35,552
And there's-there's nothing
I can do about that, right?
1782
01:19:35,553 --> 01:19:37,815
You know, all-all I can do
is-is push for the,
1783
01:19:37,816 --> 01:19:39,208
push for the, you know,
1784
01:19:39,209 --> 01:19:40,818
push for the government
to get involved.
1785
01:19:40,819 --> 01:19:42,689
But ultimately I'm just
one person there, too.
1786
01:19:42,690 --> 01:19:44,169
That's--
it's-it's up to all of us
1787
01:19:44,170 --> 01:19:46,084
to push for the government
to get involved.
1788
01:19:46,085 --> 01:19:48,521
That's the number one thing
that-that I think we need to do
1789
01:19:48,522 --> 01:19:50,959
to-to, you know, to set things
in the right direction.
1790
01:19:50,960 --> 01:19:52,787
What makes me anxious
about that is, like,
1791
01:19:52,788 --> 01:19:56,225
the basic reality that the
speed at which the technology
1792
01:19:56,226 --> 01:19:58,444
is proliferating and growing
is exponential,
1793
01:19:58,445 --> 01:20:01,491
and the mechanisms to legislate
are 300 years old,
1794
01:20:01,492 --> 01:20:02,971
- take forever.
- Yeah.
1795
01:20:02,972 --> 01:20:05,147
Um, I-I think
it's gonna be a heavy lift.
1796
01:20:05,148 --> 01:20:06,931
I-I definitely agree
with you on that.
1797
01:20:06,932 --> 01:20:08,933
You know,
what I'm literally looking for
1798
01:20:08,934 --> 01:20:11,283
is-is like,
"Here are steps that, like,
1799
01:20:11,284 --> 01:20:15,200
"the head honchos are gonna
take to-to focus on safety,
1800
01:20:15,201 --> 01:20:18,160
to mitigate the peril
and maximize the promise."
1801
01:20:18,161 --> 01:20:20,162
And I don't,
I don't, I don't know
1802
01:20:20,163 --> 01:20:21,859
that there's
a simple answer to that.
1803
01:20:21,860 --> 01:20:23,296
Um...
1804
01:20:25,951 --> 01:20:27,604
I mean, this maybe is
too simple,
1805
01:20:27,605 --> 01:20:30,215
but you...
1806
01:20:30,216 --> 01:20:32,000
create a new model,
1807
01:20:32,001 --> 01:20:34,829
you study and test it
very carefully.
1808
01:20:34,830 --> 01:20:37,266
You put it out
into the world gradually,
1809
01:20:37,267 --> 01:20:39,572
and then, more and more,
you understand
1810
01:20:39,573 --> 01:20:41,270
if that's safe or not,
and then if it is,
1811
01:20:41,271 --> 01:20:42,707
you can take the next step.
1812
01:20:44,404 --> 01:20:47,145
It doesn't sound as flashy
as, like, a brilliant scientist
1813
01:20:47,146 --> 01:20:50,496
coming up with one idea in a
lab to make an AI system, like,
1814
01:20:50,497 --> 01:20:54,022
perfectly safe and controllable
and everything else,
1815
01:20:54,023 --> 01:20:55,762
but it is what I believe
is gonna happen.
1816
01:20:55,763 --> 01:20:57,329
Like, it is the way
I think this works.
1817
01:20:57,330 --> 01:20:59,331
But let's just say
something terrible happens,
1818
01:20:59,332 --> 01:21:01,856
like a model gets loose
or goes rogue or something.
1819
01:21:01,857 --> 01:21:03,509
Is there a protocol?
1820
01:21:03,510 --> 01:21:05,860
Like, literally,
I'm imagining a red phone.
1821
01:21:05,861 --> 01:21:07,165
- Yeah.
- Sorry for thinking of this
1822
01:21:07,166 --> 01:21:08,558
in terms of movies, but, like...
1823
01:21:08,559 --> 01:21:10,038
There is a protocol.
1824
01:21:10,039 --> 01:21:11,517
Is there a red phone
on your desk? -No.
1825
01:21:11,518 --> 01:21:13,258
Is it a secret?
1826
01:21:13,259 --> 01:21:15,434
I mean, uh, no, it's not,
it's not as fancy or dramatic
1827
01:21:15,435 --> 01:21:17,219
as you, like, would hope,
but there's, like, you know--
1828
01:21:17,220 --> 01:21:18,916
we've, like, thought through
these scenarios,
1829
01:21:18,917 --> 01:21:20,570
and if this happens,
we're gonna call these people
1830
01:21:20,571 --> 01:21:22,006
in this order and do this
1831
01:21:22,007 --> 01:21:23,486
and kind of make
these decisions if, like--
1832
01:21:23,487 --> 01:21:26,097
I do believe that
when you have an opportunity
1833
01:21:26,098 --> 01:21:29,100
to do your thinking before
a stressful situation happens,
1834
01:21:29,101 --> 01:21:30,449
that's almost always
a good idea.
1835
01:21:30,450 --> 01:21:31,798
And writing it down is helpful.
1836
01:21:31,799 --> 01:21:33,279
- Being prepared is helpful.
- Yeah.
1837
01:21:34,324 --> 01:21:36,064
You...
1838
01:21:36,065 --> 01:21:38,805
It would be impossible
for me to sit across from you
1839
01:21:38,806 --> 01:21:41,199
and-and ask you to promise me
that this is gonna go well?
1840
01:21:41,200 --> 01:21:43,898
That is impossible.
1841
01:21:43,899 --> 01:21:45,725
There isn't any easy answers, unfortunately.
1842
01:21:45,726 --> 01:21:48,119
Uh, because it's such
a cutting-edge technology,
1843
01:21:48,120 --> 01:21:49,816
um, there's still
a lot of unknowns.
1844
01:21:49,817 --> 01:21:52,297
And I think that
that-that needs to be,
1845
01:21:52,298 --> 01:21:55,083
um, uh, you know, understood
1846
01:21:55,084 --> 01:21:59,000
and-and hence the need
for, uh, uh, some caution.
1847
01:21:59,001 --> 01:22:00,915
I wake up, you know, every day,
1848
01:22:00,916 --> 01:22:03,482
this is the, this is the
number one thing I think about.
1849
01:22:03,483 --> 01:22:05,180
Now, look, I'm human,
1850
01:22:05,181 --> 01:22:08,400
and, you know, has-has
every decision been perfect?
1851
01:22:08,401 --> 01:22:11,664
Can I even say my motivations
were always perfectly clear?
1852
01:22:11,665 --> 01:22:13,579
Of course not.
No one can say that.
1853
01:22:13,580 --> 01:22:16,408
Like, that's-that's
just not, like, you know--
1854
01:22:16,409 --> 01:22:18,671
that's-that's just not
how people work.
1855
01:22:18,672 --> 01:22:21,065
The-the history of science
tends to be that,
1856
01:22:21,066 --> 01:22:23,589
for better or for worse,
if something's possible to do--
1857
01:22:23,590 --> 01:22:27,028
and we now know AI is possible
to do-- humanity does it.
1858
01:22:27,029 --> 01:22:29,769
All of this
was-was going to happen.
1859
01:22:29,770 --> 01:22:32,337
This-this train
isn't gonna stop.
1860
01:22:32,338 --> 01:22:34,296
You can't step in front of
the train and stop it.
1861
01:22:34,297 --> 01:22:36,298
You're just gonna get squished.
1862
01:22:36,299 --> 01:22:38,388
I mean, it's very stressful.
1863
01:22:39,606 --> 01:22:41,085
You know, there's, like,
1864
01:22:41,086 --> 01:22:42,695
a lot of things
a lot of us don't know.
1865
01:22:42,696 --> 01:22:46,395
I think the history
of scientific discovery is
1866
01:22:46,396 --> 01:22:48,310
one of not knowing
what you don't know
1867
01:22:48,311 --> 01:22:49,702
and figuring out as you go.
1868
01:22:49,703 --> 01:22:53,010
Uh, but, yeah, it is a...
1869
01:22:53,011 --> 01:22:56,622
it is a stressful way to live.
1870
01:22:56,623 --> 01:22:58,798
Right. Sam, thank you
very much for doing this.
1871
01:22:58,799 --> 01:23:00,539
- And again, mazel tov.
- Thank you. And to you.
1872
01:23:00,540 --> 01:23:02,715
Thank you so much. Thanks, guys.
1873
01:23:47,848 --> 01:23:49,327
Hello.
1874
01:23:49,328 --> 01:23:50,589
Hey, Dad, how are you?
1875
01:23:50,590 --> 01:23:51,634
Good. How you doing?
1876
01:23:51,635 --> 01:23:53,114
You working? What are you up to?
1877
01:23:53,115 --> 01:23:55,290
I'm working on this AI film.
1878
01:23:55,291 --> 01:23:57,857
And how's it going?
1879
01:23:57,858 --> 01:24:00,469
You know, it... it's really...
1880
01:24:00,470 --> 01:24:02,210
-Hi, sweetie.
-Hi, Mom.
1881
01:24:02,211 --> 01:24:03,907
So what is
the premise of the film?
1882
01:24:03,908 --> 01:24:06,562
Is it a documentary?
Kev, don't use any more spices.
1883
01:24:06,563 --> 01:24:08,346
It's already over-spiced.
1884
01:24:08,347 --> 01:24:10,479
We're making chicken right now.
1885
01:24:10,480 --> 01:24:12,655
- Is it a documentary or what...
-It's about--
1886
01:24:12,656 --> 01:24:14,004
the movie's about
the end of the world.
1887
01:24:14,005 --> 01:24:15,832
The end of the world's coming,
1888
01:24:15,833 --> 01:24:17,616
and we're making a movie
about the end of the world.
1889
01:24:17,617 --> 01:24:19,879
-Really?
-Yeah.
1890
01:24:19,880 --> 01:24:21,620
Kind of a depressing
film, it sounds like.
1891
01:24:21,621 --> 01:24:23,492
Yeah.
1892
01:24:23,493 --> 01:24:29,019
I'm feeling a lot,
like-- this very acute anxiety.
1893
01:24:29,020 --> 01:24:31,413
It's so scary,
but there's got to be--
1894
01:24:31,414 --> 01:24:35,330
you know, have you been meeting
some-some supersmart people
1895
01:24:35,331 --> 01:24:37,810
that are giving you any answers?
1896
01:24:37,811 --> 01:24:39,508
That's what's
frustrating about it.
1897
01:24:39,509 --> 01:24:41,379
No one knows.
1898
01:24:41,380 --> 01:24:43,947
All I can say to that is that
1899
01:24:43,948 --> 01:24:49,909
every generation has had
something scary like this.
1900
01:24:49,910 --> 01:24:52,260
When I was born, it was
the Cuban Missile Crisis.
1901
01:24:54,306 --> 01:24:56,394
I was just scared that there
was going to be a nuclear war.
1902
01:24:56,395 --> 01:24:57,787
Yeah, but we didn't know
1903
01:24:57,788 --> 01:24:59,136
what they were gonna do and...
1904
01:24:59,137 --> 01:25:00,572
And the world didn't end.
1905
01:25:00,573 --> 01:25:01,660
Everyone woke up
the next morning,
1906
01:25:01,661 --> 01:25:03,445
and we're still doing our thing.
1907
01:25:03,446 --> 01:25:05,011
I'm very scared,
1908
01:25:05,012 --> 01:25:06,883
especially in
the context of, like,
1909
01:25:06,884 --> 01:25:09,190
you know, the baby and...
1910
01:25:09,191 --> 01:25:10,582
It's gonna be a learning curve.
1911
01:25:10,583 --> 01:25:12,106
You're gonna be okay.
1912
01:25:12,107 --> 01:25:13,107
You can't,
you can't think about
1913
01:25:13,108 --> 01:25:14,369
what you can't control, Daniel.
1914
01:25:14,370 --> 01:25:16,197
Just remember that.
1915
01:25:16,198 --> 01:25:17,720
I'm really, I'm really feeling
1916
01:25:17,721 --> 01:25:19,243
nervous and scared about it.
1917
01:25:19,244 --> 01:25:21,071
There's so much
that I can't control.
1918
01:25:21,072 --> 01:25:23,900
Don't be nervous.
You can't let that get to you.
1919
01:25:23,901 --> 01:25:26,294
You can only control
what you can control,
1920
01:25:26,295 --> 01:25:27,947
and that's all you can do.
1921
01:25:27,948 --> 01:25:30,646
You can't do more than that.
1922
01:25:30,647 --> 01:25:32,649
Write that down in your book.
1923
01:25:49,274 --> 01:25:51,623
When you look back,
1924
01:25:51,624 --> 01:25:54,496
the world is always ending.
1925
01:25:54,497 --> 01:25:56,846
And when you look ahead,
1926
01:25:56,847 --> 01:25:59,283
the world is always ending.
1927
01:25:59,284 --> 01:26:02,199
...on fire.
One home is already on fire...
1928
01:26:02,200 --> 01:26:05,377
But the world
is always starting, too.
1929
01:26:07,988 --> 01:26:09,989
Are you ready?
1930
01:26:09,990 --> 01:26:11,992
Are you ever really ready?
1931
01:26:15,909 --> 01:26:17,693
You want to drive
or you want me to drive?
1932
01:26:17,694 --> 01:26:18,782
You drive.
1933
01:26:20,479 --> 01:26:21,871
Okay.
1934
01:26:21,872 --> 01:26:23,742
Thank you, Maria.
1935
01:26:23,743 --> 01:26:26,310
It's gonna just
feel really crampy.
1936
01:26:26,311 --> 01:26:27,833
-Okay.
-You're ready?
1937
01:26:27,834 --> 01:26:29,095
Yes.
1938
01:26:30,968 --> 01:26:32,404
You could be in a
medical drama with all that...
1939
01:26:49,116 --> 01:26:51,075
...some pressure.
1940
01:26:52,685 --> 01:26:54,078
Just relax.
1941
01:27:02,129 --> 01:27:04,957
Hi, buddy. Ooh.
1942
01:27:04,958 --> 01:27:06,481
Hi, buddy.
1943
01:27:31,202 --> 01:27:33,725
It's gonna be a whole new world
1944
01:27:33,726 --> 01:27:35,032
for you, Daniel.
1945
01:27:37,208 --> 01:27:40,080
And I think you're gonna be
an amazing father.
1946
01:27:41,517 --> 01:27:44,388
That's for sure.
1947
01:27:44,389 --> 01:27:46,651
Oh.
1948
01:27:48,698 --> 01:27:50,787
You're gonna do a great job.
1949
01:27:51,788 --> 01:27:53,092
Kev, why are you crying?
1950
01:27:53,093 --> 01:27:54,659
I'm crying because...
1951
01:27:54,660 --> 01:27:57,096
I don't know. I just don't...
1952
01:27:57,097 --> 01:27:59,360
Dad, you're gonna
make me cry, too.
1953
01:27:59,361 --> 01:28:00,710
Why are you crying?
1954
01:28:03,190 --> 01:28:05,540
I just know that
you're gonna be an amazing dad.
1955
01:28:05,541 --> 01:28:07,150
I'm only gonna be a great dad
1956
01:28:07,151 --> 01:28:09,718
'cause I had a great dad.
1957
01:28:09,719 --> 01:28:12,329
I'm getting emotional,
that's all.
1958
01:28:12,330 --> 01:28:13,722
Aw.
1959
01:28:14,941 --> 01:28:16,594
Boy, oh, boy!
1960
01:28:16,595 --> 01:28:17,857
My goodness!
1961
01:28:18,902 --> 01:28:24,820
Look at those little cheekies.
Look at those little cheekies.
1962
01:28:24,821 --> 01:28:26,387
Yeah.
1963
01:28:26,388 --> 01:28:28,433
I know how to end this movie.
1964
01:28:29,478 --> 01:28:32,306
Babies.
1965
01:28:32,307 --> 01:28:35,309
The end of the movie is about
babies.-
1966
01:28:35,310 --> 01:28:37,136
They're life-affirming.
1967
01:28:37,137 --> 01:28:38,964
They're exhausting.
1968
01:28:38,965 --> 01:28:40,836
They're hilarious.
1969
01:28:40,837 --> 01:28:42,707
And they're worth it.
1970
01:28:42,708 --> 01:28:46,494
This film isn't about the
inner workings of a technology.
1971
01:28:46,495 --> 01:28:48,278
It's not about
the billionaire CEOs.
1972
01:28:48,279 --> 01:28:50,236
It's not about the geopolitics.
1973
01:28:50,237 --> 01:28:53,631
It's not about the terrifying
future or the end of the world,
1974
01:28:53,632 --> 01:28:55,720
because my world
is just starting.
1975
01:28:55,721 --> 01:28:57,418
I'm building a crib.
1976
01:28:57,419 --> 01:28:59,333
Right here, right now.
1977
01:29:00,509 --> 01:29:02,379
AI is gonna change everything
1978
01:29:02,380 --> 01:29:05,643
in ways too powerful and
complex for us to understand.
1979
01:29:05,644 --> 01:29:08,952
And the future is not
for any of us to decide.
1980
01:29:10,736 --> 01:29:13,651
But what I can decide is to be
the best possible husband
1981
01:29:13,652 --> 01:29:17,742
for my wife and the best
possible dad for my son.
1982
01:29:17,743 --> 01:29:21,442
So whether our AI future is
a nightmarish dystopia
1983
01:29:21,443 --> 01:29:23,879
or the utopia
that we all dream of,
1984
01:29:23,880 --> 01:29:26,925
I'll at least know
that I did everything I could
1985
01:29:26,926 --> 01:29:30,146
to guide my family
through this AI revolution.
1986
01:29:30,147 --> 01:29:33,889
And no matter what,
we'll be facing it together.
1987
01:29:36,849 --> 01:29:40,548
So that's just our first idea.
1988
01:29:40,549 --> 01:29:42,854
How does this feel?
Are you feeling this?
1989
01:29:42,855 --> 01:29:45,291
Wait, I-- like, this is not--
this is a joke.
1990
01:29:45,292 --> 01:29:47,425
It's not actually
how you're gonna end it.
1991
01:29:50,036 --> 01:29:52,690
I mean, it's just an idea. Okay?
1992
01:29:52,691 --> 01:29:54,431
No, Daniel.
1993
01:29:54,432 --> 01:29:56,172
...uh, very, very dumb.
1994
01:29:56,173 --> 01:29:58,957
You've just spent, I don't
know, like, how many years
1995
01:29:58,958 --> 01:30:02,613
of our life working on this,
talking to every leading expert
1996
01:30:02,614 --> 01:30:05,007
on the planet about the subject,
1997
01:30:05,008 --> 01:30:07,705
and you're gonna end it
1998
01:30:07,706 --> 01:30:10,839
with some, like,
kumbaya bullshit?
1999
01:30:10,840 --> 01:30:13,537
There's an asteroid
headed to Earth.
2000
01:30:13,538 --> 01:30:15,539
What do you do? Just...
2001
01:30:15,540 --> 01:30:17,672
hold hands
and hope it works out okay?
2002
01:30:17,673 --> 01:30:19,456
Absolutely not.
2003
01:30:19,457 --> 01:30:22,328
We have to-- it ha--
it has to be...
2004
01:30:22,329 --> 01:30:24,419
The ending has to...
2005
01:30:25,855 --> 01:30:27,943
Okay, first thing:
2006
01:30:27,944 --> 01:30:31,468
AI is here,
and it's here to stay.
2007
01:30:31,469 --> 01:30:32,948
The shit's out of the horse,
2008
01:30:32,949 --> 01:30:35,430
but the horse is
gonna keep shitting.
2009
01:30:39,695 --> 01:30:41,609
You know, one of the basic
laws of history
2010
01:30:41,610 --> 01:30:44,046
is that nothing
really have a beginning
2011
01:30:44,047 --> 01:30:46,004
and nothing has any ending.
2012
01:30:46,005 --> 01:30:47,745
It just goes on.
2013
01:30:47,746 --> 01:30:50,574
AI is nowhere near
its full development.
2014
01:30:50,575 --> 01:30:52,837
Even if
the current AI bubble bursts,
2015
01:30:52,838 --> 01:30:54,622
humans are never going to stop
2016
01:30:54,623 --> 01:30:57,581
building more and more
powerful technology.
2017
01:30:57,582 --> 01:30:59,453
You can choose not to use AI
2018
01:30:59,454 --> 01:31:02,281
or participate in it, but it's
going to affect you anyway.
2019
01:31:02,282 --> 01:31:04,066
Okay, fantastic.
So we're screwed.
2020
01:31:04,067 --> 01:31:06,635
No, we're not
because of one simple thing.
2021
01:31:10,943 --> 01:31:13,031
This is not inevitable.
2022
01:31:13,032 --> 01:31:15,251
If we could just see it
clearly together,
2023
01:31:15,252 --> 01:31:18,384
the obvious response will be
to choose something different.
2024
01:31:18,385 --> 01:31:21,083
We need to very clearly
change the game
2025
01:31:21,084 --> 01:31:23,172
from a race to the bottom
into a race to the top.
2026
01:31:23,173 --> 01:31:27,481
The problem we need to
solve is not AI specifically.
2027
01:31:27,482 --> 01:31:31,267
It's the general question
of how do we build a society
2028
01:31:31,268 --> 01:31:33,922
that can deal with
powerful technology.
2029
01:31:33,923 --> 01:31:35,401
Because we're going to get
2030
01:31:35,402 --> 01:31:36,838
only more and more powerful
technology.
2031
01:31:36,839 --> 01:31:39,710
We need to upgrade our society,
2032
01:31:39,711 --> 01:31:41,538
and the first step is
coming together
2033
01:31:41,539 --> 01:31:43,409
- and demanding...
-Coordination.
2034
01:31:43,410 --> 01:31:47,065
Some form of international
cooperation or agreement about
2035
01:31:47,066 --> 01:31:48,545
what the norms should be.
2036
01:31:48,546 --> 01:31:49,938
You know, how should
they be deployed...
2037
01:31:49,939 --> 01:31:51,592
Like, real
international diplomacy
2038
01:31:51,593 --> 01:31:53,332
among the superpowers.
2039
01:31:53,333 --> 01:31:56,335
The Chinese are as worried
about it as the Americans.
2040
01:31:56,336 --> 01:31:57,772
I think it's difficult,
you know,
2041
01:31:57,773 --> 01:31:59,556
in the current
geopolitical climate,
2042
01:31:59,557 --> 01:32:01,297
- but I think it's necessary.
-Absolutely.
2043
01:32:01,298 --> 01:32:04,735
In the exact same way that
the last time that humanity
2044
01:32:04,736 --> 01:32:09,653
developed a technology
this dangerous...
2045
01:32:09,654 --> 01:32:12,438
...that required a complete,
unprecedented shift
2046
01:32:12,439 --> 01:32:15,572
to the structure of our world.
2047
01:32:15,573 --> 01:32:17,661
So we need to do that complete,
2048
01:32:17,662 --> 01:32:20,925
unprecedented shift again.
2049
01:32:20,926 --> 01:32:24,146
You know, we-we talk to people
who work at these AI companies,
2050
01:32:24,147 --> 01:32:25,364
and they say they want to do
something different,
2051
01:32:25,365 --> 01:32:26,670
but they need public pressure.
2052
01:32:26,671 --> 01:32:28,193
They need the government
to do something.
2053
01:32:28,194 --> 01:32:30,021
So then we go to DC,
and they say,
2054
01:32:30,022 --> 01:32:31,893
"Well, we need Silicon Valley
to do something different.
2055
01:32:31,894 --> 01:32:33,677
They're the ones who are gonna
come up with the guardrails."
2056
01:32:33,678 --> 01:32:35,723
And so everyone is pointing
the finger at someone else,
2057
01:32:35,724 --> 01:32:39,509
and what they agree on is
that we need public pressure
2058
01:32:39,510 --> 01:32:41,250
in order for something else
to happen.
2059
01:32:41,251 --> 01:32:42,860
And that's what you
2060
01:32:42,861 --> 01:32:44,296
and all the people
watching this movie can do.
2061
01:32:44,297 --> 01:32:45,341
We need to hold the leaders
2062
01:32:45,342 --> 01:32:47,125
in our governments
2063
01:32:47,126 --> 01:32:48,605
and the leaders of
these companies accountable.
2064
01:32:48,606 --> 01:32:51,042
Whichever country you're in,
2065
01:32:51,043 --> 01:32:52,914
let them know
that you're not happy
2066
01:32:52,915 --> 01:32:54,611
with the current status quo.
2067
01:32:54,612 --> 01:32:57,135
So, yeah, it's boring to say
call your congressperson.
2068
01:32:57,136 --> 01:32:59,398
I'm not saying you should
just do that, but, like,
2069
01:32:59,399 --> 01:33:00,965
we do have to do that.
2070
01:33:00,966 --> 01:33:02,445
Like, we do have to get
the government involved.
2071
01:33:02,446 --> 01:33:03,881
So we just
call them up and say,
2072
01:33:03,882 --> 01:33:06,014
"Hey, stop Big Tech
from ruining the world"?
2073
01:33:06,015 --> 01:33:08,930
No, but there are
tons of really obvious,
2074
01:33:08,931 --> 01:33:10,714
straightforward things
we can be demanding.
2075
01:33:10,715 --> 01:33:12,542
We need transparency.
2076
01:33:12,543 --> 01:33:14,283
We need to end the secrecy
that exists inside these labs,
2077
01:33:14,284 --> 01:33:16,677
because they are building
powerful technology,
2078
01:33:16,678 --> 01:33:18,722
and the public deserves to know
what's going on.
2079
01:33:18,723 --> 01:33:20,724
Ultimately,
we're gonna need independent,
2080
01:33:20,725 --> 01:33:23,292
objective third parties
to evaluate the systems.
2081
01:33:23,293 --> 01:33:27,078
We can't count on the companies
to grade their own homework.
2082
01:33:27,079 --> 01:33:31,213
If-if a company uses AI and-and
has AI interacting with you,
2083
01:33:31,214 --> 01:33:34,999
it should disclose that you are
interacting with an AI system.
2084
01:33:35,000 --> 01:33:38,655
Yeah. And-and also we need
a system that makes companies
2085
01:33:38,656 --> 01:33:41,876
legally liable for the
AI systems that they produce.
2086
01:33:41,877 --> 01:33:45,793
We need to make sure that there
are tests and safety standards
2087
01:33:45,794 --> 01:33:48,056
that are applied to everyone.
2088
01:33:48,057 --> 01:33:49,753
We need some ground rules,
2089
01:33:49,754 --> 01:33:51,581
and we need to keep adapting
those rules
2090
01:33:51,582 --> 01:33:54,628
at the speed that
the technology develops.
2091
01:33:54,629 --> 01:33:57,195
There is currently
more regulation
2092
01:33:57,196 --> 01:33:59,328
on selling a sandwich
to the public
2093
01:33:59,329 --> 01:34:02,940
than there is on building
potentially world-ending AGI.
2094
01:34:02,941 --> 01:34:04,725
And the last thing is
2095
01:34:04,726 --> 01:34:07,641
to upgrade ourselves.
2096
01:34:07,642 --> 01:34:09,773
This is not, like, the job
of, like, the safety team
2097
01:34:09,774 --> 01:34:11,775
at any given lab, or the CEO.
2098
01:34:11,776 --> 01:34:13,342
So, like, this is
everyone's job.
2099
01:34:13,343 --> 01:34:14,822
Don't, like, leave it
up to the AI experts.
2100
01:34:14,823 --> 01:34:16,998
Like, [bleep] that.
Like, this is the moment
2101
01:34:16,999 --> 01:34:19,304
that we are transitioning
from, like,
2102
01:34:19,305 --> 01:34:21,567
mostly human cognitive power
to, like, AI cognitive power,
2103
01:34:21,568 --> 01:34:23,134
and it affects everyone,
2104
01:34:23,135 --> 01:34:24,875
and I want people to be
in on that conversation.
2105
01:34:24,876 --> 01:34:26,921
And I would say, if you think
that AI will kill us all,
2106
01:34:26,922 --> 01:34:28,531
you should be working
in AI research
2107
01:34:28,532 --> 01:34:30,272
to make sure it doesn't,
because you do have
2108
01:34:30,273 --> 01:34:32,230
an enormous amount of agency.
2109
01:34:32,231 --> 01:34:33,884
Whoever you are,
2110
01:34:33,885 --> 01:34:35,886
you are an expert
in your own industry,
2111
01:34:35,887 --> 01:34:39,194
in your own school,
in your own family,
2112
01:34:39,195 --> 01:34:42,632
and it's up to you
how AI is used in your life.
2113
01:34:42,633 --> 01:34:45,026
Whether you want to join
your school board
2114
01:34:45,027 --> 01:34:47,898
or-or whether you want to ask
your employer
2115
01:34:47,899 --> 01:34:49,421
how they're using
AI technologies,
2116
01:34:49,422 --> 01:34:51,989
like, all of us can do the work.
2117
01:34:51,990 --> 01:34:56,124
A lot of unions have been
pretty effective at, like,
2118
01:34:56,125 --> 01:34:59,083
determining how they want
to interact with these systems.
2119
01:34:59,084 --> 01:35:02,739
Nurses unions, teacher unions.
2120
01:35:02,740 --> 01:35:04,610
I would love
if parents everywhere
2121
01:35:04,611 --> 01:35:06,177
went to the AI companies
and said,
2122
01:35:06,178 --> 01:35:08,049
"How can you be better
on this?" Including us.
2123
01:35:08,050 --> 01:35:11,226
So, I founded Encode Justice
when I was 15 years old.
2124
01:35:11,227 --> 01:35:14,882
We are the world's first
and largest army of young people
2125
01:35:14,883 --> 01:35:18,407
fighting for human-centered
artificial intelligence.
2126
01:35:18,408 --> 01:35:21,062
It doesn't matter who you are,
even the smallest actions help,
2127
01:35:21,063 --> 01:35:23,978
and even conversation starting
is really, really valuable.
2128
01:35:23,979 --> 01:35:25,980
So my job could be, like,
2129
01:35:25,981 --> 01:35:27,982
tell Bubby Lila about this
at dinner?
2130
01:35:27,983 --> 01:35:29,810
Honestly, yeah,
that's part of it.
2131
01:35:29,811 --> 01:35:31,681
And we're gonna have to do
2132
01:35:31,682 --> 01:35:34,466
a lot of things that
we haven't even thought of yet.
2133
01:35:34,467 --> 01:35:36,077
People are gonna look at
anything that we've outlined
2134
01:35:36,078 --> 01:35:37,295
and say, "That's not enough."
2135
01:35:37,296 --> 01:35:38,993
What matters is that the forces
2136
01:35:38,994 --> 01:35:40,951
that are working
towards solutions
2137
01:35:40,952 --> 01:35:44,476
start to exceed the forces that
are working against solutions.
2138
01:35:44,477 --> 01:35:46,435
Making the world better
has always been hard.
2139
01:35:46,436 --> 01:35:47,871
It has never been easy.
2140
01:35:47,872 --> 01:35:50,004
Like, there have been
many shitty things
2141
01:35:50,005 --> 01:35:51,962
that have happened in history,
and we've had--
2142
01:35:51,963 --> 01:35:53,747
like, people have had
to deal with that,
2143
01:35:53,748 --> 01:35:56,837
and then they've risen up
and changed it.
2144
01:35:56,838 --> 01:36:00,362
There are an insane number
of challenges ahead of us,
2145
01:36:00,363 --> 01:36:04,148
but if we can get past them,
2146
01:36:04,149 --> 01:36:08,109
we can unlock a future beyond
our wildest imagination.
2147
01:36:08,110 --> 01:36:10,459
We have to come together
2148
01:36:10,460 --> 01:36:14,289
and find the path between
the promise and the peril.
2149
01:36:14,290 --> 01:36:16,422
We can't be pessimists
or optimists.
2150
01:36:17,423 --> 01:36:19,425
We have to become something new.
2151
01:36:21,732 --> 01:36:24,516
A friend of mine calls me
an apocaloptimist.
2152
01:36:24,517 --> 01:36:26,083
"Apocaloptimist"?
2153
01:36:26,084 --> 01:36:28,303
I think that might be
my new favorite word.
2154
01:36:28,304 --> 01:36:29,957
-MAN: Might be
the name of this movie.
2155
01:36:29,958 --> 01:36:31,306
It might be the name
of this movie. -
2156
01:36:31,307 --> 01:36:34,222
- "Apocaloptimist."
- "Apocaloptimist."
2157
01:36:34,223 --> 01:36:35,876
Yeah.
2158
01:36:35,877 --> 01:36:37,312
I don't believe in doom.
2159
01:36:37,313 --> 01:36:39,096
I believe in the spirit of life,
2160
01:36:39,097 --> 01:36:42,708
uh, and I believe in life is
about the capacity to act.
2161
01:36:42,709 --> 01:36:44,319
The capacity to relate,
2162
01:36:44,320 --> 01:36:46,365
the capacity to feel.
2163
01:36:51,022 --> 01:36:53,981
We have to double down
more and more
2164
01:36:53,982 --> 01:36:57,071
on those capacities
that we have as humans
2165
01:36:57,072 --> 01:37:00,161
that robotic systems
will never have.
2166
01:37:00,162 --> 01:37:04,339
It's time right now
to make those decisions about
2167
01:37:04,340 --> 01:37:07,516
how to guide it and support it
rather than dividing us.
2168
01:37:07,517 --> 01:37:10,301
It kind of sounds like
raising a kid.
2169
01:37:10,302 --> 01:37:12,477
That's what's up. Yeah.
2170
01:37:12,478 --> 01:37:14,523
AI may have
more raw intelligence
2171
01:37:14,524 --> 01:37:17,265
than our little human brains,
2172
01:37:17,266 --> 01:37:21,095
but we're so much more
than just our intelligence.
2173
01:37:21,096 --> 01:37:25,099
Intelligence is-is
the ability to solve problems.
2174
01:37:25,100 --> 01:37:28,320
Wisdom is the ability to know
which problems to solve.
2175
01:37:33,760 --> 01:37:34,935
It can go on your fridge.
2176
01:37:36,241 --> 01:37:38,155
So, don't give up.
2177
01:37:38,156 --> 01:37:39,809
Humanity has done
2178
01:37:39,810 --> 01:37:42,160
more difficult things than this
in its history.
2179
01:37:43,553 --> 01:37:46,381
It's just hard to convince
people that they should.
2180
01:37:54,912 --> 01:37:56,870
So, when I started
making this movie,
2181
01:37:56,871 --> 01:37:58,480
I would say that I was, like,
2182
01:37:58,481 --> 01:38:00,961
broadly a cynical asshole
about this whole thing.
2183
01:38:00,962 --> 01:38:03,137
Over the course
of making the film,
2184
01:38:03,138 --> 01:38:04,703
I've come to understand
that, like,
2185
01:38:04,704 --> 01:38:07,489
that's the only thing
we can't be.
2186
01:38:07,490 --> 01:38:09,404
...or anything else
you'd like to discuss?
2187
01:38:09,405 --> 01:38:11,406
No. I think we covered a lot.
2188
01:38:11,407 --> 01:38:13,060
- Yeah.
-Thank you so much.
2189
01:38:13,061 --> 01:38:15,540
Thanks.
2190
01:38:15,541 --> 01:38:16,933
This is a problem that's bigger
2191
01:38:16,934 --> 01:38:19,109
than any one person.
2192
01:38:19,110 --> 01:38:21,590
This will change the world in
ways that we don't understand.
2193
01:38:21,591 --> 01:38:23,505
That is all true.
2194
01:38:23,506 --> 01:38:25,594
Okay.
2195
01:38:25,595 --> 01:38:27,726
But what we do have agency over
2196
01:38:27,727 --> 01:38:30,120
is what we do about it.
2197
01:38:30,121 --> 01:38:32,601
As frontier AI grows
exponentially more capable...
2198
01:38:32,602 --> 01:38:34,472
And the reality
2199
01:38:34,473 --> 01:38:36,605
is that if we just decide
it's hopeless,
2200
01:38:36,606 --> 01:38:39,042
then it is hopeless.
2201
01:38:39,043 --> 01:38:40,783
...to put less stress
on planet Earth...
2202
01:38:40,784 --> 01:38:42,785
But if you decide
2203
01:38:42,786 --> 01:38:45,528
that you want to try...
2204
01:38:46,790 --> 01:38:48,138
...then you try.
2205
01:38:48,139 --> 01:38:49,967
And that's hard.
2206
01:38:51,273 --> 01:38:53,230
But you know what?
2207
01:38:53,231 --> 01:38:56,278
Big things seem impossible
before they actually happen.
2208
01:38:58,149 --> 01:39:01,456
But when they finally do happen,
2209
01:39:01,457 --> 01:39:05,286
it's because millions of people
took millions of actions
2210
01:39:05,287 --> 01:39:07,592
to make them happen.
2211
01:39:07,593 --> 01:39:10,334
And so...
2212
01:39:10,335 --> 01:39:12,424
we have to try.
2213
01:39:21,129 --> 01:39:23,565
There's too much at stake.
2214
01:39:37,232 --> 01:39:41,278
Look at the incredible changes
we've experienced and survived
2215
01:39:41,279 --> 01:39:42,888
from the Stone Age,
2216
01:39:42,889 --> 01:39:46,153
and yet even greater changes
are still to come.
2217
01:40:47,606 --> 01:40:50,217
♪ What will we do now? ♪
2218
01:40:50,218 --> 01:40:53,263
♪ We've lost it to trying ♪
2219
01:40:53,264 --> 01:40:55,135
♪ We've lost it ♪
2220
01:40:55,136 --> 01:40:56,746
♪ To trying ♪
2221
01:40:59,575 --> 01:41:02,272
♪ What will we do now? ♪
2222
01:41:02,273 --> 01:41:05,232
♪ We've lost it to trying ♪
2223
01:41:05,233 --> 01:41:07,016
♪ We've lost it ♪
2224
01:41:07,017 --> 01:41:08,714
♪ To trying ♪
2225
01:41:11,630 --> 01:41:14,154
♪ What can we say now? ♪
2226
01:41:14,155 --> 01:41:17,244
♪ Our mouths only lying ♪
2227
01:41:17,245 --> 01:41:18,680
♪ Our mouths ♪
2228
01:41:18,681 --> 01:41:20,683
♪ Only lying ♪
2229
01:41:25,122 --> 01:41:27,732
♪ What can we say now? ♪
2230
01:41:27,733 --> 01:41:30,866
♪ Our mouths only lying ♪
2231
01:41:30,867 --> 01:41:32,259
♪ Our mouths ♪
2232
01:41:32,260 --> 01:41:34,261
♪ Only lying ♪
2233
01:42:01,071 --> 01:42:03,768
♪ Give in and get out ♪
2234
01:42:03,769 --> 01:42:06,728
♪ We rise in the dying ♪
2235
01:42:06,729 --> 01:42:08,251
♪ We rise ♪
2236
01:42:08,252 --> 01:42:10,211
♪ In the dying ♪
2237
01:42:13,127 --> 01:42:15,824
♪ Give in and get out ♪
2238
01:42:15,825 --> 01:42:18,696
♪ We rise in the dying ♪
2239
01:42:18,697 --> 01:42:20,263
♪ We rise ♪
2240
01:42:20,264 --> 01:42:22,223
♪ In the dying ♪
2241
01:42:25,051 --> 01:42:27,705
♪ Give in and get out ♪
2242
01:42:27,706 --> 01:42:30,708
♪ We rise in the dying ♪
2243
01:42:30,709 --> 01:42:32,275
♪ We rise ♪
2244
01:42:32,276 --> 01:42:34,148
♪ In the dying ♪
2245
01:42:37,107 --> 01:42:39,761
♪ Give in and get out ♪
2246
01:42:39,762 --> 01:42:42,633
♪ We rise in the dying ♪
2247
01:42:42,634 --> 01:42:43,852
♪ We ♪
2248
01:43:22,761 --> 01:43:25,328
♪ Oh, oh, oh, oh, oh ♪
2249
01:43:25,329 --> 01:43:27,025
♪ Oh, oh, oh, oh ♪
2250
01:43:27,026 --> 01:43:28,462
♪ Oh, oh ♪
2251
01:43:28,463 --> 01:43:31,421
♪ Oh, oh, oh, oh, oh, oh ♪
2252
01:43:31,422 --> 01:43:32,988
♪ Oh, oh, oh, oh ♪
2253
01:43:32,989 --> 01:43:34,424
♪ Oh, oh ♪
2254
01:43:34,425 --> 01:43:37,079
♪ Oh, oh, oh, oh, oh, oh ♪
2255
01:43:37,080 --> 01:43:39,690
♪ What will we do now? ♪
2256
01:43:39,691 --> 01:43:42,693
♪ We've lost it to trying ♪
2257
01:43:42,694 --> 01:43:44,608
♪ We've lost it ♪
2258
01:43:44,609 --> 01:43:46,349
♪ To trying ♪
2259
01:43:46,350 --> 01:43:49,047
♪ Oh, oh, oh, oh, oh, oh ♪
2260
01:43:49,048 --> 01:43:51,746
♪ What will we do now? ♪
2261
01:43:51,747 --> 01:43:54,749
♪ We've lost it to trying ♪
2262
01:43:54,750 --> 01:43:56,620
♪ We've lost it ♪
2263
01:43:56,621 --> 01:43:58,448
♪ To trying. ♪
166020
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.