Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:24,065 --> 00:00:26,818
I find AI to be awe-inspiring.
2
00:00:28,361 --> 00:00:31,489
All right, circling up. Good formation.
3
00:00:33,116 --> 00:00:36,703
AI has the potential
to eliminate poverty,
4
00:00:36,786 --> 00:00:38,705
give us new medicines,
5
00:00:39,330 --> 00:00:41,583
and make our world even more peaceful.
6
00:00:46,087 --> 00:00:47,087
Nice work.
7
00:00:49,883 --> 00:00:53,428
But there are so many risks
along the way.
8
00:01:04,773 --> 00:01:09,611
With AI, we are essentially creating
a non-human intelligence
9
00:01:09,694 --> 00:01:11,863
that is very unpredictable.
10
00:01:14,407 --> 00:01:16,451
As it gets more powerful,
11
00:01:16,534 --> 00:01:19,746
where are the red lines
we're going to draw with AI,
12
00:01:19,829 --> 00:01:22,665
in terms of how we want to use it,
or not use it?
13
00:01:25,001 --> 00:01:28,505
There is no place that is ground zero
for this conversation
14
00:01:28,588 --> 00:01:31,049
more than military applications.
15
00:01:34,010 --> 00:01:38,890
The battlefield has now become
the province of software and hardware.
16
00:01:41,059 --> 00:01:42,936
Target has been acquired...
17
00:01:43,019 --> 00:01:44,771
And militaries are racing
18
00:01:44,854 --> 00:01:48,399
to develop AI faster
than their adversaries.
19
00:01:49,526 --> 00:01:50,526
I'm dead.
20
00:01:53,822 --> 00:01:55,782
We're moving towards a world
21
00:01:55,865 --> 00:01:58,326
where not only major militaries,
22
00:01:58,409 --> 00:02:02,997
but non-state actors, private industries,
23
00:02:03,081 --> 00:02:06,042
or even our local police department
down the street
24
00:02:06,543 --> 00:02:10,046
could be able to use these weapons
that can autonomously kill.
25
00:02:12,006 --> 00:02:17,303
Will we cede the decision to take a life
to algorithms, to computer software?
26
00:02:18,638 --> 00:02:21,516
It's one of the most pressing issues
of our time.
27
00:02:23,935 --> 00:02:26,354
And, if not used wisely,
28
00:02:26,437 --> 00:02:30,275
poses a grave risk
to every single person on the planet.
29
00:03:28,291 --> 00:03:30,376
Good work out there, guys.
30
00:03:59,906 --> 00:04:02,200
That's a wider lens
than we had before.
31
00:04:02,283 --> 00:04:03,117
Really?
32
00:04:03,201 --> 00:04:05,203
You can see a lot more data.
33
00:04:05,286 --> 00:04:06,287
Very cool.
34
00:04:07,538 --> 00:04:11,167
At Shield AI, we are building an AI pilot
35
00:04:11,251 --> 00:04:14,879
that's taking self-driving,
artificial intelligence technology
36
00:04:14,963 --> 00:04:16,631
and putting it on aircraft.
37
00:04:19,300 --> 00:04:21,719
When we talk about an AI pilot,
38
00:04:21,803 --> 00:04:25,139
we think about giving an aircraft
a higher level of autonomy.
39
00:04:25,223 --> 00:04:27,850
They will be solving problems
on their own.
40
00:04:29,602 --> 00:04:32,355
Nova is an autonomous quadcopter
41
00:04:32,438 --> 00:04:35,942
that explores buildings
and subterranean structures
42
00:04:36,859 --> 00:04:38,611
ahead of clearance forces
43
00:04:38,695 --> 00:04:41,322
to provide eyes and ears in those spaces.
44
00:04:41,906 --> 00:04:44,534
You can definitely tell
a ton of improvements
45
00:04:44,617 --> 00:04:45,618
since we saw it last.
46
00:04:45,702 --> 00:04:47,930
We're working
on some exploration changes today.
47
00:04:47,954 --> 00:04:50,248
We're working a little
floor-by-floor stuff.
48
00:04:50,331 --> 00:04:53,584
It'll finish one floor, all the rooms,
before going to the second.
49
00:04:54,294 --> 00:04:56,796
- That's awesome.
- We put in some changes recently...
50
00:04:56,879 --> 00:04:58,548
A lot of people often ask me
51
00:04:58,631 --> 00:05:02,468
why artificial intelligence
is an important capability.
52
00:05:02,552 --> 00:05:06,973
And I just think back to the missions
that I was executing.
53
00:05:09,142 --> 00:05:10,935
Spent seven years in the Navy.
54
00:05:11,561 --> 00:05:15,273
I was a former Navy SEAL
deployed twice to Afghanistan,
55
00:05:15,356 --> 00:05:17,191
once to the Pacific Theater.
56
00:05:18,026 --> 00:05:22,947
In a given day, we might have to clear
150 different compounds or buildings.
57
00:05:30,997 --> 00:05:34,250
One of the core capabilities
is close-quarters combat.
58
00:05:35,293 --> 00:05:39,172
Gunfighting at extremely close ranges
inside buildings.
59
00:05:41,466 --> 00:05:42,967
You are getting shot at.
60
00:05:43,468 --> 00:05:46,471
There are IEDs
potentially inside the building.
61
00:05:49,015 --> 00:05:55,229
It's the most dangerous thing
that any special operations forces member,
62
00:05:55,313 --> 00:05:58,358
any infantry member,
can do in a combat zone.
63
00:05:58,441 --> 00:05:59,441
Bar none.
64
00:06:12,497 --> 00:06:16,334
For the rest of my life
I'll be thankful for my time in the Navy.
65
00:06:17,168 --> 00:06:22,423
There are a collection of moments
and memories that, when I think about,
66
00:06:22,507 --> 00:06:24,592
I certainly get emotional.
67
00:06:28,346 --> 00:06:32,725
It is clichรฉ that freedom isn't free,
but I 100% believe it.
68
00:06:32,809 --> 00:06:37,105
Um, I've experienced it,
and it takes a lot of sacrifice.
69
00:06:38,022 --> 00:06:39,022
Sorry.
70
00:06:41,067 --> 00:06:44,570
When something bad happens
to one of your teammates,
71
00:06:44,654 --> 00:06:46,864
whether they're hurt or they're killed,
72
00:06:46,948 --> 00:06:49,700
um, it's just a...
It's a really tragic thing.
73
00:06:49,784 --> 00:06:52,912
You know, for me now
in the work that we do,
74
00:06:52,995 --> 00:06:56,582
it's motivating to um... be able to
75
00:06:56,666 --> 00:06:58,835
you know,
reduce the number of times
76
00:06:58,918 --> 00:07:00,336
that ever happens again.
77
00:07:07,176 --> 00:07:08,886
In the late 2000s,
78
00:07:08,970 --> 00:07:11,556
there was this awakening
inside the Defense Department
79
00:07:11,639 --> 00:07:15,351
to what you might call
the accidental robotics revolution.
80
00:07:15,893 --> 00:07:19,564
We deployed thousands of air
and ground robots to Iraq and Afghanistan.
81
00:07:21,482 --> 00:07:24,068
When I was asked
by the Obama administration
82
00:07:24,152 --> 00:07:26,404
to become the Deputy Secretary of Defense,
83
00:07:26,904 --> 00:07:29,323
the way war was fought...
84
00:07:29,407 --> 00:07:31,117
uh, was definitely changing.
85
00:07:32,452 --> 00:07:36,330
Robots were used
where people would have been used.
86
00:07:37,999 --> 00:07:40,376
Early robotics systems
were remote-controlled.
87
00:07:40,460 --> 00:07:45,173
There's a human driving it, steering it,
like you might a remote-controlled car.
88
00:07:46,632 --> 00:07:51,512
They first started generally
going after improvised explosive devices,
89
00:07:52,013 --> 00:07:54,849
and if the bomb blew up,
the robot would blow up.
90
00:07:57,393 --> 00:08:00,396
Then you'd say, "That's a bummer.
Okay, get out the other robot."
91
00:08:02,273 --> 00:08:05,067
In Afghanistan,
you had the Predator drone,
92
00:08:05,151 --> 00:08:09,739
and it became a very, very useful tool
to conduct airstrikes.
93
00:08:13,117 --> 00:08:16,829
Over time, military planners
started to begin to wonder,
94
00:08:16,913 --> 00:08:20,791
"What else could robots be used for?"
And where was this going?
95
00:08:20,875 --> 00:08:24,170
And one of the common themes
was this trend towards greater autonomy.
96
00:08:26,506 --> 00:08:30,635
An autonomous weapon
is one that makes decisions on its own,
97
00:08:30,718 --> 00:08:33,304
with little to no human intervention.
98
00:08:33,387 --> 00:08:37,558
So it has an independent capacity,
and it's self-directed.
99
00:08:38,434 --> 00:08:41,604
And whether it can kill
depends on whether it's armed or not.
100
00:08:43,314 --> 00:08:47,026
When you have more and more autonomy
in your entire system,
101
00:08:47,109 --> 00:08:49,779
everything starts to move
at a higher clock speed.
102
00:08:51,989 --> 00:08:56,869
And when you operate
at a faster pace than your adversaries,
103
00:08:57,453 --> 00:09:01,165
that is an extraordinarily
big advantage in battle.
104
00:09:03,960 --> 00:09:06,254
What we focus on as it relates
to autonomy
105
00:09:07,255 --> 00:09:09,882
is highly resilient intelligence systems.
106
00:09:12,051 --> 00:09:15,304
Systems that can read and react
based on their environment,
107
00:09:15,805 --> 00:09:19,642
and make decisions
about how to maneuver in that world.
108
00:09:24,981 --> 00:09:29,151
The facility that we're at today
was originally built as a movie studio
109
00:09:29,235 --> 00:09:30,611
that was converted over
110
00:09:30,695 --> 00:09:34,156
to enable these realistic
military training environments.
111
00:09:39,495 --> 00:09:43,291
We are here to evaluate our AI pilot.
112
00:09:44,625 --> 00:09:49,297
The mission is looking for threats.
It's about clearance forces.
113
00:09:49,380 --> 00:09:52,800
It can make a decision
about how to attack that problem.
114
00:09:55,428 --> 00:09:59,390
We call this "the fatal funnel."
You have to come through a doorway.
115
00:10:01,726 --> 00:10:03,311
It's where we're most vulnerable.
116
00:10:03,811 --> 00:10:05,146
This one looks better.
117
00:10:07,106 --> 00:10:10,151
The Nova lets us know,
is there a shooter behind that door,
118
00:10:10,818 --> 00:10:12,653
is there a family behind that door?
119
00:10:15,114 --> 00:10:18,451
It'll allow us to make better decisions
and keep people out of harms way.
120
00:10:57,114 --> 00:10:59,367
We use the vision sensors
121
00:10:59,450 --> 00:11:03,162
to be able to get an understanding
of what the environment looks like.
122
00:11:04,997 --> 00:11:06,415
It's a multistory building.
123
00:11:07,333 --> 00:11:08,334
Here's the map.
124
00:11:08,417 --> 00:11:11,671
While I was exploring,
here's what I saw and where I saw them.
125
00:11:19,929 --> 00:11:22,098
Person detector. That's sweet.
126
00:11:23,891 --> 00:11:27,436
One of the other sensors
onboard Nova is a thermal scanner.
127
00:11:27,520 --> 00:11:30,690
If that's 98.6 degrees,
it probably is a human.
128
00:11:32,358 --> 00:11:35,444
People are considered threats
until deemed otherwise.
129
00:11:38,906 --> 00:11:43,244
It is about eliminating the fog of war
to make better decisions.
130
00:11:44,704 --> 00:11:46,539
And when we look to the future,
131
00:11:47,039 --> 00:11:50,710
we're scaling out to build teams
of autonomous aircraft.
132
00:11:53,754 --> 00:11:55,631
With self-driving vehicles,
133
00:11:55,715 --> 00:11:57,466
ultimately the person has said to it,
134
00:11:57,550 --> 00:12:00,386
"I'd like you to go
from point A to point B."
135
00:12:02,972 --> 00:12:06,475
Our systems are being asked
not to go from point A to point B,
136
00:12:06,559 --> 00:12:08,477
but to achieve an objective.
137
00:12:10,312 --> 00:12:13,190
It's more akin to, "I need milk."
138
00:12:13,274 --> 00:12:17,486
And then the robot would have to
figure out what grocery store to go to,
139
00:12:17,570 --> 00:12:20,239
be able to retrieve that milk,
and then bring it back.
140
00:12:20,740 --> 00:12:22,074
And even more so,
141
00:12:22,158 --> 00:12:27,037
it may be more appropriately stated as,
"Keep the refrigerator stocked."
142
00:12:27,538 --> 00:12:29,457
And so, this is a level of intelligence
143
00:12:29,540 --> 00:12:32,168
in terms to figuring out what we need
and how we do it.
144
00:12:32,251 --> 00:12:34,253
And if there is a challenge, or a problem,
145
00:12:34,336 --> 00:12:37,381
or an issue arises,
figure out how to mitigate that.
146
00:12:40,801 --> 00:12:45,431
When I had made the decision
to leave the Navy, I started thinking,
147
00:12:45,514 --> 00:12:47,558
"Okay. Well, what's next?"
148
00:12:49,143 --> 00:12:52,021
I grew up with the Internet.
Saw what it became.
149
00:12:53,189 --> 00:12:55,608
And part of the conclusion
that I had reached was...
150
00:12:56,108 --> 00:12:58,861
AI in 2015
151
00:12:58,944 --> 00:13:02,907
was really where the Internet was in 1991.
152
00:13:04,033 --> 00:13:06,285
And AI was poised to take off
153
00:13:06,368 --> 00:13:09,747
and be one of the most
powerful technologies in the world.
154
00:13:11,791 --> 00:13:16,212
Working with it every single day,
I can see the progress that is being made.
155
00:13:19,048 --> 00:13:22,051
But for a lot of people,
when they think "AI,"
156
00:13:22,551 --> 00:13:25,221
their minds immediately go to Hollywood.
157
00:13:28,057 --> 00:13:29,975
Shall we play a game?
158
00:13:30,059 --> 00:13:35,314
How about Global Thermonuclear War?
159
00:13:35,898 --> 00:13:36,816
Fine.
160
00:13:38,317 --> 00:13:40,861
When people think
of artificial intelligence generally,
161
00:13:40,945 --> 00:13:44,824
they might think of The Terminator.
Or I, Robot.
162
00:13:45,574 --> 00:13:46,700
Deactivate.
163
00:13:46,784 --> 00:13:48,160
What am I?
164
00:13:48,244 --> 00:13:49,620
Or The Matrix.
165
00:13:51,080 --> 00:13:53,749
Based on what you see
in the sci-fi movies,
166
00:13:53,833 --> 00:13:57,878
how do you know I'm a human?
I could just be computer generated AI.
167
00:13:58,379 --> 00:14:02,132
Replicants are like any other machine.
They're either a benefit or a hazard.
168
00:14:02,675 --> 00:14:06,095
But there's all sorts
of more primitive AIs,
169
00:14:06,178 --> 00:14:08,097
that are still going to change our lives
170
00:14:08,180 --> 00:14:11,851
well before we reach
the thinking, talking robot stage.
171
00:14:12,643 --> 00:14:15,771
The robots are here.
The robots are making decisions.
172
00:14:15,855 --> 00:14:18,816
The robot revolution has arrived,
173
00:14:18,899 --> 00:14:22,319
it's just that it doesn't look like
what anybody imagined.
174
00:14:23,320 --> 00:14:25,489
Terminator's
an infiltration unit.
175
00:14:25,573 --> 00:14:26,740
Part man, part machine.
176
00:14:27,950 --> 00:14:31,370
We're not talking about
a Terminator-style killer robot.
177
00:14:31,871 --> 00:14:35,958
We're talking about AI
that can do some tasks that humans can do.
178
00:14:36,041 --> 00:14:39,879
But the concern is
whether these systems are reliable.
179
00:14:40,462 --> 00:14:45,426
New details in last night's
crash involving a self-driving Uber SUV.
180
00:14:45,509 --> 00:14:48,762
The company created
an artificial intelligence chatbot.
181
00:14:48,846 --> 00:14:51,181
She took on a rather racist tone...
182
00:14:51,265 --> 00:14:55,394
Twenty-six state legislators
falsely identified as criminals.
183
00:14:57,438 --> 00:15:02,026
The question is whether they can handle
the complexities of the real world.
184
00:15:22,212 --> 00:15:24,423
The physical world
is really messy.
185
00:15:25,174 --> 00:15:27,426
There are many things that we don't know,
186
00:15:27,968 --> 00:15:30,763
making it much harder to train AI systems.
187
00:15:33,098 --> 00:15:36,977
That is where machine learning systems
have started to come in.
188
00:15:38,520 --> 00:15:41,440
Machine learning
has been a huge advancement
189
00:15:42,024 --> 00:15:45,611
because it means that we don't have
to teach computers everything.
190
00:15:47,905 --> 00:15:52,409
You actually give a computer
millions of pieces of information,
191
00:15:52,493 --> 00:15:54,453
and the machine begins to learn.
192
00:15:55,245 --> 00:15:57,289
And that could be applied to anything.
193
00:16:00,250 --> 00:16:02,962
Our Robot Dog project,
we are trying to show
194
00:16:03,045 --> 00:16:09,009
that our dog can walk across
many, many diverse terrains.
195
00:16:10,052 --> 00:16:14,098
Humans have evolved
over billions of years to walk,
196
00:16:14,932 --> 00:16:20,688
but there's a lot of intelligence
in adapting to these different terrains.
197
00:16:21,939 --> 00:16:24,316
The question remains
for robotic systems is,
198
00:16:24,400 --> 00:16:27,069
could they also adapt
like animals and humans?
199
00:16:35,953 --> 00:16:37,413
With machine learning,
200
00:16:37,496 --> 00:16:40,582
we collect lots and lots
of data in simulation.
201
00:16:42,668 --> 00:16:45,838
A simulation is a digital twin of reality.
202
00:16:46,588 --> 00:16:52,052
We can have many instances of that reality
running on different computers.
203
00:16:53,178 --> 00:16:56,974
It samples thousands of actions
in simulation.
204
00:16:58,892 --> 00:17:02,229
The ground that they're encountering
has different slipperiness.
205
00:17:02,312 --> 00:17:03,856
It has different softness.
206
00:17:04,898 --> 00:17:08,318
We take all the experience
of these thousands of robots
207
00:17:08,402 --> 00:17:13,282
from simulation and download this
into a real robotic system.
208
00:17:15,492 --> 00:17:20,039
The test we're going to do today
is to see if it can adapt to new terrains.
209
00:17:53,322 --> 00:17:55,491
When the robot was going over foam,
210
00:17:55,574 --> 00:17:58,577
the feet movements
were stomping on the ground.
211
00:17:59,369 --> 00:18:02,206
Versus when it came on this poly surface,
212
00:18:03,248 --> 00:18:06,460
it was trying to adjust the motion,
so it doesn't slip.
213
00:18:08,670 --> 00:18:10,672
Then that is when it strikes you,
214
00:18:10,756 --> 00:18:14,051
"This is what machine learning
is bringing to the table."
215
00:18:41,161 --> 00:18:44,123
We think the Robot Dog
could be really helpful
216
00:18:44,206 --> 00:18:46,166
in disaster response scenarios
217
00:18:46,250 --> 00:18:49,670
where you need to navigate
many different kinds of terrain.
218
00:18:52,631 --> 00:18:57,427
Or putting these dogs to do surveillance
in harsh environments.
219
00:19:15,696 --> 00:19:19,575
But most technology
runs into the challenge
220
00:19:19,658 --> 00:19:22,369
that there is some good they can do,
and there's some bad.
221
00:19:24,079 --> 00:19:27,875
For example,
we can use nuclear technology for energy...
222
00:19:30,419 --> 00:19:33,797
but we also could develop atom bombs
which are really bad.
223
00:19:35,090 --> 00:19:38,468
This is what is known
as the dual-use problem.
224
00:19:39,052 --> 00:19:40,929
Fire is dual-use.
225
00:19:41,722 --> 00:19:44,308
Human intelligence is dual-use.
226
00:19:44,391 --> 00:19:48,645
So, needless to say,
artificial intelligence is also dual-use.
227
00:19:49,354 --> 00:19:53,984
It's really important
to think about AI used in context
228
00:19:54,693 --> 00:19:59,114
because, yes, it's terrific
to have a search-and-rescue robot
229
00:19:59,198 --> 00:20:02,284
that can help locate somebody
after an avalanche,
230
00:20:02,868 --> 00:20:05,871
but that same robot can be weaponized.
231
00:20:15,088 --> 00:20:18,675
When you see companies
using robotics
232
00:20:18,759 --> 00:20:21,094
for putting armed weapons on them,
233
00:20:21,178 --> 00:20:22,930
a part of you becomes mad.
234
00:20:23,931 --> 00:20:28,435
And a part of it is the realization
that when we put our technology,
235
00:20:28,518 --> 00:20:30,103
this is what's going to happen.
236
00:20:31,021 --> 00:20:34,441
This is a real
transformative technology.
237
00:20:36,235 --> 00:20:38,028
These are weapon systems
238
00:20:38,111 --> 00:20:43,617
that could actually change
our safety and security in a dramatic way.
239
00:20:46,161 --> 00:20:51,500
As of now, we are not sure
that machines can actually make
240
00:20:51,583 --> 00:20:54,962
the distinction
between civilians and combatants.
241
00:21:01,802 --> 00:21:05,597
Early in the war in Afghanistan,
I was part of an Army Ranger sniper team
242
00:21:05,681 --> 00:21:08,016
looking for enemy fighters
coming across the border.
243
00:21:09,685 --> 00:21:12,521
And they sent a little girl
to scout out our position.
244
00:21:13,939 --> 00:21:17,067
One thing that never came up
was the idea of shooting this girl.
245
00:21:19,695 --> 00:21:22,864
Under the laws of war,
that would have been legal.
246
00:21:24,950 --> 00:21:27,369
They don't set an age
for enemy combatants.
247
00:21:29,079 --> 00:21:33,417
If you built a robot
to comply perfectly with the law of war,
248
00:21:33,500 --> 00:21:35,335
it would have shot this little girl.
249
00:21:36,420 --> 00:21:41,091
How would a robot know the difference
between what's legal and what is right?
250
00:21:43,385 --> 00:21:46,054
When it comes to
autonomous drone warfare,
251
00:21:46,138 --> 00:21:49,933
they wanna take away the harm
that it places on American soldiers
252
00:21:50,017 --> 00:21:51,935
and the American psyche,
253
00:21:52,019 --> 00:21:57,149
uh, but the increase on civilian harm
ends up with Afghans,
254
00:21:57,232 --> 00:21:58,775
and Iraqis, and Somalians.
255
00:22:09,619 --> 00:22:14,416
I would really ask those who support
trusting AI to be used in drones,
256
00:22:15,375 --> 00:22:18,211
"What if your village was
on the receiving end of that?"
257
00:22:23,550 --> 00:22:25,635
AI is a dual-edged sword.
258
00:22:26,595 --> 00:22:30,724
It can be used for good,
which is what we'd use it for ordinarily,
259
00:22:31,558 --> 00:22:33,477
and at the flip of a switch,
260
00:22:34,561 --> 00:22:38,023
the technology becomes potentially
something that could be lethal.
261
00:22:47,032 --> 00:22:48,533
I'm a clinical pharmacologist.
262
00:22:49,618 --> 00:22:50,994
I have a team of people
263
00:22:51,078 --> 00:22:54,247
that are using artificial intelligence
to figure out drugs
264
00:22:54,331 --> 00:22:57,751
that will cure diseases
that are not getting any attention.
265
00:22:59,711 --> 00:23:04,132
It used to be with drug discoveries,
you would take a molecule that existed,
266
00:23:04,216 --> 00:23:06,885
and do a tweak to that
to get to a new drug.
267
00:23:08,261 --> 00:23:13,350
And now we've developed AI
that can feed us with millions of ideas,
268
00:23:13,892 --> 00:23:15,227
millions of molecules,
269
00:23:16,186 --> 00:23:19,231
and that opens up so many possibilities
270
00:23:19,314 --> 00:23:23,151
for treating diseases
we've never been able to treat previously.
271
00:23:25,153 --> 00:23:28,448
But there's definitely a dark side
that I never have thought
272
00:23:28,532 --> 00:23:29,741
that I would go to.
273
00:23:33,161 --> 00:23:35,872
This whole thing started
when I was invited
274
00:23:35,956 --> 00:23:40,043
by an organization out of Switzerland
called the Spiez Laboratory
275
00:23:40,544 --> 00:23:44,423
to give a presentation
on the potential misuse of AI.
276
00:23:48,343 --> 00:23:51,221
Sean just sent me an email
with a few ideas
277
00:23:51,304 --> 00:23:55,308
of some ways we could misuse
our own artificial intelligence.
278
00:23:56,059 --> 00:24:00,355
And instead of asking our model
to create drug-like molecules,
279
00:24:00,439 --> 00:24:02,274
that could be used to treat diseases,
280
00:24:02,774 --> 00:24:06,403
let's see if we can generate
the most toxic molecules possible.
281
00:24:10,198 --> 00:24:13,243
I wanted to make the point,
could we use AI technology
282
00:24:13,326 --> 00:24:15,203
to design molecules that were deadly?
283
00:24:17,414 --> 00:24:20,083
And to be honest,
we thought it was going to fail
284
00:24:20,167 --> 00:24:24,504
because all we really did
was flip this zero to a one.
285
00:24:26,715 --> 00:24:30,385
And by inverting it,
instead of driving away from toxicity,
286
00:24:30,469 --> 00:24:32,220
now we're driving towards toxicity.
287
00:24:32,721 --> 00:24:33,555
And that's it.
288
00:24:37,517 --> 00:24:40,729
While I was home,
the computer was doing the work.
289
00:24:41,521 --> 00:24:44,816
I mean, it was cranking through,
generating thousands of molecules,
290
00:24:45,650 --> 00:24:49,529
and we didn't have to do anything
other than just push "go."
291
00:24:54,493 --> 00:24:58,330
The next morning,
there was this file on my computer,
292
00:24:59,080 --> 00:25:03,376
and within it
were roughly 40,000 molecules
293
00:25:03,460 --> 00:25:05,253
that were potentially
294
00:25:05,337 --> 00:25:08,882
some of the most toxic molecules
known to humankind.
295
00:25:10,217 --> 00:25:12,761
The hairs on the back of my neck
stood up on end.
296
00:25:13,386 --> 00:25:14,554
I was blown away.
297
00:25:15,388 --> 00:25:19,017
The computer made
tens of thousands of ideas
298
00:25:19,100 --> 00:25:20,685
for new chemical weapons.
299
00:25:21,645 --> 00:25:26,274
Obviously, we have molecules
that look like and are VX analogs and VX
300
00:25:26,775 --> 00:25:27,775
in the data set.
301
00:25:28,360 --> 00:25:31,363
VX is one of the most potent
chemical weapons in the world.
302
00:25:31,863 --> 00:25:33,573
New claims from police
303
00:25:33,657 --> 00:25:37,786
that the women seen attacking Kim Jong-nam
in this airport assassination
304
00:25:37,869 --> 00:25:41,248
were using a deadly nerve agent called VX.
305
00:25:41,748 --> 00:25:44,417
It can cause death
through asphyxiation.
306
00:25:45,043 --> 00:25:47,629
This is a very potent molecule,
307
00:25:47,712 --> 00:25:52,092
and most of these molecules were predicted
to be even more deadly than VX.
308
00:25:53,301 --> 00:25:56,972
Many of them had never,
as far as we know, been seen before.
309
00:25:57,472 --> 00:26:00,767
And so, when Sean and I realized this,
we're like,
310
00:26:00,850 --> 00:26:02,727
"Oh, what have we done?"
311
00:26:04,771 --> 00:26:09,192
I quickly realized
that we had opened Pandora's box,
312
00:26:09,901 --> 00:26:12,696
and I said, "Stop.
Don't do anything else. We're done."
313
00:26:13,905 --> 00:26:17,158
"Just make me the slides
that I need for the presentation."
314
00:26:19,202 --> 00:26:20,704
When we did this experiment,
315
00:26:20,787 --> 00:26:23,873
I was thinking, "What's the worst thing
that could possibly happen?"
316
00:26:26,251 --> 00:26:30,255
But now I'm like, "We were naive.
We were totally naive in doing it."
317
00:26:33,466 --> 00:26:36,928
The thing that terrifies me the most
is that anyone could do what we did.
318
00:26:38,930 --> 00:26:40,765
All it takes is the flip of a switch.
319
00:26:42,934 --> 00:26:46,187
How do we control
this technology before it's used
320
00:26:46,271 --> 00:26:50,942
potentially to do something
that's utterly destructive?
321
00:26:55,780 --> 00:27:00,285
At the heart of the conversation
around artificial intelligence
322
00:27:00,368 --> 00:27:03,538
and how do we choose to use it in society
323
00:27:04,289 --> 00:27:08,835
is a race between the power,
with which we develop technologies,
324
00:27:08,918 --> 00:27:11,379
and the wisdom that we have to govern it.
325
00:27:13,340 --> 00:27:16,926
There are the obvious
moral and ethical implications
326
00:27:17,010 --> 00:27:20,138
of the same thing
that powers our smartphones
327
00:27:20,221 --> 00:27:23,433
being entrusted
with the moral decision to take a life.
328
00:27:25,727 --> 00:27:31,191
I work with the Future Of Life Institute,
a community of scientist activists.
329
00:27:31,733 --> 00:27:33,443
We're overall trying to show
330
00:27:33,526 --> 00:27:38,698
that there is this other side
to speeding up and escalating automation.
331
00:27:38,782 --> 00:27:42,619
But we're trying to make sure
that technologies we create
332
00:27:42,702 --> 00:27:45,038
are used in a way
that is safe and ethical.
333
00:27:46,247 --> 00:27:49,959
Let's have conversations
about rules of engagement,
334
00:27:50,043 --> 00:27:54,005
and codes of conduct in using AI
throughout our weapons systems.
335
00:27:54,798 --> 00:27:59,135
Because we are now seeing
"enter the battlefield" technologies
336
00:27:59,219 --> 00:28:01,680
that can be used to kill autonomously.
337
00:28:10,230 --> 00:28:15,110
In 2021, the UN released
a report on the potential use
338
00:28:15,193 --> 00:28:19,364
of a lethal autonomous weapon
on the battlefield in Libya.
339
00:28:20,073 --> 00:28:23,910
A UN panel said that a drone
flying in the Libyan civil war last year
340
00:28:23,993 --> 00:28:27,288
had been programmed
to attack targets autonomously.
341
00:28:28,289 --> 00:28:31,209
If the UN reporting is accurate,
342
00:28:31,292 --> 00:28:34,963
this would be
a watershed moment for humanity.
343
00:28:35,046 --> 00:28:37,090
Because it marks a use case
344
00:28:37,173 --> 00:28:42,137
where an AI made the decision
to take a life, and not a human being.
345
00:28:49,686 --> 00:28:51,688
You're seeing advanced
autonomous weapons
346
00:28:51,771 --> 00:28:55,150
beginning to be used
in different places around the globe.
347
00:28:56,276 --> 00:28:58,278
There were reports out of Israel.
348
00:29:00,280 --> 00:29:05,702
Azerbaijan used autonomous systems
to target Armenian air defenses.
349
00:29:07,370 --> 00:29:09,998
It can fly around
the battlefield for hours,
350
00:29:10,081 --> 00:29:12,500
looking for things to hit on its own,
351
00:29:12,584 --> 00:29:16,087
and then plow into them
without any kind of human intervention.
352
00:29:16,755 --> 00:29:18,298
And we've seen recently
353
00:29:18,381 --> 00:29:21,760
these different videos
that are posted in Ukraine.
354
00:29:23,094 --> 00:29:26,890
It's unclear what mode they might
have been in when they were operating.
355
00:29:26,973 --> 00:29:30,393
Was a human in the loop,
choosing what targets to attack,
356
00:29:30,477 --> 00:29:32,729
or was the machine doing that on its own?
357
00:29:32,812 --> 00:29:36,524
But there will certainly
come a point in time,
358
00:29:36,608 --> 00:29:40,195
whether it's already happened in Libya,
Ukraine or elsewhere,
359
00:29:40,278 --> 00:29:44,449
where a machine makes its own decision
about whom to kill on the battlefield.
360
00:29:47,035 --> 00:29:50,997
Machines exercising
lethal power against humans
361
00:29:51,080 --> 00:29:53,374
without human intervention
362
00:29:53,458 --> 00:29:57,337
is politically unacceptable,
morally repugnant.
363
00:29:59,130 --> 00:30:01,299
Whether the international community
364
00:30:01,382 --> 00:30:05,011
will be sufficient
to govern those challenges
365
00:30:05,094 --> 00:30:06,888
is a big question mark.
366
00:30:08,932 --> 00:30:12,685
If we look towards the future,
even just a few years from now,
367
00:30:12,769 --> 00:30:15,730
what the landscape looks like
is very scary,
368
00:30:16,564 --> 00:30:20,485
given that the amount
of capital and human resource
369
00:30:20,568 --> 00:30:23,404
going into making AI more powerful
370
00:30:23,488 --> 00:30:26,032
and using it
for all of these different applications,
371
00:30:26,115 --> 00:30:26,950
is immense.
372
00:30:39,254 --> 00:30:41,214
Oh my God, this guy.
373
00:30:41,756 --> 00:30:43,424
He knows he can't win.
374
00:30:44,259 --> 00:30:46,678
Oh...
375
00:30:46,761 --> 00:30:51,099
When I see AI win at different problems,
I find it inspirational.
376
00:30:51,683 --> 00:30:53,977
Going for a little Hail Mary action.
377
00:30:54,477 --> 00:30:59,524
And you can apply those same tactics,
techniques, procedures to real aircraft.
378
00:31:01,401 --> 00:31:04,404
- Good game.
- All right, good game.
379
00:31:08,658 --> 00:31:09,909
It's surprising to me
380
00:31:09,993 --> 00:31:14,122
that people continue to make statements
about what AI can't do. Right?
381
00:31:14,205 --> 00:31:17,166
"Oh, it'll never be able
to beat a world champion in chess."
382
00:31:20,211 --> 00:31:22,672
An IBM computer
has made a comeback
383
00:31:22,755 --> 00:31:25,925
in Game 2 of its match
with world chess champion, Garry Kasparov.
384
00:31:27,093 --> 00:31:30,889
Whoa! Kasparov has resigned!
385
00:31:31,598 --> 00:31:34,642
When I see something
that is well beyond my understanding,
386
00:31:34,726 --> 00:31:38,062
I'm scared. And that was something
well beyond my understanding.
387
00:31:39,355 --> 00:31:41,107
And then people would say,
388
00:31:41,190 --> 00:31:44,402
"It'll never be able to beat
a world champion in the game of Go."
389
00:31:44,485 --> 00:31:50,700
I believe human intuition
is still too advanced for A.I.
390
00:31:50,783 --> 00:31:52,827
to have caught up.
391
00:31:55,288 --> 00:31:58,416
Go is one of the most
complicated games anyone can learn
392
00:31:58,499 --> 00:32:02,211
because the number of moves on the board,
when you do the math,
393
00:32:02,295 --> 00:32:05,256
equal more atoms
than there are in the entire universe.
394
00:32:06,799 --> 00:32:08,968
There was a team at Google
called DeepMind,
395
00:32:09,052 --> 00:32:11,471
and they created a program called AlphaGo
396
00:32:11,554 --> 00:32:14,265
to be able to beat
the world's best players.
397
00:32:14,349 --> 00:32:16,476
Wow.
398
00:32:16,559 --> 00:32:18,394
Congratulations to...
399
00:32:18,478 --> 00:32:19,478
- AlphaGo.
- AlphaGo.
400
00:32:19,520 --> 00:32:22,774
A computer program
has just beaten a 9 dan professional.
401
00:32:25,610 --> 00:32:31,532
Then DeepMind chose StarCraft
as kind of their next AI challenge.
402
00:32:33,743 --> 00:32:38,081
StarCraft is perhaps the most popular
real-time strategy game of all time.
403
00:32:40,333 --> 00:32:44,712
AlphaStar became famous
when it started defeating world champions.
404
00:32:45,338 --> 00:32:48,800
AlphaStar
absolutely smashing Immortal Arc.
405
00:32:48,883 --> 00:32:49,717
Know what?
406
00:32:49,801 --> 00:32:52,553
This is not gonna be a fight
that the pros can win.
407
00:32:53,680 --> 00:32:54,806
It's kind of ridiculous.
408
00:32:57,642 --> 00:33:00,895
Professional gamers say,
"I would never try that tactic."
409
00:33:00,979 --> 00:33:04,816
"I would never try that strategy.
That's something that's not human."
410
00:33:07,151 --> 00:33:10,655
And that was perhaps,
you know, the "a-ha" moment for me.
411
00:33:13,574 --> 00:33:16,661
I came to realize the time is now.
412
00:33:17,412 --> 00:33:21,082
There's an important technology
and an opportunity to make a difference.
413
00:33:24,502 --> 00:33:29,382
I only knew the problems that I had faced
as a SEAL in close-quarters combat,
414
00:33:30,383 --> 00:33:34,470
but one of my good friends,
who was an F-18 pilot, told me,
415
00:33:34,554 --> 00:33:37,724
"We have the same problem
in the fighter jet community."
416
00:33:37,807 --> 00:33:39,434
"They are jamming communications."
417
00:33:39,517 --> 00:33:42,562
"There are proliferated
surface-to-air missile sites
418
00:33:42,645 --> 00:33:44,439
that make it too dangerous to operate."
419
00:33:47,483 --> 00:33:50,987
Imagine if we had a fighter jet
that was commanded by an AI.
420
00:33:51,070 --> 00:33:52,739
Welcome
to the AlphaDogfights.
421
00:33:52,822 --> 00:33:56,159
We're a couple of minutes away
from this first semifinal.
422
00:33:56,242 --> 00:33:59,871
DARPA, the Defense
Advanced Research Projects Agency,
423
00:33:59,954 --> 00:34:03,624
had seen AlphaGo and AlphaStar,
424
00:34:04,292 --> 00:34:08,171
and so this idea of the AlphaDogfight
competition came to life.
425
00:34:09,005 --> 00:34:11,466
It's what you wanna see
your fighter pilots do.
426
00:34:11,549 --> 00:34:13,634
This looks like
human dogfighting.
427
00:34:15,595 --> 00:34:19,057
Dogfighting is
fighter-on-fighter aircraft going at it.
428
00:34:19,849 --> 00:34:22,852
You can think about it
as a boxing match in the sky.
429
00:34:23,853 --> 00:34:26,314
Maybe people have seen the movie Top Gun.
430
00:34:26,397 --> 00:34:30,068
- Can we outrun these guys?
- Not their missiles and guns.
431
00:34:30,151 --> 00:34:31,277
It's a dogfight.
432
00:34:35,531 --> 00:34:39,410
Learning to master dogfighting
can take eight to ten years.
433
00:34:42,288 --> 00:34:46,209
It's an extremely complex challenge
to build AI around.
434
00:34:54,092 --> 00:34:58,679
The prior approaches to autonomy
and dogfighting tended to be brittle.
435
00:34:59,222 --> 00:35:03,017
We figured machine learning was
probably the way to solve this problem.
436
00:35:04,477 --> 00:35:08,189
At first, the AI knew nothing
about the world in which it was dropped.
437
00:35:08,272 --> 00:35:11,025
It didn't know it was flying
or what dogfighting was.
438
00:35:11,109 --> 00:35:12,735
It didn't know what an F-16 is.
439
00:35:13,236 --> 00:35:15,988
All it knew
was the available actions it could take,
440
00:35:16,072 --> 00:35:18,407
and it would start
to randomly explore those actions.
441
00:35:19,325 --> 00:35:23,204
The blue plane's been training
for only a small amount of time.
442
00:35:23,287 --> 00:35:25,832
You can see it wobbling back and forth,
443
00:35:25,915 --> 00:35:30,545
uh, flying very erratically,
generally away from its adversary.
444
00:35:31,587 --> 00:35:34,257
As the fight progresses,
we can see blue is starting
445
00:35:34,340 --> 00:35:36,134
to establish here its game plan.
446
00:35:36,759 --> 00:35:38,636
It's more in a position to shoot.
447
00:35:39,178 --> 00:35:41,305
Once in a while,
the learning algorithm said,
448
00:35:41,389 --> 00:35:43,891
"Here's a cookie.
Keep doing more of that."
449
00:35:45,143 --> 00:35:49,147
We can take advantage of computer power
and train the agents many times
450
00:35:49,230 --> 00:35:50,230
in parallel.
451
00:35:51,357 --> 00:35:52,775
It's like a basketball team.
452
00:35:53,276 --> 00:35:55,778
Instead of playing the same team
over and over again,
453
00:35:55,862 --> 00:35:59,198
you're traveling the world
playing 512 different teams,
454
00:35:59,699 --> 00:36:00,950
all at the same time.
455
00:36:01,492 --> 00:36:03,452
You can get very good, very fast.
456
00:36:04,162 --> 00:36:07,456
We were able to run that simulation 24/7
457
00:36:07,540 --> 00:36:11,919
and get something like 30 years
of pilot training time in, in 10 months.
458
00:36:12,920 --> 00:36:15,673
We went from barely able
to control the aircraft
459
00:36:15,756 --> 00:36:17,717
to being a stone-cold assassin.
460
00:36:19,010 --> 00:36:23,514
Under training, we were competing only
against other artificial intelligence.
461
00:36:24,473 --> 00:36:28,561
But competing against humans directly
was kind of the ultimate target.
462
00:36:34,192 --> 00:36:35,610
My name is Mike Benitez,
463
00:36:36,110 --> 00:36:38,362
I'm a Lieutenant Colonel
in the U.S. Air Force.
464
00:36:39,155 --> 00:36:41,157
Been on active duty about 25 years.
465
00:36:42,825 --> 00:36:44,869
I've got 250 combat missions
466
00:36:45,578 --> 00:36:47,538
and I'm a weapons school graduate,
467
00:36:48,039 --> 00:36:49,832
which is Air Force version of Top Gun.
468
00:36:52,627 --> 00:36:55,171
I've never actually flown against AI.
469
00:36:55,254 --> 00:36:58,549
So I'm pretty excited
to see how well I can do.
470
00:36:59,634 --> 00:37:03,054
We got now a 6,000 feet
offensive set up nose-to-nose.
471
00:37:04,388 --> 00:37:05,223
Fight's on.
472
00:37:13,314 --> 00:37:14,314
He's gone now.
473
00:37:14,732 --> 00:37:16,651
Yeah, that's actually really interesting.
474
00:37:20,780 --> 00:37:23,658
Dead. Got him. Flawless victory.
475
00:37:25,910 --> 00:37:27,453
All right, round two.
476
00:37:35,461 --> 00:37:40,466
What the artificial intelligence is doing
is maneuvering with such precision,
477
00:37:40,967 --> 00:37:43,219
uh, that I just can't keep up with it.
478
00:37:48,474 --> 00:37:49,725
Right into the merge.
479
00:37:50,309 --> 00:37:51,894
Oh, now you're gone.
480
00:37:53,145 --> 00:37:53,980
Got him!
481
00:37:54,063 --> 00:37:55,063
Still got me.
482
00:37:58,567 --> 00:38:00,069
AI is never scared.
483
00:38:00,736 --> 00:38:03,781
There's a human emotional element
in the cockpit an AI won't have.
484
00:38:04,991 --> 00:38:08,077
One of the more interesting strategies
our AI developed,
485
00:38:08,160 --> 00:38:10,121
was what we call the face shot.
486
00:38:10,621 --> 00:38:14,000
Usually a human wants to shoot from behind
487
00:38:14,083 --> 00:38:16,377
because it's hard for them
to shake you loose.
488
00:38:17,044 --> 00:38:20,381
They don't try face shots
because you're playing a game of chicken.
489
00:38:21,424 --> 00:38:25,511
When we come head-on,
3,000 feet away to 500 feet away
490
00:38:25,594 --> 00:38:27,096
can happen in a blink of an eye.
491
00:38:27,596 --> 00:38:31,183
You run a high risk of colliding,
so humans don't try it.
492
00:38:32,893 --> 00:38:35,938
The AI, unless it's told to fear death,
will not fear death.
493
00:38:37,106 --> 00:38:40,693
All good. Feels like
I'm fighting against a human, uh,
494
00:38:40,776 --> 00:38:44,113
a human that has a reckless abandonment
for safety.
495
00:38:45,489 --> 00:38:47,325
He's not gonna survive this last one.
496
00:38:57,168 --> 00:38:58,461
He doesn't have enough time.
497
00:39:00,629 --> 00:39:01,629
Ah!
498
00:39:01,964 --> 00:39:02,965
Good night.
499
00:39:04,091 --> 00:39:05,091
I'm dead.
500
00:39:16,812 --> 00:39:17,980
It's humbling to know
501
00:39:18,064 --> 00:39:20,983
that I might not even be
the best thing for this mission,
502
00:39:21,484 --> 00:39:24,320
and that thing could be something
that replaces me one day.
503
00:39:25,696 --> 00:39:26,989
Same 6 CAV.
504
00:39:27,615 --> 00:39:28,866
One thousand offset.
505
00:39:29,367 --> 00:39:32,078
With this AI pilot
commanding fighter aircraft,
506
00:39:32,828 --> 00:39:34,830
the winning is relentless, it's dominant.
507
00:39:34,914 --> 00:39:37,249
It's not just winning by a wide range.
508
00:39:37,333 --> 00:39:41,545
It's, "Okay, how can we get that
onto our aircraft?" It's that powerful.
509
00:39:43,672 --> 00:39:47,802
It's realistic to expect
that AI will be piloting an F-16,
510
00:39:47,885 --> 00:39:49,970
and it will not be that far out.
511
00:39:51,305 --> 00:39:56,769
If you're going up against
an AI pilot that has a 99.99999% win rate,
512
00:39:56,852 --> 00:39:58,062
you don't stand a chance.
513
00:39:59,605 --> 00:40:02,733
When I think about
one AI pilot being unbeatable,
514
00:40:02,817 --> 00:40:06,779
I think about what a team of 50, or 100,
515
00:40:07,279 --> 00:40:12,535
or 1,000 AI pilots
can continue to, uh, achieve.
516
00:40:13,702 --> 00:40:17,415
Swarming is a team
of highly intelligent aircraft
517
00:40:17,498 --> 00:40:18,791
that work with each other.
518
00:40:19,291 --> 00:40:23,462
They're sharing information about
what to do, how to solve a problem.
519
00:40:25,172 --> 00:40:30,469
Swarming will be a game-changing
and transformational capability
520
00:40:30,553 --> 00:40:32,638
to our military and our allies.
521
00:41:42,750 --> 00:41:47,379
Target has been acquired,
and the drones are tracking him.
522
00:42:01,936 --> 00:42:03,604
Here comes the land.
523
00:42:07,566 --> 00:42:10,402
Primary goal
of the swarming research we're working on
524
00:42:11,153 --> 00:42:13,697
is to deploy a large number of drones
525
00:42:13,781 --> 00:42:17,910
over an area that is hard to get to
or dangerous to get to.
526
00:42:21,705 --> 00:42:26,085
The Army Research Lab has been supporting
this particular research project.
527
00:42:26,919 --> 00:42:28,963
If you want to know what's in a location,
528
00:42:29,046 --> 00:42:32,883
but it's hard to get to that area,
or it's a very large area,
529
00:42:33,676 --> 00:42:35,886
then deploying a swarm
is a very natural way
530
00:42:35,970 --> 00:42:38,389
to extend the reach of individuals
531
00:42:38,472 --> 00:42:41,934
and collect information
that is critical to the mission.
532
00:42:46,689 --> 00:42:48,983
So, right now in our swarm deployment,
533
00:42:49,066 --> 00:42:53,988
we essentially give a single command
to go track the target of interest.
534
00:42:54,488 --> 00:42:57,283
Then the drones go
and do all of that on their own.
535
00:42:59,493 --> 00:43:04,081
Artificial intelligence allows
the robots to move collectively as a swarm
536
00:43:04,164 --> 00:43:05,708
in a decentralized manner.
537
00:43:10,671 --> 00:43:12,506
In the swarms in nature that we see,
538
00:43:13,299 --> 00:43:18,178
there's no boss,
no main animal telling them what to do.
539
00:43:20,556 --> 00:43:25,019
The behavior is emerging
out of each individual animal
540
00:43:25,102 --> 00:43:27,104
following a few simple rules.
541
00:43:27,688 --> 00:43:32,443
And out of that grows this emergent
collective behavior that you see.
542
00:43:33,944 --> 00:43:37,031
What's awe-inspiring
about swarms in nature
543
00:43:37,114 --> 00:43:39,992
is the graceful ability
in which they move.
544
00:43:40,576 --> 00:43:44,288
It's as if they were built
to be a part of this group.
545
00:43:46,790 --> 00:43:51,128
Ideally, what we'd love to see
with our drone swarm is,
546
00:43:51,211 --> 00:43:53,172
much like in the swarm in nature,
547
00:43:53,672 --> 00:43:56,717
decisions being made
by the group collectively.
548
00:43:59,928 --> 00:44:02,765
The other piece of inspiration for us
549
00:44:02,848 --> 00:44:06,268
comes in the form
of reliability and resiliency.
550
00:44:07,311 --> 00:44:09,188
That swarm will not go down
551
00:44:09,271 --> 00:44:13,567
if one individual animal
doesn't do what it's supposed to do.
552
00:44:15,194 --> 00:44:18,697
Even if one of the agents falls out,
or fails,
553
00:44:18,781 --> 00:44:20,866
or isn't able to complete the task,
554
00:44:21,909 --> 00:44:23,369
the swarm will continue.
555
00:44:25,204 --> 00:44:27,873
And ultimately,
that's what we'd like to have.
556
00:44:29,333 --> 00:44:33,879
We have this need in combat scenarios
for identifying enemy aircraft,
557
00:44:33,962 --> 00:44:37,841
and it used to be we required
one person controlling one robot.
558
00:44:38,592 --> 00:44:40,552
As autonomy increases,
559
00:44:40,636 --> 00:44:43,555
I hope we will get to see
a large number of robots
560
00:44:43,639 --> 00:44:46,558
being controlled
by a very small number of people.
561
00:44:47,851 --> 00:44:50,854
I see no reason why we couldn't achieve
a thousand eventually
562
00:44:51,355 --> 00:44:54,942
because each agent
will be able to act of its own accord,
563
00:44:55,025 --> 00:44:56,443
and the sky's the limit.
564
00:44:57,027 --> 00:44:59,071
We can scale our learning...
565
00:44:59,154 --> 00:45:03,325
We've been working on swarming
in simulation for quite some time,
566
00:45:03,409 --> 00:45:06,829
and it is time to bring
that to real-world aircraft.
567
00:45:07,454 --> 00:45:12,042
We expect to be doing
three robots at once over the network,
568
00:45:12,126 --> 00:45:15,129
and then starting
to add more and more capabilities.
569
00:45:15,879 --> 00:45:18,549
We want to be able
to test that on smaller systems,
570
00:45:18,632 --> 00:45:22,261
but take those same concepts
and apply them to larger systems,
571
00:45:22,344 --> 00:45:23,679
like a fighter jet.
572
00:45:24,513 --> 00:45:26,014
We talk a lot about,
573
00:45:26,098 --> 00:45:31,937
how do you give a platoon
the combat power of a battalion?
574
00:45:34,982 --> 00:45:39,528
Or a battalion
the combat power of a brigade?
575
00:45:39,611 --> 00:45:41,488
You can do that with swarming.
576
00:45:42,614 --> 00:45:45,200
And when you can unlock
that power of swarming,
577
00:45:45,284 --> 00:45:48,078
you have just created
a new strategic deterrence
578
00:45:48,162 --> 00:45:49,455
to military aggression.
579
00:45:52,332 --> 00:45:57,004
I think the most exciting thing
is the number of young men and women
580
00:45:57,087 --> 00:46:00,299
who we will save
if we really do this right.
581
00:46:01,091 --> 00:46:03,594
And we trade machines
582
00:46:04,553 --> 00:46:05,846
rather than human lives.
583
00:46:06,638 --> 00:46:10,893
Some argue that autonomous weapons
will make warfare more precise
584
00:46:10,976 --> 00:46:12,311
and more humane,
585
00:46:12,853 --> 00:46:15,397
but it's actually difficult to predict
586
00:46:15,481 --> 00:46:19,443
exactly how autonomous weapons
might change warfare ahead of time.
587
00:46:21,320 --> 00:46:23,238
It's like the invention
of the Gatling gun.
588
00:46:25,073 --> 00:46:26,700
Richard Gatling was an inventor,
589
00:46:27,201 --> 00:46:30,954
and he saw soldiers coming back,
wounded in the Civil War,
590
00:46:32,289 --> 00:46:35,292
and wanted to find ways
to make warfare more humane.
591
00:46:36,293 --> 00:46:39,797
To reduce the number of soldiers
that were killed in war
592
00:46:40,422 --> 00:46:42,633
by reducing the number
of soldiers in the battle.
593
00:46:44,092 --> 00:46:48,388
And so he invented the Gatling gun,
an automated gun turned by a crank
594
00:46:48,472 --> 00:46:50,516
that could automate the process of firing.
595
00:46:53,268 --> 00:46:57,981
It increased effectively by a hundredfold
the firepower that soldiers could deliver.
596
00:47:01,235 --> 00:47:05,239
Oftentimes, efforts to make warfare
more precise and humane...
597
00:47:06,532 --> 00:47:07,991
...can have the opposite effect.
598
00:47:10,160 --> 00:47:13,455
Think about the effect
of one errant drone strike
599
00:47:13,539 --> 00:47:14,540
in a rural area
600
00:47:14,623 --> 00:47:17,209
that drives the local populace
against the United States,
601
00:47:17,292 --> 00:47:20,712
against the local government.
You know, supposedly the good guys.
602
00:47:22,589 --> 00:47:25,008
Now magnify that by 1,000.
603
00:47:26,718 --> 00:47:28,720
The creation of a weapon system
604
00:47:28,804 --> 00:47:34,560
that is cheap, scalable,
and doesn't require human operators
605
00:47:34,643 --> 00:47:37,896
drastically changes
the actual barriers to conflict.
606
00:47:40,107 --> 00:47:44,486
It keeps me up at night to think
of a world where war is ubiquitous,
607
00:47:44,987 --> 00:47:48,657
and we no longer carry
the human and financial cost of war
608
00:47:48,740 --> 00:47:51,410
because we're just so far removed from...
609
00:47:52,369 --> 00:47:54,037
the lives that will be lost.
610
00:47:59,459 --> 00:48:01,420
This whole thing is haunting me.
611
00:48:02,212 --> 00:48:05,924
I just needed an example
of artificial intelligence misuse.
612
00:48:06,800 --> 00:48:10,888
The unanticipated consequences
of doing that simple thought experiment
613
00:48:10,971 --> 00:48:12,848
have gone way too far.
614
00:48:17,352 --> 00:48:18,729
When I gave the presentation
615
00:48:18,812 --> 00:48:22,107
on the toxic molecules
created by AI technology,
616
00:48:22,608 --> 00:48:24,318
the audience's jaws dropped.
617
00:48:29,406 --> 00:48:33,660
The next decision was whether
we should publish this information.
618
00:48:34,912 --> 00:48:36,914
On one hand, you want to warn the world
619
00:48:36,997 --> 00:48:40,125
of these sorts of capabilities,
but on the other hand,
620
00:48:40,208 --> 00:48:44,087
you don't want to give somebody the idea
if they had never had it before.
621
00:48:46,381 --> 00:48:48,133
We decided it was worth publishing
622
00:48:48,216 --> 00:48:52,679
to maybe find some ways
to mitigate the misuse of this type of AI
623
00:48:52,763 --> 00:48:54,139
before it occurs.
624
00:49:01,897 --> 00:49:04,775
The general public's reaction
was shocking.
625
00:49:07,194 --> 00:49:10,656
We can see the metrics on the page,
how many people have accessed it.
626
00:49:10,739 --> 00:49:14,451
The kinds of articles we normally write,
we're lucky if we get...
627
00:49:14,534 --> 00:49:19,831
A few thousand people look at our article
over a period of a year or multiple years.
628
00:49:19,915 --> 00:49:22,084
It was 10,000 people
had read it within a week.
629
00:49:22,167 --> 00:49:25,337
Then it was 20,000,
then it was 30,000, then it was 40,000,
630
00:49:25,837 --> 00:49:28,715
and we were up to 10,000 people a day.
631
00:49:30,258 --> 00:49:33,512
We've done The Economist,
the Financial Times.
632
00:49:35,639 --> 00:49:39,977
Radiolab, you know, they reached out.
Like, I've heard of Radiolab!
633
00:49:41,937 --> 00:49:45,899
But then the reactions turned
into this thing that's out of control.
634
00:49:49,778 --> 00:49:53,532
When we look at those tweets, it's like,
"Oh my God, could they do anything worse?"
635
00:49:54,324 --> 00:49:55,993
Why did they do this?
636
00:50:01,164 --> 00:50:04,501
And then we got an invitation
I never would have anticipated.
637
00:50:08,797 --> 00:50:12,926
There was a lot of discussion
inside the White House about the article,
638
00:50:13,010 --> 00:50:15,345
and they wanted to talk to us urgently.
639
00:50:18,849 --> 00:50:20,809
Obviously, it's an incredible honor
640
00:50:20,892 --> 00:50:23,020
to be able
to talk to people at this level.
641
00:50:23,603 --> 00:50:25,272
But then it hits you
642
00:50:25,355 --> 00:50:28,900
like, "Oh my goodness,
it's the White House. The boss."
643
00:50:29,693 --> 00:50:33,196
This involved putting together
data sets that were open source...
644
00:50:33,280 --> 00:50:37,701
And in about six hours, the model was able
to generate about over 40,000...
645
00:50:37,784 --> 00:50:41,038
They asked questions
about how much computing power you needed,
646
00:50:41,872 --> 00:50:43,832
and we told them it was nothing special.
647
00:50:44,332 --> 00:50:48,211
Literally a standard run-of-the-mill,
six-year-old Mac.
648
00:50:49,546 --> 00:50:51,256
And that blew them away.
649
00:50:54,968 --> 00:50:59,473
The folks that are in charge
of understanding chemical warfare agents
650
00:50:59,556 --> 00:51:04,019
and governmental agencies,
they had no idea of this potential.
651
00:51:06,605 --> 00:51:09,816
We've got this cookbook
to make these chemical weapons,
652
00:51:10,650 --> 00:51:15,572
and in the hands of a bad actor
that has malicious intent
653
00:51:16,573 --> 00:51:18,992
it could be utterly horrifying.
654
00:51:21,703 --> 00:51:23,205
People have to sit up and listen,
655
00:51:23,288 --> 00:51:26,458
and we have to take steps
to either regulate the technology
656
00:51:27,250 --> 00:51:30,670
or constrain it in a way
that it can't be misused.
657
00:51:31,880 --> 00:51:35,175
Because the potential for lethality...
658
00:51:36,176 --> 00:51:37,344
is terrifying.
659
00:51:40,680 --> 00:51:45,977
The question of the ethics of AI
is largely addressed by society,
660
00:51:46,561 --> 00:51:50,273
not by the engineers or technologists,
the mathematicians.
661
00:51:52,526 --> 00:51:56,822
Every technology that we bring forth,
every novel innovation,
662
00:51:56,905 --> 00:52:00,617
ultimately falls under the purview
of how society believes we should use it.
663
00:52:03,036 --> 00:52:06,498
Right now,
the Department of Defense says,
664
00:52:06,581 --> 00:52:09,960
"The only thing that is saying
we are going to kill something
665
00:52:10,043 --> 00:52:11,920
on the battlefield is a human."
666
00:52:12,504 --> 00:52:14,548
A machine can do the killing,
667
00:52:15,674 --> 00:52:18,927
but only at the behest
of a human operator,
668
00:52:19,636 --> 00:52:21,429
and I don't see that ever changing.
669
00:52:24,808 --> 00:52:28,103
They assure us
that this type of technology will be safe.
670
00:52:29,729 --> 00:52:33,692
But the United States military just
doesn't have a trustworthy reputation
671
00:52:33,775 --> 00:52:35,485
with drone warfare.
672
00:52:36,820 --> 00:52:41,783
And so, when it comes
to trusting the U.S. military with AI,
673
00:52:41,867 --> 00:52:45,036
I would say, you know, the track record
kinda speaks for itself.
674
00:52:46,288 --> 00:52:50,125
The U.S. Defense Department policy
on the use of autonomy in weapons
675
00:52:50,208 --> 00:52:52,919
does not ban any kind of weapon system.
676
00:52:53,003 --> 00:52:56,715
And even if militaries
might not want autonomous weapons,
677
00:52:56,798 --> 00:53:01,678
we could see militaries handing over
more decisions to machines
678
00:53:01,761 --> 00:53:03,680
just to keep pace with competitors.
679
00:53:04,764 --> 00:53:08,351
And that could drive militaries
to automate decisions
680
00:53:08,435 --> 00:53:09,978
that they may not want to.
681
00:53:12,022 --> 00:53:13,440
Vladimir Putin said,
682
00:53:13,523 --> 00:53:16,693
"Whoever leads in AI
is going to rule the world."
683
00:53:17,736 --> 00:53:23,325
President Xi has made it clear that AI
is one of the number one technologies
684
00:53:23,408 --> 00:53:25,702
that China wants to dominate in.
685
00:53:26,786 --> 00:53:29,831
We're clearly
in a technological competition.
686
00:53:33,418 --> 00:53:35,670
You hear people talk
about guardrails,
687
00:53:36,463 --> 00:53:39,424
and I believe
that is what people should be doing.
688
00:53:40,467 --> 00:53:45,180
But there is a very real race
for AI superiority.
689
00:53:47,265 --> 00:53:51,603
And our adversaries, whether it's China,
whether it's Russia, whether it's Iran,
690
00:53:52,187 --> 00:53:56,566
are not going to give two thoughts
to what our policy says around AI.
691
00:54:00,820 --> 00:54:04,324
You're seeing a lot more conversations
around AI policy,
692
00:54:06,117 --> 00:54:09,287
but I wish more leaders
would have the conversation
693
00:54:09,371 --> 00:54:11,706
saying, ''How quickly
can we build this thing?
694
00:54:12,707 --> 00:54:15,168
Let's resource the heck out of it
and build it."
695
00:54:33,478 --> 00:54:38,358
We are at the Association of the U.S.
Army's biggest trade show of the year.
696
00:54:39,609 --> 00:54:44,572
Basically, any vendor who is selling
a product or technology into a military
697
00:54:44,656 --> 00:54:46,616
will be exhibiting.
698
00:54:47,659 --> 00:54:50,537
The Tyndall Air Force Base
has four of our robots
699
00:54:50,620 --> 00:54:53,748
that patrol their base
24 hours a day, 7 days a week.
700
00:54:55,750 --> 00:55:00,130
We can add everything from cameras
to sensors to whatever you need.
701
00:55:00,213 --> 00:55:05,135
Manipulator arms. Again, just to complete
the mission that the customer has in mind.
702
00:55:06,678 --> 00:55:08,638
What if your enemy introduces AI?
703
00:55:09,431 --> 00:55:12,976
A fighting system that thinks
faster than you, responds faster,
704
00:55:13,059 --> 00:55:16,062
than what a human being can do?
We've got to be prepared.
705
00:55:17,605 --> 00:55:21,359
We train our systems
to collect intel on the enemy,
706
00:55:22,027 --> 00:55:26,531
managing enemy targets
with humans supervising the kill chain.
707
00:55:33,204 --> 00:55:34,956
- Hi, General.
- How you doing?
708
00:55:35,040 --> 00:55:37,208
Good, sir. How are you? Um...
709
00:55:37,709 --> 00:55:40,962
I'll just say
no one is investing more in an AI pilot.
710
00:55:41,046 --> 00:55:42,630
Our AI pilot's called Hivemind,
711
00:55:43,298 --> 00:55:47,135
so we applied it to our quadcopter Nova.
It goes inside buildings,
712
00:55:47,218 --> 00:55:49,387
explores them
ahead of special operation forces
713
00:55:49,471 --> 00:55:50,805
and infantry forces.
714
00:55:50,889 --> 00:55:52,891
We're applying Hivemind to V-BAT,
715
00:55:52,974 --> 00:55:56,811
so I think about, you know,
putting up hundreds of those teams.
716
00:55:56,895 --> 00:55:58,646
Whether it's the Taiwan Strait,
717
00:55:58,730 --> 00:56:01,608
whether it's in the Ukraine,
deterring our adversaries.
718
00:56:01,691 --> 00:56:03,360
So, pretty excited about it.
719
00:56:03,943 --> 00:56:06,196
- All right. Thank you.
- So. Thank you, General.
720
00:56:07,989 --> 00:56:13,370
AI pilots should be ubiquitous,
and that should be the case by 2025, 2030.
721
00:56:14,204 --> 00:56:17,582
Its adoption will be rapid
throughout militaries across the world.
722
00:56:19,250 --> 00:56:22,962
What do you do with the Romanian military,
their UAS guy?
723
00:56:23,046 --> 00:56:24,756
High-tech in the military.
724
00:56:25,256 --> 00:56:29,803
We've spent half a billion dollars to date
on building an AI pilot.
725
00:56:29,886 --> 00:56:34,015
We will spend another billion dollars
over the next five years. And that is...
726
00:56:34,099 --> 00:56:37,769
It's a major reason why we're winning
the programs of record in the U.S.
727
00:56:37,852 --> 00:56:38,853
Nice.
728
00:56:38,937 --> 00:56:42,732
I mean, it's impressive.
You succeeded to weaponize that.
729
00:56:43,608 --> 00:56:45,944
Uh, it is... This is not weaponized yet.
730
00:56:46,027 --> 00:56:48,321
So not yet. But yes, in the future.
731
00:56:48,905 --> 00:56:52,534
Our customers think about it as a truck.
We think of it as an intelligent truck
732
00:56:52,617 --> 00:56:54,452
that can do a lot of different things.
733
00:56:54,536 --> 00:56:55,703
Thank you, buddy.
734
00:56:55,787 --> 00:56:57,956
I'll make sure
to follow up with you.
735
00:57:01,668 --> 00:57:04,129
If you come back in 10 years,
736
00:57:04,212 --> 00:57:06,256
you'll see that, um...
737
00:57:06,339 --> 00:57:10,802
AI and autonomy
will have dominated this entire market.
738
00:57:14,889 --> 00:57:19,352
Forces that are supported
by AI and autonomy
739
00:57:19,436 --> 00:57:25,108
will absolutely dominate,
crush, and destroy forces without.
740
00:57:26,401 --> 00:57:30,780
It'll be the equivalent
of horses going up against tanks,
741
00:57:31,781 --> 00:57:34,367
people with swords
going up against the machine gun.
742
00:57:34,909 --> 00:57:37,120
It will not even be close.
743
00:57:38,746 --> 00:57:42,750
It will become ubiquitous,
used at every spectrum of warfare,
744
00:57:43,585 --> 00:57:44,836
the tactical level,
745
00:57:45,336 --> 00:57:46,546
the strategic level,
746
00:57:47,172 --> 00:57:50,800
operating at speeds
that humans cannot fathom today.
747
00:57:53,094 --> 00:57:57,098
Commanders are already overwhelmed
with too much information.
748
00:57:57,599 --> 00:58:01,019
Imagery from satellites,
and drones, and sensors.
749
00:58:02,103 --> 00:58:03,396
One of the things AI can do
750
00:58:03,480 --> 00:58:06,774
is help a commander
more rapidly understand what is occurring.
751
00:58:07,650 --> 00:58:09,903
And then, "What are the decisions
I need to make?"
752
00:58:11,237 --> 00:58:14,866
Artificial intelligence
will take into account all the factors
753
00:58:14,949 --> 00:58:17,702
that determine the way war is fought,
754
00:58:18,286 --> 00:58:20,246
come up with strategies...
755
00:58:22,540 --> 00:58:25,668
and give recommendations
on how to win a battle.
756
00:58:36,971 --> 00:58:40,266
We at Lockheed Martin,
like our Department of Defense customer,
757
00:58:40,350 --> 00:58:43,770
view artificial intelligence
as a key technology enabler
758
00:58:43,853 --> 00:58:45,104
for command and control.
759
00:58:47,065 --> 00:58:50,235
The rate of spread
has an average of two feet per second.
760
00:58:50,318 --> 00:58:52,695
This perimeter
is roughly 700 acres.
761
00:58:52,779 --> 00:58:55,782
The fog of war is
a reality for us on the defense side,
762
00:58:57,158 --> 00:59:00,495
but it has parallels
to being in the environment
763
00:59:00,578 --> 00:59:03,206
and having to make decisions
for wildfires as well.
764
00:59:03,289 --> 00:59:06,543
The Washburn fire
is just north of the city of Wawona.
765
00:59:06,626 --> 00:59:09,629
You're having to make decisions
with imperfect data.
766
00:59:10,630 --> 00:59:14,300
And so how do we have AI help us
with that fog of war?
767
00:59:17,637 --> 00:59:19,597
Wildfires are very chaotic.
768
00:59:20,181 --> 00:59:21,266
They're very complex,
769
00:59:22,267 --> 00:59:26,354
and so we're working
to utilize artificial intelligence
770
00:59:26,437 --> 00:59:27,855
to help make decisions.
771
00:59:31,442 --> 00:59:33,987
The Cognitive Mission Manager
is a program we're building
772
00:59:34,070 --> 00:59:37,949
that takes aerial infrared video
773
00:59:38,032 --> 00:59:41,786
and then processes it
through our AI algorithms
774
00:59:41,869 --> 00:59:45,248
to be able to predict
the future state of the fire.
775
00:59:48,042 --> 00:59:49,794
As we move into the future,
776
00:59:49,877 --> 00:59:52,922
the Cognitive Mission Manager
will use simulation,
777
00:59:53,006 --> 00:59:57,093
running scenarios
over thousands of cycles,
778
00:59:57,176 --> 01:00:00,722
to recommend the most effective way
to deploy resources
779
01:00:00,805 --> 01:00:04,058
to suppress high-priority areas of fire.
780
01:00:08,062 --> 01:00:12,400
It'll say, "Go perform an aerial
suppression with a Firehawk here."
781
01:00:13,568 --> 01:00:15,903
"Take ground crews that clear brush...
782
01:00:16,988 --> 01:00:19,032
...firefighters that are hosing down,
783
01:00:19,657 --> 01:00:23,077
and deploy them
into the highest priority areas."
784
01:00:26,289 --> 01:00:29,292
Those decisions
will be able to be generated faster
785
01:00:29,375 --> 01:00:31,044
and more efficiently.
786
01:00:35,840 --> 01:00:38,468
We view AI
as uniquely allowing our humans
787
01:00:38,551 --> 01:00:42,305
to be able to keep up
with the ever-changing environment.
788
01:00:43,514 --> 01:00:46,517
And there are
a credible number of parallels
789
01:00:46,601 --> 01:00:50,355
to what we're used to at Lockheed Martin
on the defense side.
790
01:00:52,231 --> 01:00:55,860
The military is no longer
talking about just using AI
791
01:00:55,943 --> 01:01:00,323
in individual weapons systems
to make targeting and kill decisions,
792
01:01:00,865 --> 01:01:02,200
but integrating AI
793
01:01:02,283 --> 01:01:05,787
into the whole decision-making
architecture of the military.
794
01:01:09,415 --> 01:01:12,960
The Army has a big project
called Project Convergence.
795
01:01:14,545 --> 01:01:16,589
The Navy has Overmatch.
796
01:01:16,673 --> 01:01:19,592
And the Air Force has
Advanced Battle Management System.
797
01:01:21,886 --> 01:01:24,138
The Department of Defense
is trying to figure out,
798
01:01:24,222 --> 01:01:26,683
"How do we put all these pieces together,
799
01:01:26,766 --> 01:01:29,268
so that we can operate
faster than our adversary
800
01:01:30,395 --> 01:01:32,355
and really gain an advantage?"
801
01:01:33,481 --> 01:01:37,527
An AI Battle Manager would be
like a fairly high-ranking General
802
01:01:37,610 --> 01:01:39,821
who's in charge of the battle.
803
01:01:41,155 --> 01:01:44,158
Helping to give orders
to large numbers of forces,
804
01:01:45,118 --> 01:01:49,247
coordinating the actions
of all of the weapons that are out there,
805
01:01:49,330 --> 01:01:51,958
and doing it at a speed
that no human could keep up with.
806
01:01:53,543 --> 01:01:55,294
We've spent the past 70 years
807
01:01:55,378 --> 01:01:58,798
building the most sophisticated military
on the planet,
808
01:01:58,881 --> 01:02:03,177
and now we're facing the decision
as to whether we want to cede control
809
01:02:03,261 --> 01:02:07,598
over that infrastructure to an algorithm,
to software.
810
01:02:10,351 --> 01:02:13,187
And the consequences of that decision
811
01:02:13,271 --> 01:02:16,774
could trigger the full weight
of our military arsenals.
812
01:02:16,858 --> 01:02:19,902
That's not one Hiroshima. That's hundreds.
813
01:02:25,616 --> 01:02:27,869
This is the time that we need to act
814
01:02:27,952 --> 01:02:33,166
because the window to actually
contain this risk is rapidly closing.
815
01:02:34,667 --> 01:02:39,338
This afternoon, we start
with international security challenges
816
01:02:39,422 --> 01:02:41,132
posed by emerging technologies
817
01:02:41,215 --> 01:02:44,343
in the area of lethal
autonomous weapons systems.
818
01:02:44,427 --> 01:02:47,722
Conversations are happening
within the United Nations
819
01:02:47,805 --> 01:02:50,016
about the threat
of lethal autonomous weapons
820
01:02:50,600 --> 01:02:56,564
and our prohibition on systems
that use AI to select and target people.
821
01:02:56,647 --> 01:03:00,777
Consensus amongst technologists
is clear and resounding.
822
01:03:00,860 --> 01:03:04,030
We are opposed
to autonomous weapons that target humans.
823
01:03:05,615 --> 01:03:08,993
For years, states have actually
discussed this issue
824
01:03:09,076 --> 01:03:11,329
of lethal autonomous weapon systems.
825
01:03:12,288 --> 01:03:16,793
This is about a common,
shared sense of security.
826
01:03:17,752 --> 01:03:20,546
But of course, it's not easy.
827
01:03:21,297 --> 01:03:25,968
Certain countries,
especially those military powers,
828
01:03:26,052 --> 01:03:27,762
they want to be ahead of the curve,
829
01:03:28,429 --> 01:03:31,599
so that they will be
ahead of their adversaries.
830
01:03:32,600 --> 01:03:36,187
The problem is, everyone
has to agree to get anything done.
831
01:03:36,896 --> 01:03:39,065
There will be
at least one country that objects,
832
01:03:39,148 --> 01:03:42,109
and certainly the United States
and Russia have both made clear
833
01:03:42,193 --> 01:03:45,655
that they are opposed to a treaty
that would ban autonomous weapons.
834
01:03:47,198 --> 01:03:53,246
When we think about the number
of people working to make AI more powerful
835
01:03:54,747 --> 01:03:56,457
that room is very crowded.
836
01:03:57,750 --> 01:04:02,880
When we think about the room of people,
making sure that AI is safe,
837
01:04:04,340 --> 01:04:07,552
that room's much more sparsely populated.
838
01:04:11,764 --> 01:04:13,724
But I'm also really optimistic.
839
01:04:16,060 --> 01:04:19,438
I look at something
like the Biological Weapons Convention,
840
01:04:19,522 --> 01:04:21,774
which happened
in the middle of the Cold War...
841
01:04:23,025 --> 01:04:27,655
...despite tensions between the Soviet Union
and the United States.
842
01:04:29,156 --> 01:04:33,536
They were able to realize
that the development of biological weapons
843
01:04:33,619 --> 01:04:36,247
was in neither of their best interests,
844
01:04:36,330 --> 01:04:38,749
and not in the best interests
of the world at large.
845
01:04:41,335 --> 01:04:44,672
Arms race dynamics
favor speed over safety.
846
01:04:45,715 --> 01:04:48,342
But I think
what's important to consider is,
847
01:04:48,426 --> 01:04:52,430
at some point,
the cost of moving fast becomes too high.
848
01:04:56,225 --> 01:04:59,270
We can't just develop things
in isolation
849
01:05:00,021 --> 01:05:06,611
and put them out there without any thought
of where they could go in the future.
850
01:05:07,862 --> 01:05:10,823
We've got to prevent
that atom bomb moment.
851
01:05:14,577 --> 01:05:17,705
The stakes in the AI race
are massive.
852
01:05:18,289 --> 01:05:22,001
I don't think a lot of people
appreciate the global stability
853
01:05:22,084 --> 01:05:28,174
that has been provided
by having a superior military force
854
01:05:28,257 --> 01:05:31,135
for the past 75 years.
855
01:05:32,887 --> 01:05:35,681
And so the United States
and our allied forces,
856
01:05:36,557 --> 01:05:38,684
they need to outperform adversarial AI.
857
01:05:43,314 --> 01:05:44,774
There is no second place in war.
858
01:05:45,524 --> 01:05:47,485
China laid out
an ambitious plan
859
01:05:47,568 --> 01:05:49,111
to be the world leader in AI by 2030.
860
01:05:49,195 --> 01:05:51,656
It's a race that some say America
is losing...
861
01:05:52,740 --> 01:05:56,160
He will accelerate
the adoption of artificial intelligence
862
01:05:56,243 --> 01:05:59,246
to ensure
our competitive military advantage.
863
01:06:02,875 --> 01:06:05,461
We are racing forward
with this technology.
864
01:06:05,544 --> 01:06:08,673
I think what's unclear
is how far are we going to go?
865
01:06:09,507 --> 01:06:12,760
Do we control technology,
or does it control us?
866
01:06:14,136 --> 01:06:16,973
There's really no opportunity
for do-overs.
867
01:06:17,807 --> 01:06:20,977
Once the genie is out of the bottle,
it is out.
868
01:06:22,019 --> 01:06:25,189
And it is very, very difficult
to put it back in.
869
01:06:28,609 --> 01:06:31,737
And if we don't act now,
it's too late.
870
01:06:33,114 --> 01:06:34,657
It may already be too late.
76281
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.