Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:06,047 --> 00:00:08,800
[cryptic music plays]
2
00:00:15,598 --> 00:00:17,100
[music intensifies]
3
00:00:24,065 --> 00:00:26,818
[woman 1] I find AI to be awe-inspiring.
4
00:00:28,361 --> 00:00:31,489
All right, circling up. Good formation.
5
00:00:33,116 --> 00:00:36,703
[woman 1] AI has the potential
to eliminate poverty,
6
00:00:36,786 --> 00:00:38,705
give us new medicines,
7
00:00:39,330 --> 00:00:41,583
and make our world even more peaceful.
8
00:00:46,087 --> 00:00:47,087
[man 1] Nice work.
9
00:00:49,883 --> 00:00:53,428
[woman 1] But there are so many risks
along the way.
10
00:00:54,304 --> 00:00:55,847
[music crescendos]
11
00:01:04,773 --> 00:01:09,611
With AI, we are essentially creating
a non-human intelligence
12
00:01:09,694 --> 00:01:11,863
that is very unpredictable.
13
00:01:14,407 --> 00:01:16,451
As it gets more powerful,
14
00:01:16,534 --> 00:01:19,746
where are the red lines
we're going to draw with AI,
15
00:01:19,829 --> 00:01:22,665
in terms of how we want to use it,
or not use it?
16
00:01:25,001 --> 00:01:28,505
There is no place that is ground zero
for this conversation
17
00:01:28,588 --> 00:01:31,049
more than military applications.
18
00:01:34,010 --> 00:01:38,890
The battlefield has now become
the province of software and hardware.
19
00:01:41,059 --> 00:01:42,936
[man 1] Target has been acquired...
20
00:01:43,019 --> 00:01:44,771
[woman 1] And militaries are racing
21
00:01:44,854 --> 00:01:48,399
to develop AI faster
than their adversaries.
22
00:01:48,483 --> 00:01:49,483
[beeping]
23
00:01:49,526 --> 00:01:50,526
[man 1] I'm dead.
24
00:01:50,944 --> 00:01:52,028
[tense music plays]
25
00:01:52,612 --> 00:01:53,738
[radio chatter]
26
00:01:53,822 --> 00:01:55,782
[woman 1] We're moving towards a world
27
00:01:55,865 --> 00:01:58,326
where not only major militaries,
28
00:01:58,409 --> 00:02:02,997
but non-state actors, private industries,
29
00:02:03,081 --> 00:02:06,042
or even our local police department
down the street
30
00:02:06,543 --> 00:02:10,046
could be able to use these weapons
that can autonomously kill.
31
00:02:12,006 --> 00:02:17,303
Will we cede the decision to take a life
to algorithms, to computer software?
32
00:02:18,638 --> 00:02:21,516
It's one of the most pressing issues
of our time.
33
00:02:23,935 --> 00:02:26,354
And, if not used wisely,
34
00:02:26,437 --> 00:02:30,275
poses a grave risk
to every single person on the planet.
35
00:02:31,317 --> 00:02:32,694
[buzzing]
36
00:02:35,905 --> 00:02:37,407
[music crescendos]
37
00:02:40,118 --> 00:02:42,871
[ominous music plays]
38
00:02:46,749 --> 00:02:48,751
[music crescendos]
39
00:02:56,467 --> 00:02:58,469
[flies buzzing]
40
00:03:00,972 --> 00:03:02,974
[electronic buzzing]
41
00:03:17,697 --> 00:03:19,657
[tense music plays]
42
00:03:19,741 --> 00:03:21,367
[indistinct radio chatter]
43
00:03:28,291 --> 00:03:30,376
[male voice] Good work out there, guys.
44
00:03:30,460 --> 00:03:31,836
[muffled radio chatter]
45
00:03:34,839 --> 00:03:36,841
[music intensifies]
46
00:03:37,342 --> 00:03:39,886
[indistinct radio chatter]
47
00:03:42,722 --> 00:03:43,722
[gunshots]
48
00:03:46,226 --> 00:03:47,227
[gunshot]
49
00:03:47,310 --> 00:03:48,310
[radio chatter]
50
00:03:55,109 --> 00:03:56,569
[laughter]
51
00:03:59,906 --> 00:04:02,200
[tech 1] That's a wider lens
than we had before.
52
00:04:02,283 --> 00:04:03,117
[man 2] Really?
53
00:04:03,201 --> 00:04:05,203
[tech 1] You can see a lot more data.
54
00:04:05,286 --> 00:04:06,287
Very cool.
55
00:04:07,538 --> 00:04:11,167
At Shield AI, we are building an AI pilot
56
00:04:11,251 --> 00:04:14,879
that's taking self-driving,
artificial intelligence technology
57
00:04:14,963 --> 00:04:16,631
and putting it on aircraft.
58
00:04:16,714 --> 00:04:18,800
[cryptic music plays]
59
00:04:19,300 --> 00:04:21,719
When we talk about an AI pilot,
60
00:04:21,803 --> 00:04:25,139
we think about giving an aircraft
a higher level of autonomy.
61
00:04:25,223 --> 00:04:27,850
They will be solving problems
on their own.
62
00:04:29,602 --> 00:04:32,355
Nova is an autonomous quadcopter
63
00:04:32,438 --> 00:04:35,942
that explores buildings
and subterranean structures
64
00:04:36,859 --> 00:04:38,611
ahead of clearance forces
65
00:04:38,695 --> 00:04:41,322
to provide eyes and ears in those spaces.
66
00:04:41,906 --> 00:04:44,534
You can definitely tell
a ton of improvements
67
00:04:44,617 --> 00:04:45,618
since we saw it last.
68
00:04:45,702 --> 00:04:47,930
[tech 2] We're working
on some exploration changes today.
69
00:04:47,954 --> 00:04:50,248
We're working a little
floor-by-floor stuff.
70
00:04:50,331 --> 00:04:53,584
It'll finish one floor, all the rooms,
before going to the second.
71
00:04:54,294 --> 00:04:56,796
- That's awesome.
- We put in some changes recently...
72
00:04:56,879 --> 00:04:58,548
A lot of people often ask me
73
00:04:58,631 --> 00:05:02,468
why artificial intelligence
is an important capability.
74
00:05:02,552 --> 00:05:06,973
And I just think back to the missions
that I was executing.
75
00:05:07,056 --> 00:05:09,058
[dramatic music plays]
76
00:05:09,142 --> 00:05:10,935
Spent seven years in the Navy.
77
00:05:11,561 --> 00:05:15,273
I was a former Navy SEAL
deployed twice to Afghanistan,
78
00:05:15,356 --> 00:05:17,191
once to the Pacific Theater.
79
00:05:18,026 --> 00:05:22,947
In a given day, we might have to clear
150 different compounds or buildings.
80
00:05:23,031 --> 00:05:24,574
[tense music plays]
81
00:05:30,997 --> 00:05:34,250
One of the core capabilities
is close-quarters combat.
82
00:05:35,293 --> 00:05:39,172
Gunfighting at extremely close ranges
inside buildings.
83
00:05:39,672 --> 00:05:40,672
[gunshot]
84
00:05:41,466 --> 00:05:42,967
You are getting shot at.
85
00:05:43,468 --> 00:05:46,471
There are IEDs
potentially inside the building.
86
00:05:47,764 --> 00:05:48,931
[yelling]
87
00:05:49,015 --> 00:05:55,229
It's the most dangerous thing
that any special operations forces member,
88
00:05:55,313 --> 00:05:58,358
any infantry member,
can do in a combat zone.
89
00:05:58,441 --> 00:05:59,441
Bar none.
90
00:06:01,152 --> 00:06:03,154
[somber music plays]
91
00:06:12,497 --> 00:06:16,334
For the rest of my life
I'll be thankful for my time in the Navy.
92
00:06:17,168 --> 00:06:22,423
There are a collection of moments
and memories that, when I think about,
93
00:06:22,507 --> 00:06:24,592
I certainly get emotional.
94
00:06:28,346 --> 00:06:32,725
It is clichรฉ that freedom isn't free,
but I 100% believe it.
95
00:06:32,809 --> 00:06:37,105
[quavers] Um, I've experienced it,
and it takes a lot of sacrifice.
96
00:06:38,022 --> 00:06:39,022
Sorry.
97
00:06:41,067 --> 00:06:44,570
When something bad happens
to one of your teammates,
98
00:06:44,654 --> 00:06:46,864
whether they're hurt or they're killed,
99
00:06:46,948 --> 00:06:49,700
um, it's just a...
It's a really tragic thing.
100
00:06:49,784 --> 00:06:52,912
You know, for me now
in the work that we do,
101
00:06:52,995 --> 00:06:56,582
it's motivating to um... be able to
102
00:06:56,666 --> 00:06:58,835
you know,
reduce the number of times
103
00:06:58,918 --> 00:07:00,336
that ever happens again.
104
00:07:03,965 --> 00:07:06,426
[ominous music plays]
105
00:07:07,176 --> 00:07:08,886
[man 3] In the late 2000s,
106
00:07:08,970 --> 00:07:11,556
there was this awakening
inside the Defense Department
107
00:07:11,639 --> 00:07:15,351
to what you might call
the accidental robotics revolution.
108
00:07:15,893 --> 00:07:19,564
We deployed thousands of air
and ground robots to Iraq and Afghanistan.
109
00:07:21,482 --> 00:07:24,068
[man 4] When I was asked
by the Obama administration
110
00:07:24,152 --> 00:07:26,404
to become the Deputy Secretary of Defense,
111
00:07:26,904 --> 00:07:29,323
the way war was fought...
112
00:07:29,407 --> 00:07:31,117
uh, was definitely changing.
113
00:07:32,452 --> 00:07:36,330
Robots were used
where people would have been used.
114
00:07:37,999 --> 00:07:40,376
[Paul] Early robotics systems
were remote-controlled.
115
00:07:40,460 --> 00:07:45,173
There's a human driving it, steering it,
like you might a remote-controlled car.
116
00:07:46,632 --> 00:07:51,512
[Bob] They first started generally
going after improvised explosive devices,
117
00:07:52,013 --> 00:07:54,849
and if the bomb blew up,
the robot would blow up.
118
00:07:57,393 --> 00:08:00,396
Then you'd say, "That's a bummer.
Okay, get out the other robot."
119
00:08:00,480 --> 00:08:02,190
[tense music plays]
120
00:08:02,273 --> 00:08:05,067
[woman 2] In Afghanistan,
you had the Predator drone,
121
00:08:05,151 --> 00:08:09,739
and it became a very, very useful tool
to conduct airstrikes.
122
00:08:13,117 --> 00:08:16,829
[Paul] Over time, military planners
started to begin to wonder,
123
00:08:16,913 --> 00:08:20,791
"What else could robots be used for?"
And where was this going?
124
00:08:20,875 --> 00:08:24,170
And one of the common themes
was this trend towards greater autonomy.
125
00:08:26,506 --> 00:08:30,635
[woman 2] An autonomous weapon
is one that makes decisions on its own,
126
00:08:30,718 --> 00:08:33,304
with little to no human intervention.
127
00:08:33,387 --> 00:08:37,558
So it has an independent capacity,
and it's self-directed.
128
00:08:38,434 --> 00:08:41,604
And whether it can kill
depends on whether it's armed or not.
129
00:08:42,104 --> 00:08:43,231
[beeping]
130
00:08:43,314 --> 00:08:47,026
[Bob] When you have more and more autonomy
in your entire system,
131
00:08:47,109 --> 00:08:49,779
everything starts to move
at a higher clock speed.
132
00:08:51,989 --> 00:08:56,869
And when you operate
at a faster pace than your adversaries,
133
00:08:57,453 --> 00:09:01,165
that is an extraordinarily
big advantage in battle.
134
00:09:03,960 --> 00:09:06,254
[man 5] What we focus on as it relates
to autonomy
135
00:09:07,255 --> 00:09:09,882
is highly resilient intelligence systems.
136
00:09:09,966 --> 00:09:11,968
[buzzing]
137
00:09:12,051 --> 00:09:15,304
Systems that can read and react
based on their environment,
138
00:09:15,805 --> 00:09:19,642
and make decisions
about how to maneuver in that world.
139
00:09:24,981 --> 00:09:29,151
The facility that we're at today
was originally built as a movie studio
140
00:09:29,235 --> 00:09:30,611
that was converted over
141
00:09:30,695 --> 00:09:34,156
to enable these realistic
military training environments.
142
00:09:35,491 --> 00:09:37,994
[ominous music plays]
143
00:09:39,495 --> 00:09:43,291
We are here to evaluate our AI pilot.
144
00:09:44,625 --> 00:09:49,297
The mission is looking for threats.
It's about clearance forces.
145
00:09:49,380 --> 00:09:52,800
It can make a decision
about how to attack that problem.
146
00:09:55,428 --> 00:09:59,390
[man 6] We call this "the fatal funnel."
You have to come through a doorway.
147
00:10:01,726 --> 00:10:03,311
It's where we're most vulnerable.
148
00:10:03,811 --> 00:10:05,146
This one looks better.
149
00:10:07,106 --> 00:10:10,151
[man 6] The Nova lets us know,
is there a shooter behind that door,
150
00:10:10,818 --> 00:10:12,653
is there a family behind that door?
151
00:10:15,114 --> 00:10:18,451
It'll allow us to make better decisions
and keep people out of harms way.
152
00:10:27,293 --> 00:10:29,128
[tense music plays]
153
00:10:47,730 --> 00:10:49,482
[music intensifies]
154
00:10:57,114 --> 00:10:59,367
[man 7] We use the vision sensors
155
00:10:59,450 --> 00:11:03,162
to be able to get an understanding
of what the environment looks like.
156
00:11:04,997 --> 00:11:06,415
It's a multistory building.
157
00:11:07,333 --> 00:11:08,334
Here's the map.
158
00:11:08,417 --> 00:11:11,671
While I was exploring,
here's what I saw and where I saw them.
159
00:11:16,967 --> 00:11:18,552
[music crescendos]
160
00:11:19,929 --> 00:11:22,098
[Brandon] Person detector. That's sweet.
161
00:11:22,181 --> 00:11:23,808
[tense music continues]
162
00:11:23,891 --> 00:11:27,436
[man 7] One of the other sensors
onboard Nova is a thermal scanner.
163
00:11:27,520 --> 00:11:30,690
If that's 98.6 degrees,
it probably is a human.
164
00:11:32,358 --> 00:11:35,444
People are considered threats
until deemed otherwise.
165
00:11:38,906 --> 00:11:43,244
It is about eliminating the fog of war
to make better decisions.
166
00:11:43,327 --> 00:11:44,620
[music builds]
167
00:11:44,704 --> 00:11:46,539
And when we look to the future,
168
00:11:47,039 --> 00:11:50,710
we're scaling out to build teams
of autonomous aircraft.
169
00:11:50,793 --> 00:11:52,128
[music crescendos]
170
00:11:52,211 --> 00:11:53,671
[low buzzing]
171
00:11:53,754 --> 00:11:55,631
[man 7] With self-driving vehicles,
172
00:11:55,715 --> 00:11:57,466
ultimately the person has said to it,
173
00:11:57,550 --> 00:12:00,386
"I'd like you to go
from point A to point B."
174
00:12:00,469 --> 00:12:02,471
[low buzzing]
175
00:12:02,972 --> 00:12:06,475
Our systems are being asked
not to go from point A to point B,
176
00:12:06,559 --> 00:12:08,477
but to achieve an objective.
177
00:12:08,561 --> 00:12:10,229
[cryptic music plays]
178
00:12:10,312 --> 00:12:13,190
It's more akin to, "I need milk."
179
00:12:13,274 --> 00:12:17,486
And then the robot would have to
figure out what grocery store to go to,
180
00:12:17,570 --> 00:12:20,239
be able to retrieve that milk,
and then bring it back.
181
00:12:20,740 --> 00:12:22,074
And even more so,
182
00:12:22,158 --> 00:12:27,037
it may be more appropriately stated as,
"Keep the refrigerator stocked."
183
00:12:27,538 --> 00:12:29,457
And so, this is a level of intelligence
184
00:12:29,540 --> 00:12:32,168
in terms to figuring out what we need
and how we do it.
185
00:12:32,251 --> 00:12:34,253
And if there is a challenge, or a problem,
186
00:12:34,336 --> 00:12:37,381
or an issue arises,
figure out how to mitigate that.
187
00:12:37,465 --> 00:12:39,467
[cryptic music continues]
188
00:12:40,801 --> 00:12:45,431
[Brandon] When I had made the decision
to leave the Navy, I started thinking,
189
00:12:45,514 --> 00:12:47,558
"Okay. Well, what's next?"
190
00:12:47,641 --> 00:12:49,059
[bell ringing]
191
00:12:49,143 --> 00:12:52,021
I grew up with the Internet.
Saw what it became.
192
00:12:53,189 --> 00:12:55,608
And part of the conclusion
that I had reached was...
193
00:12:56,108 --> 00:12:58,861
AI in 2015
194
00:12:58,944 --> 00:13:02,907
was really where the Internet was in 1991.
195
00:13:04,033 --> 00:13:06,285
And AI was poised to take off
196
00:13:06,368 --> 00:13:09,747
and be one of the most
powerful technologies in the world.
197
00:13:11,791 --> 00:13:16,212
Working with it every single day,
I can see the progress that is being made.
198
00:13:19,048 --> 00:13:22,051
But for a lot of people,
when they think "AI,"
199
00:13:22,551 --> 00:13:25,221
their minds immediately go to Hollywood.
200
00:13:25,304 --> 00:13:27,556
[beeping]
201
00:13:28,057 --> 00:13:29,975
[computer voice] Shall we play a game?
202
00:13:30,059 --> 00:13:35,314
How about Global Thermonuclear War?
203
00:13:35,898 --> 00:13:36,816
Fine.
204
00:13:36,899 --> 00:13:38,234
[dramatic music plays]
205
00:13:38,317 --> 00:13:40,861
[man 8] When people think
of artificial intelligence generally,
206
00:13:40,945 --> 00:13:44,824
they might think of The Terminator.
Or I, Robot.
207
00:13:45,574 --> 00:13:46,700
Deactivate.
208
00:13:46,784 --> 00:13:48,160
What am I?
209
00:13:48,244 --> 00:13:49,620
[man 8] Or The Matrix.
210
00:13:51,080 --> 00:13:53,749
Based on what you see
in the sci-fi movies,
211
00:13:53,833 --> 00:13:57,878
how do you know I'm a human?
I could just be computer generated AI.
212
00:13:58,379 --> 00:14:02,132
Replicants are like any other machine.
They're either a benefit or a hazard.
213
00:14:02,675 --> 00:14:06,095
[Andrew] But there's all sorts
of more primitive AIs,
214
00:14:06,178 --> 00:14:08,097
that are still going to change our lives
215
00:14:08,180 --> 00:14:11,851
well before we reach
the thinking, talking robot stage.
216
00:14:12,643 --> 00:14:15,771
[woman 3] The robots are here.
The robots are making decisions.
217
00:14:15,855 --> 00:14:18,816
The robot revolution has arrived,
218
00:14:18,899 --> 00:14:22,319
it's just that it doesn't look like
what anybody imagined.
219
00:14:23,320 --> 00:14:25,489
[film character] Terminator's
an infiltration unit.
220
00:14:25,573 --> 00:14:26,740
Part man, part machine.
221
00:14:27,950 --> 00:14:31,370
[man 9] We're not talking about
a Terminator-style killer robot.
222
00:14:31,871 --> 00:14:35,958
We're talking about AI
that can do some tasks that humans can do.
223
00:14:36,041 --> 00:14:39,879
But the concern is
whether these systems are reliable.
224
00:14:40,462 --> 00:14:45,426
[reporter 1] New details in last night's
crash involving a self-driving Uber SUV.
225
00:14:45,509 --> 00:14:48,762
The company created
an artificial intelligence chatbot.
226
00:14:48,846 --> 00:14:51,181
She took on a rather racist tone...
227
00:14:51,265 --> 00:14:55,394
[reporter 2] Twenty-six state legislators
falsely identified as criminals.
228
00:14:57,438 --> 00:15:02,026
The question is whether they can handle
the complexities of the real world.
229
00:15:02,109 --> 00:15:03,110
[birds chirping]
230
00:15:03,944 --> 00:15:06,238
[somber music plays]
231
00:15:22,212 --> 00:15:24,423
[man 10] The physical world
is really messy.
232
00:15:25,174 --> 00:15:27,426
There are many things that we don't know,
233
00:15:27,968 --> 00:15:30,763
making it much harder to train AI systems.
234
00:15:30,846 --> 00:15:33,015
[upbeat music plays]
235
00:15:33,098 --> 00:15:36,977
That is where machine learning systems
have started to come in.
236
00:15:38,520 --> 00:15:41,440
[man 11] Machine learning
has been a huge advancement
237
00:15:42,024 --> 00:15:45,611
because it means that we don't have
to teach computers everything.
238
00:15:45,694 --> 00:15:47,821
[music intensifies]
239
00:15:47,905 --> 00:15:52,409
You actually give a computer
millions of pieces of information,
240
00:15:52,493 --> 00:15:54,453
and the machine begins to learn.
241
00:15:55,245 --> 00:15:57,289
And that could be applied to anything.
242
00:16:00,250 --> 00:16:02,962
[Pulkit] Our Robot Dog project,
we are trying to show
243
00:16:03,045 --> 00:16:09,009
that our dog can walk across
many, many diverse terrains.
244
00:16:10,052 --> 00:16:14,098
Humans have evolved
over billions of years to walk,
245
00:16:14,932 --> 00:16:20,688
but there's a lot of intelligence
in adapting to these different terrains.
246
00:16:21,939 --> 00:16:24,316
The question remains
for robotic systems is,
247
00:16:24,400 --> 00:16:27,069
could they also adapt
like animals and humans?
248
00:16:30,656 --> 00:16:32,157
[music crescendos]
249
00:16:33,826 --> 00:16:35,869
[cryptic music plays]
250
00:16:35,953 --> 00:16:37,413
With machine learning,
251
00:16:37,496 --> 00:16:40,582
we collect lots and lots
of data in simulation.
252
00:16:42,668 --> 00:16:45,838
A simulation is a digital twin of reality.
253
00:16:46,588 --> 00:16:52,052
We can have many instances of that reality
running on different computers.
254
00:16:53,178 --> 00:16:56,974
It samples thousands of actions
in simulation.
255
00:16:58,892 --> 00:17:02,229
The ground that they're encountering
has different slipperiness.
256
00:17:02,312 --> 00:17:03,856
It has different softness.
257
00:17:04,898 --> 00:17:08,318
We take all the experience
of these thousands of robots
258
00:17:08,402 --> 00:17:13,282
from simulation and download this
into a real robotic system.
259
00:17:15,492 --> 00:17:20,039
The test we're going to do today
is to see if it can adapt to new terrains.
260
00:17:21,206 --> 00:17:23,208
[tense music plays]
261
00:17:38,015 --> 00:17:40,017
[tense music continues]
262
00:17:53,322 --> 00:17:55,491
When the robot was going over foam,
263
00:17:55,574 --> 00:17:58,577
the feet movements
were stomping on the ground.
264
00:17:59,369 --> 00:18:02,206
Versus when it came on this poly surface,
265
00:18:03,248 --> 00:18:06,460
it was trying to adjust the motion,
so it doesn't slip.
266
00:18:08,670 --> 00:18:10,672
Then that is when it strikes you,
267
00:18:10,756 --> 00:18:14,051
"This is what machine learning
is bringing to the table."
268
00:18:22,476 --> 00:18:24,478
[cryptic music plays]
269
00:18:34,488 --> 00:18:35,864
[whimpering]
270
00:18:41,161 --> 00:18:44,123
We think the Robot Dog
could be really helpful
271
00:18:44,206 --> 00:18:46,166
in disaster response scenarios
272
00:18:46,250 --> 00:18:49,670
where you need to navigate
many different kinds of terrain.
273
00:18:52,631 --> 00:18:57,427
Or putting these dogs to do surveillance
in harsh environments.
274
00:18:57,970 --> 00:19:00,264
[cryptic music continues]
275
00:19:12,317 --> 00:19:13,819
[music crescendos]
276
00:19:15,696 --> 00:19:19,575
But most technology
runs into the challenge
277
00:19:19,658 --> 00:19:22,369
that there is some good they can do,
and there's some bad.
278
00:19:22,452 --> 00:19:23,996
[grim music plays]
279
00:19:24,079 --> 00:19:27,875
For example,
we can use nuclear technology for energy...
280
00:19:30,419 --> 00:19:33,797
but we also could develop atom bombs
which are really bad.
281
00:19:35,090 --> 00:19:38,468
This is what is known
as the dual-use problem.
282
00:19:39,052 --> 00:19:40,929
Fire is dual-use.
283
00:19:41,722 --> 00:19:44,308
Human intelligence is dual-use.
284
00:19:44,391 --> 00:19:48,645
So, needless to say,
artificial intelligence is also dual-use.
285
00:19:49,354 --> 00:19:53,984
It's really important
to think about AI used in context
286
00:19:54,693 --> 00:19:59,114
because, yes, it's terrific
to have a search-and-rescue robot
287
00:19:59,198 --> 00:20:02,284
that can help locate somebody
after an avalanche,
288
00:20:02,868 --> 00:20:05,871
but that same robot can be weaponized.
289
00:20:05,954 --> 00:20:07,080
[music builds]
290
00:20:08,457 --> 00:20:09,457
[gunshots]
291
00:20:15,088 --> 00:20:18,675
[Pulkit] When you see companies
using robotics
292
00:20:18,759 --> 00:20:21,094
for putting armed weapons on them,
293
00:20:21,178 --> 00:20:22,930
a part of you becomes mad.
294
00:20:23,931 --> 00:20:28,435
And a part of it is the realization
that when we put our technology,
295
00:20:28,518 --> 00:20:30,103
this is what's going to happen.
296
00:20:31,021 --> 00:20:34,441
[woman 4] This is a real
transformative technology.
297
00:20:34,524 --> 00:20:36,151
[grim music continues]
298
00:20:36,235 --> 00:20:38,028
These are weapon systems
299
00:20:38,111 --> 00:20:43,617
that could actually change
our safety and security in a dramatic way.
300
00:20:43,700 --> 00:20:45,035
[music crescendos]
301
00:20:46,161 --> 00:20:51,500
As of now, we are not sure
that machines can actually make
302
00:20:51,583 --> 00:20:54,962
the distinction
between civilians and combatants.
303
00:20:55,045 --> 00:20:57,047
[somber music plays]
304
00:20:57,130 --> 00:20:59,591
[indistinct voices]
305
00:21:01,802 --> 00:21:05,597
[Paul] Early in the war in Afghanistan,
I was part of an Army Ranger sniper team
306
00:21:05,681 --> 00:21:08,016
looking for enemy fighters
coming across the border.
307
00:21:09,685 --> 00:21:12,521
And they sent a little girl
to scout out our position.
308
00:21:13,939 --> 00:21:17,067
One thing that never came up
was the idea of shooting this girl.
309
00:21:18,193 --> 00:21:19,611
[children squealing]
310
00:21:19,695 --> 00:21:22,864
Under the laws of war,
that would have been legal.
311
00:21:22,948 --> 00:21:24,866
[indistinct chatter]
312
00:21:24,950 --> 00:21:27,369
They don't set an age
for enemy combatants.
313
00:21:29,079 --> 00:21:33,417
If you built a robot
to comply perfectly with the law of war,
314
00:21:33,500 --> 00:21:35,335
it would have shot this little girl.
315
00:21:36,420 --> 00:21:41,091
How would a robot know the difference
between what's legal and what is right?
316
00:21:41,174 --> 00:21:42,342
[indistinct chatter]
317
00:21:42,426 --> 00:21:43,302
[beeping]
318
00:21:43,385 --> 00:21:46,054
[man 12] When it comes to
autonomous drone warfare,
319
00:21:46,138 --> 00:21:49,933
they wanna take away the harm
that it places on American soldiers
320
00:21:50,017 --> 00:21:51,935
and the American psyche,
321
00:21:52,019 --> 00:21:57,149
uh, but the increase on civilian harm
ends up with Afghans,
322
00:21:57,232 --> 00:21:58,775
and Iraqis, and Somalians.
323
00:21:58,859 --> 00:22:00,944
[somber music plays]
324
00:22:01,445 --> 00:22:03,196
[indistinct chatter]
325
00:22:09,619 --> 00:22:14,416
I would really ask those who support
trusting AI to be used in drones,
326
00:22:15,375 --> 00:22:18,211
"What if your village was
on the receiving end of that?"
327
00:22:19,338 --> 00:22:20,338
[beeping]
328
00:22:23,550 --> 00:22:25,635
[man 13] AI is a dual-edged sword.
329
00:22:26,595 --> 00:22:30,724
It can be used for good,
which is what we'd use it for ordinarily,
330
00:22:31,558 --> 00:22:33,477
and at the flip of a switch,
331
00:22:34,561 --> 00:22:38,023
the technology becomes potentially
something that could be lethal.
332
00:22:38,106 --> 00:22:40,108
[thunder rumbles]
333
00:22:41,943 --> 00:22:43,945
[cryptic music plays]
334
00:22:47,032 --> 00:22:48,533
I'm a clinical pharmacologist.
335
00:22:49,618 --> 00:22:50,994
I have a team of people
336
00:22:51,078 --> 00:22:54,247
that are using artificial intelligence
to figure out drugs
337
00:22:54,331 --> 00:22:57,751
that will cure diseases
that are not getting any attention.
338
00:22:59,711 --> 00:23:04,132
It used to be with drug discoveries,
you would take a molecule that existed,
339
00:23:04,216 --> 00:23:06,885
and do a tweak to that
to get to a new drug.
340
00:23:08,261 --> 00:23:13,350
And now we've developed AI
that can feed us with millions of ideas,
341
00:23:13,892 --> 00:23:15,227
millions of molecules,
342
00:23:16,186 --> 00:23:19,231
and that opens up so many possibilities
343
00:23:19,314 --> 00:23:23,151
for treating diseases
we've never been able to treat previously.
344
00:23:25,153 --> 00:23:28,448
But there's definitely a dark side
that I never have thought
345
00:23:28,532 --> 00:23:29,741
that I would go to.
346
00:23:30,742 --> 00:23:33,078
[tense music plays]
347
00:23:33,161 --> 00:23:35,872
This whole thing started
when I was invited
348
00:23:35,956 --> 00:23:40,043
by an organization out of Switzerland
called the Spiez Laboratory
349
00:23:40,544 --> 00:23:44,423
to give a presentation
on the potential misuse of AI.
350
00:23:45,507 --> 00:23:47,509
[music intensifies]
351
00:23:48,343 --> 00:23:51,221
[man 14] Sean just sent me an email
with a few ideas
352
00:23:51,304 --> 00:23:55,308
of some ways we could misuse
our own artificial intelligence.
353
00:23:56,059 --> 00:24:00,355
And instead of asking our model
to create drug-like molecules,
354
00:24:00,439 --> 00:24:02,274
that could be used to treat diseases,
355
00:24:02,774 --> 00:24:06,403
let's see if we can generate
the most toxic molecules possible.
356
00:24:06,486 --> 00:24:08,447
[grim music plays]
357
00:24:10,198 --> 00:24:13,243
[Sean] I wanted to make the point,
could we use AI technology
358
00:24:13,326 --> 00:24:15,203
to design molecules that were deadly?
359
00:24:17,414 --> 00:24:20,083
[Fabio] And to be honest,
we thought it was going to fail
360
00:24:20,167 --> 00:24:24,504
because all we really did
was flip this zero to a one.
361
00:24:26,715 --> 00:24:30,385
And by inverting it,
instead of driving away from toxicity,
362
00:24:30,469 --> 00:24:32,220
now we're driving towards toxicity.
363
00:24:32,721 --> 00:24:33,555
And that's it.
364
00:24:33,638 --> 00:24:35,390
[music intensifies]
365
00:24:37,517 --> 00:24:40,729
While I was home,
the computer was doing the work.
366
00:24:41,521 --> 00:24:44,816
I mean, it was cranking through,
generating thousands of molecules,
367
00:24:45,650 --> 00:24:49,529
and we didn't have to do anything
other than just push "go."
368
00:24:50,071 --> 00:24:51,406
[music crescendos]
369
00:24:52,532 --> 00:24:53,992
[birds chirping]
370
00:24:54,493 --> 00:24:58,330
[Fabio] The next morning,
there was this file on my computer,
371
00:24:59,080 --> 00:25:03,376
and within it
were roughly 40,000 molecules
372
00:25:03,460 --> 00:25:05,253
that were potentially
373
00:25:05,337 --> 00:25:08,882
some of the most toxic molecules
known to humankind.
374
00:25:08,965 --> 00:25:10,133
[grim music plays]
375
00:25:10,217 --> 00:25:12,761
[Sean] The hairs on the back of my neck
stood up on end.
376
00:25:13,386 --> 00:25:14,554
I was blown away.
377
00:25:15,388 --> 00:25:19,017
The computer made
tens of thousands of ideas
378
00:25:19,100 --> 00:25:20,685
for new chemical weapons.
379
00:25:21,645 --> 00:25:26,274
Obviously, we have molecules
that look like and are VX analogs and VX
380
00:25:26,775 --> 00:25:27,775
in the data set.
381
00:25:28,360 --> 00:25:31,363
VX is one of the most potent
chemical weapons in the world.
382
00:25:31,863 --> 00:25:33,573
[reporter 3] New claims from police
383
00:25:33,657 --> 00:25:37,786
that the women seen attacking Kim Jong-nam
in this airport assassination
384
00:25:37,869 --> 00:25:41,248
were using a deadly nerve agent called VX.
385
00:25:41,748 --> 00:25:44,417
[Sean] It can cause death
through asphyxiation.
386
00:25:45,043 --> 00:25:47,629
This is a very potent molecule,
387
00:25:47,712 --> 00:25:52,092
and most of these molecules were predicted
to be even more deadly than VX.
388
00:25:53,301 --> 00:25:56,972
[Fabio] Many of them had never,
as far as we know, been seen before.
389
00:25:57,472 --> 00:26:00,767
And so, when Sean and I realized this,
we're like,
390
00:26:00,850 --> 00:26:02,727
"Oh, what have we done?" [chuckles]
391
00:26:02,811 --> 00:26:04,688
[grim music continues]
392
00:26:04,771 --> 00:26:09,192
[Sean] I quickly realized
that we had opened Pandora's box,
393
00:26:09,901 --> 00:26:12,696
and I said, "Stop.
Don't do anything else. We're done."
394
00:26:13,905 --> 00:26:17,158
"Just make me the slides
that I need for the presentation."
395
00:26:19,202 --> 00:26:20,704
When we did this experiment,
396
00:26:20,787 --> 00:26:23,873
I was thinking, "What's the worst thing
that could possibly happen?"
397
00:26:26,251 --> 00:26:30,255
But now I'm like, "We were naive.
We were totally naive in doing it."
398
00:26:31,339 --> 00:26:33,383
[music intensifies]
399
00:26:33,466 --> 00:26:36,928
The thing that terrifies me the most
is that anyone could do what we did.
400
00:26:38,930 --> 00:26:40,765
All it takes is the flip of a switch.
401
00:26:40,849 --> 00:26:42,851
[somber music plays]
402
00:26:42,934 --> 00:26:46,187
How do we control
this technology before it's used
403
00:26:46,271 --> 00:26:50,942
potentially to do something
that's utterly destructive?
404
00:26:55,780 --> 00:27:00,285
[woman 1] At the heart of the conversation
around artificial intelligence
405
00:27:00,368 --> 00:27:03,538
and how do we choose to use it in society
406
00:27:04,289 --> 00:27:08,835
is a race between the power,
with which we develop technologies,
407
00:27:08,918 --> 00:27:11,379
and the wisdom that we have to govern it.
408
00:27:11,463 --> 00:27:13,256
[somber music continues]
409
00:27:13,340 --> 00:27:16,926
There are the obvious
moral and ethical implications
410
00:27:17,010 --> 00:27:20,138
of the same thing
that powers our smartphones
411
00:27:20,221 --> 00:27:23,433
being entrusted
with the moral decision to take a life.
412
00:27:25,727 --> 00:27:31,191
I work with the Future Of Life Institute,
a community of scientist activists.
413
00:27:31,733 --> 00:27:33,443
We're overall trying to show
414
00:27:33,526 --> 00:27:38,698
that there is this other side
to speeding up and escalating automation.
415
00:27:38,782 --> 00:27:42,619
[Emilia] But we're trying to make sure
that technologies we create
416
00:27:42,702 --> 00:27:45,038
are used in a way
that is safe and ethical.
417
00:27:46,247 --> 00:27:49,959
Let's have conversations
about rules of engagement,
418
00:27:50,043 --> 00:27:54,005
and codes of conduct in using AI
throughout our weapons systems.
419
00:27:54,798 --> 00:27:59,135
Because we are now seeing
"enter the battlefield" technologies
420
00:27:59,219 --> 00:28:01,680
that can be used to kill autonomously.
421
00:28:02,263 --> 00:28:03,431
[beeping]
422
00:28:04,516 --> 00:28:05,684
[rocket whistles]
423
00:28:05,767 --> 00:28:07,268
[explosion rumbles]
424
00:28:10,230 --> 00:28:15,110
In 2021, the UN released
a report on the potential use
425
00:28:15,193 --> 00:28:19,364
of a lethal autonomous weapon
on the battlefield in Libya.
426
00:28:20,073 --> 00:28:23,910
[reporter 4] A UN panel said that a drone
flying in the Libyan civil war last year
427
00:28:23,993 --> 00:28:27,288
had been programmed
to attack targets autonomously.
428
00:28:28,289 --> 00:28:31,209
[Emilia] If the UN reporting is accurate,
429
00:28:31,292 --> 00:28:34,963
this would be
a watershed moment for humanity.
430
00:28:35,046 --> 00:28:37,090
Because it marks a use case
431
00:28:37,173 --> 00:28:42,137
where an AI made the decision
to take a life, and not a human being.
432
00:28:42,220 --> 00:28:44,180
[dramatic music plays]
433
00:28:47,517 --> 00:28:48,517
[beeping]
434
00:28:49,686 --> 00:28:51,688
[Stacie] You're seeing advanced
autonomous weapons
435
00:28:51,771 --> 00:28:55,150
beginning to be used
in different places around the globe.
436
00:28:56,276 --> 00:28:58,278
There were reports out of Israel.
437
00:28:58,361 --> 00:29:00,196
[dramatic music continues]
438
00:29:00,280 --> 00:29:05,702
Azerbaijan used autonomous systems
to target Armenian air defenses.
439
00:29:07,370 --> 00:29:09,998
[Sean] It can fly around
the battlefield for hours,
440
00:29:10,081 --> 00:29:12,500
looking for things to hit on its own,
441
00:29:12,584 --> 00:29:16,087
and then plow into them
without any kind of human intervention.
442
00:29:16,755 --> 00:29:18,298
[Stacie] And we've seen recently
443
00:29:18,381 --> 00:29:21,760
these different videos
that are posted in Ukraine.
444
00:29:23,094 --> 00:29:26,890
[Paul] It's unclear what mode they might
have been in when they were operating.
445
00:29:26,973 --> 00:29:30,393
Was a human in the loop,
choosing what targets to attack,
446
00:29:30,477 --> 00:29:32,729
or was the machine doing that on its own?
447
00:29:32,812 --> 00:29:36,524
But there will certainly
come a point in time,
448
00:29:36,608 --> 00:29:40,195
whether it's already happened in Libya,
Ukraine or elsewhere,
449
00:29:40,278 --> 00:29:44,449
where a machine makes its own decision
about whom to kill on the battlefield.
450
00:29:44,532 --> 00:29:46,534
[music crescendos]
451
00:29:47,035 --> 00:29:50,997
[Izumi] Machines exercising
lethal power against humans
452
00:29:51,080 --> 00:29:53,374
without human intervention
453
00:29:53,458 --> 00:29:57,337
is politically unacceptable,
morally repugnant.
454
00:29:59,130 --> 00:30:01,299
Whether the international community
455
00:30:01,382 --> 00:30:05,011
will be sufficient
to govern those challenges
456
00:30:05,094 --> 00:30:06,888
is a big question mark.
457
00:30:06,971 --> 00:30:08,848
[somber music plays]
458
00:30:08,932 --> 00:30:12,685
[Emilia] If we look towards the future,
even just a few years from now,
459
00:30:12,769 --> 00:30:15,730
what the landscape looks like
is very scary,
460
00:30:16,564 --> 00:30:20,485
given that the amount
of capital and human resource
461
00:30:20,568 --> 00:30:23,404
going into making AI more powerful
462
00:30:23,488 --> 00:30:26,032
and using it
for all of these different applications,
463
00:30:26,115 --> 00:30:26,950
is immense.
464
00:30:27,033 --> 00:30:29,327
[tense music plays]
465
00:30:33,373 --> 00:30:34,958
[indistinct chatter]
466
00:30:39,254 --> 00:30:41,214
Oh my God, this guy.
467
00:30:41,756 --> 00:30:43,424
[Brandon] He knows he can't win.
468
00:30:44,259 --> 00:30:46,678
Oh... [muttering]
469
00:30:46,761 --> 00:30:51,099
When I see AI win at different problems,
I find it inspirational.
470
00:30:51,683 --> 00:30:53,977
Going for a little Hail Mary action.
471
00:30:54,477 --> 00:30:59,524
And you can apply those same tactics,
techniques, procedures to real aircraft.
472
00:31:01,401 --> 00:31:04,404
- [friend] Good game.
- [Brandon] All right, good game. [laughs]
473
00:31:06,948 --> 00:31:08,157
[sighs]
474
00:31:08,658 --> 00:31:09,909
It's surprising to me
475
00:31:09,993 --> 00:31:14,122
that people continue to make statements
about what AI can't do. Right?
476
00:31:14,205 --> 00:31:17,166
"Oh, it'll never be able
to beat a world champion in chess."
477
00:31:17,250 --> 00:31:19,294
[tense classical music plays]
478
00:31:20,211 --> 00:31:22,672
[reporter 5] An IBM computer
has made a comeback
479
00:31:22,755 --> 00:31:25,925
in Game 2 of its match
with world chess champion, Garry Kasparov.
480
00:31:27,093 --> 00:31:30,889
[commentator] Whoa! Kasparov has resigned!
481
00:31:31,598 --> 00:31:34,642
[Kasporov] When I see something
that is well beyond my understanding,
482
00:31:34,726 --> 00:31:38,062
I'm scared. And that was something
well beyond my understanding.
483
00:31:39,355 --> 00:31:41,107
[Brandon] And then people would say,
484
00:31:41,190 --> 00:31:44,402
"It'll never be able to beat
a world champion in the game of Go."
485
00:31:44,485 --> 00:31:50,700
[Go champion] I believe human intuition
is still too advanced for A.I.
486
00:31:50,783 --> 00:31:52,827
to have caught up.
487
00:31:53,411 --> 00:31:55,204
[tense music continues]
488
00:31:55,288 --> 00:31:58,416
[man 15] Go is one of the most
complicated games anyone can learn
489
00:31:58,499 --> 00:32:02,211
because the number of moves on the board,
when you do the math,
490
00:32:02,295 --> 00:32:05,256
equal more atoms
than there are in the entire universe.
491
00:32:06,799 --> 00:32:08,968
There was a team at Google
called DeepMind,
492
00:32:09,052 --> 00:32:11,471
and they created a program called AlphaGo
493
00:32:11,554 --> 00:32:14,265
to be able to beat
the world's best players.
494
00:32:14,349 --> 00:32:16,476
[officiator] Wow.
495
00:32:16,559 --> 00:32:18,394
Congratulations to...
496
00:32:18,478 --> 00:32:19,478
- AlphaGo.
- AlphaGo.
497
00:32:19,520 --> 00:32:22,774
A computer program
has just beaten a 9 dan professional.
498
00:32:25,610 --> 00:32:31,532
[Brandon] Then DeepMind chose StarCraft
as kind of their next AI challenge.
499
00:32:33,743 --> 00:32:38,081
StarCraft is perhaps the most popular
real-time strategy game of all time.
500
00:32:40,333 --> 00:32:44,712
AlphaStar became famous
when it started defeating world champions.
501
00:32:45,338 --> 00:32:48,800
[host 1] AlphaStar
absolutely smashing Immortal Arc.
502
00:32:48,883 --> 00:32:49,717
[host 2] Know what?
503
00:32:49,801 --> 00:32:52,553
This is not gonna be a fight
that the pros can win.
504
00:32:53,680 --> 00:32:54,806
It's kind of ridiculous.
505
00:32:54,889 --> 00:32:57,558
[tense classical music continues]
506
00:32:57,642 --> 00:33:00,895
[Brandon] Professional gamers say,
"I would never try that tactic."
507
00:33:00,979 --> 00:33:04,816
"I would never try that strategy.
That's something that's not human."
508
00:33:07,151 --> 00:33:10,655
And that was perhaps,
you know, the "a-ha" moment for me.
509
00:33:10,738 --> 00:33:12,657
[music crescendos]
510
00:33:13,574 --> 00:33:16,661
I came to realize the time is now.
511
00:33:17,412 --> 00:33:21,082
There's an important technology
and an opportunity to make a difference.
512
00:33:21,165 --> 00:33:23,167
[somber music plays]
513
00:33:24,502 --> 00:33:29,382
I only knew the problems that I had faced
as a SEAL in close-quarters combat,
514
00:33:30,383 --> 00:33:34,470
but one of my good friends,
who was an F-18 pilot, told me,
515
00:33:34,554 --> 00:33:37,724
"We have the same problem
in the fighter jet community."
516
00:33:37,807 --> 00:33:39,434
"They are jamming communications."
517
00:33:39,517 --> 00:33:42,562
"There are proliferated
surface-to-air missile sites
518
00:33:42,645 --> 00:33:44,439
that make it too dangerous to operate."
519
00:33:47,483 --> 00:33:50,987
Imagine if we had a fighter jet
that was commanded by an AI.
520
00:33:51,070 --> 00:33:52,739
[host 3] Welcome
to the AlphaDogfights.
521
00:33:52,822 --> 00:33:56,159
We're a couple of minutes away
from this first semifinal.
522
00:33:56,242 --> 00:33:59,871
[Brandon] DARPA, the Defense
Advanced Research Projects Agency,
523
00:33:59,954 --> 00:34:03,624
had seen AlphaGo and AlphaStar,
524
00:34:04,292 --> 00:34:08,171
and so this idea of the AlphaDogfight
competition came to life.
525
00:34:09,005 --> 00:34:11,466
[host 4] It's what you wanna see
your fighter pilots do.
526
00:34:11,549 --> 00:34:13,634
[host 3] This looks like
human dogfighting.
527
00:34:13,718 --> 00:34:15,511
[tense music plays]
528
00:34:15,595 --> 00:34:19,057
[Brandon] Dogfighting is
fighter-on-fighter aircraft going at it.
529
00:34:19,849 --> 00:34:22,852
You can think about it
as a boxing match in the sky.
530
00:34:23,853 --> 00:34:26,314
Maybe people have seen the movie Top Gun.
531
00:34:26,397 --> 00:34:30,068
- [character] Can we outrun these guys?
- [Maverick] Not their missiles and guns.
532
00:34:30,151 --> 00:34:31,277
[character] It's a dogfight.
533
00:34:35,531 --> 00:34:39,410
[Brandon] Learning to master dogfighting
can take eight to ten years.
534
00:34:42,288 --> 00:34:46,209
It's an extremely complex challenge
to build AI around.
535
00:34:47,043 --> 00:34:49,045
[dramatic music plays]
536
00:34:49,128 --> 00:34:51,130
[keyboard tapping]
537
00:34:54,092 --> 00:34:58,679
The prior approaches to autonomy
and dogfighting tended to be brittle.
538
00:34:59,222 --> 00:35:03,017
[man 16] We figured machine learning was
probably the way to solve this problem.
539
00:35:04,477 --> 00:35:08,189
At first, the AI knew nothing
about the world in which it was dropped.
540
00:35:08,272 --> 00:35:11,025
It didn't know it was flying
or what dogfighting was.
541
00:35:11,109 --> 00:35:12,735
It didn't know what an F-16 is.
542
00:35:13,236 --> 00:35:15,988
All it knew
was the available actions it could take,
543
00:35:16,072 --> 00:35:18,407
and it would start
to randomly explore those actions.
544
00:35:19,325 --> 00:35:23,204
[colleague] The blue plane's been training
for only a small amount of time.
545
00:35:23,287 --> 00:35:25,832
You can see it wobbling back and forth,
546
00:35:25,915 --> 00:35:30,545
uh, flying very erratically,
generally away from its adversary.
547
00:35:31,587 --> 00:35:34,257
As the fight progresses,
we can see blue is starting
548
00:35:34,340 --> 00:35:36,134
to establish here its game plan.
549
00:35:36,759 --> 00:35:38,636
It's more in a position to shoot.
550
00:35:39,178 --> 00:35:41,305
Once in a while,
the learning algorithm said,
551
00:35:41,389 --> 00:35:43,891
"Here's a cookie.
Keep doing more of that."
552
00:35:45,143 --> 00:35:49,147
We can take advantage of computer power
and train the agents many times
553
00:35:49,230 --> 00:35:50,230
in parallel.
554
00:35:51,357 --> 00:35:52,775
It's like a basketball team.
555
00:35:53,276 --> 00:35:55,778
Instead of playing the same team
over and over again,
556
00:35:55,862 --> 00:35:59,198
you're traveling the world
playing 512 different teams,
557
00:35:59,699 --> 00:36:00,950
all at the same time.
558
00:36:01,492 --> 00:36:03,452
You can get very good, very fast.
559
00:36:04,162 --> 00:36:07,456
We were able to run that simulation 24/7
560
00:36:07,540 --> 00:36:11,919
and get something like 30 years
of pilot training time in, in 10 months.
561
00:36:12,003 --> 00:36:12,837
[music builds]
562
00:36:12,920 --> 00:36:15,673
We went from barely able
to control the aircraft
563
00:36:15,756 --> 00:36:17,717
to being a stone-cold assassin.
564
00:36:19,010 --> 00:36:23,514
Under training, we were competing only
against other artificial intelligence.
565
00:36:24,473 --> 00:36:28,561
But competing against humans directly
was kind of the ultimate target.
566
00:36:34,192 --> 00:36:35,610
My name is Mike Benitez,
567
00:36:36,110 --> 00:36:38,362
I'm a Lieutenant Colonel
in the U.S. Air Force.
568
00:36:39,155 --> 00:36:41,157
Been on active duty about 25 years.
569
00:36:42,825 --> 00:36:44,869
I've got 250 combat missions
570
00:36:45,578 --> 00:36:47,538
and I'm a weapons school graduate,
571
00:36:48,039 --> 00:36:49,832
which is Air Force version of Top Gun.
572
00:36:52,627 --> 00:36:55,171
I've never actually flown against AI.
573
00:36:55,254 --> 00:36:58,549
So I'm pretty excited
to see how well I can do.
574
00:36:59,634 --> 00:37:03,054
[commander] We got now a 6,000 feet
offensive set up nose-to-nose.
575
00:37:04,388 --> 00:37:05,223
Fight's on.
576
00:37:05,306 --> 00:37:06,724
[tense music plays]
577
00:37:06,807 --> 00:37:08,309
[air rushing]
578
00:37:09,852 --> 00:37:10,978
[machine gun fire]
579
00:37:13,314 --> 00:37:14,314
He's gone now.
580
00:37:14,732 --> 00:37:16,651
Yeah, that's actually really interesting.
581
00:37:16,734 --> 00:37:18,778
[muttering indistinctly]
582
00:37:19,278 --> 00:37:20,696
[machine gun fire]
583
00:37:20,780 --> 00:37:23,658
Dead. Got him. Flawless victory.
584
00:37:23,741 --> 00:37:25,409
[chuckling]
585
00:37:25,910 --> 00:37:27,453
All right, round two.
586
00:37:27,536 --> 00:37:30,039
[tense music continues]
587
00:37:31,290 --> 00:37:33,209
[machine gun fire]
588
00:37:35,461 --> 00:37:40,466
What the artificial intelligence is doing
is maneuvering with such precision,
589
00:37:40,967 --> 00:37:43,219
uh, that I just can't keep up with it.
590
00:37:43,302 --> 00:37:44,679
[air rushing]
591
00:37:44,762 --> 00:37:46,013
[machine gun fire]
592
00:37:46,847 --> 00:37:48,391
[air rushing]
593
00:37:48,474 --> 00:37:49,725
Right into the merge.
594
00:37:50,309 --> 00:37:51,894
Oh, now you're gone.
595
00:37:51,978 --> 00:37:53,062
[machine gun fire]
596
00:37:53,145 --> 00:37:53,980
Got him!
597
00:37:54,063 --> 00:37:55,063
Still got me.
598
00:37:55,523 --> 00:37:56,857
[laughing]
599
00:37:58,567 --> 00:38:00,069
[Brett] AI is never scared.
600
00:38:00,736 --> 00:38:03,781
There's a human emotional element
in the cockpit an AI won't have.
601
00:38:03,864 --> 00:38:04,907
[music rises]
602
00:38:04,991 --> 00:38:08,077
One of the more interesting strategies
our AI developed,
603
00:38:08,160 --> 00:38:10,121
was what we call the face shot.
604
00:38:10,621 --> 00:38:14,000
Usually a human wants to shoot from behind
605
00:38:14,083 --> 00:38:16,377
because it's hard for them
to shake you loose.
606
00:38:17,044 --> 00:38:20,381
They don't try face shots
because you're playing a game of chicken.
607
00:38:20,464 --> 00:38:21,340
[beeping]
608
00:38:21,424 --> 00:38:25,511
When we come head-on,
3,000 feet away to 500 feet away
609
00:38:25,594 --> 00:38:27,096
can happen in a blink of an eye.
610
00:38:27,596 --> 00:38:31,183
You run a high risk of colliding,
so humans don't try it.
611
00:38:31,267 --> 00:38:32,393
[high-pitched tone]
612
00:38:32,893 --> 00:38:35,938
The AI, unless it's told to fear death,
will not fear death.
613
00:38:36,022 --> 00:38:37,023
[machine gun fire]
614
00:38:37,106 --> 00:38:40,693
[Mike] All good. Feels like
I'm fighting against a human, uh,
615
00:38:40,776 --> 00:38:44,113
a human that has a reckless abandonment
for safety. [chuckles]
616
00:38:45,489 --> 00:38:47,325
He's not gonna survive this last one.
617
00:38:48,284 --> 00:38:49,869
[air rushing]
618
00:38:50,369 --> 00:38:52,038
[tense music continues]
619
00:38:54,790 --> 00:38:56,292
[wind whistling]
620
00:38:57,168 --> 00:38:58,461
He doesn't have enough time.
621
00:39:00,629 --> 00:39:01,629
Ah!
622
00:39:01,964 --> 00:39:02,965
[Brett] Good night.
623
00:39:03,049 --> 00:39:04,049
[beeping]
624
00:39:04,091 --> 00:39:05,091
[Mike] I'm dead.
625
00:39:16,812 --> 00:39:17,980
It's humbling to know
626
00:39:18,064 --> 00:39:20,983
that I might not even be
the best thing for this mission,
627
00:39:21,484 --> 00:39:24,320
and that thing could be something
that replaces me one day.
628
00:39:25,696 --> 00:39:26,989
[colleague] Same 6 CAV.
629
00:39:27,615 --> 00:39:28,866
One thousand offset.
630
00:39:29,367 --> 00:39:32,078
[Brandon] With this AI pilot
commanding fighter aircraft,
631
00:39:32,828 --> 00:39:34,830
the winning is relentless, it's dominant.
632
00:39:34,914 --> 00:39:37,249
It's not just winning by a wide range.
633
00:39:37,333 --> 00:39:41,545
It's, "Okay, how can we get that
onto our aircraft?" It's that powerful.
634
00:39:41,629 --> 00:39:43,589
[melodic music plays]
635
00:39:43,672 --> 00:39:47,802
[Nathan] It's realistic to expect
that AI will be piloting an F-16,
636
00:39:47,885 --> 00:39:49,970
and it will not be that far out.
637
00:39:51,305 --> 00:39:56,769
[Brandon] If you're going up against
an AI pilot that has a 99.99999% win rate,
638
00:39:56,852 --> 00:39:58,062
you don't stand a chance.
639
00:39:58,145 --> 00:39:59,522
[tense music plays]
640
00:39:59,605 --> 00:40:02,733
When I think about
one AI pilot being unbeatable,
641
00:40:02,817 --> 00:40:06,779
I think about what a team of 50, or 100,
642
00:40:07,279 --> 00:40:12,535
or 1,000 AI pilots
can continue to, uh, achieve.
643
00:40:13,702 --> 00:40:17,415
Swarming is a team
of highly intelligent aircraft
644
00:40:17,498 --> 00:40:18,791
that work with each other.
645
00:40:19,291 --> 00:40:23,462
They're sharing information about
what to do, how to solve a problem.
646
00:40:23,546 --> 00:40:24,588
[beeping]
647
00:40:25,172 --> 00:40:30,469
Swarming will be a game-changing
and transformational capability
648
00:40:30,553 --> 00:40:32,638
to our military and our allies.
649
00:40:33,139 --> 00:40:35,433
[crickets chirping]
650
00:40:35,516 --> 00:40:37,852
[dramatic music plays]
651
00:40:44,316 --> 00:40:45,818
[music intensifies]
652
00:41:04,920 --> 00:41:07,256
[muffled radio music]
653
00:41:12,553 --> 00:41:14,346
[tense music plays]
654
00:41:19,643 --> 00:41:21,061
[wind whistling]
655
00:41:21,145 --> 00:41:22,145
[buzzing]
656
00:41:25,608 --> 00:41:27,318
[truck accelerating]
657
00:41:27,401 --> 00:41:28,486
[music builds]
658
00:41:42,750 --> 00:41:47,379
[controller] Target has been acquired,
and the drones are tracking him.
659
00:41:56,722 --> 00:41:58,057
[music crescendos]
660
00:42:01,936 --> 00:42:03,604
[controller] Here comes the land.
661
00:42:07,566 --> 00:42:10,402
[man 17] Primary goal
of the swarming research we're working on
662
00:42:11,153 --> 00:42:13,697
is to deploy a large number of drones
663
00:42:13,781 --> 00:42:17,910
over an area that is hard to get to
or dangerous to get to.
664
00:42:17,993 --> 00:42:19,995
[cryptic music plays]
665
00:42:20,079 --> 00:42:21,622
[buzzing]
666
00:42:21,705 --> 00:42:26,085
The Army Research Lab has been supporting
this particular research project.
667
00:42:26,919 --> 00:42:28,963
If you want to know what's in a location,
668
00:42:29,046 --> 00:42:32,883
but it's hard to get to that area,
or it's a very large area,
669
00:42:33,676 --> 00:42:35,886
then deploying a swarm
is a very natural way
670
00:42:35,970 --> 00:42:38,389
to extend the reach of individuals
671
00:42:38,472 --> 00:42:41,934
and collect information
that is critical to the mission.
672
00:42:42,935 --> 00:42:44,770
[music intensifies]
673
00:42:46,689 --> 00:42:48,983
So, right now in our swarm deployment,
674
00:42:49,066 --> 00:42:53,988
we essentially give a single command
to go track the target of interest.
675
00:42:54,488 --> 00:42:57,283
Then the drones go
and do all of that on their own.
676
00:42:59,493 --> 00:43:04,081
Artificial intelligence allows
the robots to move collectively as a swarm
677
00:43:04,164 --> 00:43:05,708
in a decentralized manner.
678
00:43:06,792 --> 00:43:08,877
[melodic music plays]
679
00:43:10,671 --> 00:43:12,506
In the swarms in nature that we see,
680
00:43:13,299 --> 00:43:18,178
there's no boss,
no main animal telling them what to do.
681
00:43:20,556 --> 00:43:25,019
The behavior is emerging
out of each individual animal
682
00:43:25,102 --> 00:43:27,104
following a few simple rules.
683
00:43:27,688 --> 00:43:32,443
And out of that grows this emergent
collective behavior that you see.
684
00:43:32,526 --> 00:43:33,861
[buzzing]
685
00:43:33,944 --> 00:43:37,031
What's awe-inspiring
about swarms in nature
686
00:43:37,114 --> 00:43:39,992
is the graceful ability
in which they move.
687
00:43:40,576 --> 00:43:44,288
It's as if they were built
to be a part of this group.
688
00:43:46,790 --> 00:43:51,128
Ideally, what we'd love to see
with our drone swarm is,
689
00:43:51,211 --> 00:43:53,172
much like in the swarm in nature,
690
00:43:53,672 --> 00:43:56,717
decisions being made
by the group collectively.
691
00:43:56,800 --> 00:43:59,428
[melodic music continues]
692
00:43:59,928 --> 00:44:02,765
The other piece of inspiration for us
693
00:44:02,848 --> 00:44:06,268
comes in the form
of reliability and resiliency.
694
00:44:07,311 --> 00:44:09,188
That swarm will not go down
695
00:44:09,271 --> 00:44:13,567
if one individual animal
doesn't do what it's supposed to do.
696
00:44:13,651 --> 00:44:15,110
[buzzing]
697
00:44:15,194 --> 00:44:18,697
Even if one of the agents falls out,
or fails,
698
00:44:18,781 --> 00:44:20,866
or isn't able to complete the task,
699
00:44:21,909 --> 00:44:23,369
the swarm will continue.
700
00:44:25,204 --> 00:44:27,873
And ultimately,
that's what we'd like to have.
701
00:44:29,333 --> 00:44:33,879
We have this need in combat scenarios
for identifying enemy aircraft,
702
00:44:33,962 --> 00:44:37,841
and it used to be we required
one person controlling one robot.
703
00:44:38,592 --> 00:44:40,552
As autonomy increases,
704
00:44:40,636 --> 00:44:43,555
I hope we will get to see
a large number of robots
705
00:44:43,639 --> 00:44:46,558
being controlled
by a very small number of people.
706
00:44:47,851 --> 00:44:50,854
I see no reason why we couldn't achieve
a thousand eventually
707
00:44:51,355 --> 00:44:54,942
because each agent
will be able to act of its own accord,
708
00:44:55,025 --> 00:44:56,443
and the sky's the limit.
709
00:44:57,027 --> 00:44:59,071
We can scale our learning...
710
00:44:59,154 --> 00:45:03,325
[Justin B.] We've been working on swarming
in simulation for quite some time,
711
00:45:03,409 --> 00:45:06,829
and it is time to bring
that to real-world aircraft.
712
00:45:07,454 --> 00:45:12,042
We expect to be doing
three robots at once over the network,
713
00:45:12,126 --> 00:45:15,129
and then starting
to add more and more capabilities.
714
00:45:15,879 --> 00:45:18,549
We want to be able
to test that on smaller systems,
715
00:45:18,632 --> 00:45:22,261
but take those same concepts
and apply them to larger systems,
716
00:45:22,344 --> 00:45:23,679
like a fighter jet.
717
00:45:24,513 --> 00:45:26,014
[Brandon] We talk a lot about,
718
00:45:26,098 --> 00:45:31,937
how do you give a platoon
the combat power of a battalion?
719
00:45:32,020 --> 00:45:33,939
[dramatic music plays]
720
00:45:34,982 --> 00:45:39,528
Or a battalion
the combat power of a brigade?
721
00:45:39,611 --> 00:45:41,488
You can do that with swarming.
722
00:45:42,614 --> 00:45:45,200
And when you can unlock
that power of swarming,
723
00:45:45,284 --> 00:45:48,078
you have just created
a new strategic deterrence
724
00:45:48,162 --> 00:45:49,455
to military aggression.
725
00:45:49,538 --> 00:45:52,249
[dramatic music continues]
726
00:45:52,332 --> 00:45:57,004
[Bob] I think the most exciting thing
is the number of young men and women
727
00:45:57,087 --> 00:46:00,299
who we will save
if we really do this right.
728
00:46:01,091 --> 00:46:03,594
And we trade machines
729
00:46:04,553 --> 00:46:05,846
rather than human lives.
730
00:46:06,638 --> 00:46:10,893
Some argue that autonomous weapons
will make warfare more precise
731
00:46:10,976 --> 00:46:12,311
and more humane,
732
00:46:12,853 --> 00:46:15,397
but it's actually difficult to predict
733
00:46:15,481 --> 00:46:19,443
exactly how autonomous weapons
might change warfare ahead of time.
734
00:46:21,320 --> 00:46:23,238
It's like the invention
of the Gatling gun.
735
00:46:23,322 --> 00:46:24,990
[grim music plays]
736
00:46:25,073 --> 00:46:26,700
Richard Gatling was an inventor,
737
00:46:27,201 --> 00:46:30,954
and he saw soldiers coming back,
wounded in the Civil War,
738
00:46:32,289 --> 00:46:35,292
and wanted to find ways
to make warfare more humane.
739
00:46:36,293 --> 00:46:39,797
To reduce the number of soldiers
that were killed in war
740
00:46:40,422 --> 00:46:42,633
by reducing the number
of soldiers in the battle.
741
00:46:42,716 --> 00:46:44,009
[music builds]
742
00:46:44,092 --> 00:46:48,388
And so he invented the Gatling gun,
an automated gun turned by a crank
743
00:46:48,472 --> 00:46:50,516
that could automate the process of firing.
744
00:46:53,268 --> 00:46:57,981
It increased effectively by a hundredfold
the firepower that soldiers could deliver.
745
00:46:58,065 --> 00:47:00,734
[grim music continues]
746
00:47:01,235 --> 00:47:05,239
Oftentimes, efforts to make warfare
more precise and humane...
747
00:47:05,322 --> 00:47:06,448
[tense music plays]
748
00:47:06,532 --> 00:47:07,991
...can have the opposite effect.
749
00:47:10,160 --> 00:47:13,455
[Arash] Think about the effect
of one errant drone strike
750
00:47:13,539 --> 00:47:14,540
in a rural area
751
00:47:14,623 --> 00:47:17,209
that drives the local populace
against the United States,
752
00:47:17,292 --> 00:47:20,712
against the local government.
You know, supposedly the good guys.
753
00:47:20,796 --> 00:47:22,506
[tense music continues]
754
00:47:22,589 --> 00:47:25,008
Now magnify that by 1,000.
755
00:47:26,718 --> 00:47:28,720
[Emilia] The creation of a weapon system
756
00:47:28,804 --> 00:47:34,560
that is cheap, scalable,
and doesn't require human operators
757
00:47:34,643 --> 00:47:37,896
drastically changes
the actual barriers to conflict.
758
00:47:40,107 --> 00:47:44,486
It keeps me up at night to think
of a world where war is ubiquitous,
759
00:47:44,987 --> 00:47:48,657
and we no longer carry
the human and financial cost of war
760
00:47:48,740 --> 00:47:51,410
because we're just so far removed from...
761
00:47:52,369 --> 00:47:54,037
the lives that will be lost.
762
00:47:55,038 --> 00:47:57,541
[somber melodic music plays]
763
00:47:59,459 --> 00:48:01,420
[Sean] This whole thing is haunting me.
764
00:48:02,212 --> 00:48:05,924
I just needed an example
of artificial intelligence misuse.
765
00:48:06,800 --> 00:48:10,888
The unanticipated consequences
of doing that simple thought experiment
766
00:48:10,971 --> 00:48:12,848
have gone way too far.
767
00:48:17,352 --> 00:48:18,729
When I gave the presentation
768
00:48:18,812 --> 00:48:22,107
on the toxic molecules
created by AI technology,
769
00:48:22,608 --> 00:48:24,318
the audience's jaws dropped.
770
00:48:29,406 --> 00:48:33,660
[Fabio] The next decision was whether
we should publish this information.
771
00:48:34,912 --> 00:48:36,914
On one hand, you want to warn the world
772
00:48:36,997 --> 00:48:40,125
of these sorts of capabilities,
but on the other hand,
773
00:48:40,208 --> 00:48:44,087
you don't want to give somebody the idea
if they had never had it before.
774
00:48:46,381 --> 00:48:48,133
We decided it was worth publishing
775
00:48:48,216 --> 00:48:52,679
to maybe find some ways
to mitigate the misuse of this type of AI
776
00:48:52,763 --> 00:48:54,139
before it occurs.
777
00:48:54,222 --> 00:48:56,224
[grim music plays]
778
00:49:01,897 --> 00:49:04,775
The general public's reaction
was shocking.
779
00:49:04,858 --> 00:49:07,110
[tense music plays]
780
00:49:07,194 --> 00:49:10,656
We can see the metrics on the page,
how many people have accessed it.
781
00:49:10,739 --> 00:49:14,451
The kinds of articles we normally write,
we're lucky if we get...
782
00:49:14,534 --> 00:49:19,831
A few thousand people look at our article
over a period of a year or multiple years.
783
00:49:19,915 --> 00:49:22,084
It was 10,000 people
had read it within a week.
784
00:49:22,167 --> 00:49:25,337
Then it was 20,000,
then it was 30,000, then it was 40,000,
785
00:49:25,837 --> 00:49:28,715
and we were up to 10,000 people a day.
786
00:49:30,258 --> 00:49:33,512
[Sean] We've done The Economist,
the Financial Times.
787
00:49:35,639 --> 00:49:39,977
Radiolab, you know, they reached out.
Like, I've heard of Radiolab!
788
00:49:40,060 --> 00:49:41,311
[music crescendos]
789
00:49:41,937 --> 00:49:45,899
But then the reactions turned
into this thing that's out of control.
790
00:49:45,983 --> 00:49:48,235
[tense music continues]
791
00:49:49,778 --> 00:49:53,532
When we look at those tweets, it's like,
"Oh my God, could they do anything worse?"
792
00:49:54,324 --> 00:49:55,993
Why did they do this?
793
00:49:57,828 --> 00:49:59,329
[music crescendos]
794
00:50:01,164 --> 00:50:04,501
And then we got an invitation
I never would have anticipated.
795
00:50:04,584 --> 00:50:07,087
[dramatic music plays]
796
00:50:08,797 --> 00:50:12,926
There was a lot of discussion
inside the White House about the article,
797
00:50:13,010 --> 00:50:15,345
and they wanted to talk to us urgently.
798
00:50:18,849 --> 00:50:20,809
Obviously, it's an incredible honor
799
00:50:20,892 --> 00:50:23,020
to be able
to talk to people at this level.
800
00:50:23,603 --> 00:50:25,272
But then it hits you
801
00:50:25,355 --> 00:50:28,900
like, "Oh my goodness,
it's the White House. The boss."
802
00:50:29,693 --> 00:50:33,196
This involved putting together
data sets that were open source...
803
00:50:33,280 --> 00:50:37,701
And in about six hours, the model was able
to generate about over 40,000...
804
00:50:37,784 --> 00:50:41,038
[Sean] They asked questions
about how much computing power you needed,
805
00:50:41,872 --> 00:50:43,832
and we told them it was nothing special.
806
00:50:44,332 --> 00:50:48,211
Literally a standard run-of-the-mill,
six-year-old Mac.
807
00:50:49,546 --> 00:50:51,256
And that blew them away.
808
00:50:52,049 --> 00:50:54,468
[dramatic music continues]
809
00:50:54,968 --> 00:50:59,473
The folks that are in charge
of understanding chemical warfare agents
810
00:50:59,556 --> 00:51:04,019
and governmental agencies,
they had no idea of this potential.
811
00:51:04,102 --> 00:51:06,104
[music intensifies]
812
00:51:06,605 --> 00:51:09,816
We've got this cookbook
to make these chemical weapons,
813
00:51:10,650 --> 00:51:15,572
and in the hands of a bad actor
that has malicious intent
814
00:51:16,573 --> 00:51:18,992
it could be utterly horrifying.
815
00:51:19,534 --> 00:51:21,203
[grim music plays]
816
00:51:21,703 --> 00:51:23,205
People have to sit up and listen,
817
00:51:23,288 --> 00:51:26,458
and we have to take steps
to either regulate the technology
818
00:51:27,250 --> 00:51:30,670
or constrain it in a way
that it can't be misused.
819
00:51:31,880 --> 00:51:35,175
Because the potential for lethality...
820
00:51:36,176 --> 00:51:37,344
is terrifying.
821
00:51:40,680 --> 00:51:45,977
The question of the ethics of AI
is largely addressed by society,
822
00:51:46,561 --> 00:51:50,273
not by the engineers or technologists,
the mathematicians.
823
00:51:51,608 --> 00:51:52,442
[beeping]
824
00:51:52,526 --> 00:51:56,822
Every technology that we bring forth,
every novel innovation,
825
00:51:56,905 --> 00:52:00,617
ultimately falls under the purview
of how society believes we should use it.
826
00:52:00,700 --> 00:52:02,536
[somber music plays]
827
00:52:03,036 --> 00:52:06,498
[Bob] Right now,
the Department of Defense says,
828
00:52:06,581 --> 00:52:09,960
"The only thing that is saying
we are going to kill something
829
00:52:10,043 --> 00:52:11,920
on the battlefield is a human."
830
00:52:12,504 --> 00:52:14,548
A machine can do the killing,
831
00:52:15,674 --> 00:52:18,927
but only at the behest
of a human operator,
832
00:52:19,636 --> 00:52:21,429
and I don't see that ever changing.
833
00:52:21,513 --> 00:52:23,390
[somber music continues]
834
00:52:24,808 --> 00:52:28,103
[Arash] They assure us
that this type of technology will be safe.
835
00:52:29,729 --> 00:52:33,692
But the United States military just
doesn't have a trustworthy reputation
836
00:52:33,775 --> 00:52:35,485
with drone warfare.
837
00:52:36,820 --> 00:52:41,783
And so, when it comes
to trusting the U.S. military with AI,
838
00:52:41,867 --> 00:52:45,036
I would say, you know, the track record
kinda speaks for itself.
839
00:52:46,288 --> 00:52:50,125
[Paul] The U.S. Defense Department policy
on the use of autonomy in weapons
840
00:52:50,208 --> 00:52:52,919
does not ban any kind of weapon system.
841
00:52:53,003 --> 00:52:56,715
And even if militaries
might not want autonomous weapons,
842
00:52:56,798 --> 00:53:01,678
we could see militaries handing over
more decisions to machines
843
00:53:01,761 --> 00:53:03,680
just to keep pace with competitors.
844
00:53:04,764 --> 00:53:08,351
And that could drive militaries
to automate decisions
845
00:53:08,435 --> 00:53:09,978
that they may not want to.
846
00:53:12,022 --> 00:53:13,440
[Bob] Vladimir Putin said,
847
00:53:13,523 --> 00:53:16,693
"Whoever leads in AI
is going to rule the world."
848
00:53:17,736 --> 00:53:23,325
President Xi has made it clear that AI
is one of the number one technologies
849
00:53:23,408 --> 00:53:25,702
that China wants to dominate in.
850
00:53:26,786 --> 00:53:29,831
We're clearly
in a technological competition.
851
00:53:33,418 --> 00:53:35,670
[Brandon] You hear people talk
about guardrails,
852
00:53:36,463 --> 00:53:39,424
and I believe
that is what people should be doing.
853
00:53:40,467 --> 00:53:45,180
But there is a very real race
for AI superiority.
854
00:53:45,263 --> 00:53:46,264
[birds chirping]
855
00:53:47,265 --> 00:53:51,603
And our adversaries, whether it's China,
whether it's Russia, whether it's Iran,
856
00:53:52,187 --> 00:53:56,566
are not going to give two thoughts
to what our policy says around AI.
857
00:53:57,943 --> 00:54:00,320
[birds singing]
858
00:54:00,820 --> 00:54:04,324
You're seeing a lot more conversations
around AI policy,
859
00:54:06,117 --> 00:54:09,287
but I wish more leaders
would have the conversation
860
00:54:09,371 --> 00:54:11,706
saying, ''How quickly
can we build this thing?
861
00:54:12,707 --> 00:54:15,168
Let's resource the heck out of it
and build it."
862
00:54:15,919 --> 00:54:17,462
[dramatic music plays]
863
00:54:17,545 --> 00:54:18,838
[horns honking]
864
00:54:24,177 --> 00:54:26,388
[indistinct chatter]
865
00:54:26,888 --> 00:54:29,057
[inaudible conversation]
866
00:54:29,557 --> 00:54:31,017
[beeping]
867
00:54:33,478 --> 00:54:38,358
We are at the Association of the U.S.
Army's biggest trade show of the year.
868
00:54:39,609 --> 00:54:44,572
Basically, any vendor who is selling
a product or technology into a military
869
00:54:44,656 --> 00:54:46,616
will be exhibiting.
870
00:54:47,659 --> 00:54:50,537
[man 18] The Tyndall Air Force Base
has four of our robots
871
00:54:50,620 --> 00:54:53,748
that patrol their base
24 hours a day, 7 days a week.
872
00:54:55,750 --> 00:55:00,130
We can add everything from cameras
to sensors to whatever you need.
873
00:55:00,213 --> 00:55:05,135
Manipulator arms. Again, just to complete
the mission that the customer has in mind.
874
00:55:06,678 --> 00:55:08,638
[man 19] What if your enemy introduces AI?
875
00:55:09,431 --> 00:55:12,976
A fighting system that thinks
faster than you, responds faster,
876
00:55:13,059 --> 00:55:16,062
than what a human being can do?
We've got to be prepared.
877
00:55:17,605 --> 00:55:21,359
We train our systems
to collect intel on the enemy,
878
00:55:22,027 --> 00:55:26,531
managing enemy targets
with humans supervising the kill chain.
879
00:55:27,115 --> 00:55:28,491
[music crescendos]
880
00:55:33,204 --> 00:55:34,956
- [Brandon] Hi, General.
- How you doing?
881
00:55:35,040 --> 00:55:37,208
[Brandon] Good, sir. How are you? Um...
882
00:55:37,709 --> 00:55:40,962
I'll just say
no one is investing more in an AI pilot.
883
00:55:41,046 --> 00:55:42,630
Our AI pilot's called Hivemind,
884
00:55:43,298 --> 00:55:47,135
so we applied it to our quadcopter Nova.
It goes inside buildings,
885
00:55:47,218 --> 00:55:49,387
explores them
ahead of special operation forces
886
00:55:49,471 --> 00:55:50,805
and infantry forces.
887
00:55:50,889 --> 00:55:52,891
We're applying Hivemind to V-BAT,
888
00:55:52,974 --> 00:55:56,811
so I think about, you know,
putting up hundreds of those teams.
889
00:55:56,895 --> 00:55:58,646
Whether it's the Taiwan Strait,
890
00:55:58,730 --> 00:56:01,608
whether it's in the Ukraine,
deterring our adversaries.
891
00:56:01,691 --> 00:56:03,360
So, pretty excited about it.
892
00:56:03,943 --> 00:56:06,196
- All right. Thank you.
- So. Thank you, General.
893
00:56:06,279 --> 00:56:07,906
[indistinct chatter]
894
00:56:07,989 --> 00:56:13,370
AI pilots should be ubiquitous,
and that should be the case by 2025, 2030.
895
00:56:14,204 --> 00:56:17,582
Its adoption will be rapid
throughout militaries across the world.
896
00:56:17,665 --> 00:56:19,167
[inaudible chatter]
897
00:56:19,250 --> 00:56:22,962
What do you do with the Romanian military,
their UAS guy?
898
00:56:23,046 --> 00:56:24,756
[soldier] High-tech in the military.
899
00:56:25,256 --> 00:56:29,803
We've spent half a billion dollars to date
on building an AI pilot.
900
00:56:29,886 --> 00:56:34,015
We will spend another billion dollars
over the next five years. And that is...
901
00:56:34,099 --> 00:56:37,769
It's a major reason why we're winning
the programs of record in the U.S.
902
00:56:37,852 --> 00:56:38,853
Nice.
903
00:56:38,937 --> 00:56:42,732
I mean, it's impressive.
You succeeded to weaponize that.
904
00:56:43,608 --> 00:56:45,944
Uh, it is... This is not weaponized yet.
905
00:56:46,027 --> 00:56:48,321
So not yet. But yes, in the future.
906
00:56:48,905 --> 00:56:52,534
Our customers think about it as a truck.
We think of it as an intelligent truck
907
00:56:52,617 --> 00:56:54,452
that can do a lot of different things.
908
00:56:54,536 --> 00:56:55,703
Thank you, buddy.
909
00:56:55,787 --> 00:56:57,956
[Brandon] I'll make sure
to follow up with you.
910
00:56:58,456 --> 00:57:00,458
[inaudible chatter]
911
00:57:01,668 --> 00:57:04,129
If you come back in 10 years,
912
00:57:04,212 --> 00:57:06,256
you'll see that, um...
913
00:57:06,339 --> 00:57:10,802
AI and autonomy
will have dominated this entire market.
914
00:57:10,885 --> 00:57:12,887
[cryptic music plays]
915
00:57:14,889 --> 00:57:19,352
[Nathan] Forces that are supported
by AI and autonomy
916
00:57:19,436 --> 00:57:25,108
will absolutely dominate,
crush, and destroy forces without.
917
00:57:26,401 --> 00:57:30,780
It'll be the equivalent
of horses going up against tanks,
918
00:57:31,781 --> 00:57:34,367
people with swords
going up against the machine gun.
919
00:57:34,909 --> 00:57:37,120
It will not even be close.
920
00:57:38,746 --> 00:57:42,750
[Brandon] It will become ubiquitous,
used at every spectrum of warfare,
921
00:57:43,585 --> 00:57:44,836
the tactical level,
922
00:57:45,336 --> 00:57:46,546
the strategic level,
923
00:57:47,172 --> 00:57:50,800
operating at speeds
that humans cannot fathom today.
924
00:57:53,094 --> 00:57:57,098
[Paul] Commanders are already overwhelmed
with too much information.
925
00:57:57,599 --> 00:58:01,019
Imagery from satellites,
and drones, and sensors.
926
00:58:02,103 --> 00:58:03,396
One of the things AI can do
927
00:58:03,480 --> 00:58:06,774
is help a commander
more rapidly understand what is occurring.
928
00:58:07,650 --> 00:58:09,903
And then, "What are the decisions
I need to make?"
929
00:58:11,237 --> 00:58:14,866
[Bob] Artificial intelligence
will take into account all the factors
930
00:58:14,949 --> 00:58:17,702
that determine the way war is fought,
931
00:58:18,286 --> 00:58:20,246
come up with strategies...
932
00:58:22,540 --> 00:58:25,668
and give recommendations
on how to win a battle.
933
00:58:25,752 --> 00:58:27,754
[cryptic music continues]
934
00:58:27,837 --> 00:58:29,589
[birds chirping]
935
00:58:36,971 --> 00:58:40,266
[man 20] We at Lockheed Martin,
like our Department of Defense customer,
936
00:58:40,350 --> 00:58:43,770
view artificial intelligence
as a key technology enabler
937
00:58:43,853 --> 00:58:45,104
for command and control.
938
00:58:47,065 --> 00:58:50,235
[analyst 1] The rate of spread
has an average of two feet per second.
939
00:58:50,318 --> 00:58:52,695
[analyst 2] This perimeter
is roughly 700 acres.
940
00:58:52,779 --> 00:58:55,782
[man 20] The fog of war is
a reality for us on the defense side,
941
00:58:57,158 --> 00:59:00,495
but it has parallels
to being in the environment
942
00:59:00,578 --> 00:59:03,206
and having to make decisions
for wildfires as well.
943
00:59:03,289 --> 00:59:06,543
[analyst 1] The Washburn fire
is just north of the city of Wawona.
944
00:59:06,626 --> 00:59:09,629
[man 20] You're having to make decisions
with imperfect data.
945
00:59:10,630 --> 00:59:14,300
And so how do we have AI help us
with that fog of war?
946
00:59:17,637 --> 00:59:19,597
Wildfires are very chaotic.
947
00:59:20,181 --> 00:59:21,266
They're very complex,
948
00:59:22,267 --> 00:59:26,354
and so we're working
to utilize artificial intelligence
949
00:59:26,437 --> 00:59:27,855
to help make decisions.
950
00:59:28,356 --> 00:59:30,942
[somber melodic music plays]
951
00:59:31,442 --> 00:59:33,987
The Cognitive Mission Manager
is a program we're building
952
00:59:34,070 --> 00:59:37,949
that takes aerial infrared video
953
00:59:38,032 --> 00:59:41,786
and then processes it
through our AI algorithms
954
00:59:41,869 --> 00:59:45,248
to be able to predict
the future state of the fire.
955
00:59:45,331 --> 00:59:47,083
[music intensifies]
956
00:59:48,042 --> 00:59:49,794
[man 21] As we move into the future,
957
00:59:49,877 --> 00:59:52,922
the Cognitive Mission Manager
will use simulation,
958
00:59:53,006 --> 00:59:57,093
running scenarios
over thousands of cycles,
959
00:59:57,176 --> 01:00:00,722
to recommend the most effective way
to deploy resources
960
01:00:00,805 --> 01:00:04,058
to suppress high-priority areas of fire.
961
01:00:04,142 --> 01:00:05,810
[tense music plays]
962
01:00:08,062 --> 01:00:12,400
It'll say, "Go perform an aerial
suppression with a Firehawk here."
963
01:00:13,568 --> 01:00:15,903
"Take ground crews that clear brush...
964
01:00:15,987 --> 01:00:16,904
[sirens wail]
965
01:00:16,988 --> 01:00:19,032
...firefighters that are hosing down,
966
01:00:19,657 --> 01:00:23,077
and deploy them
into the highest priority areas."
967
01:00:24,704 --> 01:00:26,205
[music intensifies]
968
01:00:26,289 --> 01:00:29,292
Those decisions
will be able to be generated faster
969
01:00:29,375 --> 01:00:31,044
and more efficiently.
970
01:00:35,840 --> 01:00:38,468
[Justin T.] We view AI
as uniquely allowing our humans
971
01:00:38,551 --> 01:00:42,305
to be able to keep up
with the ever-changing environment.
972
01:00:42,388 --> 01:00:43,431
[grim music plays]
973
01:00:43,514 --> 01:00:46,517
And there are
a credible number of parallels
974
01:00:46,601 --> 01:00:50,355
to what we're used to at Lockheed Martin
on the defense side.
975
01:00:52,231 --> 01:00:55,860
[Emilia] The military is no longer
talking about just using AI
976
01:00:55,943 --> 01:01:00,323
in individual weapons systems
to make targeting and kill decisions,
977
01:01:00,865 --> 01:01:02,200
but integrating AI
978
01:01:02,283 --> 01:01:05,787
into the whole decision-making
architecture of the military.
979
01:01:05,870 --> 01:01:07,413
[beeping]
980
01:01:09,415 --> 01:01:12,960
[Bob] The Army has a big project
called Project Convergence.
981
01:01:14,545 --> 01:01:16,589
The Navy has Overmatch.
982
01:01:16,673 --> 01:01:19,592
And the Air Force has
Advanced Battle Management System.
983
01:01:19,676 --> 01:01:21,260
[beeping]
984
01:01:21,886 --> 01:01:24,138
The Department of Defense
is trying to figure out,
985
01:01:24,222 --> 01:01:26,683
"How do we put all these pieces together,
986
01:01:26,766 --> 01:01:29,268
so that we can operate
faster than our adversary
987
01:01:30,395 --> 01:01:32,355
and really gain an advantage?"
988
01:01:33,481 --> 01:01:37,527
[Stacie] An AI Battle Manager would be
like a fairly high-ranking General
989
01:01:37,610 --> 01:01:39,821
who's in charge of the battle.
990
01:01:39,904 --> 01:01:41,072
[grim music continues]
991
01:01:41,155 --> 01:01:44,158
Helping to give orders
to large numbers of forces,
992
01:01:45,118 --> 01:01:49,247
coordinating the actions
of all of the weapons that are out there,
993
01:01:49,330 --> 01:01:51,958
and doing it at a speed
that no human could keep up with.
994
01:01:52,041 --> 01:01:53,459
[music intensifies]
995
01:01:53,543 --> 01:01:55,294
[Emilia] We've spent the past 70 years
996
01:01:55,378 --> 01:01:58,798
building the most sophisticated military
on the planet,
997
01:01:58,881 --> 01:02:03,177
and now we're facing the decision
as to whether we want to cede control
998
01:02:03,261 --> 01:02:07,598
over that infrastructure to an algorithm,
to software.
999
01:02:08,141 --> 01:02:10,268
[indistinct chatter]
1000
01:02:10,351 --> 01:02:13,187
And the consequences of that decision
1001
01:02:13,271 --> 01:02:16,774
could trigger the full weight
of our military arsenals.
1002
01:02:16,858 --> 01:02:19,902
That's not one Hiroshima. That's hundreds.
1003
01:02:19,986 --> 01:02:21,404
[music crescendos]
1004
01:02:25,616 --> 01:02:27,869
This is the time that we need to act
1005
01:02:27,952 --> 01:02:33,166
because the window to actually
contain this risk is rapidly closing.
1006
01:02:33,249 --> 01:02:34,584
[melodic music plays]
1007
01:02:34,667 --> 01:02:39,338
[UN chair] This afternoon, we start
with international security challenges
1008
01:02:39,422 --> 01:02:41,132
posed by emerging technologies
1009
01:02:41,215 --> 01:02:44,343
in the area of lethal
autonomous weapons systems.
1010
01:02:44,427 --> 01:02:47,722
[Emilia] Conversations are happening
within the United Nations
1011
01:02:47,805 --> 01:02:50,016
about the threat
of lethal autonomous weapons
1012
01:02:50,600 --> 01:02:56,564
and our prohibition on systems
that use AI to select and target people.
1013
01:02:56,647 --> 01:03:00,777
Consensus amongst technologists
is clear and resounding.
1014
01:03:00,860 --> 01:03:04,030
We are opposed
to autonomous weapons that target humans.
1015
01:03:05,615 --> 01:03:08,993
[Izumi] For years, states have actually
discussed this issue
1016
01:03:09,076 --> 01:03:11,329
of lethal autonomous weapon systems.
1017
01:03:12,288 --> 01:03:16,793
This is about a common,
shared sense of security.
1018
01:03:17,752 --> 01:03:20,546
But of course, it's not easy.
1019
01:03:21,297 --> 01:03:25,968
Certain countries,
especially those military powers,
1020
01:03:26,052 --> 01:03:27,762
they want to be ahead of the curve,
1021
01:03:28,429 --> 01:03:31,599
so that they will be
ahead of their adversaries.
1022
01:03:32,600 --> 01:03:36,187
[Paul] The problem is, everyone
has to agree to get anything done.
1023
01:03:36,896 --> 01:03:39,065
There will be
at least one country that objects,
1024
01:03:39,148 --> 01:03:42,109
and certainly the United States
and Russia have both made clear
1025
01:03:42,193 --> 01:03:45,655
that they are opposed to a treaty
that would ban autonomous weapons.
1026
01:03:47,198 --> 01:03:53,246
[Emilia] When we think about the number
of people working to make AI more powerful
1027
01:03:54,747 --> 01:03:56,457
that room is very crowded.
1028
01:03:57,750 --> 01:04:02,880
When we think about the room of people,
making sure that AI is safe,
1029
01:04:04,340 --> 01:04:07,552
that room's much more sparsely populated.
[chuckles]
1030
01:04:11,764 --> 01:04:13,724
But I'm also really optimistic.
1031
01:04:13,808 --> 01:04:15,560
[melodic music plays]
1032
01:04:16,060 --> 01:04:19,438
I look at something
like the Biological Weapons Convention,
1033
01:04:19,522 --> 01:04:21,774
which happened
in the middle of the Cold War...
1034
01:04:21,858 --> 01:04:22,942
[crowd cheers]
1035
01:04:23,025 --> 01:04:27,655
...despite tensions between the Soviet Union
and the United States.
1036
01:04:29,156 --> 01:04:33,536
They were able to realize
that the development of biological weapons
1037
01:04:33,619 --> 01:04:36,247
was in neither of their best interests,
1038
01:04:36,330 --> 01:04:38,749
and not in the best interests
of the world at large.
1039
01:04:38,833 --> 01:04:40,835
[music intensifies]
1040
01:04:41,335 --> 01:04:44,672
Arms race dynamics
favor speed over safety.
1041
01:04:44,755 --> 01:04:45,631
[beeping]
1042
01:04:45,715 --> 01:04:48,342
But I think
what's important to consider is,
1043
01:04:48,426 --> 01:04:52,430
at some point,
the cost of moving fast becomes too high.
1044
01:04:54,557 --> 01:04:56,142
[beeping]
1045
01:04:56,225 --> 01:04:59,270
[Sean] We can't just develop things
in isolation
1046
01:05:00,021 --> 01:05:06,611
and put them out there without any thought
of where they could go in the future.
1047
01:05:07,862 --> 01:05:10,823
We've got to prevent
that atom bomb moment.
1048
01:05:13,117 --> 01:05:14,493
[music intensifies]
1049
01:05:14,577 --> 01:05:17,705
[Brandon] The stakes in the AI race
are massive.
1050
01:05:18,289 --> 01:05:22,001
I don't think a lot of people
appreciate the global stability
1051
01:05:22,084 --> 01:05:28,174
that has been provided
by having a superior military force
1052
01:05:28,257 --> 01:05:31,135
for the past 75 years.
1053
01:05:31,218 --> 01:05:32,803
[beeping]
1054
01:05:32,887 --> 01:05:35,681
And so the United States
and our allied forces,
1055
01:05:36,557 --> 01:05:38,684
they need to outperform adversarial AI.
1056
01:05:40,436 --> 01:05:41,854
[music crescendos]
1057
01:05:43,314 --> 01:05:44,774
There is no second place in war.
1058
01:05:45,524 --> 01:05:47,485
[reporter 6] China laid out
an ambitious plan
1059
01:05:47,568 --> 01:05:49,111
to be the world leader in AI by 2030.
1060
01:05:49,195 --> 01:05:51,656
It's a race that some say America
is losing...
1061
01:05:51,739 --> 01:05:52,657
[tense music plays]
1062
01:05:52,740 --> 01:05:56,160
[official] He will accelerate
the adoption of artificial intelligence
1063
01:05:56,243 --> 01:05:59,246
to ensure
our competitive military advantage.
1064
01:05:59,330 --> 01:06:00,206
[beeping]
1065
01:06:00,289 --> 01:06:02,291
[music intensifies]
1066
01:06:02,875 --> 01:06:05,461
[Paul] We are racing forward
with this technology.
1067
01:06:05,544 --> 01:06:08,673
I think what's unclear
is how far are we going to go?
1068
01:06:09,507 --> 01:06:12,760
Do we control technology,
or does it control us?
1069
01:06:14,136 --> 01:06:16,973
[Emilia] There's really no opportunity
for do-overs.
1070
01:06:17,807 --> 01:06:20,977
Once the genie is out of the bottle,
it is out.
1071
01:06:22,019 --> 01:06:25,189
And it is very, very difficult
to put it back in.
1072
01:06:26,065 --> 01:06:27,566
[beeping]
1073
01:06:28,609 --> 01:06:31,737
[Sean] And if we don't act now,
it's too late.
1074
01:06:31,821 --> 01:06:33,030
[beeping]
1075
01:06:33,114 --> 01:06:34,657
It may already be too late.
1076
01:06:35,366 --> 01:06:36,366
[beeping]
1077
01:06:42,123 --> 01:06:43,499
[music crescendos]
1078
01:06:45,209 --> 01:06:48,212
[somber melodic music plays]
1079
01:08:53,629 --> 01:08:57,383
[harmonic tones chime]
89887
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.