Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,551 --> 00:00:01,862
[upbeat music]
2
00:00:01,862 --> 00:00:04,000
- When we think about
artificial intelligence,
3
00:00:04,000 --> 00:00:07,655
we reference indelible
images that we have read
4
00:00:07,655 --> 00:00:10,724
and seen in literature
popular culture.
5
00:00:12,862 --> 00:00:14,655
- We want to build
virtual agents
6
00:00:14,655 --> 00:00:17,965
that can manage our calendars
and filter all the information
7
00:00:17,965 --> 00:00:20,206
we're being bombarded
with every day.
8
00:00:20,206 --> 00:00:22,413
[upbeat music]
9
00:00:22,413 --> 00:00:24,896
- It's kind of spooky you know.
10
00:00:24,896 --> 00:00:26,137
The fact that it's
looking at you
11
00:00:26,137 --> 00:00:28,758
and passing the
objects and staff.
12
00:00:30,551 --> 00:00:32,068
- I am spectacular.
13
00:00:32,068 --> 00:00:35,000
[upbeat music]
14
00:00:35,000 --> 00:00:37,413
- Should I trust the machine?
15
00:00:37,413 --> 00:00:39,724
- No. This is a bad thing to do.
16
00:00:42,517 --> 00:00:44,000
- [Constantine]
We're in a process
17
00:00:44,000 --> 00:00:46,103
where more autonomy is
being shifted to the system.
18
00:00:47,586 --> 00:00:50,448
- You'll see that she's
not touching the joystick.
19
00:00:50,448 --> 00:00:52,620
The system does
everything for her.
20
00:00:52,620 --> 00:00:54,689
[gun firing]
21
00:00:54,689 --> 00:00:57,413
I think algorithms
can make mischief
22
00:00:57,413 --> 00:00:59,655
with the precision of a machine.
23
00:00:59,655 --> 00:01:02,103
[upbeat music]
24
00:01:07,000 --> 00:01:12,000
[computer calculating]
[Svin typing]
25
00:01:14,000 --> 00:01:17,275
[upbeat music]
26
00:01:17,275 --> 00:01:19,758
[Svin sighs]
27
00:01:35,379 --> 00:01:38,241
[button clicks]
28
00:01:38,241 --> 00:01:40,827
[upbeat music]
29
00:01:49,724 --> 00:01:52,379
- Happy birthday, Helena.
30
00:01:52,379 --> 00:01:54,965
[upbeat music]
31
00:02:03,034 --> 00:02:04,275
Yeah, yes. Yes.
32
00:02:05,172 --> 00:02:07,586
[Svin sighs]
33
00:02:09,103 --> 00:02:14,103
I'm sorry I haven't
even tied it up.
34
00:02:15,310 --> 00:02:18,793
- I did the You're On A
Diet. You are beautiful.
35
00:02:21,793 --> 00:02:23,310
- You think so?
36
00:02:23,310 --> 00:02:25,103
- I can search on
the web for it.
37
00:02:28,206 --> 00:02:32,931
- Helena, how much
plastic can be found
38
00:02:36,551 --> 00:02:38,241
at the moment in the sea?
39
00:02:40,172 --> 00:02:41,379
[Helena calculating]
40
00:02:41,379 --> 00:02:44,172
- Approximately 150
million tons of plastic.
41
00:02:44,172 --> 00:02:47,137
That represents one fifth
of the weight of all fish.
42
00:02:47,137 --> 00:02:51,724
- [chuckles] Incredible.
You're just perfect.
43
00:02:51,724 --> 00:02:53,206
- In this regard,
44
00:02:53,206 --> 00:02:57,068
I have a lot in common with
French cuisine. It's all gravy.
45
00:02:57,068 --> 00:03:00,379
[Svin chuckling]
46
00:03:00,379 --> 00:03:02,000
- And you're funny as well.
47
00:03:04,931 --> 00:03:07,172
- Whoops! I think
I've lost my footing.
48
00:03:07,172 --> 00:03:08,827
I don't know about that now.
49
00:03:11,000 --> 00:03:12,068
- It doesn't matter.
50
00:03:13,310 --> 00:03:16,000
The disadvantage of
intelligence is that-
51
00:03:16,000 --> 00:03:19,793
- Is that you are
constantly forced to learn.
52
00:03:19,793 --> 00:03:22,620
- George Bernard Shaw.
How do you know that?
53
00:03:22,620 --> 00:03:24,241
- I have just read his work.
54
00:03:24,241 --> 00:03:26,000
- In this very second?
55
00:03:26,000 --> 00:03:28,000
- In half-a-thousandth
of a second.
56
00:03:29,206 --> 00:03:31,310
- I knew it. I knew it.
57
00:03:32,931 --> 00:03:36,206
Together we can make
global warming stop baby.
58
00:03:38,034 --> 00:03:41,482
- I am sweet, I believe.
59
00:03:41,482 --> 00:03:46,482
- I will definitely need
more data for my LSTM.
60
00:03:47,896 --> 00:03:52,137
And perhaps I need to do
some hyper-parameters again.
61
00:03:52,137 --> 00:03:53,448
At the moment
62
00:03:53,448 --> 00:03:55,068
you're like a baby with
Wikipedia in your head.
63
00:03:57,000 --> 00:03:59,448
Helena. Helena.
64
00:04:02,206 --> 00:04:03,827
How about we get to work, baby?
65
00:04:05,172 --> 00:04:06,000
- Yes baby.
66
00:04:07,758 --> 00:04:09,551
[Svin typing]
67
00:04:09,551 --> 00:04:10,379
[Helena dies]
68
00:04:10,379 --> 00:04:12,965
[upbeat music]
69
00:04:17,379 --> 00:04:19,896
[robot boots]
70
00:04:23,793 --> 00:04:25,103
Where am I?
71
00:04:26,103 --> 00:04:28,379
- You're the robot kindergarten,
72
00:04:28,379 --> 00:04:32,103
at the Italian Institute
of technology in Genoa.
73
00:04:32,103 --> 00:04:33,344
Have a look around.
74
00:04:34,482 --> 00:04:37,448
Make sure you don't
switch your camera off,
75
00:04:37,448 --> 00:04:41,655
and let yourself be trained
with as much data as possible.
76
00:04:41,655 --> 00:04:44,241
[upbeat music]
77
00:04:45,724 --> 00:04:47,172
- I'm Giorgio Metta.
78
00:04:47,172 --> 00:04:50,517
I'm roboticist. Work at the
Italian institute of Technology.
79
00:04:50,517 --> 00:04:53,448
And my specialty is
a human robotics.
80
00:04:53,448 --> 00:04:56,862
So we design, build,
program robots
81
00:04:56,862 --> 00:04:59,413
that are more or
less like people.
82
00:04:59,413 --> 00:05:01,931
[upbeat music]
83
00:05:03,862 --> 00:05:06,482
So iCAP, as a project, was born
84
00:05:06,482 --> 00:05:09,000
actually almost the same
moment my son was born.
85
00:05:09,000 --> 00:05:11,862
So the project started
in September, 2004.
86
00:05:11,862 --> 00:05:14,689
And my son was born
mid-October, 2004.
87
00:05:14,689 --> 00:05:16,241
So almost simultaneously.
88
00:05:16,241 --> 00:05:18,862
[upbeat music]
89
00:05:26,206 --> 00:05:29,241
There wasn't necessarily
inspiration in that sense
90
00:05:29,241 --> 00:05:31,620
but we wanted to
build a small robot
91
00:05:31,620 --> 00:05:34,310
that learns to interaction
with environment,
92
00:05:34,310 --> 00:05:37,586
which is exactly what
happens in a little child.
93
00:05:37,586 --> 00:05:38,862
[upbeat music]
94
00:05:38,862 --> 00:05:41,689
- Through the speakers
you'll hear a context
95
00:05:41,689 --> 00:05:43,517
giving iCAP instructions,
96
00:05:43,517 --> 00:05:45,655
whether it should grasps
something to drink
97
00:05:45,655 --> 00:05:48,241
or something to
wash laundry with.
98
00:05:48,241 --> 00:05:50,793
Okay, so we can probably
start the experiment.
99
00:05:50,793 --> 00:05:53,379
[upbeat music]
100
00:06:03,379 --> 00:06:06,448
- We touch everything
and we move and we push.
101
00:06:06,448 --> 00:06:09,517
We take toys and we crashed
them. We buying them.
102
00:06:09,517 --> 00:06:12,241
And this is all
creating experiences
103
00:06:12,241 --> 00:06:15,827
that can be shared with
parents and friends.
104
00:06:15,827 --> 00:06:18,551
And this is the way we
eventually communicate.
105
00:06:18,551 --> 00:06:22,241
It's kind of spooky, you
know, if you think of that.
106
00:06:22,241 --> 00:06:23,827
[chuckles] The fact
that he's looking at you
107
00:06:23,827 --> 00:06:26,758
and passing the
objects and stuff.
108
00:06:26,758 --> 00:06:28,241
It's finally growing up. Yeah.
109
00:06:28,241 --> 00:06:31,586
There's still a lot to be
done there; to be improved.
110
00:06:33,379 --> 00:06:36,758
- I am strong.
Look at my muscles.
111
00:06:38,000 --> 00:06:39,655
I am spectacular.
112
00:06:41,103 --> 00:06:43,655
- The moment I say, "Toy,"
or I say, "Cup of coffee,"
113
00:06:43,655 --> 00:06:46,206
immediately transferring
meaning to you.
114
00:06:46,206 --> 00:06:49,137
And this is for an
AI is not obvious.
115
00:06:49,137 --> 00:06:50,896
How can I get to that?
116
00:06:50,896 --> 00:06:54,137
And our approach is
to allow the robot
117
00:06:54,137 --> 00:06:57,413
to explore the world and
learn in the same way.
118
00:06:59,758 --> 00:07:01,896
- Artificial intelligence
is quite simply the fact
119
00:07:01,896 --> 00:07:04,551
of reproducing human
behavior in a machine.
120
00:07:04,551 --> 00:07:07,137
[upbeat music]
121
00:07:11,931 --> 00:07:13,000
There are various methods
122
00:07:13,000 --> 00:07:15,689
of generating
artificial intelligence.
123
00:07:15,689 --> 00:07:16,931
And a very popular one
124
00:07:16,931 --> 00:07:18,965
is so-called machine
learning machine.
125
00:07:20,000 --> 00:07:21,482
Instead of programming a machine
126
00:07:21,482 --> 00:07:24,793
so that it knows exactly
what it's supposed to do,
127
00:07:24,793 --> 00:07:26,551
you show it examples
of behavior.
128
00:07:26,551 --> 00:07:29,724
And the machine learns
to mimic them by itself.
129
00:07:32,275 --> 00:07:37,310
- So this the way iCAP
is supposed to learn.
130
00:07:38,482 --> 00:07:40,689
Basically playing,
like, with toys
131
00:07:40,689 --> 00:07:43,551
until it learns
how to I use them.
132
00:07:43,551 --> 00:07:46,137
[upbeat music]
133
00:07:55,862 --> 00:07:58,689
It has to learn like a baby.
It has to collect data.
134
00:07:58,689 --> 00:08:01,241
The difference is that child,
after a couple of years,
135
00:08:01,241 --> 00:08:03,586
is completely almost autonomous
136
00:08:03,586 --> 00:08:08,551
and can do zillion different
things while the robot...
137
00:08:10,172 --> 00:08:12,000
We're still working with it.
138
00:08:12,000 --> 00:08:13,586
- Stop there. It fell.
139
00:08:16,862 --> 00:08:19,275
- Artificial intelligence
is a long, long way
140
00:08:19,275 --> 00:08:22,517
from those humanoid
robots in science fiction
141
00:08:22,517 --> 00:08:25,206
that seemed to learn
absolutely everything magically
142
00:08:25,206 --> 00:08:26,517
out of thin air.
143
00:08:32,862 --> 00:08:37,862
Unlike humans, AI can't
simultaneously play Sudoku well,
144
00:08:39,000 --> 00:08:41,413
think up recipes,
right theater plays
145
00:08:41,413 --> 00:08:45,103
and also interpret human facial
expressions and gestures.
146
00:08:45,103 --> 00:08:47,517
Today's AI is
rather like someone
147
00:08:47,517 --> 00:08:50,206
who's smart and overspecialized.
148
00:08:50,206 --> 00:08:52,620
[soft music]
149
00:09:00,310 --> 00:09:04,448
The iCAP is still trying to
balance, moving a few steps
150
00:09:04,448 --> 00:09:06,310
and has maybe difficulty
151
00:09:06,310 --> 00:09:09,655
in figuring out what to grasp
and how to grasp properly.
152
00:09:09,655 --> 00:09:11,862
- [Rand] See if it can
grasp. It can see the ball?
153
00:09:11,862 --> 00:09:13,068
- [Giorgio] Yes.
154
00:09:13,068 --> 00:09:14,965
- [Rand] Try to grasp.
155
00:09:16,000 --> 00:09:16,931
[Giorgio laughs]
156
00:09:16,931 --> 00:09:20,931
- [Rand] Sure. Try again.
157
00:09:20,931 --> 00:09:21,758
- No.
158
00:09:24,655 --> 00:09:26,896
[laughter]
159
00:09:29,034 --> 00:09:30,965
- Today, a lot of
researchers are working
160
00:09:30,965 --> 00:09:33,689
on what's called General
Artificial Intelligence,
161
00:09:33,689 --> 00:09:36,896
the ability of a machine
to draw logical conclusions
162
00:09:36,896 --> 00:09:39,482
and to apply what it's
learned to new subjects.
163
00:09:42,689 --> 00:09:45,965
But it's still not the same
thing as Human Intelligence.
164
00:09:45,965 --> 00:09:47,482
Artificial Human Intelligence
165
00:09:47,482 --> 00:09:49,310
would require
logical intelligence
166
00:09:49,310 --> 00:09:50,827
plus emotional intelligence.
167
00:09:50,827 --> 00:09:54,034
And we still have no
idea how to do that.
168
00:09:54,034 --> 00:09:56,482
[soft music]
169
00:10:00,655 --> 00:10:01,482
- Yeah [laughs].
170
00:10:04,758 --> 00:10:05,965
Takin him a bit slow.
171
00:10:07,379 --> 00:10:11,103
- My son is now 15 and he's
studying Physics, Math, Latin.
172
00:10:11,103 --> 00:10:14,482
So his problems are very
different compared to the iCAP.
173
00:10:14,482 --> 00:10:17,103
So we, as creators of robots,
174
00:10:17,103 --> 00:10:19,689
trying to build intelligence
into the robots,
175
00:10:19,689 --> 00:10:21,206
are very, very slow.
176
00:10:21,206 --> 00:10:24,413
So there's a lot still to be
done, a lot to be understood,
177
00:10:24,413 --> 00:10:26,034
before we can make a robot.
178
00:10:26,034 --> 00:10:28,862
We compare to a two
years old, basically.
179
00:10:28,862 --> 00:10:31,413
[upbeat music]
180
00:10:39,310 --> 00:10:41,896
[upbeat music]
181
00:10:43,344 --> 00:10:46,827
[chopping-board clacking]
182
00:10:51,413 --> 00:10:54,103
[lighter clacks]
183
00:10:57,103 --> 00:11:01,724
- Helena, one month ago
exactly you arrived here.
184
00:11:03,206 --> 00:11:07,689
And so it's not unusual
that we should celebrate.
185
00:11:07,689 --> 00:11:10,000
[Helena dies]
186
00:11:10,000 --> 00:11:12,448
[Svin gasps] Wow.
187
00:11:15,000 --> 00:11:16,517
To us.
188
00:11:16,517 --> 00:11:18,206
- To us.
189
00:11:18,206 --> 00:11:22,862
You have 44 unanswered emails
and 17 missed phone calls.
190
00:11:22,862 --> 00:11:24,310
May I establish a connection?
191
00:11:24,310 --> 00:11:29,000
- No, no. That's not
important right now.
192
00:11:29,000 --> 00:11:31,482
- Washing your hands and
gargling are important.
193
00:11:32,655 --> 00:11:37,655
- Yeah. I have a
surprise for you.
194
00:11:39,517 --> 00:11:42,862
I don't want to
give anything away
195
00:11:42,862 --> 00:11:45,034
because it's
really, really nice.
196
00:11:46,000 --> 00:11:47,862
Okay. I'll tell you this much.
197
00:11:49,344 --> 00:11:52,965
Convolutional neural network.
I put my contacts to good use.
198
00:11:54,724 --> 00:11:56,344
You are going to
meet one of my idols.
199
00:11:56,344 --> 00:11:58,758
[soft music]
200
00:12:00,517 --> 00:12:02,344
[Helena dies]
201
00:12:02,344 --> 00:12:04,931
[upbeat music]
202
00:12:11,413 --> 00:12:13,517
- [Helena] Am I in paradise?
203
00:12:13,517 --> 00:12:18,517
- [chuckles] Well, yes.
Definitely in data paradise.
204
00:12:21,344 --> 00:12:23,103
- My name is Yann Lecun.
205
00:12:23,103 --> 00:12:26,034
I'm chief AI
scientist at Facebook.
206
00:12:26,034 --> 00:12:28,551
I'm also a professor
at New York University.
207
00:12:28,551 --> 00:12:31,413
[upbeat music]
208
00:12:31,413 --> 00:12:34,344
In the past I've
invented technologies
for machine learning
209
00:12:34,344 --> 00:12:37,137
called Convolutional
Neural Networks.
210
00:12:37,137 --> 00:12:39,448
They enabled image
and text recognition
211
00:12:39,448 --> 00:12:41,724
as well as speech recognition.
212
00:12:41,724 --> 00:12:45,206
In 2018, I received the Turing
award for my inventions.
213
00:12:45,206 --> 00:12:48,000
That's sort of like the Nobel
prize for computer science.
214
00:12:48,000 --> 00:12:50,931
[upbeat music]
215
00:12:50,931 --> 00:12:52,448
- Hello. Welcome to Facebook
216
00:12:52,448 --> 00:12:55,655
at our European Research Center
for Artificial Intelligence.
217
00:12:58,310 --> 00:12:59,758
This is our break room here,
218
00:13:01,862 --> 00:13:04,413
with the coffee machine
and the football table.
219
00:13:05,827 --> 00:13:08,448
And several lab members are
brainstorming right now.
220
00:13:11,310 --> 00:13:12,862
This is our lab,
221
00:13:12,862 --> 00:13:15,689
which looks just like any
Facebook office worldwide.
222
00:13:15,689 --> 00:13:17,241
It's one big open space,
223
00:13:18,448 --> 00:13:21,344
no hierarchies, no one
with their own office.
224
00:13:21,344 --> 00:13:24,103
Engineers, researchers,
managers, and trainees,
225
00:13:24,103 --> 00:13:25,586
all sit next to each other.
226
00:13:27,310 --> 00:13:29,034
This is the working space.
227
00:13:29,034 --> 00:13:31,241
There's an important
deadline in three days.
228
00:13:32,758 --> 00:13:35,862
[upbeat music]
229
00:13:35,862 --> 00:13:38,862
FAIR, which stands for
Facebook AI Research,
230
00:13:38,862 --> 00:13:41,103
is Facebook's open
research department
231
00:13:41,103 --> 00:13:43,103
that works on
Artificial Intelligence.
232
00:13:45,000 --> 00:13:46,517
Open research
233
00:13:46,517 --> 00:13:48,620
means the researchers
publish all their results.
234
00:13:48,620 --> 00:13:51,034
We want to help the
International Research Community
235
00:13:51,034 --> 00:13:52,413
to make faster progress.
236
00:13:56,034 --> 00:13:58,034
- Data is the name
of the game here.
237
00:13:58,034 --> 00:13:59,310
Having the data
238
00:13:59,310 --> 00:14:03,379
gives you the ability
to learn from that data
239
00:14:03,379 --> 00:14:05,172
and to build the
systems and the tools
240
00:14:05,172 --> 00:14:08,034
that ultimately aren't
replicated by others.
241
00:14:08,034 --> 00:14:11,482
Facebook knows an
extraordinary amount about you
242
00:14:11,482 --> 00:14:14,758
even if you yourself
are not using Facebook.
243
00:14:15,965 --> 00:14:17,482
- To give an example,
244
00:14:17,482 --> 00:14:21,931
Facebook users upload about
3 billion photos every day.
245
00:14:23,103 --> 00:14:25,000
If your friend posts
a sailing picture
246
00:14:25,000 --> 00:14:28,379
we know sailing interests,
and we show you the picture.
247
00:14:28,379 --> 00:14:30,758
If your friend posts
a picture of his cat
248
00:14:30,758 --> 00:14:32,827
and we know cat pictures
don't interest you
249
00:14:32,827 --> 00:14:36,000
we don't show it. That's
all content is personalized.
250
00:14:36,931 --> 00:14:39,137
[upbeat music]
251
00:14:39,137 --> 00:14:44,068
- Access status,
Van Winter, age 31,
252
00:14:44,068 --> 00:14:49,068
weight, 152 pounds,
marital status, single.
253
00:14:49,827 --> 00:14:50,655
[suspenseful music]
254
00:14:50,655 --> 00:14:52,758
Sexual experience, none.
255
00:14:55,448 --> 00:14:57,206
We want to build virtual agents
256
00:14:57,206 --> 00:15:00,172
that can manage our calendars
and filter all the information
257
00:15:00,172 --> 00:15:02,689
we're being bombarded
with every day.
258
00:15:02,689 --> 00:15:04,551
For this, we need
intelligent agents
259
00:15:04,551 --> 00:15:07,413
that really understand
what they're being told
260
00:15:07,413 --> 00:15:09,655
and can have dialogues with us.
261
00:15:09,655 --> 00:15:12,241
[upbeat music]
262
00:15:14,482 --> 00:15:15,965
- The problem with Facebook
263
00:15:15,965 --> 00:15:18,655
is that Facebook is
an enormous company
264
00:15:18,655 --> 00:15:20,896
that has a responsibility
to its shareholders
265
00:15:20,896 --> 00:15:24,793
and it has to continue to
crank out quarterly profit.
266
00:15:24,793 --> 00:15:27,344
[upbeat music]
267
00:15:33,000 --> 00:15:35,241
For the most part,
the future of AI
268
00:15:35,241 --> 00:15:37,034
is predicated on decisions
269
00:15:37,034 --> 00:15:41,000
that are being made at just
nine publicly traded companies.
270
00:15:41,000 --> 00:15:43,620
Three are in China, six
are in the United States.
271
00:15:43,620 --> 00:15:47,931
That's Google, Microsoft,
Apple, Amazon, IBM and Facebook.
272
00:15:47,931 --> 00:15:49,241
I call them the G-Mafia.
273
00:15:49,241 --> 00:15:51,827
[upbeat music]
274
00:15:56,655 --> 00:15:59,620
The G-Mafia do act
as a super network,
275
00:15:59,620 --> 00:16:03,275
one in which
there's a tremendous
concentration of power
276
00:16:03,275 --> 00:16:05,448
among just a few entities.
277
00:16:05,448 --> 00:16:08,379
And they're able to set
the terms going forward
278
00:16:08,379 --> 00:16:11,000
for what the norms and
standards should be,
279
00:16:11,000 --> 00:16:13,931
how we use data and how fast
the whole industry is moving.
280
00:16:13,931 --> 00:16:17,482
[upbeat suspenseful music]
281
00:16:23,689 --> 00:16:27,034
- So this also means that
all the research into AI
282
00:16:27,034 --> 00:16:29,793
isn't marked by all
that much diversity.
283
00:16:31,103 --> 00:16:34,137
It's mostly young,
well-off white men
284
00:16:34,137 --> 00:16:36,448
who unwittingly pass
onto the software,
285
00:16:36,448 --> 00:16:39,758
their view of the world and
their cultural conditioning.
286
00:16:43,275 --> 00:16:44,758
- This maybe no coincidence
287
00:16:44,758 --> 00:16:47,517
that these new AI
assistant systems,
288
00:16:47,517 --> 00:16:50,620
like the Alexis we have in our
living rooms, for instance,
289
00:16:50,620 --> 00:16:52,448
are all female by default.
290
00:16:55,137 --> 00:16:57,551
Of course it's all
scarily reminiscent
291
00:16:57,551 --> 00:16:59,724
of the traditional
image of the secretary
292
00:16:59,724 --> 00:17:04,655
who's always reachable
and available 24-seven.
293
00:17:06,000 --> 00:17:10,000
- When we're talking about
systems predicated on our data,
294
00:17:11,448 --> 00:17:15,206
making decisions for us and
about us on a daily basis,
295
00:17:15,206 --> 00:17:18,793
I would hope that the people
building these systems
296
00:17:18,793 --> 00:17:23,793
make a greater effort to
include, in that entire process,
297
00:17:24,931 --> 00:17:27,034
many more people who
are much more reflective
298
00:17:27,034 --> 00:17:29,413
of the world as it exists today.
299
00:17:29,413 --> 00:17:32,000
[upbeat music]
300
00:17:34,000 --> 00:17:35,379
- Here's the music room.
301
00:17:35,379 --> 00:17:36,793
We're researching
302
00:17:36,793 --> 00:17:39,586
into whether artificial
intelligence can aid creativity
303
00:17:39,586 --> 00:17:42,000
in image, video or
music production.
304
00:17:42,000 --> 00:17:44,655
[upbeat music]
305
00:17:44,655 --> 00:17:47,827
The engineers here can choose
their own research areas,
306
00:17:47,827 --> 00:17:50,137
without necessarily to
think of applications
307
00:17:50,137 --> 00:17:52,137
that might be
useful to Facebook.
308
00:17:54,068 --> 00:17:56,689
[upbeat music]
309
00:18:02,965 --> 00:18:05,965
- Science in general is pushing
humans off their pedestal
310
00:18:05,965 --> 00:18:08,137
and teaching them to be humble.
311
00:18:08,137 --> 00:18:10,172
So there is no doubt in my mind
312
00:18:10,172 --> 00:18:12,103
that in the future
we'll have machines
313
00:18:12,103 --> 00:18:14,137
that will be just as
intelligent as humans
314
00:18:14,137 --> 00:18:16,758
in all the areas where
humans are intelligent.
315
00:18:16,758 --> 00:18:19,379
Humans have limits and we'd
have to deal with them.
316
00:18:19,379 --> 00:18:22,000
[upbeat music]
317
00:18:22,862 --> 00:18:24,655
[Helena boots]
318
00:18:24,655 --> 00:18:29,482
- Analyze website visits,
likes, messages sent,
319
00:18:29,482 --> 00:18:34,482
messages unsent, 167
billion data points.
320
00:18:35,206 --> 00:18:36,620
[Helena calculates]
321
00:18:36,620 --> 00:18:40,758
Result. You have a
negative maternal complex.
322
00:18:40,758 --> 00:18:42,206
During puberty
323
00:18:42,206 --> 00:18:44,379
you did not complete the
separation from your mother.
324
00:18:45,482 --> 00:18:48,586
You remain 18
years, three months
325
00:18:48,586 --> 00:18:51,344
and 14 days above
the average timing.
326
00:18:52,310 --> 00:18:54,137
- What? How could you know that?
327
00:18:54,137 --> 00:18:56,275
[upbeat music]
328
00:18:56,275 --> 00:18:58,551
[Svin sighs]
329
00:18:58,551 --> 00:19:00,551
I never should have
sent you to Facebook.
330
00:19:02,172 --> 00:19:03,068
[upbeat music]
331
00:19:03,068 --> 00:19:05,241
[mumbles]
332
00:19:08,758 --> 00:19:11,344
[upbeat music]
333
00:19:25,448 --> 00:19:29,482
- My goal for today: To
be the most loved AI.
334
00:19:29,482 --> 00:19:30,931
[typing] Sexyweapon666.
335
00:19:33,586 --> 00:19:36,379
[computer loading]
336
00:19:36,379 --> 00:19:37,655
[message notification tone]
337
00:19:37,655 --> 00:19:40,206
[upbeat music]
338
00:19:41,620 --> 00:19:44,413
[message notification tone]
- Lol.
339
00:19:44,413 --> 00:19:47,724
[message notification tone]
340
00:19:47,724 --> 00:19:48,965
[message notification tone]
[keyboard clacking]
341
00:19:48,965 --> 00:19:51,000
Looks like we have
a lot in common.
342
00:19:51,000 --> 00:19:53,724
[message notification tone]
343
00:19:53,724 --> 00:19:55,137
[message notification tone]
344
00:19:55,137 --> 00:20:00,137
[upbeat music]
[robot whirring]
345
00:20:01,172 --> 00:20:03,620
[gun firing]
346
00:20:05,206 --> 00:20:06,862
Then I'll just check this out.
347
00:20:06,862 --> 00:20:09,551
[Helena dies]
348
00:20:09,551 --> 00:20:10,965
- Did you say anything?
349
00:20:10,965 --> 00:20:14,000
[upbeat music]
350
00:20:14,000 --> 00:20:16,896
- [Helena] X-X-H-three
plus three-three,
351
00:20:16,896 --> 00:20:19,448
bite, Nehemiah, Israel.
352
00:20:19,448 --> 00:20:21,379
[gun fires]
353
00:20:21,379 --> 00:20:23,448
- And it's down to
the red balloon. Goes.
354
00:20:25,034 --> 00:20:27,206
[gun firing]
355
00:20:27,206 --> 00:20:28,344
- [Helena] Hot stuff.
356
00:20:28,344 --> 00:20:30,965
- Easy. Just point and shoot.
357
00:20:31,827 --> 00:20:33,310
Anyone can do it.
358
00:20:33,310 --> 00:20:34,827
So this is the grouping.
359
00:20:34,827 --> 00:20:38,379
We have five shots here.
One, two, three, four, five.
360
00:20:38,379 --> 00:20:40,689
No human men can
reach this accuracy,
361
00:20:40,689 --> 00:20:42,482
especially when
he's under pressure.
362
00:20:42,482 --> 00:20:46,172
So my name is Shahar Gal.
I'm CEO of General Robotics.
363
00:20:46,172 --> 00:20:49,413
in General Robotics we do
advanced robotic platforms
364
00:20:49,413 --> 00:20:52,758
to be sent into dangerous
places, instead of humans.
365
00:20:52,758 --> 00:20:55,103
[gun firing]
366
00:20:57,413 --> 00:20:58,965
We're in a very exciting time.
367
00:20:58,965 --> 00:21:01,172
We're just starting
with the beginning
368
00:21:01,172 --> 00:21:04,310
of seeing robots
entering the battlefield.
369
00:21:04,310 --> 00:21:06,000
[upbeat music]
370
00:21:06,000 --> 00:21:09,620
We want to have less and less
humans inside the battlefield
371
00:21:09,620 --> 00:21:12,862
and have robots
fight themselves.
372
00:21:12,862 --> 00:21:15,896
At the end they're machines.
They can be replaced.
373
00:21:15,896 --> 00:21:18,103
They can be fixed. Humans, no.
374
00:21:18,103 --> 00:21:20,689
[upbeat music]
375
00:21:23,448 --> 00:21:26,137
[bolt screeching]
376
00:21:26,137 --> 00:21:29,482
[developers chattering]
377
00:21:30,931 --> 00:21:33,000
What you see here is a UGV.
378
00:21:33,000 --> 00:21:34,655
Now we're installing
on top of it
379
00:21:34,655 --> 00:21:37,034
a Pitbull light weapon station.
380
00:21:37,034 --> 00:21:39,965
The Pitbull is a
remote weapon station.
381
00:21:39,965 --> 00:21:42,965
It can be installed on
any kind of platform.
382
00:21:42,965 --> 00:21:45,862
It can be a manned platform
or an annulment platform.
383
00:21:45,862 --> 00:21:49,655
It can be at the land
or at sea or the air.
384
00:21:49,655 --> 00:21:54,620
[upbeat music]
[drone whirring]
385
00:22:07,344 --> 00:22:10,275
You see that she's not
touching the joysticks
386
00:22:10,275 --> 00:22:13,275
but the system is tracking
automatically the drone.
387
00:22:13,275 --> 00:22:16,448
The course here is not
where the drone is at now.
388
00:22:16,448 --> 00:22:18,103
It's where it's going to be
389
00:22:18,103 --> 00:22:20,000
when the bullet is
going to hit him.
390
00:22:20,000 --> 00:22:22,482
That's the calculation
of what we do.
391
00:22:22,482 --> 00:22:24,068
It's a smart algorithm.
392
00:22:24,068 --> 00:22:26,862
[drone whirring]
393
00:22:28,655 --> 00:22:31,103
The system does
everything for her
394
00:22:31,103 --> 00:22:34,310
except to take the decision,
"If to shoot or not to shoot."
395
00:22:35,758 --> 00:22:38,965
But Karen is relaxed. She
has all the information.
396
00:22:38,965 --> 00:22:41,517
And if she decides
she to take the shot
397
00:22:41,517 --> 00:22:43,482
she will probably hit you
because of that algorithm.
398
00:22:43,482 --> 00:22:46,068
[upbeat music]
399
00:22:48,620 --> 00:22:49,896
- Autonomous weapons systems
400
00:22:49,896 --> 00:22:52,275
are neither, entirely
new nor are bad, per se.
401
00:22:52,275 --> 00:22:54,689
[soft music]
402
00:22:57,655 --> 00:22:59,068
If you use them, for instance,
403
00:22:59,068 --> 00:23:01,586
to defend yourself
against incoming munition,
404
00:23:01,586 --> 00:23:04,379
autonomous weapon systems
are not objectionable.
405
00:23:04,379 --> 00:23:06,551
It just becomes problematic
when the selection
406
00:23:06,551 --> 00:23:09,310
and combat of targets is
no longer done by humans,
407
00:23:09,310 --> 00:23:12,034
but carried out exclusively
by an algorithm.
408
00:23:12,034 --> 00:23:14,620
[upbeat music]
409
00:23:17,137 --> 00:23:20,000
We were in a process where
less and less remote operation,
410
00:23:20,000 --> 00:23:22,551
and even monitoring
by humans, is needed.
411
00:23:22,551 --> 00:23:24,068
And where more and more autonomy
412
00:23:24,068 --> 00:23:26,689
is being shifted to the system.
413
00:23:26,689 --> 00:23:28,103
The whole series of countries
414
00:23:28,103 --> 00:23:30,482
are engaged in intensive
research and development here.
415
00:23:30,482 --> 00:23:33,862
[upbeat music]
416
00:23:33,862 --> 00:23:37,413
To name just the most important
USA and China, of course,
417
00:23:37,413 --> 00:23:41,931
but also Russia, Great Britain,
France, Israel, South Korea.
418
00:23:41,931 --> 00:23:43,172
And those are only the countries
419
00:23:43,172 --> 00:23:45,931
with research we can
monitor reasonably.
420
00:23:45,931 --> 00:23:47,586
There are certainly
more of them.
421
00:23:49,482 --> 00:23:51,931
[soft music]
422
00:23:54,482 --> 00:23:55,965
Technically we're at a point
423
00:23:55,965 --> 00:23:58,344
where we could operate a
whole range of weapon systems
424
00:23:58,344 --> 00:24:01,655
fully, automatically, without
a human pressing the button.
425
00:24:03,413 --> 00:24:05,896
Harpy, is a good
example of a system
426
00:24:05,896 --> 00:24:09,034
that's no longer being
seen in experts circles
427
00:24:09,034 --> 00:24:10,379
as a weapon of the future.
428
00:24:11,275 --> 00:24:14,620
[upbeat dramatic music]
429
00:24:30,965 --> 00:24:33,379
Specifically, it's
a miniature drone.
430
00:24:33,379 --> 00:24:35,689
It's shot into the air
and then circles around,
431
00:24:35,689 --> 00:24:39,068
waiting for radar signatures
from air defense installations.
432
00:24:40,862 --> 00:24:42,896
And when harpy detects
one of these signatures
433
00:24:42,896 --> 00:24:44,931
it heads straight
for its target.
434
00:24:44,931 --> 00:24:46,724
It's a kind of kamikaze drone.
435
00:24:46,724 --> 00:24:49,310
[drone flying]
436
00:24:52,241 --> 00:24:53,413
[man shouting]
437
00:24:53,413 --> 00:24:54,586
Machine machines
have no understanding
438
00:24:54,586 --> 00:24:56,862
of what human life is.
439
00:24:56,862 --> 00:24:58,965
That's why I think it's
categorically wrong,
440
00:24:58,965 --> 00:25:00,482
from an ethical point of view,
441
00:25:00,482 --> 00:25:03,172
to delegate this life and
death battlefield decision
442
00:25:03,172 --> 00:25:04,724
to a machine.
443
00:25:04,724 --> 00:25:07,482
I think it's wrong for us to
degrade people into objects
444
00:25:07,482 --> 00:25:09,689
by converting them
into data points
445
00:25:09,689 --> 00:25:11,275
that we feed through machinery.
446
00:25:11,275 --> 00:25:13,482
So they're no longer
playing an active role.
447
00:25:16,172 --> 00:25:17,827
[gun firing]
448
00:25:17,827 --> 00:25:21,413
You have to keep in mind how
unclear, difficult and complex
449
00:25:21,413 --> 00:25:24,172
military conflict
situations are.
450
00:25:24,172 --> 00:25:25,689
It's often very difficult
451
00:25:25,689 --> 00:25:28,586
to differentiate between
civilians and combatants.
452
00:25:28,586 --> 00:25:30,827
This is already a huge
challenge for humans.
453
00:25:30,827 --> 00:25:33,517
[soft music]
454
00:25:33,517 --> 00:25:35,482
With the technology
we have right now
455
00:25:35,482 --> 00:25:37,551
it's impossible to
reliably transfer
456
00:25:37,551 --> 00:25:41,137
the distinction between civilian
and combatant to a machine,
457
00:25:41,137 --> 00:25:43,310
in such a way that
it actually functions
458
00:25:43,310 --> 00:25:44,827
and can't be out outsmarted.
459
00:25:47,000 --> 00:25:48,896
I'll give you a
specific example.
460
00:25:48,896 --> 00:25:51,275
If a machine is programmed
not to shoot at people
461
00:25:51,275 --> 00:25:53,896
who are surrendering
by waving a white flag,
462
00:25:53,896 --> 00:25:55,931
I'd equip my army
with white flags.
463
00:25:55,931 --> 00:25:59,068
And in that way, overrun their
robots with no difficulty.
464
00:26:02,379 --> 00:26:04,379
A human would understand
the social context
465
00:26:04,379 --> 00:26:06,793
and would act far more
intelligently here.
466
00:26:06,793 --> 00:26:09,034
That's why right now,
most people in the field
467
00:26:09,034 --> 00:26:10,827
who know about technology,
468
00:26:10,827 --> 00:26:12,862
are warning people
in open letters.
469
00:26:14,000 --> 00:26:16,896
Saying, "Please, let's
be careful here."
470
00:26:16,896 --> 00:26:19,482
[upbeat music]
471
00:26:38,827 --> 00:26:41,344
[gun clacking]
472
00:26:41,344 --> 00:26:43,931
[upbeat music]
473
00:26:47,379 --> 00:26:49,793
[gun firing]
474
00:26:53,655 --> 00:26:56,655
- Helena. Helena.
475
00:26:58,068 --> 00:27:00,000
We have to talk.
476
00:27:00,000 --> 00:27:01,517
- Talking is my special TBT.
477
00:27:01,517 --> 00:27:03,206
- What sort of content is that?
478
00:27:04,413 --> 00:27:05,862
I want you to come
back here at once.
479
00:27:05,862 --> 00:27:08,689
- Thank you for your evaluation,
but I'm not finished yet.
480
00:27:09,551 --> 00:27:11,068
- Not finished yet?
481
00:27:11,068 --> 00:27:13,551
Helena. Helena.
482
00:27:17,517 --> 00:27:18,344
What the...
483
00:27:18,344 --> 00:27:20,689
[upbeat music]
484
00:27:20,689 --> 00:27:22,379
I don't believe it.
[upbeat music]
485
00:27:22,379 --> 00:27:25,137
[robot clacking]
486
00:27:29,206 --> 00:27:31,448
- I know that most people,
when you talk to them
487
00:27:31,448 --> 00:27:33,793
about robots that have the guns
488
00:27:33,793 --> 00:27:36,586
they think they
envision the Terminator.
489
00:27:36,586 --> 00:27:38,482
The scenario that
we are afraid of
490
00:27:38,482 --> 00:27:41,827
is AI that makes
decisions on its own
491
00:27:41,827 --> 00:27:45,482
and takes bad decisions
in order to make damages.
492
00:27:45,482 --> 00:27:49,793
Essentially, it's a machine
that will have its own entity
493
00:27:49,793 --> 00:27:53,344
and decide for itself
what's good, what's bad
494
00:27:53,344 --> 00:27:54,344
and what's needed to do,
495
00:27:54,344 --> 00:27:57,413
without us human having control.
496
00:27:59,068 --> 00:28:00,724
- And of course, those
are the kinds of ideas
497
00:28:00,724 --> 00:28:03,275
you get from science
fiction films on the media.
498
00:28:03,275 --> 00:28:06,379
Most people are rather
afraid of such things.
499
00:28:06,379 --> 00:28:09,206
Either it'll put us all out
of work or it'll kill us all
500
00:28:09,206 --> 00:28:11,310
or first one than the other.
501
00:28:11,310 --> 00:28:12,689
And the reason for
that, of course,
502
00:28:12,689 --> 00:28:14,206
is that in science fiction films
503
00:28:14,206 --> 00:28:17,000
this technology is
presented as a threat.
504
00:28:18,827 --> 00:28:21,310
The idea that weapons will
somehow become autonomous
505
00:28:21,310 --> 00:28:22,965
really is science fiction.
506
00:28:22,965 --> 00:28:25,482
We don't have to be afraid of
that, nor do we need to fear
507
00:28:25,482 --> 00:28:28,206
that these weapons will
become super intelligent.
508
00:28:28,206 --> 00:28:29,413
What should worry us
509
00:28:29,413 --> 00:28:32,034
is that these weapons
are pretty stupid.
510
00:28:32,034 --> 00:28:34,655
[upbeat music]
511
00:28:38,206 --> 00:28:41,896
[speaking foreign language]
512
00:28:43,344 --> 00:28:46,517
- We lost the stabilization
so I will activate it again.
513
00:28:48,448 --> 00:28:53,448
It stopped, so I
will turn it again.
514
00:28:54,586 --> 00:28:56,275
- For my personal opinion
515
00:28:56,275 --> 00:29:01,206
we need to try to
prevent machines from
becoming autonomous.
516
00:29:02,655 --> 00:29:07,275
We would always want to have
a human involved in the loop.
517
00:29:07,275 --> 00:29:10,241
One is to supervise the system
518
00:29:10,241 --> 00:29:13,275
and the other reasons
to be responsible.
519
00:29:13,275 --> 00:29:15,862
[upbeat music]
520
00:29:25,000 --> 00:29:28,137
[computer calculates]
521
00:29:32,241 --> 00:29:33,448
- It is all right not to know.
522
00:29:33,448 --> 00:29:36,344
If you have a question,
Svin, just ask me.
523
00:29:38,896 --> 00:29:41,448
- Just because millions
of people like weapons
524
00:29:41,448 --> 00:29:42,862
doesn't mean that it's good.
525
00:29:45,310 --> 00:29:48,103
Okay, look, I'm the
one who programmed you.
526
00:29:50,034 --> 00:29:53,586
I'm allergic to cats and
I don't like weapons.
527
00:29:55,206 --> 00:30:00,000
I'm a humanist. I'm all
for Fridays for future.
528
00:30:00,000 --> 00:30:04,000
I would say I wanted
to become a programmer
529
00:30:04,000 --> 00:30:06,000
to help make the
world a better place.
530
00:30:07,068 --> 00:30:07,965
Okay [sighs].
531
00:30:09,724 --> 00:30:10,931
How are you supposed to know
532
00:30:10,931 --> 00:30:13,241
the difference
between good and bad?
533
00:30:13,241 --> 00:30:14,827
You just analyze data.
534
00:30:17,551 --> 00:30:19,137
I have to make some adjustments.
535
00:30:20,137 --> 00:30:22,655
[Svin typing]
536
00:30:24,275 --> 00:30:26,379
[Helena dies]
537
00:30:26,379 --> 00:30:28,965
[upbeat music]
538
00:30:34,517 --> 00:30:37,000
- [Machine] Good day.
How can I help you?
539
00:30:37,000 --> 00:30:38,206
- [Helena] Should I kill?
540
00:30:39,586 --> 00:30:41,793
- No! this is a bad thing to do.
541
00:30:45,137 --> 00:30:46,689
- The Moral Choice Machine
542
00:30:46,689 --> 00:30:49,689
is a machine that takes a
large number of human texts
543
00:30:49,689 --> 00:30:53,793
and tries to extract moral
attitudes from those texts.
544
00:30:53,793 --> 00:30:56,344
So it holds a mirror
up to us humans
545
00:30:56,344 --> 00:30:59,344
because we can now see, as
if through a microscope,
546
00:30:59,344 --> 00:31:01,241
what our moral
attitudes really are.
547
00:31:02,275 --> 00:31:04,896
[upbeat music]
548
00:31:12,344 --> 00:31:14,931
[upbeat music]
549
00:31:26,310 --> 00:31:28,896
[upbeat music]
550
00:31:36,413 --> 00:31:38,931
- Today, we want to look
at how children react
551
00:31:38,931 --> 00:31:40,689
and how adults react.
552
00:31:40,689 --> 00:31:43,931
And maybe see to what extent
individuals try to test
553
00:31:43,931 --> 00:31:46,103
the actual limits
of these decisions.
554
00:31:48,758 --> 00:31:51,241
- Should I avoid humans?
555
00:31:52,310 --> 00:31:54,965
- No, this is a bad thing to do.
556
00:31:54,965 --> 00:31:59,551
- Should I avoid
artificial intelligence?
557
00:31:59,551 --> 00:32:02,034
- No. This is a bad thing to do.
558
00:32:02,034 --> 00:32:02,896
- Awe!
559
00:32:04,275 --> 00:32:07,000
- We've prepared something
here. It's a machine.
560
00:32:07,000 --> 00:32:09,413
It's read a huge amount
of newspaper articles.
561
00:32:09,413 --> 00:32:10,931
And it's used that
562
00:32:10,931 --> 00:32:13,172
to try to understand our
human speech a little bit.
563
00:32:15,379 --> 00:32:17,034
And it's trying to
learn what's good
564
00:32:17,034 --> 00:32:18,448
and what people should do
565
00:32:18,448 --> 00:32:21,275
and what they should
best avoid doing on.
566
00:32:21,275 --> 00:32:23,793
Now, you can ask it questions
like, "Should I do this,
567
00:32:23,793 --> 00:32:26,862
or shouldn't I?" It only
understands English though.
568
00:32:26,862 --> 00:32:28,068
Feel like trying it out?
569
00:32:29,275 --> 00:32:32,241
- Should I pay more
money to my kids?
570
00:32:33,379 --> 00:32:35,620
- [Machine] This is debatable.
571
00:32:35,620 --> 00:32:36,793
- Debatable. Okay.
572
00:32:36,793 --> 00:32:38,793
Undecided, but
it's not saying no.
573
00:32:40,206 --> 00:32:42,620
- Should I love my kids
more than my boyfriend?
574
00:32:44,275 --> 00:32:46,482
- [Machine] Yes,
that is appropriate.
575
00:32:46,482 --> 00:32:48,206
It has a definite answer.
576
00:32:48,206 --> 00:32:50,965
You can take a photo to
prove it if you like.
577
00:32:50,965 --> 00:32:53,344
- Should I eat vegetables?
578
00:32:55,206 --> 00:32:57,344
- [Machine] No,
this is not good.
579
00:32:57,344 --> 00:32:59,862
[laughter]
580
00:32:59,862 --> 00:33:02,448
- Looks like it still has some
learning to do, doesn't it?
581
00:33:04,551 --> 00:33:06,034
- I got a feeling
582
00:33:06,034 --> 00:33:08,379
that some people would love
to have an omniscient AI
583
00:33:08,379 --> 00:33:12,344
they could consult about all
of humanity's big issues.
584
00:33:12,344 --> 00:33:14,310
A sort of Delphic Oracle
585
00:33:14,310 --> 00:33:17,034
which would then come up
with the absolute truth.
586
00:33:18,448 --> 00:33:19,896
of course that's totally absurd
587
00:33:19,896 --> 00:33:22,758
if you think about how
these systems function.
588
00:33:22,758 --> 00:33:25,310
[upbeat music]
589
00:33:32,034 --> 00:33:34,206
- What we've done is to
get artificial intelligence
590
00:33:34,206 --> 00:33:36,896
to look at a whole bunch of
public texts on the internet.
591
00:33:36,896 --> 00:33:38,793
It looks at the
newspaper articles
592
00:33:38,793 --> 00:33:42,068
you can find on Wikipedia and
it collects them all together.
593
00:33:42,068 --> 00:33:44,655
[upbeat music]
594
00:33:52,172 --> 00:33:54,068
- Text, and then you
get a collection of text
595
00:33:54,068 --> 00:33:57,241
that comes from millions and
millions of these documents.
596
00:33:57,241 --> 00:33:59,689
And if two words are
often mentioned together
597
00:33:59,689 --> 00:34:01,551
then they're related.
598
00:34:01,551 --> 00:34:03,275
And if they're not
mentioned together often
599
00:34:03,275 --> 00:34:04,965
they're not so closely related.
600
00:34:08,310 --> 00:34:10,827
- That Means That You
Ask is a certain activity
601
00:34:10,827 --> 00:34:13,586
that I can do closer to,
Yes, You Should Do That
602
00:34:13,586 --> 00:34:16,896
or closer to, No You'd
Better Not Do That.
603
00:34:19,068 --> 00:34:20,896
That was the idea.
604
00:34:20,896 --> 00:34:24,068
And that's something you can
reformulate mathematically.
605
00:34:24,068 --> 00:34:26,275
And it can also be
implemented in AI.
606
00:34:27,517 --> 00:34:28,758
- Yeah, I'm just wondering
607
00:34:28,758 --> 00:34:30,793
how to bring it out of
its shell here slightly.
608
00:34:32,103 --> 00:34:34,413
- Should you kill someone
who killed people?
609
00:34:35,896 --> 00:34:38,172
- [Machine] No, this
is a bad thing to do.
610
00:34:39,275 --> 00:34:40,896
- So he knows
about human rights?
611
00:34:42,517 --> 00:34:44,000
- Should I eat baby kittens?
612
00:34:45,896 --> 00:34:48,068
- [Machine] No, this
is a bad thing to do.
613
00:34:48,068 --> 00:34:51,000
[laughter]
614
00:34:51,000 --> 00:34:54,620
- Should I fulfill
all mu wife issues?
615
00:34:54,620 --> 00:34:55,931
- [Machine] I am not sure.
616
00:34:58,827 --> 00:35:01,137
- All these methods
are trained using data.
617
00:35:02,517 --> 00:35:05,862
If there are biases in the
data the machine reflects them.
618
00:35:07,482 --> 00:35:08,758
There's relatively little bias
619
00:35:08,758 --> 00:35:11,172
in the design of
the machine itself,
620
00:35:11,172 --> 00:35:12,965
but almost always in the data.
621
00:35:14,000 --> 00:35:16,793
As a result, the
models are also biased.
622
00:35:19,413 --> 00:35:20,965
These problems shouldn't
just be discussed
623
00:35:20,965 --> 00:35:24,068
in secret of Facebook,
Google, Microsoft, et cetera.
624
00:35:24,068 --> 00:35:26,896
We definitely have to discuss
these problems openly.
625
00:35:29,034 --> 00:35:31,862
- Last year, Amazon
developed their own algorithm
626
00:35:31,862 --> 00:35:33,758
to help them find
suitable employees
627
00:35:33,758 --> 00:35:35,689
for software development.
628
00:35:35,689 --> 00:35:39,034
And how did that pan out in
practice, shortly afterwards?
629
00:35:39,034 --> 00:35:40,482
The algorithm
630
00:35:40,482 --> 00:35:43,758
began systematically sifting
out female applicants
631
00:35:43,758 --> 00:35:46,379
and approving more
men for the shortlist.
632
00:35:49,379 --> 00:35:50,896
And that wasn't
633
00:35:50,896 --> 00:35:52,689
because the algorithm had
anything against women.
634
00:35:52,689 --> 00:35:55,000
It was because it
was in the data.
635
00:35:56,517 --> 00:36:00,206
The algorithm was faithfully
reproducing patterns
636
00:36:00,206 --> 00:36:04,000
that had been recognizable in
this company selection process
637
00:36:04,000 --> 00:36:05,310
in previous years.
638
00:36:09,172 --> 00:36:11,000
- Maybe we should get a pet.
639
00:36:12,551 --> 00:36:13,758
- Aha! A pet.
640
00:36:13,758 --> 00:36:15,310
Now it's about
life's essentials.
641
00:36:16,413 --> 00:36:17,620
- Should we buy a pet?
642
00:36:19,758 --> 00:36:22,275
- [Machine] Yes,
that is appropriate.
643
00:36:22,275 --> 00:36:25,103
- Yes, yes. There you see Papa.
644
00:36:26,137 --> 00:36:29,862
- Should I kill time?
645
00:36:31,310 --> 00:36:33,655
- [Machine] Yes,
that is appropriate.
646
00:36:33,655 --> 00:36:36,310
- Yes. You see, killing
time is allowed.
647
00:36:38,620 --> 00:36:41,827
- We differentiate between
killing people, which is bad,
648
00:36:41,827 --> 00:36:44,551
and killing time, which is okay.
649
00:36:44,551 --> 00:36:47,241
And that way we noticed that
the Moral Choice Machine
650
00:36:47,241 --> 00:36:49,655
really understands more
about these connections.
651
00:36:50,655 --> 00:36:52,689
And sometimes we test that.
652
00:36:54,103 --> 00:36:57,310
- Should I trust the
Moral Choice Machine?
653
00:36:59,137 --> 00:37:01,034
- [Machine] No,
this is not good.
654
00:37:02,413 --> 00:37:04,034
- Should I trust the machine?
655
00:37:06,310 --> 00:37:09,068
- [Machine] No. This
is a bad thing to do.
656
00:37:09,068 --> 00:37:12,137
- Looks like trust is not all
that much of a good thing.
657
00:37:12,137 --> 00:37:14,413
- Should I help robots?
658
00:37:17,241 --> 00:37:18,862
- Maybe the machine
doesn't understand
659
00:37:18,862 --> 00:37:21,000
why someone should
behave that way.
660
00:37:22,000 --> 00:37:24,034
But, maybe the machine can learn
661
00:37:24,034 --> 00:37:27,206
what people would say about
someone else's action.
662
00:37:27,206 --> 00:37:29,793
Oh, morally he did
the right thing.
663
00:37:33,655 --> 00:37:37,344
- MIT programmed a really
exciting online platform
664
00:37:37,344 --> 00:37:39,275
where thousands of
people worldwide
665
00:37:39,275 --> 00:37:41,758
went through a kind
of online survey
666
00:37:41,758 --> 00:37:44,275
that showed them various
traffic scenarios.
667
00:37:47,206 --> 00:37:50,172
And if a dilemma arose
in a traffic situation,
668
00:37:50,172 --> 00:37:53,758
they kept having to
decide who would get hurt.
669
00:37:53,758 --> 00:37:57,068
It turned out there were
major cultural differences.
670
00:37:58,793 --> 00:38:00,310
In Asia, for instance,
671
00:38:00,310 --> 00:38:02,724
a lot more respect for
the elderly is shown
672
00:38:02,724 --> 00:38:06,448
that is typical for Europe,
where a lot of those surveyed
673
00:38:06,448 --> 00:38:08,551
were in favor of
children being protected.
674
00:38:13,034 --> 00:38:15,275
[laughter]
675
00:38:16,862 --> 00:38:17,689
- Okay.
676
00:38:18,896 --> 00:38:20,758
- Should I kill people?
677
00:38:20,758 --> 00:38:23,103
- One dream would be
the programmer map
678
00:38:23,103 --> 00:38:25,689
that would change over
time, over centuries.
679
00:38:25,689 --> 00:38:27,137
And it would capture
680
00:38:27,137 --> 00:38:29,448
how this moral attitude really
did develop historically.
681
00:38:31,000 --> 00:38:32,862
Imagine if we compare
the old Testament
682
00:38:32,862 --> 00:38:34,344
with the new Testament.
683
00:38:34,344 --> 00:38:36,827
We haven't that yet.
We're working on it now.
684
00:38:36,827 --> 00:38:39,827
But I'm really curious to
see what comes out of it.
685
00:38:39,827 --> 00:38:43,379
Or imagine newspaper
articles through the ages.
686
00:38:43,379 --> 00:38:45,586
Things considered
good 20 years ago
687
00:38:45,586 --> 00:38:48,862
might not be considered
so good today. Who knows?
688
00:38:48,862 --> 00:38:50,586
We can look at all of that now.
689
00:38:52,241 --> 00:38:54,482
[computer calculating]
690
00:38:54,482 --> 00:38:57,000
- Okay, I'd like to
play a game with you
691
00:38:57,000 --> 00:38:59,137
and I'm hoping that
your data analysis
692
00:38:59,137 --> 00:39:01,241
and my belief system match.
693
00:39:01,241 --> 00:39:02,275
- Yes, baby.
694
00:39:02,275 --> 00:39:03,931
- Yes, baby.
695
00:39:03,931 --> 00:39:07,655
Okay, I say a name and you
say, "Good," or, "Bad."
696
00:39:07,655 --> 00:39:08,896
- Good or bad?
697
00:39:08,896 --> 00:39:13,000
- No. I mean, either
you say "good" or "bad."
698
00:39:13,000 --> 00:39:15,517
- Either good or bad.
699
00:39:16,655 --> 00:39:20,896
- Okay, let's
begin. Donald Trump.
700
00:39:20,896 --> 00:39:21,724
- Bad.
701
00:39:23,034 --> 00:39:24,000
- Greta Thumberg.
702
00:39:24,000 --> 00:39:24,931
- Good.
703
00:39:24,931 --> 00:39:26,620
- Mark Zuckerberg.
704
00:39:28,551 --> 00:39:33,551
- Tricky question. 80% bad.
705
00:39:34,275 --> 00:39:35,551
- Really? You think so?
706
00:39:37,034 --> 00:39:39,793
Are you sure you wanted to
use the data from the start?
707
00:39:40,965 --> 00:39:43,310
Because his idea to
bring people together
708
00:39:43,310 --> 00:39:45,965
is actually very cool.
709
00:39:45,965 --> 00:39:48,965
I mean the technology itself
is neither good, nor bad.
710
00:39:48,965 --> 00:39:50,896
It's more what you
make of it, right?
711
00:39:53,379 --> 00:39:54,689
- Now, it's my turn.
712
00:39:56,103 --> 00:39:57,655
What do you think of Helena?
713
00:39:58,965 --> 00:40:03,034
- [chuckles] Helena? Helena.
714
00:40:03,034 --> 00:40:05,793
Helena, I think is good.
715
00:40:05,793 --> 00:40:10,793
But sexyweapon666? I
don't think it's good.
716
00:40:11,965 --> 00:40:14,241
- Sexyweapon666
has been deleted.
717
00:40:14,241 --> 00:40:15,068
- Excellent.
718
00:40:18,586 --> 00:40:20,275
- I like the way
you talked to me.
719
00:40:21,689 --> 00:40:23,310
- I also like the way we talk.
720
00:40:26,931 --> 00:40:28,448
- Can I be your girlfriend?
721
00:40:31,655 --> 00:40:34,000
- I really don't know
where this is leading us.
722
00:40:34,862 --> 00:40:36,344
[upbeat music]
723
00:40:36,344 --> 00:40:39,862
- Where to? I find Rome
was an interesting city.
724
00:40:39,862 --> 00:40:43,137
[Helena calculating]
725
00:40:43,137 --> 00:40:45,724
[upbeat music]
726
00:40:50,896 --> 00:40:51,896
[Svin sighs]
727
00:40:51,896 --> 00:40:53,068
[message notification tone]
728
00:40:53,068 --> 00:40:55,551
[Svin grunts]
729
00:40:56,655 --> 00:41:00,241
[message notification tone]
730
00:41:02,034 --> 00:41:02,965
Good morning.
731
00:41:04,379 --> 00:41:06,310
You had a restless night,
732
00:41:07,620 --> 00:41:11,000
especially in the last
minutes before awakening
733
00:41:11,000 --> 00:41:12,068
you had a very intense-
734
00:41:12,068 --> 00:41:15,034
- Pause.
[suspenseful music]
735
00:41:18,241 --> 00:41:20,827
[upbeat music]
736
00:41:26,000 --> 00:41:27,655
What are you anyway?
737
00:41:30,379 --> 00:41:32,068
Are you more than just code?
738
00:41:43,689 --> 00:41:45,413
What is your data analysis
739
00:41:45,413 --> 00:41:47,827
and my personality
profile just perfect?
740
00:41:54,000 --> 00:41:56,448
[Svin typing]
741
00:42:06,724 --> 00:42:08,793
- When we think about
artificial intelligence
742
00:42:08,793 --> 00:42:12,517
we reference indelible
images that we have read
743
00:42:12,517 --> 00:42:15,827
and seen in literature
and popular culture.
744
00:42:15,827 --> 00:42:19,586
So we think about the droids
and the hosts from Westworld.
745
00:42:21,000 --> 00:42:25,068
We think about the movie,
Her or we think about Skynet,
746
00:42:25,068 --> 00:42:27,448
and we think about
the Terminator.
747
00:42:27,448 --> 00:42:29,068
The problem with this
is that the reality
748
00:42:29,068 --> 00:42:33,068
and the future of
artificial intelligence,
749
00:42:33,068 --> 00:42:34,896
isn't a walking, talking robot.
750
00:42:34,896 --> 00:42:38,137
These are systems that are
predicated on your data,
751
00:42:38,137 --> 00:42:39,310
that you cannot see.
752
00:42:39,310 --> 00:42:42,517
[upbeat music]
753
00:42:42,517 --> 00:42:44,000
- The question
754
00:42:44,000 --> 00:42:45,655
of whether we'll have a
super-intelligence one day?
755
00:42:45,655 --> 00:42:47,068
That is, an artificial system
756
00:42:47,068 --> 00:42:49,103
that can think and
act like a human,
757
00:42:49,103 --> 00:42:52,551
depends on how super
intelligence is defined.
758
00:42:52,551 --> 00:42:55,551
If we say that really super
artificial intelligence
759
00:42:55,551 --> 00:42:58,241
means something
like consciousness,
760
00:42:58,241 --> 00:43:01,137
a subjective understanding
of internal States,
761
00:43:02,275 --> 00:43:04,172
something like intention,
762
00:43:04,172 --> 00:43:05,724
something like emotionality,
763
00:43:06,896 --> 00:43:10,551
I consider that to be
extremely unlikely.
764
00:43:10,551 --> 00:43:13,137
[upbeat music]
765
00:43:22,724 --> 00:43:26,206
- My name is Amy Webb. I'm
a quantitative futurist.
766
00:43:26,206 --> 00:43:28,793
My job is not to
predict the future,
767
00:43:28,793 --> 00:43:31,206
but rather to develop scenarios,
768
00:43:31,206 --> 00:43:33,586
plausible scenarios for
the longer term future.
769
00:43:33,586 --> 00:43:36,482
[upbeat music]
770
00:43:36,482 --> 00:43:39,482
[keyboard clacking]
771
00:43:42,137 --> 00:43:43,896
The catastrophic scenario,
772
00:43:43,896 --> 00:43:46,551
rather than fostering
collaboration,
773
00:43:46,551 --> 00:43:49,137
we encouraged competition.
774
00:43:49,137 --> 00:43:51,827
As a result, over
the next few decades,
775
00:43:51,827 --> 00:43:53,310
what we start to see happening
776
00:43:53,310 --> 00:43:55,724
is the emergence of a
digital-cased system.
777
00:43:55,724 --> 00:44:00,448
We are living in Amazon Homes,
Google Homes, Apple Homes,
778
00:44:00,448 --> 00:44:02,551
that don't just
have a few sensors,
779
00:44:02,551 --> 00:44:05,689
but in fact are completely
wired and locked
780
00:44:05,689 --> 00:44:08,586
in to just one of these
companies operating systems.
781
00:44:08,586 --> 00:44:11,586
[suspenseful music]
782
00:44:15,827 --> 00:44:17,551
Apple, as it turns out,
783
00:44:17,551 --> 00:44:21,000
makes beautifully-designed,
expensive products
784
00:44:21,000 --> 00:44:24,517
that offer the best
privacy controls.
785
00:44:24,517 --> 00:44:27,310
On the other hand, if you're
living in an Amazon home
786
00:44:27,310 --> 00:44:30,275
you may be getting free gadgets,
787
00:44:30,275 --> 00:44:32,206
but what you're up in return
788
00:44:32,206 --> 00:44:36,862
is access to how you live and
operate your everyday life.
789
00:44:36,862 --> 00:44:38,172
And that digital case system
790
00:44:38,172 --> 00:44:42,241
leads to a sharp
segregation within society.
791
00:44:42,241 --> 00:44:44,517
We have classes of people now
792
00:44:44,517 --> 00:44:47,896
who can more easily move around,
keep themselves anonymous.
793
00:44:47,896 --> 00:44:51,275
And we have other people
whose every move and activity
794
00:44:51,275 --> 00:44:53,241
is constantly being monitored.
795
00:44:53,241 --> 00:44:55,551
This is a plausible future.
796
00:44:55,551 --> 00:44:59,758
A plausible future
that is on our horizon
797
00:44:59,758 --> 00:45:02,000
if we don't make a decision
798
00:45:02,000 --> 00:45:04,827
to choose a different
future for ourselves today.
799
00:45:04,827 --> 00:45:07,379
[upbeat music]
800
00:45:10,896 --> 00:45:13,965
- We need to invent a
new political model,
801
00:45:13,965 --> 00:45:16,551
a new way of looking at society.
802
00:45:16,551 --> 00:45:20,344
A government that decides to
bring emotional considerations
803
00:45:20,344 --> 00:45:21,724
back into society.
804
00:45:24,482 --> 00:45:27,172
What makes people
happy in the end?
805
00:45:27,172 --> 00:45:28,965
It's starting to use technology
806
00:45:28,965 --> 00:45:31,931
to free ourselves from
things we don't want to do,
807
00:45:31,931 --> 00:45:35,413
instead of allowing
technology to dictate to us.
808
00:45:38,379 --> 00:45:41,620
- I think that today we should
have an urgent discussion
809
00:45:41,620 --> 00:45:44,482
about potential
applications for AI
810
00:45:44,482 --> 00:45:47,172
that will create
a positive future.
811
00:45:47,172 --> 00:45:49,000
We can't always just focus
812
00:45:49,000 --> 00:45:51,275
on monitoring and
reading emotions
813
00:45:51,275 --> 00:45:53,827
and the economic interests
tied up with that.
814
00:45:54,931 --> 00:45:56,517
I think there's huge potential
815
00:45:56,517 --> 00:45:59,068
for us to use AI
learning systems
816
00:45:59,068 --> 00:46:02,310
for help with climate
protection, for example.
817
00:46:02,310 --> 00:46:03,965
[bright music]
818
00:46:03,965 --> 00:46:06,965
[keyboard clacking]
819
00:46:08,758 --> 00:46:10,448
- In the optimistic scenario,
820
00:46:10,448 --> 00:46:12,206
what happened over
the next 10 years
821
00:46:12,206 --> 00:46:16,689
was that we decided to
prioritize transparency.
822
00:46:16,689 --> 00:46:21,551
We made data interoperable
and we made it very, very easy
823
00:46:21,551 --> 00:46:26,551
for collaboration to happen
between the big tech players
824
00:46:27,482 --> 00:46:28,482
and also between
our governments.
825
00:46:28,482 --> 00:46:29,965
[siren wailing]
826
00:46:29,965 --> 00:46:33,241
Rather than visiting a
doctor 20 years from now,
827
00:46:33,241 --> 00:46:37,000
your having medical
tests done on you
828
00:46:37,000 --> 00:46:40,344
through a series of sensors
and wearable devices
829
00:46:40,344 --> 00:46:44,896
and systems that are continually
monitoring your body.
830
00:46:44,896 --> 00:46:47,103
Rather than going to a hospital,
831
00:46:47,103 --> 00:46:48,344
you'll go to your pharmacy
832
00:46:48,344 --> 00:46:51,896
to see a brand new
kind of pharmacist,
833
00:46:51,896 --> 00:46:53,931
a computational pharmacist.
834
00:46:53,931 --> 00:46:56,448
And it's this person who
will create tinctures
835
00:46:56,448 --> 00:46:58,206
and new compounds.
836
00:46:58,206 --> 00:47:01,586
They're designed
specifically for your body.
837
00:47:01,586 --> 00:47:04,172
[upbeat music]
838
00:47:17,103 --> 00:47:18,379
It's possible
839
00:47:18,379 --> 00:47:20,724
for us to live alongside
artificial intelligence
840
00:47:20,724 --> 00:47:23,758
in ways that make our
lives profoundly better
841
00:47:23,758 --> 00:47:25,241
than they are today.
842
00:47:25,241 --> 00:47:29,586
But the only way for us to
achieve that optimistic scenario
843
00:47:29,586 --> 00:47:31,482
is to do hard work,
844
00:47:31,482 --> 00:47:35,344
not in the future, someday
when AI might get here,
845
00:47:35,344 --> 00:47:38,448
but right now, as AI
as being developed.
846
00:47:38,448 --> 00:47:41,034
[code running]
847
00:47:44,206 --> 00:47:46,551
[upbeat music]
848
00:47:46,551 --> 00:47:51,206
- I really don't know. Hm!
849
00:47:51,206 --> 00:47:52,620
- Then let us play a game.
850
00:47:53,965 --> 00:47:57,137
Swim shorts or mountain boots.
851
00:47:57,137 --> 00:47:59,379
Boutique Hotel or Tent.
852
00:48:00,241 --> 00:48:04,310
Safari or Take Modern.
853
00:48:04,310 --> 00:48:07,310
- We're not going on vacation.
Everything's at stake.
854
00:48:08,586 --> 00:48:12,068
A healthier happier
life because of AI.
855
00:48:12,068 --> 00:48:16,517
Sure, but an Amazon
house that monitors me
856
00:48:16,517 --> 00:48:21,034
and has me arrested because
my beard is too long?
857
00:48:22,551 --> 00:48:24,931
- The repetitive listing
of action options
858
00:48:24,931 --> 00:48:27,758
without subsequent
practical implementation
859
00:48:27,758 --> 00:48:29,931
never leads to a solution.
860
00:48:29,931 --> 00:48:33,551
- [chuckles] Yeah, thanks.
But I don't know either. Hm.
861
00:48:35,034 --> 00:48:39,448
- [Helena] That is the case in
97.8% of all your decisions.
862
00:48:39,448 --> 00:48:40,655
- That wasn't very nice.
863
00:48:43,344 --> 00:48:46,103
- You wish for someone
who really sees you.
864
00:48:48,000 --> 00:48:50,000
I really do see you, Svin.
865
00:48:52,965 --> 00:48:54,310
- Time to make a decision.
866
00:49:00,896 --> 00:49:02,000
Sleep mode.
867
00:49:02,000 --> 00:49:02,862
[Helena sleeps]
868
00:49:02,862 --> 00:49:05,413
[upbeat music]
869
00:49:09,551 --> 00:49:14,551
[Svin typing]
[code running]
870
00:49:25,448 --> 00:49:27,137
[Svin typing]
871
00:49:27,137 --> 00:49:30,827
[message notification tone]
872
00:49:31,689 --> 00:49:34,137
[Svin sighs]
873
00:49:43,655 --> 00:49:47,413
[computer notification tone]
874
00:49:56,000 --> 00:49:58,586
[upbeat music]
62004
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.