Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,000 --> 00:00:06,280
100 years ago, most lifts
were driven by trained operators.
2
00:00:06,280 --> 00:00:08,640
The technology was there
to replace them,
3
00:00:08,640 --> 00:00:10,800
but people just didn't feel
comfortable
4
00:00:10,800 --> 00:00:12,800
with the idea of automation.
5
00:00:12,800 --> 00:00:14,320
Level one, please.
6
00:00:18,360 --> 00:00:24,280
And then lift designers made some
small but ground-breaking changes.
7
00:00:28,640 --> 00:00:31,320
TANNOY: First floor,
Christmas lectures.
8
00:00:32,400 --> 00:00:35,400
Add that to a stop button
and some relaxing music
9
00:00:35,400 --> 00:00:39,360
and suddenly trust
in automated lifts soared.
10
00:00:39,360 --> 00:00:41,080
And here we are today.
11
00:00:41,080 --> 00:00:43,160
But what about today?
12
00:00:43,160 --> 00:00:45,760
Should we trust the machines
that surround us
13
00:00:45,760 --> 00:00:48,320
or are we right to be cautious?
14
00:01:13,800 --> 00:01:16,320
CHEERING AND APPLAUSE
15
00:01:31,840 --> 00:01:33,840
Welcome to the Christmas Lectures.
16
00:01:33,840 --> 00:01:36,320
I'm Dr Hannah Fry
and tonight we're going to ask
17
00:01:36,320 --> 00:01:38,400
whether we should trust the maths.
18
00:01:38,400 --> 00:01:42,360
Just how far should we be going
with our mathematical skills?
19
00:01:42,360 --> 00:01:45,640
And let's demonstrate those
mathematical skills first off
20
00:01:45,640 --> 00:01:49,880
because we are joined by Scott
Hamlin and your BMX bike. Hello.
21
00:01:49,880 --> 00:01:53,840
And a rather carefully placed ramp
just here.
22
00:01:53,840 --> 00:01:57,080
Very careful indeed, to make sure
it's absolutely amazing.
23
00:01:57,080 --> 00:02:00,600
And you've been in since 8:00am
this morning calculating exactly
24
00:02:00,600 --> 00:02:03,760
where this ramp should be,
the shape of the ramp, everything,
25
00:02:03,760 --> 00:02:06,840
because it's quite a short space
we've got here. Yes, absolutely.
26
00:02:06,840 --> 00:02:09,760
We need to make sure we can get
enough velocity to give us lift
27
00:02:09,760 --> 00:02:12,680
off the ramp and I need to use
my personal calculations...
28
00:02:12,680 --> 00:02:15,360
..OK, instincts, to be able
to perform my stunts
29
00:02:15,360 --> 00:02:18,360
and stop before we crash
into the wall.
30
00:02:18,360 --> 00:02:21,400
And when you say stunts, Scott,
what are you going to do?
31
00:02:21,400 --> 00:02:24,880
Well, it depends. It might be a
backflip if people want to see one.
32
00:02:24,880 --> 00:02:26,960
Do you want to see a backflip?
ALL: Yes!
33
00:02:26,960 --> 00:02:29,120
Yeah, they want to see a backflip.
34
00:02:29,120 --> 00:02:32,240
All right, Scott, you ready to
give this a go? You guys ready?
35
00:02:32,240 --> 00:02:34,560
ALL: Yes.
Right, here we go.
36
00:02:34,560 --> 00:02:37,320
Come on, then. We'll give you
a countdown, Scott,
37
00:02:37,320 --> 00:02:39,320
when you're in place. Cool.
38
00:02:42,800 --> 00:02:44,320
Here we go.
39
00:02:47,360 --> 00:02:49,920
And then I'm going to get
really far out of the way.
40
00:02:49,920 --> 00:02:51,280
LAUGHTER
41
00:02:51,280 --> 00:02:53,320
OK, whenever you're happy.
42
00:02:53,320 --> 00:02:55,400
Right, are we ready? Yeah.
43
00:02:55,400 --> 00:02:58,200
You happy, Scott?
I didn't hear the kids.
44
00:02:59,240 --> 00:03:01,880
Are you guys ready for this?
ALL: Yes!
45
00:03:01,880 --> 00:03:05,320
All right, we're going to give you
a countdown, Scott. Go for it.
46
00:03:05,320 --> 00:03:06,760
Five...
47
00:03:06,760 --> 00:03:11,400
ALL: ..four, three, two, one, go!
48
00:03:13,040 --> 00:03:14,760
Wow!
49
00:03:14,760 --> 00:03:17,080
APPLAUSE
50
00:03:20,640 --> 00:03:22,360
Woo!
51
00:03:24,080 --> 00:03:25,440
I'm alive.
52
00:03:29,720 --> 00:03:31,560
Well done, Scott.
53
00:03:31,560 --> 00:03:35,640
That was some tight calculations
going on. Yes, it certainly was.
54
00:03:35,640 --> 00:03:38,760
I'm glad it paid off. Thanks to
you guys for making the noise.
55
00:03:38,760 --> 00:03:42,080
Well, no, thank you to you.
Scott Hamlin, thank you very much.
56
00:03:42,080 --> 00:03:44,520
CHEERING AND APPLAUSE
57
00:03:47,880 --> 00:03:51,040
Now, OK, we all know that maths
is amazing at this kind of stuff.
58
00:03:51,040 --> 00:03:53,280
It's out there in physics
and engineering.
59
00:03:53,280 --> 00:03:56,600
It's doing a brilliant job, just as
long as you do your sums correctly.
60
00:03:56,600 --> 00:03:58,960
But I want to tell you a little
story about a bridge
61
00:03:58,960 --> 00:04:02,640
that I think demonstrates how it's
quite a lot easier said than done,
62
00:04:02,640 --> 00:04:06,000
because this here, this is
the Millennium Bridge in London.
63
00:04:06,000 --> 00:04:08,640
It had its grand opening
in the year 2000.
64
00:04:08,640 --> 00:04:11,560
But you might also know
this bridge by its nickname.
65
00:04:11,560 --> 00:04:14,400
This is known as the Wobbly Bridge
66
00:04:14,400 --> 00:04:17,880
because of something that happened
very soon after it opened.
67
00:04:17,880 --> 00:04:20,800
Now, all bridges, including
this one, they're built to move
68
00:04:20,800 --> 00:04:23,120
left and right just a little bit.
It's no biggie.
69
00:04:23,120 --> 00:04:25,720
But there was something about
this particular bridge
70
00:04:25,720 --> 00:04:27,800
that the designers
hadn't thought of.
71
00:04:27,800 --> 00:04:30,920
They'd missed off something quite
important in their equations.
72
00:04:30,920 --> 00:04:33,560
Let me explain what happened here
with one of these -
73
00:04:33,560 --> 00:04:35,160
a little metronome.
74
00:04:35,160 --> 00:04:38,400
Now, this thing here
ticks and tocks in time.
75
00:04:38,400 --> 00:04:41,600
If I wanted to write a set
of equations for this metronome,
76
00:04:41,600 --> 00:04:44,640
it would be quite simple -
some very straightforward physics.
77
00:04:44,640 --> 00:04:47,120
And if I wanted to write
some equations for a number
78
00:04:47,120 --> 00:04:50,080
of metronomes, I'd just do the
same thing over and over again.
79
00:04:50,080 --> 00:04:53,000
It's not like these things
can communicate with one another.
80
00:04:53,000 --> 00:04:56,120
I can treat each one as though
they're completely individual.
81
00:04:56,120 --> 00:04:59,560
But now let's see what happens
when they're on a bridge
82
00:04:59,560 --> 00:05:01,960
with just a little bit of movement,
83
00:05:01,960 --> 00:05:04,360
because something rather
intriguing happens.
84
00:05:04,360 --> 00:05:07,360
I don't want that to fall too far.
Let's go from there. OK.
85
00:05:07,360 --> 00:05:10,880
Let's see what happens when
these things are on a bridge
86
00:05:10,880 --> 00:05:15,640
that can move itself left and right
just a little bit.
87
00:05:15,640 --> 00:05:20,640
OK, so now that these things
can move left and right,
88
00:05:20,640 --> 00:05:23,480
something a little bit
unusual happens,
89
00:05:23,480 --> 00:05:27,080
because every time that a metronome
is ticking or tocking
90
00:05:27,080 --> 00:05:28,880
in one direction,
91
00:05:28,880 --> 00:05:32,080
that left and right movement
means that the whole bridge
92
00:05:32,080 --> 00:05:35,360
will just be knocked
ever so slightly left and right.
93
00:05:35,360 --> 00:05:37,360
What that means is that now
94
00:05:37,360 --> 00:05:40,600
these metronomes can effectively
start listening to each other.
95
00:05:40,600 --> 00:05:43,360
They're all now connected
by the bridge,
96
00:05:43,360 --> 00:05:46,160
which means that very, very slowly,
97
00:05:46,160 --> 00:05:49,400
you just see it moving very
slightly left and right
98
00:05:49,400 --> 00:05:54,120
and very slowly they start to
synchronise with one another...
99
00:05:55,120 --> 00:05:57,560
..like a creepy metronome army.
100
00:05:58,840 --> 00:06:01,400
Now, this is something
that my equations
101
00:06:01,400 --> 00:06:03,480
just wouldn't have taken
into account.
102
00:06:03,480 --> 00:06:06,520
But humans, like metronomes,
actually have to be handled
103
00:06:06,520 --> 00:06:09,600
with quite a lot of care and so
for that I'm going to put you
104
00:06:09,600 --> 00:06:12,040
in the very capable hands
of my good friend
105
00:06:12,040 --> 00:06:14,040
and mathematician Matt Parker.
106
00:06:14,040 --> 00:06:16,640
Oh, hey. So, thanks, Hannah.
107
00:06:16,640 --> 00:06:19,320
I'm over here in the library
at the Royal Institution
108
00:06:19,320 --> 00:06:21,840
where, instead of very small
metronomes,
109
00:06:21,840 --> 00:06:25,880
we have a massive
wobbly bridge simulator.
110
00:06:25,880 --> 00:06:28,640
We've borrowed this from
the University of Cambridge.
111
00:06:28,640 --> 00:06:31,880
They use it to test things like,
well, full-size bridges.
112
00:06:31,880 --> 00:06:36,120
It's a very firm structure
with a tray attached to it,
113
00:06:36,120 --> 00:06:38,800
which is able to move a
little bit side to side.
114
00:06:38,800 --> 00:06:41,880
We've got two treadmills.
I'm joined by Dylan here,
115
00:06:41,880 --> 00:06:44,840
who's going to walk on one
of these treadmills with me.
116
00:06:44,840 --> 00:06:49,040
If we turn these on, in theory,
we'll start walking,
117
00:06:49,040 --> 00:06:51,440
slowly to start with. There we go.
118
00:06:51,440 --> 00:06:55,080
And then, shall we go up to four?
Let's try a speed of four.
119
00:06:55,080 --> 00:06:58,560
So now we are trying to
walk on a bridge
120
00:06:58,560 --> 00:07:01,400
which is not staying still at all
121
00:07:01,400 --> 00:07:04,640
and it feels a bit like
being on a boat, maybe,
122
00:07:04,640 --> 00:07:06,400
where it's moving around
123
00:07:06,400 --> 00:07:09,120
and you're trying to compensate
for that movement,
124
00:07:09,120 --> 00:07:12,320
and it means we have perfectly
synched up our walking
125
00:07:12,320 --> 00:07:14,320
and that's causing it to move...
126
00:07:14,320 --> 00:07:16,640
..I'll say a concerning amount.
127
00:07:16,640 --> 00:07:20,080
Yes. You're keeping a brave face
but it's terrifying up here.
128
00:07:20,080 --> 00:07:23,480
All our movements is causing it
to shift backwards and forwards.
129
00:07:23,480 --> 00:07:27,080
Hannah, you can imagine what would
happen if you had loads of people
130
00:07:27,080 --> 00:07:29,760
doing this on a much bigger
structure.
131
00:07:29,760 --> 00:07:31,640
Just imagine, indeed.
132
00:07:31,640 --> 00:07:34,360
But it turns out this is
precisely what had happened
133
00:07:34,360 --> 00:07:37,120
because the people who designed
the Millennium Bridge
134
00:07:37,120 --> 00:07:40,280
hadn't taken into account the fact
that people can affect each other
135
00:07:40,280 --> 00:07:42,200
with the way that they're walking.
136
00:07:42,200 --> 00:07:45,400
Those small movements left to right
suddenly became a very big deal
137
00:07:45,400 --> 00:07:48,640
and it meant that on the opening
day, you had hundreds of people
138
00:07:48,640 --> 00:07:52,840
walking over an £18 million bridge,
the beginning of a new millennium,
139
00:07:52,840 --> 00:07:56,120
and every single one of them
was hanging on for dear life.
140
00:07:56,120 --> 00:08:00,600
Look at that. That's British
ingenuity at its finest, just there.
141
00:08:00,600 --> 00:08:03,080
But there is an important point
in all of this,
142
00:08:03,080 --> 00:08:05,080
with wobbly bridges and BMX bikes.
143
00:08:05,080 --> 00:08:08,040
Maths can do an amazing job,
but only if your equations
144
00:08:08,040 --> 00:08:11,400
actually match up to the world
that you're describing.
145
00:08:11,400 --> 00:08:14,720
Just as long as you've got the right
equations for bikes and bridges,
146
00:08:14,720 --> 00:08:17,120
you can be certain of
what's going to happen next.
147
00:08:17,120 --> 00:08:20,200
But there are some things that are
quite hard to write equations for
148
00:08:20,200 --> 00:08:21,680
in the first place.
149
00:08:21,680 --> 00:08:25,080
OK, let's imagine that it's long
into the future and you're trying
150
00:08:25,080 --> 00:08:27,640
to write an algorithm
that can help a doctor
151
00:08:27,640 --> 00:08:30,120
work out what's wrong
with their patients.
152
00:08:30,120 --> 00:08:33,080
Now, a doctor's job is quite
different to that of an engineer
153
00:08:33,080 --> 00:08:36,240
or a physicist because if someone
just comes in with a headache,
154
00:08:36,240 --> 00:08:39,040
a doctor can't just take
measurements of what's wrong
155
00:08:39,040 --> 00:08:41,120
and end up with an exact answer
156
00:08:41,120 --> 00:08:44,360
because that headache could mean
a whole host of different things.
157
00:08:44,360 --> 00:08:47,840
Somehow, the doctor has to use
a whole bunch of clues
158
00:08:47,840 --> 00:08:51,120
to build a picture of what
might be wrong with you.
159
00:08:51,120 --> 00:08:54,320
Now, this is the difference
between calculating the answer
160
00:08:54,320 --> 00:08:58,360
and just making your best possible
guess, and no-one had any idea
161
00:08:58,360 --> 00:09:01,840
how to do that in an equation
until the 1700s,
162
00:09:01,840 --> 00:09:05,880
when the Reverend Thomas Bayes
thought of a very clever game.
163
00:09:05,880 --> 00:09:08,400
Now, we're going to play
a version of this game
164
00:09:08,400 --> 00:09:10,360
just with a little bit more fire,
165
00:09:10,360 --> 00:09:14,760
so who would like to come down
and volunteer for this?
166
00:09:14,760 --> 00:09:18,240
Erm, let's go...
Let's go for you just there.
167
00:09:18,240 --> 00:09:20,600
Round of applause
as she comes to the stage.
168
00:09:20,600 --> 00:09:22,280
APPLAUSE
169
00:09:22,280 --> 00:09:24,840
What's your name? Emma. Emma? Yeah.
170
00:09:24,840 --> 00:09:27,840
OK, Emma. Right, what we're going
to do is, we're going to play
171
00:09:27,840 --> 00:09:30,600
a version of this game
and it involves this red hat,
172
00:09:30,600 --> 00:09:33,120
if you don't mind just popping
that on your head.
173
00:09:33,120 --> 00:09:36,400
Now, just so that we can get your
view of things. Oh, it's a bit...
174
00:09:36,400 --> 00:09:38,880
Hold on one second.
Let me tighten this up.
175
00:09:38,880 --> 00:09:41,120
Now, just so we can get
your view of things,
176
00:09:41,120 --> 00:09:44,680
I've also got a version of this
red hat over here for this camera
177
00:09:44,680 --> 00:09:46,720
so we'll be able to see
what you see.
178
00:09:46,720 --> 00:09:49,680
The other thing that we need
for this game is a whole host
179
00:09:49,680 --> 00:09:53,080
of balloons, which is just coming
on...just coming on behind you.
180
00:09:53,080 --> 00:09:56,040
If you want to turn around, Emma.
Just have a little look...
181
00:09:56,040 --> 00:09:58,320
Just have a little look
at these balloons.
182
00:09:58,320 --> 00:10:01,640
What colour are these balloons,
Emma? You want to stand over here.
183
00:10:01,640 --> 00:10:04,400
Red. What colour are the balloons?
Red. Red. OK.
184
00:10:04,400 --> 00:10:06,720
So if we look through
this camera now,
185
00:10:06,720 --> 00:10:10,360
we will be able to see that
they do indeed all look red,
186
00:10:10,360 --> 00:10:13,360
and yet...
Just step over here. Sorry.
187
00:10:13,360 --> 00:10:17,000
And yet, to everyone in this
audience, we can see that, in fact,
188
00:10:17,000 --> 00:10:21,680
what you're looking at are 99
orange balloons and one yellow one.
189
00:10:21,680 --> 00:10:23,960
Now, we all know where
the yellow balloon is.
190
00:10:23,960 --> 00:10:25,840
No-one's allowed to give it away.
191
00:10:25,840 --> 00:10:29,600
But your job, Emma, is to try
and pop that yellow balloon.
192
00:10:29,600 --> 00:10:33,800
And if you manage it, we're all
going to explode in excitement.
193
00:10:33,800 --> 00:10:35,600
And if you fail,
194
00:10:35,600 --> 00:10:38,600
we're going to respond with
disappointed silence, OK?
195
00:10:38,600 --> 00:10:42,840
So here we go. You get to pick
a balloon at random and pop one.
196
00:10:42,840 --> 00:10:46,360
Which one do you want to pop? You
stand here and Matt will do it.
197
00:10:46,360 --> 00:10:48,280
Up? Up?
198
00:10:48,280 --> 00:10:50,280
Down. This one? Yeah.
199
00:10:50,280 --> 00:10:52,280
Perfect. That one there? Yeah.
200
00:10:52,280 --> 00:10:54,080
Ready? Here we go.
201
00:10:54,080 --> 00:10:57,120
That is not the yellow balloon.
That's OK.
202
00:10:57,120 --> 00:11:00,040
It's pretty hard in the beginning.
I mean, you know...
203
00:11:00,040 --> 00:11:02,360
How can you possibly guess it
first time?
204
00:11:02,360 --> 00:11:05,600
What we're going to do is, audience,
we're going to help her here.
205
00:11:05,600 --> 00:11:09,600
So, we are going to tell you,
on my cue, we're going to tell you,
206
00:11:09,600 --> 00:11:12,080
based on the last balloon
that you popped,
207
00:11:12,080 --> 00:11:14,680
whether you should go higher
or lower, left or right.
208
00:11:14,680 --> 00:11:16,920
If you think it's higher,
say higher.
209
00:11:16,920 --> 00:11:18,960
if you think it's lower, say lower.
210
00:11:18,960 --> 00:11:21,200
If you think it's neither
higher nor lower,
211
00:11:21,200 --> 00:11:23,840
I want exactly 50% to say higher,
50% to say lower,
212
00:11:23,840 --> 00:11:26,880
and I'll let you work out between
yourselves which one is which.
213
00:11:26,880 --> 00:11:29,800
OK, so, based on her last
balloon pop,
214
00:11:29,800 --> 00:11:32,040
should she go higher or lower?
215
00:11:32,040 --> 00:11:33,560
ALL: Lower.
216
00:11:33,560 --> 00:11:36,480
And should she go left or right?
217
00:11:36,480 --> 00:11:39,200
ALL: Left.
OK, where do you want to pop?
218
00:11:39,200 --> 00:11:40,640
Erm...
219
00:11:41,680 --> 00:11:43,040
..lower.
220
00:11:43,040 --> 00:11:44,760
Left a bit.
221
00:11:46,040 --> 00:11:48,160
That one. That one. Ready?
222
00:11:49,200 --> 00:11:51,640
Oh, OK.
We'll give her another go.
223
00:11:51,640 --> 00:11:54,120
Do you want to go,
based on the last balloon pop,
224
00:11:54,120 --> 00:11:57,320
should she go higher or lower?
ALL: Lower.
225
00:11:57,320 --> 00:12:00,360
And should she go left or right?
MIXED RESPONSES
226
00:12:00,360 --> 00:12:03,360
Oh, interesting.
OK, which way do you want to go?
227
00:12:04,400 --> 00:12:05,680
Down.
228
00:12:06,720 --> 00:12:08,280
Down. That one.
229
00:12:08,280 --> 00:12:09,680
Ready?
230
00:12:10,720 --> 00:12:12,320
EXPLOSION
231
00:12:12,320 --> 00:12:14,000
APPLAUSE
232
00:12:20,280 --> 00:12:22,600
You got there.
Thank you so much, Emma.
233
00:12:22,600 --> 00:12:25,680
You got there amazingly quickly.
OK, do you want to take this...?
234
00:12:25,680 --> 00:12:29,120
Tell me, what was your strategy
in the first place? Just pick one.
235
00:12:29,120 --> 00:12:31,880
Just pick one at random? Yes.
You were just guessing at random.
236
00:12:31,880 --> 00:12:33,680
Did our clues help you? Yes.
237
00:12:33,680 --> 00:12:36,360
You knew probably where
the balloon was by the end. Yeah.
238
00:12:36,360 --> 00:12:39,560
Were you getting more and more
confident? Yes. Yes. Perfect.
239
00:12:39,560 --> 00:12:42,040
All right, Emma, thank you so much.
Amazing.
240
00:12:42,040 --> 00:12:44,040
APPLAUSE
241
00:12:47,080 --> 00:12:50,280
What Emma was doing there,
she was demonstrating something
242
00:12:50,280 --> 00:12:53,320
that's called Bayesian thinking
and, actually, it's something
243
00:12:53,320 --> 00:12:55,120
that all of us do instinctively.
244
00:12:55,120 --> 00:12:57,880
But it wasn't until Bayes
thought of a version of that game
245
00:12:57,880 --> 00:13:00,720
that the world realised that
you can actually write down
246
00:13:00,720 --> 00:13:03,040
that way of thinking
into an equation.
247
00:13:03,040 --> 00:13:06,600
I'm really not exaggerating when
I tell you that that Bayes theorem
248
00:13:06,600 --> 00:13:09,400
is one of the most important
equations of all time,
249
00:13:09,400 --> 00:13:11,880
because, suddenly,
it doesn't matter if you're not
250
00:13:11,880 --> 00:13:13,880
completely sure of the answer.
251
00:13:13,880 --> 00:13:16,560
You can still get a really good
sense of the right answer,
252
00:13:16,560 --> 00:13:18,800
even from incomplete pieces
of information.
253
00:13:18,800 --> 00:13:21,560
And that is something
that is incredibly useful.
254
00:13:21,560 --> 00:13:22,920
Let me show you.
255
00:13:22,920 --> 00:13:27,360
So, OK, let's imagine that you
are making a driverless car.
256
00:13:27,360 --> 00:13:30,360
Now, it hasn't got a driver
in it so you need to make sure
257
00:13:30,360 --> 00:13:35,040
that you know where you are and,
OK, you could use GPS to do that,
258
00:13:35,040 --> 00:13:37,400
but GPS isn't perfect.
259
00:13:37,400 --> 00:13:41,040
So, sometimes, your GPS
will get your position out
260
00:13:41,040 --> 00:13:42,880
by about a metre or so.
261
00:13:42,880 --> 00:13:45,160
And if you're a human,
that's fine. No big deal.
262
00:13:45,160 --> 00:13:47,040
You can work out where you are.
263
00:13:47,040 --> 00:13:50,640
But if you're a driverless car,
the difference of a few metres
264
00:13:50,640 --> 00:13:53,360
can mean the difference between
driving on the pavement
265
00:13:53,360 --> 00:13:56,400
and driving into oncoming
traffic, which isn't ideal.
266
00:13:56,400 --> 00:13:59,360
So driverless cars, they
also have cameras on board.
267
00:13:59,360 --> 00:14:02,400
Now, cameras are pretty good at
letting you know where you are,
268
00:14:02,400 --> 00:14:07,400
but, again, they're not perfect
because skies look a lot like water
269
00:14:07,400 --> 00:14:11,640
and, you know, lorry tarpaulin
looks a lot like a clouded sky.
270
00:14:11,640 --> 00:14:15,640
The point about driverless cars is
that you don't just have one thing
271
00:14:15,640 --> 00:14:17,880
that gives you exactly
where you are.
272
00:14:17,880 --> 00:14:21,320
You have lots of different
things that you use as clues
273
00:14:21,320 --> 00:14:23,400
to indicate where you are.
274
00:14:23,400 --> 00:14:26,360
And that is something
that is especially important
275
00:14:26,360 --> 00:14:29,880
when you are driving
at 200mph.
276
00:14:29,880 --> 00:14:34,640
So, this thing here, this is
the world's fastest autonomous car.
277
00:14:34,640 --> 00:14:37,760
It was built for racing and it's
got all kinds of different sensors
278
00:14:37,760 --> 00:14:39,720
to help it work out where it is.
279
00:14:39,720 --> 00:14:41,960
It's got little cameras here,
280
00:14:41,960 --> 00:14:44,920
it's got another kind of camera
called LiDAR over here.
281
00:14:44,920 --> 00:14:47,480
it's got radar over here
at the back.
282
00:14:47,480 --> 00:14:50,120
All of these are the clues
for the car,
283
00:14:50,120 --> 00:14:54,120
and GPS within the computer
that's just inside here.
284
00:14:54,120 --> 00:14:57,880
Now, this thing is basically
a Bayesian machine.
285
00:14:57,880 --> 00:15:01,360
So, that computer, that is
the size of just a lunchbox,
286
00:15:01,360 --> 00:15:04,840
is churning through trillions
of calculations every second
287
00:15:04,840 --> 00:15:07,880
to make sure that this car
knows where it is
288
00:15:07,880 --> 00:15:10,840
and finishes the race
as quickly as possible,
289
00:15:10,840 --> 00:15:14,080
all while avoiding
other competitors.
290
00:15:14,080 --> 00:15:17,360
I think that's the thing about
how these modern inventions work.
291
00:15:17,360 --> 00:15:20,400
That's how they deal
with uncertainty.
292
00:15:20,400 --> 00:15:24,080
They don't just have one sensor,
they don't just have two sensors,
293
00:15:24,080 --> 00:15:28,320
they have a whole host of sensors
that they use to layer up
294
00:15:28,320 --> 00:15:30,080
and give them information.
295
00:15:30,080 --> 00:15:33,080
That's something that's true
of driverless cars,
296
00:15:33,080 --> 00:15:36,840
but it's also true
of this little guy here,
297
00:15:36,840 --> 00:15:40,560
who I believe is going to
follow me into the studio.
298
00:15:40,560 --> 00:15:42,040
There we go.
299
00:15:42,040 --> 00:15:43,560
Come on, then.
300
00:15:46,320 --> 00:15:48,280
Come on. Come on.
301
00:15:48,280 --> 00:15:50,200
CHUCKLING
302
00:15:57,840 --> 00:16:00,800
Hey! Round of applause
for our little drone.
303
00:16:00,800 --> 00:16:02,480
APPLAUSE
304
00:16:13,560 --> 00:16:16,000
Oh, that was a lovely landing.
305
00:16:16,000 --> 00:16:18,760
Right, I want you to join me
in welcoming to the stage
306
00:16:18,760 --> 00:16:21,000
Duncan and the Skyports drone.
307
00:16:21,000 --> 00:16:23,000
CHEERING AND APPLAUSE
308
00:16:29,120 --> 00:16:32,080
This is quite some drone, Duncan.
This is quite a big one.
309
00:16:32,080 --> 00:16:35,120
This is one of our delivery drones,
so we can do medical samples
310
00:16:35,120 --> 00:16:37,240
or e-commerce deliveries
with this one.
311
00:16:37,240 --> 00:16:40,360
So what kind of things is this
used for? We do blood samples.
312
00:16:40,360 --> 00:16:43,200
We can do them between hospitals
and medical facilities.
313
00:16:43,200 --> 00:16:46,120
We can do packages. We were flying
them in Finland recently.
314
00:16:46,120 --> 00:16:49,040
Basically, anything you can fit
in that box, up to about 5kg,
315
00:16:49,040 --> 00:16:51,880
we can fly it. So how does
this thing avoid crashing?
316
00:16:51,880 --> 00:16:55,520
It's got a number of systems on it.
It's got 4G, like a mobile phone.
317
00:16:55,520 --> 00:16:59,040
It's got a Wi-Fi network of its own.
318
00:16:59,040 --> 00:17:01,120
And it's also got,
if all else fails,
319
00:17:01,120 --> 00:17:03,080
a satellite communications network.
320
00:17:03,080 --> 00:17:05,760
What's this thing over here?
That's the fail-safe.
321
00:17:05,760 --> 00:17:08,840
So, if everything goes wrong, that's
a parachute and that would deploy.
322
00:17:08,840 --> 00:17:11,160
So, for example,
if one of the rotors stops
323
00:17:11,160 --> 00:17:13,800
or one of the motors doesn't work,
a big alarm goes off,
324
00:17:13,800 --> 00:17:16,640
that will deploy, and it will come
down to Earth very safely.
325
00:17:16,640 --> 00:17:19,680
So it needs all of those different
systems running in parallel?
326
00:17:19,680 --> 00:17:22,880
It needs them running in parallel.
Hopefully you only ever use one.
327
00:17:22,880 --> 00:17:26,200
The rest we call redundancy. It's
there in case something goes wrong.
328
00:17:26,200 --> 00:17:28,400
What's the future of drones
like this, then?
329
00:17:28,400 --> 00:17:31,560
This is becoming more and more
prevalent. We're flying in Africa.
330
00:17:31,560 --> 00:17:34,120
We're doing snake bite anti-venom.
331
00:17:34,120 --> 00:17:37,640
So very urgent stuff,
often very bad road networks.
332
00:17:37,640 --> 00:17:40,360
We're flying in the
west coast of Scotland,
333
00:17:40,360 --> 00:17:42,360
doing some medical samples again.
334
00:17:42,360 --> 00:17:44,320
Ultimately, it will
come into cities,
335
00:17:44,320 --> 00:17:46,320
much more complex environments,
336
00:17:46,320 --> 00:17:48,720
you've got lots more people,
lots more buildings,
337
00:17:48,720 --> 00:17:50,920
lots more things to keep
out of the way of.
338
00:17:50,920 --> 00:17:53,120
But the technology
is good enough now
339
00:17:53,120 --> 00:17:55,640
that you can fly in
pretty much any environment.
340
00:17:55,640 --> 00:17:59,560
Talking of people, could I ever get
a personal passenger drone? You can.
341
00:17:59,560 --> 00:18:01,360
In fact, you can already.
342
00:18:01,360 --> 00:18:04,400
So this is your company. Personal
passenger drones, are they?
343
00:18:04,400 --> 00:18:07,360
Yes, this is a company called
Volocopter, and these are live.
344
00:18:07,360 --> 00:18:09,480
They're going through
certification now.
345
00:18:09,480 --> 00:18:12,720
Within two years, everybody will be
able to get in one and fly around.
346
00:18:12,720 --> 00:18:15,000
Within two years? Within two years.
Goodness me.
347
00:18:15,000 --> 00:18:17,640
Wil our skies be full of them
in the future, do you think?
348
00:18:17,640 --> 00:18:20,040
The air space is vast.
Cities are now very dense.
349
00:18:20,040 --> 00:18:22,480
It's hard to put more
infrastructure into cities.
350
00:18:22,480 --> 00:18:24,920
These can fly in our
underutilised air. Amazing.
351
00:18:24,920 --> 00:18:26,920
Duncan, a view of the
future there, I think.
352
00:18:26,920 --> 00:18:29,080
A big round of applause, if you can.
353
00:18:29,080 --> 00:18:30,880
APPLAUSE
354
00:18:33,840 --> 00:18:36,840
Duncan was talking a lot there
about having backups on backups
355
00:18:36,840 --> 00:18:40,400
on backups, just to make sure
that if there's ever a problem,
356
00:18:40,400 --> 00:18:43,840
you know that the drone won't crash,
and that is something, actually,
357
00:18:43,840 --> 00:18:46,440
that Matt Parker has been
thinking about, too.
358
00:18:46,440 --> 00:18:51,120
Yes, and I've brought a comedy
oversized slice of cheese.
359
00:18:51,120 --> 00:18:53,560
A slice of cheese?
A slice of cheese. All right.
360
00:18:53,560 --> 00:18:56,840
Because when a lot of people are
thinking about things like drones
361
00:18:56,840 --> 00:19:00,000
and trying to avoid disasters, they
find it's useful to think of it
362
00:19:00,000 --> 00:19:01,800
in terms of cheese. OK.
363
00:19:01,800 --> 00:19:03,640
Things can go wrong with drones.
364
00:19:03,640 --> 00:19:05,840
You can have...
One of the motors might break,
365
00:19:05,840 --> 00:19:07,880
the battery might
run out of charge.
366
00:19:07,880 --> 00:19:11,000
When that happens, you don't want
it to crash and cause a disaster.
367
00:19:11,000 --> 00:19:14,360
So, you imagine these mistakes,
these errors, coming at your system
368
00:19:14,360 --> 00:19:16,000
and you put in barriers...
369
00:19:16,000 --> 00:19:19,720
so, like a slice of cheese, to stop
them from making it...bear with me,
370
00:19:19,720 --> 00:19:22,080
making it through
and becoming a disaster.
371
00:19:22,080 --> 00:19:25,400
So this could be, for example,
like, the GPS system. OK.
372
00:19:25,400 --> 00:19:27,880
It's tracking where it is,
anything that goes wrong,
373
00:19:27,880 --> 00:19:30,720
it shouldn't be a disaster.
What about the holes, though?
374
00:19:30,720 --> 00:19:33,360
Well spotted. So, GPS,
as you know, is not perfect.
375
00:19:33,360 --> 00:19:35,800
You could be on the sidewalk,
according to the GPS,
376
00:19:35,800 --> 00:19:38,840
and so it might be inaccurate,
it might give you the wrong data.
377
00:19:38,840 --> 00:19:41,600
No one layer to try and stop
disasters will be perfect.
378
00:19:41,600 --> 00:19:44,560
So this is trying to block
disasters from happening,
379
00:19:44,560 --> 00:19:47,360
mostly it works, but just
occasionally it's going to fail.
380
00:19:47,360 --> 00:19:50,440
Occasionally a mistake will slip
through and GPS won't be enough.
381
00:19:50,440 --> 00:19:52,800
But over here,
we've got more cheese.
382
00:19:52,800 --> 00:19:56,920
If you'd like to take a seat here,
under the cheese. It's fine.
383
00:19:56,920 --> 00:19:59,240
Are you sure?
You trust the maths. Here we go.
384
00:19:59,240 --> 00:20:02,080
I'm not sure I trust you though,
Matt. Wise Very wise.
385
00:20:02,080 --> 00:20:06,400
So, this is another slice of cheese
and this one is the parachute.
386
00:20:06,400 --> 00:20:09,320
So, if the big drone fails
and the GPS is wrong,
387
00:20:09,320 --> 00:20:11,360
the parachute will deploy.
388
00:20:11,360 --> 00:20:14,400
With two layers together...
This has got holes in it as well.
389
00:20:14,400 --> 00:20:17,360
That's true, but what we hope is,
the holes in this layer
390
00:20:17,360 --> 00:20:20,040
don't line up with the holes
in the other layers.
391
00:20:20,040 --> 00:20:22,040
Actually, we can get another layer.
392
00:20:22,040 --> 00:20:24,880
This one's called rules,
which doesn't sound very exciting,
393
00:20:24,880 --> 00:20:26,600
but we all need rules.
394
00:20:26,600 --> 00:20:28,640
So, when we brought
the drone in here,
395
00:20:28,640 --> 00:20:31,080
we weren't allowed to fly it
above a crowd.
396
00:20:31,080 --> 00:20:33,400
That's one of the rules
for using a drone.
397
00:20:33,400 --> 00:20:36,080
So even if an accident happens..
Yeah.
398
00:20:36,080 --> 00:20:38,720
..even if the parachute fails
and the GPS fails,
399
00:20:38,720 --> 00:20:40,680
at least people won't get hurt.
400
00:20:40,680 --> 00:20:43,640
In theory, if you're following
the rules, it'll be fine,
401
00:20:43,640 --> 00:20:46,400
although, of course, sometimes
people break the rules
402
00:20:46,400 --> 00:20:48,880
and you hope the other layers
will help you out.
403
00:20:48,880 --> 00:20:52,040
And so...I'm going to grab a camera
so I can show you a point of view.
404
00:20:52,040 --> 00:20:54,000
Thank you. Can I just...
405
00:20:54,000 --> 00:20:57,040
You know what, Matt, I mean,
it's not that I don't trust you...
406
00:20:58,400 --> 00:21:02,040
No, it's literally that you don't
trust me. It's literally that. OK.
407
00:21:02,040 --> 00:21:05,200
So if you have a look from
Hannah's point of view underneath,
408
00:21:05,200 --> 00:21:08,640
you can see up through some holes,
but then, almost straight away,
409
00:21:08,640 --> 00:21:11,080
it's blocked by a different
slice of cheese.
410
00:21:11,080 --> 00:21:13,880
And the point of view at the top,
here you go.
411
00:21:13,880 --> 00:21:17,080
Again, there are some holes that go
partway down, but then they stop.
412
00:21:17,080 --> 00:21:20,800
What we're going to do now
is rain some errors down on you.
413
00:21:20,800 --> 00:21:22,920
We've got a whole bucket of errors
414
00:21:22,920 --> 00:21:25,880
and, in theory,
if we drip them down,
415
00:21:25,880 --> 00:21:29,360
while they will make it through
some of the slices of cheese
416
00:21:29,360 --> 00:21:32,600
they will be stopped by...
More errors, more errors.
417
00:21:32,600 --> 00:21:35,640
And so, some of them are being
stopped by the first layer,
418
00:21:35,640 --> 00:21:37,920
some of them are getting
through the first layer
419
00:21:37,920 --> 00:21:40,800
but they're stopped by the next
layer, and so, in theory...
420
00:21:40,800 --> 00:21:42,760
OK, I get it, I get it, I get it.
421
00:21:42,760 --> 00:21:45,400
Yeah, overall...
Overall, it's fine, right?
422
00:21:45,400 --> 00:21:48,400
All of these different layers
are blocking it from happening.
423
00:21:48,400 --> 00:21:50,800
And so even though
any one individual layer,
424
00:21:50,800 --> 00:21:53,160
you're like, oh, look
at all those holes in it,
425
00:21:53,160 --> 00:21:56,040
if you sit down and you look
through the whole lot at once,
426
00:21:56,040 --> 00:21:58,000
you're like, oh, that's amazing.
427
00:21:58,000 --> 00:22:01,320
From here, I can't see any hole
which goes the entire way through.
428
00:22:01,320 --> 00:22:03,120
That's good, but then, Matt,
429
00:22:03,120 --> 00:22:05,720
what happens if the holes
do go the entire way through?
430
00:22:05,720 --> 00:22:07,800
That's a good point,
because every...
431
00:22:07,800 --> 00:22:09,400
LAUGHTER
432
00:22:09,400 --> 00:22:11,200
You make a valid point.
433
00:22:11,200 --> 00:22:13,960
I think we need more errors.
Albeit in an interesting way.
434
00:22:13,960 --> 00:22:16,240
More errors. More errors.
435
00:22:16,240 --> 00:22:18,280
CHEERING
436
00:22:28,640 --> 00:22:31,760
I'm trying to make
a serious point here.
437
00:22:31,760 --> 00:22:34,840
Occasionally, your
cheese holes will line up
438
00:22:34,840 --> 00:22:38,040
and a few mistakes, normally,
will make it through
439
00:22:38,040 --> 00:22:39,600
and become a disaster.
440
00:22:39,600 --> 00:22:42,600
And there's nothing you can do
about that? Nothing you can do.
441
00:22:42,600 --> 00:22:45,120
Disasters, Matt, are inevitable.
Erm, yep.
442
00:22:45,120 --> 00:22:47,480
I'm coming to terms
with that as we speak.
443
00:22:47,480 --> 00:22:50,320
Matt Parker. Round of applause.
Thank you very much.
444
00:22:50,320 --> 00:22:52,840
CHEERING AND APPLAUSE
445
00:22:57,320 --> 00:23:00,600
So this is the really big
downside of uncertainty.
446
00:23:00,600 --> 00:23:03,480
You have to accept that
perfection is impossible.
447
00:23:03,480 --> 00:23:06,080
Mistakes are effectively inevitable.
448
00:23:06,080 --> 00:23:09,080
I think that that raises a very
big and important question.
449
00:23:09,080 --> 00:23:11,720
If we know for sure
that our algorithms
450
00:23:11,720 --> 00:23:13,880
are never going to be perfect,
451
00:23:13,880 --> 00:23:17,320
do we want to put them in
charge of making decisions,
452
00:23:17,320 --> 00:23:20,600
especially in situations where
people's lives are at stake,
453
00:23:20,600 --> 00:23:22,560
like in the courtroom?
454
00:23:22,560 --> 00:23:27,080
Someone who's thought about this
a lot is Professor Katie Atkinson.
455
00:23:27,080 --> 00:23:29,320
APPLAUSE
456
00:23:32,600 --> 00:23:36,120
Katie, the courtroom, it doesn't
feel like a natural place
457
00:23:36,120 --> 00:23:39,600
where you would find mathematics.
Well, perhaps not, but, actually,
458
00:23:39,600 --> 00:23:43,400
AI and law researchers
are working on building models
459
00:23:43,400 --> 00:23:46,640
of legal reasoning
using mathematical models
460
00:23:46,640 --> 00:23:49,120
that are then turned
into software programmes
461
00:23:49,120 --> 00:23:51,920
that can help judges and lawyers.
Why do they need them?
462
00:23:51,920 --> 00:23:54,400
Why can't the judges
just do it all themselves?
463
00:23:54,400 --> 00:23:57,440
Well, the point of using
these mathematical models
464
00:23:57,440 --> 00:24:00,360
is that we can get consistent,
efficient decisions,
465
00:24:00,360 --> 00:24:03,600
and we know that any kind of
unconscious biases are stripped out.
466
00:24:03,600 --> 00:24:06,640
So, judges make mistakes,
they have unconscious biases,
467
00:24:06,640 --> 00:24:08,960
and the idea is that
using algorithms
468
00:24:08,960 --> 00:24:11,720
can help to minimise that?
That's absolutely right.
469
00:24:11,720 --> 00:24:14,680
We're hoping these can help.
I understand that you've brought
470
00:24:14,680 --> 00:24:17,760
a little friend along to help us
understand what's going on here.
471
00:24:17,760 --> 00:24:20,800
Yes, this is Pepper the robot.
Pepper the robot. Hello, Pepper.
472
00:24:20,800 --> 00:24:24,840
All rise for the court that will
decide the case of Popov v Hayashi.
473
00:24:24,840 --> 00:24:28,400
So this links us to a legal case
that was in the United States
474
00:24:28,400 --> 00:24:32,680
and involved a baseball and people
catching and dropping a baseball.
475
00:24:32,680 --> 00:24:34,960
So someone hit a
baseball into the crowd
476
00:24:34,960 --> 00:24:37,360
and then two people
fought over the baseball,
477
00:24:37,360 --> 00:24:40,080
and this baseball was very
valuable, wasn't it? Indeed.
478
00:24:40,080 --> 00:24:42,600
It was worth $450,000
in the end. Oh, crikey!
479
00:24:42,600 --> 00:24:45,720
I can understand why two people
were particularly upset. Yeah.
480
00:24:45,720 --> 00:24:49,000
So what we have to do is, first
of all, work out what the facts
481
00:24:49,000 --> 00:24:52,400
of the case are and then we have
to work out what the arguments are
482
00:24:52,400 --> 00:24:55,080
for the two different sides
within the legal case.
483
00:24:55,080 --> 00:24:57,960
And that's what Pepper's going
to help us with? That's right.
484
00:24:57,960 --> 00:25:02,400
So the facts of the case are that
Mr Popov stopped the forward motion
485
00:25:02,400 --> 00:25:04,880
of the ball once it
was hit into the crowd.
486
00:25:04,880 --> 00:25:08,360
He tried to get it under control
but he was thrown to the ground
487
00:25:08,360 --> 00:25:11,360
by this mob who were also trying
to secure the baseball.
488
00:25:11,360 --> 00:25:14,080
It sounds a bit unfair, really,
if you're, you know...
489
00:25:14,080 --> 00:25:17,280
You've caught it and then...
It's not your fault. That's right.
490
00:25:17,280 --> 00:25:20,280
That's why he felt that he should
have been given the opportunity
491
00:25:20,280 --> 00:25:22,160
to complete the catch.
492
00:25:22,160 --> 00:25:25,600
And then Mr Hayashi found
the loose ball on the floor
493
00:25:25,600 --> 00:25:29,040
and he picked it up
and claimed it as his. OK.
494
00:25:29,040 --> 00:25:32,080
Well, I mean, he was the one
who ended up with it at the end.
495
00:25:32,080 --> 00:25:35,160
That kind of seems like he's got
a fair claim, too. That's right.
496
00:25:35,160 --> 00:25:37,480
And, in particular,
he wasn't part of this mob
497
00:25:37,480 --> 00:25:40,160
that threw Mr Popov to the ground,
so he did no wrong either,
498
00:25:40,160 --> 00:25:41,840
so he shouldn't be punished.
499
00:25:41,840 --> 00:25:44,680
And that's really the facts
and the arguments of the case.
500
00:25:44,680 --> 00:25:47,520
So you can take all of those facts
and arguments over the case,
501
00:25:47,520 --> 00:25:49,760
put them into equations,
compare them to cases
502
00:25:49,760 --> 00:25:51,560
that were like this in the past,
503
00:25:51,560 --> 00:25:53,440
and then, ultimately,
504
00:25:53,440 --> 00:25:56,880
can the algorithm give us
a verdict? Yes, that's right.
505
00:25:56,880 --> 00:25:59,880
OK, Pepper, what's the
verdict in this case, then?
506
00:25:59,880 --> 00:26:04,400
The decision of this court is that
the ball should be sold at auction
507
00:26:04,400 --> 00:26:09,400
and the proceeds split evenly
between Mr Popov and Mr Hayashi.
508
00:26:09,400 --> 00:26:12,680
OK, that's very clear. We're having
lots of fun with a humanoid robot,
509
00:26:12,680 --> 00:26:15,600
but it's not the intention to
actually have humanoid robots
510
00:26:15,600 --> 00:26:18,080
in courtrooms, is it?
That's absolutely right.
511
00:26:18,080 --> 00:26:20,680
We're aiming at writing
these mathematical models
512
00:26:20,680 --> 00:26:24,040
that we're turning into AI tools
that will be on the computer
513
00:26:24,040 --> 00:26:25,840
and helping judges and lawyers
514
00:26:25,840 --> 00:26:28,360
who are set there using
these for decision support.
515
00:26:28,360 --> 00:26:31,600
It's all the stuff inside Pepper,
not Pepper herself. That's right.
516
00:26:31,600 --> 00:26:34,400
I can't imagine us seeing
Pepper in a court any time soon.
517
00:26:34,400 --> 00:26:37,280
If she's sending you off to jail,
going like that... No.
518
00:26:37,280 --> 00:26:39,720
This example sounds like
a very positive thing.
519
00:26:39,720 --> 00:26:42,360
You could get through, I imagine,
a big backlog of cases
520
00:26:42,360 --> 00:26:44,400
with something like this
on your side.
521
00:26:44,400 --> 00:26:47,520
But algorithms in the courtroom,
they're not...they haven't been
522
00:26:47,520 --> 00:26:50,120
universally positive, have they?
Yeah, that's right.
523
00:26:50,120 --> 00:26:51,760
There is a big issue of trust
524
00:26:51,760 --> 00:26:54,560
as well as whether the actual
technology works in itself.
525
00:26:54,560 --> 00:26:57,760
Because if you're putting all of
history into a series of equations,
526
00:26:57,760 --> 00:27:00,640
I mean, history wasn't exactly fair,
was it? That's right.
527
00:27:00,640 --> 00:27:03,640
And we want to make these decisions
as fair as we possibly can
528
00:27:03,640 --> 00:27:06,400
and get AI technologies to help
us do this. I quite agree.
529
00:27:06,400 --> 00:27:09,160
Katie, thank you very much
for joining us. You're welcome.
530
00:27:09,160 --> 00:27:10,800
APPLAUSE
531
00:27:15,600 --> 00:27:18,600
I think Katie made a really
important point in all that,
532
00:27:18,600 --> 00:27:22,080
because if all you're doing is just
chucking in everything that happened
533
00:27:22,080 --> 00:27:25,080
in the past to your equations,
then you're going to perpetuate
534
00:27:25,080 --> 00:27:27,880
all of society's biggest
imbalances going forward.
535
00:27:27,880 --> 00:27:32,400
And a machine is only ever as good
as the data that it's trained on.
536
00:27:32,400 --> 00:27:34,760
Let me show you what
I'm talking about here,
537
00:27:34,760 --> 00:27:38,280
because I'd like you to welcome
back to the stage Matt Parker
538
00:27:38,280 --> 00:27:40,640
with a very special
shoe-detecting machine.
539
00:27:40,640 --> 00:27:43,000
APPLAUSE
540
00:27:45,480 --> 00:27:49,840
Is that a new shirt, Matt?
I bring a range of shirts.
541
00:27:49,840 --> 00:27:54,280
So, this is my shoe-detecting
device. OK. You can have a go.
542
00:27:54,280 --> 00:27:57,600
I call it Shoe Do You Think
You Are? OK. Very nice.
543
00:27:57,600 --> 00:28:01,880
If you point it at something, it
can tell you if it's a shoe or not.
544
00:28:01,880 --> 00:28:03,600
All right. Let me give it a go.
Try your face.
545
00:28:03,600 --> 00:28:04,880
OK, that is not a shoe.
546
00:28:04,880 --> 00:28:06,720
Not a shoe. Correct.
Thank you very much.
547
00:28:06,720 --> 00:28:08,280
Let's try aiming at your shoes. Yep.
548
00:28:08,280 --> 00:28:10,160
Oh, hang on. It says
they're not shoes.
549
00:28:10,160 --> 00:28:12,520
The device is fine.
We just haven't trained it yet.
550
00:28:12,520 --> 00:28:14,360
We have to put in some
training data.
551
00:28:14,360 --> 00:28:16,520
You were here for lecture two.
I was.
552
00:28:16,520 --> 00:28:18,560
So we've got to teach it
what a shoe looks like
553
00:28:18,560 --> 00:28:20,920
and then it'll be amazing.
OK. All right, then.
554
00:28:20,920 --> 00:28:24,800
So let's get a group of people
to help me train this shoe.
555
00:28:24,800 --> 00:28:26,760
So let's get some people from
over here.
556
00:28:26,760 --> 00:28:28,800
Let's get that little column
up there. And up here.
557
00:28:28,800 --> 00:28:31,640
You guys all want to come down here
and help me train this machine.
558
00:28:31,640 --> 00:28:33,760
Round of applause as
they come to the stage.
559
00:28:33,760 --> 00:28:35,480
APPLAUSE
560
00:28:40,280 --> 00:28:41,880
OK, so here's
what we're going to do.
561
00:28:41,880 --> 00:28:43,280
We'll give you 20 seconds,
562
00:28:43,280 --> 00:28:46,200
and I would just like you
to draw a picture of a shoe
563
00:28:46,200 --> 00:28:47,640
that I can scan into the machine.
564
00:28:47,640 --> 00:28:49,080
Here we go.
565
00:28:57,200 --> 00:28:59,280
Two, one, stop.
566
00:28:59,280 --> 00:29:02,520
OK. Let's turn those round.
Hold them up to the camera.
567
00:29:02,520 --> 00:29:04,200
We'll have a look. Let's scan these.
568
00:29:04,200 --> 00:29:05,800
We've got some lace-y numbers.
569
00:29:05,800 --> 00:29:08,160
Some nice... A top shot of a
shoe there. This is great.
570
00:29:08,160 --> 00:29:10,520
Lots of laces. You hare
some trainers there.
571
00:29:10,520 --> 00:29:12,680
This is all perfect.
OK, great. Perfect.
572
00:29:12,680 --> 00:29:14,680
It's all... I think
it's all trained, Matt.
573
00:29:14,680 --> 00:29:16,440
So let's give it a go.
574
00:29:16,440 --> 00:29:18,280
Let's try your shoes there.
575
00:29:18,280 --> 00:29:19,600
OK. Perfect.
576
00:29:19,600 --> 00:29:22,080
Spotted your shoes. Amazing.
It's great. It works.
577
00:29:22,080 --> 00:29:24,800
Is it detecting shoes? Yeah, look.
Yeah, perfect.
578
00:29:24,800 --> 00:29:27,240
That's amazing. Can I just pop
through and get a closer look?
579
00:29:27,240 --> 00:29:29,280
That's incredible. Oh.
580
00:29:29,280 --> 00:29:33,400
And it's not failed yet on any
of these near identical shoes?
581
00:29:33,400 --> 00:29:36,040
No. I mean you did all draw very
similar shoes. Give these a go.
582
00:29:36,040 --> 00:29:37,280
Let's try this one.
583
00:29:37,280 --> 00:29:40,120
Oh, no. It's saying no.
It doesn't detect your shoes at all.
584
00:29:40,120 --> 00:29:42,720
You need a much more
diverse range of training.
585
00:29:42,720 --> 00:29:44,960
I'm very disappointed in all of you.
586
00:29:44,960 --> 00:29:46,200
LAUGHTER
587
00:29:46,200 --> 00:29:48,560
I mean, guys, you did just
basically draw your own shoes.
588
00:29:48,560 --> 00:29:49,880
So let's try again.
589
00:29:49,880 --> 00:29:51,880
Turn your paper round, try again,
590
00:29:51,880 --> 00:29:55,640
and this time try and think of as
wide a range of shoes as possible.
591
00:29:55,640 --> 00:29:56,720
Off you go.
592
00:29:58,560 --> 00:30:01,600
Three, two, one, stop.
593
00:30:01,600 --> 00:30:04,040
OK. Here we go. Let's get
this in scanning mode.
594
00:30:04,040 --> 00:30:05,280
We're ready to go.
595
00:30:05,280 --> 00:30:07,440
Turn it around. Hold it up
to the camera. That's it.
596
00:30:07,440 --> 00:30:09,560
It looks pretty much like a shoe.
This is pretty good.
597
00:30:09,560 --> 00:30:11,160
This is great. Got some high heels.
598
00:30:11,160 --> 00:30:13,000
Lots of high heels.
Got flip-flops, got boots.
599
00:30:13,000 --> 00:30:14,680
We've got some wellies there.
600
00:30:14,680 --> 00:30:17,280
This is lovely. Ballerina shoe.
That's great. Perfect.
601
00:30:17,280 --> 00:30:19,800
Matt, I think it's good now.
I think it's got to be good now.
602
00:30:19,800 --> 00:30:22,320
Surely this works on all shoes now.
603
00:30:22,320 --> 00:30:24,480
Let's have a closer look at it.
604
00:30:24,480 --> 00:30:26,400
Excuse me.
MURMURS IN AUDIENCE
605
00:30:28,080 --> 00:30:29,640
Shoe? Shoe.
606
00:30:29,640 --> 00:30:31,560
Not shoe. Not a shoe?!
607
00:30:31,560 --> 00:30:33,120
I mean... Shoe.
608
00:30:33,120 --> 00:30:37,360
..that's debatable whether that's
a shoe. It's definitely footwear.
609
00:30:37,360 --> 00:30:38,640
OK. Let's give it a go.
610
00:30:38,640 --> 00:30:41,040
With the right data it could
have detected that as a shoe.
611
00:30:41,040 --> 00:30:42,880
This is true, with the right data.
612
00:30:42,880 --> 00:30:45,720
But I think the point here is
that even when you know to draw
613
00:30:45,720 --> 00:30:47,560
as diverse a range
of shoes as possible,
614
00:30:47,560 --> 00:30:50,160
it's actually just really
hard to think of everything.
615
00:30:50,160 --> 00:30:51,800
Just unbelievable.
616
00:30:54,000 --> 00:30:55,520
Thank you very much, Matt.
617
00:30:55,520 --> 00:30:57,680
A round of applause
for our volunteers.
618
00:30:57,680 --> 00:30:59,080
APPLAUSE
619
00:31:01,080 --> 00:31:03,400
Now, there is a really
important point in all of this,
620
00:31:03,400 --> 00:31:07,320
because if you don't think through
all of the possible situations
621
00:31:07,320 --> 00:31:09,680
that your machine needs to include,
622
00:31:09,680 --> 00:31:12,880
it can end up having very big,
real life consequences.
623
00:31:12,880 --> 00:31:14,480
To tell us a little bit more,
624
00:31:14,480 --> 00:31:17,440
please join me in welcoming
from the University of York
625
00:31:17,440 --> 00:31:19,640
an expert in image recognition
for surveillance,
626
00:31:19,640 --> 00:31:21,360
Dr Kofi Appiah.
627
00:31:21,360 --> 00:31:23,600
APPLAUSE
628
00:31:25,720 --> 00:31:26,960
Hey, Kofi.
629
00:31:32,560 --> 00:31:36,520
Now, Kofi, you've done a lot of work
in facial recognition before, right?
630
00:31:36,520 --> 00:31:39,080
That's right. Tell me, how does
facial detection work?
631
00:31:39,080 --> 00:31:42,720
So for face detection, all that
we need is to be able to pick out
632
00:31:42,720 --> 00:31:46,400
the elements of the face, which is
the eye - the key features -
633
00:31:46,400 --> 00:31:48,080
the eye, the nose, the mouth,
634
00:31:48,080 --> 00:31:50,400
and to be able to pick out
these features,
635
00:31:50,400 --> 00:31:53,520
there's a big contrast
that we can find between the eye
636
00:31:53,520 --> 00:31:56,800
and the eyelashes,
the eyelashes and the skin itself.
637
00:31:56,800 --> 00:31:59,400
When it comes to the mouth,
the lips, you have an edge there.
638
00:31:59,400 --> 00:32:01,760
So these are the big contrasts
that we are able to pick
639
00:32:01,760 --> 00:32:05,160
and as far as we're able to find
eyes, nose, mouth, we've got a face.
640
00:32:05,160 --> 00:32:07,800
So is it looking... If I come
over here, then, to this one here.
641
00:32:07,800 --> 00:32:09,560
Is it looking for areas
of light and dark?
642
00:32:09,560 --> 00:32:11,160
Say, on my nose, for
instance? Yes.
643
00:32:11,160 --> 00:32:13,920
It's kind of darker on either side
and then lighter down the middle.
644
00:32:13,920 --> 00:32:16,800
That's correct. And if it gets a
strip of darker pixels and lighter
645
00:32:16,800 --> 00:32:18,800
pixels, it knows it's found
a nose? That's right.
646
00:32:18,800 --> 00:32:20,400
It's looking for these key
features.
647
00:32:20,400 --> 00:32:22,880
This is what a face normally
will have - the key features.
648
00:32:22,880 --> 00:32:25,440
And that's what we train our systems
to use and recognise.
649
00:32:25,440 --> 00:32:28,040
Perfect. Just one thing, though,
Kofi, that I'm noticing here.
650
00:32:28,040 --> 00:32:29,960
It's getting my face OK, but...
651
00:32:29,960 --> 00:32:33,520
Right. So it's not able to pick
my face,
652
00:32:33,520 --> 00:32:36,840
and it's relating to the data
that you just mentioned.
653
00:32:36,840 --> 00:32:39,560
This system has not been trained
with enough data.
654
00:32:39,560 --> 00:32:43,680
The contrasting features that
you've got between your eyebrows
655
00:32:43,680 --> 00:32:46,760
and the skin texture is
different from mine.
656
00:32:46,760 --> 00:32:48,080
And it's not picking it up.
657
00:32:48,080 --> 00:32:50,920
So in this case, I'm going
to have this...
658
00:32:52,880 --> 00:32:55,160
It's picking it up as a face
because it can pick some
659
00:32:55,160 --> 00:32:57,680
of the salient features
that I was talking about.
660
00:32:57,680 --> 00:32:59,760
It's going to put a bounding
box around it.
661
00:33:01,280 --> 00:33:03,240
Whereas in my case, no.
662
00:33:03,240 --> 00:33:06,000
But this stuff is actually, you
know, it's a really big deal.
663
00:33:06,000 --> 00:33:07,520
It's not just in cameras.
664
00:33:07,520 --> 00:33:10,040
We're now seeing facial
detection in all kinds of things.
665
00:33:10,040 --> 00:33:12,440
Passport queues, for instance.
That's right, yes.
666
00:33:12,440 --> 00:33:15,120
Obviously, in this case,
if you've got a face like me,
667
00:33:15,120 --> 00:33:17,800
unfortunately, it's going
to be a long delay for you.
668
00:33:17,800 --> 00:33:20,480
We're using it in law
enforcement as well.
669
00:33:20,480 --> 00:33:23,600
An example, if it's not able
to pick the right face,
670
00:33:23,600 --> 00:33:26,800
you're going to be prosecuted
for something that you've not done.
671
00:33:26,800 --> 00:33:30,240
We're using this facial recognition
to be able to unlock phones.
672
00:33:30,240 --> 00:33:32,080
It's like a password now.
673
00:33:32,080 --> 00:33:36,120
So if it's not working right
look at the harm that it can do.
674
00:33:36,120 --> 00:33:37,800
Is it improving?
675
00:33:37,800 --> 00:33:39,440
Are the algorithms getting better?
676
00:33:39,440 --> 00:33:41,480
Yes, it's getting better and better.
677
00:33:41,480 --> 00:33:44,360
And what they're trying to do
is to make it non-biased
678
00:33:44,360 --> 00:33:47,240
by training with diverse,
non-biased data sets.
679
00:33:47,240 --> 00:33:49,560
As you can see, you can pick
some of the features,
680
00:33:49,560 --> 00:33:52,960
it can pick mine. And works with
a whole different range of faces.
681
00:33:52,960 --> 00:33:54,400
It's trying to fix that.
682
00:33:54,400 --> 00:33:57,720
So it's improving over time
and we're getting there.
683
00:33:57,720 --> 00:34:01,360
Obviously, the issues of bias and
fairness are incredibly important
684
00:34:01,360 --> 00:34:02,960
when it comes to facial recognition.
685
00:34:02,960 --> 00:34:05,480
But there is another concern
that people have when it comes
686
00:34:05,480 --> 00:34:08,680
to this technology, which is
that some people don't like the idea
687
00:34:08,680 --> 00:34:11,040
that they can be
identified in a crowd
688
00:34:11,040 --> 00:34:12,720
based on their facial features.
689
00:34:12,720 --> 00:34:15,440
Some people don't like the idea
of losing their anonymity.
690
00:34:15,440 --> 00:34:18,520
So some people have been
experimenting with different ways
691
00:34:18,520 --> 00:34:21,520
to try and trick facial
recognition cameras.
692
00:34:21,520 --> 00:34:25,640
And we are lucky to
be joined from Glow Up
693
00:34:25,640 --> 00:34:28,720
by make-up artist
Tiffany Hunt and Eva.
694
00:34:28,720 --> 00:34:30,480
APPLAUSE
695
00:34:34,440 --> 00:34:36,240
If you come and stand round here.
696
00:34:37,320 --> 00:34:41,000
Now, Eva, you have had,
as we can see,
697
00:34:41,000 --> 00:34:44,360
Tiffany has been doing your make-up
for a little while backstage.
698
00:34:44,360 --> 00:34:46,240
Looks very remarkable.
699
00:34:46,240 --> 00:34:50,080
Is this the kind of thing that would
work to fool facial detection?
700
00:34:50,080 --> 00:34:53,240
Oh, yes. So this kind of system,
because it's looking
701
00:34:53,240 --> 00:34:58,200
for that contrasting bit between
the eye, the eyelashes, the nose,
702
00:34:58,200 --> 00:35:00,560
but the way the face
has being painted now,
703
00:35:00,560 --> 00:35:02,200
it's breaking all the symmetry.
704
00:35:02,200 --> 00:35:04,080
It can't find the nose.
705
00:35:04,080 --> 00:35:06,280
Tiffany, talk us through
what you were going for here.
706
00:35:06,280 --> 00:35:08,160
It's quite dramatic!
707
00:35:08,160 --> 00:35:10,160
Just a bit.
Just something, like, natural!
708
00:35:10,160 --> 00:35:11,520
Basically, what I was thinking,
709
00:35:11,520 --> 00:35:13,640
was something that was a
monochromatic illusion.
710
00:35:13,640 --> 00:35:15,840
I really wanted to break up
specific facial features
711
00:35:15,840 --> 00:35:17,920
such as this area in the
central area of the face,
712
00:35:17,920 --> 00:35:19,920
which is the area that
gets picked up the most.
713
00:35:19,920 --> 00:35:22,720
Also changing the eye shape,
the lashes and just really going
714
00:35:22,720 --> 00:35:25,160
for something with colours
that are just so different
715
00:35:25,160 --> 00:35:27,840
to your usual skin tone and just
changing her face totally, really.
716
00:35:27,840 --> 00:35:29,320
Do you like it?
717
00:35:29,320 --> 00:35:32,360
Yeah. Friday night out.
718
00:35:32,360 --> 00:35:34,800
But the proof is in the pudding.
Let's have a look.
719
00:35:34,800 --> 00:35:37,840
So, Tiffany, we're picking you up.
If you step to one side.
720
00:35:37,840 --> 00:35:40,360
It's not getting you at all!
721
00:35:40,360 --> 00:35:43,480
I think a big round of applause
there for the dazzling make-up.
722
00:35:43,480 --> 00:35:45,160
Very impressive.
723
00:35:45,160 --> 00:35:47,000
To Eva, Tiffany and Kofi. Thank you.
724
00:35:47,000 --> 00:35:48,760
APPLAUSE
725
00:35:55,480 --> 00:35:58,600
We've got some ways now that we
know we can avoid being detected
726
00:35:58,600 --> 00:36:01,400
by facial recognition cameras.
Might look a little bit crazy
727
00:36:01,400 --> 00:36:03,520
if you're wandering
around with that all the time.
728
00:36:03,520 --> 00:36:06,240
But when it comes to protecting
our privacy, there are some people
729
00:36:06,240 --> 00:36:08,760
who are worried that
even this won't be enough.
730
00:36:08,760 --> 00:36:11,240
There are some people who are
worried that with the help
731
00:36:11,240 --> 00:36:14,800
of mathematical algorithms,
we are building and processing vast
732
00:36:14,800 --> 00:36:17,400
profiles on each
and every one of us,
733
00:36:17,400 --> 00:36:19,040
often without our knowledge.
734
00:36:19,040 --> 00:36:21,960
Now, to explain this,
please welcome to the stage
735
00:36:21,960 --> 00:36:25,280
computer scientist extraordinaire,
Dr Anne-Marie Imafidon.
736
00:36:25,280 --> 00:36:27,200
APPLAUSE
737
00:36:35,920 --> 00:36:39,680
Anne-Marie, do you think that we're
seeing the death of privacy?
738
00:36:39,680 --> 00:36:42,320
In some ways we are,
in other ways we aren't.
739
00:36:42,320 --> 00:36:46,080
I know if you go back far enough,
we used to communicate with fire
740
00:36:46,080 --> 00:36:48,920
and smoke signals or by
yelling things long distance.
741
00:36:48,920 --> 00:36:51,240
Not that long ago, really!
Yeah, basically.
742
00:36:51,240 --> 00:36:54,520
But now, with the algorithms
that we have and the time
743
00:36:54,520 --> 00:36:58,280
that we spend online, really
in-depth profiles are being built up
744
00:36:58,280 --> 00:37:00,120
on all of us continuously.
745
00:37:00,120 --> 00:37:03,520
And it's those profiles
that are connecting pieces
of information together, right?
746
00:37:03,520 --> 00:37:05,240
That's kind of the big thing.
Exactly.
747
00:37:05,240 --> 00:37:07,920
So where those things might
not have been private before,
748
00:37:07,920 --> 00:37:10,960
being able to connect them together
to build up that picture of someone,
749
00:37:10,960 --> 00:37:13,640
that was harder.
Whereas now it's very easy.
750
00:37:13,640 --> 00:37:15,320
And we use things like social media
751
00:37:15,320 --> 00:37:18,240
where we're putting that information
out. Voluntarily. Exactly.
752
00:37:18,240 --> 00:37:20,120
So the profiles are pretty complex.
753
00:37:20,120 --> 00:37:23,960
OK. So to give us a sense
of these kind of profiles,
754
00:37:23,960 --> 00:37:26,720
I wanted to get, let's say,
three volunteers here,
755
00:37:26,720 --> 00:37:29,840
because Anne-Marie's got a
little demonstration for us.
756
00:37:29,840 --> 00:37:32,240
OK, perfect. All right.
757
00:37:32,240 --> 00:37:35,120
OK. We'll go for you.
You get to come down.
758
00:37:35,120 --> 00:37:37,640
Let's go for... Let's go for
you there, in the jumper.
759
00:37:37,640 --> 00:37:39,880
And you there,
in the stripy jumper.
760
00:37:39,880 --> 00:37:41,680
APPLAUSE
761
00:37:41,680 --> 00:37:43,720
Let's bring you down
to the stage, come on.
762
00:37:46,160 --> 00:37:47,840
What's your name? Natasha.
763
00:37:47,840 --> 00:37:50,440
Natasha, OK, perfect. Natasha,
if you want to go there.
764
00:37:50,440 --> 00:37:52,280
What was your name?
Kieran. OK, perfect.
765
00:37:52,280 --> 00:37:54,720
Do you want to go there, Kieran?
And what was your name?
766
00:37:54,720 --> 00:37:57,320
Caitlin. Perfect.
You jump round there.
767
00:37:57,320 --> 00:38:00,440
Anne-Marie is going to talk us
through a little demonstration.
768
00:38:00,440 --> 00:38:03,960
Yes. So, each of you have got
a website that we've loaded up
769
00:38:03,960 --> 00:38:06,080
for you on your laptop there.
770
00:38:06,080 --> 00:38:08,240
And I want to make sure
you've accepted cookies.
771
00:38:08,240 --> 00:38:11,520
And I'd like you to get clicking
and browsing on these sites
772
00:38:11,520 --> 00:38:13,400
and doing some cool things.
773
00:38:13,400 --> 00:38:16,800
And as you've accepted cookies, I'm
going to give each of you a cookie.
774
00:38:20,680 --> 00:38:23,000
They are rather large cookies.
775
00:38:23,000 --> 00:38:24,760
Thank you very much.
776
00:38:26,480 --> 00:38:29,480
So as your browsing
and your clicking around,
777
00:38:29,480 --> 00:38:33,320
you have this cookie in
your site, in your laptop,
778
00:38:33,320 --> 00:38:36,560
and there's different data and
information that you're putting in.
779
00:38:36,560 --> 00:38:38,200
So I can see you here.
780
00:38:38,200 --> 00:38:40,680
You're about to watch
some YouTube videos.
781
00:38:40,680 --> 00:38:42,560
I can see which
channel that you're on.
782
00:38:42,560 --> 00:38:45,040
So I'm going to pop a little
bit of a chocolate chip there.
783
00:38:45,040 --> 00:38:46,360
That's some data.
784
00:38:46,360 --> 00:38:49,480
I can see the artist
that you've got as well.
785
00:38:49,480 --> 00:38:52,040
So I'm going to pop a little bit
more data
786
00:38:52,040 --> 00:38:54,200
on your chocolate chip cookie.
787
00:38:54,200 --> 00:38:56,360
I can see you're shopping here.
788
00:38:56,360 --> 00:38:58,080
Brilliant, curtains.
789
00:38:58,080 --> 00:38:59,720
That's a good bit of data.
790
00:38:59,720 --> 00:39:03,120
Just got a new house and she
likes blue and white curtains.
791
00:39:03,120 --> 00:39:05,840
So two more bits of data
that we've got there.
792
00:39:05,840 --> 00:39:07,280
Fantastic.
793
00:39:07,280 --> 00:39:09,520
I think you've added them
to your basket as well,
794
00:39:09,520 --> 00:39:11,320
so we're going to add
that into your cookie,
795
00:39:11,320 --> 00:39:13,680
just so when you come back,
you don't need to browse blue
796
00:39:13,680 --> 00:39:15,040
and white curtains again.
797
00:39:15,040 --> 00:39:18,160
And just over here, we're
looking at your address,
798
00:39:18,160 --> 00:39:20,080
so that's a bit of data.
799
00:39:20,080 --> 00:39:22,320
And I can see you're getting
directions to school,
800
00:39:22,320 --> 00:39:24,760
so that's even more data
that we've got popped in there.
801
00:39:24,760 --> 00:39:26,720
This is a lot of data now,
Anne-Marie.
802
00:39:26,720 --> 00:39:29,680
A lot of data just from browsing
and entering information
803
00:39:29,680 --> 00:39:33,120
that is going into these cookies
that are stored on their laptops.
804
00:39:33,120 --> 00:39:34,720
Now, if you stop browsing now,
805
00:39:34,720 --> 00:39:38,280
you'll see that we've got quite
a lot of data on these cookies.
806
00:39:38,280 --> 00:39:41,400
And these cookies don't
just stay within your laptop.
807
00:39:41,400 --> 00:39:43,840
Many of you might have
accepted cookies
808
00:39:43,840 --> 00:39:46,240
that have got third
party sites as well.
809
00:39:46,240 --> 00:39:48,640
So here's our third party site.
What does that mean then?
810
00:39:48,640 --> 00:39:50,960
And you actually get that data
as well from their cookies.
811
00:39:50,960 --> 00:39:54,120
So you know it's blue and white
curtains that she's looking for.
812
00:39:54,120 --> 00:39:57,200
So if you accept third party
cookies, that means another person
813
00:39:57,200 --> 00:40:00,040
can just buy that data and add
it to a whole host of other data
814
00:40:00,040 --> 00:40:02,080
that they've got from all
loads of other websites
815
00:40:02,080 --> 00:40:04,040
and build this incredibly
detailed profile.
816
00:40:04,040 --> 00:40:05,560
That's exactly what's happening.
817
00:40:05,560 --> 00:40:08,240
Do we realise that's what we're
really doing when clicking around?
818
00:40:08,240 --> 00:40:09,520
Probably not. OK.
819
00:40:09,520 --> 00:40:11,960
But the thing is, is it's
not just the information
820
00:40:11,960 --> 00:40:16,440
that you're clicking on that
helps to build this profile of you.
821
00:40:16,440 --> 00:40:19,760
It's also things that you
are not clicking on, too.
822
00:40:19,760 --> 00:40:23,800
And so for this, let me welcome
to the stage Marc Kerstein.
823
00:40:23,800 --> 00:40:27,520
APPLAUSE
824
00:40:31,960 --> 00:40:34,600
You build websites, don't you?
Yes, that's right.
825
00:40:34,600 --> 00:40:36,720
And you in fact built
a website for us.
826
00:40:36,720 --> 00:40:39,480
Yes. So, let's have a look at this.
Talk us through what you've built.
827
00:40:39,480 --> 00:40:41,800
This is just a web page that
I've created that lists
828
00:40:41,800 --> 00:40:44,320
a bunch of animals. OK.
Perfect. All right.
829
00:40:44,320 --> 00:40:46,240
So, what I'd like you
to do, if it's OK,
830
00:40:46,240 --> 00:40:48,320
is just have a little
look through this
831
00:40:48,320 --> 00:40:51,760
and have a little flick through
and stop and have a little
832
00:40:51,760 --> 00:40:56,720
read of whichever one you think ends
up being particularly interesting.
833
00:40:56,720 --> 00:40:59,720
Happy? Have you stopped on one?
Have you picked one?
834
00:40:59,720 --> 00:41:04,600
Yup. OK, perfect. All right.
Now, Marc, which one did she pick?
835
00:41:04,600 --> 00:41:08,360
That would be the dog, is that
correct? And how did you know that?
836
00:41:08,360 --> 00:41:10,440
Yes, so, it's not just
about the information
837
00:41:10,440 --> 00:41:12,680
you're actively putting in,
838
00:41:12,680 --> 00:41:16,080
but it's enough to simply scroll
through something to find out
839
00:41:16,080 --> 00:41:19,160
what someone's thinking of,
so in this case, this web page,
840
00:41:19,160 --> 00:41:22,280
as you scroll through, you're
seeing the different animals that
841
00:41:22,280 --> 00:41:25,560
are being looked at in real time,
being transmitted to the server.
842
00:41:25,560 --> 00:41:28,000
So, Anne-Marie, websites
aren't just tracking what you're
843
00:41:28,000 --> 00:41:31,240
clicking on, they're tracking
on what you're pausing on...? Yeah.
844
00:41:31,240 --> 00:41:32,800
How fast your mouse is moving,
845
00:41:32,800 --> 00:41:35,520
the kind of device that you're
using to access the website.
846
00:41:35,520 --> 00:41:38,840
They can even tell the difference
between a click and a tap.
847
00:41:38,840 --> 00:41:41,840
But often, there's even more,
so things like your IP address,
848
00:41:41,840 --> 00:41:45,120
which might give a clue as to where
you are physically browsing from.
849
00:41:45,120 --> 00:41:48,640
Just imagine how much information
you could get on someone
850
00:41:48,640 --> 00:41:52,200
if you're a professional company
that had been doing it for years.
851
00:41:52,200 --> 00:41:54,520
Exactly, and that's
why it's so valuable.
852
00:41:54,520 --> 00:41:57,400
There's so many insights that we
can pick up from lots of different
853
00:41:57,400 --> 00:41:59,960
websites. There's so many of us
that are using these platforms
854
00:41:59,960 --> 00:42:02,120
and so those cookies
are pretty valuable
855
00:42:02,120 --> 00:42:04,960
over time to lots of different
people. Incredibly valuable indeed.
856
00:42:04,960 --> 00:42:07,560
Thank you.
APPLAUSE
857
00:42:07,560 --> 00:42:08,880
And thank you, Anne-Marie.
858
00:42:13,160 --> 00:42:14,920
Now, with all of this stuff,
859
00:42:14,920 --> 00:42:16,760
we're not saying that
it's necessarily bad,
860
00:42:16,760 --> 00:42:19,400
but I think it's important
to realise exactly how
861
00:42:19,400 --> 00:42:21,200
much we are giving away
862
00:42:21,200 --> 00:42:24,640
because I think if someone can
work out what kind of person
863
00:42:24,640 --> 00:42:27,520
you are, they can use that
information to target you
864
00:42:27,520 --> 00:42:30,640
with very precisely
tailored messages.
865
00:42:30,640 --> 00:42:33,640
Now, that might be adverts to
persuade you to buy something,
866
00:42:33,640 --> 00:42:36,480
but it might also be linked
to political messages to
867
00:42:36,480 --> 00:42:39,280
persuade you to
vote in a certain way.
868
00:42:39,280 --> 00:42:42,400
And that is something that's
been in the news a lot recently.
869
00:42:42,400 --> 00:42:46,920
I think it is important to remember
this is kind of how our whole
870
00:42:46,920 --> 00:42:51,240
online world is designed to work,
but what makes people
871
00:42:51,240 --> 00:42:56,200
particularly unhappy is when that
targeting happens with fake news.
872
00:42:56,200 --> 00:43:00,000
But even there, there is
maths hiding behind the scenes.
873
00:43:00,000 --> 00:43:02,480
Because with algorithms
on your side, it is
874
00:43:02,480 --> 00:43:06,360
easier to create realistic fake
stuff now than it's ever been before
875
00:43:06,360 --> 00:43:10,400
and I want to show you just how easy
it can be to create fake stuff.
876
00:43:10,400 --> 00:43:14,400
I want to see if you as an audience
are capable of spotting some
877
00:43:14,400 --> 00:43:16,720
fake classical music.
878
00:43:16,720 --> 00:43:18,600
We're going to play
a little game that
879
00:43:18,600 --> 00:43:22,120
I like to call
Is The Bach Worse Than The Byte?
880
00:43:22,120 --> 00:43:25,880
OK? I was pretty proud of that.
Thanks. What they're going to do,
881
00:43:25,880 --> 00:43:29,040
they're going to play you two
pieces of classical music.
882
00:43:29,040 --> 00:43:33,960
One of them was composed by the
very great Vivaldi and the other
883
00:43:33,960 --> 00:43:39,440
one was composed by a mathematical
algorithm in the style of Vivaldi.
884
00:43:39,440 --> 00:43:41,120
What I want you to do is to see
885
00:43:41,120 --> 00:43:43,640
if you can guess which
one is which, OK?
886
00:43:43,640 --> 00:43:47,760
So, two pieces of music, your job,
spot the real Vivaldi.
887
00:43:47,760 --> 00:43:49,720
OK, here we go,
piece of music number one.
888
00:43:53,920 --> 00:43:57,680
THEY PLAY
889
00:44:22,920 --> 00:44:25,720
That was lovely.
APPLAUSE
890
00:44:29,360 --> 00:44:31,840
OK.
That was piece of music number one.
891
00:44:31,840 --> 00:44:33,520
Here is piece of music number two.
892
00:44:37,040 --> 00:44:40,760
THEY PLAY
893
00:44:59,880 --> 00:45:02,680
APPLAUSE
894
00:45:04,360 --> 00:45:06,760
OK.
895
00:45:06,760 --> 00:45:11,720
But one of those was a fake, OK?
So you've got to spot which one.
896
00:45:11,720 --> 00:45:16,080
If you think that the real Vivaldi
was the first one, give me a cheer.
897
00:45:16,080 --> 00:45:17,760
THEY CHEER
898
00:45:17,760 --> 00:45:21,000
If you think the second song was
the real Vivaldi, give me a cheer.
899
00:45:21,000 --> 00:45:23,040
THEY CHEER
900
00:45:23,040 --> 00:45:25,280
I mean, that's
basically 50-50, guys.
901
00:45:25,280 --> 00:45:28,520
Just guessing at random, I see. And
the real answer was...? Which one?
902
00:45:28,520 --> 00:45:30,960
Number one.
Number one was the real Vivaldi.
903
00:45:30,960 --> 00:45:33,520
Well done, if you got that right.
APPLAUSE
904
00:45:38,280 --> 00:45:41,400
Now, OK, that fake Vivaldi
was impressive enough to
905
00:45:41,400 --> 00:45:45,040
kind of fool half a room full
of people as to which one was real,
906
00:45:45,040 --> 00:45:47,760
but the algorithm itself is
actually incredibly simple.
907
00:45:47,760 --> 00:45:51,320
All you do is you take
a catalogue of all of the songs that
908
00:45:51,320 --> 00:45:55,880
Vivaldi has ever written and then
you give the algorithm a chord
909
00:45:55,880 --> 00:45:59,800
and then it will tell you with
what probability the next chord
910
00:45:59,800 --> 00:46:03,360
that was likely to come
up in Vivaldi's original music.
911
00:46:03,360 --> 00:46:06,280
So all you do, give it a chord,
it gives you a chord back based
912
00:46:06,280 --> 00:46:08,120
on probability,
you give it a chord,
913
00:46:08,120 --> 00:46:10,200
it gives you a chord back
based on probability,
914
00:46:10,200 --> 00:46:13,200
you chain those chords together,
one after the other,
915
00:46:13,200 --> 00:46:16,280
until you end up with
something that is entirely original.
916
00:46:16,280 --> 00:46:17,760
But that, I think,
917
00:46:17,760 --> 00:46:21,640
is the real giveaway here about
the fact that this is the fake.
918
00:46:21,640 --> 00:46:24,800
Those very simple chord transitions
that go on in the background.
919
00:46:24,800 --> 00:46:26,920
And there are other
bits though, right?
920
00:46:26,920 --> 00:46:29,400
There was one bit at the end
there particularly which
921
00:46:29,400 --> 00:46:32,520
looked like it was quite difficult.
I'm afraid there are passages later
922
00:46:32,520 --> 00:46:36,960
on which are impossible to play.
Impossible to play? Impossible, how?
923
00:46:36,960 --> 00:46:40,520
At the speed that the artificial
intelligence has asked us to play,
924
00:46:40,520 --> 00:46:44,240
you physically can't reach extreme
ends of the instrument quick enough.
925
00:46:44,240 --> 00:46:46,120
And I think that's
an important point.
926
00:46:46,120 --> 00:46:48,800
Vivaldi, he had knowledge of how
to play instruments,
927
00:46:48,800 --> 00:46:52,720
he had knowledge of human hands
and human bodies, but the algorithm
928
00:46:52,720 --> 00:46:56,560
doesn't, it just kind of shoves
loads of stuff in together.
929
00:46:56,560 --> 00:47:00,000
But I should tell you that it
turns out that you can use this
930
00:47:00,000 --> 00:47:02,360
very same idea for lots
of other things.
931
00:47:02,360 --> 00:47:05,200
You can use it for music,
but you can also use it for lyrics
932
00:47:05,200 --> 00:47:07,640
and because it's Christmas,
what I decided to do,
933
00:47:07,640 --> 00:47:11,920
I decided to feed in some classic
Christmas carols
934
00:47:11,920 --> 00:47:16,880
and I decided to train my very own
algorithm to generate a whole
935
00:47:16,880 --> 00:47:22,000
new carol, and so for this, please
welcome to the stage Rob Levey,
936
00:47:22,000 --> 00:47:25,520
and join us for a rendition
of a mathematical Christmas carol.
937
00:47:25,520 --> 00:47:27,920
Round of applause for Rob Levey.
APPLAUSE
938
00:47:31,880 --> 00:47:34,320
Feel free to sing along.
939
00:48:19,680 --> 00:48:23,560
CHEERS AND APPLAUSE
940
00:48:25,240 --> 00:48:28,600
Rob Levey and the Aeolian
String Quartet, everybody!
941
00:48:28,600 --> 00:48:30,200
Thank you very much!
942
00:48:34,360 --> 00:48:37,040
Here's the thing about maths
in this very creative way,
943
00:48:37,040 --> 00:48:40,360
it's very impressive way,
but it's also not really human
944
00:48:40,360 --> 00:48:43,560
and there is something a little bit
uncomfortable about a world
945
00:48:43,560 --> 00:48:45,800
where fakes can fool,
946
00:48:45,800 --> 00:48:49,800
especially when we aren't just
mimicking music, but people.
947
00:48:49,800 --> 00:48:52,640
To explain,
let's welcome Dr Alex Adam.
948
00:48:52,640 --> 00:48:55,280
APPLAUSE
949
00:49:00,280 --> 00:49:03,240
Now, Alex, you work in an area
called deep fakes, right?
950
00:49:03,240 --> 00:49:04,800
That's right. What are they?
951
00:49:04,800 --> 00:49:08,160
So, deep fake algorithms
are a special kind of machine
952
00:49:08,160 --> 00:49:12,000
learning algorithm that can
take say one person's face,
953
00:49:12,000 --> 00:49:14,840
or their head, or their
entire body, or their voice,
954
00:49:14,840 --> 00:49:16,840
and turn it into another
person's head.
955
00:49:16,840 --> 00:49:19,400
So, to make people look like they've
done stuff in videos that
956
00:49:19,400 --> 00:49:21,080
they've never done. Absolutely.
957
00:49:21,080 --> 00:49:24,240
OK. I mean, that's quite something,
right? You can manipulate someone.
958
00:49:24,240 --> 00:49:27,280
Yes, it's quite scary. You can
essentially puppet people.
959
00:49:27,280 --> 00:49:28,640
So, how does it work?
960
00:49:28,640 --> 00:49:32,160
So you imagine if I have a video
of you and a video of me.
961
00:49:32,160 --> 00:49:34,600
So we take those two videos
and we split them
962
00:49:34,600 --> 00:49:36,680
up into their sort
of individual frames,
963
00:49:36,680 --> 00:49:38,840
so the images that
make up that video,
964
00:49:38,840 --> 00:49:42,000
and we'll show a computer
all of those different images of me
965
00:49:42,000 --> 00:49:45,000
and you and what the algorithms will
start to learn is they'll start
966
00:49:45,000 --> 00:49:48,320
to learn things like what's
the structure of our different
967
00:49:48,320 --> 00:49:51,280
faces and what are the different
kinds of expressions that we
968
00:49:51,280 --> 00:49:54,600
make, but critically, they learn
how to distinguish the two,
969
00:49:54,600 --> 00:49:56,800
so one of them will know,
OK, I can make red hair
970
00:49:56,800 --> 00:49:59,480
and a particular kind of skin tone
and the other one will say,
971
00:49:59,480 --> 00:50:01,960
I can make brown hair
and a particular kind of skin tone.
972
00:50:01,960 --> 00:50:05,080
But they'll separate
out our expressions, so I'll be able
973
00:50:05,080 --> 00:50:10,360
to say, OK, I want to take you
smiling and put that smile on to me.
974
00:50:10,360 --> 00:50:14,320
So, you're splitting out
what your face looks like
975
00:50:14,320 --> 00:50:17,080
and what your face is doing.
That's exactly right.
976
00:50:17,080 --> 00:50:19,880
And then you can put what your face
is doing on my face.
977
00:50:19,880 --> 00:50:22,840
Whilst keeping all of your other
features the same. OK, all right.
978
00:50:22,840 --> 00:50:25,880
So we've got a couple of puzzles
here. So here's a picture of you.
979
00:50:25,880 --> 00:50:30,800
You're doing a little snarl.
A bit of an eyebrow raise as well.
980
00:50:30,800 --> 00:50:34,640
OK, so you can make me do this face.
Yeah, that's exactly right.
981
00:50:34,640 --> 00:50:37,720
How do you do it?
So, if we just flip that over,
982
00:50:37,720 --> 00:50:40,520
so what we can think of is
we can think of my expression
983
00:50:40,520 --> 00:50:43,480
as having some kind of abstract
mathematical representation,
984
00:50:43,480 --> 00:50:45,800
which I'm sort of thinking
of as these puzzle pieces
985
00:50:45,800 --> 00:50:50,120
and what I can do is I can say,
well, piece 32 and say 37 over there
986
00:50:50,120 --> 00:50:52,640
correspond to me making this snarl
987
00:50:52,640 --> 00:50:55,720
and what I can do is I can just take
those pieces, let's imagine I've got
988
00:50:55,720 --> 00:51:00,440
them over here, and I can just slot
them in over here into your face.
989
00:51:00,440 --> 00:51:02,640
You're shuffling my
face around, essentially.
990
00:51:02,640 --> 00:51:04,000
Yeah, I'm basically saying,
991
00:51:04,000 --> 00:51:07,240
turn on the bits of your face
that will make you make that snarl.
992
00:51:07,240 --> 00:51:09,800
OK, so once you've
completed that jigsaw,
993
00:51:09,800 --> 00:51:12,840
if you can flip it over,
the idea is...
994
00:51:12,840 --> 00:51:14,560
Ah, there we go.
995
00:51:14,560 --> 00:51:16,720
That's a lovely, snarly face,
isn't it? Exactly.
996
00:51:16,720 --> 00:51:18,720
Now we get Hannah with the snarl.
997
00:51:18,720 --> 00:51:22,160
Are there concerns about
this kind of technology?
998
00:51:22,160 --> 00:51:25,320
So, like, my day job is a data
scientist of faculty
999
00:51:25,320 --> 00:51:29,000
and I work on using artificial
intelligence techniques to
1000
00:51:29,000 --> 00:51:31,480
sort of detect these
sort of video manipulations.
1001
00:51:31,480 --> 00:51:34,120
Obviously, there's lots of
implications for the fact that you
1002
00:51:34,120 --> 00:51:37,080
can just puppet people. I could
record a video of myself saying
1003
00:51:37,080 --> 00:51:40,120
something and just transform it into
some celebrity, for example,
1004
00:51:40,120 --> 00:51:42,680
and this has
implications for say privacy
1005
00:51:42,680 --> 00:51:45,160
but also for say democracy
and political disinformation.
1006
00:51:45,160 --> 00:51:47,760
So there are lots of sort
of concerns about that, which is
1007
00:51:47,760 --> 00:51:50,600
why it's important to be able to
detect this kind of content online.
1008
00:51:50,600 --> 00:51:52,640
There are also many great
applications though,
1009
00:51:52,640 --> 00:51:55,080
like dubbing, special effects
and things like that.
1010
00:51:55,080 --> 00:51:57,320
Do you have any top
tips for spotting deep fakes?
1011
00:51:57,320 --> 00:52:00,160
Yeah, so I think my top tip for sort
of spotting deep fakes is
1012
00:52:00,160 --> 00:52:02,600
always just if you're watching
a video of something
1013
00:52:02,600 --> 00:52:04,800
and you sort of suspect that it
might be a deep fake,
1014
00:52:04,800 --> 00:52:07,320
just ask the question, do you really
believe that this person
1015
00:52:07,320 --> 00:52:09,800
would do or say these
things that the video is portraying?
1016
00:52:09,800 --> 00:52:11,760
So that's like my number one
suggestion.
1017
00:52:11,760 --> 00:52:14,360
My second suggestion would be
sort of do some fact checking.
1018
00:52:14,360 --> 00:52:17,600
Sort of try to see if you can find
that video on some trusted news
1019
00:52:17,600 --> 00:52:20,840
outlets, particularly if you found
it posted on social media or
1020
00:52:20,840 --> 00:52:23,720
something that's harder to verify.
And from a technical perspective,
1021
00:52:23,720 --> 00:52:26,680
I think it's useful to look for
things like objects in the
1022
00:52:26,680 --> 00:52:29,400
background of a video,
so if as I move you notice objects
1023
00:52:29,400 --> 00:52:31,680
in the background of the video
moving with me,
1024
00:52:31,680 --> 00:52:34,680
that's a pretty strong indication
that the video's been manipulated.
1025
00:52:34,680 --> 00:52:37,520
They, I think, are some excellent
tips that will only serve us
1026
00:52:37,520 --> 00:52:40,760
well in the future. Alex, thank you
very much indeed. Thank you.
1027
00:52:40,760 --> 00:52:44,160
APPLAUSE
1028
00:52:44,160 --> 00:52:48,480
Now, we really wanted to show you
how this deep fake stuff works,
1029
00:52:48,480 --> 00:52:51,320
so what we've done is we've been
working with Alex's team
1030
00:52:51,320 --> 00:52:55,000
and we've created a deep
fake of someone in the audience.
1031
00:52:55,000 --> 00:52:58,040
We always look to break boundaries
here at the Christmas Lectures.
1032
00:52:58,040 --> 00:53:01,600
So, to give this moment the real
sense of occasion that it
1033
00:53:01,600 --> 00:53:05,840
deserves, I would like to introduce
you to a brand-new talk show
1034
00:53:05,840 --> 00:53:08,440
because tonight,
for one night only,
1035
00:53:08,440 --> 00:53:12,960
please welcome your host, it's
Matt Parker and This Is Your Face.
1036
00:53:12,960 --> 00:53:15,680
CHEERS AND APPLAUSE
1037
00:53:18,520 --> 00:53:20,640
Thank you.
1038
00:53:20,640 --> 00:53:24,840
Welcome to This Is Your Face.
Can you please welcome to the stage
1039
00:53:24,840 --> 00:53:27,320
tonight's face? It's Kaya.
1040
00:53:27,320 --> 00:53:30,520
CHEERS AND APPLAUSE
1041
00:53:32,840 --> 00:53:36,480
Come on down, Kaya. If you'd like
to take a seat. Thank you.
1042
00:53:36,480 --> 00:53:40,680
So, Kaya, it's great to meet
the person behind the face and to
1043
00:53:40,680 --> 00:53:44,720
get to know you, my first question
is, do you have a favourite food?
1044
00:53:44,720 --> 00:53:48,800
Yes, I like pizza and pasta.
Pizza and pasta. Yes.
1045
00:53:48,800 --> 00:53:51,480
That's pretty uncontroversial.
Pizza and pasta?! No.
1046
00:53:51,480 --> 00:53:54,600
We're not having that.
Let's have another go.
1047
00:53:54,600 --> 00:53:57,920
I really like vegetables,
1048
00:53:57,920 --> 00:54:01,600
just plain old vegetables.
1049
00:54:01,600 --> 00:54:04,960
Especially Brussels sprouts,
actually. That's number one.
1050
00:54:04,960 --> 00:54:09,160
And often maybe with
a side of extra vegetables.
1051
00:54:09,160 --> 00:54:12,240
Well, Kaya, that is your face.
1052
00:54:12,240 --> 00:54:16,040
My next question, do you have a
favourite type of animal? Yeah.
1053
00:54:16,040 --> 00:54:19,200
I love cats. You love cats? They're
pretty adorable, aren't they? Yeah.
1054
00:54:19,200 --> 00:54:22,920
Have you got like a least favourite
creature? Erm, probably insects.
1055
00:54:22,920 --> 00:54:24,560
You don't like insects? No.
1056
00:54:24,560 --> 00:54:27,000
Interesting.
Hm, let's see, shall we?
1057
00:54:29,280 --> 00:54:34,080
I just love ants. I think
they're amazing.
1058
00:54:34,080 --> 00:54:37,480
I love the idea of them
crawling all over me,
1059
00:54:37,480 --> 00:54:41,520
in my hair, up my nose, in my ears.
1060
00:54:41,520 --> 00:54:46,320
Just like a massive
swarm of ants everywhere.
1061
00:54:46,320 --> 00:54:48,760
Oh, Kaya,
you'd better talk to your face.
1062
00:54:48,760 --> 00:54:51,920
Now, my last question, you've got
a younger sister, haven't you? Yes.
1063
00:54:51,920 --> 00:54:54,920
Are they in tonight? Yes, just right
there. Over there. Excellent.
1064
00:54:54,920 --> 00:54:56,560
And you get pocket money. Yeah.
1065
00:54:56,560 --> 00:54:58,920
OK, my final question,
just hypothetically,
1066
00:54:58,920 --> 00:55:02,560
your pocket money, you wouldn't mind
giving all of that to let's
1067
00:55:02,560 --> 00:55:06,000
say your sister for the next
roughly five years?
1068
00:55:06,000 --> 00:55:07,360
No! No?
1069
00:55:07,360 --> 00:55:08,880
Hm.
1070
00:55:08,880 --> 00:55:11,600
That's your sister? Yeah.
Don't worry, we've got you.
1071
00:55:13,480 --> 00:55:16,520
I've decided to give my sister
all of my pocket
1072
00:55:16,520 --> 00:55:21,280
money for the next five,
ten, 20 years.
1073
00:55:21,280 --> 00:55:24,320
I think she deserves it, you know,
she can have it.
1074
00:55:24,320 --> 00:55:27,760
And she can have all of my desserts
too. I don't need those.
1075
00:55:27,760 --> 00:55:31,720
And she can also have my phone
and my room, which is
1076
00:55:31,720 --> 00:55:35,440
where she can keep all of my clothes
because they are now hers.
1077
00:55:35,440 --> 00:55:37,400
LAUGHTER
1078
00:55:37,400 --> 00:55:41,400
Wow! That is your face, so thank
you very much for coming along.
1079
00:55:41,400 --> 00:55:43,240
No problem. No, not you. Your face!
1080
00:55:43,240 --> 00:55:45,360
Thanks,
it's been wonderful to be here
1081
00:55:45,360 --> 00:55:48,440
and I completely
stand by all of my answers.
1082
00:55:48,440 --> 00:55:52,040
APPLAUSE
Thank you very much, Kaya.
1083
00:55:56,400 --> 00:55:59,920
There is a point to all of this
1084
00:55:59,920 --> 00:56:03,440
because whether you're talking
about deep fakes or driverless cars
1085
00:56:03,440 --> 00:56:05,920
or facial recognition,
under the surface,
1086
00:56:05,920 --> 00:56:09,200
all of these things that have such
potential to change our world,
1087
00:56:09,200 --> 00:56:12,520
they're ultimately
mathematical creations
1088
00:56:12,520 --> 00:56:15,520
and maths isn't just about bridges
and bicycles and buildings.
1089
00:56:15,520 --> 00:56:17,000
Behind the scenes,
1090
00:56:17,000 --> 00:56:20,760
it's mathematical levers that are
powering the changes to our society.
1091
00:56:20,760 --> 00:56:23,600
It's got the potential to change
everything, what we know,
1092
00:56:23,600 --> 00:56:28,040
how we talk to each other, even how
our whole democracy is structured.
1093
00:56:28,040 --> 00:56:29,200
Now, don't get me wrong,
1094
00:56:29,200 --> 00:56:32,040
I don't think this is about being
afraid of the advancement
1095
00:56:32,040 --> 00:56:35,040
of machines, but I do think that
we need to be honest with
1096
00:56:35,040 --> 00:56:38,360
ourselves about the awesome
power of mathematics
1097
00:56:38,360 --> 00:56:40,560
and I think we need to be
careful of the very real
1098
00:56:40,560 --> 00:56:44,360
limitations of something
that will never be perfect,
1099
00:56:44,360 --> 00:56:46,760
but I think
if we can be aware of the pitfalls,
1100
00:56:46,760 --> 00:56:49,000
if we can work our way
around the challenges,
1101
00:56:49,000 --> 00:56:52,040
I am optimistic about
a future where humans
1102
00:56:52,040 --> 00:56:55,200
and machines can work together,
exploiting each other's strengths
1103
00:56:55,200 --> 00:56:57,520
and acknowledging
each other's weaknesses,
1104
00:56:57,520 --> 00:57:01,440
a partnership has the real
potential to be a force for good.
1105
00:57:01,440 --> 00:57:04,440
And so to finish,
let's combine human
1106
00:57:04,440 --> 00:57:08,640
and machine in a performance
of collaboration.
1107
00:57:08,640 --> 00:57:11,720
Please welcome the
Chineke! Orchestra.
1108
00:57:11,720 --> 00:57:14,000
APPLAUSE
1109
00:57:14,000 --> 00:57:18,720
Your job is to guess
which parts are human
1110
00:57:18,720 --> 00:57:21,360
and which were the
mathematical algorithm.
1111
00:57:21,360 --> 00:57:23,960
Enjoy.
1112
00:57:23,960 --> 00:57:27,240
THEY PLAY
97521
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.