Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,080 --> 00:00:05,759
warm welcome to the Royal Palace and the
2
00:00:02,600 --> 00:00:09,240
badot library here you will find over
3
00:00:05,759 --> 00:00:12,000
100,000 books a collection that used to
4
00:00:09,240 --> 00:00:15,000
belong to previous kings and queens of
5
00:00:12,000 --> 00:00:17,480
the House of badot offering a glimpse
6
00:00:15,000 --> 00:00:20,480
into their history and
7
00:00:17,480 --> 00:00:23,720
interests however today we're here to
8
00:00:20,480 --> 00:00:26,960
listen to our esteemed Nobel laurates to
9
00:00:23,720 --> 00:00:30,800
their insights their expertise and their
10
00:00:26,960 --> 00:00:34,120
invaluable contributions to science and
11
00:00:30,800 --> 00:00:37,320
economics once again a very warm welcome
12
00:00:34,120 --> 00:00:39,120
to the Royal Palace in this program
13
00:00:37,320 --> 00:00:41,879
we'll be looking at the potential and
14
00:00:39,120 --> 00:00:44,079
pitfalls of artificial intelligence why
15
00:00:41,879 --> 00:00:46,280
some countries are richer than others
16
00:00:44,079 --> 00:00:48,650
and what a worm tells us about the
17
00:00:46,280 --> 00:00:51,829
origins of life
18
00:00:48,650 --> 00:00:51,829
[Applause]
19
00:00:58,950 --> 00:01:08,199
[Music]
20
00:01:08,210 --> 00:01:11,400
[Applause]
21
00:01:08,800 --> 00:01:13,320
[Music]
22
00:01:11,400 --> 00:01:15,960
your Royal Highness thank you for that
23
00:01:13,320 --> 00:01:19,119
very warm welcome to your palace here in
24
00:01:15,960 --> 00:01:20,960
Stockholm and uh Nobel laurates this is
25
00:01:19,119 --> 00:01:23,360
the first time that some of you have
26
00:01:20,960 --> 00:01:25,119
been brought together in discussion on
27
00:01:23,360 --> 00:01:27,040
um television and we're also joined by
28
00:01:25,119 --> 00:01:30,680
some of your family and friends as well
29
00:01:27,040 --> 00:01:32,560
as students from here in Stockholm um
30
00:01:30,680 --> 00:01:34,880
before we start let's just give them a
31
00:01:32,560 --> 00:01:38,200
really big round of applause renewed
32
00:01:34,880 --> 00:01:38,200
congratulations to all of
33
00:01:48,240 --> 00:01:52,960
you I guess you're all getting very used
34
00:01:50,759 --> 00:01:56,680
to the sound of Applause now aren't you
35
00:01:52,960 --> 00:02:00,439
so tell me um how has winning the Nobel
36
00:01:56,680 --> 00:02:01,520
Prize changed your life uh who shall I
37
00:02:00,439 --> 00:02:04,039
start with
38
00:02:01,520 --> 00:02:05,560
Gary well the level of attention is
39
00:02:04,039 --> 00:02:09,239
something that's a
40
00:02:05,560 --> 00:02:11,360
THX what it ever was for other Awards
41
00:02:09,239 --> 00:02:15,280
it's you know the Nobel is a it's a
42
00:02:11,360 --> 00:02:17,160
brand and it's 120 something years of
43
00:02:15,280 --> 00:02:20,840
History to
44
00:02:17,160 --> 00:02:23,000
completely uh mesmerizing Daron one of
45
00:02:20,840 --> 00:02:25,560
the uh economists what about you how's
46
00:02:23,000 --> 00:02:27,760
it changed your life I mean I'm here
47
00:02:25,560 --> 00:02:29,920
well being in Stockholm for one week in
48
00:02:27,760 --> 00:02:32,560
December that's a life-changing event
49
00:02:29,920 --> 00:02:35,519
but I am amazingly grateful and happy
50
00:02:32,560 --> 00:02:37,080
honored and I'll take it as it comes
51
00:02:35,519 --> 00:02:38,920
your diary is going to be super full
52
00:02:37,080 --> 00:02:41,879
from now on you're going to be running
53
00:02:38,920 --> 00:02:44,760
around from lecture to lecture and guest
54
00:02:41,879 --> 00:02:47,720
appearances so Professor Jeffrey Hinton
55
00:02:44,760 --> 00:02:49,920
what about you um yeah it makes an
56
00:02:47,720 --> 00:02:52,800
amazing change I get huge amounts of
57
00:02:49,920 --> 00:02:54,400
email asking me to do things um I
58
00:02:52,800 --> 00:02:57,080
luckily have an assistant who deals with
59
00:02:54,400 --> 00:03:00,319
most of it um I get stopped for selfies
60
00:02:57,080 --> 00:03:02,440
in the street which is um it's very
61
00:03:00,319 --> 00:03:04,200
annoying but if it went away I'd be
62
00:03:02,440 --> 00:03:05,480
disappointed but also you've been
63
00:03:04,200 --> 00:03:07,200
teaching for many years at the
64
00:03:05,480 --> 00:03:09,200
University of Toronto and you said after
65
00:03:07,200 --> 00:03:11,400
you won the Nobel Prize They At Last
66
00:03:09,200 --> 00:03:13,560
gave you an office yes they didn't think
67
00:03:11,400 --> 00:03:17,879
I was worth an office before
68
00:03:13,560 --> 00:03:19,799
that James um I've noticed that people
69
00:03:17,879 --> 00:03:21,480
take what I say much more seriously I've
70
00:03:19,799 --> 00:03:23,319
always proceeded on the assumption that
71
00:03:21,480 --> 00:03:25,799
no one was ever actually listening to
72
00:03:23,319 --> 00:03:28,040
anything I said so now I have to really
73
00:03:25,799 --> 00:03:29,599
choose my words carefully yeah and does
74
00:03:28,040 --> 00:03:31,000
that extend to your family members as
75
00:03:29,599 --> 00:03:34,200
well or do they listen to what you say
76
00:03:31,000 --> 00:03:37,599
now um I'd have to think about that
77
00:03:34,200 --> 00:03:39,280
one David Baker uh well actually a
78
00:03:37,599 --> 00:03:41,560
highlight has really been this week and
79
00:03:39,280 --> 00:03:43,959
having all my my family and and
80
00:03:41,560 --> 00:03:47,080
colleagues here it's been a great
81
00:03:43,959 --> 00:03:50,560
celebration and um yeah I've had to give
82
00:03:47,080 --> 00:03:52,439
up email which has been positive um and
83
00:03:50,560 --> 00:03:55,319
I've learned to completely avoid selfies
84
00:03:52,439 --> 00:03:58,040
so um but uh on the whole it's been very
85
00:03:55,319 --> 00:03:59,959
exciting and you don't travel light do
86
00:03:58,040 --> 00:04:01,680
you if I could put it that way just
87
00:03:59,959 --> 00:04:04,519
remind me how many people have you come
88
00:04:01,680 --> 00:04:06,159
with to Stockholm uh 185 I think that
89
00:04:04,519 --> 00:04:08,280
must be a record I'm going to have to
90
00:04:06,159 --> 00:04:09,680
check but I'm pretty sure that must be a
91
00:04:08,280 --> 00:04:13,239
record well it's quite a party you're
92
00:04:09,680 --> 00:04:14,720
going to have and uh sir Demis hassabis
93
00:04:13,239 --> 00:04:17,040
um well of course it's been an honor of
94
00:04:14,720 --> 00:04:19,479
a lifetime and um to tell you the truth
95
00:04:17,040 --> 00:04:21,440
it hasn't really sunk in yet so uh maybe
96
00:04:19,479 --> 00:04:23,560
I'll do that over the Christmas holidays
97
00:04:21,440 --> 00:04:25,880
um but it's also you know an amazing
98
00:04:23,560 --> 00:04:27,520
platform to talk about your subject more
99
00:04:25,880 --> 00:04:29,800
widely and have to think about that
100
00:04:27,520 --> 00:04:31,639
responsibility in the coming years mhm
101
00:04:29,800 --> 00:04:33,280
okay let's turn now to the awards that
102
00:04:31,639 --> 00:04:35,400
were made this year and let's start with
103
00:04:33,280 --> 00:04:39,759
the physics prize and here's a brief
104
00:04:35,400 --> 00:04:42,800
summary of the research behind that
105
00:04:39,759 --> 00:04:44,960
prize this year's physics prize rewards
106
00:04:42,800 --> 00:04:47,919
research that laid the foundations for
107
00:04:44,960 --> 00:04:51,479
the development of AI enabling machine
108
00:04:47,919 --> 00:04:53,880
learning with artificial neural networks
109
00:04:51,479 --> 00:04:57,360
John hopfield created a structure that
110
00:04:53,880 --> 00:05:00,240
can store and reconstruct information
111
00:04:57,360 --> 00:05:02,479
Jeffrey Hinton built on his ideas and
112
00:05:00,240 --> 00:05:05,360
made it possible to create completely
113
00:05:02,479 --> 00:05:09,479
new content with the help of AI
114
00:05:05,360 --> 00:05:12,039
so-called generative AI this opens up
115
00:05:09,479 --> 00:05:14,320
numerous potential areas of use for
116
00:05:12,039 --> 00:05:16,759
instance by providing techniques for
117
00:05:14,320 --> 00:05:20,120
calculating and predicting the
118
00:05:16,759 --> 00:05:22,520
properties of molecules and materials
119
00:05:20,120 --> 00:05:24,840
their research has also prompted
120
00:05:22,520 --> 00:05:27,680
extensive discussion of the ethics
121
00:05:24,840 --> 00:05:30,919
around how the technology is developed
122
00:05:27,680 --> 00:05:30,919
and used
123
00:05:33,000 --> 00:05:40,360
so Jeffrey Hinton you actually wanted to
124
00:05:36,440 --> 00:05:43,240
find out how the human brain works so
125
00:05:40,360 --> 00:05:45,319
how does it work we still don't know
126
00:05:43,240 --> 00:05:47,840
we've made lots of efforts to figure out
127
00:05:45,319 --> 00:05:49,560
how the brain figures out how to change
128
00:05:47,840 --> 00:05:52,240
the strength of connections between two
129
00:05:49,560 --> 00:05:54,319
neurons we've learned a lot from these
130
00:05:52,240 --> 00:05:56,360
big systems that we've built which is if
131
00:05:54,319 --> 00:05:57,840
you could find any way to know whether
132
00:05:56,360 --> 00:05:59,919
you should increase or decrease the
133
00:05:57,840 --> 00:06:01,800
strength and then you just did did that
134
00:05:59,919 --> 00:06:03,840
for all of the connections all 100
135
00:06:01,800 --> 00:06:05,440
trillion connections and you just kept
136
00:06:03,840 --> 00:06:07,160
doing that with lots of examples
137
00:06:05,440 --> 00:06:09,280
slightly increasing or decreasing the
138
00:06:07,160 --> 00:06:12,360
strength then you would get fantastic
139
00:06:09,280 --> 00:06:14,199
systems like gb4 these big chatbots
140
00:06:12,360 --> 00:06:16,440
learn thousands of times more than any
141
00:06:14,199 --> 00:06:18,319
one person so they can compress all of
142
00:06:16,440 --> 00:06:20,599
human knowledge into only a trillion
143
00:06:18,319 --> 00:06:23,199
connections and we have a 100 trillion
144
00:06:20,599 --> 00:06:26,280
connections and none of us know much but
145
00:06:23,199 --> 00:06:28,360
that's but that's interesting he says
146
00:06:26,280 --> 00:06:32,280
speak for yourself but anyway he does
147
00:06:28,360 --> 00:06:35,199
know a lot actually but compared with
148
00:06:32,280 --> 00:06:37,960
gbt so it make you you make it sound
149
00:06:35,199 --> 00:06:39,919
though as if this is the best computer
150
00:06:37,960 --> 00:06:42,160
and it's never been bettered we don't
151
00:06:39,919 --> 00:06:44,680
quite know how it works and yet you also
152
00:06:42,160 --> 00:06:47,039
say that artificial intelligence
153
00:06:44,680 --> 00:06:50,039
artificial neuron networks could
154
00:06:47,039 --> 00:06:52,639
outsmart humans oh I think we've been
155
00:06:50,039 --> 00:06:54,319
bettered already if you look at gbd4 it
156
00:06:52,639 --> 00:06:55,800
knows much more than any one person it's
157
00:06:54,319 --> 00:06:57,960
like a not very good expert at
158
00:06:55,800 --> 00:06:59,800
everything um so it's got much more
159
00:06:57,960 --> 00:07:02,000
knowledge in far fewer connections and
160
00:06:59,800 --> 00:07:03,680
we've been bettered in that sense do you
161
00:07:02,000 --> 00:07:04,800
agree with that Miss well look I think
162
00:07:03,680 --> 00:07:07,160
so I mean just going back to your
163
00:07:04,800 --> 00:07:08,960
initial question um originally in with
164
00:07:07,160 --> 00:07:10,919
with the with the field of AI there's a
165
00:07:08,960 --> 00:07:12,440
lot of inspiration taken from
166
00:07:10,919 --> 00:07:14,360
architectures of the brain including
167
00:07:12,440 --> 00:07:16,479
neur networks and an algorithm called
168
00:07:14,360 --> 00:07:18,560
reinforcement learning um then we've
169
00:07:16,479 --> 00:07:20,240
gone into a kind of engineering phase
170
00:07:18,560 --> 00:07:22,160
now where we're scaling these systems up
171
00:07:20,240 --> 00:07:24,400
to massive size all of these large
172
00:07:22,160 --> 00:07:26,400
Foundation models or language models uh
173
00:07:24,400 --> 00:07:28,000
and there's many leading models now and
174
00:07:26,400 --> 00:07:29,919
I think we'll we'll end up in the next
175
00:07:28,000 --> 00:07:32,639
phase where we'll start using these AI
176
00:07:29,919 --> 00:07:34,560
models to analyze our own brains and to
177
00:07:32,639 --> 00:07:36,960
help with Neuroscience as one of the
178
00:07:34,560 --> 00:07:38,280
sciences that AI helps with so actually
179
00:07:36,960 --> 00:07:41,319
I think it's going of come sort of Full
180
00:07:38,280 --> 00:07:44,159
Circle neurosciences sort of inspired
181
00:07:41,319 --> 00:07:46,199
modern Ai and then AI will come back and
182
00:07:44,159 --> 00:07:48,240
um help us I think understand what's
183
00:07:46,199 --> 00:07:50,560
special about the brain will machine
184
00:07:48,240 --> 00:07:52,039
intelligence outsmart humans I mean what
185
00:07:50,560 --> 00:07:53,440
what kind of time frame are you talking
186
00:07:52,039 --> 00:07:56,199
about are you saying it's already
187
00:07:53,440 --> 00:07:57,479
happened so in terms of the amount of
188
00:07:56,199 --> 00:07:59,639
knowledge you can have in one system
189
00:07:57,479 --> 00:08:01,720
it's clearly already happened right GB
190
00:07:59,639 --> 00:08:03,479
knows much more than any human yeah and
191
00:08:01,720 --> 00:08:06,080
it it does make stuff up but it still
192
00:08:03,479 --> 00:08:08,759
knows a lot yeah um in terms of the
193
00:08:06,080 --> 00:08:11,560
timing I think all the leading experts I
194
00:08:08,759 --> 00:08:12,960
know people like Demis um they believe
195
00:08:11,560 --> 00:08:14,440
it's going to happen they believe these
196
00:08:12,960 --> 00:08:17,080
machines are going to get smarter than
197
00:08:14,440 --> 00:08:18,680
people at general intelligence and they
198
00:08:17,080 --> 00:08:20,120
just differ in how long they think
199
00:08:18,680 --> 00:08:22,039
that's going to take well we're going to
200
00:08:20,120 --> 00:08:24,280
start being bossed around by machines
201
00:08:22,039 --> 00:08:27,000
and robots is that what your suggestion
202
00:08:24,280 --> 00:08:28,879
well that's the question um can you have
203
00:08:27,000 --> 00:08:30,720
things more intelligent than you and you
204
00:08:28,879 --> 00:08:32,000
still staying control once they're more
205
00:08:30,720 --> 00:08:33,800
intelligent than us will they be the
206
00:08:32,000 --> 00:08:35,719
bosses or will we still be the bosses
207
00:08:33,800 --> 00:08:38,159
and what do you think I think we need to
208
00:08:35,719 --> 00:08:40,000
do a lot of research right now on how we
209
00:08:38,159 --> 00:08:42,399
Remain the bosses you didn't actually
210
00:08:40,000 --> 00:08:44,880
answer that question dearis do you think
211
00:08:42,399 --> 00:08:47,279
that machine intelligence could outsmart
212
00:08:44,880 --> 00:08:49,480
outwits to the extent that actually they
213
00:08:47,279 --> 00:08:51,519
start ruling the roost no well look I
214
00:08:49,480 --> 00:08:53,519
think for now so I disagree with with
215
00:08:51,519 --> 00:08:55,880
with with Jeff on the fact that today's
216
00:08:53,519 --> 00:08:57,519
systems are still not that good they're
217
00:08:55,880 --> 00:08:59,640
impressive they can talk to us and other
218
00:08:57,519 --> 00:09:01,200
things they acquire a lot of knowledge
219
00:08:59,640 --> 00:09:02,560
um but they're pretty weak at a lot of
220
00:09:01,200 --> 00:09:06,399
things they're not very good at planning
221
00:09:02,560 --> 00:09:07,880
yet or reasoning um or uh imagining and
222
00:09:06,399 --> 00:09:09,600
you know creativity those kinds of
223
00:09:07,880 --> 00:09:11,880
things but they they are going to get
224
00:09:09,600 --> 00:09:14,880
better rapidly so it depends now on how
225
00:09:11,880 --> 00:09:17,079
we design those systems and how we um
226
00:09:14,880 --> 00:09:18,440
decide to sort of as a society deploy
227
00:09:17,079 --> 00:09:20,200
those systems and build those systems
228
00:09:18,440 --> 00:09:22,320
all right so we'll look at what we do
229
00:09:20,200 --> 00:09:25,040
about it but gentlemen this is a very
230
00:09:22,320 --> 00:09:27,040
big fundamental question Gary and then
231
00:09:25,040 --> 00:09:30,560
you yeah I think you're overrating
232
00:09:27,040 --> 00:09:34,279
humans in this so we make up a lot of
233
00:09:30,560 --> 00:09:36,760
untruths as well and uh there's so many
234
00:09:34,279 --> 00:09:39,040
examples of false ideas that get
235
00:09:36,760 --> 00:09:41,800
propagated and and it's getting worse of
236
00:09:39,040 --> 00:09:46,120
course with social networks so the
237
00:09:41,800 --> 00:09:49,399
standards for AI to do well it is it's
238
00:09:46,120 --> 00:09:54,839
pretty low you humanity is way
239
00:09:49,399 --> 00:09:56,680
overrated right okay dve humans I I'll
240
00:09:54,839 --> 00:09:58,279
I'll take a contrarian view here um you
241
00:09:56,680 --> 00:10:00,040
know humans since the really the
242
00:09:58,279 --> 00:10:02,079
beginning of civilization have have
243
00:10:00,040 --> 00:10:04,040
created things that are um better at
244
00:10:02,079 --> 00:10:06,760
them than in almost every domain you
245
00:10:04,040 --> 00:10:09,800
know cars can go infinitely faster
246
00:10:06,760 --> 00:10:11,519
planes can fly humans can't um you know
247
00:10:09,800 --> 00:10:14,040
for a long time we've had computers that
248
00:10:11,519 --> 00:10:16,200
can do calculations that humans can't do
249
00:10:14,040 --> 00:10:18,519
um Demis has developed you know programs
250
00:10:16,200 --> 00:10:20,079
that solve go and chess so we're very
251
00:10:18,519 --> 00:10:22,440
comfortable I think with machines being
252
00:10:20,079 --> 00:10:25,920
able to do things that we can't do chat
253
00:10:22,440 --> 00:10:27,440
GP you know gp4 um has much more
254
00:10:25,920 --> 00:10:28,720
knowledge than any human being I think
255
00:10:27,440 --> 00:10:30,519
we just take this kind of thing in
256
00:10:28,720 --> 00:10:32,920
stride I don't think we worry about
257
00:10:30,519 --> 00:10:35,120
losing control so I guess that's the key
258
00:10:32,920 --> 00:10:37,120
issue that we know that computers can do
259
00:10:35,120 --> 00:10:39,440
a lot that we can't but it's question of
260
00:10:37,120 --> 00:10:41,480
control I mean planes fly but it's the
261
00:10:39,440 --> 00:10:44,200
human pilot who's in the cockpit
262
00:10:41,480 --> 00:10:47,639
assisted by technology obviously and we
263
00:10:44,200 --> 00:10:50,200
still drive cars yeah um what about you
264
00:10:47,639 --> 00:10:51,399
two The Economist where do you stand on
265
00:10:50,200 --> 00:10:53,040
this question I'll take the opposite
266
00:10:51,399 --> 00:10:55,560
position to Gary I think humans are
267
00:10:53,040 --> 00:11:00,040
incredibly underrated right
268
00:10:55,560 --> 00:11:03,600
now human adaptability fluidity
269
00:11:00,040 --> 00:11:06,639
creativity but also Community I think
270
00:11:03,600 --> 00:11:09,000
humans are just amazing social animals
271
00:11:06,639 --> 00:11:11,760
we learn as collectives and as
272
00:11:09,000 --> 00:11:14,800
collectives we are able to do a huge
273
00:11:11,760 --> 00:11:17,600
number of things in very quick
274
00:11:14,800 --> 00:11:20,160
succession so I would worry about those
275
00:11:17,600 --> 00:11:24,000
people controlling AI before AI itself
276
00:11:20,160 --> 00:11:26,560
turning on us humankind's greatest enemy
277
00:11:24,000 --> 00:11:29,920
is humankind the sort of do evils that
278
00:11:26,560 --> 00:11:33,800
we see in Popular Science fic who think
279
00:11:29,920 --> 00:11:36,519
they're doing good I wouldn't put those
280
00:11:33,800 --> 00:11:38,120
past doing huge damage yeah I would
281
00:11:36,519 --> 00:11:39,720
agree I mean as the tools get more
282
00:11:38,120 --> 00:11:41,519
powerful I think the worry is not the
283
00:11:39,720 --> 00:11:44,680
machines themselves but people using the
284
00:11:41,519 --> 00:11:46,720
tools misinformation autonomous military
285
00:11:44,680 --> 00:11:49,000
weapons all kinds of things humans have
286
00:11:46,720 --> 00:11:51,000
a great track record of inventing things
287
00:11:49,000 --> 00:11:52,800
you know that jeopardize the human race
288
00:11:51,000 --> 00:11:55,800
such as nuclear weapons I mean just
289
00:11:52,800 --> 00:11:58,120
think about how close we've been to
290
00:11:55,800 --> 00:11:59,800
obliterating the planet with the Cuba
291
00:11:58,120 --> 00:12:02,000
Cuban Missile Crisis
292
00:11:59,800 --> 00:12:03,920
and you know so so we've done it already
293
00:12:02,000 --> 00:12:06,160
we can do it again in a different form
294
00:12:03,920 --> 00:12:07,839
or you know with a diff so so so I guess
295
00:12:06,160 --> 00:12:09,639
I would I would like to ask Demis you
296
00:12:07,839 --> 00:12:11,279
know I I I take the point of view
297
00:12:09,639 --> 00:12:13,040
everyone's saying yes we need to
298
00:12:11,279 --> 00:12:14,920
regulate we need to but who has the
299
00:12:13,040 --> 00:12:17,440
incentive to do that I don't you know
300
00:12:14,920 --> 00:12:19,079
like it's one thing to say that but but
301
00:12:17,440 --> 00:12:20,920
I I suspect the politicians and the
302
00:12:19,079 --> 00:12:22,519
governments they're just playing catchup
303
00:12:20,920 --> 00:12:24,480
that you know the thing is moving faster
304
00:12:22,519 --> 00:12:26,040
than they can get their hands on and who
305
00:12:24,480 --> 00:12:27,920
in the private sector they just want to
306
00:12:26,040 --> 00:12:30,199
make money and get this stuff out there
307
00:12:27,920 --> 00:12:31,959
and so where where are the incentives to
308
00:12:30,199 --> 00:12:34,399
actually do something about that yeah
309
00:12:31,959 --> 00:12:36,160
well look I mean there's obviously the
310
00:12:34,399 --> 00:12:37,800
reason that many of us are working on AI
311
00:12:36,160 --> 00:12:39,680
is because we want to bring to bear all
312
00:12:37,800 --> 00:12:42,279
of the incredible benefits that can
313
00:12:39,680 --> 00:12:44,199
happen with AI from in medicine but also
314
00:12:42,279 --> 00:12:45,519
productivity and so on but I agree with
315
00:12:44,199 --> 00:12:48,440
you there is a going to be a kind of
316
00:12:45,519 --> 00:12:49,720
coordination problem where um I think
317
00:12:48,440 --> 00:12:51,199
there has to be some form of
318
00:12:49,720 --> 00:12:53,360
international cooperation on these
319
00:12:51,199 --> 00:12:55,399
issues I think we've got a few years to
320
00:12:53,360 --> 00:12:58,600
get our act together on that and I think
321
00:12:55,399 --> 00:13:00,839
leading researchers and leading um labs
322
00:12:58,600 --> 00:13:02,519
in industry and Academia need to come
323
00:13:00,839 --> 00:13:04,560
together to kind of demand that sort of
324
00:13:02,519 --> 00:13:06,920
cooperation as we get closer to
325
00:13:04,560 --> 00:13:08,920
artificial general intelligence um and
326
00:13:06,920 --> 00:13:11,199
have more information about what that
327
00:13:08,920 --> 00:13:14,199
might look like but I'm I'm I'm a big
328
00:13:11,199 --> 00:13:15,880
believer in human Ingenuity and um as
329
00:13:14,199 --> 00:13:18,040
David says you know we're unbelievably
330
00:13:15,880 --> 00:13:19,560
adaptive as a species and you know look
331
00:13:18,040 --> 00:13:21,279
at our modern technology we already use
332
00:13:19,560 --> 00:13:22,760
today that we sort of seamlessly the
333
00:13:21,279 --> 00:13:25,040
younger generation just seamlessly
334
00:13:22,760 --> 00:13:26,440
adapts to and takes as a given and I
335
00:13:25,040 --> 00:13:29,680
think that's also happened with these
336
00:13:26,440 --> 00:13:31,279
chat Bots which you know 25 years ago we
337
00:13:29,680 --> 00:13:33,040
would have been amazed those of us in
338
00:13:31,279 --> 00:13:34,760
this in the area of AI if you were to
339
00:13:33,040 --> 00:13:37,720
transport the Technologies back we have
340
00:13:34,760 --> 00:13:39,519
today back then um and yet we've all um
341
00:13:37,720 --> 00:13:41,800
Society seems to have sort of seamlessly
342
00:13:39,519 --> 00:13:43,560
adapted to that as well Jeffrey Hinton
343
00:13:41,800 --> 00:13:45,399
do you see that happening you've raised
344
00:13:43,560 --> 00:13:48,560
the alarm Bells about humans becoming
345
00:13:45,399 --> 00:13:50,040
subservient in a way to to machines um
346
00:13:48,560 --> 00:13:52,160
do you think that there's enough of a
347
00:13:50,040 --> 00:13:56,199
debate at an international level do we
348
00:13:52,160 --> 00:13:58,040
need more Ethics in science to debate
349
00:13:56,199 --> 00:14:00,759
these kind of issues do you see that
350
00:13:58,040 --> 00:14:03,160
happening so to distinguish two kinds of
351
00:14:00,759 --> 00:14:04,720
risks from AI one is relatively
352
00:14:03,160 --> 00:14:08,199
shortterm and that's to do with Bad
353
00:14:04,720 --> 00:14:09,759
actors and that's much more urgent um
354
00:14:08,199 --> 00:14:11,560
that were that's going to be obvious
355
00:14:09,759 --> 00:14:12,800
with lethal autonomous weapons which all
356
00:14:11,560 --> 00:14:15,240
the big defense departments are
357
00:14:12,800 --> 00:14:17,440
developing and they have no intention of
358
00:14:15,240 --> 00:14:20,240
not doing it the European regulations on
359
00:14:17,440 --> 00:14:22,680
AI say none of these regulations apply
360
00:14:20,240 --> 00:14:25,000
to military uses of AI so they clearly
361
00:14:22,680 --> 00:14:26,560
intend to go ahead with all that and
362
00:14:25,000 --> 00:14:30,120
there's many other short-term risks like
363
00:14:26,560 --> 00:14:32,639
cyber crime um generating bad path
364
00:14:30,120 --> 00:14:34,680
fake videos surveillance all of those
365
00:14:32,639 --> 00:14:36,720
short-term risks are very serious and we
366
00:14:34,680 --> 00:14:37,839
need to take them seriously and it's
367
00:14:36,720 --> 00:14:39,720
going to be very hard to get
368
00:14:37,839 --> 00:14:41,639
collaboration on those then the
369
00:14:39,720 --> 00:14:43,320
long-term risk that these things will
370
00:14:41,639 --> 00:14:45,440
get more intelligent than us and they'll
371
00:14:43,320 --> 00:14:47,120
be agents they'll act in the world and
372
00:14:45,440 --> 00:14:48,920
they'll decide that they can achieve
373
00:14:47,120 --> 00:14:50,160
their goals better which we gave them
374
00:14:48,920 --> 00:14:51,480
the goals and they can achieve them
375
00:14:50,160 --> 00:14:55,079
better if they just brush us aside and
376
00:14:51,480 --> 00:14:56,959
get on with it um that particular risk
377
00:14:55,079 --> 00:14:59,040
the existential threat is a place where
378
00:14:56,959 --> 00:15:00,839
people will cooperate and that's because
379
00:14:59,040 --> 00:15:03,800
because we're all in the same boat
380
00:15:00,839 --> 00:15:06,079
nobody wants these AIS to take over from
381
00:15:03,800 --> 00:15:08,279
people and so the Chinese Communist
382
00:15:06,079 --> 00:15:09,639
party doesn't want AI to be in control
383
00:15:08,279 --> 00:15:11,440
it wants the Chinese Communist party to
384
00:15:09,639 --> 00:15:14,680
be in control you know for somebody
385
00:15:11,440 --> 00:15:16,680
who's described as the Godfather of AI
386
00:15:14,680 --> 00:15:19,360
you sound quite a bit down on it in so
387
00:15:16,680 --> 00:15:21,519
many ways well it's potentially very
388
00:15:19,360 --> 00:15:23,639
dangerous it's potentially very good and
389
00:15:21,519 --> 00:15:25,399
potentially very dangerous and I you
390
00:15:23,639 --> 00:15:28,120
know I think we should be making a huge
391
00:15:25,399 --> 00:15:30,399
effort now into making sure we can get
392
00:15:28,120 --> 00:15:32,759
the good aspects of of it without the
393
00:15:30,399 --> 00:15:35,680
bad possibilities and it's not going to
394
00:15:32,759 --> 00:15:37,319
happen automatically like he says well
395
00:15:35,680 --> 00:15:38,839
we've got some students in the audience
396
00:15:37,319 --> 00:15:41,199
here and I know that some of them want
397
00:15:38,839 --> 00:15:44,319
to pose a question to you Lau prashan
398
00:15:41,199 --> 00:15:47,319
yadava from the kth AI Society your
399
00:15:44,319 --> 00:15:50,199
question please I'd like to know in what
400
00:15:47,319 --> 00:15:52,440
ways AI can be put to use uh in bringing
401
00:15:50,199 --> 00:15:54,519
truly democratic values and bringing
402
00:15:52,440 --> 00:15:58,639
economic equalities to the
403
00:15:54,519 --> 00:16:01,399
world so in what way can AI promote
404
00:15:58,639 --> 00:16:04,680
democ Ry and equality in the world who's
405
00:16:01,399 --> 00:16:04,680
going to answer that
406
00:16:04,800 --> 00:16:10,240
one I can start off I mean I I think um
407
00:16:08,880 --> 00:16:12,399
as we've discussed actually for most of
408
00:16:10,240 --> 00:16:16,240
the conversation you I think powerful
409
00:16:12,399 --> 00:16:18,319
Technologies in of themselves are um
410
00:16:16,240 --> 00:16:20,959
kind of like neutral they could go good
411
00:16:18,319 --> 00:16:22,959
or bad depending on what we as society
412
00:16:20,959 --> 00:16:24,720
decide to do with them and I think AI is
413
00:16:22,959 --> 00:16:26,160
just the latest example of that in that
414
00:16:24,720 --> 00:16:28,240
case maybe it's going to be the most
415
00:16:26,160 --> 00:16:30,199
powerful thing and most important that
416
00:16:28,240 --> 00:16:31,680
we get right but it's also on the
417
00:16:30,199 --> 00:16:33,040
optimistic end I think it's one of the
418
00:16:31,680 --> 00:16:34,800
challenges it's the only challenge I can
419
00:16:33,040 --> 00:16:37,480
think of that could be useful to address
420
00:16:34,800 --> 00:16:40,560
the other challenges if we get it right
421
00:16:37,480 --> 00:16:41,759
so um so that's the key um I don't know
422
00:16:40,560 --> 00:16:43,040
you know democracy and other things
423
00:16:41,759 --> 00:16:44,680
that's better out scope maybe it's for
424
00:16:43,040 --> 00:16:46,759
the economist to talk about well I'll
425
00:16:44,680 --> 00:16:48,839
I'll just say I think AI is an
426
00:16:46,759 --> 00:16:53,120
informational tool and it would be most
427
00:16:48,839 --> 00:16:56,319
useful and most uh enriching for us in
428
00:16:53,120 --> 00:16:58,560
every respect if it's useful reliable
429
00:16:56,319 --> 00:17:00,399
and enabling information for everybody
430
00:16:58,560 --> 00:17:02,480
not just for somebody sitting at top to
431
00:17:00,399 --> 00:17:04,199
manipulate others but enabling for
432
00:17:02,480 --> 00:17:06,079
Citizens for example enabling for
433
00:17:04,199 --> 00:17:08,679
workers of different skills to do their
434
00:17:06,079 --> 00:17:11,240
tasks all of those are aspects of
435
00:17:08,679 --> 00:17:13,520
democratization but we still have a long
436
00:17:11,240 --> 00:17:15,679
way to go for that sort of tool to be
437
00:17:13,520 --> 00:17:17,720
available in a widespread way and not be
438
00:17:15,679 --> 00:17:20,400
manipulable let's go for another
439
00:17:17,720 --> 00:17:23,199
question now from our audience Al karini
440
00:17:20,400 --> 00:17:25,880
papasu from the Stockholm School of
441
00:17:23,199 --> 00:17:28,120
Economics your question please uh hello
442
00:17:25,880 --> 00:17:31,440
thank you uh my question regards how do
443
00:17:28,120 --> 00:17:34,200
you think philosophy and science coexist
444
00:17:31,440 --> 00:17:36,640
um we have a very deep need for more
445
00:17:34,200 --> 00:17:38,240
philosophy and great endon perhaps
446
00:17:36,640 --> 00:17:40,480
There's an opportunity for some new
447
00:17:38,240 --> 00:17:42,679
great philosophers to appear um to help
448
00:17:40,480 --> 00:17:45,120
us through the next phase of
449
00:17:42,679 --> 00:17:46,960
technological development um in my view
450
00:17:45,120 --> 00:17:49,880
that that is going to require depending
451
00:17:46,960 --> 00:17:51,720
on your definition of philosophy um some
452
00:17:49,880 --> 00:17:53,720
uh uh deep thinking and and wider
453
00:17:51,720 --> 00:17:55,919
Thinking Beyond the technology itself
454
00:17:53,720 --> 00:17:59,600
yeah absolutely I think actually one of
455
00:17:55,919 --> 00:18:03,039
the things with the advances in AI
456
00:17:59,600 --> 00:18:04,720
we will need to understand much better
457
00:18:03,039 --> 00:18:07,080
what makes us conscious what makes us
458
00:18:04,720 --> 00:18:11,159
human there might be some stumbling
459
00:18:07,080 --> 00:18:13,600
blocks that will make us delve deeper
460
00:18:11,159 --> 00:18:15,919
into some of these questions but even if
461
00:18:13,600 --> 00:18:19,440
advances in AI are very fast we will
462
00:18:15,919 --> 00:18:21,559
need to question our own existence and
463
00:18:19,440 --> 00:18:24,080
what makes that more meaningful
464
00:18:21,559 --> 00:18:25,679
certainly we need ethics but the kind of
465
00:18:24,080 --> 00:18:27,760
philosophy we don't need I think is
466
00:18:25,679 --> 00:18:30,120
philosophers talking about Consciousness
467
00:18:27,760 --> 00:18:31,720
and sentience and subjective experience
468
00:18:30,120 --> 00:18:34,200
I think understanding those is a
469
00:18:31,720 --> 00:18:36,679
scientific problem and we'll be better
470
00:18:34,200 --> 00:18:36,679
off without
471
00:18:36,799 --> 00:18:41,840
philosophers anybody else on this no all
472
00:18:39,679 --> 00:18:44,400
right thank you very much but let's turn
473
00:18:41,840 --> 00:18:47,240
now to um some of the work that has
474
00:18:44,400 --> 00:18:49,440
contributed to the award for the
475
00:18:47,240 --> 00:18:52,240
chemistry prize this year for Demis
476
00:18:49,440 --> 00:18:54,200
cabis David Baker along with John jumper
477
00:18:52,240 --> 00:18:57,960
and let's just get a brief idea of the
478
00:18:54,200 --> 00:19:00,039
research that led to the chemistry Nobel
479
00:18:57,960 --> 00:19:02,559
Prize award
480
00:19:00,039 --> 00:19:04,919
the ability to figure out quickly what
481
00:19:02,559 --> 00:19:07,600
proteins look like and to create
482
00:19:04,919 --> 00:19:09,919
proteins of your own has fundamentally
483
00:19:07,600 --> 00:19:13,640
changed the development of chemistry
484
00:19:09,919 --> 00:19:17,320
biology and medical science by creating
485
00:19:13,640 --> 00:19:20,280
the AI program Alpha fold 2 this year's
486
00:19:17,320 --> 00:19:22,880
chemistry laurates Demis hassabis and
487
00:19:20,280 --> 00:19:25,600
John jumper have made it possible to
488
00:19:22,880 --> 00:19:27,919
calculate the shape of proteins and
489
00:19:25,600 --> 00:19:29,880
thereby understand how the building
490
00:19:27,919 --> 00:19:32,320
block of life Life
491
00:19:29,880 --> 00:19:34,640
Works the second half of this year's
492
00:19:32,320 --> 00:19:37,200
award goes to David Baker for what's
493
00:19:34,640 --> 00:19:40,480
been described as the almost impossible
494
00:19:37,200 --> 00:19:43,640
feat of building entirely new kinds of
495
00:19:40,480 --> 00:19:47,520
proteins useful not least for producing
496
00:19:43,640 --> 00:19:50,760
what could block the SARS Cove 2 virus
497
00:19:47,520 --> 00:19:52,799
making new proteins can simply open up
498
00:19:50,760 --> 00:19:55,120
whole new
499
00:19:52,799 --> 00:19:57,720
worlds so let's start with you David
500
00:19:55,120 --> 00:19:59,159
Baker you've been applauded for creating
501
00:19:57,720 --> 00:20:00,679
these new
502
00:19:59,159 --> 00:20:02,559
proteins and actually you didn't even
503
00:20:00,679 --> 00:20:04,360
want to become a scientist in the first
504
00:20:02,559 --> 00:20:06,600
place so it's quite amazing that You'
505
00:20:04,360 --> 00:20:09,080
now got this Nobel Prize but just tell
506
00:20:06,600 --> 00:20:12,240
us what kind of um applications
507
00:20:09,080 --> 00:20:14,280
implications do you think your work has
508
00:20:12,240 --> 00:20:16,320
uh led to or could lead to yeah I think
509
00:20:14,280 --> 00:20:18,480
following up on our previous discussion
510
00:20:16,320 --> 00:20:21,440
um I think I can really talk about the
511
00:20:18,480 --> 00:20:24,720
the really real power of AI to do good
512
00:20:21,440 --> 00:20:26,280
so um some of the proteins in nature uh
513
00:20:24,720 --> 00:20:28,280
solve all the problems that came up
514
00:20:26,280 --> 00:20:29,840
during Evolution and we face all kinds
515
00:20:28,280 --> 00:20:31,280
of new problems in the world today you
516
00:20:29,840 --> 00:20:33,320
know we live longer so neur
517
00:20:31,280 --> 00:20:35,360
neurodegenerative diseases are important
518
00:20:33,320 --> 00:20:37,880
we heating up and pling the planet and
519
00:20:35,360 --> 00:20:41,080
these are really existential problems
520
00:20:37,880 --> 00:20:43,159
and uh now um you know maybe with
521
00:20:41,080 --> 00:20:44,960
Evolution another 100 million years
522
00:20:43,159 --> 00:20:46,799
proteins would evolve that would help
523
00:20:44,960 --> 00:20:49,120
address these but with protein design we
524
00:20:46,799 --> 00:20:51,480
can now design proteins to uh try and
525
00:20:49,120 --> 00:20:53,240
deal with these today and so we're
526
00:20:51,480 --> 00:20:55,480
designing proteins completely new
527
00:20:53,240 --> 00:20:57,080
proteins to do things ranging from
528
00:20:55,480 --> 00:21:00,280
breaking down plastic that's been
529
00:20:57,080 --> 00:21:02,360
released into the environment to um uh
530
00:21:00,280 --> 00:21:05,400
combating neurod degenerative disease
531
00:21:02,360 --> 00:21:07,559
and and cancer and de sis of course
532
00:21:05,400 --> 00:21:09,440
you're well known for being a co-founder
533
00:21:07,559 --> 00:21:11,840
of Deep Mind the machine Learning
534
00:21:09,440 --> 00:21:15,080
Company and uh I mean you're a
535
00:21:11,840 --> 00:21:18,039
chess uh Champion you're a child prodigy
536
00:21:15,080 --> 00:21:19,760
really um you know making video games
537
00:21:18,039 --> 00:21:21,919
when you're only in your team so here
538
00:21:19,760 --> 00:21:24,880
you are you've got a Nobel Prize under
539
00:21:21,919 --> 00:21:27,440
your belt as well um but you've already
540
00:21:24,880 --> 00:21:29,400
actually started using the research that
541
00:21:27,440 --> 00:21:31,039
for which you were awarded the prize
542
00:21:29,400 --> 00:21:33,480
along with John jumper that's right so
543
00:21:31,039 --> 00:21:35,440
we we are with our own collaborations um
544
00:21:33,480 --> 00:21:37,120
uh we've been uh working with uh
545
00:21:35,440 --> 00:21:41,039
institutes like the drugs for neglected
546
00:21:37,120 --> 00:21:43,520
diseases part of the who uh and uh
547
00:21:41,039 --> 00:21:45,640
indeed because if you reduce the cost of
548
00:21:43,520 --> 00:21:48,120
of understanding what these proteins do
549
00:21:45,640 --> 00:21:50,240
you can go straight to drug design um
550
00:21:48,120 --> 00:21:51,360
that can help uh with a lot of the
551
00:21:50,240 --> 00:21:53,159
diseases that affect the poorer
552
00:21:51,360 --> 00:21:55,200
countries of the world where big farmer
553
00:21:53,159 --> 00:21:57,840
won't go because there isn't a return to
554
00:21:55,200 --> 00:21:59,480
be made so but in fact it affects uh you
555
00:21:57,840 --> 00:22:02,279
know a larger part of the of the world's
556
00:21:59,480 --> 00:22:03,559
population so um I think these these
557
00:22:02,279 --> 00:22:05,640
Technologies actually going back to our
558
00:22:03,559 --> 00:22:07,480
earlier conversation will help a lot of
559
00:22:05,640 --> 00:22:10,320
the poorer parts of the world by making
560
00:22:07,480 --> 00:22:11,679
the cost of Discovery so much um so much
561
00:22:10,320 --> 00:22:14,120
lower you know that it's within the
562
00:22:11,679 --> 00:22:16,120
scope then of Nos and nonprofits anybody
563
00:22:14,120 --> 00:22:19,000
else want to chip in on this I mean I
564
00:22:16,120 --> 00:22:22,120
mean obviously I think this is just an
565
00:22:19,000 --> 00:22:25,880
amazing opportunity for science anything
566
00:22:22,120 --> 00:22:27,799
we can use to improve the scientific
567
00:22:25,880 --> 00:22:30,279
process can have can have not
568
00:22:27,799 --> 00:22:33,919
necessarily will have have can have
569
00:22:30,279 --> 00:22:36,200
great benefits but that doesn't change
570
00:22:33,919 --> 00:22:38,600
some of the tenor of the earlier
571
00:22:36,200 --> 00:22:41,480
conversation great tools also still
572
00:22:38,600 --> 00:22:44,080
create great risks Fritz harber you know
573
00:22:41,480 --> 00:22:46,520
a Nobel Prize winner for work on which
574
00:22:44,080 --> 00:22:50,000
we depend every day with uh synthetic
575
00:22:46,520 --> 00:22:52,640
fertilizers you know also made chemical
576
00:22:50,000 --> 00:22:55,440
weapons for the German Army in World War
577
00:22:52,640 --> 00:22:57,279
one and directly causing the deaths of
578
00:22:55,440 --> 00:22:59,039
hundreds of thousands of people so the
579
00:22:57,279 --> 00:23:01,279
responsibility of scientist with
580
00:22:59,039 --> 00:23:03,840
powerful tools is no
581
00:23:01,279 --> 00:23:06,520
less we're seeing skepticism in all
582
00:23:03,840 --> 00:23:09,000
sorts of positions of power now aren't
583
00:23:06,520 --> 00:23:10,799
we um all over the world is is that
584
00:23:09,000 --> 00:23:12,919
something that worries you that policy
585
00:23:10,799 --> 00:23:16,240
makers don't perhaps understand the full
586
00:23:12,919 --> 00:23:19,360
complexity of science be climate science
587
00:23:16,240 --> 00:23:22,000
or or you know other difficult
588
00:23:19,360 --> 00:23:24,240
issues well I would say it's also part
589
00:23:22,000 --> 00:23:25,480
of our responsibility that we have to
590
00:23:24,240 --> 00:23:29,960
work
591
00:23:25,480 --> 00:23:33,120
harder in getting people to trust
592
00:23:29,960 --> 00:23:35,240
science I think there is much greater
593
00:23:33,120 --> 00:23:38,039
skepticism about
594
00:23:35,240 --> 00:23:40,520
science and I don't know I don't think
595
00:23:38,039 --> 00:23:42,520
anybody knows exactly why but it is part
596
00:23:40,520 --> 00:23:45,960
of the general polarization but it's
597
00:23:42,520 --> 00:23:48,159
also probably the way that we are not
598
00:23:45,960 --> 00:23:50,120
properly communicating the uncertainties
599
00:23:48,159 --> 00:23:51,679
in science the disagreements in science
600
00:23:50,120 --> 00:23:54,320
what we are sure and what we are not
601
00:23:51,679 --> 00:23:58,960
sure so I think we do have a lot more
602
00:23:54,320 --> 00:24:01,600
responsibilities in building the
603
00:23:58,960 --> 00:24:03,799
Public's trust in the knowledge that's
604
00:24:01,600 --> 00:24:06,440
usable in order for that knowledge to be
605
00:24:03,799 --> 00:24:08,400
seamlessly applicable to good things
606
00:24:06,440 --> 00:24:10,799
Demis and then maybe Gary yeah yeah I
607
00:24:08,400 --> 00:24:12,960
think um I I agree with that and uh I
608
00:24:10,799 --> 00:24:16,360
think in the in just in the realm of AI
609
00:24:12,960 --> 00:24:18,440
I feel like um one of the benefits of uh
610
00:24:16,360 --> 00:24:20,000
the sort of chatbot era is AI is much
611
00:24:18,440 --> 00:24:21,720
more than just chatbots you know it's
612
00:24:20,000 --> 00:24:23,039
scientific tools and other things and
613
00:24:21,720 --> 00:24:25,039
and but that it has brought it to the
614
00:24:23,039 --> 00:24:26,520
Public's Consciousness and also made
615
00:24:25,039 --> 00:24:27,799
governments more aware of it and sort of
616
00:24:26,520 --> 00:24:29,559
brought it out of the realm of Science
617
00:24:27,799 --> 00:24:30,919
Fiction and I think that's good because
618
00:24:29,559 --> 00:24:33,559
I think in the last couple of years I've
619
00:24:30,919 --> 00:24:36,600
seen a lot more convening of government
620
00:24:33,559 --> 00:24:39,200
Civil Society um academic institutes to
621
00:24:36,600 --> 00:24:40,720
discuss the broader uh issues beyond the
622
00:24:39,200 --> 00:24:42,279
Technologies which I totally agree with
623
00:24:40,720 --> 00:24:44,000
by the way including things like what
624
00:24:42,279 --> 00:24:46,480
new institutes do we need how do do we
625
00:24:44,000 --> 00:24:49,279
distribute the the benefits of this uh
626
00:24:46,480 --> 00:24:51,720
uh widely um that's a societal problem
627
00:24:49,279 --> 00:24:53,640
it's not a technal technological problem
628
00:24:51,720 --> 00:24:55,480
and um we need to have a broad debate
629
00:24:53,640 --> 00:24:57,440
about that and we've started seeing that
630
00:24:55,480 --> 00:24:59,799
we've had a couple of um Global safety
631
00:24:57,440 --> 00:25:02,080
Summits about a one in the UK one in
632
00:24:59,799 --> 00:25:03,399
South Korea next one's in France um and
633
00:25:02,080 --> 00:25:06,200
I think we need actually a higher
634
00:25:03,399 --> 00:25:08,559
intensity and more rapid discussion
635
00:25:06,200 --> 00:25:11,919
around those issues Gary do you want to
636
00:25:08,559 --> 00:25:14,080
come in here yeah I the engine of
637
00:25:11,919 --> 00:25:17,159
Western economies in terms of the
638
00:25:14,080 --> 00:25:19,960
revolution in the last 50 years has been
639
00:25:17,159 --> 00:25:21,640
technology and and uh science and
640
00:25:19,960 --> 00:25:26,679
Silicon Valley and that sort of thing in
641
00:25:21,640 --> 00:25:30,200
terms of and and um if you wanted to if
642
00:25:26,679 --> 00:25:32,120
you're an enemy of the West you want to
643
00:25:30,200 --> 00:25:34,840
destabilize that and so I think this
644
00:25:32,120 --> 00:25:37,480
whole Social Network I don't trust
645
00:25:34,840 --> 00:25:39,880
technology I don't trust any of the
646
00:25:37,480 --> 00:25:41,559
Enterprises I don't think that's evolved
647
00:25:39,880 --> 00:25:46,320
naturally I think that's been
648
00:25:41,559 --> 00:25:49,440
manipulated by bad agents and uh we have
649
00:25:46,320 --> 00:25:52,919
to be aware of that which bad agents I I
650
00:25:49,440 --> 00:25:55,799
think it's Russia and Iran I I don't
651
00:25:52,919 --> 00:25:59,440
think it's stupid to say that and uh
652
00:25:55,799 --> 00:26:01,279
politics yeah they're not not looking in
653
00:25:59,440 --> 00:26:03,640
our best interests I think there's other
654
00:26:01,279 --> 00:26:05,240
bad agents too I probably the energy
655
00:26:03,640 --> 00:26:07,520
industry would like you not to believe
656
00:26:05,240 --> 00:26:09,600
in climate change just like the tobacco
657
00:26:07,520 --> 00:26:11,240
industry knew very well that cigarettes
658
00:26:09,600 --> 00:26:13,360
cause cancer but they hid that fact for
659
00:26:11,240 --> 00:26:14,960
a long time you know if we cannot trust
660
00:26:13,360 --> 00:26:16,840
the energy companies we cannot trust
661
00:26:14,960 --> 00:26:18,880
pharmaceutical companies tobacco
662
00:26:16,840 --> 00:26:20,720
companies can we trust the tech
663
00:26:18,880 --> 00:26:23,760
companies which are extremely
664
00:26:20,720 --> 00:26:24,559
concentrated and if AI is so important
665
00:26:23,760 --> 00:26:27,679
what
666
00:26:24,559 --> 00:26:30,559
about uh the power of tech companies I
667
00:26:27,679 --> 00:26:33,039
don't know why ask me I don't work for a
668
00:26:30,559 --> 00:26:34,880
tech so you have an objective opinion no
669
00:26:33,039 --> 00:26:37,159
no but that's that's one aspect of the
670
00:26:34,880 --> 00:26:39,600
risks of AI that we didn't talk
671
00:26:37,159 --> 00:26:40,960
about okay well but just to take a more
672
00:26:39,600 --> 00:26:43,440
positive point of view again I mean
673
00:26:40,960 --> 00:26:45,559
despite the skepticism of about science
674
00:26:43,440 --> 00:26:48,279
and certainly um you don't have to look
675
00:26:45,559 --> 00:26:50,320
far in the US it should be pointed out
676
00:26:48,279 --> 00:26:52,919
that it was the response to covid with
677
00:26:50,320 --> 00:26:55,399
the MRNA vaccines was truly miraculous
678
00:26:52,919 --> 00:26:58,480
it was a technology that really had not
679
00:26:55,399 --> 00:27:00,200
been proven at at all and in very little
680
00:26:58,480 --> 00:27:02,480
time because it was it was this thing
681
00:27:00,200 --> 00:27:05,120
about having a common enemy and and a
682
00:27:02,480 --> 00:27:07,080
threat um you know we were able to
683
00:27:05,120 --> 00:27:09,200
mobilize very quickly try something
684
00:27:07,080 --> 00:27:11,760
really completely new and bring it to
685
00:27:09,200 --> 00:27:15,039
the point where it uh did a huge amount
686
00:27:11,760 --> 00:27:17,320
of good so um uh so there there reasons
687
00:27:15,039 --> 00:27:20,360
to be optimistic that were other threats
688
00:27:17,320 --> 00:27:22,960
to appear that uh a lot of the silliness
689
00:27:20,360 --> 00:27:24,640
would sort of filter out and um the
690
00:27:22,960 --> 00:27:28,440
correct actions would be taken and the
691
00:27:24,640 --> 00:27:30,240
Skeptics died okay well on that positive
692
00:27:28,440 --> 00:27:33,039
not let's just pause there for a moment
693
00:27:30,240 --> 00:27:35,399
and let's turn to the economics Nobel
694
00:27:33,039 --> 00:27:36,370
Prize and let's see why the award was
695
00:27:35,399 --> 00:27:37,840
made this
696
00:27:36,370 --> 00:27:40,960
[Music]
697
00:27:37,840 --> 00:27:43,880
year this year's prize in economics
698
00:27:40,960 --> 00:27:47,440
touches on historical injustices and
699
00:27:43,880 --> 00:27:49,720
cruelties as well as current events too
700
00:27:47,440 --> 00:27:51,919
the question of how economic development
701
00:27:49,720 --> 00:27:54,840
is connected to individual rights
702
00:27:51,919 --> 00:27:57,279
equality and decent political
703
00:27:54,840 --> 00:28:00,360
leaders when large parts of the world
704
00:27:57,279 --> 00:28:04,360
were colonized by European powers their
705
00:28:00,360 --> 00:28:07,039
approaches varied derone asoglu Simon
706
00:28:04,360 --> 00:28:09,960
Johnson and James Robinson have shown
707
00:28:07,039 --> 00:28:12,600
that Prosperity Rose in places where the
708
00:28:09,960 --> 00:28:15,720
colonial authorities built functioning
709
00:28:12,600 --> 00:28:18,440
social institutions rather than simply
710
00:28:15,720 --> 00:28:21,480
exploiting the locals and their
711
00:28:18,440 --> 00:28:24,159
resources but no growth or improvements
712
00:28:21,480 --> 00:28:27,200
in lifestyle were created in societies
713
00:28:24,159 --> 00:28:28,799
where democracy and legal certainties
714
00:28:27,200 --> 00:28:31,679
were lacking
715
00:28:28,799 --> 00:28:34,559
the laur's research also helps us
716
00:28:31,679 --> 00:28:37,720
understand why this is the case and
717
00:28:34,559 --> 00:28:39,570
could contribute to reducing income gaps
718
00:28:37,720 --> 00:28:41,919
between
719
00:28:39,570 --> 00:28:45,399
[Music]
720
00:28:41,919 --> 00:28:47,000
nations so Daron when you're both
721
00:28:45,399 --> 00:28:48,840
talking about the importance of
722
00:28:47,000 --> 00:28:50,840
democratic institutions what kind of
723
00:28:48,840 --> 00:28:56,200
Institutions are you talking
724
00:28:50,840 --> 00:28:58,480
about the uh label that we Simon Jim and
725
00:28:56,200 --> 00:29:01,399
I use as inclusive Institution ions
726
00:28:58,480 --> 00:29:04,960
meaning institutions that distribute
727
00:29:01,399 --> 00:29:07,519
political power and economic power and
728
00:29:04,960 --> 00:29:09,919
opportunity broadly in society and that
729
00:29:07,519 --> 00:29:11,519
requires certain political institutions
730
00:29:09,919 --> 00:29:13,799
that provide voice to people so that
731
00:29:11,519 --> 00:29:16,559
they can participate their views are
732
00:29:13,799 --> 00:29:18,480
expressed and also constraints on the
733
00:29:16,559 --> 00:29:19,679
exercise of that power so you just
734
00:29:18,480 --> 00:29:22,399
talking about really the checks and
735
00:29:19,679 --> 00:29:24,799
balances we see in you know set down in
736
00:29:22,399 --> 00:29:28,240
constitutions like a an independent
737
00:29:24,799 --> 00:29:30,000
legislature a free Judiciary freedom of
738
00:29:28,240 --> 00:29:31,960
speech with you know a me the media
739
00:29:30,000 --> 00:29:35,080
being able to operate as it absolutely
740
00:29:31,960 --> 00:29:37,720
absolutely but that's not enough partly
741
00:29:35,080 --> 00:29:41,200
because what you write in a constitution
742
00:29:37,720 --> 00:29:43,120
is not going to get enforced unless
743
00:29:41,200 --> 00:29:45,840
there is a general empowerment of the
744
00:29:43,120 --> 00:29:49,559
people so Constitutions are sometimes
745
00:29:45,840 --> 00:29:52,880
changed just like shirts and uh it
746
00:29:49,559 --> 00:29:56,360
doesn't mean anything unless it becomes
747
00:29:52,880 --> 00:29:58,480
enforced but you have also seen um
748
00:29:56,360 --> 00:30:00,720
countries Prosper economically which
749
00:29:58,480 --> 00:30:02,919
have been governed by fairly
750
00:30:00,720 --> 00:30:05,640
authoritarian governments haven't you I
751
00:30:02,919 --> 00:30:08,320
mean often we talk about Lee kuu in
752
00:30:05,640 --> 00:30:10,120
Singapore or Mah Muhammad in Malaysia
753
00:30:08,320 --> 00:30:12,279
for instance yeah I think that's not the
754
00:30:10,120 --> 00:30:14,200
general pattern I mean so there are
755
00:30:12,279 --> 00:30:16,120
examples like that of course but for
756
00:30:14,200 --> 00:30:18,240
every example like that there's far more
757
00:30:16,120 --> 00:30:20,440
examples of autocratic societies that
758
00:30:18,240 --> 00:30:23,320
have not flourished economically you
759
00:30:20,440 --> 00:30:25,519
know if if you can create inclusive
760
00:30:23,320 --> 00:30:27,600
Economic Institutions even under a
761
00:30:25,519 --> 00:30:29,919
politically kind of autocratic Society
762
00:30:27,600 --> 00:30:31,600
you can flourish economically at least
763
00:30:29,919 --> 00:30:34,039
transitorily you know that's what
764
00:30:31,600 --> 00:30:36,080
happened in China you know starting in
765
00:30:34,039 --> 00:30:37,960
the late 1970s it was the movement
766
00:30:36,080 --> 00:30:39,960
towards a much more inclusive economy
767
00:30:37,960 --> 00:30:42,000
giving people the right to make
768
00:30:39,960 --> 00:30:44,440
decisions making them residual claimants
769
00:30:42,000 --> 00:30:46,080
on their own efforts and you know so so
770
00:30:44,440 --> 00:30:48,039
that that's what generated economic
771
00:30:46,080 --> 00:30:50,159
growth but our our view is that you know
772
00:30:48,039 --> 00:30:53,159
you can't sustain an economy like that
773
00:30:50,159 --> 00:30:56,320
under a autocratic political system it
774
00:30:53,159 --> 00:30:58,519
can be there for a transitory period but
775
00:30:56,320 --> 00:31:00,679
it's not sustainable a lot of research
776
00:30:58,519 --> 00:31:03,080
is based on countries which have been
777
00:31:00,679 --> 00:31:04,559
colonized and um there's been a lot of
778
00:31:03,080 --> 00:31:06,880
debate of course particularly in the
779
00:31:04,559 --> 00:31:10,080
United Kingdom because of the historical
780
00:31:06,880 --> 00:31:12,080
you know Great British Empire and um
781
00:31:10,080 --> 00:31:13,559
whether it was good or bad for the
782
00:31:12,080 --> 00:31:15,639
countries that were colonized
783
00:31:13,559 --> 00:31:18,080
practically all of Africa but you say
784
00:31:15,639 --> 00:31:21,760
that colonization often brought about a
785
00:31:18,080 --> 00:31:24,399
reversal in economic fortunes of the
786
00:31:21,760 --> 00:31:25,919
colonized people so just unpack for us
787
00:31:24,399 --> 00:31:29,200
why you say that because it sounds like
788
00:31:25,919 --> 00:31:31,840
you're saying colonization was bad for
789
00:31:29,200 --> 00:31:34,120
the people I I think colonization was a
790
00:31:31,840 --> 00:31:37,639
disaster AB absolutely but of course it
791
00:31:34,120 --> 00:31:39,159
did create prosperous Societies in parts
792
00:31:37,639 --> 00:31:41,360
of the world in North America and
793
00:31:39,159 --> 00:31:43,480
australasia but for the indigenous
794
00:31:41,360 --> 00:31:46,600
people it was a catastrophe you know
795
00:31:43,480 --> 00:31:48,880
diseases wiped out 90% of the population
796
00:31:46,600 --> 00:31:50,919
of the Americas people were exploited
797
00:31:48,880 --> 00:31:52,960
they had their lands and livelihoods
798
00:31:50,919 --> 00:31:56,120
destroyed their communities destroyed I
799
00:31:52,960 --> 00:31:58,120
mean absolutely yes so so so I don't you
800
00:31:56,120 --> 00:31:59,840
know so I don't think there's much to
801
00:31:58,120 --> 00:32:02,159
about that in my view I think this
802
00:31:59,840 --> 00:32:03,760
notion of reversal you know the Americas
803
00:32:02,159 --> 00:32:06,240
is very clear in the Americas you know
804
00:32:03,760 --> 00:32:07,880
at the time 5 you go back 500 years
805
00:32:06,240 --> 00:32:09,919
where were the prosperous parts of the
806
00:32:07,880 --> 00:32:13,919
Americas Central America the Central
807
00:32:09,919 --> 00:32:16,519
Valley of Mexico andian the Inca Empire
808
00:32:13,919 --> 00:32:18,600
you know the mexicas the valley of Waka
809
00:32:16,519 --> 00:32:21,120
you know there you had you had writing
810
00:32:18,600 --> 00:32:23,559
you had political complexity you had
811
00:32:21,120 --> 00:32:25,399
economic organization sophistication
812
00:32:23,559 --> 00:32:28,360
whatever the southern cone of Latin
813
00:32:25,399 --> 00:32:30,519
America North America far behind
814
00:32:28,360 --> 00:32:33,000
you know and then this gets completely
815
00:32:30,519 --> 00:32:35,000
reversed during the colonial period and
816
00:32:33,000 --> 00:32:36,880
the places that were relatively poor
817
00:32:35,000 --> 00:32:38,799
then become relatively prosperous so
818
00:32:36,880 --> 00:32:41,399
that's there you see the reversal in a
819
00:32:38,799 --> 00:32:43,440
very clear way right I want to bring you
820
00:32:41,399 --> 00:32:45,880
in Demis because your mother is
821
00:32:43,440 --> 00:32:47,960
Singaporean or Singaporean born uh you
822
00:32:45,880 --> 00:32:50,080
brought up in in Britain of course but
823
00:32:47,960 --> 00:32:52,360
um what do you think when you hear about
824
00:32:50,080 --> 00:32:54,200
this kind of thing about democracy and
825
00:32:52,360 --> 00:32:55,360
prosperity and well it's very I mean
826
00:32:54,200 --> 00:32:56,720
it's very interesting obviously I've
827
00:32:55,360 --> 00:32:58,919
heard from my mother the sort of
828
00:32:56,720 --> 00:33:00,240
economic miracle that leanu brought to
829
00:32:58,919 --> 00:33:02,320
Singapore and he's re when you're
830
00:33:00,240 --> 00:33:04,039
rightly revered for that I don't know
831
00:33:02,320 --> 00:33:06,480
obviously this is not my area but it's
832
00:33:04,039 --> 00:33:08,159
it's how do you how do you try and you
833
00:33:06,480 --> 00:33:10,399
know how are these institutions going to
834
00:33:08,159 --> 00:33:12,159
be built in the places where they aren't
835
00:33:10,399 --> 00:33:13,880
um is there external is it going to be
836
00:33:12,159 --> 00:33:15,760
external encouragement or it has to
837
00:33:13,880 --> 00:33:17,240
happen internally or you know how is
838
00:33:15,760 --> 00:33:19,559
that going to or you just have to be
839
00:33:17,240 --> 00:33:21,840
lucky with finding the right leader like
840
00:33:19,559 --> 00:33:23,440
like a Le oneu yeah I mean I think you
841
00:33:21,840 --> 00:33:25,320
know the success stories are all they
842
00:33:23,440 --> 00:33:27,799
all come from within people build the
843
00:33:25,320 --> 00:33:29,600
institutions in their own context I mean
844
00:33:27,799 --> 00:33:31,120
there are you know I think Lewan Yu is a
845
00:33:29,600 --> 00:33:33,159
sort of fascinating person he's not the
846
00:33:31,120 --> 00:33:36,240
only person in the world like that you
847
00:33:33,159 --> 00:33:38,639
know you had seretti karma in in in in
848
00:33:36,240 --> 00:33:41,440
Botswana you know you have other kind of
849
00:33:38,639 --> 00:33:44,039
outstanding leaders but I think on
850
00:33:41,440 --> 00:33:46,480
average you know the evidence suggests
851
00:33:44,039 --> 00:33:48,919
autocratic regimes don't do as well as
852
00:33:46,480 --> 00:33:50,919
Democratic ones and sure you know people
853
00:33:48,919 --> 00:33:53,840
matter individuals matter having good
854
00:33:50,919 --> 00:33:55,200
leader matter where' you find leanu you
855
00:33:53,840 --> 00:33:57,279
know that's what was going to ask you so
856
00:33:55,200 --> 00:33:59,000
then if it has to come from within you
857
00:33:57,279 --> 00:34:01,000
know what so you're pointing out with
858
00:33:59,000 --> 00:34:03,200
your great work like what the issues are
859
00:34:01,000 --> 00:34:05,279
but how other than wait for for the
860
00:34:03,200 --> 00:34:07,559
right you know Mandela or leanu to come
861
00:34:05,279 --> 00:34:09,480
along which is very rare as you say what
862
00:34:07,559 --> 00:34:11,119
else can be done to you know encourage
863
00:34:09,480 --> 00:34:12,599
those institutions to be built yeah but
864
00:34:11,119 --> 00:34:14,679
there's lots of Institutions are built
865
00:34:12,599 --> 00:34:17,240
without famous leaders I think the track
866
00:34:14,679 --> 00:34:18,679
record of external imposition of
867
00:34:17,240 --> 00:34:21,599
Institutions is not very good there are
868
00:34:18,679 --> 00:34:24,839
a few cases where you can point to but
869
00:34:21,599 --> 00:34:27,240
uh but generally institutions are built
870
00:34:24,839 --> 00:34:30,440
organically but there are influences out
871
00:34:27,240 --> 00:34:34,159
there so one of the cases Jim already
872
00:34:30,440 --> 00:34:36,879
hinted that Sama Botswana you know an
873
00:34:34,159 --> 00:34:38,399
amazingly successful democracy in
874
00:34:36,879 --> 00:34:40,200
subsaharan Africa an amazingly
875
00:34:38,399 --> 00:34:43,280
successful country in terms of economic
876
00:34:40,200 --> 00:34:45,879
growth uh very rapid growth on the whole
877
00:34:43,280 --> 00:34:48,200
and it was all existing actually
878
00:34:45,879 --> 00:34:50,399
pre-colonial institutions that were the
879
00:34:48,200 --> 00:34:52,520
basis of more democratic but leadership
880
00:34:50,399 --> 00:34:55,960
there mattered too so you need you need
881
00:34:52,520 --> 00:34:57,720
a combination so I think facilitating
882
00:34:55,960 --> 00:34:59,839
institution building domestically
883
00:34:57,720 --> 00:35:03,079
providing tools for them and getting rid
884
00:34:59,839 --> 00:35:05,680
of our hindrances often you know Western
885
00:35:03,079 --> 00:35:08,480
and Russian powers or sometimes Chinese
886
00:35:05,680 --> 00:35:10,280
Powers interfering in other Count's uh
887
00:35:08,480 --> 00:35:11,960
domestic affairs is not conducive to
888
00:35:10,280 --> 00:35:13,280
better institution building but at the
889
00:35:11,960 --> 00:35:16,040
end of the day institutions are going to
890
00:35:13,280 --> 00:35:18,560
be built bottom up okay so look a major
891
00:35:16,040 --> 00:35:20,599
theme of this year's Nobel prizes has
892
00:35:18,560 --> 00:35:21,920
been artificial intelligence so James
893
00:35:20,599 --> 00:35:26,280
let me ask you then if you think
894
00:35:21,920 --> 00:35:28,280
technology AI could help Africa develop
895
00:35:26,280 --> 00:35:30,320
but Africa has not been benefiting from
896
00:35:28,280 --> 00:35:32,920
all this technology I'm saying it it
897
00:35:30,320 --> 00:35:34,520
could but to do that many things have to
898
00:35:32,920 --> 00:35:36,680
change many things have to change
899
00:35:34,520 --> 00:35:38,680
institutions have to change politics has
900
00:35:36,680 --> 00:35:41,200
to change you know people's trusts all
901
00:35:38,680 --> 00:35:43,560
sorts of things have to change and what
902
00:35:41,200 --> 00:35:46,359
about the impact of technology AI for
903
00:35:43,560 --> 00:35:48,680
instance on Democracy really I am
904
00:35:46,359 --> 00:35:51,119
talking about the impact on jobs to what
905
00:35:48,680 --> 00:35:54,480
extent there'll be displacement of human
906
00:35:51,119 --> 00:35:58,319
activity and jobs by machines yeah I
907
00:35:54,480 --> 00:36:01,359
mean I think that's a huge risk I
908
00:35:58,319 --> 00:36:03,480
believe that humans would have a very
909
00:36:01,359 --> 00:36:06,680
difficult time building their social
910
00:36:03,480 --> 00:36:09,640
systems and communities if they become
911
00:36:06,680 --> 00:36:13,280
majorly sidelined and they feel they
912
00:36:09,640 --> 00:36:15,839
don't have dignity or use or a way to
913
00:36:13,280 --> 00:36:17,160
contribute to the social good from your
914
00:36:15,839 --> 00:36:19,200
perspective I mean there have been a lot
915
00:36:17,160 --> 00:36:21,560
of advances in technology over the last
916
00:36:19,200 --> 00:36:23,640
100 years have have any of them really
917
00:36:21,560 --> 00:36:26,480
cause massive displacement of jobs I
918
00:36:23,640 --> 00:36:28,200
mean already you know there's um a lot
919
00:36:26,480 --> 00:36:29,680
of these Technologies are out there but
920
00:36:28,200 --> 00:36:31,520
have they reduced the number of jobs
921
00:36:29,680 --> 00:36:33,119
yeah yeah there it has happened it has
922
00:36:31,520 --> 00:36:34,839
happened I mean the early phase of the
923
00:36:33,119 --> 00:36:36,960
Industrial Revolution where it was all
924
00:36:34,839 --> 00:36:40,000
lot about automation there were huge
925
00:36:36,960 --> 00:36:42,040
displacements huge wage losses onethird
926
00:36:40,000 --> 00:36:44,240
two you know people's wages within 20
927
00:36:42,040 --> 00:36:46,640
years in real terms fell to for some
928
00:36:44,240 --> 00:36:48,400
people to onethird of it what it was
929
00:36:46,640 --> 00:36:52,240
that's just a tremendous yes but down
930
00:36:48,400 --> 00:36:53,960
then in the end it became better so I
931
00:36:52,240 --> 00:36:55,720
yes so in my view there will be a lot of
932
00:36:53,960 --> 00:36:57,680
disruption like these other like the
933
00:36:55,720 --> 00:37:00,319
Industrial Revolution
934
00:36:57,680 --> 00:37:02,000
90 years it took 90 years I don't think
935
00:37:00,319 --> 00:37:05,440
what we want to put up with but there
936
00:37:02,000 --> 00:37:07,480
could be new classes of jobs I mean most
937
00:37:05,440 --> 00:37:09,079
but those new classes of jobs they're
938
00:37:07,480 --> 00:37:13,000
not automatic so there are like two ways
939
00:37:09,079 --> 00:37:14,560
of thinking uh on this beyond the
940
00:37:13,000 --> 00:37:16,200
artificial general intelligence one is
941
00:37:14,560 --> 00:37:17,839
that you introduce these disruptive
942
00:37:16,200 --> 00:37:19,920
Technologies and the system
943
00:37:17,839 --> 00:37:21,880
automatically adjusts nobody needs to do
944
00:37:19,920 --> 00:37:24,680
anything no policy maker no scientist
945
00:37:21,880 --> 00:37:26,040
nor technologist the system will adjust
946
00:37:24,680 --> 00:37:28,040
I think that just does is is
947
00:37:26,040 --> 00:37:30,160
contradicted by history the way that it
948
00:37:28,040 --> 00:37:32,280
works is that we all have to work in
949
00:37:30,160 --> 00:37:34,400
order to make things better including
950
00:37:32,280 --> 00:37:36,480
technologist so that we actually use the
951
00:37:34,400 --> 00:37:38,440
scientific knowledge to create new tasks
952
00:37:36,480 --> 00:37:40,319
more capabilities for humans rather than
953
00:37:38,440 --> 00:37:41,720
just sidelining them I mean we've seen
954
00:37:40,319 --> 00:37:43,560
used to talk about the lessons of
955
00:37:41,720 --> 00:37:45,880
history but we saw with the printing
956
00:37:43,560 --> 00:37:49,040
press Revolution people who were writing
957
00:37:45,880 --> 00:37:51,720
books were put out of business but then
958
00:37:49,040 --> 00:37:53,800
lots of new jobs were created through
959
00:37:51,720 --> 00:37:56,440
publishing but look at the last 40 years
960
00:37:53,800 --> 00:37:59,560
the US is an extreme case yeah
961
00:37:56,440 --> 00:38:01,640
but roughly speaking I'm exaggerating a
962
00:37:59,560 --> 00:38:03,880
little bit but about half of the US
963
00:38:01,640 --> 00:38:07,000
population those who don't have college
964
00:38:03,880 --> 00:38:11,520
degrees have had almost no growth in
965
00:38:07,000 --> 00:38:15,560
their real incomes until about 2015 from
966
00:38:11,520 --> 00:38:17,119
1980 so no new jobs of import were
967
00:38:15,560 --> 00:38:19,400
created for them there were a lot of new
968
00:38:17,119 --> 00:38:21,560
jobs in the 1990s and 2000s but they
969
00:38:19,400 --> 00:38:23,960
were all for people with postgraduate
970
00:38:21,560 --> 00:38:26,119
degrees and and and specialized
971
00:38:23,960 --> 00:38:28,079
knowledge okay Jeff Hinton do you think
972
00:38:26,119 --> 00:38:30,560
that this increase in product activity
973
00:38:28,079 --> 00:38:33,079
essentially that will come with um
974
00:38:30,560 --> 00:38:35,480
Automation and so on and so forth is is
975
00:38:33,079 --> 00:38:37,359
a good thing for society well it ought
976
00:38:35,480 --> 00:38:41,040
to be right I mean it's crazy we're
977
00:38:37,359 --> 00:38:42,960
we're talking about um having a huge
978
00:38:41,040 --> 00:38:44,680
increase in productivity so there's
979
00:38:42,960 --> 00:38:46,599
going to be more goods and services for
980
00:38:44,680 --> 00:38:48,839
everybody so everybody ought to be
981
00:38:46,599 --> 00:38:50,599
better off but actually it's going to be
982
00:38:48,839 --> 00:38:52,599
the other way around and it's because we
983
00:38:50,599 --> 00:38:54,599
live in a capitalist society and so
984
00:38:52,599 --> 00:38:56,560
what's going to happen is this huge
985
00:38:54,599 --> 00:38:58,040
increase in productivity is going to
986
00:38:56,560 --> 00:38:59,960
make much more money for the big
987
00:38:58,040 --> 00:39:01,720
companies and the rich and it's going to
988
00:38:59,960 --> 00:39:03,800
increase the gap between the rich and
989
00:39:01,720 --> 00:39:06,200
the people who lose their jobs and as
990
00:39:03,800 --> 00:39:09,079
soon as you increase that Gap you get
991
00:39:06,200 --> 00:39:13,079
fertile ground for fascism and so it's
992
00:39:09,079 --> 00:39:14,599
very scary that um we may be at a point
993
00:39:13,079 --> 00:39:17,359
where we're just making things worse and
994
00:39:14,599 --> 00:39:19,240
worse and it's crazy because we're doing
995
00:39:17,359 --> 00:39:20,800
something that should help everybody and
996
00:39:19,240 --> 00:39:23,760
obviously it will help in healthcare it
997
00:39:20,800 --> 00:39:25,440
help in education but if the profits
998
00:39:23,760 --> 00:39:28,520
just go to the rich that's going to make
999
00:39:25,440 --> 00:39:31,040
Society worse so okay let's look at the
1000
00:39:28,520 --> 00:39:34,520
last award and that's the Nobel Prize
1001
00:39:31,040 --> 00:39:37,960
for medicine or physiology and this is
1002
00:39:34,520 --> 00:39:40,880
why it was awarded this
1003
00:39:37,960 --> 00:39:44,040
year our organs and tissues are made up
1004
00:39:40,880 --> 00:39:47,079
of many varied types of cells they all
1005
00:39:44,040 --> 00:39:48,520
have identical genetic material but
1006
00:39:47,079 --> 00:39:50,640
different
1007
00:39:48,520 --> 00:39:54,160
characteristics this year's medicine
1008
00:39:50,640 --> 00:39:56,319
laurates Gary riffkin and Victor Ambrose
1009
00:39:54,160 --> 00:39:59,720
have shown how a new form of Gene
1010
00:39:56,319 --> 00:40:01,920
regulation microrna is crucial in
1011
00:39:59,720 --> 00:40:05,400
ensuring that the different cells of
1012
00:40:01,920 --> 00:40:08,319
organisms such as muscles or nerve cells
1013
00:40:05,400 --> 00:40:11,720
get the functions they need it's already
1014
00:40:08,319 --> 00:40:14,280
known that abnormal levels of micro RNA
1015
00:40:11,720 --> 00:40:16,920
increase the risk of cancer but the
1016
00:40:14,280 --> 00:40:19,920
laurat research could lead to developing
1017
00:40:16,920 --> 00:40:23,520
new Diagnostics and treatments for
1018
00:40:19,920 --> 00:40:26,319
example it could map how microrna varies
1019
00:40:23,520 --> 00:40:30,440
in different diseases helping unlock
1020
00:40:26,319 --> 00:40:30,440
prognosis for the development of
1021
00:40:31,079 --> 00:40:37,520
diseases so Gary your research was based
1022
00:40:34,680 --> 00:40:39,720
on um looking at mutant strains of the
1023
00:40:37,520 --> 00:40:41,440
round worm um actually it should have
1024
00:40:39,720 --> 00:40:44,800
its own Nobel Prize shouldn't it It's
1025
00:40:41,440 --> 00:40:48,359
featured so much in research that's led
1026
00:40:44,800 --> 00:40:50,920
to Nobel prizes but um just tell us what
1027
00:40:48,359 --> 00:40:55,119
does your work with round worms tell us
1028
00:40:50,920 --> 00:40:58,599
about genetic mutations in humans doing
1029
00:40:55,119 --> 00:41:00,720
genetics is is a form of doing what
1030
00:40:58,599 --> 00:41:04,800
evolution has been doing for 4 billion
1031
00:41:00,720 --> 00:41:07,760
years the our planet is a genetic
1032
00:41:04,800 --> 00:41:11,079
experiment that's has been generating
1033
00:41:07,760 --> 00:41:14,560
diverse life from primitive life over
1034
00:41:11,079 --> 00:41:16,520
four billion years by inducing variation
1035
00:41:14,560 --> 00:41:19,599
to give you the tree of life that goes
1036
00:41:16,520 --> 00:41:23,079
you know to bats and to plants and to
1037
00:41:19,599 --> 00:41:26,040
bacteria and we do that on one organism
1038
00:41:23,079 --> 00:41:28,680
and the reason it works so well is that
1039
00:41:26,040 --> 00:41:32,160
Evolution has evolved a way to generate
1040
00:41:28,680 --> 00:41:34,400
diversity by mutating that's that's what
1041
00:41:32,160 --> 00:41:36,720
all around us you know when you see a a
1042
00:41:34,400 --> 00:41:39,280
green tree it's because
1043
00:41:36,720 --> 00:41:42,000
photosynthesis was developed two billion
1044
00:41:39,280 --> 00:41:45,680
years ago the reason we can breathe
1045
00:41:42,000 --> 00:41:48,720
oxygen is because photosynthesis evolved
1046
00:41:45,680 --> 00:41:50,800
and it wasn't there beforehand and so
1047
00:41:48,720 --> 00:41:52,680
what we're doing is that process and
1048
00:41:50,800 --> 00:41:55,720
that's why it works so well so the
1049
00:41:52,680 --> 00:41:58,520
reason uh the worm has gotten four Nobel
1050
00:41:55,720 --> 00:42:01,960
prizes is that and it's the worm that
1051
00:41:58,520 --> 00:42:04,359
got it so you know we we're just The
1052
00:42:01,960 --> 00:42:07,839
Operators uh is she behave with this
1053
00:42:04,359 --> 00:42:10,640
yeah yeah it's very tiny it's very tiny
1054
00:42:07,839 --> 00:42:14,040
it's a millimeter long it you know it
1055
00:42:10,640 --> 00:42:15,680
has 959 cells that's different from us
1056
00:42:14,040 --> 00:42:18,400
right our not every one of our cells
1057
00:42:15,680 --> 00:42:21,280
does not have a name you know it but
1058
00:42:18,400 --> 00:42:23,520
every cell in a worm has a name and that
1059
00:42:21,280 --> 00:42:27,200
attracted a kind of cohort of people who
1060
00:42:23,520 --> 00:42:29,119
like names names are important and we
1061
00:42:27,200 --> 00:42:31,240
we're thinking about well we can learn a
1062
00:42:29,119 --> 00:42:33,880
lot about how biology works by sort of
1063
00:42:31,240 --> 00:42:35,760
following cells what's their history
1064
00:42:33,880 --> 00:42:38,000
what do they become how much do they
1065
00:42:35,760 --> 00:42:39,800
talk to each other but we figure it out
1066
00:42:38,000 --> 00:42:41,400
by breaking it but it's I mean it's
1067
00:42:39,800 --> 00:42:46,200
extraordinary that a human has about
1068
00:42:41,400 --> 00:42:47,440
20,000 genes and a worm has 20,000 gen
1069
00:42:46,200 --> 00:42:50,319
that's why we really are too
1070
00:42:47,440 --> 00:42:52,480
self-important we're just not you know
1071
00:42:50,319 --> 00:42:55,480
humans are just not that great you know
1072
00:42:52,480 --> 00:42:58,359
we're we're fine I'm happy to be a human
1073
00:42:55,480 --> 00:43:02,119
I don't want to be a worm but you
1074
00:42:58,359 --> 00:43:04,280
know you know a bacteria has 4,000 genes
1075
00:43:02,119 --> 00:43:07,640
that's not very different from 20,000
1076
00:43:04,280 --> 00:43:10,000
I'm sorry right and you know people say
1077
00:43:07,640 --> 00:43:13,480
oh geez if you look for life on other
1078
00:43:10,000 --> 00:43:16,000
planets it's bacteria how boring you got
1079
00:43:13,480 --> 00:43:17,839
it all wrong folks bacteria are totally
1080
00:43:16,000 --> 00:43:19,880
awesome yeah yeah but so what you're
1081
00:43:17,839 --> 00:43:22,440
saying essentially is that mutations
1082
00:43:19,880 --> 00:43:25,240
obviously can be bad because they can
1083
00:43:22,440 --> 00:43:27,359
lead to sorts of genetic um illnesses
1084
00:43:25,240 --> 00:43:29,520
and so on but they're not always bad and
1085
00:43:27,359 --> 00:43:31,720
some are quite actually relatively
1086
00:43:29,520 --> 00:43:34,200
insignificant like you're color blind
1087
00:43:31,720 --> 00:43:36,200
aren't you for instance I mean that's a
1088
00:43:34,200 --> 00:43:39,240
genetic mutation it is and it's a
1089
00:43:36,200 --> 00:43:42,280
debilitating mutation for me be uh in
1090
00:43:39,240 --> 00:43:45,920
the days of black and white um
1091
00:43:42,280 --> 00:43:48,839
publishing I I was King things were fine
1092
00:43:45,920 --> 00:43:52,040
and then everything became color you
1093
00:43:48,839 --> 00:43:55,000
know it it I have to say so like uh our
1094
00:43:52,040 --> 00:43:57,240
little worm I'll go to a seminar and
1095
00:43:55,000 --> 00:43:59,520
people are presenting graphs with red
1096
00:43:57,240 --> 00:44:02,200
green and and I come out going gez that
1097
00:43:59,520 --> 00:44:04,000
was just complete horse what I didn't
1098
00:44:02,200 --> 00:44:07,640
and people said oh it was fantastic you
1099
00:44:04,000 --> 00:44:11,000
didn't see I wrote to Google Maps and
1100
00:44:07,640 --> 00:44:14,160
said you guys you do traffic is red and
1101
00:44:11,000 --> 00:44:17,839
green means things are F I can't see it
1102
00:44:14,160 --> 00:44:20,680
and you're losing 4% of the of the users
1103
00:44:17,839 --> 00:44:20,680
and it's the best
1104
00:44:20,920 --> 00:44:25,680
4% but I mean you actually want did not
1105
00:44:23,480 --> 00:44:27,599
respond you wanted to be an electrical
1106
00:44:25,680 --> 00:44:30,599
engineer originally did
1107
00:44:27,599 --> 00:44:34,280
well yes I did Electronics as a kid cuz
1108
00:44:30,599 --> 00:44:38,280
I loved electronics and I built kits uh
1109
00:44:34,280 --> 00:44:41,000
I built a shortwave radio with a $39 kit
1110
00:44:38,280 --> 00:44:44,400
made with vacuum tubes this is before
1111
00:44:41,000 --> 00:44:45,800
you know transistors but the resistors
1112
00:44:44,400 --> 00:44:48,800
have a color
1113
00:44:45,800 --> 00:44:51,160
code right and so and it tells you how
1114
00:44:48,800 --> 00:44:54,640
many ohms it is and that's how much
1115
00:44:51,160 --> 00:44:56,200
resistance it has and so I didn't know
1116
00:44:54,640 --> 00:44:58,960
it I didn't know I was color blind at
1117
00:44:56,200 --> 00:45:02,040
the time so I put it together and the
1118
00:44:58,960 --> 00:45:03,280
test for how well you are whether it's
1119
00:45:02,040 --> 00:45:07,480
going to work is you turn it on and if
1120
00:45:03,280 --> 00:45:09,440
it doesn't smoke that that's good and my
1121
00:45:07,480 --> 00:45:11,160
electronic assembly didn't pass the
1122
00:45:09,440 --> 00:45:15,000
smoke
1123
00:45:11,160 --> 00:45:18,599
test so um let's go for another question
1124
00:45:15,000 --> 00:45:21,280
now from our audience Jasmin kovitz what
1125
00:45:18,599 --> 00:45:24,040
do you want to ask Professor rkin was
1126
00:45:21,280 --> 00:45:25,960
micro RNA an unexpected Fant or was it a
1127
00:45:24,040 --> 00:45:28,760
part of your hypothesis while conducting
1128
00:45:25,960 --> 00:45:31,559
your research there no hypothesis on
1129
00:45:28,760 --> 00:45:34,119
that no no no no no it was a complete
1130
00:45:31,559 --> 00:45:37,640
surprise and I love
1131
00:45:34,119 --> 00:45:41,520
surprises and really you know that's the
1132
00:45:37,640 --> 00:45:44,720
beauty of doing genetics is that what
1133
00:45:41,520 --> 00:45:47,040
comes out is what teaches you right it's
1134
00:45:44,720 --> 00:45:49,440
and you know you do a mutagenesis you
1135
00:45:47,040 --> 00:45:51,800
get an animal that looks like what you
1136
00:45:49,440 --> 00:45:53,640
were looking for part of the search is
1137
00:45:51,800 --> 00:45:56,319
saying what am I going to look for
1138
00:45:53,640 --> 00:45:59,359
that's the the art of it how did your
1139
00:45:56,319 --> 00:46:00,839
research with Victor Ambrose's um go
1140
00:45:59,359 --> 00:46:03,599
down when you first published it in the
1141
00:46:00,839 --> 00:46:06,319
early 1990s it was in a little little
1142
00:46:03,599 --> 00:46:08,280
corner of biology this worm and there
1143
00:46:06,319 --> 00:46:11,240
was a sense when you would deliver a
1144
00:46:08,280 --> 00:46:13,040
paper to go to give a talk about it that
1145
00:46:11,240 --> 00:46:15,400
well it's a worm who cares you know and
1146
00:46:13,040 --> 00:46:18,359
it's a weird little animal until it we
1147
00:46:15,400 --> 00:46:22,040
discovered that it was in human genome
1148
00:46:18,359 --> 00:46:24,559
and then many other genomes and it's
1149
00:46:22,040 --> 00:46:26,960
been embraced and it what was
1150
00:46:24,559 --> 00:46:28,960
especially sort of empowering to it was
1151
00:46:26,960 --> 00:46:31,240
was it it intersected with RNA
1152
00:46:28,960 --> 00:46:34,160
interference which is an antiviral
1153
00:46:31,240 --> 00:46:36,640
response and people really care now of
1154
00:46:34,160 --> 00:46:39,440
course about antiviral responses of
1155
00:46:36,640 --> 00:46:41,040
course I mean how did you all find doing
1156
00:46:39,440 --> 00:46:43,520
your research I mean just listening to
1157
00:46:41,040 --> 00:46:46,079
what Gary is saying did you encounter
1158
00:46:43,520 --> 00:46:49,559
setbacks can you define particular
1159
00:46:46,079 --> 00:46:52,680
moments when your research really felt
1160
00:46:49,559 --> 00:46:54,960
that you you were on a winning streak
1161
00:46:52,680 --> 00:46:57,440
did people discourage you from what you
1162
00:46:54,960 --> 00:47:01,720
were doing all of the above
1163
00:46:57,440 --> 00:47:05,359
really I mean you know Academia is
1164
00:47:01,720 --> 00:47:09,400
really hard at some level you know you
1165
00:47:05,359 --> 00:47:12,000
work sometimes three years on a project
1166
00:47:09,400 --> 00:47:16,000
and then somebody anonymously destroys
1167
00:47:12,000 --> 00:47:18,960
it so that's very very difficult to get
1168
00:47:16,000 --> 00:47:21,160
used to so I do a lot of coaching with
1169
00:47:18,960 --> 00:47:22,960
my graduate students to get them ready
1170
00:47:21,160 --> 00:47:26,319
for that but on the other hand I found
1171
00:47:22,960 --> 00:47:28,800
Academia to be quite open-minded as well
1172
00:47:26,319 --> 00:47:31,119
you know when
1173
00:47:28,800 --> 00:47:33,480
Jim Simon Johnson and I for example
1174
00:47:31,119 --> 00:47:36,119
started doing our work you know I think
1175
00:47:33,480 --> 00:47:38,599
there was not much of the sort in
1176
00:47:36,119 --> 00:47:40,359
economics and people could have said no
1177
00:47:38,599 --> 00:47:42,440
this is not economics and some people
1178
00:47:40,359 --> 00:47:44,000
did and people could have said this is
1179
00:47:42,440 --> 00:47:45,520
crazy and some people did but there were
1180
00:47:44,000 --> 00:47:47,920
a lot of people who were open-minded
1181
00:47:45,520 --> 00:47:50,800
especially young researchers you know
1182
00:47:47,920 --> 00:47:54,119
they're hungry for new angles so I found
1183
00:47:50,800 --> 00:47:56,839
Academia to be quite open-minded as well
1184
00:47:54,119 --> 00:47:58,920
but a tough Place D Salis David I think
1185
00:47:56,839 --> 00:48:01,839
I think you've both said that research
1186
00:47:58,920 --> 00:48:05,079
in proteins is kind of seen as at the
1187
00:48:01,839 --> 00:48:08,400
time being on The Lunatic Fringe of uh
1188
00:48:05,079 --> 00:48:10,319
science um I mean how how did you cope
1189
00:48:08,400 --> 00:48:12,040
with that kind of perception that you
1190
00:48:10,319 --> 00:48:14,319
were doing something that was a bit out
1191
00:48:12,040 --> 00:48:15,920
there well I think when um when we
1192
00:48:14,319 --> 00:48:17,559
started trying to design proteins
1193
00:48:15,920 --> 00:48:19,720
everyone thought it was a crazy way to
1194
00:48:17,559 --> 00:48:21,200
try and solve hard problems the only
1195
00:48:19,720 --> 00:48:23,000
proteins we knew at the time were the
1196
00:48:21,200 --> 00:48:24,720
ones that came down through you know
1197
00:48:23,000 --> 00:48:26,599
through Evolution the ones in us and in
1198
00:48:24,720 --> 00:48:28,200
all living things so the idea that you
1199
00:48:26,599 --> 00:48:30,960
could make completely new ones and that
1200
00:48:28,200 --> 00:48:34,280
they could they could do new things was
1201
00:48:30,960 --> 00:48:36,480
was really seen as um you know Lunatic
1202
00:48:34,280 --> 00:48:37,880
Fringe as you said but um I think the
1203
00:48:36,480 --> 00:48:40,760
way you deal with that is you you work
1204
00:48:37,880 --> 00:48:42,359
on the problem and you make progress and
1205
00:48:40,760 --> 00:48:44,839
uh you know now it's gotten to the point
1206
00:48:42,359 --> 00:48:46,200
where um every other day there's another
1207
00:48:44,839 --> 00:48:47,920
company saying they're joining the
1208
00:48:46,200 --> 00:48:49,760
protein design Revolution and they're
1209
00:48:47,920 --> 00:48:51,599
going to be solving so you can go from
1210
00:48:49,760 --> 00:48:53,559
The Lunatic Fringe to the mainstream
1211
00:48:51,599 --> 00:48:55,839
faster than you might expect utterly
1212
00:48:53,559 --> 00:48:58,640
Vindicated weren't you I mean yeah it's
1213
00:48:55,839 --> 00:49:00,559
very right similar with you know I think
1214
00:48:58,640 --> 00:49:01,960
if you're fascinated enough and
1215
00:49:00,559 --> 00:49:03,280
passionate enough about the area you're
1216
00:49:01,960 --> 00:49:05,720
going to you know I was going to do it
1217
00:49:03,280 --> 00:49:06,760
no matter what you know and and actually
1218
00:49:05,720 --> 00:49:08,520
um I can't think of anything more
1219
00:49:06,760 --> 00:49:10,720
interesting to work on than than in the
1220
00:49:08,520 --> 00:49:12,400
nature of intelligence and and and
1221
00:49:10,720 --> 00:49:14,640
computation of you know computational
1222
00:49:12,400 --> 00:49:17,640
principles underpinning that and when we
1223
00:49:14,640 --> 00:49:19,319
started like Deep Mind in 2010 um nobody
1224
00:49:17,640 --> 00:49:22,160
was working on AI pretty much there was
1225
00:49:19,319 --> 00:49:25,960
some Yes except for very few people very
1226
00:49:22,160 --> 00:49:28,440
few fored people in in in in um in
1227
00:49:25,960 --> 00:49:30,480
Academia and uh and then now fast
1228
00:49:28,440 --> 00:49:31,920
forward 15 years which is not very much
1229
00:49:30,480 --> 00:49:33,359
time and obviously the whole world's
1230
00:49:31,920 --> 00:49:35,640
talking about it and certainly in
1231
00:49:33,359 --> 00:49:38,200
Industry no one was doing that in 2010
1232
00:49:35,640 --> 00:49:39,680
but we I we already foresaw um building
1233
00:49:38,200 --> 00:49:42,160
on you know the great work of people
1234
00:49:39,680 --> 00:49:43,839
like Professor Hinton that um uh this
1235
00:49:42,160 --> 00:49:45,319
would be one of the most consequential
1236
00:49:43,839 --> 00:49:46,960
transformative Technologies in the world
1237
00:49:45,319 --> 00:49:48,480
if it could be done and if it's that you
1238
00:49:46,960 --> 00:49:50,760
see something like that then it's worth
1239
00:49:48,480 --> 00:49:52,319
doing in its you know in of itself I
1240
00:49:50,760 --> 00:49:54,520
mean you Professor Hinton along with
1241
00:49:52,319 --> 00:49:58,200
your co-recipient of the physics Nobel
1242
00:49:54,520 --> 00:50:00,559
Prize professor John hopfield who is 91
1243
00:49:58,200 --> 00:50:03,920
a real Pioneer in this field of
1244
00:50:00,559 --> 00:50:06,000
Technology also I mean do does this what
1245
00:50:03,920 --> 00:50:08,799
you've just heard here resonate with you
1246
00:50:06,000 --> 00:50:10,599
that work that at one stage was seen as
1247
00:50:08,799 --> 00:50:12,880
being on The Lunatic Fringe and then
1248
00:50:10,599 --> 00:50:17,440
here you are years later
1249
00:50:12,880 --> 00:50:19,079
Vindicated uh yes so um people students
1250
00:50:17,440 --> 00:50:20,960
would apply to my department to work
1251
00:50:19,079 --> 00:50:22,720
with me and other professors in my
1252
00:50:20,960 --> 00:50:23,920
department would say oh if you work with
1253
00:50:22,720 --> 00:50:25,079
Hinton that's the end of your career
1254
00:50:23,920 --> 00:50:27,400
this stuff is
1255
00:50:25,079 --> 00:50:30,760
rubbish how did that make you feel I
1256
00:50:27,400 --> 00:50:33,760
mean did you still you luckily at the
1257
00:50:30,760 --> 00:50:33,760
time I didn't know about
1258
00:50:35,079 --> 00:50:40,559
it time I think for another question
1259
00:50:37,400 --> 00:50:43,760
from our audience and Mano dinakaran
1260
00:50:40,559 --> 00:50:45,079
from The kolinska Institute wants to ask
1261
00:50:43,760 --> 00:50:48,079
this what's your question thank you
1262
00:50:45,079 --> 00:50:50,760
Laurette um science is all about being
1263
00:50:48,079 --> 00:50:53,359
motivated when things don't go the way
1264
00:50:50,760 --> 00:50:55,880
we expect them to so what kept you all
1265
00:50:53,359 --> 00:50:57,760
motivated when things didn't go the way
1266
00:50:55,880 --> 00:51:00,200
you expected in times of hardship and
1267
00:50:57,760 --> 00:51:03,280
helped you adapt who'd like to pick it
1268
00:51:00,200 --> 00:51:05,400
up there that's that's the best bit that
1269
00:51:03,280 --> 00:51:07,359
when it doesn't when what you expected
1270
00:51:05,400 --> 00:51:08,520
to happen doesn't happen and then you
1271
00:51:07,359 --> 00:51:10,319
you really learn that's when you really
1272
00:51:08,520 --> 00:51:12,960
learn something I mean so that's the
1273
00:51:10,319 --> 00:51:15,880
best bit it's really crushing for the
1274
00:51:12,960 --> 00:51:18,720
first couple of days and then you then
1275
00:51:15,880 --> 00:51:20,520
you then then you like oh then oh now I
1276
00:51:18,720 --> 00:51:22,359
learned something that's like I didn't
1277
00:51:20,520 --> 00:51:24,359
understand that there is ascertainment
1278
00:51:22,359 --> 00:51:27,599
bias here cuz you have the people who've
1279
00:51:24,359 --> 00:51:30,000
gotten winning hands in the in the poker
1280
00:51:27,599 --> 00:51:33,160
game of life you
1281
00:51:30,000 --> 00:51:35,559
know right well gentlemen thanks to all
1282
00:51:33,160 --> 00:51:37,760
of you and renewed congratulations on
1283
00:51:35,559 --> 00:51:39,920
your Nobel prizes that saw from this
1284
00:51:37,760 --> 00:51:41,680
year's Nobel mindes from the Royal
1285
00:51:39,920 --> 00:51:43,720
Palace in Stockholm it's been a
1286
00:51:41,680 --> 00:51:46,040
privilege having this discussion with
1287
00:51:43,720 --> 00:51:47,839
you thank you to their Royal highnesses
1288
00:51:46,040 --> 00:51:49,680
the Crown Princess Victoria and Prince
1289
00:51:47,839 --> 00:51:51,799
Daniel for being with us of course
1290
00:51:49,680 --> 00:51:54,040
everybody else in the audience and you
1291
00:51:51,799 --> 00:51:56,240
also at home for watching from me Zay
1292
00:51:54,040 --> 00:51:59,880
abuwi and the rest of the Nobel Minds
1293
00:51:56,240 --> 00:51:59,880
team goodbye
1294
00:52:01,020 --> 00:52:10,810
[Applause]
1295
00:52:05,450 --> 00:52:13,910
[Music]
1296
00:52:10,810 --> 00:52:19,470
[Applause]
1297
00:52:13,910 --> 00:52:22,540
[Music]
1298
00:52:19,470 --> 00:52:22,540
[Applause]99258
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.