All language subtitles for The Thinking Game 2024 1080p AMZN WEB-DL DDP5 1 H264-GPRS

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified) Download
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
el Greek
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranรฎ)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:30,682 --> 00:00:31,814 Hi, Alpha. 2 00:00:32,684 --> 00:00:34,034 Hello. 3 00:00:34,077 --> 00:00:35,296 Can you help me write code? 4 00:00:37,559 --> 00:00:39,082 I was trained to answer questions, 5 00:00:40,562 --> 00:00:42,216 but I'm able to learn. 6 00:00:43,695 --> 00:00:45,958 That's very open-minded of you. 7 00:00:46,002 --> 00:00:48,657 Thank you. I'm glad you're happy with me. 8 00:00:50,311 --> 00:00:51,312 What's this guy doing? 9 00:00:54,532 --> 00:00:56,621 That's a developer. 10 00:00:56,665 --> 00:00:58,014 What do you think he's working on? 11 00:00:59,537 --> 00:01:01,017 That's a tough question. 12 00:01:01,061 --> 00:01:02,888 He might be working on a new feature, 13 00:01:02,932 --> 00:01:05,239 a bug fix or something else. 14 00:01:05,282 --> 00:01:06,327 It's quite possible. 15 00:01:06,370 --> 00:01:07,328 Yes. 16 00:01:10,548 --> 00:01:11,854 Do you see my backpack? 17 00:01:13,986 --> 00:01:15,292 That's a badminton racket. 18 00:01:15,336 --> 00:01:17,903 It's a squash racket, but that's pretty close. 19 00:01:20,123 --> 00:01:21,690 That's a badminton racket. 20 00:01:21,733 --> 00:01:23,126 No, but you're not the first person 21 00:01:23,170 --> 00:01:24,649 to make that mistake. 22 00:01:34,703 --> 00:01:35,965 AI, the technology 23 00:01:36,008 --> 00:01:38,402 that has been advancing at breakneck speed. 24 00:01:38,446 --> 00:01:40,796 Artificial intelligence is all the rage. 25 00:01:40,839 --> 00:01:42,319 Some are now raising alarm about... 26 00:01:42,363 --> 00:01:43,886 It is definitely concerning. 27 00:01:43,929 --> 00:01:45,322 This is an AI arms race. 28 00:01:45,366 --> 00:01:46,454 We don't know 29 00:01:46,497 --> 00:01:47,585 how this is all going to shake out, 30 00:01:47,629 --> 00:01:48,891 but it's clear something is happening. 31 00:01:53,809 --> 00:01:55,115 I'm kind of restless. 32 00:01:57,073 --> 00:02:00,120 Trying to build AGI is the most exciting journey, 33 00:02:00,163 --> 00:02:02,383 in my opinion, that humans have ever embarked on. 34 00:02:04,254 --> 00:02:06,300 If you're really going to take that seriously, 35 00:02:06,343 --> 00:02:08,345 there isn't a lot of time. 36 00:02:08,389 --> 00:02:09,825 Life's very short. 37 00:02:12,523 --> 00:02:14,482 My whole life goal is to solve 38 00:02:14,525 --> 00:02:16,875 artificial general intelligence. 39 00:02:16,919 --> 00:02:20,357 And on the way, use AI as the ultimate tool 40 00:02:20,401 --> 00:02:21,750 to solve all the world's 41 00:02:21,793 --> 00:02:23,230 most complex scientific problems. 42 00:02:25,449 --> 00:02:27,103 I think that's bigger than the Internet. 43 00:02:27,147 --> 00:02:28,452 I think that's bigger than mobile. 44 00:02:29,671 --> 00:02:31,020 I think it's more like 45 00:02:31,063 --> 00:02:32,891 the advent of electricity or fire. 46 00:02:45,687 --> 00:02:46,731 World leaders 47 00:02:46,775 --> 00:02:48,516 and artificial intelligence experts 48 00:02:48,559 --> 00:02:49,908 are gathering for the first ever 49 00:02:49,952 --> 00:02:52,781 global AI safety summit, 50 00:02:52,824 --> 00:02:54,348 set to look at the risks 51 00:02:54,391 --> 00:02:56,611 of the fast growing technology and also... 52 00:02:56,654 --> 00:02:57,873 I think this is a hugely 53 00:02:57,916 --> 00:03:00,136 critical moment for all humanity. 54 00:03:01,093 --> 00:03:03,618 It feels like we're on the cusp 55 00:03:03,661 --> 00:03:05,794 of some incredible things happening. 56 00:03:05,837 --> 00:03:07,056 Let me take you through 57 00:03:07,099 --> 00:03:08,275 some of the reactions in today's papers. 58 00:03:08,318 --> 00:03:10,015 AGI is pretty close, I think. 59 00:03:10,059 --> 00:03:12,844 There's clearly huge interest in what it is capable of, 60 00:03:12,888 --> 00:03:14,803 where it's taking us. 61 00:03:14,846 --> 00:03:15,978 This is the moment 62 00:03:16,021 --> 00:03:17,936 I've been living my whole life for. 63 00:03:22,202 --> 00:03:24,987 I've always been fascinated by the mind. 64 00:03:25,030 --> 00:03:27,990 So I set my heart on studying neuroscience 65 00:03:28,033 --> 00:03:29,948 because I wanted to get inspiration 66 00:03:29,992 --> 00:03:31,646 from the brain for AI. 67 00:03:31,689 --> 00:03:33,300 I remember asking Demis, 68 00:03:33,343 --> 00:03:34,649 "What's the end game?" 69 00:03:34,692 --> 00:03:36,172 You know? So you're going to come here 70 00:03:36,216 --> 00:03:38,218 and you're going to study neuroscience 71 00:03:38,261 --> 00:03:41,438 and you're going to maybe get a Ph.D. if you work hard. 72 00:03:42,874 --> 00:03:44,136 And he said, 73 00:03:44,180 --> 00:03:46,965 "You know, I want to be able to solve AI. 74 00:03:47,009 --> 00:03:49,359 "I want to be able to solve intelligence." 75 00:03:49,403 --> 00:03:51,361 The human brain is the only existent proof 76 00:03:51,405 --> 00:03:53,494 we have, perhaps in the entire universe, 77 00:03:53,537 --> 00:03:56,279 that general intelligence is possible at all. 78 00:03:56,323 --> 00:03:59,282 And I thought someone in this building 79 00:03:59,326 --> 00:04:00,370 should be interested 80 00:04:00,414 --> 00:04:02,590 in general intelligence like I am. 81 00:04:02,633 --> 00:04:04,809 And then Shane's name popped up. 82 00:04:04,853 --> 00:04:07,334 Our next speaker today is Shane Legg. 83 00:04:07,377 --> 00:04:08,596 He's from New Zealand, 84 00:04:08,639 --> 00:04:11,512 where he trained in math and classical ballet. 85 00:04:11,555 --> 00:04:14,079 Are machines actually becoming more intelligent? 86 00:04:14,123 --> 00:04:16,517 Some people say yes, some people say no. 87 00:04:16,560 --> 00:04:17,648 It's not really clear. 88 00:04:17,692 --> 00:04:18,867 We know they're getting a lot faster 89 00:04:18,910 --> 00:04:20,216 at doing computations. 90 00:04:20,260 --> 00:04:21,609 But are we actually going forwards 91 00:04:21,652 --> 00:04:23,828 in terms of general intelligence? 92 00:04:23,872 --> 00:04:25,700 We were both obsessed with AGI, 93 00:04:25,743 --> 00:04:27,397 artificial general intelligence. 94 00:04:27,441 --> 00:04:29,704 So today I'm going to be talking about 95 00:04:29,747 --> 00:04:32,097 different approaches to building AGI. 96 00:04:32,141 --> 00:04:33,882 With my colleague Demis Hassabis, 97 00:04:33,925 --> 00:04:35,884 we're looking at ways to bring in ideas 98 00:04:35,927 --> 00:04:37,451 from theoretical neuroscience. 99 00:04:37,494 --> 00:04:41,585 I felt like we were the keepers of a secret 100 00:04:41,629 --> 00:04:43,152 that no one else knew. 101 00:04:43,195 --> 00:04:45,415 Shane and I knew no one in academia 102 00:04:45,459 --> 00:04:47,809 would be supportive of what we were doing. 103 00:04:47,852 --> 00:04:50,855 AI was almost an embarrassing word 104 00:04:50,899 --> 00:04:52,857 to use in academic circles, right? 105 00:04:52,901 --> 00:04:54,903 If you said you were working on AI, 106 00:04:54,946 --> 00:04:57,993 then you clearly weren't a serious scientist. 107 00:04:58,036 --> 00:05:00,082 So I convinced Shane the right way to do it 108 00:05:00,125 --> 00:05:01,344 would be to start a company. 109 00:05:01,388 --> 00:05:03,215 Okay, we're going to try to do 110 00:05:03,259 --> 00:05:04,913 artificial general intelligence. 111 00:05:04,956 --> 00:05:06,915 It may not even be possible. 112 00:05:06,958 --> 00:05:08,482 We're not quite sure how we're going to do it, 113 00:05:08,525 --> 00:05:11,528 but we have some ideas or, kind of, approaches. 114 00:05:11,572 --> 00:05:14,966 Huge amounts of money, huge amounts of risk, 115 00:05:15,010 --> 00:05:16,533 lots and lots of compute. 116 00:05:18,709 --> 00:05:20,320 And if we pull this off, 117 00:05:20,363 --> 00:05:23,540 it'll be the biggest thing ever, right? 118 00:05:23,584 --> 00:05:26,108 That is a very hard thing for a typical investor 119 00:05:26,151 --> 00:05:27,327 to put their money on. 120 00:05:27,370 --> 00:05:29,633 It's almost like buying a lottery ticket. 121 00:05:29,677 --> 00:05:32,462 I'm going to be speaking about the system of neuroscience 122 00:05:32,506 --> 00:05:36,727 and how it might be used to help us build AGI. 123 00:05:36,771 --> 00:05:37,946 Finding initial funding 124 00:05:37,989 --> 00:05:39,208 for this was very hard. 125 00:05:39,251 --> 00:05:41,253 We're going to solve all of intelligence. 126 00:05:41,297 --> 00:05:42,907 You can imagine some of the looks I got 127 00:05:42,951 --> 00:05:44,779 when we were pitching that around. 128 00:05:44,822 --> 00:05:47,912 So I'm a V.C. and I look at about 129 00:05:47,956 --> 00:05:51,002 700 to 1,000 projects a year. 130 00:05:51,046 --> 00:05:54,789 And I fund literally 1% of those. 131 00:05:54,832 --> 00:05:57,139 About eight projects a year. 132 00:05:57,182 --> 00:06:00,272 So that means 99% of the time, you're in "No" mode. 133 00:06:00,316 --> 00:06:01,622 "Wait a minute. I'm telling you, 134 00:06:01,665 --> 00:06:03,798 "this is the most important thing of all time. 135 00:06:03,841 --> 00:06:05,103 "I'm giving you all this build-up 136 00:06:05,147 --> 00:06:06,235 "about how... explain 137 00:06:06,278 --> 00:06:07,671 "how it connects with the brain, 138 00:06:07,715 --> 00:06:09,412 "why the time's right now, and then you're asking me, 139 00:06:09,456 --> 00:06:10,979 "'But what's your, how are you going to make money? 140 00:06:11,022 --> 00:06:12,154 "'What's your product?'" 141 00:06:12,197 --> 00:06:15,984 It's like, so prosaic a question. 142 00:06:16,506 --> 00:06:17,942 You know? 143 00:06:17,986 --> 00:06:19,117 "Have you not been listening to what I've been saying?" 144 00:06:19,161 --> 00:06:21,076 We needed investors 145 00:06:21,119 --> 00:06:23,470 who aren't necessarily going to invest 146 00:06:23,513 --> 00:06:25,123 because they think it's the best 147 00:06:25,167 --> 00:06:26,690 investment decision. 148 00:06:26,734 --> 00:06:27,865 They're probably going to invest 149 00:06:27,909 --> 00:06:29,476 because they just think it's really cool. 150 00:06:29,519 --> 00:06:31,478 He's the Silicon Valley 151 00:06:31,521 --> 00:06:33,523 version of the man behind the curtain 152 00:06:33,567 --> 00:06:34,872 inThe Wizard of Oz. 153 00:06:34,916 --> 00:06:36,265 He had a lot to do with giving you 154 00:06:36,308 --> 00:06:39,007 PayPal, Facebook, YouTube and Yelp. 155 00:06:39,050 --> 00:06:40,443 If everyone says "X," 156 00:06:40,487 --> 00:06:43,185 Peter Thiel suspects that the opposite of X 157 00:06:43,228 --> 00:06:44,708 is quite possibly true. 158 00:06:44,752 --> 00:06:47,276 So Peter Thiel was our first big investor. 159 00:06:47,319 --> 00:06:50,105 But he insisted that we come to Silicon Valley 160 00:06:50,148 --> 00:06:51,802 because that was the only place we could... 161 00:06:51,846 --> 00:06:52,977 There would be the talent, 162 00:06:53,021 --> 00:06:55,066 and we could build that kind of company. 163 00:06:55,110 --> 00:06:56,981 But I was pretty adamant we should be in London 164 00:06:57,025 --> 00:06:59,027 because I think London's an amazing city. 165 00:06:59,070 --> 00:07:01,464 Plus, I knew there were really amazing people 166 00:07:01,508 --> 00:07:03,814 trained at Cambridge and Oxford and UCL. 167 00:07:03,858 --> 00:07:05,033 In Silicon Valley, 168 00:07:05,076 --> 00:07:06,643 everybody's founding a company every year, 169 00:07:06,687 --> 00:07:07,818 and then if it doesn't work, 170 00:07:07,862 --> 00:07:09,603 you chuck it and you start something new. 171 00:07:09,646 --> 00:07:11,169 That is not conducive 172 00:07:11,213 --> 00:07:14,564 to a long-term research challenge. 173 00:07:14,608 --> 00:07:17,349 So we were totally an outlier for him. 174 00:07:17,393 --> 00:07:20,962 Hi, everyone. Welcome to DeepMind. 175 00:07:21,005 --> 00:07:22,529 So, what is our mission? 176 00:07:23,747 --> 00:07:25,706 We summarize it as... 177 00:07:25,749 --> 00:07:28,012 DeepMind's mission is to build the world's first 178 00:07:28,056 --> 00:07:29,579 general learning machine. 179 00:07:29,623 --> 00:07:31,712 So we always stress the word "general" and "learning" here 180 00:07:31,755 --> 00:07:32,887 are the key things. 181 00:07:32,930 --> 00:07:35,106 Our mission was to build an AGI, 182 00:07:35,150 --> 00:07:36,760 an artificial general intelligence. 183 00:07:36,804 --> 00:07:40,372 And so that means that we need a system which is general. 184 00:07:40,416 --> 00:07:42,723 It doesn't learn to do one specific thing. 185 00:07:42,766 --> 00:07:45,073 That's a really key part of human intelligence. 186 00:07:45,116 --> 00:07:46,683 We can learn to do many, many things. 187 00:07:46,727 --> 00:07:48,685 It's going to, of course, be a lot of hard work. 188 00:07:48,729 --> 00:07:51,079 But one of the things that keeps me up at night 189 00:07:51,122 --> 00:07:53,168 is to not waste this opportunity to, you know, 190 00:07:53,211 --> 00:07:54,430 to really make a difference here, 191 00:07:54,474 --> 00:07:56,824 and have a big impact on the world. 192 00:07:56,867 --> 00:07:58,303 The first people that came 193 00:07:58,347 --> 00:08:00,262 and joined DeepMind really believed in the dream. 194 00:08:00,305 --> 00:08:02,090 But this was, I think, one of the first times 195 00:08:02,133 --> 00:08:04,440 they found a place full of other dreamers. 196 00:08:04,484 --> 00:08:06,398 You know, we collected this Manhattan Project, 197 00:08:06,442 --> 00:08:08,400 if you like, together to solve AI. 198 00:08:08,444 --> 00:08:09,750 In the first two years, 199 00:08:09,793 --> 00:08:10,794 we were in total stealth mode. 200 00:08:10,838 --> 00:08:12,274 And so we couldn't say to anyone 201 00:08:12,317 --> 00:08:14,755 what were we doing or where we worked. 202 00:08:14,798 --> 00:08:16,104 It was all quite vague. 203 00:08:16,147 --> 00:08:17,584 It had no public presence at all. 204 00:08:17,627 --> 00:08:18,933 You couldn't look at a website. 205 00:08:18,976 --> 00:08:21,109 The office was at a secret location. 206 00:08:21,152 --> 00:08:23,981 When we would interview people in those early days, 207 00:08:24,025 --> 00:08:26,288 they would show up very nervously. 208 00:08:27,463 --> 00:08:29,639 I had at least one candidate who said, 209 00:08:29,683 --> 00:08:31,989 "I just messaged my wife to tell her exactly 210 00:08:32,033 --> 00:08:33,338 "where I'm going just in case 211 00:08:33,382 --> 00:08:34,514 "this turns out to be some kind of horrible scam 212 00:08:34,557 --> 00:08:35,819 "and I'm going to get kidnapped." 213 00:08:35,863 --> 00:08:39,693 Well, my favorite new person who's an investor, 214 00:08:39,736 --> 00:08:43,000 who I've been working for a year, is Elon Musk. 215 00:08:43,044 --> 00:08:44,132 So for those of you who don't know, 216 00:08:44,175 --> 00:08:45,350 this is what he looks like. 217 00:08:45,394 --> 00:08:47,483 And he hadn't really thought much about AI 218 00:08:47,527 --> 00:08:49,137 until we chatted. 219 00:08:49,180 --> 00:08:51,269 His mission is to die on Mars or something. 220 00:08:51,313 --> 00:08:52,923 But not on impact. 221 00:08:52,967 --> 00:08:54,185 So... 222 00:08:55,186 --> 00:08:57,145 We made some big decisions 223 00:08:57,188 --> 00:08:59,321 about how we were going to approach building AI. 224 00:08:59,364 --> 00:09:01,105 This is a reinforcement learning setup. 225 00:09:01,149 --> 00:09:02,803 This is the kind of setup that we think about 226 00:09:02,846 --> 00:09:06,328 when we say we're building, you know, an AI agent. 227 00:09:06,371 --> 00:09:08,504 It's basically the agent, which is the AI, 228 00:09:08,548 --> 00:09:09,984 and then there's the environment 229 00:09:10,027 --> 00:09:11,159 that it's interacting with. 230 00:09:11,202 --> 00:09:12,464 We decided that games, 231 00:09:12,508 --> 00:09:13,988 as long as you're very disciplined 232 00:09:14,031 --> 00:09:15,293 about how you use them, 233 00:09:15,337 --> 00:09:17,252 are the perfect training ground 234 00:09:17,295 --> 00:09:19,341 for AI development. 235 00:09:19,384 --> 00:09:21,691 We wanted to try to create one algorithm 236 00:09:21,735 --> 00:09:23,650 that could to be trained up to play 237 00:09:23,693 --> 00:09:26,043 several dozen different Atari games. 238 00:09:26,087 --> 00:09:27,479 So just like a human, 239 00:09:27,523 --> 00:09:29,525 you have to use the same brain to play all the games. 240 00:09:29,569 --> 00:09:30,874 You can think of it 241 00:09:30,918 --> 00:09:33,007 that you provide the agent with the cartridge. 242 00:09:33,050 --> 00:09:34,312 And you say, 243 00:09:34,356 --> 00:09:35,705 "Okay, imagine you're born into that world 244 00:09:35,749 --> 00:09:37,533 "with that cartridge, and you just get to interact 245 00:09:37,577 --> 00:09:39,404 "with the pixels and see the score. 246 00:09:40,362 --> 00:09:41,668 "What can you do?" 247 00:09:43,974 --> 00:09:47,369 So what you're going to do is take your Q function. Q-K... 248 00:09:47,412 --> 00:09:49,197 Q-learning is one of the oldest methods 249 00:09:49,240 --> 00:09:50,894 for reinforcement learning. 250 00:09:50,938 --> 00:09:53,680 And what we did was combine reinforcement learning 251 00:09:53,723 --> 00:09:56,770 with deep learning in one system. 252 00:09:56,813 --> 00:09:59,337 No one had ever combined those two things together 253 00:09:59,381 --> 00:10:01,426 at scale to do anything impressive, 254 00:10:01,470 --> 00:10:03,515 and we needed to prove out this thesis. 255 00:10:03,559 --> 00:10:06,780 We tried doingPong as the first game. 256 00:10:06,823 --> 00:10:08,129 It seemed like the simplest. 257 00:10:08,172 --> 00:10:10,218 It hasn't been told 258 00:10:10,261 --> 00:10:11,828 anything about what it's controlling 259 00:10:11,872 --> 00:10:12,873 or what it's supposed to do. 260 00:10:12,916 --> 00:10:14,657 All it knows is that score is good 261 00:10:14,701 --> 00:10:18,052 and it has to learn what its controls do, 262 00:10:18,095 --> 00:10:20,620 and build everything... first principles. 263 00:10:29,193 --> 00:10:30,412 It wasn't really working. 264 00:10:32,588 --> 00:10:34,068 I was just saying to Shane, 265 00:10:34,111 --> 00:10:37,071 "Maybe we're just wrong, and we can't even doPong." 266 00:10:37,114 --> 00:10:38,812 It was a bit nerve-racking, 267 00:10:38,855 --> 00:10:40,422 thinking how far we had to go 268 00:10:40,465 --> 00:10:42,337 if we were going to really build 269 00:10:42,380 --> 00:10:44,426 a generally intelligent system. 270 00:10:44,469 --> 00:10:45,732 And it felt like it was time 271 00:10:45,775 --> 00:10:47,255 to give up and move on. 272 00:10:48,169 --> 00:10:49,387 And then suddenly... 273 00:10:51,389 --> 00:10:53,609 We got our first point. 274 00:10:53,653 --> 00:10:56,612 And then it was like, "Is this random?" 275 00:10:56,656 --> 00:10:59,180 "No, no, it's really getting a point now." 276 00:10:59,223 --> 00:11:00,703 It was really exciting that this thing 277 00:11:00,747 --> 00:11:02,226 that previously couldn't even figure out 278 00:11:02,270 --> 00:11:03,532 how to move a paddle 279 00:11:03,575 --> 00:11:05,926 had suddenly been able to totally get it right. 280 00:11:05,969 --> 00:11:07,144 Then it was getting a few points. 281 00:11:07,188 --> 00:11:08,624 And then it won its first game. 282 00:11:08,668 --> 00:11:10,974 And then three months later, no human could beat it. 283 00:11:11,018 --> 00:11:14,238 You hadn't told it the rules, how to get the score, nothing. 284 00:11:14,282 --> 00:11:16,110 And you just tell it to maximize the score, 285 00:11:16,153 --> 00:11:17,372 and it goes away and does it. 286 00:11:17,415 --> 00:11:18,678 This is the first time 287 00:11:18,721 --> 00:11:20,549 anyone had done this end-to-end learning. 288 00:11:20,592 --> 00:11:24,292 "Okay, so we have this working in quite a general way. 289 00:11:24,335 --> 00:11:25,772 "Now let's try another game." 290 00:11:25,815 --> 00:11:27,512 So then we triedBreakout. 291 00:11:27,556 --> 00:11:29,123 At the beginning, after 100 games, 292 00:11:29,166 --> 00:11:30,777 the agent is not very good. 293 00:11:30,820 --> 00:11:32,474 It's missing the ball most of the time, 294 00:11:32,517 --> 00:11:34,389 but it's starting to get the hang of the idea 295 00:11:34,432 --> 00:11:35,999 that the bat should go towards the ball. 296 00:11:36,043 --> 00:11:37,566 Now, after 300 games, 297 00:11:37,609 --> 00:11:40,656 it's about as good as any human can play this. 298 00:11:40,700 --> 00:11:42,049 We thought, "Well, that's pretty cool," 299 00:11:42,092 --> 00:11:44,312 but we left the system playing for another 200 games, 300 00:11:44,355 --> 00:11:46,053 and it did this amazing thing. 301 00:11:46,096 --> 00:11:47,358 It found the optimal strategy 302 00:11:47,402 --> 00:11:49,404 was to dig a tunnel around the side 303 00:11:49,447 --> 00:11:51,667 and put the ball around the back of the wall. 304 00:11:51,711 --> 00:11:53,234 Finally, the agent 305 00:11:53,277 --> 00:11:54,365 is actually achieving 306 00:11:54,409 --> 00:11:55,627 what you thought it would achieve. 307 00:11:55,671 --> 00:11:57,325 That is a great feeling. Right? 308 00:11:57,368 --> 00:11:59,283 Like, I mean, when we do research, 309 00:11:59,327 --> 00:12:00,676 that is the best we can hope for. 310 00:12:00,720 --> 00:12:03,200 We started generalizing to 50 games, 311 00:12:03,244 --> 00:12:05,463 and we basically created a recipe. 312 00:12:05,507 --> 00:12:06,813 We could just take a game 313 00:12:06,856 --> 00:12:08,379 that we have never seen before. 314 00:12:08,423 --> 00:12:09,903 We would run the algorithm on that, 315 00:12:09,946 --> 00:12:13,036 and DQN could train itself from scratch, 316 00:12:13,080 --> 00:12:14,429 achieving human level 317 00:12:14,472 --> 00:12:15,996 or sometimes better than human level. 318 00:12:16,039 --> 00:12:18,433 We didn't build it to play any of them. 319 00:12:18,476 --> 00:12:20,522 We could just give it a bunch of games 320 00:12:20,565 --> 00:12:22,654 and would figure it out for itself. 321 00:12:22,698 --> 00:12:25,179 And there was something quite magical in that. 322 00:12:25,222 --> 00:12:26,528 Suddenly you had something 323 00:12:26,571 --> 00:12:27,921 that would respond and learn 324 00:12:27,964 --> 00:12:30,358 whatever situation it was parachuted into. 325 00:12:30,401 --> 00:12:33,013 And that was like a huge, huge breakthrough. 326 00:12:33,056 --> 00:12:35,276 It was in many respects 327 00:12:35,319 --> 00:12:36,625 the first example 328 00:12:36,668 --> 00:12:39,062 of any kind of thing you could call 329 00:12:39,106 --> 00:12:40,542 a general intelligence. 330 00:12:42,109 --> 00:12:43,893 Although we were a well-funded startup, 331 00:12:43,937 --> 00:12:47,375 holding us back was not enough compute power. 332 00:12:47,418 --> 00:12:49,203 I realized that this would accelerate 333 00:12:49,246 --> 00:12:51,466 our time scale to AGI massively. 334 00:12:51,509 --> 00:12:53,163 I used to see Demis quite frequently. 335 00:12:53,207 --> 00:12:55,035 We'd have lunch, and he did... 336 00:12:56,210 --> 00:12:58,865 say to me that he had two companies 337 00:12:58,908 --> 00:13:02,564 that were involved in buying DeepMind. 338 00:13:02,607 --> 00:13:04,609 And he didn't know which one to go with. 339 00:13:04,653 --> 00:13:08,526 The issue was, would any commercial company 340 00:13:08,570 --> 00:13:12,661 appreciate the real importance of the research? 341 00:13:12,704 --> 00:13:15,838 And give the research time to come to fruition 342 00:13:15,882 --> 00:13:17,797 and not be breathing down their necks, 343 00:13:17,840 --> 00:13:21,539 saying, "We want some kind of commercial benefit from this." 344 00:13:27,284 --> 00:13:32,463 Google has bought DeepMind for a reported ยฃ400,000,000, 345 00:13:32,507 --> 00:13:34,726 making the artificial intelligence firm 346 00:13:34,770 --> 00:13:37,947 its largest European acquisition so far. 347 00:13:37,991 --> 00:13:39,644 The company was founded 348 00:13:39,688 --> 00:13:43,344 by 37-year-old entrepreneur Demis Hassabis. 349 00:13:43,387 --> 00:13:45,563 After the acquisition, I started mentoring 350 00:13:45,607 --> 00:13:47,261 and spending time with Demis, 351 00:13:47,304 --> 00:13:48,915 and just listening to him. 352 00:13:48,958 --> 00:13:52,396 And this is a person who fundamentally 353 00:13:52,440 --> 00:13:55,573 is a scientist and a natural scientist. 354 00:13:55,617 --> 00:13:58,489 He wants science to solve every problem in the world, 355 00:13:58,533 --> 00:14:00,622 and he believes it can do so. 356 00:14:00,665 --> 00:14:03,625 That's not a normal person you find in a tech company. 357 00:14:05,496 --> 00:14:07,455 We were able to not only join Google 358 00:14:07,498 --> 00:14:10,240 but run independently in London, 359 00:14:10,284 --> 00:14:11,328 build our culture, 360 00:14:11,372 --> 00:14:13,461 which was optimized for breakthroughs 361 00:14:13,504 --> 00:14:15,419 and not deal with products, 362 00:14:15,463 --> 00:14:17,726 do pure research. 363 00:14:17,769 --> 00:14:19,293 Our investors didn't want to sell, 364 00:14:19,336 --> 00:14:20,685 but we decided 365 00:14:20,729 --> 00:14:22,731 that this was the best thing for the mission. 366 00:14:22,774 --> 00:14:24,646 In many senses, we were underselling 367 00:14:24,689 --> 00:14:26,169 in terms of value before it more matured, 368 00:14:26,213 --> 00:14:28,128 and you could have sold it for a lot more money. 369 00:14:28,171 --> 00:14:32,828 And the reason is because there's no time to waste. 370 00:14:32,872 --> 00:14:35,178 There's so many things that got to be cracked 371 00:14:35,222 --> 00:14:37,659 while the brain is still in gear. 372 00:14:37,702 --> 00:14:39,008 You know, I'm still alive. 373 00:14:39,052 --> 00:14:40,880 There's all these things that gotta be done. 374 00:14:40,923 --> 00:14:42,925 So you haven't got-- I mean, how many... 375 00:14:42,969 --> 00:14:44,492 How many billions would you trade for 376 00:14:44,535 --> 00:14:45,797 another five years of life, you know, 377 00:14:45,841 --> 00:14:48,365 to do what you set out to do? 378 00:14:48,409 --> 00:14:49,758 Okay, all of a sudden, 379 00:14:49,801 --> 00:14:52,717 we've got this massive scale compute available to us. 380 00:14:52,761 --> 00:14:53,936 What can we do with that? 381 00:14:56,591 --> 00:14:59,942 Go is the pinnacle of board games. 382 00:14:59,986 --> 00:15:04,642 It is the most complex game ever devised by man. 383 00:15:04,686 --> 00:15:06,688 There are more possible board configurations 384 00:15:06,731 --> 00:15:09,821 in the game of Go than there are atoms in the universe. 385 00:15:09,865 --> 00:15:13,390 Go is the holy grail of artificial intelligence. 386 00:15:13,434 --> 00:15:14,522 For many years, 387 00:15:14,565 --> 00:15:16,002 people have looked at this game 388 00:15:16,045 --> 00:15:17,917 and they've thought, "Wow, this is just too hard." 389 00:15:17,960 --> 00:15:20,180 Everything we've ever tried in AI, 390 00:15:20,223 --> 00:15:22,704 it just falls over when you try the game of Go. 391 00:15:22,747 --> 00:15:23,966 And so that's why it feels like 392 00:15:24,010 --> 00:15:26,099 a real litmus test of progress. 393 00:15:26,142 --> 00:15:28,579 We had just bought DeepMind. 394 00:15:28,623 --> 00:15:30,712 They were working on reinforcement learning 395 00:15:30,755 --> 00:15:32,975 and they were the world's experts in games. 396 00:15:33,019 --> 00:15:34,846 And so when they introduced the idea 397 00:15:34,890 --> 00:15:37,197 that they could beat the top level Go players 398 00:15:37,240 --> 00:15:40,113 in a game that was thought to be incomputable, 399 00:15:40,156 --> 00:15:42,724 I thought, "Well, that's pretty interesting." 400 00:15:42,767 --> 00:15:46,467 Our ultimate next step is to play the legendary 401 00:15:46,510 --> 00:15:49,209 Lee Sedol in just over two weeks. 402 00:15:50,514 --> 00:15:52,038 A match like no other 403 00:15:52,081 --> 00:15:54,257 is about to get underway in South Korea. 404 00:15:54,301 --> 00:15:57,826 Lee Sedol is getting ready to rumble. 405 00:15:57,869 --> 00:15:59,262 Lee Sedol is probably 406 00:15:59,306 --> 00:16:01,699 one of the greatest players of the last decade. 407 00:16:01,743 --> 00:16:04,224 I describe him as the Roger Federer of Go. 408 00:16:05,573 --> 00:16:06,922 He showed up, 409 00:16:06,966 --> 00:16:09,838 and all of a sudden we have a thousand Koreans 410 00:16:09,881 --> 00:16:13,059 who represent all of Korean society, 411 00:16:13,102 --> 00:16:14,190 the top Go players. 412 00:16:15,583 --> 00:16:17,802 And then we have Demis. 413 00:16:17,846 --> 00:16:19,848 And the great engineering team. 414 00:16:20,588 --> 00:16:22,285 He's very famous 415 00:16:22,329 --> 00:16:25,506 for very creative fighting play. 416 00:16:25,549 --> 00:16:28,770 So this could be difficult for us. 417 00:16:28,813 --> 00:16:31,991 I figured Lee Sedol is going to beat these guys, 418 00:16:32,034 --> 00:16:34,297 but they'll make a good showing. 419 00:16:34,341 --> 00:16:35,733 Good for a startup. 420 00:16:38,040 --> 00:16:39,563 I went over to the technical group 421 00:16:39,607 --> 00:16:40,912 and they said, 422 00:16:40,956 --> 00:16:42,305 "Let me show you how our algorithm works." 423 00:16:43,654 --> 00:16:45,091 If you step through the actual game, 424 00:16:45,134 --> 00:16:47,789 we can see, kind of, how AlphaGo thinks. 425 00:16:47,832 --> 00:16:50,226 The way we start off on training AlphaGo 426 00:16:50,270 --> 00:16:52,968 is by showing it 100,000 games 427 00:16:53,012 --> 00:16:54,491 that strong amateurs have played. 428 00:16:54,535 --> 00:16:55,710 And we first initially 429 00:16:55,753 --> 00:16:58,887 get AlphaGo to mimic the human player, 430 00:16:58,930 --> 00:17:00,802 and then through reinforcement learning, 431 00:17:00,845 --> 00:17:02,369 it plays against different versions of itself 432 00:17:02,412 --> 00:17:05,720 many millions of times and learns from its errors. 433 00:17:05,763 --> 00:17:07,591 Hmm, this is interesting. 434 00:17:07,635 --> 00:17:08,679 All right, folks, 435 00:17:08,723 --> 00:17:10,594 you're going to see history made. 436 00:17:12,770 --> 00:17:14,772 So the game starts. 437 00:17:14,816 --> 00:17:15,991 He's really concentrating. 438 00:17:16,035 --> 00:17:17,645 If you really look at the... 439 00:17:20,909 --> 00:17:25,348 That's a very surprising move. 440 00:17:25,392 --> 00:17:27,916 I think we're seeing an original move here. 441 00:17:34,401 --> 00:17:35,793 Yeah, that's an exciting move. 442 00:17:36,185 --> 00:17:37,230 I like... 443 00:17:37,273 --> 00:17:38,579 Professional commentators 444 00:17:38,622 --> 00:17:40,102 almost unanimously said 445 00:17:40,146 --> 00:17:43,323 that not a single human player would have chosen move 37. 446 00:17:43,366 --> 00:17:45,803 So I actually had a poke around in AlphaGo 447 00:17:45,847 --> 00:17:47,414 to see what AlphaGo thought. 448 00:17:47,457 --> 00:17:50,330 And AlphaGo actually agreed with that assessment. 449 00:17:50,373 --> 00:17:53,811 AlphaGo said there was a one in 10,000 probability 450 00:17:53,855 --> 00:17:57,772 that move 37 would have been played by a human player. 451 00:18:08,826 --> 00:18:10,176 The game of Go has been studied 452 00:18:10,219 --> 00:18:11,568 for thousands of years. 453 00:18:11,612 --> 00:18:15,137 And AlphaGo discovered something completely new. 454 00:18:16,878 --> 00:18:19,707 He resigned. Lee Sedol has just resigned. 455 00:18:19,750 --> 00:18:21,187 He's beaten. 456 00:18:22,710 --> 00:18:24,668 The battle between man versus machine, 457 00:18:24,712 --> 00:18:26,235 a computer just came out the victor. 458 00:18:26,279 --> 00:18:28,237 Google put its DeepMind team 459 00:18:28,281 --> 00:18:29,673 to the test against 460 00:18:29,717 --> 00:18:32,023 one of the brightest minds in the world and won. 461 00:18:32,067 --> 00:18:33,721 That's when we realized 462 00:18:33,764 --> 00:18:35,244 the DeepMind people knew what they were doing 463 00:18:35,288 --> 00:18:37,551 and to pay attention to reinforcement learning 464 00:18:37,594 --> 00:18:38,900 as they have invented it. 465 00:18:40,075 --> 00:18:41,816 Based on that experience, 466 00:18:41,859 --> 00:18:44,819 AlphaGo got better and better and better. 467 00:18:44,862 --> 00:18:45,950 And they had a little chart 468 00:18:45,994 --> 00:18:47,517 of how much better they were getting. 469 00:18:47,561 --> 00:18:49,302 And I said, "When does this stop?" 470 00:18:50,085 --> 00:18:50,999 And Demis said, 471 00:18:51,042 --> 00:18:52,696 "When we beat the Chinese guy, 472 00:18:52,740 --> 00:18:55,743 "the top-rated player in the world." 473 00:18:56,961 --> 00:18:59,225 Ke Jie versus AlphaGo. 474 00:19:03,620 --> 00:19:04,795 And I think we will see 475 00:19:04,839 --> 00:19:06,145 AlphaGo pushing through there. 476 00:19:06,188 --> 00:19:08,190 AlphaGo is ahead quite a bit. 477 00:19:08,234 --> 00:19:11,454 About halfway through the first game, 478 00:19:11,498 --> 00:19:14,675 the best player in the world was not doing so well. 479 00:19:14,718 --> 00:19:17,808 What can black do here? 480 00:19:19,114 --> 00:19:21,247 Looks difficult. 481 00:19:21,290 --> 00:19:23,336 And at a critical moment... 482 00:19:32,997 --> 00:19:35,913 the Chinese government ordered the feed cut off. 483 00:19:38,307 --> 00:19:41,745 It was at that moment we were telling the world 484 00:19:41,789 --> 00:19:44,966 that something new had arrived on earth. 485 00:19:47,621 --> 00:19:48,970 In the 1950s 486 00:19:49,013 --> 00:19:51,929 when Russia'sSputnik satellite was launched, 487 00:19:53,279 --> 00:19:55,194 it changed the course of history. 488 00:19:55,237 --> 00:19:57,544 It is a challenge that America must meet 489 00:19:57,587 --> 00:19:59,633 to survive in the Space Age. 490 00:19:59,676 --> 00:20:02,375 This has been called theSputnik moment. 491 00:20:02,418 --> 00:20:06,335 The Sputnikmoment created a massive reaction in the US 492 00:20:06,379 --> 00:20:10,034 in terms of funding for science and engineering, 493 00:20:10,078 --> 00:20:12,254 and particularly of space technology. 494 00:20:12,298 --> 00:20:15,823 For China, AlphaGo was the wakeup call, 495 00:20:15,866 --> 00:20:17,128 the Sputnikmoment. 496 00:20:17,172 --> 00:20:19,870 It launched an AI space race. 497 00:20:21,220 --> 00:20:23,047 We had this huge idea that worked, 498 00:20:23,091 --> 00:20:26,355 and now the whole world knows. 499 00:20:26,399 --> 00:20:28,879 It's always easier to land on the moon 500 00:20:28,923 --> 00:20:30,838 if someone's already landed there. 501 00:20:32,056 --> 00:20:34,450 It is going to matter who builds AI, 502 00:20:34,494 --> 00:20:36,626 and how it gets built. 503 00:20:36,670 --> 00:20:38,324 I always feel that pressure. 504 00:20:42,066 --> 00:20:43,764 There's been a big chain of events 505 00:20:43,807 --> 00:20:46,680 that followed on from all of the excitement of AlphaGo. 506 00:20:46,723 --> 00:20:48,072 When we played against Lee Sedol, 507 00:20:48,116 --> 00:20:49,248 we actually had a system 508 00:20:49,291 --> 00:20:50,684 that had been trained on human data, 509 00:20:50,727 --> 00:20:52,251 on all of the millions of games 510 00:20:52,294 --> 00:20:55,036 that have been played by human experts. 511 00:20:55,079 --> 00:20:56,994 We eventually found a new algorithm, 512 00:20:57,038 --> 00:20:59,170 a much more elegant approach to the whole system, 513 00:20:59,214 --> 00:21:01,172 which actually stripped out all of the human knowledge 514 00:21:01,216 --> 00:21:03,697 and just started completely from scratch. 515 00:21:03,740 --> 00:21:06,700 And that became a project which we called AlphaZero. 516 00:21:06,743 --> 00:21:09,529 Zero, meaning having zero human knowledge in the loop. 517 00:21:11,879 --> 00:21:13,054 Instead of learning from human data, 518 00:21:13,097 --> 00:21:15,796 it learned from its own games. 519 00:21:15,839 --> 00:21:17,841 So it actually became its own teacher. 520 00:21:21,280 --> 00:21:23,499 AlphaZero is an experiment 521 00:21:23,543 --> 00:21:26,720 in how little knowledge can we put into these systems 522 00:21:26,763 --> 00:21:28,243 and how quickly and how efficiently 523 00:21:28,287 --> 00:21:29,723 can they learn? 524 00:21:29,766 --> 00:21:32,552 But the other thing is AlphaZero doesn't have any rules. 525 00:21:32,595 --> 00:21:33,553 It learns through experience. 526 00:21:36,120 --> 00:21:38,862 The next stage was to make it more general, 527 00:21:38,906 --> 00:21:40,995 so that it could play any two-player game. 528 00:21:41,038 --> 00:21:42,344 Things like chess, 529 00:21:42,388 --> 00:21:44,085 and in fact, any kind of two-player 530 00:21:44,128 --> 00:21:45,391 perfect information game. 531 00:21:45,434 --> 00:21:46,653 It's going really well. 532 00:21:46,696 --> 00:21:47,828 It's going really, really well. 533 00:21:47,871 --> 00:21:50,091 -Oh, wow. -It's going down, like fast. 534 00:21:50,134 --> 00:21:53,050 AlphaGo used to take a few months to train, 535 00:21:53,094 --> 00:21:55,662 but AlphaZero could start in the morning 536 00:21:55,705 --> 00:21:57,794 playing completely randomly 537 00:21:57,838 --> 00:22:01,015 and then by tea be at superhuman level. 538 00:22:01,058 --> 00:22:03,365 And by dinner it will be the strongest chess entity 539 00:22:03,409 --> 00:22:04,758 there's ever been. 540 00:22:04,801 --> 00:22:06,629 -Amazing, it's amazing. -Yeah. 541 00:22:06,673 --> 00:22:09,371 It's discovered its own attacking style, you know, 542 00:22:09,415 --> 00:22:11,417 to take on the current level of defense. 543 00:22:11,460 --> 00:22:12,766 I mean, I never in my wildest dreams... 544 00:22:12,809 --> 00:22:14,942 I agree. Actually, I was not expecting that either. 545 00:22:14,985 --> 00:22:16,422 And it's fun for me. 546 00:22:16,465 --> 00:22:18,598 I mean, it's inspired me to get back into chess again, 547 00:22:18,641 --> 00:22:20,164 because it's cool to see 548 00:22:20,208 --> 00:22:22,384 that there's even more depth than we thought in chess. 549 00:22:31,611 --> 00:22:34,440 I actually got into AI through games. 550 00:22:35,658 --> 00:22:37,660 Initially, it was board games. 551 00:22:37,704 --> 00:22:40,054 I was thinking, "How is my brain doing this?" 552 00:22:40,097 --> 00:22:41,969 Like, what is it doing? 553 00:22:43,362 --> 00:22:47,148 I was very aware of that from a very young age. 554 00:22:47,191 --> 00:22:50,064 So I've always been thinking about thinking. 555 00:22:50,107 --> 00:22:52,762 The British and American chess champions 556 00:22:52,806 --> 00:22:55,025 meet to begin a series of matches. 557 00:22:55,069 --> 00:22:56,592 Playing alongside them are the cream 558 00:22:56,636 --> 00:22:59,073 of Britain and America's youngest players. 559 00:22:59,116 --> 00:23:01,467 Demis Hassabis is representing Britain. 560 00:23:06,210 --> 00:23:07,864 When Demis was four, 561 00:23:07,908 --> 00:23:11,259 he first showed an aptitude for chess. 562 00:23:12,739 --> 00:23:14,044 By the time he was six, 563 00:23:14,088 --> 00:23:18,048 he became London under-eight champion. 564 00:23:18,092 --> 00:23:19,485 My parents were very interesting 565 00:23:19,528 --> 00:23:20,834 and unusual, actually. 566 00:23:20,877 --> 00:23:23,750 I'd probably describe them as quite bohemian. 567 00:23:23,793 --> 00:23:25,491 My father was a singer-songwriter 568 00:23:25,534 --> 00:23:26,709 when he was younger, 569 00:23:26,753 --> 00:23:28,363 and Bob Dylan was his hero. 570 00:23:38,068 --> 00:23:39,287 Yeah, yeah. 571 00:23:41,637 --> 00:23:44,031 What is it that you like about this game? 572 00:23:45,075 --> 00:23:47,121 It's just a good thinking game. 573 00:23:49,253 --> 00:23:51,038 At the time, I was the second-highest rated 574 00:23:51,081 --> 00:23:52,692 chess player in the world for my age. 575 00:23:52,735 --> 00:23:54,345 But although I was on track 576 00:23:54,389 --> 00:23:55,956 to be a professional chess player, 577 00:23:55,999 --> 00:23:57,523 I thought that was what I was going to do. 578 00:23:57,566 --> 00:23:59,220 No matter how much I loved the game, 579 00:23:59,263 --> 00:24:01,222 it was incredibly stressful. 580 00:24:01,265 --> 00:24:03,354 Definitely was not fun and games for me. 581 00:24:03,398 --> 00:24:05,226 My parents used to, you know, 582 00:24:05,269 --> 00:24:06,923 get very upset when I lost the game 583 00:24:06,967 --> 00:24:10,405 and angry if I forgot something. 584 00:24:10,449 --> 00:24:12,407 And because it was quite high stakes for them, you know, 585 00:24:12,451 --> 00:24:14,061 it cost a lot of money to go to these tournaments. 586 00:24:14,104 --> 00:24:15,715 And my parents didn't have much money. 587 00:24:18,413 --> 00:24:19,719 My parents thought, you know, 588 00:24:19,762 --> 00:24:22,069 "If you interested in being a chess professional, 589 00:24:22,112 --> 00:24:25,376 "this is really important. It's like your exams." 590 00:24:27,291 --> 00:24:30,077 I remember I was about 12-years-old 591 00:24:30,120 --> 00:24:32,079 and I was at this international chess tournament 592 00:24:32,122 --> 00:24:34,255 in Liechtenstein up in the mountains. 593 00:24:43,307 --> 00:24:45,571 And we were in this huge church hall 594 00:24:47,181 --> 00:24:48,399 with, you know, 595 00:24:48,443 --> 00:24:50,184 hundreds of international chess players. 596 00:24:52,403 --> 00:24:56,016 And I was playing the ex-Danish champion. 597 00:24:56,059 --> 00:24:58,801 He must have been in his 30s, probably. 598 00:25:00,411 --> 00:25:02,979 In those days, there was a long time limit. 599 00:25:03,023 --> 00:25:05,068 The games could literally last all day. 600 00:25:08,332 --> 00:25:10,813 We were into our tenth hour. 601 00:25:20,257 --> 00:25:23,086 And we were in this incredibly unusual ending. 602 00:25:23,130 --> 00:25:24,566 I think it should be a draw. 603 00:25:26,307 --> 00:25:28,657 But he kept on trying to win for hours. 604 00:25:35,359 --> 00:25:38,319 Finally, he tried one last cheap trick. 605 00:25:42,541 --> 00:25:44,455 All I had to do was give away my queen. 606 00:25:44,499 --> 00:25:45,674 Then it would be stalemate. 607 00:25:47,023 --> 00:25:48,634 But I was so tired, 608 00:25:48,677 --> 00:25:50,157 I thought it was inevitable I was going to be checkmated. 609 00:25:52,289 --> 00:25:53,508 And so I resigned. 610 00:25:57,294 --> 00:25:59,558 He jumped up. Just started laughing. 611 00:26:01,777 --> 00:26:02,909 And he went, 612 00:26:02,952 --> 00:26:04,171 "Why have you resigned? It's a draw." 613 00:26:04,214 --> 00:26:05,346 And he immediately, with a flourish, 614 00:26:05,389 --> 00:26:06,652 sort of showed me the drawing move. 615 00:26:09,045 --> 00:26:12,396 I felt so sick to my stomach. 616 00:26:12,440 --> 00:26:14,137 It made me think of the rest of that tournament. 617 00:26:14,181 --> 00:26:16,662 Like, are we wasting our minds? 618 00:26:16,705 --> 00:26:19,708 Is this the best use of all this brain power? 619 00:26:19,752 --> 00:26:22,363 Everybody's, collectively, in that building? 620 00:26:22,406 --> 00:26:24,060 If you could somehow plug in 621 00:26:24,104 --> 00:26:27,542 those 300 brains into a system, 622 00:26:27,586 --> 00:26:29,022 you might be able to solve cancer 623 00:26:29,065 --> 00:26:30,501 with that level of brain power. 624 00:26:31,633 --> 00:26:33,504 This intuitive feeling came over me 625 00:26:33,548 --> 00:26:35,071 that although I love chess, 626 00:26:35,115 --> 00:26:38,335 this is not the right thing to spend my whole life on. 627 00:26:51,479 --> 00:26:53,089 Demis and myself, 628 00:26:53,133 --> 00:26:56,092 our plan was always to fill DeepMind 629 00:26:56,136 --> 00:26:57,224 with some of the most 630 00:26:57,267 --> 00:26:59,226 brilliant scientists in the world. 631 00:26:59,269 --> 00:27:01,054 So we had the human brains 632 00:27:01,097 --> 00:27:04,927 necessary to create an AGI system. 633 00:27:04,971 --> 00:27:09,279 By definition, the "G" in AGI is about generality. 634 00:27:09,323 --> 00:27:13,109 What I imagine is being able to talk to an agent, 635 00:27:13,153 --> 00:27:15,198 the agent can talk back, 636 00:27:15,242 --> 00:27:18,898 and the agent is able to solve novel problems 637 00:27:18,941 --> 00:27:20,595 that it hasn't seen before. 638 00:27:20,639 --> 00:27:22,728 That's a really key part of human intelligence, 639 00:27:22,771 --> 00:27:24,468 and it's that cognitive breadth 640 00:27:24,512 --> 00:27:27,733 and flexibility that's incredible. 641 00:27:27,776 --> 00:27:29,473 The only natural general intelligence 642 00:27:29,517 --> 00:27:30,910 we know of as humans, 643 00:27:30,953 --> 00:27:33,434 we obviously learn a lot from our environment. 644 00:27:33,477 --> 00:27:35,871 So we think that simulated environments 645 00:27:35,915 --> 00:27:38,874 are one of the ways to create an AGI. 646 00:27:40,528 --> 00:27:42,356 The very early humans 647 00:27:42,399 --> 00:27:44,314 were having to solve logic problems. 648 00:27:44,358 --> 00:27:46,752 They were having to solve navigation, memory, 649 00:27:46,795 --> 00:27:48,971 and we evolved in that environment. 650 00:27:50,364 --> 00:27:52,496 If we can create a virtual recreation 651 00:27:52,540 --> 00:27:54,629 of that kind of environment, 652 00:27:54,673 --> 00:27:56,326 that's the perfect testing ground 653 00:27:56,370 --> 00:27:57,458 and training ground 654 00:27:57,501 --> 00:27:59,329 for everything we do at DeepMind. 655 00:28:04,247 --> 00:28:05,727 What they were doing here 656 00:28:05,771 --> 00:28:09,252 was creating environments for childlike beings, 657 00:28:09,296 --> 00:28:11,690 the agents to exist within and play. 658 00:28:12,429 --> 00:28:13,648 That just sounded like 659 00:28:13,692 --> 00:28:16,782 the most interesting thing in all the world. 660 00:28:16,825 --> 00:28:19,132 A child learns by tearing things up 661 00:28:19,175 --> 00:28:20,655 and then throwing food around 662 00:28:20,699 --> 00:28:23,353 and getting a response from mommy or daddy. 663 00:28:23,397 --> 00:28:25,616 This seems like an important idea to incorporate 664 00:28:25,660 --> 00:28:27,880 in the way you train an agent. 665 00:28:27,923 --> 00:28:30,970 The humanoid is supposed to stand up. 666 00:28:31,013 --> 00:28:32,972 As his center of gravity rises, 667 00:28:33,015 --> 00:28:34,408 it gets more points. 668 00:28:37,454 --> 00:28:38,804 You have a reward 669 00:28:38,847 --> 00:28:40,849 and the agent learns from the reward, 670 00:28:40,893 --> 00:28:43,373 like, you do something well, you get a positive reward. 671 00:28:43,417 --> 00:28:47,508 You do something bad, you get a negative reward. 672 00:28:47,551 --> 00:28:49,379 It looks like it's standing. 673 00:28:50,859 --> 00:28:52,165 It's still a bit drunk. 674 00:28:52,208 --> 00:28:53,644 It likes to walk backwards. 675 00:28:53,688 --> 00:28:55,342 Yeah. 676 00:28:55,385 --> 00:28:57,518 The whole algorithm is trying to optimize 677 00:28:57,561 --> 00:28:59,694 for receiving as much rewards as possible, 678 00:28:59,738 --> 00:29:02,305 and it's found that walking backwards, 679 00:29:02,349 --> 00:29:05,352 it's good enough to get very good scores. 680 00:29:07,746 --> 00:29:09,573 When we learn to navigate, 681 00:29:09,617 --> 00:29:11,271 when we learn to get around in our world, 682 00:29:11,314 --> 00:29:13,490 we don't start with maps. 683 00:29:13,534 --> 00:29:16,319 We just start with our own exploration, 684 00:29:16,363 --> 00:29:18,017 adventuring off across the park, 685 00:29:18,060 --> 00:29:21,890 without our parents by our side, 686 00:29:21,934 --> 00:29:24,327 or finding our way home from school when we're young. 687 00:29:26,939 --> 00:29:28,723 A few of us came up with this idea 688 00:29:28,767 --> 00:29:31,987 that if we had an environment where a simulated robot 689 00:29:32,031 --> 00:29:33,728 just had to run forward, 690 00:29:33,772 --> 00:29:36,296 we could put all sorts of obstacles in its way 691 00:29:36,339 --> 00:29:38,211 and see if it could manage to navigate 692 00:29:38,254 --> 00:29:40,474 different types of terrain. 693 00:29:40,517 --> 00:29:43,042 The idea would be like a parkour challenge. 694 00:29:46,393 --> 00:29:48,743 It's not graceful, 695 00:29:48,787 --> 00:29:51,833 but was never trained to hold a glass whilst it was running 696 00:29:51,877 --> 00:29:53,052 and not spill water. 697 00:29:54,183 --> 00:29:55,706 You set this objective that says, 698 00:29:55,750 --> 00:29:58,231 "Just move forward, forward velocity, 699 00:29:58,274 --> 00:30:00,581 "and you'll get a reward for that." 700 00:30:00,624 --> 00:30:02,670 And the learning algorithm figures out 701 00:30:02,713 --> 00:30:05,412 how to move this complex set of joints. 702 00:30:06,152 --> 00:30:07,370 That's the power of 703 00:30:07,414 --> 00:30:10,069 reward-based reinforcement learning. 704 00:30:10,112 --> 00:30:12,767 Our goal is to try and build agents 705 00:30:12,811 --> 00:30:15,770 which, we drop them in, they know nothing, 706 00:30:15,814 --> 00:30:18,381 they get to play around in whatever problem you give them 707 00:30:18,425 --> 00:30:22,342 and eventually figure out how to solve it for themselves. 708 00:30:22,385 --> 00:30:24,823 Now we want something which can do that 709 00:30:24,866 --> 00:30:27,390 in as many different types of problems as possible. 710 00:30:29,262 --> 00:30:32,918 A human needs diverse skills to interact with the world. 711 00:30:32,961 --> 00:30:35,050 How to deal with complex images, 712 00:30:35,094 --> 00:30:37,879 how to manipulate thousands of things at once, 713 00:30:37,923 --> 00:30:40,447 how to deal with missing information. 714 00:30:40,490 --> 00:30:42,014 We think all of these things together 715 00:30:42,057 --> 00:30:45,278 are represented by this game calledStarCraft. 716 00:30:45,321 --> 00:30:47,280 All it's being trained to do is, 717 00:30:47,323 --> 00:30:50,457 given this situation, this screen, 718 00:30:50,500 --> 00:30:51,850 what would a human do? 719 00:30:51,893 --> 00:30:55,244 We took inspiration from large language models 720 00:30:55,288 --> 00:30:57,638 where you simply train a model 721 00:30:57,681 --> 00:30:59,553 to predict the next word, 722 00:31:03,731 --> 00:31:05,472 which is exactly the same as 723 00:31:05,515 --> 00:31:07,735 predict the next StarCraft move. 724 00:31:07,778 --> 00:31:08,954 Unlike chess or Go, 725 00:31:08,997 --> 00:31:11,217 where players take turns to make moves, 726 00:31:11,260 --> 00:31:14,133 inStarCraft there's a continuous flow of decisions. 727 00:31:15,134 --> 00:31:16,483 On top of that, 728 00:31:16,526 --> 00:31:18,659 you can't even see what the opponent is doing. 729 00:31:18,702 --> 00:31:20,879 There is no longer a clear definition 730 00:31:20,922 --> 00:31:22,315 of what it means to play the best way. 731 00:31:22,358 --> 00:31:23,925 It depends on what your opponent does. 732 00:31:23,969 --> 00:31:25,492 This is the way that we'll get to 733 00:31:25,535 --> 00:31:27,494 a much more fluid, 734 00:31:27,537 --> 00:31:31,672 more natural, faster, more reactive agent. 735 00:31:31,715 --> 00:31:33,021 This is a huge challenge 736 00:31:33,065 --> 00:31:35,415 and let's see how far we can push. 737 00:31:35,458 --> 00:31:36,459 Oh! 738 00:31:36,503 --> 00:31:38,200 Holy monkey! 739 00:31:38,244 --> 00:31:40,376 I'm a pretty low-level amateur. 740 00:31:40,420 --> 00:31:42,857 I'm okay, but I'm a pretty low-level amateur. 741 00:31:42,901 --> 00:31:45,991 These agents have a long ways to go. 742 00:31:46,034 --> 00:31:48,515 We couldn't beat someone of Tim's level. 743 00:31:48,558 --> 00:31:50,430 You know, that was a little bit alarming. 744 00:31:50,473 --> 00:31:51,997 At that point, it felt like 745 00:31:52,040 --> 00:31:53,520 it was going to be, like, a really big long challenge, 746 00:31:53,563 --> 00:31:55,000 maybe a couple of years. 747 00:31:58,177 --> 00:32:01,832 Dani is the best DeepMind StarCraft 2 player. 748 00:32:01,876 --> 00:32:05,097 I've been playing the agent every day for a few weeks now. 749 00:32:07,186 --> 00:32:08,883 I could feel that the agent 750 00:32:08,927 --> 00:32:11,103 was getting better really fast. 751 00:32:13,409 --> 00:32:15,020 Wow, we beat Danny. That, for me, 752 00:32:15,063 --> 00:32:16,935 was already like a huge achievement. 753 00:32:18,284 --> 00:32:19,372 The next step is 754 00:32:19,415 --> 00:32:21,722 we're going to book in a pro to play. 755 00:32:42,003 --> 00:32:44,049 It feels a bit unfair. All you guys against me. 756 00:32:45,615 --> 00:32:46,965 We're way ahead of what I thought 757 00:32:47,008 --> 00:32:49,402 we would do, given where we were two months ago. 758 00:32:49,445 --> 00:32:50,794 Just trying to digest it all, actually. 759 00:32:50,838 --> 00:32:52,753 But it's very, very cool. 760 00:32:52,796 --> 00:32:54,146 Now we're in a position where 761 00:32:54,189 --> 00:32:56,061 we can finally share the work that we've done 762 00:32:56,104 --> 00:32:57,192 with the public. 763 00:32:57,236 --> 00:32:58,585 This is a big step. 764 00:32:58,628 --> 00:33:00,674 We are really putting ourselves on the line here. 765 00:33:00,717 --> 00:33:02,589 -Take it away. Cheers. -Thank you. 766 00:33:02,632 --> 00:33:04,460 We're going to be live from London. 767 00:33:04,504 --> 00:33:05,722 It's happening. 768 00:33:08,638 --> 00:33:10,597 Welcome to London. 769 00:33:10,640 --> 00:33:13,252 We are going to have a live exhibition match, 770 00:33:13,295 --> 00:33:15,341 MaNa against AlphaStar. 771 00:33:18,344 --> 00:33:19,998 At this point now, 772 00:33:20,041 --> 00:33:23,827 AlphaStar, 10 and 0 against professional gamers. 773 00:33:23,871 --> 00:33:25,873 Any thoughts before we get into this game? 774 00:33:25,916 --> 00:33:27,483 I just want to see a good game, yeah. 775 00:33:27,527 --> 00:33:28,963 I want to see a good game. 776 00:33:29,007 --> 00:33:30,486 Absolutely, good game. We're all excited. 777 00:33:30,530 --> 00:33:33,011 All right. Let's see what MaNa can pull off. 778 00:33:34,969 --> 00:33:36,405 AlphaStar is definitely 779 00:33:36,449 --> 00:33:38,364 dominating the pace of this game. 780 00:33:41,149 --> 00:33:44,152 Wow. AlphaStar is playing so smartly. 781 00:33:46,850 --> 00:33:48,461 This really looks like I'm watching 782 00:33:48,504 --> 00:33:49,940 a professional human gamer 783 00:33:49,984 --> 00:33:51,290 from the AlphaStar point of view. 784 00:33:57,252 --> 00:34:01,952 I hadn't really seen a pro playStarCraft up close, 785 00:34:01,996 --> 00:34:03,563 and the 800 clicks per minute. 786 00:34:03,606 --> 00:34:06,000 I don't understand how anyone can even click 800 times, 787 00:34:06,044 --> 00:34:09,482 let alone doing 800 useful clicks. 788 00:34:09,525 --> 00:34:11,092 Oh, another good hit. 789 00:34:11,136 --> 00:34:13,094 AlphaStar is just 790 00:34:13,138 --> 00:34:14,617 completely relentless. 791 00:34:14,661 --> 00:34:16,141 We need to be careful 792 00:34:16,184 --> 00:34:19,361 because many of us grew up as gamers and are gamers. 793 00:34:19,405 --> 00:34:21,363 And so to us, it's very natural 794 00:34:21,407 --> 00:34:23,800 to view games as what they are, 795 00:34:23,844 --> 00:34:26,977 which is pure vehicles for fun, 796 00:34:27,021 --> 00:34:29,719 and not to see that more militaristic side 797 00:34:29,763 --> 00:34:32,853 that the public might see if they looked at this. 798 00:34:32,896 --> 00:34:37,553 You can't look at gunpowder and only make a firecracker. 799 00:34:37,597 --> 00:34:41,209 All technologies inherently point into certain directions. 800 00:34:43,124 --> 00:34:44,691 I'm very worried about 801 00:34:44,734 --> 00:34:46,606 the certain ways in which AI 802 00:34:46,649 --> 00:34:49,652 will be used for military purposes. 803 00:34:51,306 --> 00:34:55,049 And that makes it even clearer how important it is 804 00:34:55,093 --> 00:34:58,357 for our societies to be in control 805 00:34:58,400 --> 00:35:01,055 of these new technologies. 806 00:35:01,099 --> 00:35:05,190 The potential for abuse from AI will be significant. 807 00:35:05,233 --> 00:35:08,758 Wars that occur faster than humans can comprehend 808 00:35:08,802 --> 00:35:11,065 and more powerful surveillance. 809 00:35:12,458 --> 00:35:15,765 How do you keep power forever 810 00:35:15,809 --> 00:35:19,421 over something that's much more powerful than you? 811 00:35:43,053 --> 00:35:45,752 Technologies can be used to do terrible things. 812 00:35:47,493 --> 00:35:50,452 And technology can be used to do wonderful things 813 00:35:50,496 --> 00:35:52,150 and solve all kinds of problems. 814 00:35:53,586 --> 00:35:54,978 When DeepMind was acquired by Google... 815 00:35:55,022 --> 00:35:56,589 -Yeah. -...you got Google to promise 816 00:35:56,632 --> 00:35:58,025 that technology you developed won't be used by the military 817 00:35:58,068 --> 00:35:59,418 -for surveillance. -Right. 818 00:35:59,461 --> 00:36:00,593 -Yes. -Tell us about that. 819 00:36:00,636 --> 00:36:03,161 I think technology is neutral in itself, 820 00:36:03,204 --> 00:36:05,598 um, but how, you know, we as a society 821 00:36:05,641 --> 00:36:07,382 or humans and companies and other things, 822 00:36:07,426 --> 00:36:09,515 other entities and governments decide to use it 823 00:36:09,558 --> 00:36:12,561 is what determines whether things become good or bad. 824 00:36:12,605 --> 00:36:16,261 You know, I personally think having autonomous weaponry 825 00:36:16,304 --> 00:36:17,479 is just a very bad idea. 826 00:36:19,177 --> 00:36:21,266 AlphaStar is playing 827 00:36:21,309 --> 00:36:24,094 an extremely intelligent game right now. 828 00:36:24,138 --> 00:36:27,359 There is an element to what's being created 829 00:36:27,402 --> 00:36:28,882 at DeepMind in London 830 00:36:28,925 --> 00:36:34,148 that does seem like the Manhattan Project. 831 00:36:34,192 --> 00:36:37,586 There's a relationship between Robert Oppenheimer 832 00:36:37,630 --> 00:36:39,675 and Demis Hassabis 833 00:36:39,719 --> 00:36:44,202 in which they're unleashing a new force upon humanity. 834 00:36:44,245 --> 00:36:46,204 MaNa is fighting back, though. 835 00:36:46,247 --> 00:36:48,162 Oh, man! 836 00:36:48,206 --> 00:36:50,208 I think that Oppenheimer 837 00:36:50,251 --> 00:36:52,471 and some of the other leaders of that project got caught up 838 00:36:52,514 --> 00:36:54,908 in the excitement of building the technology 839 00:36:54,951 --> 00:36:56,170 and seeing if it was possible. 840 00:36:56,214 --> 00:36:58,520 Where is AlphaStar? 841 00:36:58,564 --> 00:36:59,782 Where is AlphaStar? 842 00:36:59,826 --> 00:37:01,958 I don't see AlphaStar's units anywhere. 843 00:37:02,002 --> 00:37:03,525 They did not think carefully enough 844 00:37:03,569 --> 00:37:07,312 about the morals of what they were doing early enough. 845 00:37:07,355 --> 00:37:08,965 What we should do as scientists 846 00:37:09,009 --> 00:37:11,011 with powerful new technologies 847 00:37:11,054 --> 00:37:13,883 is try and understand it in controlled conditions first. 848 00:37:14,928 --> 00:37:16,799 And that is that. 849 00:37:16,843 --> 00:37:19,411 MaNa has defeated AlphaStar. 850 00:37:29,551 --> 00:37:31,336 I mean, my honest feeling is that I think it is 851 00:37:31,379 --> 00:37:33,207 a fair representation of where we are. 852 00:37:33,251 --> 00:37:35,949 And I think that part feels... feels okay. 853 00:37:35,992 --> 00:37:37,429 -I'm very happy for you. -I'm happy. 854 00:37:37,472 --> 00:37:38,865 So well... well done. 855 00:37:38,908 --> 00:37:40,867 My view is that the approach to building technology 856 00:37:40,910 --> 00:37:43,348 which is embodied by move fast and break things, 857 00:37:43,391 --> 00:37:46,220 is exactly what we should not be doing, 858 00:37:46,264 --> 00:37:47,961 because you can't afford to break things 859 00:37:48,004 --> 00:37:49,049 and then fix them afterwards. 860 00:37:49,092 --> 00:37:50,398 -Cheers. -Thank you so much. 861 00:37:50,442 --> 00:37:52,008 Yeah, get... get some rest. You did really well. 862 00:37:52,052 --> 00:37:53,923 -Cheers, yeah? -Thank you for having us. 863 00:38:04,238 --> 00:38:05,500 When I was eight, 864 00:38:05,544 --> 00:38:06,849 I bought my first computer 865 00:38:06,893 --> 00:38:09,548 with the winnings from a chess tournament. 866 00:38:09,591 --> 00:38:11,158 I sort of had this intuition 867 00:38:11,201 --> 00:38:13,726 that computers are this magical device 868 00:38:13,769 --> 00:38:15,902 that can extend the power of the mind. 869 00:38:15,945 --> 00:38:17,382 I had a couple of school friends, 870 00:38:17,425 --> 00:38:19,166 and we used to have a hacking club, 871 00:38:19,209 --> 00:38:21,908 writing code, making games. 872 00:38:26,260 --> 00:38:27,827 And then over the summer holidays, 873 00:38:27,870 --> 00:38:29,219 I'd spend the whole day 874 00:38:29,263 --> 00:38:31,526 flicking through games magazines. 875 00:38:31,570 --> 00:38:33,441 And one day I noticed there was a competition 876 00:38:33,485 --> 00:38:35,878 to write an original version of Space Invaders. 877 00:38:35,922 --> 00:38:39,621 And the winner won a job at Bullfrog. 878 00:38:39,665 --> 00:38:42,320 Bullfrog at the time was the best game development house 879 00:38:42,363 --> 00:38:43,756 in all of Europe. 880 00:38:43,799 --> 00:38:45,279 You know, I really wanted to work at this place 881 00:38:45,323 --> 00:38:48,587 and see how they build games. 882 00:38:48,630 --> 00:38:50,415 Bullfrog, based here in Guildford, 883 00:38:50,458 --> 00:38:52,286 began with a big idea. 884 00:38:52,330 --> 00:38:54,680 That idea turned into the game Populous, 885 00:38:54,723 --> 00:38:56,551 which became a global bestseller. 886 00:38:56,595 --> 00:38:59,859 In the '90s, there was no recruitment agencies. 887 00:38:59,902 --> 00:39:02,122 You couldn't go out and say, you know, 888 00:39:02,165 --> 00:39:04,951 "Come and work in the games industry." 889 00:39:04,994 --> 00:39:08,171 It was still not even considered an industry. 890 00:39:08,215 --> 00:39:11,218 So we came up with the idea to have a competition 891 00:39:11,261 --> 00:39:13,655 and we got a lot of applicants. 892 00:39:14,700 --> 00:39:17,616 And one of those was Demis's. 893 00:39:17,659 --> 00:39:20,706 I can still remember clearly 894 00:39:20,749 --> 00:39:23,970 the day that Demis came in. 895 00:39:24,013 --> 00:39:27,147 He walked in the door, he looked about 12. 896 00:39:28,670 --> 00:39:30,019 I thought, "Oh, my God, 897 00:39:30,063 --> 00:39:31,586 "what the hell are we going to do with this guy?" 898 00:39:31,630 --> 00:39:32,979 I applied to Cambridge. 899 00:39:33,022 --> 00:39:35,373 I got in but they said I was way too young. 900 00:39:35,416 --> 00:39:37,853 So... So I needed to take a year off 901 00:39:37,897 --> 00:39:39,899 so I'd be at least 17 before I got there. 902 00:39:39,942 --> 00:39:42,771 And that's when I decided to spend that entire gap year 903 00:39:42,815 --> 00:39:44,469 working at Bullfrog. 904 00:39:44,512 --> 00:39:46,166 They couldn't even legally employ me, 905 00:39:46,209 --> 00:39:48,298 so I ended up being paid in brown paper envelopes. 906 00:39:50,823 --> 00:39:54,304 I got a feeling of being really at the cutting edge 907 00:39:54,348 --> 00:39:58,047 and how much fun that was to invent things every day. 908 00:39:58,091 --> 00:40:00,572 And then you know, a few months later, 909 00:40:00,615 --> 00:40:03,662 maybe everyone... a million people will be playing it. 910 00:40:03,705 --> 00:40:06,665 In those days computer games had to evolve. 911 00:40:06,708 --> 00:40:08,536 There had to be new genres 912 00:40:08,580 --> 00:40:11,757 which were more than just shooting things. 913 00:40:11,800 --> 00:40:14,063 Wouldn't it be amazing to have a game 914 00:40:14,107 --> 00:40:18,807 where you design and build your own theme park? 915 00:40:22,594 --> 00:40:25,945 Demis and I started to talk aboutTheme Park. 916 00:40:25,988 --> 00:40:28,904 It allows the player to build a world 917 00:40:28,948 --> 00:40:31,864 and see the consequences of your choices 918 00:40:31,907 --> 00:40:34,127 that you've made in that world. 919 00:40:34,170 --> 00:40:36,085 A human player set out the layout 920 00:40:36,129 --> 00:40:38,566 of the theme park and designed the roller coaster 921 00:40:38,610 --> 00:40:41,351 and set the prices in the chip shop. 922 00:40:41,395 --> 00:40:43,615 What I was working on was the behaviors of the people. 923 00:40:43,658 --> 00:40:45,138 They were autonomous 924 00:40:45,181 --> 00:40:47,314 and that was the AI in this case. 925 00:40:47,357 --> 00:40:48,881 So what I was trying to do was mimic 926 00:40:48,924 --> 00:40:51,013 interesting human behavior 927 00:40:51,057 --> 00:40:52,319 so that the simulation would be 928 00:40:52,362 --> 00:40:54,582 more interesting to interact with. 929 00:40:54,626 --> 00:40:56,541 Demis worked on ridiculous things, 930 00:40:56,584 --> 00:40:59,413 like you could place down these shops 931 00:40:59,457 --> 00:41:03,591 and if you put a shop too near a very dangerous ride, 932 00:41:03,635 --> 00:41:05,375 then people on the ride would throw up 933 00:41:05,419 --> 00:41:08,030 because they'd just eaten. 934 00:41:08,074 --> 00:41:09,641 And then that would make other people throw up 935 00:41:09,684 --> 00:41:12,121 when they saw the throwing-up on the floor, 936 00:41:12,165 --> 00:41:14,559 so you then had to have lots of sweepers 937 00:41:14,602 --> 00:41:17,823 to quickly sweep it up before the people saw it. 938 00:41:17,866 --> 00:41:19,520 That's the cool thing about it. 939 00:41:19,564 --> 00:41:22,784 You as the player tinker with it and then it reacts to you. 940 00:41:22,828 --> 00:41:25,874 All those nuanced simulation things he did 941 00:41:25,918 --> 00:41:28,094 and that was an invention 942 00:41:28,137 --> 00:41:31,227 which never really existed before. 943 00:41:31,271 --> 00:41:34,230 It was unbelievably successful. 944 00:41:34,274 --> 00:41:35,710 Theme Park actually turned out 945 00:41:35,754 --> 00:41:37,190 to be a top ten title 946 00:41:37,233 --> 00:41:39,932 and that was the first time we were starting to see 947 00:41:39,975 --> 00:41:43,022 how AI could make a difference. 948 00:41:46,155 --> 00:41:47,592 We were doing some Christmas shopping 949 00:41:47,635 --> 00:41:51,247 and were waiting for the taxi to take us home. 950 00:41:51,291 --> 00:41:54,947 I have this very clear memory of Demis talking about AI 951 00:41:54,990 --> 00:41:56,209 in a very different way, 952 00:41:56,252 --> 00:41:58,428 in a way that we didn't commonly talk about. 953 00:41:58,472 --> 00:42:02,345 This idea of AI being useful for other things 954 00:42:02,389 --> 00:42:04,086 other than entertainment. 955 00:42:04,130 --> 00:42:07,437 So being useful for, um, helping the world 956 00:42:07,481 --> 00:42:10,310 and the potential of AI to change the world. 957 00:42:10,353 --> 00:42:13,226 I just said to Demis, "What is it you want to do?" 958 00:42:13,269 --> 00:42:14,532 And he said to me, 959 00:42:14,575 --> 00:42:16,795 "I want to be the person that solves AI." 960 00:42:22,670 --> 00:42:25,760 Peter offered me ยฃ1 million 961 00:42:25,804 --> 00:42:27,675 to not go to university. 962 00:42:30,199 --> 00:42:32,593 But I had a plan from the beginning. 963 00:42:32,637 --> 00:42:35,814 And my plan was always to go to Cambridge. 964 00:42:35,857 --> 00:42:36,902 I think a lot of my schoolfriends 965 00:42:36,945 --> 00:42:38,033 thought I was mad. 966 00:42:38,077 --> 00:42:39,252 Why would you not... 967 00:42:39,295 --> 00:42:40,688 I mean, ยฃ1 million, that's a lot of money. 968 00:42:40,732 --> 00:42:43,517 In the '90s, that is a lot of money, right? 969 00:42:43,561 --> 00:42:46,346 For a... For a poor 17-year-old kid. 970 00:42:46,389 --> 00:42:50,219 He's like this little seed that's going to burst through, 971 00:42:50,263 --> 00:42:53,658 and he's not going to be able to do that at Bullfrog. 972 00:42:56,443 --> 00:42:59,098 I had to drop him off at the train station 973 00:42:59,141 --> 00:43:02,580 and I can still see that picture 974 00:43:02,623 --> 00:43:07,019 of this little elven character disappear down that tunnel. 975 00:43:07,062 --> 00:43:09,804 That was an incredibly sad moment. 976 00:43:13,242 --> 00:43:14,635 I had this romantic ideal 977 00:43:14,679 --> 00:43:16,942 of what Cambridge would be like, 978 00:43:16,985 --> 00:43:18,639 1,000 years of history, 979 00:43:18,683 --> 00:43:21,033 walking the same streets that Turing, 980 00:43:21,076 --> 00:43:23,601 Newton and Crick had walked. 981 00:43:23,644 --> 00:43:26,647 I wanted to explore the edge of the universe. 982 00:43:29,084 --> 00:43:30,346 When I got to Cambridge, 983 00:43:30,390 --> 00:43:32,653 I'd basically been working my whole life. 984 00:43:33,741 --> 00:43:35,090 Every single summer, 985 00:43:35,134 --> 00:43:37,136 I was either playing chess professionally, 986 00:43:37,179 --> 00:43:39,704 or I was working, doing an internship. 987 00:43:39,747 --> 00:43:43,708 So I was, like, "Right, I am gonna have fun now 988 00:43:43,751 --> 00:43:46,711 "and explore what it means to be a normal teenager." 989 00:43:50,236 --> 00:43:52,238 Come on! Go, boy, go! 990 00:43:52,281 --> 00:43:54,022 It was work hard and play hard. 991 00:43:55,850 --> 00:43:57,025 I first met Demis 992 00:43:57,069 --> 00:43:59,201 because we both attended Queens' College. 993 00:44:00,115 --> 00:44:01,203 Our group of friends, 994 00:44:01,247 --> 00:44:03,205 we'd often drink beer in the bar, 995 00:44:03,249 --> 00:44:04,946 play table football. 996 00:44:04,990 --> 00:44:07,340 In the bar, I used to play speed chess, 997 00:44:07,383 --> 00:44:09,255 pieces flying off the board, 998 00:44:09,298 --> 00:44:11,083 you know, the whole game in one minute. 999 00:44:11,126 --> 00:44:12,301 Demis sat down opposite me. 1000 00:44:12,345 --> 00:44:13,563 And I looked at him and I thought, 1001 00:44:13,607 --> 00:44:15,217 "I remember you from when we were kids." 1002 00:44:15,261 --> 00:44:17,176 I had actually been in the same chess tournament 1003 00:44:17,219 --> 00:44:18,786 as Dave in Ipswich, 1004 00:44:18,830 --> 00:44:20,440 where I used to go and try and raid his local chess club 1005 00:44:20,483 --> 00:44:22,703 to win a bit of prize money. 1006 00:44:22,747 --> 00:44:24,618 We were studying computer science. 1007 00:44:24,662 --> 00:44:26,794 Some people, who at the age of 17 1008 00:44:26,838 --> 00:44:28,404 would have come in and made sure to tell everybody 1009 00:44:28,448 --> 00:44:29,492 everything about themselves. 1010 00:44:29,536 --> 00:44:30,972 "Hey, I worked at Bullfrog 1011 00:44:31,016 --> 00:44:33,018 "and built the world's most successful video game." 1012 00:44:33,061 --> 00:44:34,715 But he wasn't like that at all. 1013 00:44:34,759 --> 00:44:36,412 At Cambridge, Demis and myself 1014 00:44:36,456 --> 00:44:38,414 both had an interest in computational neuroscience 1015 00:44:38,458 --> 00:44:40,242 and trying to understand how computers and brains 1016 00:44:40,286 --> 00:44:42,636 intertwined and linked together. 1017 00:44:42,680 --> 00:44:44,290 Both David and Demis 1018 00:44:44,333 --> 00:44:46,422 came to me for supervisions. 1019 00:44:46,466 --> 00:44:49,774 It happens just by coincidence that the year 1997, 1020 00:44:49,817 --> 00:44:51,645 their third and final year at Cambridge, 1021 00:44:51,689 --> 00:44:55,301 was also the year when the first chess grandmaster 1022 00:44:55,344 --> 00:44:56,781 was beaten by a computer program. 1023 00:44:58,304 --> 00:45:00,088 Round one today of a chess match 1024 00:45:00,132 --> 00:45:03,701 between the ranking world champion Garry Kasparov 1025 00:45:03,744 --> 00:45:06,007 and an opponent named Deep Blue 1026 00:45:06,051 --> 00:45:10,490 to test to see if the human brain can outwit a machine. 1027 00:45:10,533 --> 00:45:11,621 I remember the drama 1028 00:45:11,665 --> 00:45:13,798 of Kasparov losing the last match. 1029 00:45:13,841 --> 00:45:15,234 Whoa! 1030 00:45:15,277 --> 00:45:17,192 Kasparov has resigned! 1031 00:45:17,236 --> 00:45:19,586 When Deep Blue beat Garry Kasparov, 1032 00:45:19,629 --> 00:45:21,457 that was a real watershed event. 1033 00:45:21,501 --> 00:45:23,155 My main memory of it was 1034 00:45:23,198 --> 00:45:25,331 I wasn't that impressed with Deep Blue. 1035 00:45:25,374 --> 00:45:27,246 I was more impressed with Kasparov's mind. 1036 00:45:27,289 --> 00:45:29,509 That he could play chess to this level, 1037 00:45:29,552 --> 00:45:31,641 where he could compete on an equal footing 1038 00:45:31,685 --> 00:45:33,252 with the brute of a machine, 1039 00:45:33,295 --> 00:45:35,167 but of course, Kasparov can do 1040 00:45:35,210 --> 00:45:36,951 everything else humans can do, too. 1041 00:45:36,995 --> 00:45:38,257 It was a huge achievement. 1042 00:45:38,300 --> 00:45:39,388 But the truth of the matter was, 1043 00:45:39,432 --> 00:45:40,868 Deep Blue could only play chess. 1044 00:45:42,435 --> 00:45:44,524 What we would regard as intelligence 1045 00:45:44,567 --> 00:45:46,874 was missing from that system. 1046 00:45:46,918 --> 00:45:49,834 This idea of generality and also learning. 1047 00:45:53,751 --> 00:45:55,404 Cambridge was amazing, because of course, you know, 1048 00:45:55,448 --> 00:45:56,666 you're mixing with people 1049 00:45:56,710 --> 00:45:58,233 who are studying many different subjects. 1050 00:45:58,277 --> 00:46:01,410 There were scientists, philosophers, artists... 1051 00:46:01,454 --> 00:46:04,457 ...geologists, biologists, ecologists. 1052 00:46:04,500 --> 00:46:07,416 You know, everybody is talking about everything all the time. 1053 00:46:07,460 --> 00:46:10,768 I was obsessed with the protein folding problem. 1054 00:46:10,811 --> 00:46:13,248 Tim Stevens used to talk obsessively, 1055 00:46:13,292 --> 00:46:15,381 almost like religiously about this problem, 1056 00:46:15,424 --> 00:46:17,165 protein folding problem. 1057 00:46:17,209 --> 00:46:18,863 Proteins are, you know, 1058 00:46:18,906 --> 00:46:22,083 one of the most beautiful and elegant things about biology. 1059 00:46:22,127 --> 00:46:24,738 They are the machines of life. 1060 00:46:24,782 --> 00:46:27,001 They build everything, they control everything, 1061 00:46:27,045 --> 00:46:29,569 they're why biology works. 1062 00:46:29,612 --> 00:46:32,659 Proteins are made from strings of amino acids 1063 00:46:32,702 --> 00:46:37,055 that fold up to create a protein structure. 1064 00:46:37,098 --> 00:46:39,884 If we can predict the structure of proteins 1065 00:46:39,927 --> 00:46:43,104 from just their amino acid sequences, 1066 00:46:43,148 --> 00:46:46,107 then a new protein to cure cancer 1067 00:46:46,151 --> 00:46:49,415 or break down plastic to help the environment 1068 00:46:49,458 --> 00:46:50,808 is definitely something 1069 00:46:50,851 --> 00:46:52,940 that you could begin to think about. 1070 00:46:53,941 --> 00:46:55,029 I kind of thought, 1071 00:46:55,073 --> 00:46:58,293 "Well, is a human being clever enough 1072 00:46:58,337 --> 00:46:59,991 "to actually fold a protein?" 1073 00:47:00,034 --> 00:47:02,080 We can't work it out. 1074 00:47:02,123 --> 00:47:04,082 Since the 1960s, 1075 00:47:04,125 --> 00:47:05,953 we thought that in principle, 1076 00:47:05,997 --> 00:47:08,913 if I know what the amino acid sequence of a protein is, 1077 00:47:08,956 --> 00:47:11,437 I should be able to compute what the structure's like. 1078 00:47:11,480 --> 00:47:13,700 So, if you could just press a button, 1079 00:47:13,743 --> 00:47:16,311 and they'd all come popping out, that would be... 1080 00:47:16,355 --> 00:47:18,009 that would have some impact. 1081 00:47:20,272 --> 00:47:21,577 It stuck in my mind. 1082 00:47:21,621 --> 00:47:23,405 "Oh, this is a very interesting problem." 1083 00:47:23,449 --> 00:47:26,844 And it felt to me like it would be solvable. 1084 00:47:26,887 --> 00:47:29,934 But I thought it would need AI to do it. 1085 00:47:31,500 --> 00:47:34,025 If we could just solve protein folding, 1086 00:47:34,068 --> 00:47:35,635 it could change the world. 1087 00:47:50,868 --> 00:47:52,826 Ever since I was a student at Cambridge, 1088 00:47:54,001 --> 00:47:55,611 I've never stopped thinking about 1089 00:47:55,655 --> 00:47:57,135 the protein folding problem. 1090 00:47:59,789 --> 00:48:02,792 If you were to solve protein folding, 1091 00:48:02,836 --> 00:48:05,665 then the potential to help solve problems like 1092 00:48:05,708 --> 00:48:09,669 Alzheimer's, dementia and drug discovery is huge. 1093 00:48:09,712 --> 00:48:11,671 Solving disease is probably 1094 00:48:11,714 --> 00:48:13,325 the most major impact we could have. 1095 00:48:15,457 --> 00:48:16,676 Thousands of very smart people 1096 00:48:16,719 --> 00:48:18,765 have tried to solve protein folding. 1097 00:48:18,808 --> 00:48:20,985 I just think now is the right time 1098 00:48:21,028 --> 00:48:22,508 for AI to crack it. 1099 00:48:26,555 --> 00:48:28,383 We needed a reasonable way 1100 00:48:28,427 --> 00:48:29,515 to apply machine learning 1101 00:48:29,558 --> 00:48:30,516 to the protein folding problem. 1102 00:48:32,431 --> 00:48:35,173 We came across this Foldit game. 1103 00:48:35,216 --> 00:48:38,785 The goal is to move around this 3D model of a protein 1104 00:48:38,828 --> 00:48:41,701 and you get a score every time you move it. 1105 00:48:41,744 --> 00:48:43,050 The more accurate you make these structures, 1106 00:48:43,094 --> 00:48:45,531 the more useful they will be to biologists. 1107 00:48:46,227 --> 00:48:47,489 I spent a few days 1108 00:48:47,533 --> 00:48:48,838 just kind of seeing how well we could do. 1109 00:48:50,666 --> 00:48:52,407 We did reasonably well. 1110 00:48:52,451 --> 00:48:53,843 But even if you were 1111 00:48:53,887 --> 00:48:55,280 the world's best Foldit player, 1112 00:48:55,323 --> 00:48:57,499 you wouldn't solve protein folding. 1113 00:48:57,543 --> 00:48:59,501 That's why we had to move beyond the game. 1114 00:48:59,545 --> 00:49:00,720 Games are always just 1115 00:49:00,763 --> 00:49:03,723 the proving ground for our algorithms. 1116 00:49:03,766 --> 00:49:07,727 The ultimate goal was not just to crack Go and StarCraft. 1117 00:49:07,770 --> 00:49:10,034 It was to crack real-world challenges. 1118 00:49:16,127 --> 00:49:18,520 I remember hearing this rumor 1119 00:49:18,564 --> 00:49:21,175 that Demis was getting into proteins. 1120 00:49:21,219 --> 00:49:23,743 I talked to some people at DeepMind and I would ask, 1121 00:49:23,786 --> 00:49:25,049 "So are you doing protein folding?" 1122 00:49:25,092 --> 00:49:26,920 And they would artfully change the subject. 1123 00:49:26,964 --> 00:49:30,097 And when that happened twice, I pretty much figured it out. 1124 00:49:30,141 --> 00:49:32,839 So I thought I should submit a resume. 1125 00:49:32,882 --> 00:49:35,668 All right, everyone, welcome to DeepMind. 1126 00:49:35,711 --> 00:49:37,626 I know some of you, this may be your first week, 1127 00:49:37,670 --> 00:49:39,193 but I hope you all set... 1128 00:49:39,237 --> 00:49:40,890 The really appealing part for me about the job 1129 00:49:40,934 --> 00:49:42,675 was this, like, sense of connection 1130 00:49:42,718 --> 00:49:44,503 to the larger purpose. 1131 00:49:44,546 --> 00:49:45,591 If we can crack 1132 00:49:45,634 --> 00:49:48,028 some fundamental problems in science, 1133 00:49:48,072 --> 00:49:49,160 many other people 1134 00:49:49,203 --> 00:49:50,900 and other companies and labs and so on 1135 00:49:50,944 --> 00:49:52,467 could build on top of our work. 1136 00:49:52,511 --> 00:49:53,816 This is your chance now 1137 00:49:53,860 --> 00:49:55,818 to add your chapter to this story. 1138 00:49:55,862 --> 00:49:57,298 When I arrived, 1139 00:49:57,342 --> 00:49:59,605 I was definitely quite a bit nervous. 1140 00:49:59,648 --> 00:50:00,736 I'm still trying to keep... 1141 00:50:00,780 --> 00:50:02,912 I haven't taken any biology courses. 1142 00:50:02,956 --> 00:50:05,393 We haven't spent years of our lives 1143 00:50:05,437 --> 00:50:07,917 looking at these structures and understanding them. 1144 00:50:07,961 --> 00:50:09,963 We are just going off the data 1145 00:50:10,007 --> 00:50:11,269 and our machine learning models. 1146 00:50:12,705 --> 00:50:13,836 In machine learning, 1147 00:50:13,880 --> 00:50:15,795 you train a network like flashcards. 1148 00:50:15,838 --> 00:50:18,798 Here's the question. Here's the answer. 1149 00:50:18,841 --> 00:50:20,756 Here's the question. Here's the answer. 1150 00:50:20,800 --> 00:50:22,323 But in protein folding, 1151 00:50:22,367 --> 00:50:25,457 we're not doing the kind of standard task at DeepMind 1152 00:50:25,500 --> 00:50:28,155 where you have unlimited data. 1153 00:50:28,199 --> 00:50:30,766 Your job is to get better at chess or Go 1154 00:50:30,810 --> 00:50:32,986 and you can play as many games of chess or Go 1155 00:50:33,030 --> 00:50:34,814 as your computers will allow. 1156 00:50:35,510 --> 00:50:36,772 With proteins, 1157 00:50:36,816 --> 00:50:39,732 we're sitting on a very thick size of data 1158 00:50:39,775 --> 00:50:41,995 that's been determined by a half century 1159 00:50:42,039 --> 00:50:46,478 of time-consuming experimental methods in laboratories. 1160 00:50:46,521 --> 00:50:49,698 These painstaking methods can take months or years 1161 00:50:49,742 --> 00:50:52,353 to determine a single protein structure, 1162 00:50:52,397 --> 00:50:55,791 and sometimes, a structure can never be determined. 1163 00:50:57,315 --> 00:51:00,274 That's why we're working with such small datasets 1164 00:51:00,318 --> 00:51:02,233 to train our algorithms. 1165 00:51:02,276 --> 00:51:04,365 When DeepMind started to explore 1166 00:51:04,409 --> 00:51:05,975 the folding problem, 1167 00:51:06,019 --> 00:51:07,977 they were talking to us about which datasets they were using 1168 00:51:08,021 --> 00:51:09,892 and what would be the possibilities 1169 00:51:09,936 --> 00:51:11,633 if they did solve this problem. 1170 00:51:12,373 --> 00:51:13,940 Many people have tried, 1171 00:51:13,983 --> 00:51:16,638 and yet no one on the planet has solved protein folding. 1172 00:51:16,682 --> 00:51:18,205 I did think to myself, 1173 00:51:18,249 --> 00:51:19,946 "Well, you know, good luck." 1174 00:51:19,989 --> 00:51:22,731 If we can solve the protein folding problem, 1175 00:51:22,775 --> 00:51:25,691 it would have an incredible kind of medical relevance. 1176 00:51:25,734 --> 00:51:27,736 This is the cycle of science. 1177 00:51:27,780 --> 00:51:30,043 You do a huge amount of exploration, 1178 00:51:30,087 --> 00:51:31,914 and then you go into exploitation mode, 1179 00:51:31,958 --> 00:51:33,568 and you focus and you see 1180 00:51:33,612 --> 00:51:35,353 how good are those ideas, really? 1181 00:51:35,396 --> 00:51:36,528 And there's nothing better 1182 00:51:36,571 --> 00:51:38,095 than external competition for that. 1183 00:51:39,835 --> 00:51:43,056 So we decided to enter CASP competition. 1184 00:51:43,100 --> 00:51:47,234 CASP, we started to try and speed up 1185 00:51:47,278 --> 00:51:49,802 the solution to the protein folding problem. 1186 00:51:49,845 --> 00:51:52,065 CASP is when we say, 1187 00:51:52,109 --> 00:51:54,372 "Look, DeepMind is doing protein folding, 1188 00:51:54,415 --> 00:51:55,590 "this is how good we are, 1189 00:51:55,634 --> 00:51:57,592 "and maybe it's better than everybody else. 1190 00:51:57,636 --> 00:51:58,680 "Maybe it isn't." 1191 00:51:58,724 --> 00:52:00,073 CASP is a bit like 1192 00:52:00,117 --> 00:52:02,075 the Olympic Games of protein folding. 1193 00:52:03,642 --> 00:52:06,035 CASP is a community-wide assessment 1194 00:52:06,079 --> 00:52:08,125 that's held every two years. 1195 00:52:09,561 --> 00:52:10,866 Teams are given 1196 00:52:10,910 --> 00:52:14,261 the amino acid sequences of about 100 proteins, 1197 00:52:14,305 --> 00:52:17,525 and then they try to solve this folding problem 1198 00:52:17,569 --> 00:52:20,833 using computational methods. 1199 00:52:20,876 --> 00:52:23,401 These proteins have already been determined 1200 00:52:23,444 --> 00:52:25,968 by experiments in a laboratory, 1201 00:52:26,012 --> 00:52:29,276 but have not yet been revealed publicly. 1202 00:52:29,320 --> 00:52:30,756 And these known structures 1203 00:52:30,799 --> 00:52:33,280 represent the gold standard against which 1204 00:52:33,324 --> 00:52:36,936 all the computational predictions will be compared. 1205 00:52:37,937 --> 00:52:39,330 We've got a score 1206 00:52:39,373 --> 00:52:42,159 that measures the accuracy of the predictions. 1207 00:52:42,202 --> 00:52:44,726 And you would expect a score of over 90 1208 00:52:44,770 --> 00:52:47,425 to be a solution to the protein folding problem. 1209 00:52:48,556 --> 00:52:50,036 Welcome, everyone, 1210 00:52:50,079 --> 00:52:52,038 to our first, uh, semifinals in the winners' bracket. 1211 00:52:52,081 --> 00:52:54,954 Nick and John versus Demis and Frank. 1212 00:52:54,997 --> 00:52:57,217 Please join us, come around. This will be an intense match. 1213 00:52:57,261 --> 00:52:59,306 When I learned that Demis was 1214 00:52:59,350 --> 00:53:02,048 going to tackle the protein folding issue, 1215 00:53:02,091 --> 00:53:04,790 um, I wasn't at all surprised. 1216 00:53:04,833 --> 00:53:06,792 It's very typical of Demis. 1217 00:53:06,835 --> 00:53:08,924 You know, he loves competition. 1218 00:53:08,968 --> 00:53:10,143 And that's the end 1219 00:53:10,187 --> 00:53:12,972 of the first game, 10-7. 1220 00:53:13,015 --> 00:53:14,103 The aim for CASP would be 1221 00:53:14,147 --> 00:53:15,975 to not just win the competition, 1222 00:53:16,018 --> 00:53:19,892 but sort of, um, retire the need for it. 1223 00:53:19,935 --> 00:53:23,417 So, 20 targets total have been released by CASP. 1224 00:53:23,461 --> 00:53:24,810 We were thinking maybe 1225 00:53:24,853 --> 00:53:26,899 throw in the standard kind of machine learning 1226 00:53:26,942 --> 00:53:28,814 and see how far that could take us. 1227 00:53:28,857 --> 00:53:30,729 Instead of having a couple of days on an experiment, 1228 00:53:30,772 --> 00:53:33,558 we can turn around five experiments a day. 1229 00:53:33,601 --> 00:53:35,342 Great. Well done, everyone. 1230 00:53:36,778 --> 00:53:38,693 Can you show me the real one instead of ours? 1231 00:53:38,737 --> 00:53:39,781 The true answer is 1232 00:53:39,825 --> 00:53:42,044 supposed to look something like that. 1233 00:53:42,088 --> 00:53:45,047 It's a lot more cylindrical than I thought. 1234 00:53:45,091 --> 00:53:47,398 The results were not very good. 1235 00:53:47,441 --> 00:53:48,703 Okay. 1236 00:53:48,747 --> 00:53:49,835 We throw all the obvious ideas to it 1237 00:53:49,878 --> 00:53:51,793 and the problem laughs at you. 1238 00:53:52,881 --> 00:53:54,361 This makes no sense. 1239 00:53:54,405 --> 00:53:56,015 We thought we could just throw 1240 00:53:56,058 --> 00:53:58,322 some of our best algorithms at the problem. 1241 00:53:59,366 --> 00:54:00,976 We were slightly naive. 1242 00:54:01,020 --> 00:54:02,282 We should be learning this, 1243 00:54:02,326 --> 00:54:04,284 you know, in the blink of an eye. 1244 00:54:05,372 --> 00:54:06,982 The thing I'm worried about is, 1245 00:54:07,026 --> 00:54:08,375 we take the field from 1246 00:54:08,419 --> 00:54:10,899 really bad answers to moderately bad answers. 1247 00:54:10,943 --> 00:54:13,946 I feel like we need some sort of new technology 1248 00:54:13,989 --> 00:54:15,164 for moving around these things. 1249 00:54:20,431 --> 00:54:22,215 With only a week left of CASP, 1250 00:54:22,259 --> 00:54:24,348 it's now a sprint to get it deployed. 1251 00:54:26,654 --> 00:54:28,090 You've done your best. 1252 00:54:28,134 --> 00:54:29,875 Then there's nothing more you can do 1253 00:54:29,918 --> 00:54:32,399 but wait for CASP to deliver the results. 1254 00:54:52,593 --> 00:54:53,725 This famous thing of Einstein, 1255 00:54:53,768 --> 00:54:55,030 the last couple of years of his life, 1256 00:54:55,074 --> 00:54:57,381 when he was here, he overlapped with Kurt Godel 1257 00:54:57,424 --> 00:54:59,774 and he said one of the reasons he still comes in to work 1258 00:54:59,818 --> 00:55:01,646 is so that he gets to walk home 1259 00:55:01,689 --> 00:55:03,517 and discuss things with Godel. 1260 00:55:03,561 --> 00:55:05,911 It's a pretty big compliment for Kurt Godel, 1261 00:55:05,954 --> 00:55:07,478 shows you how amazing he was. 1262 00:55:09,131 --> 00:55:10,568 The Institute for Advanced Study 1263 00:55:10,611 --> 00:55:12,918 was formed in 1933. 1264 00:55:12,961 --> 00:55:14,223 In the early years, 1265 00:55:14,267 --> 00:55:16,487 the intense scientific atmosphere attracted 1266 00:55:16,530 --> 00:55:19,359 some of the most brilliant mathematicians and physicists 1267 00:55:19,403 --> 00:55:22,536 ever concentrated in a single place and time. 1268 00:55:22,580 --> 00:55:24,582 The founding principle of this place, 1269 00:55:24,625 --> 00:55:28,020 it's the idea of unfettered intellectual pursuits, 1270 00:55:28,063 --> 00:55:30,152 even if you don't know what you're exploring. 1271 00:55:30,196 --> 00:55:32,154 Will result in some cool things, 1272 00:55:32,198 --> 00:55:34,896 and sometimes that then ends up being useful, 1273 00:55:34,940 --> 00:55:36,376 which, of course, 1274 00:55:36,420 --> 00:55:37,986 is partially what I've been trying to do at DeepMind. 1275 00:55:38,030 --> 00:55:39,988 How many big breakthroughs do you think are required 1276 00:55:40,032 --> 00:55:41,686 to get all the way to AGI? 1277 00:55:41,729 --> 00:55:43,078 And, you know, I estimate maybe 1278 00:55:43,122 --> 00:55:44,341 there's about a dozen of those. 1279 00:55:44,384 --> 00:55:46,125 You know, I hope it's within my lifetime. 1280 00:55:46,168 --> 00:55:47,605 - Yes, okay. - But then, 1281 00:55:47,648 --> 00:55:49,171 all scientists hope that, right? 1282 00:55:49,215 --> 00:55:51,130 Demis has many accolades. 1283 00:55:51,173 --> 00:55:54,002 He was elected Fellow to the Royal Society last year. 1284 00:55:54,046 --> 00:55:55,961 He is also a Fellow of Royal Society of Arts. 1285 00:55:56,004 --> 00:55:57,528 A big hand for Demis Hassabis. 1286 00:56:04,230 --> 00:56:05,927 My dream has always been to try 1287 00:56:05,971 --> 00:56:08,103 and make AI-assisted science possible. 1288 00:56:08,147 --> 00:56:09,235 And what I think is 1289 00:56:09,278 --> 00:56:11,150 our most exciting project, last year, 1290 00:56:11,193 --> 00:56:13,152 which is our work in protein folding. 1291 00:56:13,195 --> 00:56:15,328 Uh, and we call this system AlphaFold. 1292 00:56:15,372 --> 00:56:18,331 We entered it into CASP and our system, uh, 1293 00:56:18,375 --> 00:56:20,507 was the most accurate, uh, predicting structures 1294 00:56:20,551 --> 00:56:24,946 for 25 out of the 43 proteins in the hardest category. 1295 00:56:24,990 --> 00:56:26,208 So we're state of the art, 1296 00:56:26,252 --> 00:56:27,514 but we still... I have to make... Be clear, 1297 00:56:27,558 --> 00:56:28,559 we're still a long way from 1298 00:56:28,602 --> 00:56:30,474 solving the protein folding problem. 1299 00:56:30,517 --> 00:56:31,866 We're working hard on this, though, 1300 00:56:31,910 --> 00:56:33,738 and we're exploring many other techniques. 1301 00:56:49,188 --> 00:56:50,232 Let's get started. 1302 00:56:50,276 --> 00:56:53,148 So kind of a rapid debrief, 1303 00:56:53,192 --> 00:56:55,455 these are our final rankings for CASP. 1304 00:56:56,500 --> 00:56:57,544 We beat the second team 1305 00:56:57,588 --> 00:57:00,155 in this competition by nearly 50%, 1306 00:57:00,199 --> 00:57:01,592 but we've still got a long way to go 1307 00:57:01,635 --> 00:57:04,333 before we've solved the protein folding problem 1308 00:57:04,377 --> 00:57:07,032 in a sense that a biologist could use it. 1309 00:57:07,075 --> 00:57:08,990 It is area of concern. 1310 00:57:11,602 --> 00:57:14,213 The quality of predictions varied 1311 00:57:14,256 --> 00:57:16,737 and they were no more useful than the previous methods. 1312 00:57:16,781 --> 00:57:19,914 AlphaFold didn't produce good enough data 1313 00:57:19,958 --> 00:57:22,526 for it to be useful in a practical way 1314 00:57:22,569 --> 00:57:24,005 to, say, somebody like me 1315 00:57:24,049 --> 00:57:28,227 investigating my own biological problems. 1316 00:57:28,270 --> 00:57:30,316 That was kind of a humbling moment 1317 00:57:30,359 --> 00:57:32,753 'cause we thought we'd worked very hard and succeeded. 1318 00:57:32,797 --> 00:57:34,886 And what we'd found is we were the best in the world 1319 00:57:34,929 --> 00:57:36,453 at a problem the world's not good at. 1320 00:57:37,671 --> 00:57:38,933 We knew we sucked. 1321 00:57:40,457 --> 00:57:42,328 It doesn't help if you have the tallest ladder 1322 00:57:42,371 --> 00:57:44,635 when you're going to the moon. 1323 00:57:44,678 --> 00:57:47,115 The opinion of quite a few people on the team, 1324 00:57:47,159 --> 00:57:51,468 that this is sort of a fool's errand in some ways. 1325 00:57:51,511 --> 00:57:54,079 And I might have been wrong with protein folding. 1326 00:57:54,122 --> 00:57:55,559 Maybe it's too hard still 1327 00:57:55,602 --> 00:57:58,431 for where we're at generally with AI. 1328 00:57:58,475 --> 00:58:01,173 If you want to do biological research, 1329 00:58:01,216 --> 00:58:03,044 you have to be prepared to fail 1330 00:58:03,088 --> 00:58:06,570 because biology is very complicated. 1331 00:58:06,613 --> 00:58:09,790 I've run a laboratory for nearly 50 years, 1332 00:58:09,834 --> 00:58:11,096 and half my time, 1333 00:58:11,139 --> 00:58:12,619 I'm just an amateur psychiatrist 1334 00:58:12,663 --> 00:58:18,103 to keep, um, my colleagues cheerful when nothing works. 1335 00:58:18,146 --> 00:58:22,542 And quite a lot of the time and I mean, 80, 90%, 1336 00:58:22,586 --> 00:58:24,413 it does not work. 1337 00:58:24,457 --> 00:58:26,720 If you are at the forefront of science, 1338 00:58:26,764 --> 00:58:30,115 I can tell you, you will fail a great deal. 1339 00:58:35,163 --> 00:58:37,165 I just felt disappointed. 1340 00:58:38,689 --> 00:58:41,605 Lesson I learned is that ambition is a good thing, 1341 00:58:41,648 --> 00:58:43,694 but you need to get the timing right. 1342 00:58:43,737 --> 00:58:46,784 There's no point being 50 years ahead of your time. 1343 00:58:46,827 --> 00:58:48,133 You will never survive 1344 00:58:48,176 --> 00:58:49,917 fifty years of that kind of endeavor 1345 00:58:49,961 --> 00:58:51,963 before it yields something. 1346 00:58:52,006 --> 00:58:53,268 You'll literally die trying. 1347 00:59:08,936 --> 00:59:11,286 When we talk about AGI, 1348 00:59:11,330 --> 00:59:14,376 the holy grail of artificial intelligence, 1349 00:59:14,420 --> 00:59:15,508 it becomes really difficult 1350 00:59:15,552 --> 00:59:17,815 to know what we're even talking about. 1351 00:59:17,858 --> 00:59:19,643 Which bits are we gonna see today? 1352 00:59:19,686 --> 00:59:21,645 We're going to start in the garden. 1353 00:59:23,124 --> 00:59:25,649 This is the garden looking from the observation area. 1354 00:59:25,692 --> 00:59:27,433 Research scientists and engineers 1355 00:59:27,476 --> 00:59:30,871 can analyze and collaborate and evaluate 1356 00:59:30,915 --> 00:59:33,004 what's going on in real time. 1357 00:59:33,047 --> 00:59:34,614 So in the 1800s, 1358 00:59:34,658 --> 00:59:37,008 we'd think of things like television and the submarine 1359 00:59:37,051 --> 00:59:38,139 or a rocket ship to the moon 1360 00:59:38,183 --> 00:59:40,228 and say these things are impossible. 1361 00:59:40,272 --> 00:59:41,490 Yet Jules Verne wrote about them and, 1362 00:59:41,534 --> 00:59:44,406 a century and a half later, they happened. 1363 00:59:44,450 --> 00:59:45,451 We'll be experimenting 1364 00:59:45,494 --> 00:59:47,888 on civilizations really, 1365 00:59:47,932 --> 00:59:50,587 civilizations of AI agents. 1366 00:59:50,630 --> 00:59:52,719 Once the experiments start going, 1367 00:59:52,763 --> 00:59:54,242 it's going to be the most exciting thing ever. 1368 00:59:54,286 --> 00:59:56,984 So how will we get sleep? 1369 00:59:57,028 --> 00:59:58,682 I won't be able to sleep. 1370 00:59:58,725 --> 01:00:00,684 Full AGI will be able to do 1371 01:00:00,727 --> 01:00:03,861 any cognitive task a person can do. 1372 01:00:03,904 --> 01:00:08,387 It will be at a scale, potentially, far beyond that. 1373 01:00:08,430 --> 01:00:10,302 It's really impossible for us 1374 01:00:10,345 --> 01:00:14,828 to imagine the outputs of a superintelligent entity. 1375 01:00:14,872 --> 01:00:18,963 It's like asking a gorilla to imagine, you know, 1376 01:00:19,006 --> 01:00:20,181 what Einstein does 1377 01:00:20,225 --> 01:00:23,402 when he produces the theory of relativity. 1378 01:00:23,445 --> 01:00:25,491 People often ask me these questions like, 1379 01:00:25,534 --> 01:00:29,495 "What happens if you're wrong, and AGI is quite far away?" 1380 01:00:29,538 --> 01:00:31,453 And I'm like, I never worry about that. 1381 01:00:31,497 --> 01:00:33,847 I actually worry about the reverse. 1382 01:00:33,891 --> 01:00:37,242 I actually worry that it's coming faster 1383 01:00:37,285 --> 01:00:39,723 than we can really prepare for. 1384 01:00:42,073 --> 01:00:45,859 It really feels like we're in a race to AGI. 1385 01:00:45,903 --> 01:00:49,907 The prototypes and the models that we are developing now 1386 01:00:49,950 --> 01:00:51,822 are actually transforming 1387 01:00:51,865 --> 01:00:54,215 the space of what we know about intelligence. 1388 01:00:57,349 --> 01:00:58,785 Recently, we've had agents 1389 01:00:58,829 --> 01:01:00,047 that are powerful enough 1390 01:01:00,091 --> 01:01:03,442 to actually start playing games in teams, 1391 01:01:03,485 --> 01:01:06,140 then competing against other teams. 1392 01:01:06,184 --> 01:01:08,795 We're seeing co-operative social dynamics 1393 01:01:08,839 --> 01:01:10,492 coming out of agents 1394 01:01:10,536 --> 01:01:13,321 where we haven't pre-programmed in 1395 01:01:13,365 --> 01:01:15,584 any of these sorts of dynamics. 1396 01:01:15,628 --> 01:01:19,240 It's completely learned from their own experiences. 1397 01:01:20,807 --> 01:01:23,288 When we started, we thought we were 1398 01:01:23,331 --> 01:01:25,725 out to build an intelligence system 1399 01:01:25,769 --> 01:01:28,336 and convince the world that we'd done it. 1400 01:01:28,380 --> 01:01:29,947 We're now starting to wonder whether 1401 01:01:29,990 --> 01:01:31,296 we're gonna build systems 1402 01:01:31,339 --> 01:01:32,906 that we're not convinced are fully intelligent, 1403 01:01:32,950 --> 01:01:34,691 and we're trying to convince the world that they're not. 1404 01:01:38,651 --> 01:01:40,000 Hi, Alpha. 1405 01:01:40,044 --> 01:01:41,523 Hello there. 1406 01:01:41,567 --> 01:01:43,917 Where are we today? 1407 01:01:43,961 --> 01:01:46,659 You're at the Museum of Modern Art in New York City. 1408 01:01:48,400 --> 01:01:53,013 Kind of. Um, what painting is this? 1409 01:01:53,057 --> 01:01:55,494 This isThe Creation of Adam by Michelangelo. 1410 01:01:55,537 --> 01:01:58,410 I don't think that painting is in New York City. 1411 01:01:58,453 --> 01:02:01,543 You are right. It's in the Vatican City. 1412 01:02:01,587 --> 01:02:02,806 Do you think that's where we are? 1413 01:02:04,677 --> 01:02:05,809 No. 1414 01:02:05,852 --> 01:02:07,593 Then how am I looking at this? 1415 01:02:10,509 --> 01:02:11,640 You're looking at 1416 01:02:11,684 --> 01:02:12,946 a digital copy of the painting. 1417 01:02:12,990 --> 01:02:13,991 Yes. 1418 01:02:15,035 --> 01:02:16,341 Who lives in this house? 1419 01:02:19,213 --> 01:02:20,345 An astronaut. 1420 01:02:26,046 --> 01:02:28,005 What's this astronaut's name? 1421 01:02:28,048 --> 01:02:29,397 His name is Mark. 1422 01:02:29,441 --> 01:02:32,052 This is a famous equation by Albert Einstein. 1423 01:02:32,096 --> 01:02:35,577 It shows that energy and mass are equivalent. 1424 01:02:35,621 --> 01:02:37,666 How would you explain it to a five year old? 1425 01:02:40,931 --> 01:02:43,542 Well, imagine you have a big rock, 1426 01:02:43,585 --> 01:02:45,326 and you want to know how much it weighs. 1427 01:02:45,370 --> 01:02:47,589 Is Mark in danger? 1428 01:02:47,633 --> 01:02:49,722 No. Mark is not in danger. 1429 01:02:49,766 --> 01:02:50,767 Why is that? 1430 01:02:54,422 --> 01:02:56,163 Because the dinosaur is not real. 1431 01:02:57,338 --> 01:02:58,470 All right. 1432 01:02:58,513 --> 01:03:00,515 Why could you not answer my question? 1433 01:03:00,559 --> 01:03:01,908 I think I couldn't answer your question 1434 01:03:01,952 --> 01:03:04,041 because I don't know enough about the subject. 1435 01:03:04,911 --> 01:03:06,130 Okay. 1436 01:03:06,173 --> 01:03:08,654 How could you learn more about the subject? 1437 01:03:08,697 --> 01:03:09,960 I can learn more about the subject 1438 01:03:10,003 --> 01:03:12,876 by reading books, articles, and websites. 1439 01:03:12,919 --> 01:03:14,268 You can teach me. 1440 01:03:14,312 --> 01:03:15,835 My conclusion is, 1441 01:03:15,879 --> 01:03:17,619 if you now think about what we're doing, 1442 01:03:17,663 --> 01:03:20,057 is learning from all humans, 1443 01:03:20,100 --> 01:03:22,581 all their knowledge at once put on the Internet, 1444 01:03:22,624 --> 01:03:24,931 you would actually know a lot about the world. 1445 01:03:24,975 --> 01:03:27,716 Like a significant portion of everything humans can do. 1446 01:03:27,760 --> 01:03:29,631 And now, I think it's more like, 1447 01:03:29,675 --> 01:03:30,937 "Well, it might just work." 1448 01:03:30,981 --> 01:03:33,113 This is a big moment. 1449 01:03:33,157 --> 01:03:34,636 Who is this? 1450 01:03:34,680 --> 01:03:36,551 This is God reaching out to Adam. 1451 01:03:36,595 --> 01:03:37,988 The advent of AGI 1452 01:03:38,031 --> 01:03:40,860 will divide human history into two parts. 1453 01:03:40,904 --> 01:03:43,950 The part up to that point and the part after that point. 1454 01:03:43,994 --> 01:03:47,258 Why is he reaching out to touch Adam? 1455 01:03:47,301 --> 01:03:48,912 God is reaching out to touch Adam 1456 01:03:48,955 --> 01:03:50,217 to give him life. 1457 01:03:50,261 --> 01:03:52,219 It will give us a tool 1458 01:03:52,263 --> 01:03:57,529 that can completely reinvent our entire civilization. 1459 01:03:57,572 --> 01:03:59,400 What does this painting mean to you? 1460 01:04:01,750 --> 01:04:03,752 The painting means a lot to me. 1461 01:04:03,796 --> 01:04:04,797 Okay. Like what? 1462 01:04:10,324 --> 01:04:11,456 I think the painting is a reminder 1463 01:04:11,499 --> 01:04:12,674 that we are all connected to each other 1464 01:04:12,718 --> 01:04:13,937 and that we are all part of something 1465 01:04:13,980 --> 01:04:15,112 bigger than ourselves. 1466 01:04:16,461 --> 01:04:17,766 That's pretty nice. 1467 01:04:19,029 --> 01:04:21,379 When you cross that barrier of 1468 01:04:21,422 --> 01:04:23,947 "AGI might happen one day in the future" 1469 01:04:23,990 --> 01:04:26,645 to "No, actually, this could really happen in a time frame 1470 01:04:26,688 --> 01:04:28,690 "that is sort of, like, on my watch, you know," 1471 01:04:28,734 --> 01:04:30,475 something changes in your thinking. 1472 01:04:30,518 --> 01:04:32,694 ...learned to orient itself by looking... 1473 01:04:32,738 --> 01:04:35,045 We have to be careful with how we use it 1474 01:04:35,088 --> 01:04:37,177 and thoughtful about how we deploy it. 1475 01:04:39,832 --> 01:04:41,138 You'd have to consider 1476 01:04:41,181 --> 01:04:42,487 what's its top level goal. 1477 01:04:42,530 --> 01:04:45,011 If it's to keep humans happy, 1478 01:04:45,055 --> 01:04:48,928 which set of humans? What does happiness mean? 1479 01:04:48,972 --> 01:04:52,018 A lot of our collective goals are very tricky, 1480 01:04:52,062 --> 01:04:54,891 even for humans to figure out. 1481 01:04:54,934 --> 01:04:58,503 Technology always embeds our values. 1482 01:04:58,546 --> 01:05:01,680 It's not just technical, it's ethical as well. 1483 01:05:01,723 --> 01:05:02,899 So we've got to be really cautious 1484 01:05:02,942 --> 01:05:04,291 about what we're building into it. 1485 01:05:04,335 --> 01:05:06,076 We're trying to find a single algorithm which... 1486 01:05:06,119 --> 01:05:07,816 The reality is that this is an algorithm 1487 01:05:07,860 --> 01:05:11,037 that has been created by people, by us. 1488 01:05:11,081 --> 01:05:13,213 You know, what does it mean to endow our agents 1489 01:05:13,257 --> 01:05:15,607 with the same kind of values that we hold dear? 1490 01:05:15,650 --> 01:05:17,652 What is the purpose of making these AI systems 1491 01:05:17,696 --> 01:05:19,045 appear so humanlike 1492 01:05:19,089 --> 01:05:20,742 so that they do capture hearts and minds 1493 01:05:20,786 --> 01:05:21,961 because they're kind of 1494 01:05:22,005 --> 01:05:24,703 exploiting a human vulnerability also? 1495 01:05:24,746 --> 01:05:26,531 The heart and mind of these systems 1496 01:05:26,574 --> 01:05:28,054 are very much human-generated data... 1497 01:05:28,098 --> 01:05:29,055 Mmm-hmm. 1498 01:05:29,099 --> 01:05:30,491 ...for all the good and the bad. 1499 01:05:30,535 --> 01:05:32,015 There is a parallel 1500 01:05:32,058 --> 01:05:34,017 between the Industrial Revolution, 1501 01:05:34,060 --> 01:05:36,758 which was an incredible moment of displacement 1502 01:05:36,802 --> 01:05:42,373 and the current technological change created by AI. 1503 01:05:42,416 --> 01:05:43,722 Pause AI! 1504 01:05:43,765 --> 01:05:45,724 We have to think about who's displaced 1505 01:05:45,767 --> 01:05:48,596 and how we're going to support them. 1506 01:05:48,640 --> 01:05:50,076 This technology is coming a lot sooner, 1507 01:05:50,120 --> 01:05:52,426 uh, than really the world knows or kind of 1508 01:05:52,470 --> 01:05:55,908 even we 18, 24 months ago thought. 1509 01:05:55,952 --> 01:05:57,257 So there's a tremendous opportunity, 1510 01:05:57,301 --> 01:05:58,389 tremendous excitement, 1511 01:05:58,432 --> 01:06:00,391 but also tremendous responsibility. 1512 01:06:00,434 --> 01:06:01,740 It's happening so fast. 1513 01:06:02,654 --> 01:06:04,003 How will we govern it? 1514 01:06:05,135 --> 01:06:06,223 How will we decide 1515 01:06:06,266 --> 01:06:08,181 what is okay and what is not okay? 1516 01:06:08,225 --> 01:06:10,923 AI-generated images are getting more sophisticated. 1517 01:06:10,967 --> 01:06:14,535 The use of AI for generating disinformation 1518 01:06:14,579 --> 01:06:17,016 and manipulating human psychology 1519 01:06:17,060 --> 01:06:20,237 is only going to get much, much worse. 1520 01:06:21,194 --> 01:06:22,587 AGI is coming, 1521 01:06:22,630 --> 01:06:24,632 whether we do it here at DeepMind or not. 1522 01:06:25,459 --> 01:06:26,765 It's gonna happen, 1523 01:06:26,808 --> 01:06:29,028 so we better create institutions to protect us. 1524 01:06:29,072 --> 01:06:30,595 It's gonna require global coordination. 1525 01:06:30,638 --> 01:06:32,727 And I worry that humanity is 1526 01:06:32,771 --> 01:06:35,382 increasingly getting worse at that rather than better. 1527 01:06:35,426 --> 01:06:37,123 We need a lot more people 1528 01:06:37,167 --> 01:06:40,039 really taking this seriously and thinking about this. 1529 01:06:40,083 --> 01:06:42,999 It's, yeah, it's serious. It worries me. 1530 01:06:44,043 --> 01:06:45,871 It worries me. Yeah. 1531 01:06:45,914 --> 01:06:48,613 If you received an email saying 1532 01:06:48,656 --> 01:06:50,832 this superior alien civilization 1533 01:06:50,876 --> 01:06:52,791 is going to arrive on Earth, 1534 01:06:52,834 --> 01:06:54,575 there would be emergency meetings 1535 01:06:54,619 --> 01:06:56,273 of all the governments. 1536 01:06:56,316 --> 01:06:58,144 We would go into overdrive 1537 01:06:58,188 --> 01:07:00,103 trying to figure out how to prepare. 1538 01:07:01,669 --> 01:07:03,976 The arrival of AGI will be 1539 01:07:04,020 --> 01:07:06,935 the most important moment that we have ever faced. 1540 01:07:14,378 --> 01:07:17,555 My dream was that on the way to AGI, 1541 01:07:17,598 --> 01:07:20,688 we would create revolutionary technologies 1542 01:07:20,732 --> 01:07:23,082 that would be of use to humanity. 1543 01:07:23,126 --> 01:07:25,171 That's what I wanted with AlphaFold. 1544 01:07:26,694 --> 01:07:28,653 I think it's more important than ever 1545 01:07:28,696 --> 01:07:31,047 that we should solve the protein folding problem. 1546 01:07:32,004 --> 01:07:34,224 This is gonna be really hard, 1547 01:07:34,267 --> 01:07:36,791 but I won't give up until it's done. 1548 01:07:36,835 --> 01:07:37,879 You know, we need to double down 1549 01:07:37,923 --> 01:07:40,317 and go as fast as possible from here. 1550 01:07:40,360 --> 01:07:41,796 I think we've got no time to lose. 1551 01:07:41,840 --> 01:07:45,757 So we are going to make a protein folding strike team. 1552 01:07:45,800 --> 01:07:47,541 Team lead for the strike team will be John. 1553 01:07:47,585 --> 01:07:48,673 Yeah, we've seen Alpha... 1554 01:07:48,716 --> 01:07:50,283 You know, we're gonna try everything, 1555 01:07:50,327 --> 01:07:51,328 kitchen sink, the whole lot. 1556 01:07:52,198 --> 01:07:53,330 CASP14 is about 1557 01:07:53,373 --> 01:07:55,158 proving we can solve the whole problem. 1558 01:07:56,333 --> 01:07:57,725 And I felt that to do that, 1559 01:07:57,769 --> 01:08:00,337 we would need to incorporate some domain knowledge. 1560 01:08:01,903 --> 01:08:03,731 We had some fantastic engineers on it, 1561 01:08:03,775 --> 01:08:05,733 but they were not trained in biology. 1562 01:08:08,475 --> 01:08:10,260 As a computational biologist, 1563 01:08:10,303 --> 01:08:12,131 when I initially joined the AlphaFold team, 1564 01:08:12,175 --> 01:08:14,220 I didn't immediately feel confident about anything. 1565 01:08:14,264 --> 01:08:15,352 You know, 1566 01:08:15,395 --> 01:08:17,223 whether we were gonna be successful. 1567 01:08:17,267 --> 01:08:21,097 Biology is so ridiculously complicated. 1568 01:08:21,140 --> 01:08:25,101 It just felt like this very far-off mountain to climb. 1569 01:08:25,144 --> 01:08:26,754 I'm starting to play with the underlying temperatures 1570 01:08:26,798 --> 01:08:27,973 to see if we can get... 1571 01:08:28,016 --> 01:08:29,148 As one of the few people on the team 1572 01:08:29,192 --> 01:08:31,846 who's done work in biology before, 1573 01:08:31,890 --> 01:08:34,849 you feel this huge sense of responsibility. 1574 01:08:34,893 --> 01:08:36,112 "We're expecting you to do 1575 01:08:36,155 --> 01:08:37,678 "great things on this strike team." 1576 01:08:37,722 --> 01:08:38,897 That's terrifying. 1577 01:08:40,464 --> 01:08:42,727 But one of the reasons why I wanted to come here 1578 01:08:42,770 --> 01:08:45,556 was to do something that matters. 1579 01:08:45,599 --> 01:08:48,472 This is the number of missing things. 1580 01:08:48,515 --> 01:08:49,951 What about making use 1581 01:08:49,995 --> 01:08:52,563 of whatever understanding you have of physics? 1582 01:08:52,606 --> 01:08:54,391 Using that as a source of data? 1583 01:08:54,434 --> 01:08:55,479 But if it's systematic... 1584 01:08:55,522 --> 01:08:56,784 Then, that can't be right, though. 1585 01:08:56,828 --> 01:08:58,308 If it's systematically wrong in some weird way, 1586 01:08:58,351 --> 01:09:01,224 you might be learning that systematically wrong physics. 1587 01:09:01,267 --> 01:09:02,355 The team is already 1588 01:09:02,399 --> 01:09:04,749 trying to think of multiple ways that... 1589 01:09:04,792 --> 01:09:06,229 Biological relevance 1590 01:09:06,272 --> 01:09:07,795 is what we're going for. 1591 01:09:09,057 --> 01:09:11,364 So we rewrote the whole data pipeline 1592 01:09:11,408 --> 01:09:13,279 that AlphaFold uses to learn. 1593 01:09:13,323 --> 01:09:15,586 You can't force the creative phase. 1594 01:09:15,629 --> 01:09:18,241 You have to give it space for those flowers to bloom. 1595 01:09:19,242 --> 01:09:20,286 We won CASP. 1596 01:09:20,330 --> 01:09:22,070 Then it was back to the drawing board 1597 01:09:22,114 --> 01:09:24,116 and like, what are our new ideas? 1598 01:09:24,160 --> 01:09:26,945 Um, and then it's taken a little while, I would say, 1599 01:09:26,988 --> 01:09:28,686 for them to get back to where they were, 1600 01:09:28,729 --> 01:09:30,340 but with the new ideas. 1601 01:09:30,383 --> 01:09:31,515 And then now I think 1602 01:09:31,558 --> 01:09:33,952 we're seeing the benefits of the new ideas. 1603 01:09:33,995 --> 01:09:35,736 They can go further, right? 1604 01:09:35,780 --> 01:09:38,130 So, um, that's a really important moment. 1605 01:09:38,174 --> 01:09:40,959 I've seen that moment so many times now, 1606 01:09:41,002 --> 01:09:42,613 but I know what that means now. 1607 01:09:42,656 --> 01:09:44,484 And I know this is the time now to press. 1608 01:09:45,920 --> 01:09:48,009 Adding side-chains improves direct folding. 1609 01:09:48,053 --> 01:09:49,663 That drove a lot of the progress. 1610 01:09:49,707 --> 01:09:51,012 -We'll talk about that. -Great. 1611 01:09:51,056 --> 01:09:54,799 The last four months, we've made enormous gains. 1612 01:09:54,842 --> 01:09:56,453 During CASP13, 1613 01:09:56,496 --> 01:09:59,499 it would take us a day or two to fold one of the proteins, 1614 01:09:59,543 --> 01:10:01,762 and now we're folding, like, 1615 01:10:01,806 --> 01:10:03,938 hundreds of thousands a second. 1616 01:10:03,982 --> 01:10:05,636 Yeah, it's just insane. 1617 01:10:05,679 --> 01:10:06,985 Now, this is a model 1618 01:10:07,028 --> 01:10:09,901 that is orders of magnitude faster, 1619 01:10:09,944 --> 01:10:12,251 while at the same time being better. 1620 01:10:12,295 --> 01:10:13,644 We're getting a lot of structures 1621 01:10:13,687 --> 01:10:15,254 into the high-accuracy regime. 1622 01:10:15,298 --> 01:10:17,517 We're rapidly improving to a system 1623 01:10:17,561 --> 01:10:18,823 that is starting to really 1624 01:10:18,866 --> 01:10:20,477 get at the core and heart of the problem. 1625 01:10:20,520 --> 01:10:21,695 It's great work. 1626 01:10:21,739 --> 01:10:23,088 It looks like we're in good shape. 1627 01:10:23,131 --> 01:10:26,222 So we got, what, six, five weeks left? Six weeks? 1628 01:10:26,265 --> 01:10:29,616 So what's, uh... Is it... You got enough compute power? 1629 01:10:29,660 --> 01:10:31,531 I... We could use more. 1630 01:10:32,967 --> 01:10:34,360 I was nervous about CASP 1631 01:10:34,404 --> 01:10:36,580 but as the system is starting to come together, 1632 01:10:36,623 --> 01:10:37,972 I don't feel as nervous. 1633 01:10:38,016 --> 01:10:39,496 I feel like things have, sort of, 1634 01:10:39,539 --> 01:10:41,193 come into perspective recently, 1635 01:10:41,237 --> 01:10:44,240 and, you know, it's gonna be fine. 1636 01:10:47,330 --> 01:10:48,853 The Prime Minister has announced 1637 01:10:48,896 --> 01:10:51,290 the most drastic limits to our lives 1638 01:10:51,334 --> 01:10:53,858 the U.K. has ever seen in living memory. 1639 01:10:53,901 --> 01:10:55,033 I must give the British people 1640 01:10:55,076 --> 01:10:56,904 a very simple instruction. 1641 01:10:56,948 --> 01:10:59,037 You must stay at home. 1642 01:10:59,080 --> 01:11:02,519 It feels like we're in a science fiction novel. 1643 01:11:02,562 --> 01:11:04,869 You know, I'm delivering food to my parents, 1644 01:11:04,912 --> 01:11:08,220 making sure they stay isolated and safe. 1645 01:11:08,264 --> 01:11:10,570 I think it just highlights the incredible need 1646 01:11:10,614 --> 01:11:12,877 for AI-assisted science. 1647 01:11:17,098 --> 01:11:18,361 You always know that 1648 01:11:18,404 --> 01:11:21,015 something like this is a possibility. 1649 01:11:21,059 --> 01:11:23,888 But nobody ever really believes it's gonna happen 1650 01:11:23,931 --> 01:11:25,585 in their lifetime, though. 1651 01:11:26,978 --> 01:11:29,154 - Are you recording yet? - Yes. 1652 01:11:29,197 --> 01:11:31,025 -Okay, morning, all. -Hey. 1653 01:11:31,069 --> 01:11:32,679 Good. CASP has started. 1654 01:11:32,723 --> 01:11:36,074 It's nice I get to sit around in my pajama bottoms all day. 1655 01:11:36,117 --> 01:11:37,597 I never thought I'd live in a house 1656 01:11:37,641 --> 01:11:39,164 where so much was going on. 1657 01:11:39,207 --> 01:11:41,427 I would be trying to solve protein folding in one room, 1658 01:11:41,471 --> 01:11:42,559 and my husband would be trying 1659 01:11:42,602 --> 01:11:43,908 to make robots walk in the other. 1660 01:11:46,954 --> 01:11:49,392 One of the hardest proteins we've gotten in CASP thus far 1661 01:11:49,435 --> 01:11:51,219 is the SARS-CoV-2 protein 1662 01:11:51,263 --> 01:11:52,220 called ORF8. 1663 01:11:52,264 --> 01:11:54,919 ORF8 is a coronavirus protein. 1664 01:11:54,962 --> 01:11:56,964 It's one of the main proteins, um, 1665 01:11:57,008 --> 01:11:58,749 that dampens the immune system. 1666 01:11:58,792 --> 01:12:00,054 We tried really hard 1667 01:12:00,098 --> 01:12:01,752 to improve our prediction. 1668 01:12:01,795 --> 01:12:03,493 Like, really, really hard. 1669 01:12:03,536 --> 01:12:05,582 Probably the most time that we have ever spent 1670 01:12:05,625 --> 01:12:07,105 on a single target. 1671 01:12:07,148 --> 01:12:08,933 To the point where my husband is, like, 1672 01:12:08,976 --> 01:12:12,197 "It's midnight. You need to go to bed." 1673 01:12:12,240 --> 01:12:16,419 So I think we're at Day 102 since lockdown. 1674 01:12:16,462 --> 01:12:19,944 My daughter is keeping a journal. 1675 01:12:19,987 --> 01:12:22,120 Now you can go out as much as you want. 1676 01:12:25,036 --> 01:12:27,212 We have received the last target. 1677 01:12:27,255 --> 01:12:29,649 They've said they will be sending out no more targets 1678 01:12:29,693 --> 01:12:31,347 in our category of CASP. 1679 01:12:32,652 --> 01:12:33,653 So we're just making sure 1680 01:12:33,697 --> 01:12:35,481 we get the best possible answer. 1681 01:12:40,530 --> 01:12:43,315 As soon as we started to get the results, 1682 01:12:43,359 --> 01:12:48,233 I'd sit down and start looking at how close did anybody come 1683 01:12:48,276 --> 01:12:50,583 to getting the protein structures correct. 1684 01:13:00,245 --> 01:13:01,551 - Oh, hi there. - Hello. 1685 01:13:03,814 --> 01:13:07,078 It is an unbelievable thing, CASP has finally ended. 1686 01:13:07,121 --> 01:13:09,472 I think it's at least time to raise a glass. 1687 01:13:09,515 --> 01:13:11,212 Um, I don't know if everyone has a glass 1688 01:13:11,256 --> 01:13:12,823 of something that they can raise. 1689 01:13:12,866 --> 01:13:14,955 If not, raise, I don't know, your laptops. 1690 01:13:14,999 --> 01:13:17,088 Um... 1691 01:13:17,131 --> 01:13:18,611 I'll probably make a speech in a minute. 1692 01:13:18,655 --> 01:13:20,483 I feel like I should but I just have no idea what to say. 1693 01:13:21,005 --> 01:13:24,269 So... let's see. 1694 01:13:24,312 --> 01:13:27,054 I feel like a reading of email... 1695 01:13:27,098 --> 01:13:28,534 is the right thing to do. 1696 01:13:29,927 --> 01:13:31,232 When John said, 1697 01:13:31,276 --> 01:13:33,191 "I'm gonna read an email," at a team social, 1698 01:13:33,234 --> 01:13:35,498 I thought, "Wow, John, you know how to have fun." 1699 01:13:35,541 --> 01:13:38,370 We're gonna read an email now. 1700 01:13:38,414 --> 01:13:41,634 Uh, I got this about four o'clock today. 1701 01:13:42,722 --> 01:13:44,724 Um, it is from John Moult. 1702 01:13:45,725 --> 01:13:47,031 And I'll just read it. 1703 01:13:47,074 --> 01:13:49,381 It says, "As I expect you know, 1704 01:13:49,425 --> 01:13:53,603 "your group has performed amazingly well in CASP 14, 1705 01:13:53,646 --> 01:13:55,387 "both relative to other groups 1706 01:13:55,431 --> 01:13:57,911 "and in absolute model accuracy." 1707 01:13:59,826 --> 01:14:01,219 "Congratulations on this work. 1708 01:14:01,262 --> 01:14:03,047 "It is really outstanding." 1709 01:14:03,090 --> 01:14:05,266 The structures were so good, 1710 01:14:05,310 --> 01:14:07,443 it was... it was just amazing. 1711 01:14:09,140 --> 01:14:10,750 After half a century, 1712 01:14:10,794 --> 01:14:12,230 we finally have a solution 1713 01:14:12,273 --> 01:14:14,928 to the protein folding problem. 1714 01:14:14,972 --> 01:14:17,409 When I saw this email, I read it, 1715 01:14:17,453 --> 01:14:19,585 I go, "Oh, shit!" 1716 01:14:19,629 --> 01:14:21,587 And my wife goes, "Is everything okay?" 1717 01:14:21,631 --> 01:14:24,242 I call my parents, and just, like, "Hey, Mum. 1718 01:14:24,285 --> 01:14:26,244 "Um, got something to tell you. 1719 01:14:26,287 --> 01:14:27,550 "We've done this thing 1720 01:14:27,593 --> 01:14:29,813 "and it might be kind of a big deal." 1721 01:14:29,856 --> 01:14:31,641 When I learned of the CASP 14 results, 1722 01:14:32,642 --> 01:14:34,034 I was gobsmacked. 1723 01:14:34,078 --> 01:14:35,819 I was just excited. 1724 01:14:35,862 --> 01:14:38,909 This is a problem that I was beginning to think 1725 01:14:38,952 --> 01:14:42,086 would not get solved in my lifetime. 1726 01:14:42,129 --> 01:14:44,741 Now we have a tool that can be used 1727 01:14:44,784 --> 01:14:46,612 practically by scientists. 1728 01:14:46,656 --> 01:14:48,440 These people are asking us, you know, 1729 01:14:48,484 --> 01:14:50,224 "I've got this protein involved in malaria," 1730 01:14:50,268 --> 01:14:52,139 or, you know, some infectious disease. 1731 01:14:52,183 --> 01:14:53,227 "We don't know the structure. 1732 01:14:53,271 --> 01:14:55,186 "Can we use AlphaFold to solve it?" 1733 01:14:55,229 --> 01:14:56,970 We can easily predict all known sequences 1734 01:14:57,014 --> 01:14:58,276 in a month. 1735 01:14:58,319 --> 01:14:59,973 All known sequences in a month? 1736 01:15:00,017 --> 01:15:01,279 -Yeah, easily. -Mmm-hmm? 1737 01:15:01,322 --> 01:15:02,585 A billion, two billion. 1738 01:15:02,628 --> 01:15:03,673 Um, and they're... 1739 01:15:03,716 --> 01:15:05,196 So why don't we just do that? Yeah. 1740 01:15:05,239 --> 01:15:07,111 -We should just do that a lot. -Well, I mean... 1741 01:15:07,154 --> 01:15:09,243 That's way better. Why don't we just do that? 1742 01:15:09,287 --> 01:15:11,115 So that's one of the options. 1743 01:15:11,158 --> 01:15:12,638 - Right. - There's this... 1744 01:15:12,682 --> 01:15:15,119 We should just... Right, that's a great idea. 1745 01:15:15,162 --> 01:15:17,513 We should just run every protein in existence. 1746 01:15:18,296 --> 01:15:19,471 And then release that. 1747 01:15:19,515 --> 01:15:20,994 Why didn't someone suggest this before? 1748 01:15:21,038 --> 01:15:22,126 Of course that's what we should do. 1749 01:15:22,169 --> 01:15:23,954 Why are we thinking about making a service 1750 01:15:23,997 --> 01:15:25,651 and then people submit their protein? 1751 01:15:25,695 --> 01:15:26,913 We just fold everything. 1752 01:15:26,957 --> 01:15:28,654 And then give it to everyone in the world. 1753 01:15:28,698 --> 01:15:31,483 Who knows how many discoveries will be made from that? 1754 01:15:31,527 --> 01:15:33,790 Demis called us up and said, 1755 01:15:33,833 --> 01:15:35,618 "We want to make this open. 1756 01:15:35,661 --> 01:15:37,837 "Not just make sure the code is open, 1757 01:15:37,881 --> 01:15:39,578 "but we're gonna make it really easy 1758 01:15:39,622 --> 01:15:42,668 "for everybody to get access to the predictions." 1759 01:15:45,062 --> 01:15:47,238 That is fantastic. 1760 01:15:47,281 --> 01:15:49,327 It's like drawing back the curtain 1761 01:15:49,370 --> 01:15:52,852 and seeing the whole world of protein structures. 1762 01:15:55,246 --> 01:15:56,987 They released the structures 1763 01:15:57,030 --> 01:15:59,772 of 200 million proteins. 1764 01:15:59,816 --> 01:16:01,818 These are gifts to humanity. 1765 01:16:07,650 --> 01:16:10,914 The moment AlphaFold is live to the world, 1766 01:16:10,957 --> 01:16:13,873 we will no longer be the most important people 1767 01:16:13,917 --> 01:16:15,222 in AlphaFold's story. 1768 01:16:15,266 --> 01:16:16,833 Can't quite believe it's all out. 1769 01:16:16,876 --> 01:16:18,356 Aw! 1770 01:16:18,399 --> 01:16:20,314 A hundred and sixty-four users. 1771 01:16:20,358 --> 01:16:22,578 Loads of activity in Japan. 1772 01:16:22,621 --> 01:16:24,928 We have 655 users currently. 1773 01:16:24,971 --> 01:16:26,930 We currently have 100,000 concurrent users. 1774 01:16:26,973 --> 01:16:28,192 Wow! 1775 01:16:31,108 --> 01:16:33,893 Today is just crazy. 1776 01:16:33,937 --> 01:16:36,504 What an absolutely unbelievable effort 1777 01:16:36,548 --> 01:16:37,723 from everyone. 1778 01:16:37,767 --> 01:16:38,550 We're gonna all remember these moments 1779 01:16:38,594 --> 01:16:40,030 for the rest of our lives. 1780 01:16:40,073 --> 01:16:41,727 I'm excited about AlphaFold. 1781 01:16:41,771 --> 01:16:45,601 For my research, it's already propelling lots of progress. 1782 01:16:45,644 --> 01:16:47,385 And this is just the beginning. 1783 01:16:47,428 --> 01:16:48,908 My guess is, 1784 01:16:48,952 --> 01:16:53,043 every single biological and chemistry achievement 1785 01:16:53,086 --> 01:16:55,698 will be related to AlphaFold in some way. 1786 01:17:13,367 --> 01:17:15,413 AlphaFold is an index moment. 1787 01:17:15,456 --> 01:17:18,068 It's a moment that people will not forget 1788 01:17:18,111 --> 01:17:20,244 because the world changed. 1789 01:17:39,655 --> 01:17:41,482 Everybody's realized now 1790 01:17:41,526 --> 01:17:43,746 what Shane and I have known for more than 20 years, 1791 01:17:43,789 --> 01:17:46,618 that AI is going to be the most important thing 1792 01:17:46,662 --> 01:17:48,446 humanity's ever gonna invent. 1793 01:17:48,489 --> 01:17:50,230 We will shortly be arriving 1794 01:17:50,274 --> 01:17:52,058 at our final destination. 1795 01:18:02,068 --> 01:18:04,767 The pace of innovation and capabilities 1796 01:18:04,810 --> 01:18:06,507 is accelerating, 1797 01:18:06,551 --> 01:18:09,293 like a boulder rolling down a hill that we've kicked off 1798 01:18:09,336 --> 01:18:12,644 and now it's continuing to gather speed. 1799 01:18:12,688 --> 01:18:15,299 We are at a crossroads in human history. 1800 01:18:15,342 --> 01:18:16,735 AI has the potential 1801 01:18:16,779 --> 01:18:19,172 to transform our lives in every aspect. 1802 01:18:19,216 --> 01:18:23,786 It's no less important than the discovery of electricity. 1803 01:18:23,829 --> 01:18:26,484 We should be looking at the scientific method 1804 01:18:26,527 --> 01:18:28,834 and trying to understand each step of the way 1805 01:18:28,878 --> 01:18:30,096 in a rigorous way. 1806 01:18:30,140 --> 01:18:32,664 This is a moment of profound opportunity. 1807 01:18:32,708 --> 01:18:34,753 Harnessing this technology 1808 01:18:34,797 --> 01:18:37,713 could eclipse anything we have ever known. 1809 01:18:42,195 --> 01:18:43,675 Hi, Alpha. 1810 01:18:44,676 --> 01:18:45,764 Hi. 1811 01:18:47,157 --> 01:18:48,419 What is this? 1812 01:18:50,682 --> 01:18:53,729 This is a chessboard. 1813 01:18:53,772 --> 01:18:56,514 If I was to play white, what move would you recommend? 1814 01:18:59,865 --> 01:19:00,953 I would recommend 1815 01:19:00,997 --> 01:19:02,781 that you move your pawn from E2 to E4. 1816 01:19:05,871 --> 01:19:08,787 And now if you were black, what would you play now? 1817 01:19:11,572 --> 01:19:13,618 I would play the Sicilian Defense. 1818 01:19:15,838 --> 01:19:16,882 That's a good choice. 1819 01:19:19,406 --> 01:19:21,452 Thanks. 1820 01:19:23,715 --> 01:19:25,891 So what do you see? What is this object? 1821 01:19:28,546 --> 01:19:30,504 This is a pencil sculpture. 1822 01:19:32,811 --> 01:19:35,031 What happens if I move one of the pencils? 1823 01:19:37,990 --> 01:19:39,470 If you move one of the pencils, 1824 01:19:39,513 --> 01:19:42,081 the sculpture will fall apart. 1825 01:19:42,125 --> 01:19:44,301 I'd better leave it alone, then. 1826 01:19:44,344 --> 01:19:45,868 That's probably a good idea. 1827 01:19:50,568 --> 01:19:52,744 AGI is on the horizon now. 1828 01:19:54,833 --> 01:19:56,661 Very clearly the next generation 1829 01:19:56,704 --> 01:19:58,141 is going to live in a future world 1830 01:19:58,184 --> 01:20:01,057 where things will be radically different because of AI. 1831 01:20:02,493 --> 01:20:05,496 And if you want to steward that responsibly, 1832 01:20:05,539 --> 01:20:09,239 every moment is vital. 1833 01:20:09,282 --> 01:20:12,677 This is the moment I've been living my whole life for. 1834 01:20:19,162 --> 01:20:21,120 It's just a good thinking game. 139125

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.