All language subtitles for The.Thinking.Game.2024.1080p.WEBRip.x264.AAC5.1-[YTS.MX].SDH.eng

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
el Greek
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranî)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish Download
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:02,000 --> 00:00:07,000 Downloaded from YTS.MX 2 00:00:07,920 --> 00:00:10,097 [RUMINATIVE MUSIC PLAYING] 3 00:00:08,000 --> 00:00:13,000 Official YIFY movies site: YTS.MX 4 00:00:29,551 --> 00:00:30,639 [DEVICE CHIMES] 5 00:00:30,682 --> 00:00:31,814 JULIETTE LOVE: Hi, Alpha. 6 00:00:32,684 --> 00:00:34,034 ALPHA: Hello. 7 00:00:34,077 --> 00:00:35,296 LOVE: Can you help me write code? 8 00:00:37,559 --> 00:00:39,082 ALPHA: I was trained to answer questions, 9 00:00:40,562 --> 00:00:42,216 but I'm able to learn. 10 00:00:43,695 --> 00:00:45,958 LOVE: That's very open-minded of you. 11 00:00:46,002 --> 00:00:48,657 ALPHA: Thank you. I'm glad you're happy with me. 12 00:00:50,311 --> 00:00:51,312 What's this guy doing? 13 00:00:54,532 --> 00:00:56,621 ALPHA: That's a developer. 14 00:00:56,665 --> 00:00:58,014 What do you think he's working on? 15 00:00:59,537 --> 00:01:01,017 ALPHA: That's a tough question. 16 00:01:01,061 --> 00:01:02,888 He might be working on a new feature, 17 00:01:02,932 --> 00:01:05,239 a bug fix or something else. 18 00:01:05,282 --> 00:01:06,327 It's quite possible. 19 00:01:06,370 --> 00:01:07,328 ALPHA: Yes. 20 00:01:10,548 --> 00:01:11,854 LOVE: Do you see my backpack? 21 00:01:13,986 --> 00:01:15,292 ALPHA: That's a badminton racket. 22 00:01:15,336 --> 00:01:17,903 It's a squash racket, but that's pretty close. 23 00:01:20,123 --> 00:01:21,690 ALPHA: That's a badminton racket. 24 00:01:21,733 --> 00:01:23,126 No, but you're not the first person 25 00:01:23,170 --> 00:01:24,649 to make that mistake. 26 00:01:24,693 --> 00:01:26,825 [UPBEAT MUSIC PLAYING] 27 00:01:34,703 --> 00:01:35,965 NEWSREADER 1: AI, the technology 28 00:01:36,008 --> 00:01:38,402 that has been advancing at breakneck speed. 29 00:01:38,446 --> 00:01:40,796 NEWSREADER 2: Artificial intelligence is all the rage. 30 00:01:40,839 --> 00:01:42,319 NEWSREADER 3: Some are now raising alarm about... 31 00:01:42,363 --> 00:01:43,886 NEWSREADER 4: It is definitely concerning. 32 00:01:43,929 --> 00:01:45,322 NEWSREADER 5: This is an AI arms race. 33 00:01:45,366 --> 00:01:46,454 NEWSREADER 6: We don't know 34 00:01:46,497 --> 00:01:47,585 how this is all going to shake out, 35 00:01:47,629 --> 00:01:48,891 but it's clear something is happening. 36 00:01:53,809 --> 00:01:55,115 DEMIS HASSABIS: I'm kind of restless. 37 00:01:57,073 --> 00:02:00,120 Trying to build AGI is the most exciting journey, 38 00:02:00,163 --> 00:02:02,383 in my opinion, that humans have ever embarked on. 39 00:02:04,254 --> 00:02:06,300 If you're really going to take that seriously, 40 00:02:06,343 --> 00:02:08,345 there isn't a lot of time. 41 00:02:08,389 --> 00:02:09,825 Life's very short. 42 00:02:12,523 --> 00:02:14,482 My whole life goal is to solve 43 00:02:14,525 --> 00:02:16,875 artificial general intelligence. 44 00:02:16,919 --> 00:02:20,357 And on the way, use AI as the ultimate tool 45 00:02:20,401 --> 00:02:21,750 to solve all the world's 46 00:02:21,793 --> 00:02:23,230 most complex scientific problems. 47 00:02:25,449 --> 00:02:27,103 I think that's bigger than the Internet. 48 00:02:27,147 --> 00:02:28,452 I think that's bigger than mobile. 49 00:02:29,671 --> 00:02:31,020 I think it's more like 50 00:02:31,063 --> 00:02:32,891 the advent of electricity or fire. 51 00:02:45,687 --> 00:02:46,731 ANNOUNCER: World leaders 52 00:02:46,775 --> 00:02:48,516 and artificial intelligence experts 53 00:02:48,559 --> 00:02:49,908 are gathering for the first ever 54 00:02:49,952 --> 00:02:52,781 global AI safety summit, 55 00:02:52,824 --> 00:02:54,348 set to look at the risks 56 00:02:54,391 --> 00:02:56,611 of the fast growing technology and also... 57 00:02:56,654 --> 00:02:57,873 HASSABIS: I think this is a hugely 58 00:02:57,916 --> 00:03:00,136 critical moment for all humanity. 59 00:03:01,093 --> 00:03:03,618 It feels like we're on the cusp 60 00:03:03,661 --> 00:03:05,794 of some incredible things happening. 61 00:03:05,837 --> 00:03:07,056 NEWSREADER: Let me take you through 62 00:03:07,099 --> 00:03:08,275 some of the reactions in today's papers. 63 00:03:08,318 --> 00:03:10,015 HASSABIS: AGI is pretty close, I think. 64 00:03:10,059 --> 00:03:12,844 There's clearly huge interest in what it is capable of, 65 00:03:12,888 --> 00:03:14,803 where it's taking us. 66 00:03:14,846 --> 00:03:15,978 HASSABIS: This is the moment 67 00:03:16,021 --> 00:03:17,936 I've been living my whole life for. 68 00:03:17,980 --> 00:03:20,156 [MID-TEMPO ELECTRONIC MUSIC PLAYS] 69 00:03:22,202 --> 00:03:24,987 I've always been fascinated by the mind. 70 00:03:25,030 --> 00:03:27,990 So I set my heart on studying neuroscience 71 00:03:28,033 --> 00:03:29,948 because I wanted to get inspiration 72 00:03:29,992 --> 00:03:31,646 from the brain for AI. 73 00:03:31,689 --> 00:03:33,300 ELEANOR MAGUIRE: I remember asking Demis, 74 00:03:33,343 --> 00:03:34,649 "What's the end game?" 75 00:03:34,692 --> 00:03:36,172 You know? So you're going to come here 76 00:03:36,216 --> 00:03:38,218 and you're going to study neuroscience 77 00:03:38,261 --> 00:03:41,438 and you're going to maybe get a Ph.D. if you work hard. 78 00:03:42,874 --> 00:03:44,136 And he said, 79 00:03:44,180 --> 00:03:46,965 "You know, I want to be able to solve AI. 80 00:03:47,009 --> 00:03:49,359 "I want to be able to solve intelligence." 81 00:03:49,403 --> 00:03:51,361 HASSABIS: The human brain is the only existent proof 82 00:03:51,405 --> 00:03:53,494 we have, perhaps in the entire universe, 83 00:03:53,537 --> 00:03:56,279 that general intelligence is possible at all. 84 00:03:56,323 --> 00:03:59,282 And I thought someone in this building 85 00:03:59,326 --> 00:04:00,370 should be interested 86 00:04:00,414 --> 00:04:02,590 in general intelligence like I am. 87 00:04:02,633 --> 00:04:04,809 And then Shane's name popped up. 88 00:04:04,853 --> 00:04:07,334 HOST: Our next speaker today is Shane Legg. 89 00:04:07,377 --> 00:04:08,596 He's from New Zealand, 90 00:04:08,639 --> 00:04:11,512 where he trained in math and classical ballet. 91 00:04:11,555 --> 00:04:14,079 Are machines actually becoming more intelligent? 92 00:04:14,123 --> 00:04:16,517 Some people say yes, some people say no. 93 00:04:16,560 --> 00:04:17,648 It's not really clear. 94 00:04:17,692 --> 00:04:18,867 We know they're getting a lot faster 95 00:04:18,910 --> 00:04:20,216 at doing computations. 96 00:04:20,260 --> 00:04:21,609 But are we actually going forwards 97 00:04:21,652 --> 00:04:23,828 in terms of general intelligence? 98 00:04:23,872 --> 00:04:25,700 HASSABIS: We were both obsessed with AGI, 99 00:04:25,743 --> 00:04:27,397 artificial general intelligence. 100 00:04:27,441 --> 00:04:29,704 So today I'm going to be talking about 101 00:04:29,747 --> 00:04:32,097 different approaches to building AGI. 102 00:04:32,141 --> 00:04:33,882 With my colleague Demis Hassabis, 103 00:04:33,925 --> 00:04:35,884 we're looking at ways to bring in ideas 104 00:04:35,927 --> 00:04:37,451 from theoretical neuroscience. 105 00:04:37,494 --> 00:04:41,585 I felt like we were the keepers of a secret 106 00:04:41,629 --> 00:04:43,152 that no one else knew. 107 00:04:43,195 --> 00:04:45,415 Shane and I knew no one in academia 108 00:04:45,459 --> 00:04:47,809 would be supportive of what we were doing. 109 00:04:47,852 --> 00:04:50,855 AI was almost an embarrassing word 110 00:04:50,899 --> 00:04:52,857 to use in academic circles, right? 111 00:04:52,901 --> 00:04:54,903 If you said you were working on AI, 112 00:04:54,946 --> 00:04:57,993 then you clearly weren't a serious scientist. 113 00:04:58,036 --> 00:05:00,082 So I convinced Shane the right way to do it 114 00:05:00,125 --> 00:05:01,344 would be to start a company. 115 00:05:01,388 --> 00:05:03,215 SHANE LEGG: Okay, we're going to try to do 116 00:05:03,259 --> 00:05:04,913 artificial general intelligence. 117 00:05:04,956 --> 00:05:06,915 It may not even be possible. 118 00:05:06,958 --> 00:05:08,482 We're not quite sure how we're going to do it, 119 00:05:08,525 --> 00:05:11,528 but we have some ideas or, kind of, approaches. 120 00:05:11,572 --> 00:05:14,966 Huge amounts of money, huge amounts of risk, 121 00:05:15,010 --> 00:05:16,533 lots and lots of compute. 122 00:05:18,709 --> 00:05:20,320 And if we pull this off, 123 00:05:20,363 --> 00:05:23,540 it'll be the biggest thing ever, right? 124 00:05:23,584 --> 00:05:26,108 That is a very hard thing for a typical investor 125 00:05:26,151 --> 00:05:27,327 to put their money on. 126 00:05:27,370 --> 00:05:29,633 It's almost like buying a lottery ticket. 127 00:05:29,677 --> 00:05:32,462 I'm going to be speaking about the system of neuroscience 128 00:05:32,506 --> 00:05:36,727 and how it might be used to help us build AGI. 129 00:05:36,771 --> 00:05:37,946 HASSABIS: Finding initial funding 130 00:05:37,989 --> 00:05:39,208 for this was very hard. 131 00:05:39,251 --> 00:05:41,253 We're going to solve all of intelligence. 132 00:05:41,297 --> 00:05:42,907 You can imagine some of the looks I got 133 00:05:42,951 --> 00:05:44,779 when we were pitching that around. 134 00:05:44,822 --> 00:05:47,912 So I'm a V.C. and I look at about 135 00:05:47,956 --> 00:05:51,002 700 to 1,000 projects a year. 136 00:05:51,046 --> 00:05:54,789 And I fund literally 1% of those. 137 00:05:54,832 --> 00:05:57,139 About eight projects a year. 138 00:05:57,182 --> 00:06:00,272 So that means 99% of the time, you're in "No" mode. 139 00:06:00,316 --> 00:06:01,622 "Wait a minute. I'm telling you, 140 00:06:01,665 --> 00:06:03,798 "this is the most important thing of all time. 141 00:06:03,841 --> 00:06:05,103 "I'm giving you all this build-up 142 00:06:05,147 --> 00:06:06,235 "about how... explain 143 00:06:06,278 --> 00:06:07,671 "how it connects with the brain, 144 00:06:07,715 --> 00:06:09,412 "why the time's right now, and then you're asking me, 145 00:06:09,456 --> 00:06:10,979 "'But what's your, how are you going to make money? 146 00:06:11,022 --> 00:06:12,154 "'What's your product?'" 147 00:06:12,197 --> 00:06:15,984 It's like, so prosaic a question. 148 00:06:16,506 --> 00:06:17,942 You know? 149 00:06:17,986 --> 00:06:19,117 "Have you not been listening to what I've been saying?" 150 00:06:19,161 --> 00:06:21,076 LEGG: We needed investors 151 00:06:21,119 --> 00:06:23,470 who aren't necessarily going to invest 152 00:06:23,513 --> 00:06:25,123 because they think it's the best 153 00:06:25,167 --> 00:06:26,690 investment decision. 154 00:06:26,734 --> 00:06:27,865 They're probably going to invest 155 00:06:27,909 --> 00:06:29,476 because they just think it's really cool. 156 00:06:29,519 --> 00:06:31,478 NEWSREADER: He's the Silicon Valley 157 00:06:31,521 --> 00:06:33,523 version of the man behind the curtain 158 00:06:33,567 --> 00:06:34,872 inThe Wizard of Oz. 159 00:06:34,916 --> 00:06:36,265 He had a lot to do with giving you 160 00:06:36,308 --> 00:06:39,007 PayPal, Facebook, YouTube and Yelp. 161 00:06:39,050 --> 00:06:40,443 LEGG: If everyone says "X," 162 00:06:40,487 --> 00:06:43,185 Peter Thiel suspects that the opposite of X 163 00:06:43,228 --> 00:06:44,708 is quite possibly true. 164 00:06:44,752 --> 00:06:47,276 HASSABIS: So Peter Thiel was our first big investor. 165 00:06:47,319 --> 00:06:50,105 But he insisted that we come to Silicon Valley 166 00:06:50,148 --> 00:06:51,802 because that was the only place we could... 167 00:06:51,846 --> 00:06:52,977 There would be the talent, 168 00:06:53,021 --> 00:06:55,066 and we could build that kind of company. 169 00:06:55,110 --> 00:06:56,981 But I was pretty adamant we should be in London 170 00:06:57,025 --> 00:06:59,027 because I think London's an amazing city. 171 00:06:59,070 --> 00:07:01,464 Plus, I knew there were really amazing people 172 00:07:01,508 --> 00:07:03,814 trained at Cambridge and Oxford and UCL. 173 00:07:03,858 --> 00:07:05,033 In Silicon Valley, 174 00:07:05,076 --> 00:07:06,643 everybody's founding a company every year, 175 00:07:06,687 --> 00:07:07,818 and then if it doesn't work, 176 00:07:07,862 --> 00:07:09,603 you chuck it and you start something new. 177 00:07:09,646 --> 00:07:11,169 That is not conducive 178 00:07:11,213 --> 00:07:14,564 to a long-term research challenge. 179 00:07:14,608 --> 00:07:17,349 So we were totally an outlier for him. 180 00:07:17,393 --> 00:07:20,962 Hi, everyone. Welcome to DeepMind. 181 00:07:21,005 --> 00:07:22,529 So, what is our mission? 182 00:07:23,747 --> 00:07:25,706 We summarize it as... 183 00:07:25,749 --> 00:07:28,012 DeepMind's mission is to build the world's first 184 00:07:28,056 --> 00:07:29,579 general learning machine. 185 00:07:29,623 --> 00:07:31,712 So we always stress the word "general" and "learning" here 186 00:07:31,755 --> 00:07:32,887 are the key things. 187 00:07:32,930 --> 00:07:35,106 LEGG: Our mission was to build an AGI, 188 00:07:35,150 --> 00:07:36,760 an artificial general intelligence. 189 00:07:36,804 --> 00:07:40,372 And so that means that we need a system which is general. 190 00:07:40,416 --> 00:07:42,723 It doesn't learn to do one specific thing. 191 00:07:42,766 --> 00:07:45,073 That's a really key part of human intelligence. 192 00:07:45,116 --> 00:07:46,683 We can learn to do many, many things. 193 00:07:46,727 --> 00:07:48,685 It's going to, of course, be a lot of hard work. 194 00:07:48,729 --> 00:07:51,079 But one of the things that keeps me up at night 195 00:07:51,122 --> 00:07:53,168 is to not waste this opportunity to, you know, 196 00:07:53,211 --> 00:07:54,430 to really make a difference here, 197 00:07:54,474 --> 00:07:56,824 and have a big impact on the world. 198 00:07:56,867 --> 00:07:58,303 LEGG: The first people that came 199 00:07:58,347 --> 00:08:00,262 and joined DeepMind really believed in the dream. 200 00:08:00,305 --> 00:08:02,090 But this was, I think, one of the first times 201 00:08:02,133 --> 00:08:04,440 they found a place full of other dreamers. 202 00:08:04,484 --> 00:08:06,398 You know, we collected this Manhattan Project, 203 00:08:06,442 --> 00:08:08,400 if you like, together to solve AI. 204 00:08:08,444 --> 00:08:09,750 HELEN KING: In the first two years, 205 00:08:09,793 --> 00:08:10,794 we were in total stealth mode. 206 00:08:10,838 --> 00:08:12,274 And so we couldn't say to anyone 207 00:08:12,317 --> 00:08:14,755 what were we doing or where we worked. 208 00:08:14,798 --> 00:08:16,104 It was all quite vague. 209 00:08:16,147 --> 00:08:17,584 BEN COPPIN: It had no public presence at all. 210 00:08:17,627 --> 00:08:18,933 You couldn't look at a website. 211 00:08:18,976 --> 00:08:21,109 The office was at a secret location. 212 00:08:21,152 --> 00:08:23,981 When we would interview people in those early days, 213 00:08:24,025 --> 00:08:26,288 they would show up very nervously. 214 00:08:26,331 --> 00:08:27,419 [LAUGHS] 215 00:08:27,463 --> 00:08:29,639 I had at least one candidate who said, 216 00:08:29,683 --> 00:08:31,989 "I just messaged my wife to tell her exactly 217 00:08:32,033 --> 00:08:33,338 "where I'm going just in case 218 00:08:33,382 --> 00:08:34,514 "this turns out to be some kind of horrible scam 219 00:08:34,557 --> 00:08:35,819 "and I'm going to get kidnapped." 220 00:08:35,863 --> 00:08:39,693 Well, my favorite new person who's an investor, 221 00:08:39,736 --> 00:08:43,000 who I've been working for a year, is Elon Musk. 222 00:08:43,044 --> 00:08:44,132 So for those of you who don't know, 223 00:08:44,175 --> 00:08:45,350 this is what he looks like. 224 00:08:45,394 --> 00:08:47,483 And he hadn't really thought much about AI 225 00:08:47,527 --> 00:08:49,137 until we chatted. 226 00:08:49,180 --> 00:08:51,269 His mission is to die on Mars or something. 227 00:08:51,313 --> 00:08:52,923 - But not on impact. - [LAUGHTER] 228 00:08:52,967 --> 00:08:54,185 So... 229 00:08:55,186 --> 00:08:57,145 We made some big decisions 230 00:08:57,188 --> 00:08:59,321 about how we were going to approach building AI. 231 00:08:59,364 --> 00:09:01,105 This is a reinforcement learning setup. 232 00:09:01,149 --> 00:09:02,803 This is the kind of setup that we think about 233 00:09:02,846 --> 00:09:06,328 when we say we're building, you know, an AI agent. 234 00:09:06,371 --> 00:09:08,504 It's basically the agent, which is the AI, 235 00:09:08,548 --> 00:09:09,984 and then there's the environment 236 00:09:10,027 --> 00:09:11,159 that it's interacting with. 237 00:09:11,202 --> 00:09:12,464 We decided that games, 238 00:09:12,508 --> 00:09:13,988 as long as you're very disciplined 239 00:09:14,031 --> 00:09:15,293 about how you use them, 240 00:09:15,337 --> 00:09:17,252 are the perfect training ground 241 00:09:17,295 --> 00:09:19,341 for AI development. 242 00:09:19,384 --> 00:09:21,691 LEGG: We wanted to try to create one algorithm 243 00:09:21,735 --> 00:09:23,650 that could to be trained up to play 244 00:09:23,693 --> 00:09:26,043 several dozen different Atari games. 245 00:09:26,087 --> 00:09:27,479 So just like a human, 246 00:09:27,523 --> 00:09:29,525 you have to use the same brain to play all the games. 247 00:09:29,569 --> 00:09:30,874 DAVID SILVER: You can think of it 248 00:09:30,918 --> 00:09:33,007 that you provide the agent with the cartridge. 249 00:09:33,050 --> 00:09:34,312 And you say, 250 00:09:34,356 --> 00:09:35,705 "Okay, imagine you're born into that world 251 00:09:35,749 --> 00:09:37,533 "with that cartridge, and you just get to interact 252 00:09:37,577 --> 00:09:39,404 "with the pixels and see the score. 253 00:09:40,362 --> 00:09:41,668 "What can you do?" 254 00:09:43,974 --> 00:09:47,369 So what you're going to do is take your Q function. Q-K... 255 00:09:47,412 --> 00:09:49,197 HASSABIS: Q-learning is one of the oldest methods 256 00:09:49,240 --> 00:09:50,894 for reinforcement learning. 257 00:09:50,938 --> 00:09:53,680 And what we did was combine reinforcement learning 258 00:09:53,723 --> 00:09:56,770 with deep learning in one system. 259 00:09:56,813 --> 00:09:59,337 No one had ever combined those two things together 260 00:09:59,381 --> 00:10:01,426 at scale to do anything impressive, 261 00:10:01,470 --> 00:10:03,515 and we needed to prove out this thesis. 262 00:10:03,559 --> 00:10:06,780 LEGG: We tried doingPong as the first game. 263 00:10:06,823 --> 00:10:08,129 It seemed like the simplest. 264 00:10:08,172 --> 00:10:10,218 It hasn't been told 265 00:10:10,261 --> 00:10:11,828 anything about what it's controlling 266 00:10:11,872 --> 00:10:12,873 or what it's supposed to do. 267 00:10:12,916 --> 00:10:14,657 All it knows is that score is good 268 00:10:14,701 --> 00:10:18,052 and it has to learn what its controls do, 269 00:10:18,095 --> 00:10:20,620 and build everything... first principles. 270 00:10:21,229 --> 00:10:22,752 [GAME BEEPING] 271 00:10:29,193 --> 00:10:30,412 It wasn't really working. 272 00:10:32,588 --> 00:10:34,068 HASSABIS: I was just saying to Shane, 273 00:10:34,111 --> 00:10:37,071 "Maybe we're just wrong, and we can't even doPong." 274 00:10:37,114 --> 00:10:38,812 LEGG: It was a bit nerve-racking, 275 00:10:38,855 --> 00:10:40,422 thinking how far we had to go 276 00:10:40,465 --> 00:10:42,337 if we were going to really build 277 00:10:42,380 --> 00:10:44,426 a generally intelligent system. 278 00:10:44,469 --> 00:10:45,732 HASSABIS: And it felt like it was time 279 00:10:45,775 --> 00:10:47,255 to give up and move on. 280 00:10:48,169 --> 00:10:49,387 And then suddenly... 281 00:10:49,431 --> 00:10:51,346 [STIRRING MUSIC PLAYS] 282 00:10:51,389 --> 00:10:53,609 We got our first point. 283 00:10:53,653 --> 00:10:56,612 And then it was like, "Is this random?" 284 00:10:56,656 --> 00:10:59,180 "No, no, it's really getting a point now." 285 00:10:59,223 --> 00:11:00,703 It was really exciting that this thing 286 00:11:00,747 --> 00:11:02,226 that previously couldn't even figure out 287 00:11:02,270 --> 00:11:03,532 how to move a paddle 288 00:11:03,575 --> 00:11:05,926 had suddenly been able to totally get it right. 289 00:11:05,969 --> 00:11:07,144 HASSABIS: Then it was getting a few points. 290 00:11:07,188 --> 00:11:08,624 And then it won its first game. 291 00:11:08,668 --> 00:11:10,974 And then three months later, no human could beat it. 292 00:11:11,018 --> 00:11:14,238 You hadn't told it the rules, how to get the score, nothing. 293 00:11:14,282 --> 00:11:16,110 And you just tell it to maximize the score, 294 00:11:16,153 --> 00:11:17,372 and it goes away and does it. 295 00:11:17,415 --> 00:11:18,678 This is the first time 296 00:11:18,721 --> 00:11:20,549 anyone had done this end-to-end learning. 297 00:11:20,592 --> 00:11:24,292 "Okay, so we have this working in quite a general way. 298 00:11:24,335 --> 00:11:25,772 "Now let's try another game." 299 00:11:25,815 --> 00:11:27,512 HASSABIS: So then we triedBreakout. 300 00:11:27,556 --> 00:11:29,123 At the beginning, after 100 games, 301 00:11:29,166 --> 00:11:30,777 the agent is not very good. 302 00:11:30,820 --> 00:11:32,474 It's missing the ball most of the time, 303 00:11:32,517 --> 00:11:34,389 but it's starting to get the hang of the idea 304 00:11:34,432 --> 00:11:35,999 that the bat should go towards the ball. 305 00:11:36,043 --> 00:11:37,566 Now, after 300 games, 306 00:11:37,609 --> 00:11:40,656 it's about as good as any human can play this. 307 00:11:40,700 --> 00:11:42,049 We thought, "Well, that's pretty cool," 308 00:11:42,092 --> 00:11:44,312 but we left the system playing for another 200 games, 309 00:11:44,355 --> 00:11:46,053 and it did this amazing thing. 310 00:11:46,096 --> 00:11:47,358 It found the optimal strategy 311 00:11:47,402 --> 00:11:49,404 was to dig a tunnel around the side 312 00:11:49,447 --> 00:11:51,667 and put the ball around the back of the wall. 313 00:11:51,711 --> 00:11:53,234 KORAY KAVUKCUOGLU: Finally, the agent 314 00:11:53,277 --> 00:11:54,365 is actually achieving 315 00:11:54,409 --> 00:11:55,627 what you thought it would achieve. 316 00:11:55,671 --> 00:11:57,325 That is a great feeling. Right? 317 00:11:57,368 --> 00:11:59,283 Like, I mean, when we do research, 318 00:11:59,327 --> 00:12:00,676 that is the best we can hope for. 319 00:12:00,720 --> 00:12:03,200 We started generalizing to 50 games, 320 00:12:03,244 --> 00:12:05,463 and we basically created a recipe. 321 00:12:05,507 --> 00:12:06,813 We could just take a game 322 00:12:06,856 --> 00:12:08,379 that we have never seen before. 323 00:12:08,423 --> 00:12:09,903 We would run the algorithm on that, 324 00:12:09,946 --> 00:12:13,036 and DQN could train itself from scratch, 325 00:12:13,080 --> 00:12:14,429 achieving human level 326 00:12:14,472 --> 00:12:15,996 or sometimes better than human level. 327 00:12:16,039 --> 00:12:18,433 LEGG: We didn't build it to play any of them. 328 00:12:18,476 --> 00:12:20,522 We could just give it a bunch of games 329 00:12:20,565 --> 00:12:22,654 and would figure it out for itself. 330 00:12:22,698 --> 00:12:25,179 And there was something quite magical in that. 331 00:12:25,222 --> 00:12:26,528 MURRAY SHANAHAN: Suddenly you had something 332 00:12:26,571 --> 00:12:27,921 that would respond and learn 333 00:12:27,964 --> 00:12:30,358 whatever situation it was parachuted into. 334 00:12:30,401 --> 00:12:33,013 And that was like a huge, huge breakthrough. 335 00:12:33,056 --> 00:12:35,276 It was in many respects 336 00:12:35,319 --> 00:12:36,625 the first example 337 00:12:36,668 --> 00:12:39,062 of any kind of thing you could call 338 00:12:39,106 --> 00:12:40,542 a general intelligence. 339 00:12:42,109 --> 00:12:43,893 HASSABIS: Although we were a well-funded startup, 340 00:12:43,937 --> 00:12:47,375 holding us back was not enough compute power. 341 00:12:47,418 --> 00:12:49,203 I realized that this would accelerate 342 00:12:49,246 --> 00:12:51,466 our time scale to AGI massively. 343 00:12:51,509 --> 00:12:53,163 I used to see Demis quite frequently. 344 00:12:53,207 --> 00:12:55,035 We'd have lunch, and he did... 345 00:12:56,210 --> 00:12:58,865 say to me that he had two companies 346 00:12:58,908 --> 00:13:02,564 that were involved in buying DeepMind. 347 00:13:02,607 --> 00:13:04,609 And he didn't know which one to go with. 348 00:13:04,653 --> 00:13:08,526 The issue was, would any commercial company 349 00:13:08,570 --> 00:13:12,661 appreciate the real importance of the research? 350 00:13:12,704 --> 00:13:15,838 And give the research time to come to fruition 351 00:13:15,882 --> 00:13:17,797 and not be breathing down their necks, 352 00:13:17,840 --> 00:13:21,539 saying, "We want some kind of commercial benefit from this." 353 00:13:23,411 --> 00:13:24,586 [MACHINERY HUMMING] 354 00:13:27,284 --> 00:13:32,463 Google has bought DeepMind for a reported £400,000,000, 355 00:13:32,507 --> 00:13:34,726 making the artificial intelligence firm 356 00:13:34,770 --> 00:13:37,947 its largest European acquisition so far. 357 00:13:37,991 --> 00:13:39,644 The company was founded 358 00:13:39,688 --> 00:13:43,344 by 37-year-old entrepreneur Demis Hassabis. 359 00:13:43,387 --> 00:13:45,563 After the acquisition, I started mentoring 360 00:13:45,607 --> 00:13:47,261 and spending time with Demis, 361 00:13:47,304 --> 00:13:48,915 and just listening to him. 362 00:13:48,958 --> 00:13:52,396 And this is a person who fundamentally 363 00:13:52,440 --> 00:13:55,573 is a scientist and a natural scientist. 364 00:13:55,617 --> 00:13:58,489 He wants science to solve every problem in the world, 365 00:13:58,533 --> 00:14:00,622 and he believes it can do so. 366 00:14:00,665 --> 00:14:03,625 That's not a normal person you find in a tech company. 367 00:14:05,496 --> 00:14:07,455 HASSABIS: We were able to not only join Google 368 00:14:07,498 --> 00:14:10,240 but run independently in London, 369 00:14:10,284 --> 00:14:11,328 build our culture, 370 00:14:11,372 --> 00:14:13,461 which was optimized for breakthroughs 371 00:14:13,504 --> 00:14:15,419 and not deal with products, 372 00:14:15,463 --> 00:14:17,726 do pure research. 373 00:14:17,769 --> 00:14:19,293 Our investors didn't want to sell, 374 00:14:19,336 --> 00:14:20,685 but we decided 375 00:14:20,729 --> 00:14:22,731 that this was the best thing for the mission. 376 00:14:22,774 --> 00:14:24,646 In many senses, we were underselling 377 00:14:24,689 --> 00:14:26,169 in terms of value before it more matured, 378 00:14:26,213 --> 00:14:28,128 and you could have sold it for a lot more money. 379 00:14:28,171 --> 00:14:32,828 And the reason is because there's no time to waste. 380 00:14:32,872 --> 00:14:35,178 There's so many things that got to be cracked 381 00:14:35,222 --> 00:14:37,659 while the brain is still in gear. 382 00:14:37,702 --> 00:14:39,008 You know, I'm still alive. 383 00:14:39,052 --> 00:14:40,880 There's all these things that gotta be done. 384 00:14:40,923 --> 00:14:42,925 So you haven't got-- I mean, how many... 385 00:14:42,969 --> 00:14:44,492 How many billions would you trade for 386 00:14:44,535 --> 00:14:45,797 another five years of life, you know, 387 00:14:45,841 --> 00:14:48,365 to do what you set out to do? 388 00:14:48,409 --> 00:14:49,758 Okay, all of a sudden, 389 00:14:49,801 --> 00:14:52,717 we've got this massive scale compute available to us. 390 00:14:52,761 --> 00:14:53,936 What can we do with that? 391 00:14:56,591 --> 00:14:59,942 HASSABIS: Go is the pinnacle of board games. 392 00:14:59,986 --> 00:15:04,642 It is the most complex game ever devised by man. 393 00:15:04,686 --> 00:15:06,688 There are more possible board configurations 394 00:15:06,731 --> 00:15:09,821 in the game of Go than there are atoms in the universe. 395 00:15:09,865 --> 00:15:13,390 SILVER: Go is the holy grail of artificial intelligence. 396 00:15:13,434 --> 00:15:14,522 For many years, 397 00:15:14,565 --> 00:15:16,002 people have looked at this game 398 00:15:16,045 --> 00:15:17,917 and they've thought, "Wow, this is just too hard." 399 00:15:17,960 --> 00:15:20,180 Everything we've ever tried in AI, 400 00:15:20,223 --> 00:15:22,704 it just falls over when you try the game of Go. 401 00:15:22,747 --> 00:15:23,966 And so that's why it feels like 402 00:15:24,010 --> 00:15:26,099 a real litmus test of progress. 403 00:15:26,142 --> 00:15:28,579 We had just bought DeepMind. 404 00:15:28,623 --> 00:15:30,712 They were working on reinforcement learning 405 00:15:30,755 --> 00:15:32,975 and they were the world's experts in games. 406 00:15:33,019 --> 00:15:34,846 And so when they introduced the idea 407 00:15:34,890 --> 00:15:37,197 that they could beat the top level Go players 408 00:15:37,240 --> 00:15:40,113 in a game that was thought to be incomputable, 409 00:15:40,156 --> 00:15:42,724 I thought, "Well, that's pretty interesting." 410 00:15:42,767 --> 00:15:46,467 Our ultimate next step is to play the legendary 411 00:15:46,510 --> 00:15:49,209 Lee Sedol in just over two weeks. 412 00:15:50,514 --> 00:15:52,038 NEWSREADER 1: A match like no other 413 00:15:52,081 --> 00:15:54,257 is about to get underway in South Korea. 414 00:15:54,301 --> 00:15:57,826 NEWSREADER 2: Lee Sedol is getting ready to rumble. 415 00:15:57,869 --> 00:15:59,262 HASSABIS: Lee Sedol is probably 416 00:15:59,306 --> 00:16:01,699 one of the greatest players of the last decade. 417 00:16:01,743 --> 00:16:04,224 I describe him as the Roger Federer of Go. 418 00:16:05,573 --> 00:16:06,922 ERIC SCHMIDT: He showed up, 419 00:16:06,966 --> 00:16:09,838 and all of a sudden we have a thousand Koreans 420 00:16:09,881 --> 00:16:13,059 who represent all of Korean society, 421 00:16:13,102 --> 00:16:14,190 the top Go players. 422 00:16:15,583 --> 00:16:17,802 And then we have Demis. 423 00:16:17,846 --> 00:16:19,848 And the great engineering team. 424 00:16:20,588 --> 00:16:22,285 He's very famous 425 00:16:22,329 --> 00:16:25,506 for very creative fighting play. 426 00:16:25,549 --> 00:16:28,770 So this could be difficult for us. 427 00:16:28,813 --> 00:16:31,991 SCHMIDT: I figured Lee Sedol is going to beat these guys, 428 00:16:32,034 --> 00:16:34,297 but they'll make a good showing. 429 00:16:34,341 --> 00:16:35,733 Good for a startup. 430 00:16:38,040 --> 00:16:39,563 I went over to the technical group 431 00:16:39,607 --> 00:16:40,912 and they said, 432 00:16:40,956 --> 00:16:42,305 "Let me show you how our algorithm works." 433 00:16:43,654 --> 00:16:45,091 RESEARCHER: If you step through the actual game, 434 00:16:45,134 --> 00:16:47,789 we can see, kind of, how AlphaGo thinks. 435 00:16:47,832 --> 00:16:50,226 HASSABIS: The way we start off on training AlphaGo 436 00:16:50,270 --> 00:16:52,968 is by showing it 100,000 games 437 00:16:53,012 --> 00:16:54,491 that strong amateurs have played. 438 00:16:54,535 --> 00:16:55,710 And we first initially 439 00:16:55,753 --> 00:16:58,887 get AlphaGo to mimic the human player, 440 00:16:58,930 --> 00:17:00,802 and then through reinforcement learning, 441 00:17:00,845 --> 00:17:02,369 it plays against different versions of itself 442 00:17:02,412 --> 00:17:05,720 many millions of times and learns from its errors. 443 00:17:05,763 --> 00:17:07,591 Hmm, this is interesting. 444 00:17:07,635 --> 00:17:08,679 ANNOUNCER 1: All right, folks, 445 00:17:08,723 --> 00:17:10,594 you're going to see history made. 446 00:17:10,638 --> 00:17:11,856 [ANNOUNCER 2 SPEAKING KOREAN] 447 00:17:12,770 --> 00:17:14,772 SCHMIDT: So the game starts. 448 00:17:14,816 --> 00:17:15,991 ANNOUNCER 1: He's really concentrating. 449 00:17:16,035 --> 00:17:17,645 ANNOUNCER 3: If you really look at the... 450 00:17:19,473 --> 00:17:20,865 [ANNOUNCERS EXCLAIM] 451 00:17:20,909 --> 00:17:25,348 That's a very surprising move. 452 00:17:25,392 --> 00:17:27,916 ANNOUNCER 3: I think we're seeing an original move here. 453 00:17:34,401 --> 00:17:35,793 Yeah, that's an exciting move. 454 00:17:36,185 --> 00:17:37,230 I like... 455 00:17:37,273 --> 00:17:38,579 SILVER: Professional commentators 456 00:17:38,622 --> 00:17:40,102 almost unanimously said 457 00:17:40,146 --> 00:17:43,323 that not a single human player would have chosen move 37. 458 00:17:43,366 --> 00:17:45,803 So I actually had a poke around in AlphaGo 459 00:17:45,847 --> 00:17:47,414 to see what AlphaGo thought. 460 00:17:47,457 --> 00:17:50,330 And AlphaGo actually agreed with that assessment. 461 00:17:50,373 --> 00:17:53,811 AlphaGo said there was a one in 10,000 probability 462 00:17:53,855 --> 00:17:57,772 that move 37 would have been played by a human player. 463 00:17:57,815 --> 00:18:00,775 [SEDOL SPEAKING IN KOREAN] 464 00:18:08,826 --> 00:18:10,176 SILVER: The game of Go has been studied 465 00:18:10,219 --> 00:18:11,568 for thousands of years. 466 00:18:11,612 --> 00:18:15,137 And AlphaGo discovered something completely new. 467 00:18:16,878 --> 00:18:19,707 ANNOUNCER: He resigned. Lee Sedol has just resigned. 468 00:18:19,750 --> 00:18:21,187 He's beaten. 469 00:18:21,230 --> 00:18:22,666 [ELECTRONIC MUSIC PLAYING] 470 00:18:22,710 --> 00:18:24,668 NEWSREADER 1: The battle between man versus machine, 471 00:18:24,712 --> 00:18:26,235 a computer just came out the victor. 472 00:18:26,279 --> 00:18:28,237 NEWSREADER 2: Google put its DeepMind team 473 00:18:28,281 --> 00:18:29,673 to the test against 474 00:18:29,717 --> 00:18:32,023 one of the brightest minds in the world and won. 475 00:18:32,067 --> 00:18:33,721 SCHMIDT: That's when we realized 476 00:18:33,764 --> 00:18:35,244 the DeepMind people knew what they were doing 477 00:18:35,288 --> 00:18:37,551 and to pay attention to reinforcement learning 478 00:18:37,594 --> 00:18:38,900 as they have invented it. 479 00:18:40,075 --> 00:18:41,816 Based on that experience, 480 00:18:41,859 --> 00:18:44,819 AlphaGo got better and better and better. 481 00:18:44,862 --> 00:18:45,950 And they had a little chart 482 00:18:45,994 --> 00:18:47,517 of how much better they were getting. 483 00:18:47,561 --> 00:18:49,302 And I said, "When does this stop?" 484 00:18:50,085 --> 00:18:50,999 And Demis said, 485 00:18:51,042 --> 00:18:52,696 "When we beat the Chinese guy, 486 00:18:52,740 --> 00:18:55,743 "the top-rated player in the world." 487 00:18:56,961 --> 00:18:59,225 ANNOUNCER 1: Ke Jie versus AlphaGo. 488 00:19:03,620 --> 00:19:04,795 ANNOUNCER 2: And I think we will see 489 00:19:04,839 --> 00:19:06,145 AlphaGo pushing through there. 490 00:19:06,188 --> 00:19:08,190 ANNOUNCER 1: AlphaGo is ahead quite a bit. 491 00:19:08,234 --> 00:19:11,454 SCHMIDT: About halfway through the first game, 492 00:19:11,498 --> 00:19:14,675 the best player in the world was not doing so well. 493 00:19:14,718 --> 00:19:17,808 ANNOUNCER 1: What can black do here? 494 00:19:19,114 --> 00:19:21,247 ANNOUNCER 2: Looks difficult. 495 00:19:21,290 --> 00:19:23,336 SCHMIDT: And at a critical moment... 496 00:19:32,997 --> 00:19:35,913 the Chinese government ordered the feed cut off. 497 00:19:38,307 --> 00:19:41,745 It was at that moment we were telling the world 498 00:19:41,789 --> 00:19:44,966 that something new had arrived on earth. 499 00:19:47,621 --> 00:19:48,970 In the 1950s 500 00:19:49,013 --> 00:19:51,929 when Russia'sSputnik satellite was launched, 501 00:19:53,279 --> 00:19:55,194 it changed the course of history. 502 00:19:55,237 --> 00:19:57,544 TV HOST: It is a challenge that America must meet 503 00:19:57,587 --> 00:19:59,633 to survive in the Space Age. 504 00:19:59,676 --> 00:20:02,375 SCHMIDT: This has been called theSputnik moment. 505 00:20:02,418 --> 00:20:06,335 The Sputnikmoment created a massive reaction in the US 506 00:20:06,379 --> 00:20:10,034 in terms of funding for science and engineering, 507 00:20:10,078 --> 00:20:12,254 and particularly of space technology. 508 00:20:12,298 --> 00:20:15,823 For China, AlphaGo was the wakeup call, 509 00:20:15,866 --> 00:20:17,128 the Sputnikmoment. 510 00:20:17,172 --> 00:20:19,870 It launched an AI space race. 511 00:20:21,220 --> 00:20:23,047 HASSABIS: We had this huge idea that worked, 512 00:20:23,091 --> 00:20:26,355 and now the whole world knows. 513 00:20:26,399 --> 00:20:28,879 It's always easier to land on the moon 514 00:20:28,923 --> 00:20:30,838 if someone's already landed there. 515 00:20:32,056 --> 00:20:34,450 It is going to matter who builds AI, 516 00:20:34,494 --> 00:20:36,626 and how it gets built. 517 00:20:36,670 --> 00:20:38,324 I always feel that pressure. 518 00:20:42,066 --> 00:20:43,764 SILVER: There's been a big chain of events 519 00:20:43,807 --> 00:20:46,680 that followed on from all of the excitement of AlphaGo. 520 00:20:46,723 --> 00:20:48,072 When we played against Lee Sedol, 521 00:20:48,116 --> 00:20:49,248 we actually had a system 522 00:20:49,291 --> 00:20:50,684 that had been trained on human data, 523 00:20:50,727 --> 00:20:52,251 on all of the millions of games 524 00:20:52,294 --> 00:20:55,036 that have been played by human experts. 525 00:20:55,079 --> 00:20:56,994 We eventually found a new algorithm, 526 00:20:57,038 --> 00:20:59,170 a much more elegant approach to the whole system, 527 00:20:59,214 --> 00:21:01,172 which actually stripped out all of the human knowledge 528 00:21:01,216 --> 00:21:03,697 and just started completely from scratch. 529 00:21:03,740 --> 00:21:06,700 And that became a project which we called AlphaZero. 530 00:21:06,743 --> 00:21:09,529 Zero, meaning having zero human knowledge in the loop. 531 00:21:11,879 --> 00:21:13,054 Instead of learning from human data, 532 00:21:13,097 --> 00:21:15,796 it learned from its own games. 533 00:21:15,839 --> 00:21:17,841 So it actually became its own teacher. 534 00:21:21,280 --> 00:21:23,499 HASSABIS: AlphaZero is an experiment 535 00:21:23,543 --> 00:21:26,720 in how little knowledge can we put into these systems 536 00:21:26,763 --> 00:21:28,243 and how quickly and how efficiently 537 00:21:28,287 --> 00:21:29,723 can they learn? 538 00:21:29,766 --> 00:21:32,552 But the other thing is AlphaZero doesn't have any rules. 539 00:21:32,595 --> 00:21:33,553 It learns through experience. 540 00:21:36,120 --> 00:21:38,862 The next stage was to make it more general, 541 00:21:38,906 --> 00:21:40,995 so that it could play any two-player game. 542 00:21:41,038 --> 00:21:42,344 Things like chess, 543 00:21:42,388 --> 00:21:44,085 and in fact, any kind of two-player 544 00:21:44,128 --> 00:21:45,391 perfect information game. 545 00:21:45,434 --> 00:21:46,653 It's going really well. 546 00:21:46,696 --> 00:21:47,828 It's going really, really well. 547 00:21:47,871 --> 00:21:50,091 - Oh, wow. - It's going down, like fast. 548 00:21:50,134 --> 00:21:53,050 HASSABIS: AlphaGo used to take a few months to train, 549 00:21:53,094 --> 00:21:55,662 but AlphaZero could start in the morning 550 00:21:55,705 --> 00:21:57,794 playing completely randomly 551 00:21:57,838 --> 00:22:01,015 and then by tea be at superhuman level. 552 00:22:01,058 --> 00:22:03,365 And by dinner it will be the strongest chess entity 553 00:22:03,409 --> 00:22:04,758 there's ever been. 554 00:22:04,801 --> 00:22:06,629 - Amazing, it's amazing. - Yeah. 555 00:22:06,673 --> 00:22:09,371 It's discovered its own attacking style, you know, 556 00:22:09,415 --> 00:22:11,417 to take on the current level of defense. 557 00:22:11,460 --> 00:22:12,766 I mean, I never in my wildest dreams... 558 00:22:12,809 --> 00:22:14,942 I agree. Actually, I was not expecting that either. 559 00:22:14,985 --> 00:22:16,422 And it's fun for me. 560 00:22:16,465 --> 00:22:18,598 I mean, it's inspired me to get back into chess again, 561 00:22:18,641 --> 00:22:20,164 because it's cool to see 562 00:22:20,208 --> 00:22:22,384 that there's even more depth than we thought in chess. 563 00:22:24,473 --> 00:22:25,431 [HORN BLOWS] 564 00:22:31,611 --> 00:22:34,440 HASSABIS: I actually got into AI through games. 565 00:22:35,658 --> 00:22:37,660 Initially, it was board games. 566 00:22:37,704 --> 00:22:40,054 I was thinking, "How is my brain doing this?" 567 00:22:40,097 --> 00:22:41,969 Like, what is it doing? 568 00:22:43,362 --> 00:22:47,148 I was very aware of that from a very young age. 569 00:22:47,191 --> 00:22:50,064 So I've always been thinking about thinking. 570 00:22:50,107 --> 00:22:52,762 NEWSREADER: The British and American chess champions 571 00:22:52,806 --> 00:22:55,025 meet to begin a series of matches. 572 00:22:55,069 --> 00:22:56,592 Playing alongside them are the cream 573 00:22:56,636 --> 00:22:59,073 of Britain and America's youngest players. 574 00:22:59,116 --> 00:23:01,467 NEWSREADER 2: Demis Hassabis is representing Britain. 575 00:23:06,210 --> 00:23:07,864 COSTAS HASSABIS: When Demis was four, 576 00:23:07,908 --> 00:23:11,259 he first showed an aptitude for chess. 577 00:23:12,739 --> 00:23:14,044 By the time he was six, 578 00:23:14,088 --> 00:23:18,048 he became London under-eight champion. 579 00:23:18,092 --> 00:23:19,485 HASSABIS: My parents were very interesting 580 00:23:19,528 --> 00:23:20,834 and unusual, actually. 581 00:23:20,877 --> 00:23:23,750 I'd probably describe them as quite bohemian. 582 00:23:23,793 --> 00:23:25,491 My father was a singer-songwriter 583 00:23:25,534 --> 00:23:26,709 when he was younger, 584 00:23:26,753 --> 00:23:28,363 and Bob Dylan was his hero. 585 00:23:32,846 --> 00:23:34,500 [HORN HONKS] 586 00:23:34,543 --> 00:23:36,110 [ANGELA HASSABIS SPEAKING] 587 00:23:38,068 --> 00:23:39,287 Yeah, yeah. 588 00:23:41,637 --> 00:23:44,031 HOST: What is it that you like about this game? 589 00:23:45,075 --> 00:23:47,121 It's just a good thinking game. 590 00:23:49,253 --> 00:23:51,038 HASSABIS: At the time, I was the second-highest rated 591 00:23:51,081 --> 00:23:52,692 chess player in the world for my age. 592 00:23:52,735 --> 00:23:54,345 But although I was on track 593 00:23:54,389 --> 00:23:55,956 to be a professional chess player, 594 00:23:55,999 --> 00:23:57,523 I thought that was what I was going to do. 595 00:23:57,566 --> 00:23:59,220 No matter how much I loved the game, 596 00:23:59,263 --> 00:24:01,222 it was incredibly stressful. 597 00:24:01,265 --> 00:24:03,354 Definitely was not fun and games for me. 598 00:24:03,398 --> 00:24:05,226 My parents used to, you know, 599 00:24:05,269 --> 00:24:06,923 get very upset when I lost the game 600 00:24:06,967 --> 00:24:10,405 and angry if I forgot something. 601 00:24:10,449 --> 00:24:12,407 And because it was quite high stakes for them, you know, 602 00:24:12,451 --> 00:24:14,061 it cost a lot of money to go to these tournaments. 603 00:24:14,104 --> 00:24:15,715 And my parents didn't have much money. 604 00:24:18,413 --> 00:24:19,719 My parents thought, you know, 605 00:24:19,762 --> 00:24:22,069 "If you interested in being a chess professional, 606 00:24:22,112 --> 00:24:25,376 "this is really important. It's like your exams." 607 00:24:27,291 --> 00:24:30,077 I remember I was about 12-years-old 608 00:24:30,120 --> 00:24:32,079 and I was at this international chess tournament 609 00:24:32,122 --> 00:24:34,255 in Liechtenstein up in the mountains. 610 00:24:36,562 --> 00:24:39,303 [BELL TOLLING] 611 00:24:43,307 --> 00:24:45,571 And we were in this huge church hall 612 00:24:47,181 --> 00:24:48,399 with, you know, 613 00:24:48,443 --> 00:24:50,184 hundreds of international chess players. 614 00:24:52,403 --> 00:24:56,016 And I was playing the ex-Danish champion. 615 00:24:56,059 --> 00:24:58,801 He must have been in his 30s, probably. 616 00:25:00,411 --> 00:25:02,979 In those days, there was a long time limit. 617 00:25:03,023 --> 00:25:05,068 The games could literally last all day. 618 00:25:05,591 --> 00:25:08,289 [YAWNS] 619 00:25:08,332 --> 00:25:10,813 - [TIMER TICKING] - We were into our tenth hour. 620 00:25:10,857 --> 00:25:12,380 [TIMER TICKS FRANTICALLY] 621 00:25:17,690 --> 00:25:20,214 [MOUSE GASPS] 622 00:25:20,257 --> 00:25:23,086 And we were in this incredibly unusual ending. 623 00:25:23,130 --> 00:25:24,566 I think it should be a draw. 624 00:25:26,307 --> 00:25:28,657 But he kept on trying to win for hours. 625 00:25:32,182 --> 00:25:33,401 [HORSE NEIGHS] 626 00:25:35,359 --> 00:25:38,319 Finally, he tried one last cheap trick. 627 00:25:42,541 --> 00:25:44,455 All I had to do was give away my queen. 628 00:25:44,499 --> 00:25:45,674 Then it would be stalemate. 629 00:25:47,023 --> 00:25:48,634 But I was so tired, 630 00:25:48,677 --> 00:25:50,157 I thought it was inevitable I was going to be checkmated. 631 00:25:52,289 --> 00:25:53,508 And so I resigned. 632 00:25:57,294 --> 00:25:59,558 He jumped up. Just started laughing. 633 00:25:59,601 --> 00:26:00,559 [LAUGHING] 634 00:26:01,777 --> 00:26:02,909 And he went, 635 00:26:02,952 --> 00:26:04,171 "Why have you resigned? It's a draw." 636 00:26:04,214 --> 00:26:05,346 And he immediately, with a flourish, 637 00:26:05,389 --> 00:26:06,652 sort of showed me the drawing move. 638 00:26:09,045 --> 00:26:12,396 I felt so sick to my stomach. 639 00:26:12,440 --> 00:26:14,137 It made me think of the rest of that tournament. 640 00:26:14,181 --> 00:26:16,662 Like, are we wasting our minds? 641 00:26:16,705 --> 00:26:19,708 Is this the best use of all this brain power? 642 00:26:19,752 --> 00:26:22,363 Everybody's, collectively, in that building? 643 00:26:22,406 --> 00:26:24,060 If you could somehow plug in 644 00:26:24,104 --> 00:26:27,542 those 300 brains into a system, 645 00:26:27,586 --> 00:26:29,022 you might be able to solve cancer 646 00:26:29,065 --> 00:26:30,501 with that level of brain power. 647 00:26:31,633 --> 00:26:33,504 This intuitive feeling came over me 648 00:26:33,548 --> 00:26:35,071 that although I love chess, 649 00:26:35,115 --> 00:26:38,335 this is not the right thing to spend my whole life on. 650 00:26:51,479 --> 00:26:53,089 LEGG: Demis and myself, 651 00:26:53,133 --> 00:26:56,092 our plan was always to fill DeepMind 652 00:26:56,136 --> 00:26:57,224 with some of the most 653 00:26:57,267 --> 00:26:59,226 brilliant scientists in the world. 654 00:26:59,269 --> 00:27:01,054 So we had the human brains 655 00:27:01,097 --> 00:27:04,927 necessary to create an AGI system. 656 00:27:04,971 --> 00:27:09,279 By definition, the "G" in AGI is about generality. 657 00:27:09,323 --> 00:27:13,109 What I imagine is being able to talk to an agent, 658 00:27:13,153 --> 00:27:15,198 the agent can talk back, 659 00:27:15,242 --> 00:27:18,898 and the agent is able to solve novel problems 660 00:27:18,941 --> 00:27:20,595 that it hasn't seen before. 661 00:27:20,639 --> 00:27:22,728 That's a really key part of human intelligence, 662 00:27:22,771 --> 00:27:24,468 and it's that cognitive breadth 663 00:27:24,512 --> 00:27:27,733 and flexibility that's incredible. 664 00:27:27,776 --> 00:27:29,473 The only natural general intelligence 665 00:27:29,517 --> 00:27:30,910 we know of as humans, 666 00:27:30,953 --> 00:27:33,434 we obviously learn a lot from our environment. 667 00:27:33,477 --> 00:27:35,871 So we think that simulated environments 668 00:27:35,915 --> 00:27:38,874 are one of the ways to create an AGI. 669 00:27:40,528 --> 00:27:42,356 SIMON CARTER: The very early humans 670 00:27:42,399 --> 00:27:44,314 were having to solve logic problems. 671 00:27:44,358 --> 00:27:46,752 They were having to solve navigation, memory, 672 00:27:46,795 --> 00:27:48,971 and we evolved in that environment. 673 00:27:50,364 --> 00:27:52,496 If we can create a virtual recreation 674 00:27:52,540 --> 00:27:54,629 of that kind of environment, 675 00:27:54,673 --> 00:27:56,326 that's the perfect testing ground 676 00:27:56,370 --> 00:27:57,458 and training ground 677 00:27:57,501 --> 00:27:59,329 for everything we do at DeepMind. 678 00:28:04,247 --> 00:28:05,727 GUY SIMMONS: What they were doing here 679 00:28:05,771 --> 00:28:09,252 was creating environments for childlike beings, 680 00:28:09,296 --> 00:28:11,690 the agents to exist within and play. 681 00:28:12,429 --> 00:28:13,648 That just sounded like 682 00:28:13,692 --> 00:28:16,782 the most interesting thing in all the world. 683 00:28:16,825 --> 00:28:19,132 SHANAHAN: A child learns by tearing things up 684 00:28:19,175 --> 00:28:20,655 and then throwing food around 685 00:28:20,699 --> 00:28:23,353 and getting a response from mommy or daddy. 686 00:28:23,397 --> 00:28:25,616 This seems like an important idea to incorporate 687 00:28:25,660 --> 00:28:27,880 in the way you train an agent. 688 00:28:27,923 --> 00:28:30,970 RESEARCHER 1: The humanoid is supposed to stand up. 689 00:28:31,013 --> 00:28:32,972 As his center of gravity rises, 690 00:28:33,015 --> 00:28:34,408 it gets more points. 691 00:28:37,454 --> 00:28:38,804 You have a reward 692 00:28:38,847 --> 00:28:40,849 and the agent learns from the reward, 693 00:28:40,893 --> 00:28:43,373 like, you do something well, you get a positive reward. 694 00:28:43,417 --> 00:28:47,508 You do something bad, you get a negative reward. 695 00:28:47,551 --> 00:28:49,379 RESEARCHER 2: [EXCLAIMS] It looks like it's standing. 696 00:28:50,859 --> 00:28:52,165 It's still a bit drunk. 697 00:28:52,208 --> 00:28:53,644 RESEARCHER 1: It likes to walk backwards. 698 00:28:53,688 --> 00:28:55,342 RESEARCHER 2: [CHUCKLES] Yeah. 699 00:28:55,385 --> 00:28:57,518 The whole algorithm is trying to optimize 700 00:28:57,561 --> 00:28:59,694 for receiving as much rewards as possible, 701 00:28:59,738 --> 00:29:02,305 and it's found that walking backwards, 702 00:29:02,349 --> 00:29:05,352 it's good enough to get very good scores. 703 00:29:07,746 --> 00:29:09,573 RAIA HADSELL: When we learn to navigate, 704 00:29:09,617 --> 00:29:11,271 when we learn to get around in our world, 705 00:29:11,314 --> 00:29:13,490 we don't start with maps. 706 00:29:13,534 --> 00:29:16,319 We just start with our own exploration, 707 00:29:16,363 --> 00:29:18,017 adventuring off across the park, 708 00:29:18,060 --> 00:29:21,890 without our parents by our side, 709 00:29:21,934 --> 00:29:24,327 or finding our way home from school when we're young. 710 00:29:24,371 --> 00:29:25,546 [FAST ELECTRONIC MUSIC PLAYING] 711 00:29:26,939 --> 00:29:28,723 HADSELL: A few of us came up with this idea 712 00:29:28,767 --> 00:29:31,987 that if we had an environment where a simulated robot 713 00:29:32,031 --> 00:29:33,728 just had to run forward, 714 00:29:33,772 --> 00:29:36,296 we could put all sorts of obstacles in its way 715 00:29:36,339 --> 00:29:38,211 and see if it could manage to navigate 716 00:29:38,254 --> 00:29:40,474 different types of terrain. 717 00:29:40,517 --> 00:29:43,042 The idea would be like a parkour challenge. 718 00:29:46,393 --> 00:29:48,743 It's not graceful, 719 00:29:48,787 --> 00:29:51,833 but was never trained to hold a glass whilst it was running 720 00:29:51,877 --> 00:29:53,052 and not spill water. 721 00:29:54,183 --> 00:29:55,706 You set this objective that says, 722 00:29:55,750 --> 00:29:58,231 "Just move forward, forward velocity, 723 00:29:58,274 --> 00:30:00,581 "and you'll get a reward for that." 724 00:30:00,624 --> 00:30:02,670 And the learning algorithm figures out 725 00:30:02,713 --> 00:30:05,412 how to move this complex set of joints. 726 00:30:06,152 --> 00:30:07,370 That's the power of 727 00:30:07,414 --> 00:30:10,069 reward-based reinforcement learning. 728 00:30:10,112 --> 00:30:12,767 SILVER: Our goal is to try and build agents 729 00:30:12,811 --> 00:30:15,770 which, we drop them in, they know nothing, 730 00:30:15,814 --> 00:30:18,381 they get to play around in whatever problem you give them 731 00:30:18,425 --> 00:30:22,342 and eventually figure out how to solve it for themselves. 732 00:30:22,385 --> 00:30:24,823 Now we want something which can do that 733 00:30:24,866 --> 00:30:27,390 in as many different types of problems as possible. 734 00:30:29,262 --> 00:30:32,918 A human needs diverse skills to interact with the world. 735 00:30:32,961 --> 00:30:35,050 How to deal with complex images, 736 00:30:35,094 --> 00:30:37,879 how to manipulate thousands of things at once, 737 00:30:37,923 --> 00:30:40,447 how to deal with missing information. 738 00:30:40,490 --> 00:30:42,014 We think all of these things together 739 00:30:42,057 --> 00:30:45,278 are represented by this game calledStarCraft. 740 00:30:45,321 --> 00:30:47,280 All it's being trained to do is, 741 00:30:47,323 --> 00:30:50,457 given this situation, this screen, 742 00:30:50,500 --> 00:30:51,850 what would a human do? 743 00:30:51,893 --> 00:30:55,244 We took inspiration from large language models 744 00:30:55,288 --> 00:30:57,638 where you simply train a model 745 00:30:57,681 --> 00:30:59,553 to predict the next word, 746 00:31:03,731 --> 00:31:05,472 which is exactly the same as 747 00:31:05,515 --> 00:31:07,735 predict the next StarCraft move. 748 00:31:07,778 --> 00:31:08,954 SILVER: Unlike chess or Go, 749 00:31:08,997 --> 00:31:11,217 where players take turns to make moves, 750 00:31:11,260 --> 00:31:14,133 inStarCraft there's a continuous flow of decisions. 751 00:31:15,134 --> 00:31:16,483 On top of that, 752 00:31:16,526 --> 00:31:18,659 you can't even see what the opponent is doing. 753 00:31:18,702 --> 00:31:20,879 There is no longer a clear definition 754 00:31:20,922 --> 00:31:22,315 of what it means to play the best way. 755 00:31:22,358 --> 00:31:23,925 It depends on what your opponent does. 756 00:31:23,969 --> 00:31:25,492 HADSELL: This is the way that we'll get to 757 00:31:25,535 --> 00:31:27,494 a much more fluid, 758 00:31:27,537 --> 00:31:31,672 more natural, faster, more reactive agent. 759 00:31:31,715 --> 00:31:33,021 ORIOL VINYALS: This is a huge challenge 760 00:31:33,065 --> 00:31:35,415 and let's see how far we can push. 761 00:31:35,458 --> 00:31:36,459 TIM LILLICRAP: Oh! 762 00:31:36,503 --> 00:31:38,200 Holy monkey! 763 00:31:38,244 --> 00:31:40,376 I'm a pretty low-level amateur. 764 00:31:40,420 --> 00:31:42,857 I'm okay, but I'm a pretty low-level amateur. 765 00:31:42,901 --> 00:31:45,991 These agents have a long ways to go. 766 00:31:46,034 --> 00:31:48,515 HASSABIS: We couldn't beat someone of Tim's level. 767 00:31:48,558 --> 00:31:50,430 You know, that was a little bit alarming. 768 00:31:50,473 --> 00:31:51,997 LILLICRAP: At that point, it felt like 769 00:31:52,040 --> 00:31:53,520 it was going to be, like, a really big long challenge, 770 00:31:53,563 --> 00:31:55,000 maybe a couple of years. 771 00:31:58,177 --> 00:32:01,832 VINYALS: Dani is the best DeepMind StarCraft 2 player. 772 00:32:01,876 --> 00:32:05,097 I've been playing the agent every day for a few weeks now. 773 00:32:07,186 --> 00:32:08,883 I could feel that the agent 774 00:32:08,927 --> 00:32:11,103 was getting better really fast. 775 00:32:11,146 --> 00:32:13,366 [CHEERING, LAUGHTER] 776 00:32:13,409 --> 00:32:15,020 Wow, we beat Danny. That, for me, 777 00:32:15,063 --> 00:32:16,935 was already like a huge achievement. 778 00:32:18,284 --> 00:32:19,372 HASSABIS: The next step is 779 00:32:19,415 --> 00:32:21,722 we're going to book in a pro to play. 780 00:32:23,071 --> 00:32:24,203 [KEYBOARD TAPPING] 781 00:32:27,032 --> 00:32:28,250 [GROANS] 782 00:32:28,294 --> 00:32:30,513 [CHEERING, WHOOPING] 783 00:32:32,994 --> 00:32:35,562 [CHEERING, WHOOPING] 784 00:32:35,605 --> 00:32:37,868 - [LAUGHS] - [PEOPLE CLAPPING] 785 00:32:42,003 --> 00:32:44,049 It feels a bit unfair. All you guys against me. 786 00:32:44,092 --> 00:32:45,572 [ALL LAUGH] 787 00:32:45,615 --> 00:32:46,965 HASSABIS: We're way ahead of what I thought 788 00:32:47,008 --> 00:32:49,402 we would do, given where we were two months ago. 789 00:32:49,445 --> 00:32:50,794 Just trying to digest it all, actually. 790 00:32:50,838 --> 00:32:52,753 But it's very, very cool. 791 00:32:52,796 --> 00:32:54,146 SILVER: Now we're in a position where 792 00:32:54,189 --> 00:32:56,061 we can finally share the work that we've done 793 00:32:56,104 --> 00:32:57,192 with the public. 794 00:32:57,236 --> 00:32:58,585 This is a big step. 795 00:32:58,628 --> 00:33:00,674 We are really putting ourselves on the line here. 796 00:33:00,717 --> 00:33:02,589 - Take it away. Cheers. - Thank you. 797 00:33:02,632 --> 00:33:04,460 We're going to be live from London. 798 00:33:04,504 --> 00:33:05,722 It's happening. 799 00:33:08,638 --> 00:33:10,597 ANNOUNCER 1: Welcome to London. 800 00:33:10,640 --> 00:33:13,252 We are going to have a live exhibition match, 801 00:33:13,295 --> 00:33:15,341 MaNa against AlphaStar. 802 00:33:15,384 --> 00:33:17,169 [CHEERING, APPLAUSE] 803 00:33:18,344 --> 00:33:19,998 At this point now, 804 00:33:20,041 --> 00:33:23,827 AlphaStar, 10 and 0 against professional gamers. 805 00:33:23,871 --> 00:33:25,873 Any thoughts before we get into this game? 806 00:33:25,916 --> 00:33:27,483 VINYALS: I just want to see a good game, yeah. 807 00:33:27,527 --> 00:33:28,963 I want to see a good game. 808 00:33:29,007 --> 00:33:30,486 SILVER: Absolutely, good game. We're all excited. 809 00:33:30,530 --> 00:33:33,011 ANNOUNCER: All right. Let's see what MaNa can pull off. 810 00:33:34,969 --> 00:33:36,405 ANNOUNCER 2: AlphaStar is definitely 811 00:33:36,449 --> 00:33:38,364 dominating the pace of this game. 812 00:33:38,407 --> 00:33:41,106 [SPORADIC CHEERING] 813 00:33:41,149 --> 00:33:44,152 ANNOUNCER 1: Wow. AlphaStar is playing so smartly. 814 00:33:44,196 --> 00:33:46,807 [LAUGHTER] 815 00:33:46,850 --> 00:33:48,461 This really looks like I'm watching 816 00:33:48,504 --> 00:33:49,940 a professional human gamer 817 00:33:49,984 --> 00:33:51,290 from the AlphaStar point of view. 818 00:33:53,031 --> 00:33:54,858 [KEYBOARD TAPPING] 819 00:33:57,252 --> 00:34:01,952 HASSABIS: I hadn't really seen a pro playStarCraft up close, 820 00:34:01,996 --> 00:34:03,563 and the 800 clicks per minute. 821 00:34:03,606 --> 00:34:06,000 I don't understand how anyone can even click 800 times, 822 00:34:06,044 --> 00:34:09,482 let alone doing 800 useful clicks. 823 00:34:09,525 --> 00:34:11,092 ANNOUNCER 1: Oh, another good hit. 824 00:34:11,136 --> 00:34:13,094 - [ALL GROAN] - AlphaStar is just 825 00:34:13,138 --> 00:34:14,617 completely relentless. 826 00:34:14,661 --> 00:34:16,141 SILVER: We need to be careful 827 00:34:16,184 --> 00:34:19,361 because many of us grew up as gamers and are gamers. 828 00:34:19,405 --> 00:34:21,363 And so to us, it's very natural 829 00:34:21,407 --> 00:34:23,800 to view games as what they are, 830 00:34:23,844 --> 00:34:26,977 which is pure vehicles for fun, 831 00:34:27,021 --> 00:34:29,719 and not to see that more militaristic side 832 00:34:29,763 --> 00:34:32,853 that the public might see if they looked at this. 833 00:34:32,896 --> 00:34:37,553 You can't look at gunpowder and only make a firecracker. 834 00:34:37,597 --> 00:34:41,209 All technologies inherently point into certain directions. 835 00:34:43,124 --> 00:34:44,691 MARGARET LEVI: I'm very worried about 836 00:34:44,734 --> 00:34:46,606 the certain ways in which AI 837 00:34:46,649 --> 00:34:49,652 will be used for military purposes. 838 00:34:51,306 --> 00:34:55,049 And that makes it even clearer how important it is 839 00:34:55,093 --> 00:34:58,357 for our societies to be in control 840 00:34:58,400 --> 00:35:01,055 of these new technologies. 841 00:35:01,099 --> 00:35:05,190 The potential for abuse from AI will be significant. 842 00:35:05,233 --> 00:35:08,758 Wars that occur faster than humans can comprehend 843 00:35:08,802 --> 00:35:11,065 and more powerful surveillance. 844 00:35:12,458 --> 00:35:15,765 How do you keep power forever 845 00:35:15,809 --> 00:35:19,421 over something that's much more powerful than you? 846 00:35:19,465 --> 00:35:21,380 [STEPHEN HAWKING SPEAKING] 847 00:35:43,053 --> 00:35:45,752 Technologies can be used to do terrible things. 848 00:35:47,493 --> 00:35:50,452 And technology can be used to do wonderful things 849 00:35:50,496 --> 00:35:52,150 and solve all kinds of problems. 850 00:35:53,586 --> 00:35:54,978 When DeepMind was acquired by Google... 851 00:35:55,022 --> 00:35:56,589 - Yeah. - ...you got Google to promise 852 00:35:56,632 --> 00:35:58,025 that technology you developed won't be used by the military 853 00:35:58,068 --> 00:35:59,418 - for surveillance. - Right. 854 00:35:59,461 --> 00:36:00,593 - Yes. - Tell us about that. 855 00:36:00,636 --> 00:36:03,161 I think technology is neutral in itself, 856 00:36:03,204 --> 00:36:05,598 um, but how, you know, we as a society 857 00:36:05,641 --> 00:36:07,382 or humans and companies and other things, 858 00:36:07,426 --> 00:36:09,515 other entities and governments decide to use it 859 00:36:09,558 --> 00:36:12,561 is what determines whether things become good or bad. 860 00:36:12,605 --> 00:36:16,261 You know, I personally think having autonomous weaponry 861 00:36:16,304 --> 00:36:17,479 is just a very bad idea. 862 00:36:19,177 --> 00:36:21,266 ANNOUNCER 1: AlphaStar is playing 863 00:36:21,309 --> 00:36:24,094 an extremely intelligent game right now. 864 00:36:24,138 --> 00:36:27,359 CUKIER: There is an element to what's being created 865 00:36:27,402 --> 00:36:28,882 at DeepMind in London 866 00:36:28,925 --> 00:36:34,148 that does seem like the Manhattan Project. 867 00:36:34,192 --> 00:36:37,586 There's a relationship between Robert Oppenheimer 868 00:36:37,630 --> 00:36:39,675 and Demis Hassabis 869 00:36:39,719 --> 00:36:44,202 in which they're unleashing a new force upon humanity. 870 00:36:44,245 --> 00:36:46,204 ANNOUNCER 1: MaNa is fighting back, though. 871 00:36:46,247 --> 00:36:48,162 Oh, man! 872 00:36:48,206 --> 00:36:50,208 HASSABIS: I think that Oppenheimer 873 00:36:50,251 --> 00:36:52,471 and some of the other leaders of that project got caught up 874 00:36:52,514 --> 00:36:54,908 in the excitement of building the technology 875 00:36:54,951 --> 00:36:56,170 and seeing if it was possible. 876 00:36:56,214 --> 00:36:58,520 ANNOUNCER 1: Where is AlphaStar? 877 00:36:58,564 --> 00:36:59,782 Where is AlphaStar? 878 00:36:59,826 --> 00:37:01,958 I don't see AlphaStar's units anywhere. 879 00:37:02,002 --> 00:37:03,525 HASSABIS: They did not think carefully enough 880 00:37:03,569 --> 00:37:07,312 about the morals of what they were doing early enough. 881 00:37:07,355 --> 00:37:08,965 What we should do as scientists 882 00:37:09,009 --> 00:37:11,011 with powerful new technologies 883 00:37:11,054 --> 00:37:13,883 is try and understand it in controlled conditions first. 884 00:37:14,928 --> 00:37:16,799 ANNOUNCER 1: And that is that. 885 00:37:16,843 --> 00:37:19,411 MaNa has defeated AlphaStar. 886 00:37:29,551 --> 00:37:31,336 I mean, my honest feeling is that I think it is 887 00:37:31,379 --> 00:37:33,207 a fair representation of where we are. 888 00:37:33,251 --> 00:37:35,949 And I think that part feels... feels okay. 889 00:37:35,992 --> 00:37:37,429 - I'm very happy for you. - I'm happy. 890 00:37:37,472 --> 00:37:38,865 So well... well done. 891 00:37:38,908 --> 00:37:40,867 My view is that the approach to building technology 892 00:37:40,910 --> 00:37:43,348 which is embodied by move fast and break things, 893 00:37:43,391 --> 00:37:46,220 is exactly what we should not be doing, 894 00:37:46,264 --> 00:37:47,961 because you can't afford to break things 895 00:37:48,004 --> 00:37:49,049 and then fix them afterwards. 896 00:37:49,092 --> 00:37:50,398 - Cheers. - Thank you so much. 897 00:37:50,442 --> 00:37:52,008 Yeah, get... get some rest. You did really well. 898 00:37:52,052 --> 00:37:53,923 - Cheers, yeah? - Thank you for having us. 899 00:38:01,627 --> 00:38:03,281 [ELECTRONIC MUSIC PLAYING] 900 00:38:04,238 --> 00:38:05,500 HASSABIS: When I was eight, 901 00:38:05,544 --> 00:38:06,849 I bought my first computer 902 00:38:06,893 --> 00:38:09,548 with the winnings from a chess tournament. 903 00:38:09,591 --> 00:38:11,158 I sort of had this intuition 904 00:38:11,201 --> 00:38:13,726 that computers are this magical device 905 00:38:13,769 --> 00:38:15,902 that can extend the power of the mind. 906 00:38:15,945 --> 00:38:17,382 I had a couple of school friends, 907 00:38:17,425 --> 00:38:19,166 and we used to have a hacking club, 908 00:38:19,209 --> 00:38:21,908 writing code, making games. 909 00:38:26,260 --> 00:38:27,827 And then over the summer holidays, 910 00:38:27,870 --> 00:38:29,219 I'd spend the whole day 911 00:38:29,263 --> 00:38:31,526 flicking through games magazines. 912 00:38:31,570 --> 00:38:33,441 And one day I noticed there was a competition 913 00:38:33,485 --> 00:38:35,878 to write an original version of Space Invaders. 914 00:38:35,922 --> 00:38:39,621 And the winner won a job at Bullfrog. 915 00:38:39,665 --> 00:38:42,320 Bullfrog at the time was the best game development house 916 00:38:42,363 --> 00:38:43,756 in all of Europe. 917 00:38:43,799 --> 00:38:45,279 You know, I really wanted to work at this place 918 00:38:45,323 --> 00:38:48,587 and see how they build games. 919 00:38:48,630 --> 00:38:50,415 NEWSCASTER: Bullfrog, based here in Guildford, 920 00:38:50,458 --> 00:38:52,286 began with a big idea. 921 00:38:52,330 --> 00:38:54,680 That idea turned into the game Populous, 922 00:38:54,723 --> 00:38:56,551 which became a global bestseller. 923 00:38:56,595 --> 00:38:59,859 In the '90s, there was no recruitment agencies. 924 00:38:59,902 --> 00:39:02,122 You couldn't go out and say, you know, 925 00:39:02,165 --> 00:39:04,951 "Come and work in the games industry." 926 00:39:04,994 --> 00:39:08,171 It was still not even considered an industry. 927 00:39:08,215 --> 00:39:11,218 So we came up with the idea to have a competition 928 00:39:11,261 --> 00:39:13,655 and we got a lot of applicants. 929 00:39:14,700 --> 00:39:17,616 And one of those was Demis's. 930 00:39:17,659 --> 00:39:20,706 I can still remember clearly 931 00:39:20,749 --> 00:39:23,970 the day that Demis came in. 932 00:39:24,013 --> 00:39:27,147 He walked in the door, he looked about 12. 933 00:39:28,670 --> 00:39:30,019 I thought, "Oh, my God, 934 00:39:30,063 --> 00:39:31,586 "what the hell are we going to do with this guy?" 935 00:39:31,630 --> 00:39:32,979 I applied to Cambridge. 936 00:39:33,022 --> 00:39:35,373 I got in but they said I was way too young. 937 00:39:35,416 --> 00:39:37,853 So... So I needed to take a year off 938 00:39:37,897 --> 00:39:39,899 so I'd be at least 17 before I got there. 939 00:39:39,942 --> 00:39:42,771 And that's when I decided to spend that entire gap year 940 00:39:42,815 --> 00:39:44,469 working at Bullfrog. 941 00:39:44,512 --> 00:39:46,166 They couldn't even legally employ me, 942 00:39:46,209 --> 00:39:48,298 so I ended up being paid in brown paper envelopes. 943 00:39:48,342 --> 00:39:49,343 [CHUCKLES] 944 00:39:50,823 --> 00:39:54,304 I got a feeling of being really at the cutting edge 945 00:39:54,348 --> 00:39:58,047 and how much fun that was to invent things every day. 946 00:39:58,091 --> 00:40:00,572 And then you know, a few months later, 947 00:40:00,615 --> 00:40:03,662 maybe everyone... a million people will be playing it. 948 00:40:03,705 --> 00:40:06,665 MOLYNEUX: In those days computer games had to evolve. 949 00:40:06,708 --> 00:40:08,536 There had to be new genres 950 00:40:08,580 --> 00:40:11,757 which were more than just shooting things. 951 00:40:11,800 --> 00:40:14,063 Wouldn't it be amazing to have a game 952 00:40:14,107 --> 00:40:18,807 where you design and build your own theme park? 953 00:40:18,851 --> 00:40:21,027 [GAME CHARACTERS SCREAMING] 954 00:40:22,594 --> 00:40:25,945 Demis and I started to talk aboutTheme Park. 955 00:40:25,988 --> 00:40:28,904 It allows the player to build a world 956 00:40:28,948 --> 00:40:31,864 and see the consequences of your choices 957 00:40:31,907 --> 00:40:34,127 that you've made in that world. 958 00:40:34,170 --> 00:40:36,085 HASSABIS: A human player set out the layout 959 00:40:36,129 --> 00:40:38,566 of the theme park and designed the roller coaster 960 00:40:38,610 --> 00:40:41,351 and set the prices in the chip shop. 961 00:40:41,395 --> 00:40:43,615 What I was working on was the behaviors of the people. 962 00:40:43,658 --> 00:40:45,138 They were autonomous 963 00:40:45,181 --> 00:40:47,314 and that was the AI in this case. 964 00:40:47,357 --> 00:40:48,881 So what I was trying to do was mimic 965 00:40:48,924 --> 00:40:51,013 interesting human behavior 966 00:40:51,057 --> 00:40:52,319 so that the simulation would be 967 00:40:52,362 --> 00:40:54,582 more interesting to interact with. 968 00:40:54,626 --> 00:40:56,541 MOLYNEUX: Demis worked on ridiculous things, 969 00:40:56,584 --> 00:40:59,413 like you could place down these shops 970 00:40:59,457 --> 00:41:03,591 and if you put a shop too near a very dangerous ride, 971 00:41:03,635 --> 00:41:05,375 then people on the ride would throw up 972 00:41:05,419 --> 00:41:08,030 because they'd just eaten. 973 00:41:08,074 --> 00:41:09,641 And then that would make other people throw up 974 00:41:09,684 --> 00:41:12,121 when they saw the throwing-up on the floor, 975 00:41:12,165 --> 00:41:14,559 so you then had to have lots of sweepers 976 00:41:14,602 --> 00:41:17,823 to quickly sweep it up before the people saw it. 977 00:41:17,866 --> 00:41:19,520 That's the cool thing about it. 978 00:41:19,564 --> 00:41:22,784 You as the player tinker with it and then it reacts to you. 979 00:41:22,828 --> 00:41:25,874 MOLYNEUX: All those nuanced simulation things he did 980 00:41:25,918 --> 00:41:28,094 and that was an invention 981 00:41:28,137 --> 00:41:31,227 which never really existed before. 982 00:41:31,271 --> 00:41:34,230 It was unbelievably successful. 983 00:41:34,274 --> 00:41:35,710 DAVID GARDNER: Theme Park actually turned out 984 00:41:35,754 --> 00:41:37,190 to be a top ten title 985 00:41:37,233 --> 00:41:39,932 and that was the first time we were starting to see 986 00:41:39,975 --> 00:41:43,022 how AI could make a difference. 987 00:41:43,065 --> 00:41:44,806 [BRASS BAND PLAYING] 988 00:41:46,155 --> 00:41:47,592 CARTER: We were doing some Christmas shopping 989 00:41:47,635 --> 00:41:51,247 and were waiting for the taxi to take us home. 990 00:41:51,291 --> 00:41:54,947 I have this very clear memory of Demis talking about AI 991 00:41:54,990 --> 00:41:56,209 in a very different way, 992 00:41:56,252 --> 00:41:58,428 in a way that we didn't commonly talk about. 993 00:41:58,472 --> 00:42:02,345 This idea of AI being useful for other things 994 00:42:02,389 --> 00:42:04,086 other than entertainment. 995 00:42:04,130 --> 00:42:07,437 So being useful for, um, helping the world 996 00:42:07,481 --> 00:42:10,310 and the potential of AI to change the world. 997 00:42:10,353 --> 00:42:13,226 I just said to Demis, "What is it you want to do?" 998 00:42:13,269 --> 00:42:14,532 And he said to me, 999 00:42:14,575 --> 00:42:16,795 "I want to be the person that solves AI." 1000 00:42:22,670 --> 00:42:25,760 HASSABIS: Peter offered me £1 million 1001 00:42:25,804 --> 00:42:27,675 to not go to university. 1002 00:42:30,199 --> 00:42:32,593 But I had a plan from the beginning. 1003 00:42:32,637 --> 00:42:35,814 And my plan was always to go to Cambridge. 1004 00:42:35,857 --> 00:42:36,902 I think a lot of my schoolfriends 1005 00:42:36,945 --> 00:42:38,033 thought I was mad. 1006 00:42:38,077 --> 00:42:39,252 Why would you not... 1007 00:42:39,295 --> 00:42:40,688 I mean, £1 million, that's a lot of money. 1008 00:42:40,732 --> 00:42:43,517 In the '90s, that is a lot of money, right? 1009 00:42:43,561 --> 00:42:46,346 For a... For a poor 17-year-old kid. 1010 00:42:46,389 --> 00:42:50,219 He's like this little seed that's going to burst through, 1011 00:42:50,263 --> 00:42:53,658 and he's not going to be able to do that at Bullfrog. 1012 00:42:56,443 --> 00:42:59,098 I had to drop him off at the train station 1013 00:42:59,141 --> 00:43:02,580 and I can still see that picture 1014 00:43:02,623 --> 00:43:07,019 of this little elven character disappear down that tunnel. 1015 00:43:07,062 --> 00:43:09,804 That was an incredibly sad moment. 1016 00:43:13,242 --> 00:43:14,635 HASSABIS: I had this romantic ideal 1017 00:43:14,679 --> 00:43:16,942 of what Cambridge would be like, 1018 00:43:16,985 --> 00:43:18,639 1,000 years of history, 1019 00:43:18,683 --> 00:43:21,033 walking the same streets that Turing, 1020 00:43:21,076 --> 00:43:23,601 Newton and Crick had walked. 1021 00:43:23,644 --> 00:43:26,647 I wanted to explore the edge of the universe. 1022 00:43:26,691 --> 00:43:27,735 [CHURCH BELLS TOLLING] 1023 00:43:29,084 --> 00:43:30,346 When I got to Cambridge, 1024 00:43:30,390 --> 00:43:32,653 I'd basically been working my whole life. 1025 00:43:33,741 --> 00:43:35,090 Every single summer, 1026 00:43:35,134 --> 00:43:37,136 I was either playing chess professionally, 1027 00:43:37,179 --> 00:43:39,704 or I was working, doing an internship. 1028 00:43:39,747 --> 00:43:43,708 So I was, like, "Right, I am gonna have fun now 1029 00:43:43,751 --> 00:43:46,711 "and explore what it means to be a normal teenager." 1030 00:43:47,973 --> 00:43:50,192 [PEOPLE CHEERING, LAUGHING] 1031 00:43:50,236 --> 00:43:52,238 Come on! Go, boy, go! 1032 00:43:52,281 --> 00:43:54,022 TIM STEVENS: It was work hard and play hard. 1033 00:43:54,066 --> 00:43:55,807 [ALL SINGING] 1034 00:43:55,850 --> 00:43:57,025 I first met Demis 1035 00:43:57,069 --> 00:43:59,201 because we both attended Queens' College. 1036 00:44:00,115 --> 00:44:01,203 Our group of friends, 1037 00:44:01,247 --> 00:44:03,205 we'd often drink beer in the bar, 1038 00:44:03,249 --> 00:44:04,946 play table football. 1039 00:44:04,990 --> 00:44:07,340 HASSABIS: In the bar, I used to play speed chess, 1040 00:44:07,383 --> 00:44:09,255 pieces flying off the board, 1041 00:44:09,298 --> 00:44:11,083 you know, the whole game in one minute. 1042 00:44:11,126 --> 00:44:12,301 Demis sat down opposite me. 1043 00:44:12,345 --> 00:44:13,563 And I looked at him and I thought, 1044 00:44:13,607 --> 00:44:15,217 "I remember you from when we were kids." 1045 00:44:15,261 --> 00:44:17,176 HASSABIS: I had actually been in the same chess tournament 1046 00:44:17,219 --> 00:44:18,786 as Dave in Ipswich, 1047 00:44:18,830 --> 00:44:20,440 where I used to go and try and raid his local chess club 1048 00:44:20,483 --> 00:44:22,703 to win a bit of prize money. 1049 00:44:22,747 --> 00:44:24,618 COPPIN: We were studying computer science. 1050 00:44:24,662 --> 00:44:26,794 Some people, who at the age of 17 1051 00:44:26,838 --> 00:44:28,404 would have come in and made sure to tell everybody 1052 00:44:28,448 --> 00:44:29,492 everything about themselves. 1053 00:44:29,536 --> 00:44:30,972 "Hey, I worked at Bullfrog 1054 00:44:31,016 --> 00:44:33,018 "and built the world's most successful video game." 1055 00:44:33,061 --> 00:44:34,715 But he wasn't like that at all. 1056 00:44:34,759 --> 00:44:36,412 SILVER: At Cambridge, Demis and myself 1057 00:44:36,456 --> 00:44:38,414 both had an interest in computational neuroscience 1058 00:44:38,458 --> 00:44:40,242 and trying to understand how computers and brains 1059 00:44:40,286 --> 00:44:42,636 intertwined and linked together. 1060 00:44:42,680 --> 00:44:44,290 JOHN DAUGMAN: Both David and Demis 1061 00:44:44,333 --> 00:44:46,422 came to me for supervisions. 1062 00:44:46,466 --> 00:44:49,774 It happens just by coincidence that the year 1997, 1063 00:44:49,817 --> 00:44:51,645 their third and final year at Cambridge, 1064 00:44:51,689 --> 00:44:55,301 was also the year when the first chess grandmaster 1065 00:44:55,344 --> 00:44:56,781 was beaten by a computer program. 1066 00:44:56,824 --> 00:44:58,260 [CAMERA SHUTTERS CLICKING] 1067 00:44:58,304 --> 00:45:00,088 NEWSCASTER: Round one today of a chess match 1068 00:45:00,132 --> 00:45:03,701 between the ranking world champion Garry Kasparov 1069 00:45:03,744 --> 00:45:06,007 and an opponent named Deep Blue 1070 00:45:06,051 --> 00:45:10,490 to test to see if the human brain can outwit a machine. 1071 00:45:10,533 --> 00:45:11,621 HASSABIS: I remember the drama 1072 00:45:11,665 --> 00:45:13,798 of Kasparov losing the last match. 1073 00:45:13,841 --> 00:45:15,234 NEWSCASTER 2: Whoa! 1074 00:45:15,277 --> 00:45:17,192 Kasparov has resigned! 1075 00:45:17,236 --> 00:45:19,586 When Deep Blue beat Garry Kasparov, 1076 00:45:19,629 --> 00:45:21,457 that was a real watershed event. 1077 00:45:21,501 --> 00:45:23,155 HASSABIS: My main memory of it was 1078 00:45:23,198 --> 00:45:25,331 I wasn't that impressed with Deep Blue. 1079 00:45:25,374 --> 00:45:27,246 I was more impressed with Kasparov's mind. 1080 00:45:27,289 --> 00:45:29,509 That he could play chess to this level, 1081 00:45:29,552 --> 00:45:31,641 where he could compete on an equal footing 1082 00:45:31,685 --> 00:45:33,252 with the brute of a machine, 1083 00:45:33,295 --> 00:45:35,167 but of course, Kasparov can do 1084 00:45:35,210 --> 00:45:36,951 everything else humans can do, too. 1085 00:45:36,995 --> 00:45:38,257 It was a huge achievement. 1086 00:45:38,300 --> 00:45:39,388 But the truth of the matter was, 1087 00:45:39,432 --> 00:45:40,868 Deep Blue could only play chess. 1088 00:45:42,435 --> 00:45:44,524 What we would regard as intelligence 1089 00:45:44,567 --> 00:45:46,874 was missing from that system. 1090 00:45:46,918 --> 00:45:49,834 This idea of generality and also learning. 1091 00:45:53,751 --> 00:45:55,404 Cambridge was amazing, because of course, you know, 1092 00:45:55,448 --> 00:45:56,666 you're mixing with people 1093 00:45:56,710 --> 00:45:58,233 who are studying many different subjects. 1094 00:45:58,277 --> 00:46:01,410 SILVER: There were scientists, philosophers, artists... 1095 00:46:01,454 --> 00:46:04,457 STEVENS: ...geologists, biologists, ecologists. 1096 00:46:04,500 --> 00:46:07,416 You know, everybody is talking about everything all the time. 1097 00:46:07,460 --> 00:46:10,768 I was obsessed with the protein folding problem. 1098 00:46:10,811 --> 00:46:13,248 HASSABIS: Tim Stevens used to talk obsessively, 1099 00:46:13,292 --> 00:46:15,381 almost like religiously about this problem, 1100 00:46:15,424 --> 00:46:17,165 protein folding problem. 1101 00:46:17,209 --> 00:46:18,863 STEVENS: Proteins are, you know, 1102 00:46:18,906 --> 00:46:22,083 one of the most beautiful and elegant things about biology. 1103 00:46:22,127 --> 00:46:24,738 They are the machines of life. 1104 00:46:24,782 --> 00:46:27,001 They build everything, they control everything, 1105 00:46:27,045 --> 00:46:29,569 they're why biology works. 1106 00:46:29,612 --> 00:46:32,659 Proteins are made from strings of amino acids 1107 00:46:32,702 --> 00:46:37,055 that fold up to create a protein structure. 1108 00:46:37,098 --> 00:46:39,884 If we can predict the structure of proteins 1109 00:46:39,927 --> 00:46:43,104 from just their amino acid sequences, 1110 00:46:43,148 --> 00:46:46,107 then a new protein to cure cancer 1111 00:46:46,151 --> 00:46:49,415 or break down plastic to help the environment 1112 00:46:49,458 --> 00:46:50,808 is definitely something 1113 00:46:50,851 --> 00:46:52,940 that you could begin to think about. 1114 00:46:53,941 --> 00:46:55,029 I kind of thought, 1115 00:46:55,073 --> 00:46:58,293 "Well, is a human being clever enough 1116 00:46:58,337 --> 00:46:59,991 "to actually fold a protein?" 1117 00:47:00,034 --> 00:47:02,080 We can't work it out. 1118 00:47:02,123 --> 00:47:04,082 JOHN MOULT: Since the 1960s, 1119 00:47:04,125 --> 00:47:05,953 we thought that in principle, 1120 00:47:05,997 --> 00:47:08,913 if I know what the amino acid sequence of a protein is, 1121 00:47:08,956 --> 00:47:11,437 I should be able to compute what the structure's like. 1122 00:47:11,480 --> 00:47:13,700 So, if you could just press a button, 1123 00:47:13,743 --> 00:47:16,311 and they'd all come popping out, that would be... 1124 00:47:16,355 --> 00:47:18,009 that would have some impact. 1125 00:47:20,272 --> 00:47:21,577 HASSABIS: It stuck in my mind. 1126 00:47:21,621 --> 00:47:23,405 "Oh, this is a very interesting problem." 1127 00:47:23,449 --> 00:47:26,844 And it felt to me like it would be solvable. 1128 00:47:26,887 --> 00:47:29,934 But I thought it would need AI to do it. 1129 00:47:31,500 --> 00:47:34,025 If we could just solve protein folding, 1130 00:47:34,068 --> 00:47:35,635 it could change the world. 1131 00:47:50,868 --> 00:47:52,826 HASSABIS: Ever since I was a student at Cambridge, 1132 00:47:54,001 --> 00:47:55,611 I've never stopped thinking about 1133 00:47:55,655 --> 00:47:57,135 the protein folding problem. 1134 00:47:59,789 --> 00:48:02,792 If you were to solve protein folding, 1135 00:48:02,836 --> 00:48:05,665 then the potential to help solve problems like 1136 00:48:05,708 --> 00:48:09,669 Alzheimer's, dementia and drug discovery is huge. 1137 00:48:09,712 --> 00:48:11,671 Solving disease is probably 1138 00:48:11,714 --> 00:48:13,325 the most major impact we could have. 1139 00:48:13,368 --> 00:48:14,587 [CLICKS MOUSE] 1140 00:48:15,457 --> 00:48:16,676 Thousands of very smart people 1141 00:48:16,719 --> 00:48:18,765 have tried to solve protein folding. 1142 00:48:18,808 --> 00:48:20,985 I just think now is the right time 1143 00:48:21,028 --> 00:48:22,508 for AI to crack it. 1144 00:48:22,551 --> 00:48:24,292 [THRILLING MUSIC PLAYING] 1145 00:48:24,336 --> 00:48:26,512 [INDISTINCT CONVERSATION] 1146 00:48:26,555 --> 00:48:28,383 RICHARD EVANS: We needed a reasonable way 1147 00:48:28,427 --> 00:48:29,515 to apply machine learning 1148 00:48:29,558 --> 00:48:30,516 to the protein folding problem. 1149 00:48:30,559 --> 00:48:32,387 [CLICKING MOUSE] 1150 00:48:32,431 --> 00:48:35,173 We came across this Foldit game. 1151 00:48:35,216 --> 00:48:38,785 The goal is to move around this 3D model of a protein 1152 00:48:38,828 --> 00:48:41,701 and you get a score every time you move it. 1153 00:48:41,744 --> 00:48:43,050 The more accurate you make these structures, 1154 00:48:43,094 --> 00:48:45,531 the more useful they will be to biologists. 1155 00:48:46,227 --> 00:48:47,489 I spent a few days 1156 00:48:47,533 --> 00:48:48,838 just kind of seeing how well we could do. 1157 00:48:48,882 --> 00:48:50,623 [GAME DINGING] 1158 00:48:50,666 --> 00:48:52,407 We did reasonably well. 1159 00:48:52,451 --> 00:48:53,843 But even if you were 1160 00:48:53,887 --> 00:48:55,280 the world's best Foldit player, 1161 00:48:55,323 --> 00:48:57,499 you wouldn't solve protein folding. 1162 00:48:57,543 --> 00:48:59,501 That's why we had to move beyond the game. 1163 00:48:59,545 --> 00:49:00,720 HASSABIS: Games are always just 1164 00:49:00,763 --> 00:49:03,723 the proving ground for our algorithms. 1165 00:49:03,766 --> 00:49:07,727 The ultimate goal was not just to crack Go and StarCraft. 1166 00:49:07,770 --> 00:49:10,034 It was to crack real-world challenges. 1167 00:49:10,904 --> 00:49:13,037 [THRILLING MUSIC CONTINUES] 1168 00:49:16,127 --> 00:49:18,520 JOHN JUMPER: I remember hearing this rumor 1169 00:49:18,564 --> 00:49:21,175 that Demis was getting into proteins. 1170 00:49:21,219 --> 00:49:23,743 I talked to some people at DeepMind and I would ask, 1171 00:49:23,786 --> 00:49:25,049 "So are you doing protein folding?" 1172 00:49:25,092 --> 00:49:26,920 And they would artfully change the subject. 1173 00:49:26,964 --> 00:49:30,097 And when that happened twice, I pretty much figured it out. 1174 00:49:30,141 --> 00:49:32,839 So I thought I should submit a resume. 1175 00:49:32,882 --> 00:49:35,668 HASSABIS: All right, everyone, welcome to DeepMind. 1176 00:49:35,711 --> 00:49:37,626 I know some of you, this may be your first week, 1177 00:49:37,670 --> 00:49:39,193 but I hope you all set... 1178 00:49:39,237 --> 00:49:40,890 JUMPER: The really appealing part for me about the job 1179 00:49:40,934 --> 00:49:42,675 was this, like, sense of connection 1180 00:49:42,718 --> 00:49:44,503 to the larger purpose. 1181 00:49:44,546 --> 00:49:45,591 HASSABIS: If we can crack 1182 00:49:45,634 --> 00:49:48,028 some fundamental problems in science, 1183 00:49:48,072 --> 00:49:49,160 many other people 1184 00:49:49,203 --> 00:49:50,900 and other companies and labs and so on 1185 00:49:50,944 --> 00:49:52,467 could build on top of our work. 1186 00:49:52,511 --> 00:49:53,816 This is your chance now 1187 00:49:53,860 --> 00:49:55,818 to add your chapter to this story. 1188 00:49:55,862 --> 00:49:57,298 JUMPER: When I arrived, 1189 00:49:57,342 --> 00:49:59,605 I was definitely[CHUCKLES] quite a bit nervous. 1190 00:49:59,648 --> 00:50:00,736 I'm still trying to keep... 1191 00:50:00,780 --> 00:50:02,912 I haven't taken any biology courses. 1192 00:50:02,956 --> 00:50:05,393 We haven't spent years of our lives 1193 00:50:05,437 --> 00:50:07,917 looking at these structures and understanding them. 1194 00:50:07,961 --> 00:50:09,963 We are just going off the data 1195 00:50:10,007 --> 00:50:11,269 and our machine learning models. 1196 00:50:12,705 --> 00:50:13,836 JUMPER: In machine learning, 1197 00:50:13,880 --> 00:50:15,795 you train a network like flashcards. 1198 00:50:15,838 --> 00:50:18,798 Here's the question. Here's the answer. 1199 00:50:18,841 --> 00:50:20,756 Here's the question. Here's the answer. 1200 00:50:20,800 --> 00:50:22,323 But in protein folding, 1201 00:50:22,367 --> 00:50:25,457 we're not doing the kind of standard task at DeepMind 1202 00:50:25,500 --> 00:50:28,155 where you have unlimited data. 1203 00:50:28,199 --> 00:50:30,766 Your job is to get better at chess or Go 1204 00:50:30,810 --> 00:50:32,986 and you can play as many games of chess or Go 1205 00:50:33,030 --> 00:50:34,814 as your computers will allow. 1206 00:50:35,510 --> 00:50:36,772 With proteins, 1207 00:50:36,816 --> 00:50:39,732 we're sitting on a very thick size of data 1208 00:50:39,775 --> 00:50:41,995 that's been determined by a half century 1209 00:50:42,039 --> 00:50:46,478 of time-consuming experimental methods in laboratories. 1210 00:50:46,521 --> 00:50:49,698 These painstaking methods can take months or years 1211 00:50:49,742 --> 00:50:52,353 to determine a single protein structure, 1212 00:50:52,397 --> 00:50:55,791 and sometimes, a structure can never be determined. 1213 00:50:55,835 --> 00:50:57,271 [TYPING] 1214 00:50:57,315 --> 00:51:00,274 That's why we're working with such small datasets 1215 00:51:00,318 --> 00:51:02,233 to train our algorithms. 1216 00:51:02,276 --> 00:51:04,365 EWAN BIRNEY: When DeepMind started to explore 1217 00:51:04,409 --> 00:51:05,975 the folding problem, 1218 00:51:06,019 --> 00:51:07,977 they were talking to us about which datasets they were using 1219 00:51:08,021 --> 00:51:09,892 and what would be the possibilities 1220 00:51:09,936 --> 00:51:11,633 if they did solve this problem. 1221 00:51:12,373 --> 00:51:13,940 Many people have tried, 1222 00:51:13,983 --> 00:51:16,638 and yet no one on the planet has solved protein folding. 1223 00:51:16,682 --> 00:51:18,205 [CHUCKLES] I did think to myself, 1224 00:51:18,249 --> 00:51:19,946 "Well, you know, good luck." 1225 00:51:19,989 --> 00:51:22,731 JUMPER: If we can solve the protein folding problem, 1226 00:51:22,775 --> 00:51:25,691 it would have an incredible kind of medical relevance. 1227 00:51:25,734 --> 00:51:27,736 HASSABIS: This is the cycle of science. 1228 00:51:27,780 --> 00:51:30,043 You do a huge amount of exploration, 1229 00:51:30,087 --> 00:51:31,914 and then you go into exploitation mode, 1230 00:51:31,958 --> 00:51:33,568 and you focus and you see 1231 00:51:33,612 --> 00:51:35,353 how good are those ideas, really? 1232 00:51:35,396 --> 00:51:36,528 And there's nothing better 1233 00:51:36,571 --> 00:51:38,095 than external competition for that. 1234 00:51:39,835 --> 00:51:43,056 So we decided to enter CASP competition. 1235 00:51:43,100 --> 00:51:47,234 CASP, we started to try and speed up 1236 00:51:47,278 --> 00:51:49,802 the solution to the protein folding problem. 1237 00:51:49,845 --> 00:51:52,065 CASP is when we say, 1238 00:51:52,109 --> 00:51:54,372 "Look, DeepMind is doing protein folding, 1239 00:51:54,415 --> 00:51:55,590 "this is how good we are, 1240 00:51:55,634 --> 00:51:57,592 "and maybe it's better than everybody else. 1241 00:51:57,636 --> 00:51:58,680 "Maybe it isn't." 1242 00:51:58,724 --> 00:52:00,073 CASP is a bit like 1243 00:52:00,117 --> 00:52:02,075 the Olympic Games of protein folding. 1244 00:52:03,642 --> 00:52:06,035 CASP is a community-wide assessment 1245 00:52:06,079 --> 00:52:08,125 that's held every two years. 1246 00:52:09,561 --> 00:52:10,866 Teams are given 1247 00:52:10,910 --> 00:52:14,261 the amino acid sequences of about 100 proteins, 1248 00:52:14,305 --> 00:52:17,525 and then they try to solve this folding problem 1249 00:52:17,569 --> 00:52:20,833 using computational methods. 1250 00:52:20,876 --> 00:52:23,401 These proteins have already been determined 1251 00:52:23,444 --> 00:52:25,968 by experiments in a laboratory, 1252 00:52:26,012 --> 00:52:29,276 but have not yet been revealed publicly. 1253 00:52:29,320 --> 00:52:30,756 And these known structures 1254 00:52:30,799 --> 00:52:33,280 represent the gold standard against which 1255 00:52:33,324 --> 00:52:36,936 all the computational predictions will be compared. 1256 00:52:37,937 --> 00:52:39,330 MOULT: We've got a score 1257 00:52:39,373 --> 00:52:42,159 that measures the accuracy of the predictions. 1258 00:52:42,202 --> 00:52:44,726 And you would expect a score of over 90 1259 00:52:44,770 --> 00:52:47,425 to be a solution to the protein folding problem. 1260 00:52:47,468 --> 00:52:48,513 [INDISTINCT CHATTER] 1261 00:52:48,556 --> 00:52:50,036 MAN: Welcome, everyone, 1262 00:52:50,079 --> 00:52:52,038 to our first, uh, semifinals in the winners' bracket. 1263 00:52:52,081 --> 00:52:54,954 Nick and John versus Demis and Frank. 1264 00:52:54,997 --> 00:52:57,217 Please join us, come around. This will be an intense match. 1265 00:52:57,261 --> 00:52:59,306 STEVENS: When I learned that Demis was 1266 00:52:59,350 --> 00:53:02,048 going to tackle the protein folding issue, 1267 00:53:02,091 --> 00:53:04,790 um, I wasn't at all surprised. 1268 00:53:04,833 --> 00:53:06,792 It's very typical of Demis. 1269 00:53:06,835 --> 00:53:08,924 You know, he loves competition. 1270 00:53:08,968 --> 00:53:10,143 And that's the end 1271 00:53:10,187 --> 00:53:12,972 - of the first game, 10-7. - [ALL CHEERING] 1272 00:53:13,015 --> 00:53:14,103 HASSABIS: The aim for CASP would be 1273 00:53:14,147 --> 00:53:15,975 to not just win the competition, 1274 00:53:16,018 --> 00:53:19,892 but sort of, um, retire the need for it. 1275 00:53:19,935 --> 00:53:23,417 So, 20 targets total have been released by CASP. 1276 00:53:23,461 --> 00:53:24,810 JUMPER: We were thinking maybe 1277 00:53:24,853 --> 00:53:26,899 throw in the standard kind of machine learning 1278 00:53:26,942 --> 00:53:28,814 and see how far that could take us. 1279 00:53:28,857 --> 00:53:30,729 Instead of having a couple of days on an experiment, 1280 00:53:30,772 --> 00:53:33,558 we can turn around five experiments a day. 1281 00:53:33,601 --> 00:53:35,342 Great. Well done, everyone. 1282 00:53:35,386 --> 00:53:36,735 [TYPING] 1283 00:53:36,778 --> 00:53:38,693 Can you show me the real one instead of ours? 1284 00:53:38,737 --> 00:53:39,781 MAN 1: The true answer is 1285 00:53:39,825 --> 00:53:42,044 supposed to look something like that. 1286 00:53:42,088 --> 00:53:45,047 MAN 2: It's a lot more cylindrical than I thought. 1287 00:53:45,091 --> 00:53:47,398 JUMPER: The results were not very good. 1288 00:53:47,441 --> 00:53:48,703 Okay. 1289 00:53:48,747 --> 00:53:49,835 JUMPER: We throw all the obvious ideas to it 1290 00:53:49,878 --> 00:53:51,793 and the problem laughs at you. 1291 00:53:52,881 --> 00:53:54,361 This makes no sense. 1292 00:53:54,405 --> 00:53:56,015 EVANS: We thought we could just throw 1293 00:53:56,058 --> 00:53:58,322 some of our best algorithms at the problem. 1294 00:53:59,366 --> 00:54:00,976 We were slightly naive. 1295 00:54:01,020 --> 00:54:02,282 JUMPER: We should be learning this, 1296 00:54:02,326 --> 00:54:04,284 you know, in the blink of an eye. 1297 00:54:05,372 --> 00:54:06,982 The thing I'm worried about is, 1298 00:54:07,026 --> 00:54:08,375 we take the field from 1299 00:54:08,419 --> 00:54:10,899 really bad answers to moderately bad answers. 1300 00:54:10,943 --> 00:54:13,946 I feel like we need some sort of new technology 1301 00:54:13,989 --> 00:54:15,164 for moving around these things. 1302 00:54:15,208 --> 00:54:17,341 [THRILLING MUSIC CONTINUES] 1303 00:54:20,431 --> 00:54:22,215 HASSABIS: With only a week left of CASP, 1304 00:54:22,259 --> 00:54:24,348 it's now a sprint to get it deployed. 1305 00:54:24,391 --> 00:54:25,349 [MUSIC FADES] 1306 00:54:26,654 --> 00:54:28,090 You've done your best. 1307 00:54:28,134 --> 00:54:29,875 Then there's nothing more you can do 1308 00:54:29,918 --> 00:54:32,399 but wait for CASP to deliver the results. 1309 00:54:32,443 --> 00:54:34,401 [HOPEFUL MUSIC PLAYING] 1310 00:54:52,593 --> 00:54:53,725 This famous thing of Einstein, 1311 00:54:53,768 --> 00:54:55,030 the last couple of years of his life, 1312 00:54:55,074 --> 00:54:57,381 when he was here, he overlapped with Kurt Godel 1313 00:54:57,424 --> 00:54:59,774 and he said one of the reasons he still comes in to work 1314 00:54:59,818 --> 00:55:01,646 is so that he gets to walk home 1315 00:55:01,689 --> 00:55:03,517 and discuss things with Godel. 1316 00:55:03,561 --> 00:55:05,911 It's a pretty big compliment for Kurt Godel, 1317 00:55:05,954 --> 00:55:07,478 shows you how amazing he was. 1318 00:55:09,131 --> 00:55:10,568 MAN: The Institute for Advanced Study 1319 00:55:10,611 --> 00:55:12,918 was formed in 1933. 1320 00:55:12,961 --> 00:55:14,223 In the early years, 1321 00:55:14,267 --> 00:55:16,487 the intense scientific atmosphere attracted 1322 00:55:16,530 --> 00:55:19,359 some of the most brilliant mathematicians and physicists 1323 00:55:19,403 --> 00:55:22,536 ever concentrated in a single place and time. 1324 00:55:22,580 --> 00:55:24,582 HASSABIS: The founding principle of this place, 1325 00:55:24,625 --> 00:55:28,020 it's the idea of unfettered intellectual pursuits, 1326 00:55:28,063 --> 00:55:30,152 even if you don't know what you're exploring. 1327 00:55:30,196 --> 00:55:32,154 Will result in some cool things, 1328 00:55:32,198 --> 00:55:34,896 and sometimes that then ends up being useful, 1329 00:55:34,940 --> 00:55:36,376 which, of course, 1330 00:55:36,420 --> 00:55:37,986 is partially what I've been trying to do at DeepMind. 1331 00:55:38,030 --> 00:55:39,988 How many big breakthroughs do you think are required 1332 00:55:40,032 --> 00:55:41,686 to get all the way to AGI? 1333 00:55:41,729 --> 00:55:43,078 And, you know, I estimate maybe 1334 00:55:43,122 --> 00:55:44,341 there's about a dozen of those. 1335 00:55:44,384 --> 00:55:46,125 You know, I hope it's within my lifetime. 1336 00:55:46,168 --> 00:55:47,605 - Yes, okay. - HASSABIS: But then, 1337 00:55:47,648 --> 00:55:49,171 all scientists hope that, right? 1338 00:55:49,215 --> 00:55:51,130 EMCEE: Demis has many accolades. 1339 00:55:51,173 --> 00:55:54,002 He was elected Fellow to the Royal Society last year. 1340 00:55:54,046 --> 00:55:55,961 He is also a Fellow of Royal Society of Arts. 1341 00:55:56,004 --> 00:55:57,528 A big hand for Demis Hassabis. 1342 00:56:02,968 --> 00:56:04,186 [MUSIC FADES] 1343 00:56:04,230 --> 00:56:05,927 HASSABIS: My dream has always been to try 1344 00:56:05,971 --> 00:56:08,103 and make AI-assisted science possible. 1345 00:56:08,147 --> 00:56:09,235 And what I think is 1346 00:56:09,278 --> 00:56:11,150 our most exciting project, last year, 1347 00:56:11,193 --> 00:56:13,152 which is our work in protein folding. 1348 00:56:13,195 --> 00:56:15,328 Uh, and we call this system AlphaFold. 1349 00:56:15,372 --> 00:56:18,331 We entered it into CASP and our system, uh, 1350 00:56:18,375 --> 00:56:20,507 was the most accurate, uh, predicting structures 1351 00:56:20,551 --> 00:56:24,946 for 25 out of the 43 proteins in the hardest category. 1352 00:56:24,990 --> 00:56:26,208 So we're state of the art, 1353 00:56:26,252 --> 00:56:27,514 but we still... I have to make... Be clear, 1354 00:56:27,558 --> 00:56:28,559 we're still a long way from 1355 00:56:28,602 --> 00:56:30,474 solving the protein folding problem. 1356 00:56:30,517 --> 00:56:31,866 We're working hard on this, though, 1357 00:56:31,910 --> 00:56:33,738 and we're exploring many other techniques. 1358 00:56:33,781 --> 00:56:35,479 [SOMBER MUSIC PLAYING] 1359 00:56:49,188 --> 00:56:50,232 Let's get started. 1360 00:56:50,276 --> 00:56:53,148 JUMPER: So kind of a rapid debrief, 1361 00:56:53,192 --> 00:56:55,455 these are our final rankings for CASP. 1362 00:56:56,500 --> 00:56:57,544 HASSABIS: We beat the second team 1363 00:56:57,588 --> 00:57:00,155 in this competition by nearly 50%, 1364 00:57:00,199 --> 00:57:01,592 but we've still got a long way to go 1365 00:57:01,635 --> 00:57:04,333 before we've solved the protein folding problem 1366 00:57:04,377 --> 00:57:07,032 in a sense that a biologist could use it. 1367 00:57:07,075 --> 00:57:08,990 JUMPER: It is area of concern. 1368 00:57:11,602 --> 00:57:14,213 JANET THORNTON: The quality of predictions varied 1369 00:57:14,256 --> 00:57:16,737 and they were no more useful than the previous methods. 1370 00:57:16,781 --> 00:57:19,914 PAUL NURSE: AlphaFold didn't produce good enough data 1371 00:57:19,958 --> 00:57:22,526 for it to be useful in a practical way 1372 00:57:22,569 --> 00:57:24,005 to, say, somebody like me 1373 00:57:24,049 --> 00:57:28,227 investigating my own biological problems. 1374 00:57:28,270 --> 00:57:30,316 JUMPER: That was kind of a humbling moment 1375 00:57:30,359 --> 00:57:32,753 'cause we thought we'd worked very hard and succeeded. 1376 00:57:32,797 --> 00:57:34,886 And what we'd found is we were the best in the world 1377 00:57:34,929 --> 00:57:36,453 at a problem the world's not good at. 1378 00:57:37,671 --> 00:57:38,933 We knew we sucked. 1379 00:57:38,977 --> 00:57:40,413 [INDISTINCT CHATTER] 1380 00:57:40,457 --> 00:57:42,328 JUMPER: It doesn't help if you have the tallest ladder 1381 00:57:42,371 --> 00:57:44,635 when you're going to the moon. 1382 00:57:44,678 --> 00:57:47,115 HASSABIS: The opinion of quite a few people on the team, 1383 00:57:47,159 --> 00:57:51,468 that this is sort of a fool's errand in some ways. 1384 00:57:51,511 --> 00:57:54,079 And I might have been wrong with protein folding. 1385 00:57:54,122 --> 00:57:55,559 Maybe it's too hard still 1386 00:57:55,602 --> 00:57:58,431 for where we're at generally with AI. 1387 00:57:58,475 --> 00:58:01,173 If you want to do biological research, 1388 00:58:01,216 --> 00:58:03,044 you have to be prepared to fail 1389 00:58:03,088 --> 00:58:06,570 because biology is very complicated. 1390 00:58:06,613 --> 00:58:09,790 I've run a laboratory for nearly 50 years, 1391 00:58:09,834 --> 00:58:11,096 and half my time, 1392 00:58:11,139 --> 00:58:12,619 I'm just an amateur psychiatrist 1393 00:58:12,663 --> 00:58:18,103 to keep, um, my colleagues cheerful when nothing works. 1394 00:58:18,146 --> 00:58:22,542 And quite a lot of the time and I mean, 80, 90%, 1395 00:58:22,586 --> 00:58:24,413 it does not work. 1396 00:58:24,457 --> 00:58:26,720 If you are at the forefront of science, 1397 00:58:26,764 --> 00:58:30,115 I can tell you, you will fail a great deal. 1398 00:58:32,465 --> 00:58:33,466 [CLICKS MOUSE] 1399 00:58:35,163 --> 00:58:37,165 HASSABIS: I just felt disappointed. 1400 00:58:38,689 --> 00:58:41,605 Lesson I learned is that ambition is a good thing, 1401 00:58:41,648 --> 00:58:43,694 but you need to get the timing right. 1402 00:58:43,737 --> 00:58:46,784 There's no point being 50 years ahead of your time. 1403 00:58:46,827 --> 00:58:48,133 You will never survive 1404 00:58:48,176 --> 00:58:49,917 fifty years of that kind of endeavor 1405 00:58:49,961 --> 00:58:51,963 before it yields something. 1406 00:58:52,006 --> 00:58:53,268 You'll literally die trying. 1407 00:58:53,312 --> 00:58:55,096 [TENSE MUSIC PLAYING] 1408 00:59:08,936 --> 00:59:11,286 CUKIER: When we talk about AGI, 1409 00:59:11,330 --> 00:59:14,376 the holy grail of artificial intelligence, 1410 00:59:14,420 --> 00:59:15,508 it becomes really difficult 1411 00:59:15,552 --> 00:59:17,815 to know what we're even talking about. 1412 00:59:17,858 --> 00:59:19,643 HASSABIS: Which bits are we gonna see today? 1413 00:59:19,686 --> 00:59:21,645 MAN: We're going to start in the garden. 1414 00:59:21,688 --> 00:59:23,081 [MACHINE BEEPS] 1415 00:59:23,124 --> 00:59:25,649 This is the garden looking from the observation area. 1416 00:59:25,692 --> 00:59:27,433 Research scientists and engineers 1417 00:59:27,476 --> 00:59:30,871 can analyze and collaborate and evaluate 1418 00:59:30,915 --> 00:59:33,004 what's going on in real time. 1419 00:59:33,047 --> 00:59:34,614 CUKIER: So in the 1800s, 1420 00:59:34,658 --> 00:59:37,008 we'd think of things like television and the submarine 1421 00:59:37,051 --> 00:59:38,139 or a rocket ship to the moon 1422 00:59:38,183 --> 00:59:40,228 and say these things are impossible. 1423 00:59:40,272 --> 00:59:41,490 Yet Jules Verne wrote about them and, 1424 00:59:41,534 --> 00:59:44,406 a century and a half later, they happened. 1425 00:59:44,450 --> 00:59:45,451 HASSABIS: We'll be experimenting 1426 00:59:45,494 --> 00:59:47,888 on civilizations really, 1427 00:59:47,932 --> 00:59:50,587 civilizations of AI agents. 1428 00:59:50,630 --> 00:59:52,719 Once the experiments start going, 1429 00:59:52,763 --> 00:59:54,242 it's going to be the most exciting thing ever. 1430 00:59:54,286 --> 00:59:56,984 - So how will we get sleep? - [MAN LAUGHS] 1431 00:59:57,028 --> 00:59:58,682 I won't be able to sleep. 1432 00:59:58,725 --> 01:00:00,684 LEGG: Full AGI will be able to do 1433 01:00:00,727 --> 01:00:03,861 any cognitive task a person can do. 1434 01:00:03,904 --> 01:00:08,387 It will be at a scale, potentially, far beyond that. 1435 01:00:08,430 --> 01:00:10,302 STUART RUSSELL: It's really impossible for us 1436 01:00:10,345 --> 01:00:14,828 to imagine the outputs of a superintelligent entity. 1437 01:00:14,872 --> 01:00:18,963 It's like asking a gorilla to imagine, you know, 1438 01:00:19,006 --> 01:00:20,181 what Einstein does 1439 01:00:20,225 --> 01:00:23,402 when he produces the theory of relativity. 1440 01:00:23,445 --> 01:00:25,491 LEGG: People often ask me these questions like, 1441 01:00:25,534 --> 01:00:29,495 "What happens if you're wrong, and AGI is quite far away?" 1442 01:00:29,538 --> 01:00:31,453 And I'm like, I never worry about that. 1443 01:00:31,497 --> 01:00:33,847 I actually worry about the reverse. 1444 01:00:33,891 --> 01:00:37,242 I actually worry that it's coming faster 1445 01:00:37,285 --> 01:00:39,723 than we can really prepare for. 1446 01:00:39,766 --> 01:00:42,029 [ROBOTIC ARM WHIRRING] 1447 01:00:42,073 --> 01:00:45,859 HADSELL: It really feels like we're in a race to AGI. 1448 01:00:45,903 --> 01:00:49,907 The prototypes and the models that we are developing now 1449 01:00:49,950 --> 01:00:51,822 are actually transforming 1450 01:00:51,865 --> 01:00:54,215 the space of what we know about intelligence. 1451 01:00:54,259 --> 01:00:57,305 [WHIRRING] 1452 01:00:57,349 --> 01:00:58,785 LEGG: Recently, we've had agents 1453 01:00:58,829 --> 01:01:00,047 that are powerful enough 1454 01:01:00,091 --> 01:01:03,442 to actually start playing games in teams, 1455 01:01:03,485 --> 01:01:06,140 then competing against other teams. 1456 01:01:06,184 --> 01:01:08,795 We're seeing co-operative social dynamics 1457 01:01:08,839 --> 01:01:10,492 coming out of agents 1458 01:01:10,536 --> 01:01:13,321 where we haven't pre-programmed in 1459 01:01:13,365 --> 01:01:15,584 any of these sorts of dynamics. 1460 01:01:15,628 --> 01:01:19,240 It's completely learned from their own experiences. 1461 01:01:20,807 --> 01:01:23,288 When we started, we thought we were 1462 01:01:23,331 --> 01:01:25,725 out to build an intelligence system 1463 01:01:25,769 --> 01:01:28,336 and convince the world that we'd done it. 1464 01:01:28,380 --> 01:01:29,947 We're now starting to wonder whether 1465 01:01:29,990 --> 01:01:31,296 we're gonna build systems 1466 01:01:31,339 --> 01:01:32,906 that we're not convinced are fully intelligent, 1467 01:01:32,950 --> 01:01:34,691 and we're trying to convince the world that they're not. 1468 01:01:34,734 --> 01:01:35,779 [CHUCKLES] 1469 01:01:35,822 --> 01:01:36,780 [CELL PHONE DINGS] 1470 01:01:38,651 --> 01:01:40,000 Hi, Alpha. 1471 01:01:40,044 --> 01:01:41,523 ALPHA: Hello there. 1472 01:01:41,567 --> 01:01:43,917 LOVE: Where are we today? 1473 01:01:43,961 --> 01:01:46,659 You're at the Museum of Modern Art in New York City. 1474 01:01:48,400 --> 01:01:53,013 Kind of. Um, what painting is this? 1475 01:01:53,057 --> 01:01:55,494 This isThe Creation of Adam by Michelangelo. 1476 01:01:55,537 --> 01:01:58,410 I don't think that painting is in New York City. 1477 01:01:58,453 --> 01:02:01,543 You are right. It's in the Vatican City. 1478 01:02:01,587 --> 01:02:02,806 LOVE: Do you think that's where we are? 1479 01:02:04,677 --> 01:02:05,809 ALPHA: No. 1480 01:02:05,852 --> 01:02:07,593 LOVE: Then how am I looking at this? 1481 01:02:10,509 --> 01:02:11,640 ALPHA: You're looking at 1482 01:02:11,684 --> 01:02:12,946 a digital copy of the painting. 1483 01:02:12,990 --> 01:02:13,991 Yes. 1484 01:02:15,035 --> 01:02:16,341 MAN 1: Who lives in this house? 1485 01:02:16,384 --> 01:02:19,170 [EXCITING MUSIC PLAYING] 1486 01:02:19,213 --> 01:02:20,345 ALPHA: An astronaut. 1487 01:02:26,046 --> 01:02:28,005 What's this astronaut's name? 1488 01:02:28,048 --> 01:02:29,397 ALPHA: His name is Mark. 1489 01:02:29,441 --> 01:02:32,052 This is a famous equation by Albert Einstein. 1490 01:02:32,096 --> 01:02:35,577 It shows that energy and mass are equivalent. 1491 01:02:35,621 --> 01:02:37,666 How would you explain it to a five year old? 1492 01:02:40,931 --> 01:02:43,542 ALPHA: Well, imagine you have a big rock, 1493 01:02:43,585 --> 01:02:45,326 and you want to know how much it weighs. 1494 01:02:45,370 --> 01:02:47,589 MAN 2: Is Mark in danger? 1495 01:02:47,633 --> 01:02:49,722 ALPHA: No. Mark is not in danger. 1496 01:02:49,766 --> 01:02:50,767 Why is that? 1497 01:02:54,422 --> 01:02:56,163 ALPHA: Because the dinosaur is not real. 1498 01:02:56,207 --> 01:02:57,295 [MEN CHUCKLING] 1499 01:02:57,338 --> 01:02:58,470 LOVE: All right. 1500 01:02:58,513 --> 01:03:00,515 Why could you not answer my question? 1501 01:03:00,559 --> 01:03:01,908 ALPHA: I think I couldn't answer your question 1502 01:03:01,952 --> 01:03:04,041 because I don't know enough about the subject. 1503 01:03:04,911 --> 01:03:06,130 Okay. 1504 01:03:06,173 --> 01:03:08,654 How could you learn more about the subject? 1505 01:03:08,697 --> 01:03:09,960 ALPHA: I can learn more about the subject 1506 01:03:10,003 --> 01:03:12,876 by reading books, articles, and websites. 1507 01:03:12,919 --> 01:03:14,268 You can teach me. 1508 01:03:14,312 --> 01:03:15,835 HASSABIS: My conclusion is, 1509 01:03:15,879 --> 01:03:17,619 if you now think about what we're doing, 1510 01:03:17,663 --> 01:03:20,057 is learning from all humans, 1511 01:03:20,100 --> 01:03:22,581 all their knowledge at once put on the Internet, 1512 01:03:22,624 --> 01:03:24,931 you would actually know a lot about the world. 1513 01:03:24,975 --> 01:03:27,716 Like a significant portion of everything humans can do. 1514 01:03:27,760 --> 01:03:29,631 And now, I think it's more like, 1515 01:03:29,675 --> 01:03:30,937 "Well, it might just work." 1516 01:03:30,981 --> 01:03:33,113 This is a big moment. 1517 01:03:33,157 --> 01:03:34,636 Who is this? 1518 01:03:34,680 --> 01:03:36,551 ALPHA: This is God reaching out to Adam. 1519 01:03:36,595 --> 01:03:37,988 RUSSELL: The advent of AGI 1520 01:03:38,031 --> 01:03:40,860 will divide human history into two parts. 1521 01:03:40,904 --> 01:03:43,950 The part up to that point and the part after that point. 1522 01:03:43,994 --> 01:03:47,258 LOVE: Why is he reaching out to touch Adam? 1523 01:03:47,301 --> 01:03:48,912 ALPHA: God is reaching out to touch Adam 1524 01:03:48,955 --> 01:03:50,217 to give him life. 1525 01:03:50,261 --> 01:03:52,219 RUSSELL: It will give us a tool 1526 01:03:52,263 --> 01:03:57,529 that can completely reinvent our entire civilization. 1527 01:03:57,572 --> 01:03:59,400 LOVE: What does this painting mean to you? 1528 01:04:01,750 --> 01:04:03,752 ALPHA: The painting means a lot to me. 1529 01:04:03,796 --> 01:04:04,797 Okay. Like what? 1530 01:04:09,062 --> 01:04:10,281 [MUSIC FADES] 1531 01:04:10,324 --> 01:04:11,456 ALPHA: I think the painting is a reminder 1532 01:04:11,499 --> 01:04:12,674 that we are all connected to each other 1533 01:04:12,718 --> 01:04:13,937 and that we are all part of something 1534 01:04:13,980 --> 01:04:15,112 bigger than ourselves. 1535 01:04:16,461 --> 01:04:17,766 That's pretty nice. 1536 01:04:19,029 --> 01:04:21,379 LEGG: When you cross that barrier of 1537 01:04:21,422 --> 01:04:23,947 "AGI might happen one day in the future" 1538 01:04:23,990 --> 01:04:26,645 to "No, actually, this could really happen in a time frame 1539 01:04:26,688 --> 01:04:28,690 "that is sort of, like, on my watch, you know," 1540 01:04:28,734 --> 01:04:30,475 something changes in your thinking. 1541 01:04:30,518 --> 01:04:32,694 MAN: ...learned to orient itself by looking... 1542 01:04:32,738 --> 01:04:35,045 HASSABIS: We have to be careful with how we use it 1543 01:04:35,088 --> 01:04:37,177 and thoughtful about how we deploy it. 1544 01:04:37,221 --> 01:04:39,788 [GRIPPING MUSIC BUILDING] 1545 01:04:39,832 --> 01:04:41,138 HASSABIS: You'd have to consider 1546 01:04:41,181 --> 01:04:42,487 what's its top level goal. 1547 01:04:42,530 --> 01:04:45,011 If it's to keep humans happy, 1548 01:04:45,055 --> 01:04:48,928 which set of humans? What does happiness mean? 1549 01:04:48,972 --> 01:04:52,018 A lot of our collective goals are very tricky, 1550 01:04:52,062 --> 01:04:54,891 even for humans to figure out. 1551 01:04:54,934 --> 01:04:58,503 CUKIER: Technology always embeds our values. 1552 01:04:58,546 --> 01:05:01,680 It's not just technical, it's ethical as well. 1553 01:05:01,723 --> 01:05:02,899 So we've got to be really cautious 1554 01:05:02,942 --> 01:05:04,291 about what we're building into it. 1555 01:05:04,335 --> 01:05:06,076 MAN: We're trying to find a single algorithm which... 1556 01:05:06,119 --> 01:05:07,816 SILVER: The reality is that this is an algorithm 1557 01:05:07,860 --> 01:05:11,037 that has been created by people, by us. 1558 01:05:11,081 --> 01:05:13,213 You know, what does it mean to endow our agents 1559 01:05:13,257 --> 01:05:15,607 with the same kind of values that we hold dear? 1560 01:05:15,650 --> 01:05:17,652 What is the purpose of making these AI systems 1561 01:05:17,696 --> 01:05:19,045 appear so humanlike 1562 01:05:19,089 --> 01:05:20,742 so that they do capture hearts and minds 1563 01:05:20,786 --> 01:05:21,961 because they're kind of 1564 01:05:22,005 --> 01:05:24,703 exploiting a human vulnerability also? 1565 01:05:24,746 --> 01:05:26,531 The heart and mind of these systems 1566 01:05:26,574 --> 01:05:28,054 are very much human-generated data... 1567 01:05:28,098 --> 01:05:29,055 WOMAN: Mmm-hmm. 1568 01:05:29,099 --> 01:05:30,491 ...for all the good and the bad. 1569 01:05:30,535 --> 01:05:32,015 LEVI: There is a parallel 1570 01:05:32,058 --> 01:05:34,017 between the Industrial Revolution, 1571 01:05:34,060 --> 01:05:36,758 which was an incredible moment of displacement 1572 01:05:36,802 --> 01:05:42,373 and the current technological change created by AI. 1573 01:05:42,416 --> 01:05:43,722 [CHANTING] Pause AI! 1574 01:05:43,765 --> 01:05:45,724 LEVI: We have to think about who's displaced 1575 01:05:45,767 --> 01:05:48,596 and how we're going to support them. 1576 01:05:48,640 --> 01:05:50,076 This technology is coming a lot sooner, 1577 01:05:50,120 --> 01:05:52,426 uh, than really the world knows or kind of 1578 01:05:52,470 --> 01:05:55,908 even we 18, 24 months ago thought. 1579 01:05:55,952 --> 01:05:57,257 So there's a tremendous opportunity, 1580 01:05:57,301 --> 01:05:58,389 tremendous excitement, 1581 01:05:58,432 --> 01:06:00,391 but also tremendous responsibility. 1582 01:06:00,434 --> 01:06:01,740 It's happening so fast. 1583 01:06:02,654 --> 01:06:04,003 How will we govern it? 1584 01:06:05,135 --> 01:06:06,223 How will we decide 1585 01:06:06,266 --> 01:06:08,181 what is okay and what is not okay? 1586 01:06:08,225 --> 01:06:10,923 AI-generated images are getting more sophisticated. 1587 01:06:10,967 --> 01:06:14,535 RUSSELL: The use of AI for generating disinformation 1588 01:06:14,579 --> 01:06:17,016 and manipulating human psychology 1589 01:06:17,060 --> 01:06:20,237 is only going to get much, much worse. 1590 01:06:21,194 --> 01:06:22,587 LEGG: AGI is coming, 1591 01:06:22,630 --> 01:06:24,632 whether we do it here at DeepMind or not. 1592 01:06:25,459 --> 01:06:26,765 CUKIER: It's gonna happen, 1593 01:06:26,808 --> 01:06:29,028 so we better create institutions to protect us. 1594 01:06:29,072 --> 01:06:30,595 It's gonna require global coordination. 1595 01:06:30,638 --> 01:06:32,727 And I worry that humanity is 1596 01:06:32,771 --> 01:06:35,382 increasingly getting worse at that rather than better. 1597 01:06:35,426 --> 01:06:37,123 LEGG: We need a lot more people 1598 01:06:37,167 --> 01:06:40,039 really taking this seriously and thinking about this. 1599 01:06:40,083 --> 01:06:42,999 It's, yeah, it's serious. It worries me. 1600 01:06:44,043 --> 01:06:45,871 It worries me. Yeah. 1601 01:06:45,914 --> 01:06:48,613 RUSSELL: If you received an email saying 1602 01:06:48,656 --> 01:06:50,832 this superior alien civilization 1603 01:06:50,876 --> 01:06:52,791 is going to arrive on Earth, 1604 01:06:52,834 --> 01:06:54,575 there would be emergency meetings 1605 01:06:54,619 --> 01:06:56,273 of all the governments. 1606 01:06:56,316 --> 01:06:58,144 We would go into overdrive 1607 01:06:58,188 --> 01:07:00,103 trying to figure out how to prepare. 1608 01:07:00,146 --> 01:07:01,626 - [MUSIC FADES] - [BELL TOLLING FAINTLY] 1609 01:07:01,669 --> 01:07:03,976 The arrival of AGI will be 1610 01:07:04,020 --> 01:07:06,935 the most important moment that we have ever faced. 1611 01:07:06,979 --> 01:07:09,155 [BELL CONTINUES TOLLING FAINTLY] 1612 01:07:14,378 --> 01:07:17,555 HASSABIS: My dream was that on the way to AGI, 1613 01:07:17,598 --> 01:07:20,688 we would create revolutionary technologies 1614 01:07:20,732 --> 01:07:23,082 that would be of use to humanity. 1615 01:07:23,126 --> 01:07:25,171 That's what I wanted with AlphaFold. 1616 01:07:26,694 --> 01:07:28,653 I think it's more important than ever 1617 01:07:28,696 --> 01:07:31,047 that we should solve the protein folding problem. 1618 01:07:32,004 --> 01:07:34,224 This is gonna be really hard, 1619 01:07:34,267 --> 01:07:36,791 but I won't give up until it's done. 1620 01:07:36,835 --> 01:07:37,879 You know, we need to double down 1621 01:07:37,923 --> 01:07:40,317 and go as fast as possible from here. 1622 01:07:40,360 --> 01:07:41,796 I think we've got no time to lose. 1623 01:07:41,840 --> 01:07:45,757 So we are going to make a protein folding strike team. 1624 01:07:45,800 --> 01:07:47,541 Team lead for the strike team will be John. 1625 01:07:47,585 --> 01:07:48,673 Yeah, we've seen Alpha... 1626 01:07:48,716 --> 01:07:50,283 You know, we're gonna try everything, 1627 01:07:50,327 --> 01:07:51,328 kitchen sink, the whole lot. 1628 01:07:52,198 --> 01:07:53,330 CASP14 is about 1629 01:07:53,373 --> 01:07:55,158 proving we can solve the whole problem. 1630 01:07:56,333 --> 01:07:57,725 And I felt that to do that, 1631 01:07:57,769 --> 01:08:00,337 we would need to incorporate some domain knowledge. 1632 01:08:00,380 --> 01:08:01,860 [EXCITING MUSIC PLAYING] 1633 01:08:01,903 --> 01:08:03,731 We had some fantastic engineers on it, 1634 01:08:03,775 --> 01:08:05,733 but they were not trained in biology. 1635 01:08:08,475 --> 01:08:10,260 KATHRYN TUNYASUVUNAKOOL: As a computational biologist, 1636 01:08:10,303 --> 01:08:12,131 when I initially joined the AlphaFold team, 1637 01:08:12,175 --> 01:08:14,220 I didn't immediately feel confident about anything. 1638 01:08:14,264 --> 01:08:15,352 [CHUCKLES] You know, 1639 01:08:15,395 --> 01:08:17,223 whether we were gonna be successful. 1640 01:08:17,267 --> 01:08:21,097 Biology is so ridiculously complicated. 1641 01:08:21,140 --> 01:08:25,101 It just felt like this very far-off mountain to climb. 1642 01:08:25,144 --> 01:08:26,754 MAN: I'm starting to play with the underlying temperatures 1643 01:08:26,798 --> 01:08:27,973 to see if we can get... 1644 01:08:28,016 --> 01:08:29,148 As one of the few people on the team 1645 01:08:29,192 --> 01:08:31,846 who's done work in biology before, 1646 01:08:31,890 --> 01:08:34,849 you feel this huge sense of responsibility. 1647 01:08:34,893 --> 01:08:36,112 "We're expecting you to do 1648 01:08:36,155 --> 01:08:37,678 "great things on this strike team." 1649 01:08:37,722 --> 01:08:38,897 That's terrifying. 1650 01:08:40,464 --> 01:08:42,727 But one of the reasons why I wanted to come here 1651 01:08:42,770 --> 01:08:45,556 was to do something that matters. 1652 01:08:45,599 --> 01:08:48,472 This is the number of missing things. 1653 01:08:48,515 --> 01:08:49,951 What about making use 1654 01:08:49,995 --> 01:08:52,563 of whatever understanding you have of physics? 1655 01:08:52,606 --> 01:08:54,391 Using that as a source of data? 1656 01:08:54,434 --> 01:08:55,479 But if it's systematic... 1657 01:08:55,522 --> 01:08:56,784 Then, that can't be right, though. 1658 01:08:56,828 --> 01:08:58,308 If it's systematically wrong in some weird way, 1659 01:08:58,351 --> 01:09:01,224 you might be learning that systematically wrong physics. 1660 01:09:01,267 --> 01:09:02,355 The team is already 1661 01:09:02,399 --> 01:09:04,749 trying to think of multiple ways that... 1662 01:09:04,792 --> 01:09:06,229 TUNYASUVUNAKOOL: Biological relevance 1663 01:09:06,272 --> 01:09:07,795 is what we're going for. 1664 01:09:09,057 --> 01:09:11,364 So we rewrote the whole data pipeline 1665 01:09:11,408 --> 01:09:13,279 that AlphaFold uses to learn. 1666 01:09:13,323 --> 01:09:15,586 HASSABIS: You can't force the creative phase. 1667 01:09:15,629 --> 01:09:18,241 You have to give it space for those flowers to bloom. 1668 01:09:19,242 --> 01:09:20,286 We won CASP. 1669 01:09:20,330 --> 01:09:22,070 Then it was back to the drawing board 1670 01:09:22,114 --> 01:09:24,116 and like, what are our new ideas? 1671 01:09:24,160 --> 01:09:26,945 Um, and then it's taken a little while, I would say, 1672 01:09:26,988 --> 01:09:28,686 for them to get back to where they were, 1673 01:09:28,729 --> 01:09:30,340 but with the new ideas. 1674 01:09:30,383 --> 01:09:31,515 And then now I think 1675 01:09:31,558 --> 01:09:33,952 we're seeing the benefits of the new ideas. 1676 01:09:33,995 --> 01:09:35,736 They can go further, right? 1677 01:09:35,780 --> 01:09:38,130 So, um, that's a really important moment. 1678 01:09:38,174 --> 01:09:40,959 I've seen that moment so many times now, 1679 01:09:41,002 --> 01:09:42,613 but I know what that means now. 1680 01:09:42,656 --> 01:09:44,484 And I know this is the time now to press. 1681 01:09:44,528 --> 01:09:45,877 [EXCITING MUSIC CONTINUES] 1682 01:09:45,920 --> 01:09:48,009 JUMPER: Adding side-chains improves direct folding. 1683 01:09:48,053 --> 01:09:49,663 That drove a lot of the progress. 1684 01:09:49,707 --> 01:09:51,012 - We'll talk about that. - Great. 1685 01:09:51,056 --> 01:09:54,799 The last four months, we've made enormous gains. 1686 01:09:54,842 --> 01:09:56,453 EVANS: During CASP13, 1687 01:09:56,496 --> 01:09:59,499 it would take us a day or two to fold one of the proteins, 1688 01:09:59,543 --> 01:10:01,762 and now we're folding, like, 1689 01:10:01,806 --> 01:10:03,938 hundreds of thousands a second. 1690 01:10:03,982 --> 01:10:05,636 Yeah, it's just insane. [CHUCKLES] 1691 01:10:05,679 --> 01:10:06,985 KAVUKCUOGLU: Now, this is a model 1692 01:10:07,028 --> 01:10:09,901 that is orders of magnitude faster, 1693 01:10:09,944 --> 01:10:12,251 while at the same time being better. 1694 01:10:12,295 --> 01:10:13,644 We're getting a lot of structures 1695 01:10:13,687 --> 01:10:15,254 into the high-accuracy regime. 1696 01:10:15,298 --> 01:10:17,517 We're rapidly improving to a system 1697 01:10:17,561 --> 01:10:18,823 that is starting to really 1698 01:10:18,866 --> 01:10:20,477 get at the core and heart of the problem. 1699 01:10:20,520 --> 01:10:21,695 HASSABIS: It's great work. 1700 01:10:21,739 --> 01:10:23,088 It looks like we're in good shape. 1701 01:10:23,131 --> 01:10:26,222 So we got, what, six, five weeks left? Six weeks? 1702 01:10:26,265 --> 01:10:29,616 So what's, uh... Is it... You got enough compute power? 1703 01:10:29,660 --> 01:10:31,531 MAN: I... We could use more. 1704 01:10:31,575 --> 01:10:32,924 [ALL LAUGHING] 1705 01:10:32,967 --> 01:10:34,360 TUNYASUVUNAKOOL: I was nervous about CASP 1706 01:10:34,404 --> 01:10:36,580 but as the system is starting to come together, 1707 01:10:36,623 --> 01:10:37,972 I don't feel as nervous. 1708 01:10:38,016 --> 01:10:39,496 I feel like things have, sort of, 1709 01:10:39,539 --> 01:10:41,193 come into perspective recently, 1710 01:10:41,237 --> 01:10:44,240 and, you know, it's gonna be fine. 1711 01:10:47,330 --> 01:10:48,853 NEWSCASTER: The Prime Minister has announced 1712 01:10:48,896 --> 01:10:51,290 the most drastic limits to our lives 1713 01:10:51,334 --> 01:10:53,858 the U.K. has ever seen in living memory. 1714 01:10:53,901 --> 01:10:55,033 BORIS JOHNSON: I must give the British people 1715 01:10:55,076 --> 01:10:56,904 a very simple instruction. 1716 01:10:56,948 --> 01:10:59,037 You must stay at home. 1717 01:10:59,080 --> 01:11:02,519 HASSABIS: It feels like we're in a science fiction novel. 1718 01:11:02,562 --> 01:11:04,869 You know, I'm delivering food to my parents, 1719 01:11:04,912 --> 01:11:08,220 making sure they stay isolated and safe. 1720 01:11:08,264 --> 01:11:10,570 I think it just highlights the incredible need 1721 01:11:10,614 --> 01:11:12,877 for AI-assisted science. 1722 01:11:17,098 --> 01:11:18,361 TUNYASUVUNAKOOL: You always know that 1723 01:11:18,404 --> 01:11:21,015 something like this is a possibility. 1724 01:11:21,059 --> 01:11:23,888 But nobody ever really believes it's gonna happen 1725 01:11:23,931 --> 01:11:25,585 in their lifetime, though. 1726 01:11:25,629 --> 01:11:26,934 [COMPUTER BEEPS] 1727 01:11:26,978 --> 01:11:29,154 - JUMPER: Are you recording yet? - RESEARCHER: Yes. 1728 01:11:29,197 --> 01:11:31,025 - Okay, morning, all. - Hey. 1729 01:11:31,069 --> 01:11:32,679 Good. CASP has started. 1730 01:11:32,723 --> 01:11:36,074 It's nice I get to sit around in my pajama bottoms all day. 1731 01:11:36,117 --> 01:11:37,597 TUNYASUVUNAKOOL: I never thought I'd live in a house 1732 01:11:37,641 --> 01:11:39,164 where so much was going on. 1733 01:11:39,207 --> 01:11:41,427 I would be trying to solve protein folding in one room, 1734 01:11:41,471 --> 01:11:42,559 and my husband would be trying 1735 01:11:42,602 --> 01:11:43,908 to make robots walk in the other. 1736 01:11:45,388 --> 01:11:46,911 [EXHALES] 1737 01:11:46,954 --> 01:11:49,392 One of the hardest proteins we've gotten in CASP thus far 1738 01:11:49,435 --> 01:11:51,219 is the SARS-CoV-2 protein 1739 01:11:51,263 --> 01:11:52,220 called ORF8. 1740 01:11:52,264 --> 01:11:54,919 ORF8 is a coronavirus protein. 1741 01:11:54,962 --> 01:11:56,964 It's one of the main proteins, um, 1742 01:11:57,008 --> 01:11:58,749 that dampens the immune system. 1743 01:11:58,792 --> 01:12:00,054 TUNYASUVUNAKOOL: We tried really hard 1744 01:12:00,098 --> 01:12:01,752 to improve our prediction. 1745 01:12:01,795 --> 01:12:03,493 Like, really, really hard. 1746 01:12:03,536 --> 01:12:05,582 Probably the most time that we have ever spent 1747 01:12:05,625 --> 01:12:07,105 on a single target. 1748 01:12:07,148 --> 01:12:08,933 To the point where my husband is, like, 1749 01:12:08,976 --> 01:12:12,197 "It's midnight. You need to go to bed." 1750 01:12:12,240 --> 01:12:16,419 So I think we're at Day 102 since lockdown. 1751 01:12:16,462 --> 01:12:19,944 My daughter is keeping a journal. 1752 01:12:19,987 --> 01:12:22,120 Now you can go out as much as you want. 1753 01:12:25,036 --> 01:12:27,212 JUMPER: We have received the last target. 1754 01:12:27,255 --> 01:12:29,649 They've said they will be sending out no more targets 1755 01:12:29,693 --> 01:12:31,347 in our category of CASP. 1756 01:12:32,652 --> 01:12:33,653 So we're just making sure 1757 01:12:33,697 --> 01:12:35,481 we get the best possible answer. 1758 01:12:40,530 --> 01:12:43,315 MOULT: As soon as we started to get the results, 1759 01:12:43,359 --> 01:12:48,233 I'd sit down and start looking at how close did anybody come 1760 01:12:48,276 --> 01:12:50,583 to getting the protein structures correct. 1761 01:12:54,848 --> 01:12:56,937 [ROBOT SQUEAKING] 1762 01:12:59,113 --> 01:13:00,201 [INCOMING CALL BEEPING] 1763 01:13:00,245 --> 01:13:01,551 - Oh, hi there. - MAN: Hello. 1764 01:13:01,594 --> 01:13:03,770 [ALL CHUCKLING] 1765 01:13:03,814 --> 01:13:07,078 It is an unbelievable thing, CASP has finally ended. 1766 01:13:07,121 --> 01:13:09,472 I think it's at least time to raise a glass. 1767 01:13:09,515 --> 01:13:11,212 Um, I don't know if everyone has a glass 1768 01:13:11,256 --> 01:13:12,823 of something that they can raise. 1769 01:13:12,866 --> 01:13:14,955 If not, raise, I don't know, your laptops. 1770 01:13:14,999 --> 01:13:17,088 - Um... - [LAUGHTER] 1771 01:13:17,131 --> 01:13:18,611 I'll probably make a speech in a minute. 1772 01:13:18,655 --> 01:13:20,483 I feel like I should but I just have no idea what to say. 1773 01:13:21,005 --> 01:13:24,269 So... let's see. 1774 01:13:24,312 --> 01:13:27,054 I feel like a reading of email... 1775 01:13:27,098 --> 01:13:28,534 is the right thing to do. 1776 01:13:28,578 --> 01:13:29,883 [ALL CHUCKLING] 1777 01:13:29,927 --> 01:13:31,232 TUNYASUVUNAKOOL: When John said, 1778 01:13:31,276 --> 01:13:33,191 "I'm gonna read an email," at a team social, 1779 01:13:33,234 --> 01:13:35,498 I thought, "Wow, John, you know how to have fun." 1780 01:13:35,541 --> 01:13:38,370 We're gonna read an email now. [LAUGHS] 1781 01:13:38,414 --> 01:13:41,634 Uh, I got this about four o'clock today. 1782 01:13:42,722 --> 01:13:44,724 Um, it is from John Moult. 1783 01:13:45,725 --> 01:13:47,031 And I'll just read it. 1784 01:13:47,074 --> 01:13:49,381 It says, "As I expect you know, 1785 01:13:49,425 --> 01:13:53,603 "your group has performed amazingly well in CASP 14, 1786 01:13:53,646 --> 01:13:55,387 "both relative to other groups 1787 01:13:55,431 --> 01:13:57,911 "and in absolute model accuracy." 1788 01:13:57,955 --> 01:13:59,783 [PEOPLE CLAPPING] 1789 01:13:59,826 --> 01:14:01,219 "Congratulations on this work. 1790 01:14:01,262 --> 01:14:03,047 "It is really outstanding." 1791 01:14:03,090 --> 01:14:05,266 The structures were so good, 1792 01:14:05,310 --> 01:14:07,443 it was... it was just amazing. 1793 01:14:07,486 --> 01:14:09,096 [TRIUMPHANT INSTRUMENTAL MUSIC PLAYING] 1794 01:14:09,140 --> 01:14:10,750 After half a century, 1795 01:14:10,794 --> 01:14:12,230 we finally have a solution 1796 01:14:12,273 --> 01:14:14,928 to the protein folding problem. 1797 01:14:14,972 --> 01:14:17,409 When I saw this email, I read it, 1798 01:14:17,453 --> 01:14:19,585 I go, "Oh, shit!" 1799 01:14:19,629 --> 01:14:21,587 And my wife goes, "Is everything okay?" 1800 01:14:21,631 --> 01:14:24,242 I call my parents, and just, like, "Hey, Mum. 1801 01:14:24,285 --> 01:14:26,244 "Um, got something to tell you. 1802 01:14:26,287 --> 01:14:27,550 "We've done this thing 1803 01:14:27,593 --> 01:14:29,813 "and it might be kind of a big deal." [LAUGHS] 1804 01:14:29,856 --> 01:14:31,641 When I learned of the CASP 14 results, 1805 01:14:32,642 --> 01:14:34,034 I was gobsmacked. 1806 01:14:34,078 --> 01:14:35,819 I was just excited. 1807 01:14:35,862 --> 01:14:38,909 This is a problem that I was beginning to think 1808 01:14:38,952 --> 01:14:42,086 would not get solved in my lifetime. 1809 01:14:42,129 --> 01:14:44,741 NURSE: Now we have a tool that can be used 1810 01:14:44,784 --> 01:14:46,612 practically by scientists. 1811 01:14:46,656 --> 01:14:48,440 SENIOR: These people are asking us, you know, 1812 01:14:48,484 --> 01:14:50,224 "I've got this protein involved in malaria," 1813 01:14:50,268 --> 01:14:52,139 or, you know, some infectious disease. 1814 01:14:52,183 --> 01:14:53,227 "We don't know the structure. 1815 01:14:53,271 --> 01:14:55,186 "Can we use AlphaFold to solve it?" 1816 01:14:55,229 --> 01:14:56,970 JUMPER: We can easily predict all known sequences 1817 01:14:57,014 --> 01:14:58,276 in a month. 1818 01:14:58,319 --> 01:14:59,973 All known sequences in a month? 1819 01:15:00,017 --> 01:15:01,279 - Yeah, easily. - Mmm-hmm? 1820 01:15:01,322 --> 01:15:02,585 JUMPER: A billion, two billion. 1821 01:15:02,628 --> 01:15:03,673 Um, and they're... 1822 01:15:03,716 --> 01:15:05,196 So why don't we just do that? Yeah. 1823 01:15:05,239 --> 01:15:07,111 - We should just do that a lot. - Well, I mean... 1824 01:15:07,154 --> 01:15:09,243 That's way better. Why don't we just do that? 1825 01:15:09,287 --> 01:15:11,115 SENIOR: So that's one of the options. 1826 01:15:11,158 --> 01:15:12,638 - HASSABIS: Right. - There's this... 1827 01:15:12,682 --> 01:15:15,119 We should just... Right, that's a great idea. 1828 01:15:15,162 --> 01:15:17,513 We should just run every protein in existence. 1829 01:15:18,296 --> 01:15:19,471 And then release that. 1830 01:15:19,515 --> 01:15:20,994 Why didn't someone suggest this before? 1831 01:15:21,038 --> 01:15:22,126 Of course that's what we should do. 1832 01:15:22,169 --> 01:15:23,954 Why are we thinking about making a service 1833 01:15:23,997 --> 01:15:25,651 and then people submit their protein? 1834 01:15:25,695 --> 01:15:26,913 We just fold everything. 1835 01:15:26,957 --> 01:15:28,654 And then give it to everyone in the world. 1836 01:15:28,698 --> 01:15:31,483 Who knows how many discoveries will be made from that? 1837 01:15:31,527 --> 01:15:33,790 BIRNEY: Demis called us up and said, 1838 01:15:33,833 --> 01:15:35,618 "We want to make this open. 1839 01:15:35,661 --> 01:15:37,837 "Not just make sure the code is open, 1840 01:15:37,881 --> 01:15:39,578 "but we're gonna make it really easy 1841 01:15:39,622 --> 01:15:42,668 "for everybody to get access to the predictions." 1842 01:15:45,062 --> 01:15:47,238 THORNTON: That is fantastic. 1843 01:15:47,281 --> 01:15:49,327 It's like drawing back the curtain 1844 01:15:49,370 --> 01:15:52,852 and seeing the whole world of protein structures. 1845 01:15:52,896 --> 01:15:55,202 [ETHEREAL MUSIC PLAYING] 1846 01:15:55,246 --> 01:15:56,987 SCHMIDT: They released the structures 1847 01:15:57,030 --> 01:15:59,772 of 200 million proteins. 1848 01:15:59,816 --> 01:16:01,818 These are gifts to humanity. 1849 01:16:07,650 --> 01:16:10,914 JUMPER: The moment AlphaFold is live to the world, 1850 01:16:10,957 --> 01:16:13,873 we will no longer be the most important people 1851 01:16:13,917 --> 01:16:15,222 in AlphaFold's story. 1852 01:16:15,266 --> 01:16:16,833 HASSABIS: Can't quite believe it's all out. 1853 01:16:16,876 --> 01:16:18,356 PEOPLE: Aw! 1854 01:16:18,399 --> 01:16:20,314 WOMAN: A hundred and sixty-four users. 1855 01:16:20,358 --> 01:16:22,578 HASSABIS: Loads of activity in Japan. 1856 01:16:22,621 --> 01:16:24,928 RESEARCHER 1: We have 655 users currently. 1857 01:16:24,971 --> 01:16:26,930 RESEARCHER 2: We currently have 100,000 concurrent users. 1858 01:16:26,973 --> 01:16:28,192 Wow! 1859 01:16:31,108 --> 01:16:33,893 Today is just crazy. 1860 01:16:33,937 --> 01:16:36,504 HASSABIS: What an absolutely unbelievable effort 1861 01:16:36,548 --> 01:16:37,723 from everyone. 1862 01:16:37,767 --> 01:16:38,550 We're gonna all remember these moments 1863 01:16:38,594 --> 01:16:40,030 for the rest of our lives. 1864 01:16:40,073 --> 01:16:41,727 I'm excited about AlphaFold. 1865 01:16:41,771 --> 01:16:45,601 For my research, it's already propelling lots of progress. 1866 01:16:45,644 --> 01:16:47,385 And this is just the beginning. 1867 01:16:47,428 --> 01:16:48,908 SCHMIDT: My guess is, 1868 01:16:48,952 --> 01:16:53,043 every single biological and chemistry achievement 1869 01:16:53,086 --> 01:16:55,698 will be related to AlphaFold in some way. 1870 01:16:55,741 --> 01:16:57,874 [TRIUMPHANT INSTRUMENTAL MUSIC PLAYING] 1871 01:17:13,367 --> 01:17:15,413 AlphaFold is an index moment. 1872 01:17:15,456 --> 01:17:18,068 It's a moment that people will not forget 1873 01:17:18,111 --> 01:17:20,244 because the world changed. 1874 01:17:39,655 --> 01:17:41,482 HASSABIS: Everybody's realized now 1875 01:17:41,526 --> 01:17:43,746 what Shane and I have known for more than 20 years, 1876 01:17:43,789 --> 01:17:46,618 that AI is going to be the most important thing 1877 01:17:46,662 --> 01:17:48,446 humanity's ever gonna invent. 1878 01:17:48,489 --> 01:17:50,230 TRAIN ANNOUNCER: We will shortly be arriving 1879 01:17:50,274 --> 01:17:52,058 at our final destination. 1880 01:17:52,102 --> 01:17:53,581 [ELECTRONIC MUSIC PLAYING] 1881 01:18:02,068 --> 01:18:04,767 HASSABIS: The pace of innovation and capabilities 1882 01:18:04,810 --> 01:18:06,507 is accelerating, 1883 01:18:06,551 --> 01:18:09,293 like a boulder rolling down a hill that we've kicked off 1884 01:18:09,336 --> 01:18:12,644 and now it's continuing to gather speed. 1885 01:18:12,688 --> 01:18:15,299 NEWSCASTER: We are at a crossroads in human history. 1886 01:18:15,342 --> 01:18:16,735 AI has the potential 1887 01:18:16,779 --> 01:18:19,172 to transform our lives in every aspect. 1888 01:18:19,216 --> 01:18:23,786 It's no less important than the discovery of electricity. 1889 01:18:23,829 --> 01:18:26,484 HASSABIS: We should be looking at the scientific method 1890 01:18:26,527 --> 01:18:28,834 and trying to understand each step of the way 1891 01:18:28,878 --> 01:18:30,096 in a rigorous way. 1892 01:18:30,140 --> 01:18:32,664 This is a moment of profound opportunity. 1893 01:18:32,708 --> 01:18:34,753 SUNAK: Harnessing this technology 1894 01:18:34,797 --> 01:18:37,713 could eclipse anything we have ever known. 1895 01:18:40,411 --> 01:18:42,152 [ELECTRONIC DEVICE BEEPS] 1896 01:18:42,195 --> 01:18:43,675 HASSABIS: Hi, Alpha. 1897 01:18:44,676 --> 01:18:45,764 ALPHA: Hi. 1898 01:18:47,157 --> 01:18:48,419 What is this? 1899 01:18:50,682 --> 01:18:53,729 ALPHA: This is a chessboard. 1900 01:18:53,772 --> 01:18:56,514 If I was to play white, what move would you recommend? 1901 01:18:59,865 --> 01:19:00,953 ALPHA: I would recommend 1902 01:19:00,997 --> 01:19:02,781 that you move your pawn from E2 to E4. 1903 01:19:05,871 --> 01:19:08,787 And now if you were black, what would you play now? 1904 01:19:11,572 --> 01:19:13,618 ALPHA: I would play the Sicilian Defense. 1905 01:19:15,838 --> 01:19:16,882 That's a good choice. 1906 01:19:19,406 --> 01:19:21,452 - ALPHA: Thanks. - [CHUCKLES] 1907 01:19:23,715 --> 01:19:25,891 So what do you see? What is this object? 1908 01:19:28,546 --> 01:19:30,504 ALPHA: This is a pencil sculpture. 1909 01:19:32,811 --> 01:19:35,031 What happens if I move one of the pencils? 1910 01:19:37,990 --> 01:19:39,470 ALPHA: If you move one of the pencils, 1911 01:19:39,513 --> 01:19:42,081 the sculpture will fall apart. 1912 01:19:42,125 --> 01:19:44,301 I'd better leave it alone, then. 1913 01:19:44,344 --> 01:19:45,868 ALPHA: That's probably a good idea. 1914 01:19:45,911 --> 01:19:47,391 [HASSABIS CHUCKLES] 1915 01:19:50,568 --> 01:19:52,744 HASSABIS: AGI is on the horizon now. 1916 01:19:54,833 --> 01:19:56,661 Very clearly the next generation 1917 01:19:56,704 --> 01:19:58,141 is going to live in a future world 1918 01:19:58,184 --> 01:20:01,057 where things will be radically different because of AI. 1919 01:20:02,493 --> 01:20:05,496 And if you want to steward that responsibly, 1920 01:20:05,539 --> 01:20:09,239 every moment is vital. 1921 01:20:09,282 --> 01:20:12,677 This is the moment I've been living my whole life for. 1922 01:20:19,162 --> 01:20:21,120 It's just a good thinking game. 1923 01:20:22,469 --> 01:20:24,602 [UPLIFTING INSTRUMENTAL MUSIC PLAYING] 151865

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.