All language subtitles for I Am Human 2019_track3_eng

af Afrikaans
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian Download
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian
pl Polish
pt Portuguese
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:17,396 --> 00:00:19,571 [light music] 2 00:00:28,407 --> 00:00:31,410 [machinery beeping] 3 00:00:40,730 --> 00:00:43,974 [light electronic music] 4 00:01:10,380 --> 00:01:12,175 - [Bryan] We are about to enter into 5 00:01:12,210 --> 00:01:14,867 the most consequential revolution 6 00:01:14,902 --> 00:01:17,249 in the history of the human race. 7 00:01:19,148 --> 00:01:21,943 Where we can take control of our cognitive evolution. 8 00:01:25,326 --> 00:01:26,914 - [Sara] We're talking about technologies 9 00:01:26,948 --> 00:01:29,710 that could radically alter the way we are as human beings. 10 00:01:30,573 --> 00:01:32,230 [whirring] 11 00:01:32,264 --> 00:01:34,232 - [Dr. Nicolelis] What we're seeing is that 12 00:01:34,266 --> 00:01:35,957 technology is becoming part of us. 13 00:01:39,306 --> 00:01:42,136 Because we're linking biological brains 14 00:01:42,171 --> 00:01:43,448 directly to machines. 15 00:01:45,691 --> 00:01:48,832 - [Nita] What if we could expand your capabilities beyond 16 00:01:48,867 --> 00:01:51,525 what we as humans have never had before? 17 00:01:54,286 --> 00:01:56,737 - [Ramez] There are a set of people who are going to say, 18 00:01:56,771 --> 00:01:59,809 this is unnatural, this is inhuman, 19 00:01:59,843 --> 00:02:01,397 we shouldn't allow this to happen. 20 00:02:04,054 --> 00:02:06,333 - [Man] The fact is, is that, we are transforming 21 00:02:06,367 --> 00:02:07,299 into a new species, 22 00:02:07,334 --> 00:02:09,129 a technological species. 23 00:02:10,440 --> 00:02:12,546 - [Bryan] You don't know what the outcome is going to be. 24 00:02:12,580 --> 00:02:14,341 You don't know how it will be used. 25 00:02:19,967 --> 00:02:22,349 - [Nita] But if we start tinkering with the brain, 26 00:02:23,867 --> 00:02:25,938 if we start changing it... 27 00:02:29,597 --> 00:02:31,358 Are we about to fundamentally change 28 00:02:31,392 --> 00:02:32,566 what it means to be human? 29 00:02:34,947 --> 00:02:37,950 And if so, are we okay with that? 30 00:02:55,347 --> 00:02:57,556 [beeping] 31 00:03:07,117 --> 00:03:08,360 - [Bill] Wake up. - [Computer] Wake up. 32 00:03:09,292 --> 00:03:11,329 - [Bill] Drapes. - [Computer] Drapes. 33 00:03:11,363 --> 00:03:12,916 - [Bill] One. - [Computer] One. 34 00:03:12,951 --> 00:03:15,022 - [Bill] Open. - [Computer] Open. 35 00:03:15,056 --> 00:03:18,128 [machinery whirring] 36 00:03:21,442 --> 00:03:24,825 - [Bill] I was riding a bicycle in a charity event... 37 00:03:29,312 --> 00:03:31,797 it was raining really badly, 38 00:03:32,936 --> 00:03:34,490 and I was following a mail truck. 39 00:03:38,563 --> 00:03:40,461 And then all of a sudden it stopped 40 00:03:41,600 --> 00:03:43,119 and I didn't. 41 00:03:43,153 --> 00:03:45,949 [machine beeping] 42 00:03:47,986 --> 00:03:49,746 Drapes - [Computer] Drapes. 43 00:03:49,781 --> 00:03:51,679 - Two. - [Computer] Two. 44 00:03:51,714 --> 00:03:53,440 - Open. - [Computer] Open. 45 00:03:54,372 --> 00:03:55,614 [machinery whirring] 46 00:03:55,649 --> 00:03:57,409 - Head up. - [Computer] Head up. 47 00:04:00,240 --> 00:04:01,793 - Do it. - [Computer] Head up. 48 00:04:04,589 --> 00:04:06,315 - Do it. - [Computer] Head up. 49 00:04:14,323 --> 00:04:17,291 - [Bill] Things are different now. 50 00:04:25,817 --> 00:04:27,370 - [Danielle] Good morning Mr. Kochevar. 51 00:04:27,405 --> 00:04:29,027 - [Bill] Hi, Danielle. 52 00:04:29,061 --> 00:04:30,304 - [Danielle] I'm gonna go ahead 53 00:04:30,339 --> 00:04:31,788 and get you up in the chair, okay? 54 00:04:40,556 --> 00:04:43,386 [somber music] 55 00:04:45,733 --> 00:04:47,494 - I'm a tetraplegic, 56 00:04:51,152 --> 00:04:53,258 which means I can't move my arms or legs 57 00:04:55,225 --> 00:04:58,401 and I only have feeling from my mid-chest up. 58 00:04:59,816 --> 00:05:02,923 [machinery whirring] 59 00:05:05,235 --> 00:05:08,100 It's okay that I can't feel anything but 60 00:05:08,998 --> 00:05:10,551 sometimes I wish I could. 61 00:05:16,246 --> 00:05:17,386 After my injury 62 00:05:18,525 --> 00:05:23,323 I lived with my parents then 63 00:05:23,357 --> 00:05:26,671 when my mother and father both passed away, I moved here. 64 00:05:27,810 --> 00:05:29,708 - [Yvonne] Good morning Mr. Kochevar. 65 00:05:29,743 --> 00:05:31,503 - [Bill] Morning Yvonne. 66 00:05:31,538 --> 00:05:32,746 - [Yvonne] Well hello. 67 00:05:32,780 --> 00:05:34,057 - [Bill] How are you this morning? 68 00:05:34,092 --> 00:05:36,543 - [Yvonne] I am just lovely, Bill. 69 00:05:37,785 --> 00:05:40,029 - [Robert] In people that have paralysis, 70 00:05:40,063 --> 00:05:42,376 the nerves to their muscles are still intact, 71 00:05:42,411 --> 00:05:44,585 but they're cut off from the brain. 72 00:05:44,620 --> 00:05:47,036 - [Yvonne] Let's get you up on your side. 73 00:05:47,070 --> 00:05:49,556 - [Robert] So they lose the ability to voluntarily 74 00:05:49,590 --> 00:05:50,833 control their movements. 75 00:05:50,867 --> 00:05:52,282 - [Yvonne] Got your glasses here. 76 00:05:54,975 --> 00:05:56,217 - [Bolu] If we can understand the brain's 77 00:05:56,252 --> 00:05:57,598 internal code of movement, 78 00:05:58,668 --> 00:06:00,256 it would significantly assist us in 79 00:06:00,290 --> 00:06:03,397 better treating a wide range of medical issues. 80 00:06:06,607 --> 00:06:08,022 But we're not quite there. 81 00:06:09,299 --> 00:06:11,232 [machinery whirring] 82 00:06:11,267 --> 00:06:14,753 - [Announcer] Dr. Carradine, pick up at 2185. 83 00:06:14,788 --> 00:06:17,307 - [Bill] People in my situation, 84 00:06:17,342 --> 00:06:19,379 you know, they just never move again. 85 00:06:23,866 --> 00:06:26,696 It's hard sometimes to think about that. 86 00:06:27,870 --> 00:06:29,389 I always want to do more. 87 00:06:31,874 --> 00:06:33,841 Like, I wanna move from this point 88 00:06:33,876 --> 00:06:36,844 to that point without help. 89 00:06:38,639 --> 00:06:40,814 [light music] 90 00:06:40,848 --> 00:06:43,989 I don't know, probably eat something by myself. 91 00:06:44,818 --> 00:06:46,060 If it ever happens. 92 00:06:47,303 --> 00:06:50,237 You know, have somebody bring me a plate of something, 93 00:06:50,271 --> 00:06:53,343 and then I'll say "Okay, you can go away, 94 00:06:53,378 --> 00:06:55,415 "I'm gonna eat this. 95 00:06:55,449 --> 00:06:57,071 "Come back in about half an hour." 96 00:06:59,557 --> 00:07:02,111 But I don't think about it a lot 97 00:07:02,145 --> 00:07:04,389 because I know there's a lot of research 98 00:07:04,424 --> 00:07:06,460 yet to go before that happens. 99 00:07:20,060 --> 00:07:22,856 - [Bryan] Everything we're trying to do, 100 00:07:22,890 --> 00:07:25,410 everything we're trying to become, 101 00:07:25,445 --> 00:07:26,894 everything we're trying to fix, 102 00:07:30,795 --> 00:07:34,557 sits on the other side of the brain. 103 00:07:40,529 --> 00:07:43,428 The brain is the single highest potential 104 00:07:43,463 --> 00:07:45,879 area of focus we have in existence. 105 00:07:49,261 --> 00:07:52,437 It is our best tool to do the things we want 106 00:07:52,472 --> 00:07:54,163 to achieve in life, 107 00:07:54,197 --> 00:07:57,235 it is also the biggest limiter we have. 108 00:07:58,478 --> 00:08:02,205 - [Ramez] This three pounds, is our entire universe. 109 00:08:03,310 --> 00:08:05,174 Our senses, our emotions. 110 00:08:05,208 --> 00:08:07,383 - [Chantel] Love, hate, fear, jealousy, 111 00:08:07,417 --> 00:08:09,281 - [Dustin] The way we move, the way we control our body. 112 00:08:09,316 --> 00:08:10,662 - [Scientist 1] Our culture, our language, 113 00:08:10,697 --> 00:08:12,664 - [Scientist 2] Our hopes, our dreams, our aspirations 114 00:08:12,699 --> 00:08:15,460 - [Chantel] Motivation, pride, wonder. 115 00:08:17,635 --> 00:08:19,740 - [Manuel] The brain defines what we are. 116 00:08:22,087 --> 00:08:24,849 - [David] But because the brain runs so efficiently 117 00:08:24,883 --> 00:08:26,436 behind the scenes, 118 00:08:27,299 --> 00:08:30,061 it's typically only when something goes wrong 119 00:08:30,095 --> 00:08:32,339 that we appreciate the magnitude 120 00:08:32,373 --> 00:08:34,272 of what's actually happening there. 121 00:08:38,863 --> 00:08:41,555 [birds chirping] 122 00:09:08,444 --> 00:09:11,620 - [Anne] I'm not really sure what's happening in my brain. 123 00:09:15,382 --> 00:09:19,420 Anxiety, insomnia, paralysis. 124 00:09:26,393 --> 00:09:28,395 Parkinson's is different for everybody. 125 00:09:30,604 --> 00:09:32,572 Just depends on what you get. 126 00:09:44,307 --> 00:09:46,102 [Anne struggles] 127 00:09:54,283 --> 00:09:57,251 - [Stan] That's a cardinal, right? 128 00:09:57,286 --> 00:09:59,771 The one with that whistle, that one. 129 00:10:01,393 --> 00:10:02,153 - [Anne] Yeah. 130 00:10:06,053 --> 00:10:07,917 - [Stan] Anne and I have been married for 38 years. 131 00:10:07,952 --> 00:10:12,439 - [Anne] Look, the hydrangeas, it hasn't bloomed. 132 00:10:12,473 --> 00:10:15,545 - [Stan] I think Anne has one of the most gentle hearts on earth. 133 00:10:21,206 --> 00:10:24,485 Things have gradually changed over the years. 134 00:10:29,111 --> 00:10:30,526 She used to be an artist. 135 00:10:31,561 --> 00:10:35,255 She was a hospice volunteer. 136 00:10:38,879 --> 00:10:41,088 All that had to be left behind. 137 00:10:42,745 --> 00:10:45,299 [somber music] 138 00:10:45,334 --> 00:10:47,267 - [Anne] When I was diagnosed, 139 00:10:51,547 --> 00:10:53,929 I think the biggest thing for me was that 140 00:10:55,033 --> 00:10:57,001 I would become useless. 141 00:10:58,347 --> 00:11:01,384 You know, I'll be a burden to my to my children, 142 00:11:02,696 --> 00:11:03,870 and to my husband. 143 00:11:05,561 --> 00:11:08,253 And a burden just in the world. 144 00:11:11,498 --> 00:11:13,017 [beeping] 145 00:11:13,051 --> 00:11:14,294 One of the Parkinson's symptoms 146 00:11:14,328 --> 00:11:15,985 that I was always afraid of 147 00:11:18,401 --> 00:11:20,438 was that you couldn't smile 148 00:11:20,472 --> 00:11:24,131 and you smiled you had a stony expression. 149 00:11:27,756 --> 00:11:29,516 It's hard to connect with people. 150 00:11:32,553 --> 00:11:37,248 I'm just way too exhausted and way too disorganized mentally 151 00:11:38,767 --> 00:11:40,561 to be with people the way I used to. 152 00:11:43,806 --> 00:11:47,499 This illness, you know, 153 00:11:50,433 --> 00:11:52,815 all you can do is accept it. 154 00:12:03,619 --> 00:12:06,208 - [Bryan] There's nothing more dehumanizing 155 00:12:06,242 --> 00:12:10,074 than watching someone you love slowly lose their humanhood 156 00:12:12,559 --> 00:12:15,942 and yet if you look at the state of neuroscience right now, 157 00:12:17,391 --> 00:12:20,394 we still have an incredibly long way to go. 158 00:12:25,296 --> 00:12:28,748 - [Ramez] The brain has always been a black box to us. 159 00:12:30,301 --> 00:12:33,718 In the past, we've had rudimentary ideas 160 00:12:33,753 --> 00:12:34,754 of how it works. 161 00:12:36,272 --> 00:12:39,241 - We know that there are circuits that control your vision, 162 00:12:39,275 --> 00:12:41,657 your memory, your movement. 163 00:12:43,624 --> 00:12:45,316 - But we still don't know the code. 164 00:12:47,145 --> 00:12:48,768 We don't know the internal language. 165 00:12:51,425 --> 00:12:55,222 That's because the brain is the most complicated object 166 00:12:55,257 --> 00:12:57,880 we've ever encountered in nature. 167 00:12:59,088 --> 00:13:00,849 - [Dr. Nicolelis] You know, think about it 168 00:13:00,883 --> 00:13:02,989 we have in each human brain, 169 00:13:04,231 --> 00:13:05,957 roughly a hundred billion neurons. 170 00:13:07,303 --> 00:13:09,409 100 billion was the old estimate 171 00:13:09,443 --> 00:13:11,549 of the number of galaxies in the universe. 172 00:13:13,620 --> 00:13:15,864 - [David] Each neuron is as complicated 173 00:13:15,898 --> 00:13:17,797 as the city of Los Angeles. 174 00:13:19,315 --> 00:13:22,836 It's connecting to about 10,000 of its neighbors 175 00:13:22,871 --> 00:13:26,253 so you have, 500 trillion connections. 176 00:13:27,772 --> 00:13:30,775 - [Bobby] And these neurons they communicate with each other. 177 00:13:33,467 --> 00:13:36,125 And they're firing electrical signals to each other 178 00:13:37,333 --> 00:13:41,303 the same way our computers fire zeros and ones 179 00:13:41,337 --> 00:13:42,545 inside of themselves. 180 00:13:44,064 --> 00:13:47,343 - [Tracy] To think about those individual neurons firing, 181 00:13:47,378 --> 00:13:49,380 at a speed that we can't even understand yet, 182 00:13:49,414 --> 00:13:50,795 it's mind-boggling. 183 00:13:52,693 --> 00:13:57,526 - [David] It's so unbelievably complicated and yet somehow it is us. 184 00:14:01,323 --> 00:14:03,739 - [Bobby] The electrical firing of these little neurons 185 00:14:03,773 --> 00:14:05,085 is who you are. 186 00:14:08,399 --> 00:14:09,262 What is it saying? 187 00:14:14,439 --> 00:14:16,027 What is it communicating? 188 00:14:37,911 --> 00:14:40,120 - [Narrator] Jim drinks and hands the flask back to McCoy. 189 00:14:40,155 --> 00:14:42,226 The shuttle rises among the mass of pipes 190 00:14:42,260 --> 00:14:43,537 and fence in the shipyard 191 00:14:43,572 --> 00:14:45,332 and arcs up into the sky. 192 00:14:47,438 --> 00:14:50,372 [dramatic music] 193 00:14:58,173 --> 00:15:00,451 - [Stephen] I was born with a condition 194 00:15:03,143 --> 00:15:05,663 that didn't surface until later in life. 195 00:15:08,977 --> 00:15:09,909 Everything's white. 196 00:15:11,841 --> 00:15:12,704 It's blank. 197 00:15:14,154 --> 00:15:15,259 It's like having a book, you know? 198 00:15:15,293 --> 00:15:16,294 You've got a written page 199 00:15:16,329 --> 00:15:17,744 and then you've got a blank page. 200 00:15:20,333 --> 00:15:21,817 So I'm looking at a blank page. 201 00:15:32,690 --> 00:15:34,002 I live on my own. 202 00:15:35,279 --> 00:15:36,556 I have a cat. 203 00:15:36,590 --> 00:15:37,350 Come on, Angie. 204 00:15:37,384 --> 00:15:39,248 It's a second-hand cat. 205 00:15:39,283 --> 00:15:40,077 Treat time. 206 00:15:42,044 --> 00:15:43,252 Oh, look at that. 207 00:15:44,736 --> 00:15:46,876 Her name originally was Angel. 208 00:15:46,911 --> 00:15:48,430 And I said, "Nah, she's not an angel," 209 00:15:48,464 --> 00:15:50,225 so I gave her Angie. 210 00:15:51,467 --> 00:15:53,055 Yeah, she's hiding somewhere. 211 00:15:55,955 --> 00:15:58,267 [knocking] 212 00:15:58,302 --> 00:16:00,062 When I lost my vision, 213 00:16:01,236 --> 00:16:02,962 it's like the whole world collapsed. 214 00:16:04,273 --> 00:16:07,518 And probably one of the most difficult thing to do 215 00:16:07,552 --> 00:16:09,382 is to ask for help. 216 00:16:12,385 --> 00:16:14,007 You want a coffee, Denise? 217 00:16:14,042 --> 00:16:15,560 - [Denise] Yeah, that'd be great. 218 00:16:17,217 --> 00:16:18,253 - [Stephen] Is Dark Roast fine? 219 00:16:18,287 --> 00:16:19,495 - [Denise] That's good, sounds good. 220 00:16:19,530 --> 00:16:21,187 - [Stephen] Good, cause that's all I got. 221 00:16:21,221 --> 00:16:23,154 - [Denise] [laughs] As long as it's coffee. 222 00:16:23,189 --> 00:16:26,399 - [Stephen] Okay let's see what else we have here. 223 00:16:26,433 --> 00:16:28,884 - [Denise] We fight like brothers and sisters do. 224 00:16:30,403 --> 00:16:34,510 But in the last couple years we've got very very close. 225 00:16:36,064 --> 00:16:38,756 - [Stephen] Oh, that's a toaster Denise. 226 00:16:38,790 --> 00:16:42,863 Yeah I got distracted. [laughing] 227 00:16:42,898 --> 00:16:45,418 - [Denise] He's had relationships, he's had girlfriends. 228 00:16:46,557 --> 00:16:50,043 But through the years he just sort of 229 00:16:50,078 --> 00:16:51,389 pushed everybody away. 230 00:16:53,460 --> 00:16:55,393 What time did you tell me tomorrow? 231 00:16:56,360 --> 00:16:58,017 - [Stephen] 11? 232 00:16:58,051 --> 00:17:01,227 It's really very difficult to have a conversation. 233 00:17:01,261 --> 00:17:04,575 Because you can't see them, you can't see their expression. 234 00:17:06,646 --> 00:17:08,648 You can only go by the tone of voice. 235 00:17:12,548 --> 00:17:16,069 - [Denise] The last time Stephen saw me was probably 236 00:17:16,104 --> 00:17:17,795 a year and a half to two years ago. 237 00:17:21,523 --> 00:17:23,249 That's been a really sad thing. 238 00:17:24,284 --> 00:17:25,768 - [Denise] Oops, careful. - [Stephen] I thought you're 239 00:17:25,803 --> 00:17:26,942 gonna grab it. 240 00:17:26,976 --> 00:17:28,426 - [Denise] Oop, I got it. - [Stephen] Okay, thanks. 241 00:17:28,461 --> 00:17:30,359 - [Denise] I'm not as quick as you are. - [Stephen] You gotta be quick. 242 00:17:30,394 --> 00:17:31,326 - [Denise] I know. 243 00:17:31,360 --> 00:17:33,397 - [Stephen] I pretty well depend on her 100% 244 00:17:34,398 --> 00:17:37,090 for emotional support and as an intervener. 245 00:17:40,507 --> 00:17:41,888 And also to check my mail. 246 00:17:46,237 --> 00:17:48,377 We love each other very much. 247 00:17:49,620 --> 00:17:51,380 - [Clock] 4:06 p.m. 248 00:17:54,280 --> 00:17:57,904 - [Stephen] But I just miss being independent. 249 00:18:00,769 --> 00:18:03,151 - [Dr. Nicolelis] An estimated one billion people around the world 250 00:18:03,185 --> 00:18:05,222 that suffer from some sort of brain disorder. 251 00:18:05,256 --> 00:18:07,327 - [Maria Shriver] Every 66 seconds a new brain 252 00:18:07,362 --> 00:18:09,812 in this country develops Alzheimer's. 253 00:18:09,847 --> 00:18:11,124 - [Newscaster] Without a breakthrough right now 254 00:18:11,159 --> 00:18:13,678 the dementia rate will double every 20 years. 255 00:18:13,713 --> 00:18:17,579 - [John Oliver] An Estimated 43.8 million American adults 256 00:18:17,613 --> 00:18:18,890 dealt with a mental illness 257 00:18:18,925 --> 00:18:21,514 and an estimated 10 million of them suffer 258 00:18:21,548 --> 00:18:23,619 from a serious mental illness. 259 00:18:23,654 --> 00:18:27,313 - [Bryan] I think everybody has personal experience to some degree 260 00:18:28,900 --> 00:18:33,422 with depression or anxiety, Parkinson's, Alzheimer's. 261 00:18:36,045 --> 00:18:37,599 Whatever the condition is, 262 00:18:39,635 --> 00:18:43,225 it's devastating to that person, to their loved ones, 263 00:18:43,260 --> 00:18:44,123 and to society. 264 00:18:46,159 --> 00:18:48,713 I experienced a decade of chronic depression 265 00:18:48,748 --> 00:18:52,993 and I know what it's like to feel hopelessness at a depth. 266 00:18:56,514 --> 00:19:00,691 Nothing is more important than addressing a broken brain. 267 00:19:04,281 --> 00:19:08,285 To me, it's the most important conversation 268 00:19:08,319 --> 00:19:09,769 in the entire world. 269 00:19:12,116 --> 00:19:13,635 - [Man] We have a real treat for you today, 270 00:19:13,669 --> 00:19:15,188 a great entrepreneur... 271 00:19:15,223 --> 00:19:16,569 - [Podcast] Today's guest is the modern American 272 00:19:16,603 --> 00:19:17,742 rags to riches story, 273 00:19:17,777 --> 00:19:19,054 he started with nothing. 274 00:19:19,088 --> 00:19:20,366 - [Man] Today Bryan Johnson is with us 275 00:19:20,400 --> 00:19:22,609 who's investing in some of the most important areas 276 00:19:22,644 --> 00:19:25,094 of biology and technology. 277 00:19:27,683 --> 00:19:29,340 - [Bryan] Over the past couple of years 278 00:19:29,375 --> 00:19:32,205 I built a neuroscience company 279 00:19:32,240 --> 00:19:34,966 where I'm pursuing breakthrough discoveries in the brain. 280 00:19:35,001 --> 00:19:36,899 - [Woman] Bryan Johnson invested $100 million dollars 281 00:19:36,934 --> 00:19:38,315 in his own neuroscience company. 282 00:19:38,349 --> 00:19:40,972 - [Man] With the belief that unlocking our brain 283 00:19:41,007 --> 00:19:44,493 is the most significant opportunity in history. 284 00:19:50,568 --> 00:19:52,260 - [Bryan] Morning everyone. 285 00:19:52,294 --> 00:19:54,434 A lot of times when I talk to people about the future 286 00:19:54,469 --> 00:19:56,816 they'll take what exists today and they...[trails off] 287 00:19:56,850 --> 00:19:59,163 I basically spend every waking moment 288 00:19:59,198 --> 00:20:02,235 trying to convince people that working on the brain 289 00:20:02,270 --> 00:20:04,237 is the single most consequential thing 290 00:20:04,272 --> 00:20:06,722 we could be doing as a species. 291 00:20:06,757 --> 00:20:09,691 Today I would like to try and make the case 292 00:20:09,725 --> 00:20:12,763 on why I think the brain is important. 293 00:20:12,797 --> 00:20:15,214 We now have the tools to make attempts 294 00:20:15,248 --> 00:20:16,215 at these breakthroughs. 295 00:20:16,249 --> 00:20:17,664 We're one of the first generations in the world 296 00:20:17,699 --> 00:20:20,184 that cannot see past 15-20 years 297 00:20:20,219 --> 00:20:24,499 and so as a society, how do we become interested 298 00:20:24,533 --> 00:20:27,226 in this radical cognitive evolution? 299 00:20:28,296 --> 00:20:31,333 I've spoken at countless conferences. 300 00:20:31,368 --> 00:20:34,888 I write an email newsletter, blog posts, 301 00:20:34,923 --> 00:20:35,889 I'm writing a book, 302 00:20:37,477 --> 00:20:40,722 because I am consumed with this discussion, 303 00:20:40,756 --> 00:20:42,482 our situation, our potential. 304 00:20:43,828 --> 00:20:46,555 Fortunately there are these fascinating developments 305 00:20:46,590 --> 00:20:48,419 going on in neuroscience 306 00:20:48,454 --> 00:20:51,284 where our technology is enabling possibilities 307 00:20:51,319 --> 00:20:53,148 we never had before. 308 00:20:55,875 --> 00:20:57,773 Possibilities that will undoubtedly 309 00:20:58,774 --> 00:21:00,293 transform the human race. 310 00:21:04,504 --> 00:21:08,267 But the first humans already walking down 311 00:21:08,301 --> 00:21:09,578 this new evolutionary path 312 00:21:13,513 --> 00:21:16,275 are not who most people expect. 313 00:21:26,250 --> 00:21:28,735 [Anne grunting] 314 00:21:28,770 --> 00:21:31,359 [phone beeping] 315 00:21:33,361 --> 00:21:36,433 [phone ringing] 316 00:21:36,467 --> 00:21:38,573 - [Stan] Oh hi, It's Stan Shabason speaking. 317 00:21:38,607 --> 00:21:40,333 I'm just calling to confirm Anne's 318 00:21:40,368 --> 00:21:43,371 appointment this Tuesday at three p.m. 319 00:21:47,375 --> 00:21:49,031 - [Anne] Someone who I was seeing 320 00:21:49,066 --> 00:21:51,344 through the movement disorder clinic 321 00:21:51,379 --> 00:21:56,384 said there's this procedure called Deep Brain Stimulation 322 00:21:57,385 --> 00:22:00,526 that I should be open to. 323 00:22:00,560 --> 00:22:03,114 - [Stan] Great, thank you very much, we'll see you then. 324 00:22:04,702 --> 00:22:07,912 She tried alternative therapies. 325 00:22:07,947 --> 00:22:11,675 Everything from naturopath, osteopaths, 326 00:22:11,709 --> 00:22:13,297 nothing seemed to help. 327 00:22:15,161 --> 00:22:17,474 - [Dr. Lozano] In Parkinson's disease, the neurons 328 00:22:17,508 --> 00:22:20,339 that produce dopamine in your brain are dying. 329 00:22:21,995 --> 00:22:27,000 That causes tremor, rigidity, akinesia or inability to move. 330 00:22:28,899 --> 00:22:31,073 [tense music] 331 00:22:31,108 --> 00:22:34,180 So we implant electrodes inside the brain 332 00:22:35,595 --> 00:22:37,390 to suppress this abnormal activity 333 00:22:37,425 --> 00:22:41,360 that is causing the motor system to shut down. 334 00:22:41,394 --> 00:22:45,364 And we use electrical stimulation 24 hours a day. 335 00:22:48,436 --> 00:22:53,302 - [Stan] We were told this operation would offer her relief. 336 00:22:54,718 --> 00:22:59,723 But Anne's very wary of any kind of dramatic procedure. 337 00:23:02,415 --> 00:23:04,003 - [Dr. Lozano] Patients are worried, you know, 338 00:23:04,037 --> 00:23:07,282 "Will I be the same, coming out, as I was going in?" 339 00:23:08,697 --> 00:23:11,390 - [Anne] Someone's cutting into your brain. 340 00:23:11,424 --> 00:23:14,220 You don't know what's going to happen. 341 00:23:14,254 --> 00:23:17,050 - [Dr. Lozano] There's a tremendous amount of anxiety about 342 00:23:17,085 --> 00:23:19,639 whether they are going to change in their outlook, 343 00:23:19,674 --> 00:23:22,539 in their personality, in their motivation, in their drive. 344 00:23:23,885 --> 00:23:25,127 You know, this is brain surgery. 345 00:23:25,162 --> 00:23:26,819 It's invasive. 346 00:23:26,853 --> 00:23:28,337 It is a scary thought. 347 00:23:32,238 --> 00:23:36,173 - [Anne] They say my symptoms might improve. 348 00:23:43,422 --> 00:23:46,355 But cutting your body open and putting things in your body, 349 00:23:48,012 --> 00:23:52,361 I don't think I'm capable of risking that. 350 00:23:54,433 --> 00:23:57,056 [tense music] 351 00:24:08,032 --> 00:24:11,346 - [John] The idea of having electrodes that are in your brain, 352 00:24:11,380 --> 00:24:12,485 that stimulate your brain, 353 00:24:15,039 --> 00:24:17,456 is something that's a concern for people. 354 00:24:22,288 --> 00:24:26,361 And it just puts the burden on being extremely careful 355 00:24:26,395 --> 00:24:28,467 in the design of these electronics. 356 00:24:34,473 --> 00:24:35,404 Here at the Wyss Center... 357 00:24:36,923 --> 00:24:38,062 [machinery beeping] 358 00:24:38,097 --> 00:24:39,754 ...we develop implantable technology 359 00:24:41,618 --> 00:24:43,447 and test it to show that it's safe, 360 00:24:45,276 --> 00:24:48,245 to show that it's effective. 361 00:24:53,319 --> 00:24:54,734 It's a very complex process 362 00:24:57,461 --> 00:24:59,705 and it's a very expensive process. 363 00:25:04,088 --> 00:25:08,886 First, we have to develop this brain electrode 364 00:25:12,580 --> 00:25:14,305 [whirring] 365 00:25:14,340 --> 00:25:16,687 - [Claude] This is a technically very difficult challenge. 366 00:25:22,831 --> 00:25:27,733 We need to avoid dust, biological contamination, 367 00:25:27,767 --> 00:25:29,079 ionic contamination. 368 00:25:29,942 --> 00:25:32,289 [whirring] 369 00:25:33,566 --> 00:25:37,674 And so this electrode consists of very special material. 370 00:25:38,502 --> 00:25:40,746 [whirring, electronic noises] 371 00:25:42,195 --> 00:25:46,337 We are touching a very sensitive part of the human body 372 00:25:46,372 --> 00:25:48,374 so we can not fool around. 373 00:25:51,895 --> 00:25:54,414 - [John] Anything that's powered gets warm, 374 00:25:54,449 --> 00:25:57,210 so we use thermal imaging to understand 375 00:25:57,245 --> 00:25:58,246 how hot does it get 376 00:26:00,006 --> 00:26:01,387 and where does it get hot. 377 00:26:06,012 --> 00:26:09,913 The next step is an extremely complicated surgical procedure 378 00:26:13,572 --> 00:26:16,298 - [Claude] We need to pass the barrier of the scalp, 379 00:26:17,127 --> 00:26:18,197 of the skull. 380 00:26:20,820 --> 00:26:24,237 - [John] And implant the electrode into or onto the brain. 381 00:26:26,412 --> 00:26:29,104 [light clicking] 382 00:26:34,109 --> 00:26:37,975 Once inside, these electrodes can pick up 383 00:26:38,010 --> 00:26:40,564 the tiny electrical impulses of the neurons 384 00:26:41,772 --> 00:26:45,362 and convert them into a signal we can understand 385 00:26:45,396 --> 00:26:46,432 like a digital code. 386 00:26:48,917 --> 00:26:51,506 - [Ramez] It's a messy messy messy code. 387 00:26:51,540 --> 00:26:53,853 There's a hundred billion neurons. 388 00:26:55,061 --> 00:26:57,546 But once you're getting data out of the brain, 389 00:26:57,581 --> 00:26:59,894 you have digital information. 390 00:26:59,928 --> 00:27:01,965 You can do with it anything you can do 391 00:27:01,999 --> 00:27:03,414 with digital information. 392 00:27:04,899 --> 00:27:07,004 - [John÷} It's called a brain computer interface. 393 00:27:08,385 --> 00:27:10,387 - [Claude] Brain computer interfaces 394 00:27:11,388 --> 00:27:14,702 give us the ability to understand the brain, 395 00:27:15,944 --> 00:27:19,016 to understand its electrical signals 396 00:27:19,051 --> 00:27:21,398 in order to replace lost function. 397 00:27:23,745 --> 00:27:25,885 - [Bobby] Brain science for the first time 398 00:27:25,920 --> 00:27:28,370 is showing us how we can understand ourselves 399 00:27:29,717 --> 00:27:31,201 at the level of the machine. 400 00:27:33,617 --> 00:27:36,068 - [John] The spectrum of areas where you can help 401 00:27:36,102 --> 00:27:40,728 in brain-computer interfaces is really tremendous. 402 00:27:42,730 --> 00:27:44,593 - [Ramez] Could we solve mental illness? 403 00:27:46,457 --> 00:27:47,873 Cure people who are sick? 404 00:27:49,564 --> 00:27:51,359 Could we give sight to the blind, 405 00:27:52,498 --> 00:27:54,327 restore hearing to the deaf? 406 00:27:59,401 --> 00:28:02,681 Could we restore our unique human capabilities? 407 00:28:05,304 --> 00:28:08,687 The capabilities that have been taken away? 408 00:28:41,650 --> 00:28:44,446 - [Bill] When my doctor told me about it, 409 00:28:45,689 --> 00:28:47,691 he said, "We have a research project 410 00:28:49,279 --> 00:28:51,074 and it's a little Star Treky" 411 00:28:59,047 --> 00:29:00,531 And I perked up and said, 412 00:29:00,566 --> 00:29:02,741 "Oh, well this must be interesting." 413 00:29:08,436 --> 00:29:12,440 It did seem like something out of science fiction. 414 00:29:18,411 --> 00:29:19,896 So everything's gonna work today, right? 415 00:29:19,930 --> 00:29:21,207 - [Robert] That's up to you, Bill. 416 00:29:21,242 --> 00:29:24,383 [everyone laughing] 417 00:29:26,143 --> 00:29:29,077 [Robert] I met Bill about, I would say a year 418 00:29:29,112 --> 00:29:32,011 before we did the brain implant. 419 00:29:32,046 --> 00:29:33,979 - [Man] Is that good? 420 00:29:34,013 --> 00:29:36,395 - We said, "You know there's surgery involved, 421 00:29:36,429 --> 00:29:39,087 and this is experimental. 422 00:29:39,122 --> 00:29:42,159 It's a little bit on the edge 423 00:29:42,194 --> 00:29:45,404 of what our colleagues are ready to accept." 424 00:29:49,995 --> 00:29:52,238 - [Bill] They had been looking for a patient 425 00:29:52,273 --> 00:29:53,515 for quite a while. 426 00:29:55,690 --> 00:29:56,829 - [Man] How's that look? 427 00:29:56,864 --> 00:29:58,210 - [Robert] Good. 428 00:29:58,244 --> 00:30:00,453 - [Bill] And I couldn't understand why 429 00:30:00,488 --> 00:30:03,146 somebody along the way wouldn't want to do this. 430 00:30:04,457 --> 00:30:05,907 - [Man] Everything's looking good. 431 00:30:07,702 --> 00:30:10,049 - [Bolu] We have two microelectrode arrays implanted 432 00:30:10,084 --> 00:30:12,396 into Bill's primary motor cortex. 433 00:30:12,431 --> 00:30:14,364 Four by four millimeters in size 434 00:30:14,398 --> 00:30:16,780 and ninety-six recording electrodes. 435 00:30:18,299 --> 00:30:20,819 Each electrode penetrates the cortical tissue, 436 00:30:20,853 --> 00:30:23,373 1.5 millimeters in depth 437 00:30:23,407 --> 00:30:26,894 and can record the activity of individual neurons. 438 00:30:26,928 --> 00:30:28,412 So Bill one of the things we need to do 439 00:30:28,447 --> 00:30:31,553 is we need to calibrate the decoder, 440 00:30:31,588 --> 00:30:33,624 so watch and then also imagine 441 00:30:33,659 --> 00:30:35,626 you're performing those same movements. 442 00:30:37,387 --> 00:30:41,046 - We have them watch an animation of an arm, 443 00:30:41,080 --> 00:30:44,394 imagine themselves as controlling those movements, 444 00:30:46,775 --> 00:30:48,294 and we'll build this algorithm 445 00:30:48,329 --> 00:30:50,538 decoding the movement intention. 446 00:30:52,333 --> 00:30:53,058 Excellent. 447 00:30:55,405 --> 00:30:56,785 - [Bolu] We then send that intention 448 00:30:56,820 --> 00:31:00,962 to electrodes implanted in Bill's arm and hand. 449 00:31:00,997 --> 00:31:03,344 And our goal is to restore movement. 450 00:31:05,415 --> 00:31:07,831 [light electronic music] 451 00:31:07,866 --> 00:31:09,143 - [Man] Okay Bill, I'm gonna go ahead 452 00:31:09,177 --> 00:31:10,351 and connect you to that, all right? 453 00:31:10,385 --> 00:31:11,214 - [Bill] Yes. 454 00:31:12,422 --> 00:31:13,872 - [Man] Okay, it's on. 455 00:31:13,906 --> 00:31:16,322 - [Bolu] If the electrical activity in Bill's brain 456 00:31:16,357 --> 00:31:18,497 can stimulate in the right pattern, 457 00:31:18,531 --> 00:31:22,190 you can get an arm to move, a hand to open, a hand to close. 458 00:31:24,330 --> 00:31:25,849 - [Man] Connect that to the card. 459 00:31:28,334 --> 00:31:31,027 - [Robert] Imagine reaching out, imagine reaching up. 460 00:31:32,269 --> 00:31:34,754 - [Man] All right, here we go. 461 00:31:44,143 --> 00:31:45,317 - [Bill] But it may not work. 462 00:31:49,079 --> 00:31:51,392 The signals change in my brain, 463 00:31:51,426 --> 00:31:56,431 so that's a task they struggle with a lot. 464 00:32:09,410 --> 00:32:11,239 - [Bolu] We believe the potential is massive. 465 00:32:13,759 --> 00:32:16,727 But there are some technical challenges 466 00:32:16,762 --> 00:32:18,867 we are working towards overcoming. 467 00:32:22,526 --> 00:32:25,460 [chair whirring] 468 00:32:34,400 --> 00:32:36,333 - [Bryan] It's extraordinarily difficult 469 00:32:36,368 --> 00:32:39,164 to make breakthroughs in neuroscience. 470 00:32:41,131 --> 00:32:44,997 Scientists are tackling these really complicated problems, 471 00:32:45,032 --> 00:32:46,378 trying to do things that other people 472 00:32:46,412 --> 00:32:47,827 consider to be impossible. 473 00:32:49,933 --> 00:32:52,418 And it makes it both an extremely exciting time 474 00:32:52,453 --> 00:32:53,661 but also, it's daunting 475 00:32:53,695 --> 00:32:58,321 because we don't know the best approach 476 00:32:58,355 --> 00:32:59,287 to make progress. 477 00:33:01,289 --> 00:33:04,465 [lightly tense music] 478 00:33:05,431 --> 00:33:07,192 [Bryan] There's many reasons why neuroscience 479 00:33:07,226 --> 00:33:09,470 is an extremely difficult field to build in. 480 00:33:11,368 --> 00:33:13,336 You not only have the technological challenge 481 00:33:13,370 --> 00:33:15,234 of getting good neural signal, 482 00:33:16,442 --> 00:33:19,445 but then you have the scientific challenge 483 00:33:19,480 --> 00:33:22,241 of figuring out how to decode that signal. 484 00:33:23,380 --> 00:33:25,037 Optics, we're looking at geometries 485 00:33:25,072 --> 00:33:28,385 in our design of the hardware. 486 00:33:28,420 --> 00:33:31,492 Then you have intense capital requirements often times 487 00:33:31,526 --> 00:33:33,735 in the hundreds of millions of dollars. 488 00:33:33,770 --> 00:33:36,186 Brandon, how are we doing on budget? 489 00:33:36,221 --> 00:33:38,223 We need to get this done in that timeframe. 490 00:33:38,257 --> 00:33:39,569 I know it's gonna be a lot. 491 00:33:39,603 --> 00:33:41,847 And finally, you have important safety 492 00:33:41,881 --> 00:33:43,400 and ethical considerations. 493 00:33:47,266 --> 00:33:50,338 It's challenging, to say the least. 494 00:33:53,307 --> 00:33:56,620 [Bryan] But I think it is absolutely essential 495 00:33:56,655 --> 00:33:58,657 that we make breakthroughs in the brain. 496 00:34:00,038 --> 00:34:01,246 If we can do that, 497 00:34:02,868 --> 00:34:06,147 we can overcome our biological limitations. 498 00:34:08,908 --> 00:34:13,499 We can reject the things that stop us from moving forward. 499 00:34:16,399 --> 00:34:19,195 [tense music] 500 00:34:19,229 --> 00:34:22,681 - [Announcer] Dr. Ford, please dial 114. 501 00:34:26,409 --> 00:34:28,790 - [Stephen] This procedure was purely by accident. 502 00:34:31,138 --> 00:34:34,210 - [Denise] Stephen had been to his eye specialist. 503 00:34:34,244 --> 00:34:38,869 He said, "I know a doctor who is doing some amazing things." 504 00:34:41,562 --> 00:34:44,047 - [Stephen] It's called the Argus procedure. 505 00:34:44,082 --> 00:34:46,463 They've done between eight and ten patients already. 506 00:34:48,534 --> 00:34:51,365 They implant a chip underneath the eye, 507 00:34:51,399 --> 00:34:54,299 and electrodes are hooked up to the brain. 508 00:34:54,333 --> 00:34:57,336 - [Dr.Devenyi] Hi, Stephen, it's Dr. Devenyi, how are you? 509 00:34:57,371 --> 00:35:00,995 This procedure, it's for sure the most complicated 510 00:35:01,029 --> 00:35:02,100 operation we do. 511 00:35:02,134 --> 00:35:03,308 This won't be bad. - [Stephen] Sure. 512 00:35:03,342 --> 00:35:04,343 I cleared my schedule today 513 00:35:04,378 --> 00:35:05,827 just for this occasion. - [Dr. Devenyi] Good. 514 00:35:05,862 --> 00:35:09,314 And it's a significant intrusion into the eye. 515 00:35:09,348 --> 00:35:11,488 Basically there's a band that goes around the eye. 516 00:35:11,523 --> 00:35:13,249 Far back, we don't see it. 517 00:35:13,283 --> 00:35:16,183 And that has the receiving and transmitting electrodes. 518 00:35:16,217 --> 00:35:17,701 And then there's a portion from that... 519 00:35:17,736 --> 00:35:19,324 - [Stephen] It's a new technology, 520 00:35:20,256 --> 00:35:22,189 so I'm not like I got a brochure. 521 00:35:23,914 --> 00:35:25,261 It's not a hundred percent, so 522 00:35:27,194 --> 00:35:28,126 something could go wrong. 523 00:35:28,160 --> 00:35:29,334 - [Dr. Devenyi] I'll see you soon. 524 00:35:29,368 --> 00:35:30,818 - We'll get this done. - [Stephen] Thank you very much. 525 00:35:30,852 --> 00:35:31,888 - [Dr. Devenyi] Okay, take care. 526 00:35:33,338 --> 00:35:35,995 - [Stephen] But you reach a point when you're 55 years old, 527 00:35:37,411 --> 00:35:38,481 you're blind. 528 00:35:41,518 --> 00:35:43,037 Where do you go from there? 529 00:35:45,212 --> 00:35:47,697 So I said, yeah sure, why not? 530 00:35:48,629 --> 00:35:49,561 I got lots of time. 531 00:35:53,358 --> 00:35:55,463 - [Anne] If you could do this list, 532 00:35:55,498 --> 00:35:58,294 we have to remember to tell the doctor tomorrow. 533 00:35:58,328 --> 00:35:59,364 - [Stan] Okay. 534 00:36:01,676 --> 00:36:04,058 So half hour before your meds. 535 00:36:04,092 --> 00:36:05,542 - [Anne] Everything slows down. 536 00:36:06,957 --> 00:36:08,407 That writhing of my leg... 537 00:36:08,442 --> 00:36:10,754 - [Stan] It's been several months. 538 00:36:10,789 --> 00:36:14,344 And this decision for the surgery, 539 00:36:15,380 --> 00:36:17,520 it's been very difficult for her of course 540 00:36:19,038 --> 00:36:20,488 and challenging for me. 541 00:36:24,354 --> 00:36:26,184 But my kids and I 542 00:36:27,392 --> 00:36:30,015 are encouraging her more and more to consider it. 543 00:36:30,049 --> 00:36:31,223 Both legs are cramping or just your right leg? 544 00:36:31,258 --> 00:36:32,880 - [Anne] No just my right leg. 545 00:36:32,914 --> 00:36:36,194 - [Stan] Right foot and leg cramping. 546 00:36:37,195 --> 00:36:38,472 [somber music] 547 00:36:38,506 --> 00:36:40,405 Right foot over your left knee. 548 00:36:42,786 --> 00:36:43,925 Let's switch. 549 00:36:43,960 --> 00:36:46,618 Her symptoms are growing worse and worse. 550 00:36:46,652 --> 00:36:47,826 Come over to the left. 551 00:36:50,484 --> 00:36:51,864 Pull your right leg in, 552 00:36:55,109 --> 00:36:57,249 - [Anne] My husband's been incredible. 553 00:37:01,357 --> 00:37:05,119 But the thing that I'm concerned about is that 554 00:37:07,432 --> 00:37:09,365 not only do I have the illness, 555 00:37:12,747 --> 00:37:14,197 he has the illness. 556 00:37:17,649 --> 00:37:19,271 My family has the illness. 557 00:37:23,310 --> 00:37:26,002 - [Stan] Last time that we were sleeping in the same bed 558 00:37:28,453 --> 00:37:32,767 it's been at least a few years, I can't remember anymore. 559 00:37:34,976 --> 00:37:37,427 [somber music] 560 00:37:40,223 --> 00:37:42,363 - [Anne] I don't want to be a masochist 561 00:37:43,916 --> 00:37:46,954 because of my insistence on one way being the right way. 562 00:37:50,268 --> 00:37:52,753 But when it affects so many people, 563 00:37:55,618 --> 00:37:58,276 I think you also have a duty 564 00:38:00,139 --> 00:38:03,142 to try what's reasonable. 565 00:38:14,706 --> 00:38:15,500 So... 566 00:38:18,226 --> 00:38:19,918 I've decided to do it. 567 00:38:26,027 --> 00:38:27,443 [music crescendos, machine beeping] 568 00:38:27,477 --> 00:38:29,307 - [Stephen] I don't expect a miracle. 569 00:38:35,382 --> 00:38:37,418 I don't expect to be boarding a plane 570 00:38:37,453 --> 00:38:38,557 and going to Turkey. 571 00:38:43,735 --> 00:38:47,325 But I would like to move about on my own freely, 572 00:38:50,328 --> 00:38:51,398 within reason. 573 00:38:52,537 --> 00:38:54,124 - [Bill] Drapes. - [Computer] Drapes. 574 00:38:54,918 --> 00:38:55,747 - [Bill] Open. - [Computer] Open. 575 00:38:57,127 --> 00:38:59,509 [whirring] 576 00:38:59,544 --> 00:39:01,166 - [Bill] After my accident, 577 00:39:02,236 --> 00:39:03,306 - [Computer] Head up. 578 00:39:06,171 --> 00:39:08,069 - [Bill] They gave me the diagnosis and told me 579 00:39:08,104 --> 00:39:10,175 "You'll probably be a tetraplegic 580 00:39:10,209 --> 00:39:11,901 for the rest of your life." 581 00:39:11,935 --> 00:39:13,040 - [Computer] Head up. 582 00:39:16,906 --> 00:39:19,287 - [Bill] But who knows what breakthroughs will happen? 583 00:39:26,156 --> 00:39:29,884 - [Anne] Before, there didn't seem to be any other solution 584 00:39:29,919 --> 00:39:33,405 to break out of this except to make peace with it. 585 00:39:35,890 --> 00:39:38,755 But I'm not so interested in making peace with it anymore. 586 00:39:41,896 --> 00:39:44,692 - [Stephen] I would like to be able to live independently 587 00:39:47,212 --> 00:39:48,247 and travel independently. 588 00:39:52,148 --> 00:39:54,253 - [Anne] I would love to be doing artwork again. 589 00:39:55,565 --> 00:39:58,085 You know, just being able to express myself artistically. 590 00:39:59,431 --> 00:40:00,467 - [Bill] Maybe I'll be able 591 00:40:00,501 --> 00:40:03,055 to move around in my chair, 592 00:40:03,090 --> 00:40:05,023 pick something up, eat something. 593 00:40:06,231 --> 00:40:07,750 Yeah I'm hoping for that day. 594 00:40:09,993 --> 00:40:11,754 - [Anne] I don't need a perfect life. 595 00:40:13,272 --> 00:40:14,929 I don't need the best of anything. 596 00:40:17,932 --> 00:40:21,349 I just want to feel human again. 597 00:40:23,006 --> 00:40:25,388 [tense music] 598 00:40:28,495 --> 00:40:30,876 - [Dr. Lozano] The first thing we do is bolt a frame 599 00:40:30,911 --> 00:40:32,326 onto their skull. 600 00:40:32,360 --> 00:40:33,914 What are the measurements please? 601 00:40:34,984 --> 00:40:36,192 And they have to be awake. 602 00:40:36,226 --> 00:40:38,159 Anne, how are you doing? 603 00:40:38,194 --> 00:40:40,783 [tools hissing] 604 00:40:45,373 --> 00:40:49,550 We then drill two holes in their head. 605 00:40:49,585 --> 00:40:50,896 Okay, let's go. 606 00:40:51,897 --> 00:40:54,521 [drill whirring] 607 00:40:58,594 --> 00:41:01,700 We then place electrodes within trouble-making areas 608 00:41:01,735 --> 00:41:02,494 in the brain. 609 00:41:07,188 --> 00:41:10,191 This is a game where you have to be within one millimeter. 610 00:41:10,226 --> 00:41:11,296 What are the measurements please? 611 00:41:11,330 --> 00:41:13,298 That one millimeter means a difference 612 00:41:13,332 --> 00:41:15,196 between success and failure. 613 00:41:15,231 --> 00:41:17,544 We're gonna go down to zero. 614 00:41:19,753 --> 00:41:21,202 We put it about there. 615 00:41:22,410 --> 00:41:23,481 Alright, so we're in. 616 00:41:25,724 --> 00:41:27,519 So we should start recording neurons. 617 00:41:30,142 --> 00:41:31,454 [clicking] 618 00:41:31,489 --> 00:41:33,007 Anne? 619 00:41:33,042 --> 00:41:34,319 We're starting to hear some of the neurons 620 00:41:34,353 --> 00:41:35,354 in your brain now. 621 00:41:38,012 --> 00:41:39,531 Once we're inside, 622 00:41:39,566 --> 00:41:42,258 we get the patient to move their arm or their leg, 623 00:41:44,122 --> 00:41:47,228 and we see whether those neurons are activated. 624 00:41:47,263 --> 00:41:49,783 So this neurons is involved with moving the shoulder. 625 00:41:51,785 --> 00:41:53,407 - [Woman] Cleaned? 626 00:41:53,441 --> 00:41:54,477 - [Nurse] Yes. 627 00:41:56,617 --> 00:41:58,792 [beeping, music transition] 628 00:42:03,590 --> 00:42:04,522 [Dr. Devenyi] Forceps? 629 00:42:06,075 --> 00:42:08,353 - Steven's procedure is truly the most remarkable 630 00:42:08,387 --> 00:42:11,667 development that I've witnessed in my career 631 00:42:11,701 --> 00:42:13,254 and truly something I didn't think any of us 632 00:42:13,289 --> 00:42:14,980 would see in our lifetimes. 633 00:42:15,015 --> 00:42:16,672 [Dr. Devenyi] Ready for the implants. 634 00:42:18,294 --> 00:42:21,228 - It essentially is a coalescence 635 00:42:21,262 --> 00:42:22,574 of machine and man 636 00:42:22,609 --> 00:42:26,958 in a way that is really quite hard to fathom. 637 00:42:26,992 --> 00:42:28,097 - We're good. 638 00:42:28,891 --> 00:42:29,823 Calipers please? 639 00:42:31,238 --> 00:42:32,342 - [Dr. Lozano] Anne? 640 00:42:32,377 --> 00:42:34,793 We're now gonna start to put some electricity 641 00:42:34,828 --> 00:42:36,312 through your electrode. 642 00:42:37,624 --> 00:42:39,833 - [Dr. Nicolelis] What we're seeing is that technology's 643 00:42:39,867 --> 00:42:41,213 becoming part of us. 644 00:42:41,248 --> 00:42:42,801 - [Dr. Lozano] When I see three tell me 645 00:42:42,836 --> 00:42:44,216 if you feel anything, ready? 646 00:42:44,251 --> 00:42:47,392 - [Dr. Nicolelis] Because we are linking biological brains 647 00:42:47,426 --> 00:42:48,738 directly to machines. 648 00:42:49,774 --> 00:42:51,016 - [Dr. Lozano] Look at her face carefully, 649 00:42:51,051 --> 00:42:52,431 there was a contraction in her lip. 650 00:42:52,466 --> 00:42:53,467 1, 2, 3. 651 00:42:54,882 --> 00:42:57,402 So four is definitely moving her face. 652 00:42:57,436 --> 00:42:59,335 - [Dustin] If you look at machines, 653 00:42:59,369 --> 00:43:00,716 they're capable, right? 654 00:43:00,750 --> 00:43:02,131 - [Dr. Lozano] Still there. 655 00:43:02,165 --> 00:43:03,373 - [Dustin] They're bigger, they're faster, they're stronger, 656 00:43:03,408 --> 00:43:04,582 all that stuff. 657 00:43:04,616 --> 00:43:06,100 - [Dr. Devenyi] Go ahead with the vitrectomy now. 658 00:43:06,135 --> 00:43:07,654 - [Dustin] The human's a fragile beast, 659 00:43:07,688 --> 00:43:09,759 but it's creative and it's smart and it can push beyond. 660 00:43:09,794 --> 00:43:11,450 [Dr. Devenyi] Electrodes are functioning? 661 00:43:11,485 --> 00:43:12,866 [Nurse] Yes. 662 00:43:12,900 --> 00:43:14,350 - [Dustin] Bring 'em together in symbiosis, 663 00:43:14,384 --> 00:43:16,179 [Dr. Devenyi] Focus on. 664 00:43:16,214 --> 00:43:16,973 That's good. 665 00:43:18,216 --> 00:43:19,493 - [Dustin] Where you can keep the pieces 666 00:43:19,527 --> 00:43:20,528 that make us human, 667 00:43:21,599 --> 00:43:22,738 but use the advantages of the machine 668 00:43:22,772 --> 00:43:24,532 in ways we haven't been able to do before. 669 00:43:24,567 --> 00:43:26,224 [Dr. Devenyi] Marking pen please. 670 00:43:27,777 --> 00:43:30,435 - By this combination of electronics 671 00:43:30,469 --> 00:43:31,781 and the human body, 672 00:43:31,816 --> 00:43:33,196 Is the infusion on? 673 00:43:33,231 --> 00:43:36,337 - I would hazard to say it's one of the most amazing 674 00:43:36,372 --> 00:43:37,856 achievements in all of medicine. 675 00:43:39,720 --> 00:43:41,377 Forceps. 676 00:43:42,550 --> 00:43:44,345 - To be able to give back abilities 677 00:43:44,380 --> 00:43:47,866 to people that otherwise had no options whatsoever, 678 00:43:47,901 --> 00:43:49,247 Load the tac. 679 00:43:54,183 --> 00:43:54,942 Secure. 680 00:43:57,876 --> 00:43:59,395 - It's really quite remarkable. 681 00:44:02,881 --> 00:44:04,365 [Dr. Lozano] Anne? 682 00:44:04,400 --> 00:44:06,298 You're going to go to sleep. 683 00:44:06,333 --> 00:44:09,992 And after that we're going to put the battery in your chest. 684 00:44:10,026 --> 00:44:11,407 Okay. 685 00:44:11,441 --> 00:44:15,273 We are just seeing the very beginnings of this technology. 686 00:44:15,307 --> 00:44:16,930 So we're gonna leave an electrode here. 687 00:44:16,964 --> 00:44:19,691 Where there will be a blurring of the boundary 688 00:44:19,726 --> 00:44:22,867 between machine and biology 689 00:44:24,075 --> 00:44:26,767 and the two will, one day, merge. 690 00:44:26,802 --> 00:44:27,595 We're done. 691 00:44:45,406 --> 00:44:47,408 - [VOICE] The fact is is that we are transforming 692 00:44:47,443 --> 00:44:50,273 into a new species, a technological species. 693 00:44:53,483 --> 00:44:55,002 - [Dustin] It's really crossing the skin into 694 00:44:55,037 --> 00:44:57,177 the holy grail, the brain, and all these things 695 00:44:57,211 --> 00:44:59,420 that people associate with being who we are 696 00:44:59,455 --> 00:45:01,457 that people are somewhat uncomfortable with. 697 00:45:03,355 --> 00:45:06,013 - [Ramez] If we can change what our brains can do, 698 00:45:07,394 --> 00:45:08,740 what does that mean? 699 00:45:12,019 --> 00:45:14,573 - [Sara] As we're trying to intervene in the brain, 700 00:45:14,608 --> 00:45:17,922 we need to be very conscious of the ripple effects. 701 00:45:20,269 --> 00:45:23,203 - [Voice] We don't know what the outcome is going to be. 702 00:45:23,237 --> 00:45:27,241 - [Man] For some people, it's a sci-fi step too far. 703 00:45:32,419 --> 00:45:34,455 - [Bryan] There's a few hundred thousand people in the world 704 00:45:34,490 --> 00:45:37,251 who currently have implanted technology in their brain, 705 00:45:40,151 --> 00:45:42,394 and not many people know that. 706 00:45:45,328 --> 00:45:46,847 And knowing that of course 707 00:45:46,882 --> 00:45:48,228 I think changes people's perspectives 708 00:45:48,262 --> 00:45:50,402 that this has happened.... 709 00:45:50,437 --> 00:45:52,128 [Man] Okay we're gonna put your forearm in a splint. 710 00:45:52,163 --> 00:45:53,233 - [Bryan] It's happening. 711 00:45:54,924 --> 00:45:56,305 [Man] All right, here we go. 712 00:45:56,339 --> 00:45:59,239 - [Bryan] And it's now moving at a speed 713 00:45:59,273 --> 00:46:00,930 that is much faster than before. 714 00:46:04,209 --> 00:46:05,279 - [Man] That's good. 715 00:46:05,314 --> 00:46:06,694 Try it again. 716 00:46:06,729 --> 00:46:08,248 - [Bryan] But the reality is 717 00:46:08,282 --> 00:46:11,147 implantable technology is intimidating. 718 00:46:12,183 --> 00:46:14,979 It's expensive, invasive, 719 00:46:15,013 --> 00:46:16,497 and only recommended for people 720 00:46:16,532 --> 00:46:18,776 with severe disease and dysfunction. 721 00:46:23,608 --> 00:46:25,852 But what if that wasn't the case? 722 00:46:27,336 --> 00:46:30,408 What if we had brain technology that was broadly accessible? 723 00:46:33,204 --> 00:46:34,032 And what if... 724 00:46:38,105 --> 00:46:39,762 it didn't require surgery? 725 00:46:43,939 --> 00:46:46,355 [tense music] 726 00:46:47,908 --> 00:46:49,772 Fortunately we are reaching a point 727 00:46:51,084 --> 00:46:52,706 where these possibilities are emerging. 728 00:46:57,228 --> 00:46:59,264 Imagine you had a brain interface 729 00:47:00,369 --> 00:47:01,680 you could just put on your head, 730 00:47:01,715 --> 00:47:04,442 no surgeries required and you could see 731 00:47:04,476 --> 00:47:05,684 all of your brain activity. 732 00:47:05,719 --> 00:47:06,616 Not just what you're aware of, 733 00:47:06,651 --> 00:47:08,308 but all of your brain activity. 734 00:47:09,274 --> 00:47:11,656 All of your thoughts, concerns, 735 00:47:11,690 --> 00:47:13,416 it was just out in front of you. 736 00:47:15,936 --> 00:47:18,387 So for example if you struggle with anxiety, 737 00:47:19,215 --> 00:47:20,527 you'd see in your brain activity 738 00:47:20,561 --> 00:47:24,358 that you've had this thought 94 times today 739 00:47:24,393 --> 00:47:25,877 and it's probably not necessary. 740 00:47:27,741 --> 00:47:29,156 Could this help you 741 00:47:29,191 --> 00:47:31,365 so you don't have that thought 94 times? 742 00:47:32,366 --> 00:47:34,437 Or can you see in this thought process 743 00:47:34,472 --> 00:47:36,405 that you were limiting your ability to do something 744 00:47:36,439 --> 00:47:38,269 because you didn't believe you could do it? 745 00:47:39,235 --> 00:47:40,202 But that's a limiting belief. 746 00:47:40,236 --> 00:47:41,479 You really can do it. 747 00:47:43,239 --> 00:47:45,621 And if we have the technology that gives us the ability 748 00:47:45,655 --> 00:47:47,623 to improve how our brains function, 749 00:47:49,245 --> 00:47:53,111 can we rethink the things that have always held us back? 750 00:47:57,805 --> 00:48:00,015 As someone who's building these tools 751 00:48:00,049 --> 00:48:01,292 to interface with the brain, 752 00:48:03,397 --> 00:48:06,262 my hope is that we get to a point 753 00:48:06,297 --> 00:48:08,851 in technological advancements 754 00:48:08,886 --> 00:48:11,785 where we are not limited by our technology, 755 00:48:11,819 --> 00:48:15,064 we're empowered by it, so it is a matter of choice 756 00:48:16,859 --> 00:48:18,205 of what we want to become. 757 00:48:23,659 --> 00:48:25,799 - [Nita] We're at the moment where there are a lot 758 00:48:25,833 --> 00:48:28,215 of very rapidly emerging technologies. 759 00:48:29,976 --> 00:48:32,495 Brain computer interfaces are starting to become part 760 00:48:32,530 --> 00:48:33,807 of mainstream society. 761 00:48:35,464 --> 00:48:38,156 There already is technology that can pick up simple, 762 00:48:38,191 --> 00:48:40,365 electrical activity from your brain 763 00:48:40,400 --> 00:48:43,196 through these things, like consumer EEG devices. 764 00:48:43,230 --> 00:48:45,129 - [Woman] The Emotiv Insight is a portable, 765 00:48:45,163 --> 00:48:47,096 wearable device that allows you to 766 00:48:47,131 --> 00:48:49,650 capture what's going on in your brain in real time. 767 00:48:51,135 --> 00:48:54,345 - [Nita] If I wear one of these devices I can start to decode 768 00:48:54,379 --> 00:48:56,830 whether or not I'm drowsy or awake, 769 00:48:56,864 --> 00:49:01,214 whether or not paying attention or I'm unfocused, 770 00:49:01,248 --> 00:49:04,251 angry or experiencing some other kind of emotional state. 771 00:49:04,286 --> 00:49:05,839 - [Woman 2] So this is my brain right now? 772 00:49:05,873 --> 00:49:07,185 - [Woman] This is your brain right now. 773 00:49:07,220 --> 00:49:08,393 - [Woman 2] Oh my gosh. - [Woman] In real time. 774 00:49:08,428 --> 00:49:09,601 - [Woman 2] This is wild. 775 00:49:09,636 --> 00:49:12,673 - [Nita] There are also a number of people using headsets 776 00:49:12,708 --> 00:49:14,468 to focus and perform better. 777 00:49:14,503 --> 00:49:17,161 - [Man] You can buy a TDCS device online 778 00:49:17,195 --> 00:49:20,267 or you can go to youtube and build one yourself. 779 00:49:20,302 --> 00:49:21,959 - [YouTuber] I have been experimenting in an effort 780 00:49:21,993 --> 00:49:26,170 to hopefully see some really awesome cognitive enhancement. 781 00:49:27,274 --> 00:49:28,689 - [Dr. Nicolelis] The future of brain interfaces is probably 782 00:49:28,724 --> 00:49:30,415 the most fascinating unknown 783 00:49:32,314 --> 00:49:34,178 because we are at the infancy of this. 784 00:49:36,594 --> 00:49:39,321 - [Ramez] As we crack the code of the brain, 785 00:49:39,355 --> 00:49:41,979 we're handing the reins of our own lives, 786 00:49:42,013 --> 00:49:44,705 of our own minds to us. 787 00:49:44,740 --> 00:49:47,743 We get to choose how we think, who we are. 788 00:49:53,024 --> 00:49:55,682 - [David] As we get better and better as a society 789 00:49:55,716 --> 00:49:58,305 at building brain interfaces, 790 00:49:59,203 --> 00:50:01,274 these can be applied to every aspect 791 00:50:01,308 --> 00:50:02,344 of our biological existence. 792 00:50:06,348 --> 00:50:08,591 [Birds chirping] 793 00:50:08,626 --> 00:50:10,455 - [Ramez] Could we teach you a new skill? 794 00:50:14,356 --> 00:50:16,530 Could we help you communicate the emotion 795 00:50:16,565 --> 00:50:18,222 you're feeling to a loved one? 796 00:50:21,363 --> 00:50:23,020 What can and can't we do? 797 00:50:30,958 --> 00:50:32,132 - [Justin] Alright then Preston. 798 00:50:32,167 --> 00:50:33,789 The experiment will begin shortly. 799 00:50:36,274 --> 00:50:39,346 Once you are asked a question, look on the left hand side 800 00:50:39,381 --> 00:50:41,831 if the answer to that question is yes. 801 00:50:41,866 --> 00:50:44,248 Look on the right hand side if the answer 802 00:50:44,282 --> 00:50:45,525 to that question is no. 803 00:50:46,491 --> 00:50:48,355 Does that make sense to you? 804 00:50:48,390 --> 00:50:49,391 - [Preston] Yes. 805 00:50:49,425 --> 00:50:50,495 - [Justin] Wonderful. 806 00:50:51,910 --> 00:50:53,464 Are you comfortable? 807 00:50:53,498 --> 00:50:54,223 - [Preston] Mm. 808 00:50:58,193 --> 00:51:02,231 - [Man] Subject number one is wearing an EEG cap 809 00:51:02,266 --> 00:51:04,854 to record electrical fluctuations 810 00:51:04,889 --> 00:51:05,717 from their brain. 811 00:51:09,273 --> 00:51:12,241 Subject number two is receiving the information 812 00:51:12,276 --> 00:51:15,382 through a magnetic pulse to the back of the brain. 813 00:51:18,213 --> 00:51:19,317 - [Chantel] In the current experiment, 814 00:51:19,352 --> 00:51:22,217 we have two subjects in two different rooms, 815 00:51:22,251 --> 00:51:24,633 playing a game of 20 questions. 816 00:51:24,667 --> 00:51:27,498 The big difference is that the two people asking 817 00:51:27,532 --> 00:51:31,329 and answering questions are doing so using only signals 818 00:51:31,364 --> 00:51:33,090 from their brains. 819 00:51:33,124 --> 00:51:34,884 - [Steven] TMS is armed. 820 00:51:34,919 --> 00:51:36,645 - [Computer] The category is animals. 821 00:51:36,679 --> 00:51:38,233 There are eight possible answers. 822 00:51:40,718 --> 00:51:42,340 - [Justin] Data is looking good. 823 00:51:42,375 --> 00:51:43,341 Let's go ahead. 824 00:51:43,376 --> 00:51:45,550 - [Steven] You can send the first question. 825 00:52:03,050 --> 00:52:04,224 - [Woman] The answer is yes. 826 00:52:06,261 --> 00:52:07,124 - [Justin] Excellent. 827 00:52:11,266 --> 00:52:13,613 - [Chantel] It doesn't matter if they're in the same room 828 00:52:13,647 --> 00:52:16,063 or on the other side of the world. 829 00:52:16,098 --> 00:52:19,860 We can extract the information from one brain, 830 00:52:19,895 --> 00:52:22,069 transmit it to a device that can encode 831 00:52:22,104 --> 00:52:23,450 information into another brain. 832 00:52:23,485 --> 00:52:25,176 - [Woman] No. 833 00:52:25,211 --> 00:52:26,729 - [Chantel] Ultimately, the goal of this technology 834 00:52:26,764 --> 00:52:29,042 is to improve the human experience. 835 00:52:30,181 --> 00:52:33,046 To offer up the possibility of sharing 836 00:52:33,080 --> 00:52:36,567 the contents of my mind, my emotional state 837 00:52:37,706 --> 00:52:41,158 into somebody else, without words. 838 00:52:43,919 --> 00:52:46,197 - [Ramez] When we can understand someone 839 00:52:46,232 --> 00:52:49,269 without the barrier of different languages, 840 00:52:49,304 --> 00:52:53,687 we could start to blur the boundaries between us. 841 00:52:53,722 --> 00:52:56,207 We could start to see the whole world 842 00:52:56,242 --> 00:52:58,761 as members of our tribe 843 00:52:58,796 --> 00:53:01,626 and have more empathy and compassion for them. 844 00:53:03,904 --> 00:53:06,735 - [Nita] It's always really hard in advance to know 845 00:53:06,769 --> 00:53:09,669 when you're at a turning point in society, 846 00:53:09,703 --> 00:53:12,327 when technology is gonna be so transformative 847 00:53:12,361 --> 00:53:14,363 that everything will change. 848 00:53:14,398 --> 00:53:16,676 And the question is, are we there right now? 849 00:53:18,229 --> 00:53:19,713 Are we about to fundamentally change 850 00:53:19,748 --> 00:53:21,198 what it means to be human? 851 00:53:22,509 --> 00:53:23,579 And if so, 852 00:53:26,341 --> 00:53:27,376 are we okay with that? 853 00:53:33,140 --> 00:53:34,072 - [Woman] Safety test. 854 00:53:43,530 --> 00:53:46,395 - [Dr. Devenyi] Stephen's implant is a brilliant device 855 00:53:47,327 --> 00:53:49,398 that essentially has two components. 856 00:53:50,330 --> 00:53:53,368 A band that's sewn around the eye 857 00:53:53,402 --> 00:53:56,854 with 60 electrodes on it that stimulate the retina. 858 00:53:57,648 --> 00:53:58,752 - [Woman] Glasses on. 859 00:54:02,100 --> 00:54:03,516 - [Dr. Devenyi] And a pair of glasses 860 00:54:03,550 --> 00:54:05,207 with a video camera on it 861 00:54:06,657 --> 00:54:08,486 that sends images wirelessly 862 00:54:08,521 --> 00:54:12,007 to the electrodes on the surface of the eyeball. 863 00:54:13,491 --> 00:54:14,492 - [Woman] Beginning. 864 00:54:16,874 --> 00:54:18,703 - [Devenyi] Those images then travel along 865 00:54:18,738 --> 00:54:21,741 the natural pathway of the optic nerve to the brain. 866 00:54:22,742 --> 00:54:24,399 - [Woman] Electrodes activated. 867 00:54:28,368 --> 00:54:30,336 - [Woman] You have good connection. 868 00:54:31,406 --> 00:54:33,753 We're gonna activate your device. 869 00:54:35,513 --> 00:54:36,307 Light off. 870 00:54:40,760 --> 00:54:41,795 Big light on please. 871 00:54:49,389 --> 00:54:51,322 Respond yes or no as to whether 872 00:54:51,357 --> 00:54:53,566 you're able to perceive something or not 873 00:54:53,600 --> 00:54:55,740 using the game controller. 874 00:54:56,914 --> 00:54:58,191 - [Stephen] Okay. 875 00:54:58,225 --> 00:55:00,331 - [Woman] So we wanna see how much stimulation 876 00:55:00,366 --> 00:55:01,988 each of your electrodes needs. 877 00:55:02,022 --> 00:55:02,885 - [Stephen] Right. 878 00:55:06,026 --> 00:55:07,373 - [Woman] Sending stimulation. 879 00:55:09,927 --> 00:55:12,136 [beeping] 880 00:55:13,275 --> 00:55:14,828 - [Stephen] I see something flashing. 881 00:55:14,863 --> 00:55:16,416 - [Woman] Got it, okay. 882 00:55:17,624 --> 00:55:19,868 [beeping] 883 00:55:22,388 --> 00:55:23,596 - [Stephen] Another flash. 884 00:55:23,630 --> 00:55:24,976 - [Woman] Okay. 885 00:55:25,874 --> 00:55:27,876 Does it seem any brighter? 886 00:55:27,910 --> 00:55:29,360 - [Stephen] Too bright. 887 00:55:29,395 --> 00:55:30,534 - [Woman] Reset. 888 00:55:33,364 --> 00:55:34,538 - [Woman] Okay then I'm just gonna tweak 889 00:55:34,572 --> 00:55:35,470 some settings a little bit. 890 00:55:38,300 --> 00:55:40,302 Go ahead and respond to this last electrode. 891 00:55:43,409 --> 00:55:44,444 - [Stephen] Oh. 892 00:55:47,447 --> 00:55:52,107 Very faintly, I can see a bit of the hand, yeah. 893 00:55:53,729 --> 00:55:54,868 - [Woman] That's good. 894 00:56:00,426 --> 00:56:02,704 [somber music] 895 00:56:02,738 --> 00:56:04,361 Go ahead and scan and see 896 00:56:04,395 --> 00:56:06,086 what you are able to perceive. 897 00:56:08,434 --> 00:56:11,402 - You know, to the left, to the right. 898 00:56:11,437 --> 00:56:12,369 There's nothing there. 899 00:56:17,201 --> 00:56:19,030 Yeah I'm not picking up anything. 900 00:56:21,447 --> 00:56:23,276 - [Woman] Doing some adjustments right now. 901 00:56:25,209 --> 00:56:26,797 - Setting stimulation. 902 00:56:29,455 --> 00:56:32,078 [pensive music] 903 00:56:33,459 --> 00:56:35,978 - [Stephen] Oh! 904 00:56:36,013 --> 00:56:37,359 I see something there. 905 00:56:37,394 --> 00:56:38,187 - [Woman] Okay. 906 00:56:40,397 --> 00:56:45,402 - I see, I mean, I've got the silhouette right there. 907 00:56:45,885 --> 00:56:46,575 - [Woman] Okay. 908 00:56:46,610 --> 00:56:47,921 - [Woman] Trigger. 909 00:56:48,957 --> 00:56:50,372 - [Woman] Continue responding. 910 00:56:55,412 --> 00:56:56,516 - [Stephen] I see you. 911 00:56:59,208 --> 00:57:00,071 Am I right? 912 00:57:01,418 --> 00:57:03,385 - It's been a long time, Stephen. 913 00:57:04,179 --> 00:57:05,283 - Denise? 914 00:57:05,318 --> 00:57:06,319 - [Denise] Yes. 915 00:57:08,701 --> 00:57:10,185 You're looking right at me. 916 00:57:15,190 --> 00:57:15,984 - Sorry. 917 00:57:17,744 --> 00:57:19,436 - [Woman] That's huge, Stephen. 918 00:57:20,575 --> 00:57:21,921 - [Stephen] It's a nice feeling. 919 00:57:23,992 --> 00:57:24,751 - That's great. 920 00:57:27,098 --> 00:57:28,686 - [Woman] Read that. 921 00:57:33,553 --> 00:57:34,381 - Okay. 922 00:57:42,942 --> 00:57:44,081 I see something there. 923 00:57:45,427 --> 00:57:48,326 - [Denise] I think he'll get out more, he'll do more, 924 00:57:48,361 --> 00:57:49,224 he'll meet people. 925 00:57:50,639 --> 00:57:53,366 - Okay I see her right in front of me. 926 00:57:53,400 --> 00:57:55,368 - [Denise] It'll open up a whole new world. 927 00:57:56,403 --> 00:57:57,232 - [Stephen] It's her. 928 00:57:58,233 --> 00:57:59,234 - [Woman] Exactly. 929 00:58:00,580 --> 00:58:02,513 - [Stephen] My, Denise, how you've grown. 930 00:58:02,548 --> 00:58:04,481 [chuckling] 931 00:58:09,486 --> 00:58:13,006 [light electronic music] 932 00:58:15,319 --> 00:58:18,253 - [Dr. Devenyi] The hope is our patients see edges of objects, 933 00:58:19,565 --> 00:58:23,361 empty seats on subways, the outlines of people. 934 00:58:24,362 --> 00:58:25,536 Even fireworks. 935 00:58:28,332 --> 00:58:30,679 But as the technology's been improving, 936 00:58:30,714 --> 00:58:32,060 they see more and more. 937 00:58:33,233 --> 00:58:34,856 There's no limit to how much you could 938 00:58:34,890 --> 00:58:36,271 crank up the magnification 939 00:58:36,305 --> 00:58:38,687 and what sort of details we could potentially see. 940 00:58:40,206 --> 00:58:43,312 - [Ramez] Right now, the prosthetic eyes that we have 941 00:58:43,347 --> 00:58:46,488 are not anywhere near as good as human eyes, 942 00:58:47,593 --> 00:58:50,665 but eventually they'll be better. 943 00:58:50,699 --> 00:58:53,495 And why couldn't we add infrared vision to them, 944 00:58:53,530 --> 00:58:55,186 or UV vision to them? 945 00:58:58,293 --> 00:58:59,466 - [Man] So one of the latest experiments 946 00:58:59,501 --> 00:59:00,951 that we've been working on is 947 00:59:00,985 --> 00:59:03,609 using an infrared camera instead of the regular camera 948 00:59:03,643 --> 00:59:06,577 so patients could actually see in the dark. 949 00:59:06,612 --> 00:59:09,338 - [David] Our technology right now is about helping people 950 00:59:09,373 --> 00:59:12,238 with diseases or deficits of some sort 951 00:59:15,344 --> 00:59:17,174 but it's this exact same technology 952 00:59:17,208 --> 00:59:20,418 that will allow us to expand ourselves. 953 00:59:20,453 --> 00:59:22,351 Expand our reality, expand our senses, 954 00:59:22,386 --> 00:59:24,388 expand what we can do with our bodies. 955 00:59:31,533 --> 00:59:33,570 - [Dr. Munoz] The left side. 956 00:59:34,605 --> 00:59:38,022 About 130 hertz, 60 microseconds. 957 00:59:39,403 --> 00:59:41,405 Let me check your baseline. 958 00:59:41,439 --> 00:59:42,302 So this is... 959 00:59:43,441 --> 00:59:44,477 Relax. 960 00:59:45,443 --> 00:59:47,342 - [Anne] It was an extreme procedure. 961 00:59:50,414 --> 00:59:52,589 I mean they sliced my head open, 962 00:59:53,590 --> 00:59:55,039 they stuck tubes in. 963 00:59:58,180 --> 00:59:59,665 It's not for the faint of heart. 964 01:00:05,325 --> 01:00:06,499 - [Dr. Lozano] Deep Brain Stimulation 965 01:00:06,533 --> 01:00:08,432 is a two part surgical procedure. 966 01:00:11,608 --> 01:00:13,540 The electrodes are implanted in the brain 967 01:00:15,059 --> 01:00:17,752 and then connected to a pacemaker in their chest. 968 01:00:19,546 --> 01:00:20,617 [Dr. Munoz] 3.5 volts. 969 01:00:22,446 --> 01:00:25,345 - [Dr. Lozano] With a remote control, we are able to adjust 970 01:00:25,380 --> 01:00:27,934 how much current is delivered to the brain. 971 01:00:27,969 --> 01:00:29,384 Almost no rigidity. 972 01:00:29,418 --> 01:00:31,317 Very much like you would control the volume 973 01:00:31,351 --> 01:00:34,389 or change the channels on your television set. 974 01:00:34,423 --> 01:00:38,186 [Dr. Munoz] I will build up the voltage by .5 volts every step. 975 01:00:38,220 --> 01:00:41,085 In between steps I will check your rigidity. 976 01:00:41,120 --> 01:00:42,500 And the symptoms. 977 01:00:42,535 --> 01:00:44,710 [beeping] 978 01:00:47,091 --> 01:00:48,506 Did you feel anything? 979 01:00:48,541 --> 01:00:50,129 - Not yet, no. 980 01:00:50,163 --> 01:00:51,820 [Dr. Munoz] - So I'll go a little higher. 981 01:00:56,860 --> 01:00:59,345 1.5 here and here 1.5. 982 01:01:03,418 --> 01:01:04,177 3.5 volts, 80%. 983 01:01:10,045 --> 01:01:11,288 - Well this, whoa, 984 01:01:11,322 --> 01:01:13,393 my toes are really pulling on my left foot. 985 01:01:13,428 --> 01:01:15,706 They're going way up like that. 986 01:01:17,190 --> 01:01:19,365 - [Dr. Munoz] It's a good sign. 987 01:01:21,988 --> 01:01:24,025 - I feel my left side. - Left side? 988 01:01:24,059 --> 01:01:26,579 - Yeah, tingling on the bottom of my feet. 989 01:01:28,892 --> 01:01:30,548 - [Dr. Lozano] By using electrical stimulation, 990 01:01:31,688 --> 01:01:34,035 not only can we suppress the symptoms 991 01:01:35,277 --> 01:01:37,383 but we're able to discover new areas 992 01:01:37,417 --> 01:01:38,833 and new functions of the brain 993 01:01:38,867 --> 01:01:40,904 where no man has ever gone before. 994 01:01:42,353 --> 01:01:43,769 - Ugh, my tongue is getting bigger and bigger. 995 01:01:43,803 --> 01:01:46,185 Feels like, uh, down my throat. 996 01:01:46,219 --> 01:01:48,394 - [Dr. Munoz] Tell me when it gets better. 997 01:01:48,428 --> 01:01:50,051 - [Dr. Lozano] We're also realizing that this technique 998 01:01:50,085 --> 01:01:52,122 could be used to treat other disorders 999 01:01:52,156 --> 01:01:54,503 like Alzheimer's Disease, like depression, 1000 01:01:54,538 --> 01:01:56,126 maybe even obesity. 1001 01:01:57,679 --> 01:01:59,474 - Oh my goodness, I can't describe it. 1002 01:01:59,508 --> 01:02:01,372 - Is it better now? 1003 01:02:01,407 --> 01:02:02,304 Improving? 1004 01:02:02,339 --> 01:02:03,236 - It's improving. - Yeah. 1005 01:02:03,271 --> 01:02:04,893 Okay, all right. 1006 01:02:04,928 --> 01:02:09,242 [Dr. Munoz] Let's just walk to that corner back and forth. 1007 01:02:09,277 --> 01:02:10,381 Very good. 1008 01:02:10,416 --> 01:02:13,281 Let's see, so we'll pick contact three. 1009 01:02:15,110 --> 01:02:17,250 - [Dr. Lozano] I am convinced that in the future, 1010 01:02:17,285 --> 01:02:19,425 I don't know if it'll be 10, 20, 30 years from now, 1011 01:02:19,459 --> 01:02:22,393 if you said I really would like to be smarter, 1012 01:02:22,428 --> 01:02:25,603 to be happier, I'd like to have more drive, more ambition. 1013 01:02:27,364 --> 01:02:29,228 There's no reason why, at least in theory, 1014 01:02:29,262 --> 01:02:32,403 you couldn't go to these areas and adjust the activity. 1015 01:02:33,680 --> 01:02:35,475 - [Doctor] For emergencies, you need to turn it off, 1016 01:02:35,510 --> 01:02:36,476 this shuts it off. 1017 01:02:36,511 --> 01:02:37,719 - [Anne] Okay. 1018 01:02:37,754 --> 01:02:39,307 - [Ramez] Imagine you have a set of dials 1019 01:02:41,309 --> 01:02:43,621 and you can say, "Oh, I want to be more extroverted, 1020 01:02:43,656 --> 01:02:46,659 more empathetic and harder working." 1021 01:02:47,867 --> 01:02:50,352 Imagine you could just do that on-demand 1022 01:02:50,387 --> 01:02:51,388 by pushing a button. 1023 01:02:52,389 --> 01:02:53,770 - [Doctor] Just like that. 1024 01:02:56,842 --> 01:02:59,258 - [Dr. Devenyi] It's very interesting to contemplate 1025 01:02:59,292 --> 01:03:01,018 where will technology take us. 1026 01:03:02,709 --> 01:03:05,160 And as humans will it be able to make us 1027 01:03:05,195 --> 01:03:06,506 better than we currently are? 1028 01:03:27,113 --> 01:03:28,183 - [Woman's Voice] Welcome, class. 1029 01:03:31,048 --> 01:03:34,120 Today's lecture is about neuroethics. 1030 01:03:36,778 --> 01:03:41,369 We are on the cusp of being able to truly unlock 1031 01:03:41,403 --> 01:03:45,960 the secrets of the human brain. 1032 01:03:48,963 --> 01:03:50,205 [Nita] But more than that, 1033 01:03:50,240 --> 01:03:52,311 more than being able to understand 1034 01:03:52,345 --> 01:03:54,900 what's happening in the brain, 1035 01:03:54,934 --> 01:03:59,939 now we're able to make a lot of very precise changes to it. 1036 01:04:06,256 --> 01:04:07,913 How many of you feel like your sense of self 1037 01:04:07,947 --> 01:04:08,914 is in your brain? 1038 01:04:11,951 --> 01:04:14,264 Many of you, right? 1039 01:04:15,644 --> 01:04:18,233 Many of you identify your brain as the place 1040 01:04:18,268 --> 01:04:21,409 that you recognize your sense of self identity. 1041 01:04:22,893 --> 01:04:26,345 The place that you sort of think of you as being located. 1042 01:04:30,107 --> 01:04:32,109 And if you change that, 1043 01:04:34,077 --> 01:04:36,734 if you start to add machines to it, 1044 01:04:36,769 --> 01:04:39,013 if you start to implant electrodes to it 1045 01:04:41,325 --> 01:04:44,087 at what point does that start to become problematic? 1046 01:04:47,228 --> 01:04:50,300 There is great benefit to these technologies, 1047 01:04:50,334 --> 01:04:51,404 but there are risks. 1048 01:04:53,890 --> 01:04:57,238 For example who has access to those devices? 1049 01:04:59,102 --> 01:05:01,518 Who gets to drive and change those devices? 1050 01:05:04,555 --> 01:05:07,593 Whether it's hackers, corporations, 1051 01:05:07,627 --> 01:05:11,355 governments, other people, 1052 01:05:11,390 --> 01:05:15,394 what protections if any exist within society today? 1053 01:05:18,052 --> 01:05:19,225 It's arriving. 1054 01:05:20,364 --> 01:05:23,402 And unless we start deciding today 1055 01:05:23,436 --> 01:05:26,336 what kinds of protections you want for your brain, 1056 01:05:28,510 --> 01:05:29,615 it may be too late. 1057 01:05:33,308 --> 01:05:35,000 Does that make you uncomfortable? 1058 01:05:39,314 --> 01:05:41,075 Because that's where we're going. 1059 01:05:46,321 --> 01:05:48,668 [crowd applauding] 1060 01:05:48,703 --> 01:05:49,980 - [Zuckerberg] The work that we're doing in 1061 01:05:50,015 --> 01:05:51,154 direct brain interfaces 1062 01:05:51,188 --> 01:05:53,501 that are gonna eventually one day 1063 01:05:53,535 --> 01:05:56,159 let you communicate using only your mind. 1064 01:05:56,193 --> 01:05:58,437 - [Ramez] In just the last few years, there's been 1065 01:05:58,471 --> 01:06:02,475 this sudden rush to fund more aggressive, 1066 01:06:02,510 --> 01:06:05,306 more radical work in interfacing to the brain. 1067 01:06:05,340 --> 01:06:07,377 - [Newscaster] Elon Musk is creating a big stir 1068 01:06:07,411 --> 01:06:08,688 on social media this morning. 1069 01:06:08,723 --> 01:06:10,276 - [Newscaster] Elon Musk has launched Neuralink. 1070 01:06:10,311 --> 01:06:13,176 - They will aim to connect the human brain with computers. 1071 01:06:13,210 --> 01:06:15,178 - [Elon Musk] That will enable anyone who wants 1072 01:06:15,212 --> 01:06:17,974 to have super human cognition. 1073 01:06:18,008 --> 01:06:19,320 - [Joe Rogan] But billions of people 1074 01:06:19,354 --> 01:06:21,736 with enhanced cognitive ability? 1075 01:06:21,770 --> 01:06:23,324 Radically enhanced? - [Elon] Yes, yes. 1076 01:06:23,358 --> 01:06:24,981 - [Joe] Pass that whiskey. 1077 01:06:25,015 --> 01:06:26,327 This is getting ridiculous. 1078 01:06:26,361 --> 01:06:29,192 - [Ramez] What's happening in the brain is unprecedented. 1079 01:06:29,226 --> 01:06:31,608 It's moving faster now than ever before. 1080 01:06:31,642 --> 01:06:34,300 - [Regina] Together we have a goal of typing, 1081 01:06:34,335 --> 01:06:35,543 five times faster 1082 01:06:35,577 --> 01:06:37,303 than you can type on your phone, 1083 01:06:37,338 --> 01:06:39,202 straight from your brain. 1084 01:06:39,236 --> 01:06:42,032 And we're just getting started. 1085 01:06:42,067 --> 01:06:43,689 - [Bobby] If you understand the brain, 1086 01:06:43,723 --> 01:06:46,795 you get a tremendous amount of power to leverage. 1087 01:06:46,830 --> 01:06:49,350 - [Anderson Cooper] Some programmers call it brain hacking 1088 01:06:49,384 --> 01:06:51,697 and the tech world would probably prefer 1089 01:06:51,731 --> 01:06:53,112 you didn't hear about it. 1090 01:06:54,286 --> 01:06:57,047 - [Tristan] We can't underestimate the incredible 1091 01:06:57,082 --> 01:07:01,120 unchecked power that Google, Apple, Facebook have 1092 01:07:01,155 --> 01:07:04,365 over what two billion people are thinking every day. 1093 01:07:04,399 --> 01:07:08,196 That's more than the number of followers of Christianity. 1094 01:07:08,231 --> 01:07:09,853 These are like private superpowers 1095 01:07:09,887 --> 01:07:12,614 whose goals are not the same as our goals. 1096 01:07:12,649 --> 01:07:13,857 - [Chris Hardwick] Last week Facebook revealed 1097 01:07:13,891 --> 01:07:15,100 that they are working 1098 01:07:15,134 --> 01:07:16,860 on technology to read people's minds. 1099 01:07:16,894 --> 01:07:18,275 Ah, yeah, we're in [beep]ing hell you guys, 1100 01:07:18,310 --> 01:07:19,345 I don't know what to say. 1101 01:07:19,380 --> 01:07:20,795 [laughter] 1102 01:07:20,829 --> 01:07:23,073 - [Nita] People are far more likely to give up information 1103 01:07:23,108 --> 01:07:25,938 about their brain to what they think are, 1104 01:07:25,972 --> 01:07:28,768 these faceless corporations 1105 01:07:28,803 --> 01:07:31,254 but they're really people who are driving that. 1106 01:07:32,634 --> 01:07:35,120 - [Bobby] One of the best algorithms that we have 1107 01:07:35,154 --> 01:07:38,192 for mapping the brain comes from Google. 1108 01:07:38,226 --> 01:07:40,090 - [Congressman] Google is able to collect 1109 01:07:40,125 --> 01:07:42,058 an amount of information about its users 1110 01:07:42,092 --> 01:07:45,475 that would even make the NSA blush. 1111 01:07:45,509 --> 01:07:47,546 - [Bobby] Do I want my brain data 1112 01:07:47,580 --> 01:07:49,203 ultimately owned by Google? 1113 01:07:49,237 --> 01:07:51,998 I'm not sure I'm comfortable with that just yet. 1114 01:07:52,033 --> 01:07:53,172 - [Eric Schmidt] The Google policy 1115 01:07:53,207 --> 01:07:54,518 about a lot of these things 1116 01:07:54,553 --> 01:07:56,486 is to get right up to the creepy line but not cross it. 1117 01:07:56,520 --> 01:07:58,522 Implanting things in your brain 1118 01:07:58,557 --> 01:08:00,869 is beyond the creepy line 1119 01:08:00,904 --> 01:08:02,250 at least for the moment. 1120 01:08:02,285 --> 01:08:04,494 [laughing] 1121 01:08:04,528 --> 01:08:06,910 - [Tristan] Whether we're just using phones 1122 01:08:06,944 --> 01:08:09,119 or we have brain implants, 1123 01:08:09,154 --> 01:08:12,847 history is littered with good intentions. 1124 01:08:12,881 --> 01:08:15,574 Who's to say where we should go? 1125 01:08:15,608 --> 01:08:17,748 Who's to say what's best for people? 1126 01:08:17,783 --> 01:08:19,992 - [Anchor] Is it going to be chancing corporations, 1127 01:08:20,026 --> 01:08:21,683 is it going to be panicking governments, 1128 01:08:21,718 --> 01:08:23,858 an ill informed public, over this thing? 1129 01:08:23,892 --> 01:08:26,136 Somebody's got to keep close watch on this. 1130 01:08:26,171 --> 01:08:27,379 Who's going to do that? 1131 01:08:45,500 --> 01:08:48,365 - [Woman] The risk is we don't know what the future holds. 1132 01:08:54,199 --> 01:08:56,201 Should we even attempt to introduce technology 1133 01:08:56,235 --> 01:08:58,893 like this on a mass large scale? 1134 01:09:00,481 --> 01:09:01,240 - [Bryan] That's right. 1135 01:09:04,209 --> 01:09:06,452 These are tricky questions 1136 01:09:06,487 --> 01:09:08,592 that have all kind of implications. 1137 01:09:10,146 --> 01:09:14,184 The technology is transitioning from the medical domain 1138 01:09:14,219 --> 01:09:15,323 into the mainstream. 1139 01:09:16,497 --> 01:09:19,845 And as it does, the opportunities and challenges 1140 01:09:19,879 --> 01:09:23,193 that this creates for us, we can't even begin 1141 01:09:23,228 --> 01:09:24,505 to get our heads around. 1142 01:09:27,991 --> 01:09:30,718 I'm very concerned about who makes 1143 01:09:30,752 --> 01:09:32,237 breakthroughs in neuroscience. 1144 01:09:33,514 --> 01:09:35,171 In the behaviors of people who've built 1145 01:09:35,205 --> 01:09:37,034 the technology companies over the past decade, 1146 01:09:37,069 --> 01:09:39,175 their decision making processes matter a lot 1147 01:09:39,209 --> 01:09:41,522 and they influence society dramatically. 1148 01:09:41,556 --> 01:09:43,972 - [Man] I think everybody that started social media 1149 01:09:44,007 --> 01:09:45,457 had the best intention 1150 01:09:45,491 --> 01:09:47,873 and I don't think anybody foresaw 1151 01:09:47,907 --> 01:09:51,152 the levels of depression and anxiety and addiction 1152 01:09:51,187 --> 01:09:52,257 that it's caused. 1153 01:09:53,258 --> 01:09:55,191 - [Man 2] Most of us use social networks. 1154 01:09:55,225 --> 01:09:57,089 Our opinions are being manipulated in 1155 01:09:57,123 --> 01:09:59,091 ways that we don't even understand. 1156 01:09:59,125 --> 01:10:01,231 - [Man] How do we try to mitigate for all 1157 01:10:01,266 --> 01:10:04,890 of the potential things that could go wrong on a mass scale 1158 01:10:04,924 --> 01:10:06,340 that we can't even predict? 1159 01:10:11,931 --> 01:10:16,246 - [Bryan] As a society it's important we be ready for this. 1160 01:10:17,868 --> 01:10:21,217 So I'm trying to figure out how to talk about these topics 1161 01:10:21,251 --> 01:10:24,220 in a way that doesn't threaten people, 1162 01:10:25,221 --> 01:10:27,464 that enables them to explore them with me. 1163 01:10:30,364 --> 01:10:33,608 We are about to enter into the most consequential 1164 01:10:33,643 --> 01:10:37,371 revolution in the history of the human race. 1165 01:10:37,405 --> 01:10:40,339 Where we can take control of our cognitive evolution. 1166 01:10:41,306 --> 01:10:43,239 It's a very big challenge 1167 01:10:43,273 --> 01:10:46,242 because people are considering what they're going to lose. 1168 01:10:46,276 --> 01:10:47,726 Who's going to do bad things? 1169 01:10:48,899 --> 01:10:50,694 But I think one of the greatest limiters 1170 01:10:50,729 --> 01:10:53,663 to progress is fear. 1171 01:10:55,285 --> 01:10:58,530 Imagine if I had a tool to interface with my brain, 1172 01:10:58,564 --> 01:11:03,051 where I could walk a mile in someone else's shoes. 1173 01:11:03,086 --> 01:11:06,814 What if I could feel what it was like to be you? 1174 01:11:06,848 --> 01:11:09,126 What if I could understand your contextual framework? 1175 01:11:09,161 --> 01:11:11,991 What if I understood your memories and your emotions? 1176 01:11:12,026 --> 01:11:13,890 Would that change the way we deal with each other? 1177 01:11:13,924 --> 01:11:15,823 The way we cooperate, the way we make decisions? 1178 01:11:15,857 --> 01:11:17,756 Would that change our creative ability? 1179 01:11:19,482 --> 01:11:22,485 It's gonna be on our front doorsteps in 15-20 years 1180 01:11:23,693 --> 01:11:25,626 on a scale we've never seen before. 1181 01:11:25,660 --> 01:11:27,800 So how might we be thoughtful 1182 01:11:27,835 --> 01:11:29,250 in building these technologies? 1183 01:11:31,010 --> 01:11:33,910 The impact is going to be substantial. 1184 01:11:35,049 --> 01:11:38,017 And to acknowledge that that is the case, 1185 01:11:38,052 --> 01:11:40,157 and reconcile what that means 1186 01:11:40,192 --> 01:11:43,816 will allow us to have productive conversations 1187 01:11:43,851 --> 01:11:48,649 of how we might explore this, develop this, deal with this. 1188 01:11:48,683 --> 01:11:49,477 Thank you. 1189 01:11:53,032 --> 01:11:55,483 The reality is it's up to us. 1190 01:11:58,072 --> 01:12:00,177 The key to our continued thriving, 1191 01:12:00,212 --> 01:12:01,489 the key to everything we want 1192 01:12:02,387 --> 01:12:03,526 is to open up 1193 01:12:05,321 --> 01:12:07,944 and consider this new set of possibilities 1194 01:12:09,325 --> 01:12:11,257 that we've never considered before. 1195 01:12:25,789 --> 01:12:28,551 [birds chirping] 1196 01:12:31,588 --> 01:12:33,245 - [Anne] Before the operation, 1197 01:12:36,421 --> 01:12:39,355 I went in hoping everything would be fine. 1198 01:12:43,359 --> 01:12:45,188 That I would be a new person. 1199 01:12:50,745 --> 01:12:54,370 But I was forced to accept the unknown 1200 01:12:57,269 --> 01:13:00,755 and open myself up to something 1201 01:13:02,861 --> 01:13:05,346 a little bit different. 1202 01:13:17,082 --> 01:13:19,325 I can't really explain it. 1203 01:13:23,675 --> 01:13:25,884 But when I try to see what it would be like 1204 01:13:27,368 --> 01:13:29,681 if I didn't have the device on. 1205 01:13:31,717 --> 01:13:34,410 [somber music] 1206 01:13:43,039 --> 01:13:46,629 I wait a few seconds and all the symptoms 1207 01:13:46,663 --> 01:13:49,735 of Parkinson's are back. 1208 01:13:51,772 --> 01:13:56,397 My toes are all curling under and I'm starting to sway. 1209 01:14:01,367 --> 01:14:02,714 I get stiff. 1210 01:14:06,372 --> 01:14:07,615 And I can't walk. 1211 01:14:15,416 --> 01:14:18,384 My tongue gets even thicker. 1212 01:14:20,317 --> 01:14:23,217 And it feels like I can't speak at all. 1213 01:14:30,707 --> 01:14:34,331 And then when the device is turned on again, 1214 01:14:38,404 --> 01:14:40,406 my movements become more fluid. 1215 01:14:43,686 --> 01:14:48,691 It's as if this machine is motivating me 1216 01:14:51,383 --> 01:14:52,660 to get it right. 1217 01:14:55,836 --> 01:14:58,217 [light music] 1218 01:14:59,356 --> 01:15:03,222 It's been almost a year since the surgery. 1219 01:15:06,363 --> 01:15:08,193 I used to come down here and stare 1220 01:15:08,227 --> 01:15:11,403 at my boxes of paper and wonder 1221 01:15:11,437 --> 01:15:15,027 how in the world I was gonna make something. 1222 01:15:19,963 --> 01:15:24,968 But now it's as if my motivation and creative ability 1223 01:15:25,624 --> 01:15:27,350 has just clicked in. 1224 01:15:29,386 --> 01:15:33,149 There's a connection that my brain is making to my hands 1225 01:15:33,183 --> 01:15:34,806 that it never made before, 1226 01:15:36,393 --> 01:15:40,018 and anything I want to do, I can do. 1227 01:15:41,571 --> 01:15:42,365 Hi, Tootsie! 1228 01:15:43,849 --> 01:15:46,507 We had our first grandchild about three months ago. 1229 01:15:46,542 --> 01:15:48,854 - [Man] Let's go Mr. Jellybean. 1230 01:15:48,889 --> 01:15:50,304 - [Woman] The neurostimulator's stopping you 1231 01:15:50,338 --> 01:15:53,169 from shaking your legs so now you can't jostle him. 1232 01:15:53,203 --> 01:15:54,826 [laughing] 1233 01:15:54,860 --> 01:15:58,208 - [Stan] Her anxiety, which was horrible, has disappeared. 1234 01:15:59,347 --> 01:16:02,558 Now we can go to the city, take in a concert, 1235 01:16:02,592 --> 01:16:03,697 take in a play. 1236 01:16:07,355 --> 01:16:11,014 - [Anne] It all comes together the same way it used to. 1237 01:16:11,049 --> 01:16:13,534 - [Stan] Anne and I started taking dance lessons together. 1238 01:16:13,569 --> 01:16:14,328 - [Woman] Oh, you did? 1239 01:16:14,362 --> 01:16:15,467 - [Stan and Anne] Yep. 1240 01:16:15,501 --> 01:16:16,951 - [Stan] The procedure didn't stop 1241 01:16:16,986 --> 01:16:19,298 the disease from progressing. 1242 01:16:19,333 --> 01:16:20,955 But we bought a lot of time. 1243 01:16:24,649 --> 01:16:26,202 - [Anne] I get teary because 1244 01:16:28,411 --> 01:16:29,757 you can see just 1245 01:16:32,346 --> 01:16:34,417 that the things you did were worth doing. 1246 01:16:41,907 --> 01:16:44,461 It's an amazing feeling 1247 01:16:48,189 --> 01:16:49,363 that I can trust, 1248 01:16:51,917 --> 01:16:53,574 I don't know how to even say this. 1249 01:16:55,265 --> 01:17:00,270 but I can trust my brain to work better now. 1250 01:17:03,411 --> 01:17:06,000 [pensive music] 1251 01:17:15,423 --> 01:17:17,149 - [Stephen] When I lost my vision, 1252 01:17:17,184 --> 01:17:20,118 the hardest part was trying to figure out 1253 01:17:20,152 --> 01:17:22,223 how to ask for help. 1254 01:17:29,886 --> 01:17:32,855 Fortunately, I had my sister. 1255 01:17:32,889 --> 01:17:35,478 - [Denise] Right by Roy Thomson Hall again. 1256 01:17:35,512 --> 01:17:36,652 - [Stephen] Okay. 1257 01:17:36,686 --> 01:17:38,170 - [Denise] We're walking into the sunset. 1258 01:17:38,205 --> 01:17:39,275 - [Stephen] Why don't we go in there 1259 01:17:39,309 --> 01:17:40,517 and get tickets to do something? 1260 01:17:40,552 --> 01:17:42,416 At least it'll be warm. 1261 01:17:46,213 --> 01:17:48,974 - I'm hoping that over time I'll see 1262 01:17:49,009 --> 01:17:50,320 outline of objects. 1263 01:17:51,667 --> 01:17:53,565 Tables and doors. 1264 01:17:55,809 --> 01:17:58,363 And when I'm crossing the street, 1265 01:17:58,397 --> 01:17:59,985 I can distinguish, 1266 01:18:00,020 --> 01:18:02,850 oh ok this is a car and it's coming right for me, 1267 01:18:02,885 --> 01:18:04,334 [chuckling] get out of the way. 1268 01:18:07,372 --> 01:18:10,720 I had to learn to retrain my mind 1269 01:18:12,066 --> 01:18:14,241 and have a new perspective of my future. 1270 01:18:17,106 --> 01:18:20,005 - [Denise] I think his independence will come back. 1271 01:18:21,248 --> 01:18:24,147 He's definitely got the motivation and the drive. 1272 01:18:27,392 --> 01:18:29,152 I'm very excited for him. 1273 01:18:38,092 --> 01:18:40,716 [pensive music] 1274 01:18:44,892 --> 01:18:46,411 - [Bill] Before my accident, 1275 01:18:48,068 --> 01:18:50,415 I didn't really think about medical research. 1276 01:18:51,209 --> 01:18:52,210 - [Bill's Friend] You the star. 1277 01:18:53,383 --> 01:18:55,800 - [Bill] I was just a pee-on too. 1278 01:18:55,834 --> 01:18:57,525 - [Bill's Friend] You was a pee-on? 1279 01:18:57,560 --> 01:19:01,012 [Bill] Yeah, I was nobody. 1280 01:19:01,046 --> 01:19:02,220 [Bill's Friend] You were nobody. 1281 01:19:02,254 --> 01:19:03,704 [Bill] But now I'm somebody. 1282 01:19:03,739 --> 01:19:05,775 [Bill's Friend] But now you're somebody. 1283 01:19:05,810 --> 01:19:09,020 [Bill] But somebody has to do research 1284 01:19:10,746 --> 01:19:12,644 or no breakthroughs are gonna happen. 1285 01:19:25,243 --> 01:19:28,315 And even if I don't get any benefit out of this, 1286 01:19:29,661 --> 01:19:32,698 eventually somebody's gonna benefit from what I'm doing 1287 01:19:34,493 --> 01:19:36,185 and that makes me feel special. 1288 01:19:37,738 --> 01:19:38,981 - [Man] Right, Bill, you ready? 1289 01:19:39,015 --> 01:19:40,292 This time it'll be you with no assistance. 1290 01:19:40,327 --> 01:19:42,398 So it'll just be all you here. 1291 01:19:42,432 --> 01:19:43,364 - [Bill] I'm ready. 1292 01:20:02,142 --> 01:20:04,523 - [Man] All right, here we go. 1293 01:20:14,223 --> 01:20:16,915 [beeping] - [Computer] Open. 1294 01:20:20,298 --> 01:20:23,025 [beeping] Open. 1295 01:20:35,244 --> 01:20:38,005 [beeping] Open. 1296 01:20:39,351 --> 01:20:41,250 - [Man] Try it again. 1297 01:20:42,976 --> 01:20:44,322 See if you can open your hand. 1298 01:20:49,741 --> 01:20:50,535 - [Bolu] Bill, focus. 1299 01:20:59,820 --> 01:21:01,304 - [Computer] Out. 1300 01:21:02,409 --> 01:21:03,755 - [Man] Just checking triceps. 1301 01:21:05,412 --> 01:21:06,551 - [Computer] Up. 1302 01:21:10,555 --> 01:21:11,280 [beeping] Close. 1303 01:21:12,798 --> 01:21:13,696 - [Man] Close your hand. 1304 01:21:18,321 --> 01:21:19,357 Reach in. 1305 01:21:25,259 --> 01:21:28,055 [uplifting music] 1306 01:21:32,853 --> 01:21:35,614 [beeping] - [Computer] Down. 1307 01:21:38,514 --> 01:21:40,723 [beeping] 1308 01:21:43,243 --> 01:21:45,901 [beeping] Out. 1309 01:21:47,316 --> 01:21:48,558 - [Man] Biceps. 1310 01:21:48,593 --> 01:21:49,835 Just checking. 1311 01:21:50,767 --> 01:21:52,977 [beeping] 1312 01:21:59,052 --> 01:22:01,709 [beeping] - [Computer] In. 1313 01:22:01,744 --> 01:22:03,297 - [Bolu] It's amazing. 1314 01:22:03,332 --> 01:22:05,161 [beeping] - [Computer] Out. 1315 01:22:05,196 --> 01:22:07,577 - [Man] Ready for some mashed potatoes? 1316 01:22:07,612 --> 01:22:08,371 - [Bill] Yes I am. 1317 01:22:10,960 --> 01:22:13,998 Like most people, you know, 1318 01:22:14,032 --> 01:22:17,346 I thought quadriplegics all were, you know 1319 01:22:17,380 --> 01:22:20,590 stuck in bed, they didn't do anything, didn't move around. 1320 01:22:20,625 --> 01:22:21,591 - [Man] Bring your arm down. 1321 01:22:23,283 --> 01:22:26,458 - [Bolu] He seems to have a very strong elbow movements. 1322 01:22:28,564 --> 01:22:29,875 - [Bill] But now.... 1323 01:22:29,910 --> 01:22:31,325 - [Man] Raise up again. 1324 01:22:31,360 --> 01:22:32,948 - [Bill] I'm learning 1325 01:22:34,570 --> 01:22:36,020 that's not really true. 1326 01:22:45,753 --> 01:22:49,826 Mashed Potatoes. [chuckling] 1327 01:22:49,861 --> 01:22:50,620 I love it. 1328 01:22:51,794 --> 01:22:53,451 To be able to eat something with a fork. 1329 01:22:54,935 --> 01:22:56,316 It's wonderful. 1330 01:22:56,350 --> 01:22:57,386 - [Bolu] Good, good good. 1331 01:23:02,046 --> 01:23:03,288 This is groundbreaking work. 1332 01:23:04,841 --> 01:23:07,154 Bill Kochevar, is the first person to our knowledge 1333 01:23:07,189 --> 01:23:09,881 to use an implanted brain computer interface 1334 01:23:09,915 --> 01:23:13,091 to control both reaching and grasping 1335 01:23:13,126 --> 01:23:13,989 just by thinking. 1336 01:23:17,337 --> 01:23:19,132 - [Man] Mashed potatoes now, steak next? 1337 01:23:19,166 --> 01:23:21,651 [laughing] 1338 01:23:21,686 --> 01:23:23,412 - [Bolu] He is a pioneer. 1339 01:23:23,446 --> 01:23:25,655 [laughing] 1340 01:23:25,690 --> 01:23:27,002 - [Man] You ready? 1341 01:23:27,036 --> 01:23:28,417 - [Bill] Yes I am! 1342 01:23:31,454 --> 01:23:34,112 - [Bolu] Bill is always asking us what is the next step. 1343 01:23:35,355 --> 01:23:38,806 My hope is that it will eventually become available 1344 01:23:38,841 --> 01:23:40,463 outside of a laboratory setting, 1345 01:23:41,292 --> 01:23:44,157 invisible to the outside user, 1346 01:23:44,191 --> 01:23:46,297 and give persons with spinal cord injury 1347 01:23:46,331 --> 01:23:48,816 the ability to perform tasks on their own, 1348 01:23:48,851 --> 01:23:50,197 to increase their independence, 1349 01:23:50,232 --> 01:23:51,888 to increase their quality of life. 1350 01:23:53,545 --> 01:23:57,446 I believe that we can harness the power of the brain 1351 01:23:57,480 --> 01:23:59,379 and harness the power of computers 1352 01:24:01,174 --> 01:24:04,694 to overcome some of the most critical limitations we face. 1353 01:24:14,497 --> 01:24:16,361 - [Bryan] We are walking down this path 1354 01:24:18,294 --> 01:24:19,709 of merging with our technology. 1355 01:24:21,573 --> 01:24:23,886 The insights we make with the brain 1356 01:24:25,577 --> 01:24:28,511 will invite all of us to ask questions of ourselves 1357 01:24:29,995 --> 01:24:32,377 that challenge our deepest held beliefs 1358 01:24:34,310 --> 01:24:37,348 on identity and purpose. 1359 01:24:41,076 --> 01:24:42,663 So what does it mean to be human? 1360 01:24:46,357 --> 01:24:49,325 To me, it means we can become anything we want. 1361 01:24:59,197 --> 01:25:01,130 - [Bill] Sleep. 1362 01:25:01,165 --> 01:25:02,649 - [Computer] Sleep. 1363 01:25:16,525 --> 01:25:19,666 [deep pensive music] 1364 01:27:18,094 --> 01:27:20,580 [light music] 102990

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.