All language subtitles for Year.Million.S01E04.Mind.Meld.1080p.AMZN.WEB-DL.DD+5.1.H.264-Cinefeel_track3_[eng]

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranî)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:09,043 --> 00:00:10,911 NARRATOR: Imagine deep in the future, 2 00:00:10,944 --> 00:00:13,347 you and your loved ones have carved out time 3 00:00:13,381 --> 00:00:15,683 to take in a concert. 4 00:00:15,716 --> 00:00:19,387 But this isn't just your average jam session. 5 00:00:19,420 --> 00:00:23,724 Let me take you behind the scenes. 6 00:00:23,757 --> 00:00:25,726 This is what's actually happening. 7 00:00:25,759 --> 00:00:28,829 You and all your fellow concert-goers, 8 00:00:28,862 --> 00:00:33,033 your brains are hardwired together. 9 00:00:33,067 --> 00:00:34,368 Still not getting it? 10 00:00:34,402 --> 00:00:35,603 For the price of admission, 11 00:00:35,636 --> 00:00:37,771 when all of your brains are connected, 12 00:00:37,805 --> 00:00:42,210 you can be the performer, the audience, the orchestra, 13 00:00:42,243 --> 00:00:44,945 the vibrations in the air itself. 14 00:00:47,581 --> 00:00:51,519 You hear a melody, it sparks an emotion, a memory. 15 00:00:51,552 --> 00:00:54,955 Down the rabbit hole you go. 16 00:00:54,988 --> 00:00:58,726 Imagine literally teleporting yourself to that moment in time 17 00:00:58,759 --> 00:01:03,464 and actually living the experience. 18 00:01:03,497 --> 00:01:09,303 Welcome to your future, in the hive mind. 19 00:01:18,412 --> 00:01:21,215 It's the deep future. 20 00:01:21,249 --> 00:01:23,384 Your body, gone. 21 00:01:23,417 --> 00:01:27,288 You're all computer, all the time. 22 00:01:27,321 --> 00:01:29,423 Your brain is way more powerful 23 00:01:29,457 --> 00:01:32,993 than even a billion supercomputers. 24 00:01:33,026 --> 00:01:38,432 Jobs, food, language, water, even traditional thought, 25 00:01:38,466 --> 00:01:41,001 all of humanity's building blocks, 26 00:01:41,034 --> 00:01:42,603 all that's done. 27 00:01:42,636 --> 00:01:45,639 And you are immortal. 28 00:01:45,673 --> 00:01:47,508 Squirming in your chair yet? 29 00:01:47,541 --> 00:01:48,976 You should be. 30 00:01:49,009 --> 00:01:50,678 This isn't science fiction. 31 00:01:50,711 --> 00:01:54,248 Today's visionary thinkers say it's a strong probability 32 00:01:54,282 --> 00:01:57,651 that this is what your world is going to look like. 33 00:01:57,685 --> 00:02:00,954 Tonight, they'll guide you toward that spectacular future, 34 00:02:00,988 --> 00:02:03,924 and we'll see how one family navigates it, 35 00:02:03,957 --> 00:02:06,194 one invention at a time. 36 00:02:06,227 --> 00:02:09,197 This is the story of your future. 37 00:02:09,230 --> 00:02:13,301 This is the road to Year Million. 38 00:02:21,975 --> 00:02:24,945 MICHAEL GRAZIANO: Human beings are the cooperative species. 39 00:02:24,978 --> 00:02:26,480 I mean, that's what made us so successful. 40 00:02:26,514 --> 00:02:27,715 There's no other species on Earth 41 00:02:27,748 --> 00:02:29,817 that can come together in such groups, 42 00:02:29,850 --> 00:02:33,721 instantly intuit what everyone else is thinking, 43 00:02:33,754 --> 00:02:37,024 and cooperate on large-scale projects. 44 00:02:37,057 --> 00:02:40,628 NARRATOR: That's right, folks, communication is our superpower, 45 00:02:40,661 --> 00:02:44,865 whether it's through the spoken or the written word. 46 00:02:44,898 --> 00:02:46,534 Ever since that first Homo sapien 47 00:02:46,567 --> 00:02:49,470 turned a primal grunt into an actual word, 48 00:02:49,503 --> 00:02:51,472 humans have used word-based language 49 00:02:51,505 --> 00:02:56,977 to make us the alpha species on this planet. 50 00:02:57,010 --> 00:02:58,846 CHUCK NICE: Communication is going exactly 51 00:02:58,879 --> 00:03:04,552 where it's been going since the first primates started talking. 52 00:03:04,585 --> 00:03:06,520 Men will say, 'I don't want to talk about it,' 53 00:03:06,554 --> 00:03:08,489 and women will say, 'Why not?' 54 00:03:08,522 --> 00:03:10,190 Okay, okay, I had to do it. 55 00:03:10,224 --> 00:03:12,693 I'm sorry. I just had to do it. 56 00:03:12,726 --> 00:03:14,895 NARRATOR: We forgive you, Chuck, but he does have a point. 57 00:03:14,928 --> 00:03:17,665 While word-based language may be humanity's superpower, 58 00:03:17,698 --> 00:03:21,535 when communication breaks down, it is also our kryptonite. 59 00:03:21,569 --> 00:03:23,637 Let's go back to scripture. 60 00:03:26,274 --> 00:03:27,575 CHARLES SOULE: The Tower of Babel 61 00:03:27,608 --> 00:03:30,778 is one of those proto myths in human society 62 00:03:30,811 --> 00:03:33,447 suggesting that there was a time 63 00:03:33,481 --> 00:03:36,450 when humans all spoke the same language. 64 00:03:36,484 --> 00:03:37,785 They decided to get together 65 00:03:37,818 --> 00:03:40,288 and do the most incredible thing that they could, 66 00:03:40,321 --> 00:03:42,623 which was to build a tower so high 67 00:03:42,656 --> 00:03:44,858 that it would reach all the way to heaven. 68 00:03:44,892 --> 00:03:48,396 And so they start building this thing and it gets really tall. 69 00:03:48,429 --> 00:03:51,299 But then God looks over the edge of heaven 70 00:03:51,332 --> 00:03:53,634 and says, uh, wait a minute, 71 00:03:53,667 --> 00:03:55,936 this is not what I want to have happen. 72 00:03:55,969 --> 00:03:58,906 And so he does something clever, because he is, he is God. 73 00:03:58,939 --> 00:04:01,775 And he, he makes it so that none of those people 74 00:04:01,809 --> 00:04:03,210 who are building the tower together 75 00:04:03,243 --> 00:04:04,612 speak the same language anymore. 76 00:04:04,645 --> 00:04:05,779 And so the minute they stop being able 77 00:04:05,813 --> 00:04:07,214 to speak the same language, 78 00:04:07,247 --> 00:04:08,416 they can't work together anymore. 79 00:04:08,449 --> 00:04:10,751 And so the Tower of Babel falls 80 00:04:10,784 --> 00:04:12,986 and, and never is able to be built again. 81 00:04:13,020 --> 00:04:14,922 I think it's such a brilliant explanation 82 00:04:14,955 --> 00:04:16,457 of the way that human beings think. 83 00:04:16,490 --> 00:04:18,992 If you could just, could talk to somebody 84 00:04:19,026 --> 00:04:21,495 and make sure that you were understood in a clear way, 85 00:04:21,529 --> 00:04:23,130 I think we'd be able to work together 86 00:04:23,163 --> 00:04:24,632 in a really beautiful way. 87 00:04:24,665 --> 00:04:26,634 I think that'd be incredible. 88 00:04:26,667 --> 00:04:30,304 NARRATOR: You know where this is headed, right? 89 00:04:30,338 --> 00:04:32,673 PETER DIAMANDIS: We're about to see this explosion 90 00:04:32,706 --> 00:04:34,408 in the way we communicate, 91 00:04:34,442 --> 00:04:36,810 and that it's these next 20 or 30 years 92 00:04:36,844 --> 00:04:42,650 that we are really plugging the brain into the Internet. 93 00:04:42,683 --> 00:04:45,319 NARRATOR: We're headed to a future of pure, seamless, 94 00:04:45,353 --> 00:04:47,087 unadulterated communication 95 00:04:47,120 --> 00:04:49,990 that will enable levels of cooperation and intelligence 96 00:04:50,023 --> 00:04:51,692 that will make the Tower of Babel 97 00:04:51,725 --> 00:04:53,927 look like a Lego set. 98 00:04:53,961 --> 00:04:56,964 And the communication revolution has already begun. 99 00:04:56,997 --> 00:05:00,167 Billions are spent every year on communication apps. 100 00:05:00,200 --> 00:05:02,035 Twitter, emojis, Google Translate 101 00:05:02,069 --> 00:05:04,472 are all breaking down language barriers. 102 00:05:04,505 --> 00:05:06,507 But these are just the first baby steps 103 00:05:06,540 --> 00:05:09,343 in the evolution of communication. 104 00:05:09,377 --> 00:05:11,311 Flash forward a few thousand years 105 00:05:11,345 --> 00:05:13,280 and traditional word-based language 106 00:05:13,313 --> 00:05:14,615 will be ancient history. 107 00:05:14,648 --> 00:05:16,550 We'll be communicating effortlessly 108 00:05:16,584 --> 00:05:18,018 and at the speed of light, 109 00:05:18,051 --> 00:05:20,320 and that will seismically transform 110 00:05:20,354 --> 00:05:22,990 the very nature of our existence. 111 00:05:23,023 --> 00:05:24,324 This is how we'll do it. 112 00:05:24,358 --> 00:05:26,660 First stage, telepathy. 113 00:05:26,694 --> 00:05:29,062 Using tiny nanochips implanted in our brains 114 00:05:29,096 --> 00:05:31,532 connected to an ultra-high-speed Internet, 115 00:05:31,565 --> 00:05:33,333 we will finally realize the dream 116 00:05:33,367 --> 00:05:36,236 of actual brain-to-brain communication. 117 00:05:36,269 --> 00:05:37,871 Opening our brains to one another 118 00:05:37,905 --> 00:05:41,208 will be a transformational moment in human communication, 119 00:05:41,241 --> 00:05:44,645 as well as the end of privacy as we know it. 120 00:05:44,678 --> 00:05:48,682 But when we go there we will be ready for the next step, 121 00:05:48,716 --> 00:05:51,385 swarm intelligence. 122 00:05:51,419 --> 00:05:53,353 Combining our high-speed connectivity 123 00:05:53,387 --> 00:05:55,055 with our brain-to-brain communication, 124 00:05:55,088 --> 00:05:57,357 we'll combine our diverse outlooks 125 00:05:57,391 --> 00:05:59,226 to exponentially boost our intelligence 126 00:05:59,259 --> 00:06:03,196 and work together in swarms to solve problems in groups 127 00:06:03,230 --> 00:06:05,265 that we never could alone. 128 00:06:05,298 --> 00:06:10,203 We'll need it when we come face to face with alien intelligence. 129 00:06:10,237 --> 00:06:11,539 Figuring out how to communicate 130 00:06:11,572 --> 00:06:13,607 will require all of our ingenuity 131 00:06:13,641 --> 00:06:15,108 and will only be possible 132 00:06:15,142 --> 00:06:17,878 because of our communication revolution. 133 00:06:17,911 --> 00:06:20,047 And when we've mastered swarm intelligence 134 00:06:20,080 --> 00:06:22,716 and become super-intelligent beings, 135 00:06:22,750 --> 00:06:24,918 the final step in human communication will be 136 00:06:24,952 --> 00:06:28,889 when we merge our minds into a single consciousness. 137 00:06:28,922 --> 00:06:31,725 Like that super-trippy concert we just witnessed. 138 00:06:31,759 --> 00:06:34,628 We'll evolve beyond individuality 139 00:06:34,662 --> 00:06:37,397 and shed our very sense of self. 140 00:06:37,431 --> 00:06:39,032 And when we've united humanity 141 00:06:39,066 --> 00:06:41,401 into an enormous super intelligence, 142 00:06:41,435 --> 00:06:42,970 eliminating the barriers between us, 143 00:06:43,003 --> 00:06:46,106 then we can finally build on our limitless imagination, 144 00:06:46,139 --> 00:06:47,708 and everything will be possible. 145 00:06:47,741 --> 00:06:50,310 So, what will be our Tower of Babel look like 146 00:06:50,343 --> 00:06:51,378 in the future? 147 00:06:51,411 --> 00:06:52,980 Stay with us and find out. 148 00:06:53,013 --> 00:06:54,782 But first let's roll back the clock 149 00:06:54,815 --> 00:06:56,249 to witness the first stage, 150 00:06:56,283 --> 00:07:01,088 when we humans are just beginning to embrace telepathy. 151 00:07:11,431 --> 00:07:13,100 [muffled chatter, laughter] 152 00:07:13,133 --> 00:07:18,506 This is what your future dinner party might look and sound like. 153 00:07:18,539 --> 00:07:21,241 [muffled speech] 154 00:07:21,274 --> 00:07:23,276 Oops, I forgot. Since our brains really do think 155 00:07:23,310 --> 00:07:27,481 faster than we speak, let me slow it down so you can hear. 156 00:07:29,817 --> 00:07:32,786 Our brains work faster than our mouths. 157 00:07:32,820 --> 00:07:35,455 EVA: Mm, it's good, it's really good. 158 00:07:35,489 --> 00:07:36,524 MAN: Jess, your mom's a lightweight. 159 00:07:36,557 --> 00:07:38,091 NARRATOR: How bizarre will it be 160 00:07:38,125 --> 00:07:42,630 when our mouths are used solely for eating and breathing? 161 00:07:42,663 --> 00:07:44,965 In the future, a dinner party will take place 162 00:07:44,998 --> 00:07:46,133 in total silence. 163 00:07:46,166 --> 00:07:48,969 Because words as we know them will disappear. 164 00:07:49,002 --> 00:07:52,506 Tiny chips in our brains will enable us to communicate 165 00:07:52,540 --> 00:07:54,942 telepathically via the Internet. 166 00:07:54,975 --> 00:07:57,444 WOMAN: So, Johnny, did you hear about the tsunami in Morocco? 167 00:07:57,477 --> 00:07:59,647 WOMAN: The family that I saw today were really interesting. 168 00:07:59,680 --> 00:08:01,649 WOMAN: The project on Europa, it's amazing. 169 00:08:01,682 --> 00:08:03,283 NARRATOR: Some people, like Oscar here, 170 00:08:03,316 --> 00:08:05,018 will resist the tech implants, 171 00:08:05,052 --> 00:08:07,855 because it also comes with a risk for hacking, 172 00:08:07,888 --> 00:08:09,389 but we'll get into that later. 173 00:08:09,422 --> 00:08:12,860 OSCAR: So, how was your day? 174 00:08:14,261 --> 00:08:17,998 MAN: You still haven't upgraded to telepathy? 175 00:08:20,233 --> 00:08:22,570 NARRATOR: So how do we get to that telepathic dinner party 176 00:08:22,603 --> 00:08:25,606 in the future from where we are today? 177 00:08:28,208 --> 00:08:30,210 The key will be to find a common language 178 00:08:30,243 --> 00:08:32,746 that can connect all of humanity. 179 00:08:32,780 --> 00:08:34,915 I wonder where we'd find something like that. 180 00:08:34,948 --> 00:08:37,751 NICE: I will say this, and I say it without compunction 181 00:08:37,785 --> 00:08:39,419 and with a great deal of confidence-- 182 00:08:39,452 --> 00:08:42,656 the Internet is our universal language. 183 00:08:42,690 --> 00:08:44,057 It's already there. 184 00:08:44,091 --> 00:08:46,326 And you would think that we would now have 185 00:08:46,359 --> 00:08:48,762 this incredible exchange of ideas 186 00:08:48,796 --> 00:08:52,766 and exciting means of transferring information, 187 00:08:52,800 --> 00:08:54,735 but instead what do we do? 188 00:08:54,768 --> 00:09:01,041 We send emojis that talk to us in little faces. 189 00:09:01,074 --> 00:09:04,845 Why? 'Cause they're cute and you can understand them. 190 00:09:04,878 --> 00:09:06,947 NARRATOR: Call them cute or really irritating, 191 00:09:06,980 --> 00:09:10,117 truth is there's no denying the emoji's part of a new grammar 192 00:09:10,150 --> 00:09:12,452 connecting people all around the world 193 00:09:12,485 --> 00:09:14,622 in a way never before seen in history. 194 00:09:14,655 --> 00:09:16,724 And it's just the beginning. 195 00:09:16,757 --> 00:09:18,792 RAY KURZWEIL: We're going to connect our neocortex 196 00:09:18,826 --> 00:09:21,028 to a synthetic neocortex in the cloud. 197 00:09:21,061 --> 00:09:22,596 And I'm thinking that it will be a hybrid 198 00:09:22,630 --> 00:09:25,132 of our biological brains, 199 00:09:25,165 --> 00:09:28,435 with the non-biological extension in the cloud. 200 00:09:28,468 --> 00:09:31,104 NARRATOR: When our brains are directly linked to the cloud, 201 00:09:31,138 --> 00:09:32,906 then the dream of telepathic communication 202 00:09:32,940 --> 00:09:35,308 will finally be realized. 203 00:09:35,342 --> 00:09:37,711 THOMAS WEBSTER: It's actually not so far-fetched, I think, 204 00:09:37,745 --> 00:09:39,312 if you think about it. 205 00:09:39,346 --> 00:09:44,084 So if you're able to create nano-implants or nano-sensors 206 00:09:44,117 --> 00:09:47,120 to put in the brain of one person, 207 00:09:47,154 --> 00:09:49,456 and then put it in the brain of another person. 208 00:09:49,489 --> 00:09:51,124 That's a way to communicate in a way 209 00:09:51,158 --> 00:09:54,127 that we have never really thought about 210 00:09:54,161 --> 00:09:56,864 in society so far. 211 00:09:58,666 --> 00:10:00,300 ANNALEE NEWITZ: There are experiments now 212 00:10:00,333 --> 00:10:04,471 where we have brain-computer interfaces, 213 00:10:04,504 --> 00:10:07,841 which really does suggest that 214 00:10:07,875 --> 00:10:11,611 something like telepathy could exist. 215 00:10:11,645 --> 00:10:13,847 BARATUNDE THURSTON: Oh, that's so dangerous. 216 00:10:13,881 --> 00:10:17,617 Oh, man, we're going to have so many broken relationships. 217 00:10:17,651 --> 00:10:19,787 And look at what people tweet. 218 00:10:19,820 --> 00:10:21,822 That takes effort. 219 00:10:21,855 --> 00:10:23,991 You have to launch an app, you know, 220 00:10:24,024 --> 00:10:27,327 open up the compose window, tap out a message, and press send. 221 00:10:27,360 --> 00:10:29,830 And people still say things they regret, 222 00:10:29,863 --> 00:10:32,632 and lose their jobs over it and get divorced over it. 223 00:10:32,666 --> 00:10:34,367 Thinking to communication? 224 00:10:34,401 --> 00:10:36,269 That's, that's real sloppy. 225 00:10:36,303 --> 00:10:38,872 That's going to be a hot mess. 226 00:10:41,541 --> 00:10:42,843 NICE: You know, it's funny, 227 00:10:42,876 --> 00:10:44,712 because here's how telepathy is awesome-- 228 00:10:44,745 --> 00:10:47,180 when you're the only person who has it. 229 00:10:47,214 --> 00:10:48,882 [laughs] 230 00:10:48,916 --> 00:10:50,784 When that's like your superpower 231 00:10:50,818 --> 00:10:53,353 and you're going around reading everybody's mind, 232 00:10:53,386 --> 00:10:54,988 but nobody can read yours. 233 00:10:55,022 --> 00:10:57,825 That's when telepathy is great. 234 00:10:57,858 --> 00:11:00,560 NARRATOR: That's probably not going to be how it works. 235 00:11:00,593 --> 00:11:02,462 Telepathy will be accessible to everyone. 236 00:11:02,495 --> 00:11:05,665 But telepathy will definitely require some adjustments, 237 00:11:05,699 --> 00:11:07,500 and there will be growing pains. 238 00:11:07,534 --> 00:11:12,472 It won't be all sitting around the campfire singing Kumbaya. 239 00:11:12,505 --> 00:11:14,374 N.K. JEMISIN: Lord help us, what if something like a Twitter mob 240 00:11:14,407 --> 00:11:16,043 existed for the mind? 241 00:11:16,076 --> 00:11:20,513 Um, no, I can't think of anything more horrific. 242 00:11:20,547 --> 00:11:22,049 But if we could control it, 243 00:11:22,082 --> 00:11:24,017 if it was really just another way of connecting, 244 00:11:24,051 --> 00:11:28,055 like AOL, but you know, instead of 'You've got mail,' 245 00:11:28,088 --> 00:11:30,891 it's, you know, 'You've got thoughts.' 246 00:11:30,924 --> 00:11:32,359 THURSTON: On the other hand, 247 00:11:32,392 --> 00:11:36,830 it could lead to a more subtle set of interactions, 248 00:11:36,864 --> 00:11:40,500 because you would feel the weight not just of your words, 249 00:11:40,533 --> 00:11:42,936 but of your thoughts. 250 00:11:42,970 --> 00:11:44,671 NARRATOR: That will be amazing. 251 00:11:44,704 --> 00:11:47,440 Think of it, instantaneous, immersive, 252 00:11:47,474 --> 00:11:49,209 empathic communication. 253 00:11:49,242 --> 00:11:52,012 Humanity will never be the same again. 254 00:11:52,045 --> 00:11:53,713 But it's not going to be easy. 255 00:11:53,747 --> 00:11:55,849 Wireless devices are manufactured to conform 256 00:11:55,883 --> 00:11:59,286 to an agreed-upon standard, like Bluetooth or Wi-Fi. 257 00:11:59,319 --> 00:12:01,688 But telepathy involves human beings, 258 00:12:01,721 --> 00:12:03,757 and we can't agree on anything. 259 00:12:03,791 --> 00:12:05,392 ANDERS SANDBERG: The fundamental problem 260 00:12:05,425 --> 00:12:09,729 of brain-to-brain communication is that every brain is unique. 261 00:12:09,763 --> 00:12:12,800 When I think about the concept of a mountain, 262 00:12:12,833 --> 00:12:15,068 I might envision something like Matterhorn, 263 00:12:15,102 --> 00:12:17,771 because I was seeing that as a child. 264 00:12:17,805 --> 00:12:20,473 And my neurons firing when I think mountain, 265 00:12:20,507 --> 00:12:23,710 might be very different from your neurons that fire. 266 00:12:23,743 --> 00:12:25,412 So I need to find a mapping, 267 00:12:25,445 --> 00:12:29,416 so when my mountain neurons fire in a particular pattern, 268 00:12:29,449 --> 00:12:31,051 we can activate the right ones. 269 00:12:31,084 --> 00:12:34,054 That's a very tough machine learning problem. 270 00:12:34,087 --> 00:12:36,223 NARRATOR: And it's not just going to be a tough problem 271 00:12:36,256 --> 00:12:37,891 for machines to learn. 272 00:12:37,925 --> 00:12:40,794 It's also going to be a big adjustment for humans as well. 273 00:12:40,828 --> 00:12:43,063 Not everyone is going to be an early adopter. 274 00:12:43,096 --> 00:12:45,632 There will be those who don't want their innermost thoughts 275 00:12:45,665 --> 00:12:46,967 accessible to others. 276 00:12:47,000 --> 00:12:49,236 Or those worried about hacking, like Oscar, 277 00:12:49,269 --> 00:12:51,171 the patriarch of our future family 278 00:12:51,204 --> 00:12:55,108 who hasn't yet bought into this newfangled technology. 279 00:12:59,947 --> 00:13:02,515 JESS: Hey, did Sajani tell you their news? 280 00:13:02,549 --> 00:13:03,583 EVA: No, what's up? 281 00:13:03,616 --> 00:13:05,285 NARRATOR: Oscar is old school, 282 00:13:05,318 --> 00:13:06,920 but how long can he hold out 283 00:13:06,954 --> 00:13:09,823 when everyone around him is communicating telepathically? 284 00:13:09,857 --> 00:13:12,359 JESS: Damon wants to marry another woman. 285 00:13:12,392 --> 00:13:13,861 EVA: What? 286 00:13:13,894 --> 00:13:15,495 Does he want a divorce? 287 00:13:15,528 --> 00:13:16,663 JESS: No, no, not at all. 288 00:13:16,696 --> 00:13:18,265 OSCAR: What? 289 00:13:18,298 --> 00:13:20,367 JESS: Apparently, they met someone they both like and... 290 00:13:20,400 --> 00:13:21,701 OSCAR: What is it? 291 00:13:21,734 --> 00:13:24,071 EVA: Nothing. JESS: Nothing. 292 00:13:28,408 --> 00:13:30,310 OSCAR: Okay, that's it. 293 00:13:30,343 --> 00:13:31,778 JESS: Dad. 294 00:13:31,811 --> 00:13:33,346 NARRATOR: At some point he might have to give in, 295 00:13:33,380 --> 00:13:36,116 just to join the conversation. 296 00:13:36,149 --> 00:13:39,252 OSCAR: Telepathy upgrade, dosage Oscar. 297 00:13:50,898 --> 00:13:53,934 So, what's for dessert? 298 00:14:03,510 --> 00:14:05,012 NARRATOR: Telepathy is on its way. 299 00:14:05,045 --> 00:14:06,914 When we connect our minds together 300 00:14:06,947 --> 00:14:09,582 it will be a singular moment in human history 301 00:14:09,616 --> 00:14:12,319 sending human intelligence into overdrive; 302 00:14:12,352 --> 00:14:16,256 a first step on our path back to that Tower of Babel. 303 00:14:16,289 --> 00:14:18,325 But it won't come without a sacrifice. 304 00:14:18,358 --> 00:14:20,760 Opening our innermost thoughts to each other 305 00:14:20,793 --> 00:14:23,063 will begin to blur the differences between us 306 00:14:23,096 --> 00:14:25,198 and will be the beginning of the end 307 00:14:25,232 --> 00:14:30,103 to one of our most cherished possessions: our privacy. 308 00:14:38,211 --> 00:14:40,413 NARRATOR: In the future we're headed to a world 309 00:14:40,447 --> 00:14:42,115 where Internet-enabled telepathy 310 00:14:42,149 --> 00:14:45,752 will connect us all brain-to-brain. 311 00:14:45,785 --> 00:14:47,854 This is going to catapult humanity forward 312 00:14:47,887 --> 00:14:52,459 on our journey to that future Tower of Babel. 313 00:14:52,492 --> 00:14:55,295 Just imagine the implications. 314 00:14:59,232 --> 00:15:00,767 GEORGE DVORSKY: You know, just with the flick of a thought, 315 00:15:00,800 --> 00:15:02,802 start to engage in a conversation with someone 316 00:15:02,835 --> 00:15:04,271 miles away, and not even have to talk. 317 00:15:04,304 --> 00:15:05,772 That's going to change a lot of things 318 00:15:05,805 --> 00:15:07,540 in terms of the level of engagement 319 00:15:07,574 --> 00:15:08,775 that we have with others, 320 00:15:08,808 --> 00:15:09,943 and just even the sense of intimacy 321 00:15:09,977 --> 00:15:13,313 that we will have with others. 322 00:15:13,346 --> 00:15:18,218 DAVID BYRNE: Ooh, that's a risky place to go, I think. 323 00:15:18,251 --> 00:15:21,288 Sometimes it might be, benefit both parties 324 00:15:21,321 --> 00:15:25,258 if you withhold a little bit of information 325 00:15:25,292 --> 00:15:28,128 and give it some time. 326 00:15:28,161 --> 00:15:29,963 BRIAN BENDIS: Even as we're sitting here talking, 327 00:15:29,997 --> 00:15:31,965 you're thinking other things about me, right. 328 00:15:31,999 --> 00:15:34,067 You're lost in the little bald spot. 329 00:15:34,101 --> 00:15:36,003 Would you want to share that? 330 00:15:36,036 --> 00:15:37,904 NARRATOR: Bald spot? What bald spot? 331 00:15:37,937 --> 00:15:39,606 I totally didn't even notice. 332 00:15:39,639 --> 00:15:43,410 Okay, yes, it's true, I was thinking about the bald spot. 333 00:15:43,443 --> 00:15:44,444 He has a point. 334 00:15:44,477 --> 00:15:45,778 There are definitely thoughts 335 00:15:45,812 --> 00:15:48,115 I would prefer not to share with the world. 336 00:15:48,148 --> 00:15:50,917 But in this newly highly connected world 337 00:15:50,950 --> 00:15:53,286 where we're communicating mind-to-mind, 338 00:15:53,320 --> 00:15:55,322 we may not have a choice. 339 00:16:01,561 --> 00:16:04,797 DIAMANDIS: In 2010, we had 1.8 billion people connected. 340 00:16:04,831 --> 00:16:09,636 By 2020, 2025, that number is expected to grow 341 00:16:09,669 --> 00:16:12,139 to all 8 billion humans on the planet. 342 00:16:12,172 --> 00:16:14,774 So imagine a time in the near future 343 00:16:14,807 --> 00:16:16,143 where every single person on the planet 344 00:16:16,176 --> 00:16:19,046 is connected with Internet. 345 00:16:19,079 --> 00:16:21,014 And these 8 billion people with a megabit connection 346 00:16:21,048 --> 00:16:25,285 now have access at a brand-new level. 347 00:16:25,318 --> 00:16:28,121 MICHIO KAKU: The Internet will evolve into brain-net. 348 00:16:28,155 --> 00:16:32,625 That is, we'll send emotions, memories, feelings. 349 00:16:32,659 --> 00:16:34,694 And that's going to change everything. 350 00:16:34,727 --> 00:16:38,131 NARRATOR: So if the signal is coming from within our brain, 351 00:16:38,165 --> 00:16:41,468 how do we police what we unconsciously send? 352 00:16:41,501 --> 00:16:43,903 Our deepest thoughts and feelings could be hacked 353 00:16:43,936 --> 00:16:45,572 and littered across the Internet. 354 00:16:45,605 --> 00:16:48,741 Who will have access to these extremely personal 355 00:16:48,775 --> 00:16:51,044 and private parts of ourselves? 356 00:16:51,078 --> 00:16:54,814 How does that change how we communicate? 357 00:16:54,847 --> 00:16:56,416 MORGAN MARQUIS-BOIRE: Privacy is at the heart 358 00:16:56,449 --> 00:16:58,651 of many treasured parts of the human experience, 359 00:16:58,685 --> 00:17:00,087 like love and like family. 360 00:17:00,120 --> 00:17:03,556 It is sort of a basic right of humans. 361 00:17:03,590 --> 00:17:06,659 The worry, of course, about the sort of erosion of privacy 362 00:17:06,693 --> 00:17:08,428 is that it changes, you know, 363 00:17:08,461 --> 00:17:10,430 sort of the nature of who we are as humans, 364 00:17:10,463 --> 00:17:12,999 and the nature of our relationship with others 365 00:17:13,032 --> 00:17:15,001 in ways that are not positive. 366 00:17:15,034 --> 00:17:18,171 We become afraid to think certain thoughts, 367 00:17:18,205 --> 00:17:21,274 because we know that we're constantly being watched. 368 00:17:21,308 --> 00:17:22,842 NARRATOR: You're worried about the government 369 00:17:22,875 --> 00:17:24,744 monitoring your search history. 370 00:17:24,777 --> 00:17:28,581 Well, in the future it won't just be your search history, 371 00:17:28,615 --> 00:17:30,683 it will be your entire history. 372 00:17:30,717 --> 00:17:34,521 And that raises a serious question. 373 00:17:34,554 --> 00:17:37,257 ROSE EVELETH: Are we going to have privacy in the future? 374 00:17:37,290 --> 00:17:38,591 No. 375 00:17:38,625 --> 00:17:41,361 It's just going to be a different conversation. 376 00:17:41,394 --> 00:17:45,265 THURSTON: We've been sold convenience and efficiency 377 00:17:45,298 --> 00:17:50,370 as a tradeoff for letting essentially surveillance, 378 00:17:50,403 --> 00:17:52,105 at scale, into our lives. 379 00:17:52,139 --> 00:17:55,074 So we get free information online, 380 00:17:55,108 --> 00:17:57,477 by giving up our information online. 381 00:17:57,510 --> 00:18:02,715 And if every choice you make is mediated by an algorithm, 382 00:18:02,749 --> 00:18:06,486 from what you eat, to who you love, to where you go to lunch, 383 00:18:06,519 --> 00:18:08,388 then that's kind of a destruction of self 384 00:18:08,421 --> 00:18:10,990 and a destruction of independence and free will 385 00:18:11,023 --> 00:18:15,495 that gets very philosophical at that point. 386 00:18:15,528 --> 00:18:18,064 EVELETH: In the sort of most dystopian version of this, 387 00:18:18,097 --> 00:18:20,667 you're being surveyed all the time. 388 00:18:20,700 --> 00:18:23,303 TREVOR PAGLEN: When you put that picture on Facebook, 389 00:18:23,336 --> 00:18:27,340 it becomes a part of a body of data 390 00:18:27,374 --> 00:18:33,646 attached to that specific person and to the people around them. 391 00:18:33,680 --> 00:18:36,516 NARRATOR: And that's just what's going on today. 392 00:18:36,549 --> 00:18:39,186 ADAM HARVEY: You begin to feel very watched 393 00:18:39,219 --> 00:18:41,654 when you know how powerful computer vision is, 394 00:18:41,688 --> 00:18:45,124 in terms of extracting knowledge and building a narrative. 395 00:18:45,158 --> 00:18:48,261 NARRATOR: In the future, when we communicate telepathically, 396 00:18:48,295 --> 00:18:50,997 technology might stop tracking key words or faces, 397 00:18:51,030 --> 00:18:54,467 and start tracking your thoughts and your feelings. 398 00:18:54,501 --> 00:18:56,636 THURSTON: That's pretty terrifying. 399 00:18:56,669 --> 00:18:57,870 PAGLEN: It's a question of freedom, 400 00:18:57,904 --> 00:19:00,340 and it's a question of rights. 401 00:19:00,373 --> 00:19:02,709 NARRATOR: We risk becoming a kind of surveillance state 402 00:19:02,742 --> 00:19:03,776 in the future. 403 00:19:03,810 --> 00:19:05,545 Some say we already are. 404 00:19:05,578 --> 00:19:08,681 And as we spend more and more of our time online, 405 00:19:08,715 --> 00:19:10,483 watching and learning on the Internet, 406 00:19:10,517 --> 00:19:13,353 the Internet is also watching and learning us. 407 00:19:13,386 --> 00:19:14,821 It's a two-way street. 408 00:19:14,854 --> 00:19:18,525 And that could be a frightening proposition. 409 00:19:18,558 --> 00:19:24,997 * 410 00:19:25,031 --> 00:19:27,033 PAGLEN: For a number of years in studio, 411 00:19:27,066 --> 00:19:32,038 we've been developing tools to work with machine vision. 412 00:19:32,071 --> 00:19:34,874 NARRATOR: Trevor Paglen is an artist in San Francisco. 413 00:19:34,907 --> 00:19:36,843 And he created this performance piece 414 00:19:36,876 --> 00:19:39,246 with the famed Kronos Quartet. 415 00:19:39,279 --> 00:19:41,848 It looks like a normal concert, right? 416 00:19:41,881 --> 00:19:43,850 Except it's anything but. 417 00:19:43,883 --> 00:19:45,818 These musicians are being watched, 418 00:19:45,852 --> 00:19:47,420 and not just by the audience. 419 00:19:47,454 --> 00:19:50,089 Cameras project the performers on the screen, 420 00:19:50,122 --> 00:19:55,362 and using algorithms, interpret what they're seeing. 421 00:19:55,395 --> 00:19:58,097 PAGLEN: You start to see a very sharp contrast 422 00:19:58,130 --> 00:19:59,932 between how you, as a human audience, 423 00:19:59,966 --> 00:20:01,401 are perceiving the performance, 424 00:20:01,434 --> 00:20:03,736 and how these machinic forms of seeing 425 00:20:03,770 --> 00:20:05,405 are perceiving the performance. 426 00:20:05,438 --> 00:20:08,541 NARRATOR: As you can tell, it's not always completely accurate. 427 00:20:08,575 --> 00:20:11,210 But using facial recognition software, 428 00:20:11,244 --> 00:20:13,079 Trevor is using this performance 429 00:20:13,112 --> 00:20:16,249 to demonstrate how computers watch and record us. 430 00:20:16,283 --> 00:20:18,718 And just how easy would it be for computers 431 00:20:18,751 --> 00:20:22,422 to build a digital profile of us based on algorithms 432 00:20:22,455 --> 00:20:26,593 that may or may not have our best interests in mind. 433 00:20:26,626 --> 00:20:32,231 PAGLEN: My concern is that this very intimate quantification 434 00:20:32,265 --> 00:20:33,400 of everyday life 435 00:20:33,433 --> 00:20:38,070 adds up to an extremely conformist society. 436 00:20:38,104 --> 00:20:39,739 It's very easy to imagine a future 437 00:20:39,772 --> 00:20:43,543 in which you put a picture of you drinking a beer on Facebook, 438 00:20:43,576 --> 00:20:45,144 that automatically translating 439 00:20:45,177 --> 00:20:48,415 into an increase in your car insurance. 440 00:20:48,448 --> 00:20:52,319 NARRATOR: That's bad, but it could get much worse. 441 00:20:55,788 --> 00:20:57,557 KAKU: Let's say a crime is committed, 442 00:20:57,590 --> 00:21:02,228 and the CIA or the FBI wants to have access to everyone 443 00:21:02,261 --> 00:21:04,964 who has an inclination to do something 444 00:21:04,997 --> 00:21:07,900 like the crime that was just committed. 445 00:21:07,934 --> 00:21:10,069 It scans a database of everybody, 446 00:21:10,102 --> 00:21:11,638 and then just prints out the names. 447 00:21:11,671 --> 00:21:13,072 And you could be totally innocent 448 00:21:13,105 --> 00:21:15,742 and your name could be picked out. 449 00:21:15,775 --> 00:21:18,611 NARRATOR: You might be thinking of Edward Snowden right now. 450 00:21:18,645 --> 00:21:22,349 You can see how, unchecked, this kind of access and power 451 00:21:22,382 --> 00:21:24,784 is wide open for abuse. 452 00:21:24,817 --> 00:21:27,119 NEWITZ: How do we make sure that this kind of technology 453 00:21:27,153 --> 00:21:31,290 isn't being used to categorize people unfairly? 454 00:21:31,324 --> 00:21:34,126 It leads right into issues around racial profiling, 455 00:21:34,160 --> 00:21:35,462 it leads into issues 456 00:21:35,495 --> 00:21:38,197 around other kinds of profiling as well. 457 00:21:38,230 --> 00:21:40,099 NARRATOR: And what about identity theft? 458 00:21:40,132 --> 00:21:41,634 Sure, it's a problem today, 459 00:21:41,668 --> 00:21:44,103 but when the Internet is a part of your brain, 460 00:21:44,136 --> 00:21:46,473 hackers might just not be able to steal your data, 461 00:21:46,506 --> 00:21:49,542 they might also be able to hack your mind! 462 00:21:49,576 --> 00:21:51,444 MARQUIS-BOIRE: As anyone who's sat in a plane 463 00:21:51,478 --> 00:21:53,813 and worried about the hundreds of thousands of lines of code 464 00:21:53,846 --> 00:21:57,350 that keep them floating in the air, 465 00:21:57,384 --> 00:21:58,951 you know, similarly, 466 00:21:58,985 --> 00:22:01,788 the exploitability of our digital lives 467 00:22:01,821 --> 00:22:05,024 is something that, you know, 468 00:22:05,057 --> 00:22:06,893 it's sort of like an ever-pressing concern 469 00:22:06,926 --> 00:22:08,728 in the back of my mind. 470 00:22:08,761 --> 00:22:11,898 NARRATOR: These are serious issues with major repercussions 471 00:22:11,931 --> 00:22:13,966 that our newly connected society 472 00:22:14,000 --> 00:22:15,568 is going to have to grapple with, 473 00:22:15,602 --> 00:22:17,704 because the Internet isn't going anywhere. 474 00:22:17,737 --> 00:22:19,406 BRYAN JOHNSON: In the history of the human race, 475 00:22:19,439 --> 00:22:23,075 we have never stopped the development of a technology. 476 00:22:23,109 --> 00:22:24,977 No matter how dangerous it is, 477 00:22:25,011 --> 00:22:27,146 we have never been able to stop it. 478 00:22:27,179 --> 00:22:28,381 NARRATOR: The trend of human history 479 00:22:28,415 --> 00:22:30,717 is greater and greater connection. 480 00:22:30,750 --> 00:22:32,251 There's no turning back. 481 00:22:32,284 --> 00:22:35,221 We have opened the Pandora's box. 482 00:22:35,254 --> 00:22:36,489 DIAMANDIS: I see us going, 483 00:22:36,523 --> 00:22:40,593 over the next 30, 40 years at the outmost, 484 00:22:40,627 --> 00:22:43,162 from individuals, me and you, 485 00:22:43,195 --> 00:22:47,767 to a meta-intelligence, where 8 billion people are plugged in, 486 00:22:47,800 --> 00:22:51,704 through the cloud, knowing each other's thoughts and feelings, 487 00:22:51,738 --> 00:22:55,608 and becoming conscious at a brand-new level. 488 00:22:55,642 --> 00:22:59,178 NARRATOR: Our current notion of privacy will be ancient history. 489 00:22:59,211 --> 00:23:01,247 But that is a sacrifice we'll have to make, 490 00:23:01,280 --> 00:23:05,585 because we will gain so much more in our new open society. 491 00:23:05,618 --> 00:23:07,454 SANDBERG: One could imagine a world 492 00:23:07,487 --> 00:23:11,524 where thoughts are flowing freely between different minds, 493 00:23:11,558 --> 00:23:13,760 but different minds are solving problems 494 00:23:13,793 --> 00:23:17,229 or looking at things from a different perspective. 495 00:23:17,263 --> 00:23:19,632 NARRATOR: When we master telepathy and redefine privacy, 496 00:23:19,666 --> 00:23:22,735 we'll be on our way to that shining tower of the future. 497 00:23:22,769 --> 00:23:23,936 What comes next? 498 00:23:23,970 --> 00:23:26,105 A revolutionary form of communication 499 00:23:26,138 --> 00:23:27,940 that will unlock the collective power 500 00:23:27,974 --> 00:23:29,576 of our supercharged brains. 501 00:23:29,609 --> 00:23:31,978 Nothing can stop us when we launch human cooperation 502 00:23:32,011 --> 00:23:35,848 into overdrive with swarm intelligence. 503 00:23:43,355 --> 00:23:46,493 NARRATOR: They say two heads are better than one. 504 00:23:46,526 --> 00:23:50,162 Well, in the future we're not going to settle for just two, 505 00:23:50,196 --> 00:23:55,802 try 200, 2,000, 2 million! 506 00:23:55,835 --> 00:24:00,006 Powered by high-speed Internet connected directly to our brain, 507 00:24:00,039 --> 00:24:02,509 we'll all be communicating telepathically 508 00:24:02,542 --> 00:24:07,279 and working together at the speed of light. 509 00:24:07,313 --> 00:24:09,281 And that's going to blow the walls 510 00:24:09,315 --> 00:24:12,418 off what humanity is capable of. 511 00:24:12,451 --> 00:24:14,320 And whom does the hyper-connected, 512 00:24:14,353 --> 00:24:16,589 telepathic society of Year Million 513 00:24:16,623 --> 00:24:19,125 have to thank for their superpower? 514 00:24:19,158 --> 00:24:21,861 That's right, bees. 515 00:24:21,894 --> 00:24:25,264 Welcome to swarm intelligence. 516 00:24:25,297 --> 00:24:27,867 ['Flight of the Bumblebee' playing] 517 00:24:27,900 --> 00:24:29,268 LOUIS ROSENBERG: Bees go out, 518 00:24:29,301 --> 00:24:31,504 and every year they have to find a new home. 519 00:24:31,538 --> 00:24:34,674 And so what they do is they form a swarm. 520 00:24:34,707 --> 00:24:36,709 And that swarm will negotiate 521 00:24:36,743 --> 00:24:41,213 and find the best possible site among all the options. 522 00:24:41,247 --> 00:24:43,550 And what's amazing is that an individual bee can't conceive 523 00:24:43,583 --> 00:24:46,653 of the problem of finding the best possible site. 524 00:24:46,686 --> 00:24:49,221 But when they work together as a swarm, 525 00:24:49,255 --> 00:24:52,559 they converge on that best answer. 526 00:24:54,727 --> 00:24:57,396 EVELETH: You might know a ton about Chinese geography, 527 00:24:57,429 --> 00:24:58,965 which I don't know anything about. 528 00:24:58,998 --> 00:25:01,233 And I might know a ton about krill, 529 00:25:01,267 --> 00:25:03,035 and you might not know anything about that. 530 00:25:03,069 --> 00:25:04,503 I actually do know a lot about krill. 531 00:25:04,537 --> 00:25:06,205 And then together, we're really good at Jeopardy, 532 00:25:06,238 --> 00:25:07,540 or whatever it is, you know. 533 00:25:07,574 --> 00:25:10,042 And that's kind of the idea, right? 534 00:25:10,076 --> 00:25:13,379 THURSTON: Connectivity breeds connection. 535 00:25:13,412 --> 00:25:15,214 I think that there's something real powerful, 536 00:25:15,247 --> 00:25:17,349 if thought becomes communication. 537 00:25:17,383 --> 00:25:19,151 ROSENBERG: 'Cause ultimately, 538 00:25:19,185 --> 00:25:24,090 it's about collecting input from diverse groups. 539 00:25:26,726 --> 00:25:29,328 NARRATOR: For animals, the key to swarm intelligence 540 00:25:29,361 --> 00:25:32,398 is rapid communication within the group. 541 00:25:32,431 --> 00:25:34,200 One of the things that's held back humans 542 00:25:34,233 --> 00:25:37,336 from tapping the full potential of swarm intelligence 543 00:25:37,369 --> 00:25:40,039 is our traditional word-based language. 544 00:25:40,072 --> 00:25:43,576 Powerful as it is, it's just too slow. 545 00:25:43,610 --> 00:25:45,344 JOHNSON: Right now, we communicate 546 00:25:45,377 --> 00:25:49,616 at something like 40 to 60 bits per second via voice. 547 00:25:49,649 --> 00:25:53,452 But our brains can process information much faster. 548 00:25:53,485 --> 00:25:55,521 ROSENBERG: If we're trying to solve problems, 549 00:25:55,554 --> 00:25:58,090 and we work together as a system, 550 00:25:58,124 --> 00:25:59,892 we should find solutions to problems 551 00:25:59,926 --> 00:26:02,895 that over time, as technology becomes more seamless, 552 00:26:02,929 --> 00:26:05,364 they'll just think, and they'll think together as a system, 553 00:26:05,397 --> 00:26:06,733 they'll think together as a swarm, 554 00:26:06,766 --> 00:26:08,234 and they'll converge on answers 555 00:26:08,267 --> 00:26:13,272 that optimizes the satisfaction of the whole population. 556 00:26:13,305 --> 00:26:15,041 SANDBERG: Once we figure out the science 557 00:26:15,074 --> 00:26:17,610 of deliberately swarming people, 558 00:26:17,644 --> 00:26:19,078 we're going to unleash 559 00:26:19,111 --> 00:26:23,582 a tremendous form of collective intelligence. 560 00:26:23,616 --> 00:26:26,085 NARRATOR: Swarm intelligence could well be the only way 561 00:26:26,118 --> 00:26:27,419 in the far future 562 00:26:27,453 --> 00:26:31,758 that we can compete with artificial intelligence. 563 00:26:31,791 --> 00:26:34,593 When we combine our collective brain power together, 564 00:26:34,627 --> 00:26:37,596 it will be like millions of incredibly powerful computers 565 00:26:37,630 --> 00:26:42,601 uniting to solve the world's most pressing problems. 566 00:26:42,635 --> 00:26:46,739 Like how to house refugees whose homes have been destroyed by war 567 00:26:46,773 --> 00:26:50,509 or the effects of an increasingly warming planet. 568 00:26:57,516 --> 00:26:59,285 [muffled voices] 569 00:26:59,318 --> 00:27:02,154 This is what swarm intelligence may look like in the future. 570 00:27:02,188 --> 00:27:03,856 Jess is telepathically swarming with other people 571 00:27:03,890 --> 00:27:05,357 around the world 572 00:27:05,391 --> 00:27:08,761 trying to come up with a solution to a refugee crisis. 573 00:27:08,795 --> 00:27:10,362 But where are my manners? 574 00:27:10,396 --> 00:27:13,766 Let me slow this down again so your minds can process it. 575 00:27:13,800 --> 00:27:17,103 Really, this conversation happened in the blink of an eye. 576 00:27:17,136 --> 00:27:19,171 MAN: Another tsunami in less than a month. 577 00:27:19,205 --> 00:27:22,374 WOMAN: Not to mention the hurricanes in North America. 578 00:27:22,408 --> 00:27:24,543 WOMAN: And the drought in Central Asia. 579 00:27:24,576 --> 00:27:27,479 MAN: Climate change is wreaking havoc in our cities. 580 00:27:27,513 --> 00:27:30,149 WOMAN: Millions of people have been displaced. 581 00:27:30,182 --> 00:27:32,484 JESS: We have to do something to help them. 582 00:27:32,518 --> 00:27:34,620 MAN: Can we stabilize the climate? 583 00:27:34,653 --> 00:27:38,224 WOMAN: Eventually, yes, but in the meantime? 584 00:27:38,257 --> 00:27:40,827 JESS: These people need homes. What can we do? 585 00:27:40,860 --> 00:27:42,862 WOMAN: We redesign major cities. 586 00:27:42,895 --> 00:27:44,797 MAN: Relocate to other planets. 587 00:27:44,831 --> 00:27:46,232 WOMAN: Too much time. 588 00:27:46,265 --> 00:27:49,001 JESS: There has to be an inexpensive and quick solution. 589 00:27:49,035 --> 00:27:50,803 NARRATOR: Working together as a swarm, 590 00:27:50,837 --> 00:27:54,073 they're able to come up with a creative and fast solution 591 00:27:54,106 --> 00:27:55,341 to save the planet. 592 00:27:55,374 --> 00:27:56,642 MAN: We need mobility. 593 00:27:56,675 --> 00:27:57,676 JESS: Mobile. 594 00:27:57,710 --> 00:27:59,045 MAN: Inexpensive. 595 00:27:59,078 --> 00:28:00,980 WOMAN: Clean energy. 596 00:28:01,013 --> 00:28:02,648 JESS: That's it! 597 00:28:02,681 --> 00:28:06,452 NARRATOR: This is the future of communication and cooperation. 598 00:28:12,725 --> 00:28:14,193 JEMISIN: We've got the potential to harness 599 00:28:14,226 --> 00:28:16,595 a tremendously democratizing force. 600 00:28:16,628 --> 00:28:18,597 A planet-wide e-democracy. 601 00:28:18,630 --> 00:28:20,166 Which would be awesome, 602 00:28:20,199 --> 00:28:22,534 if we can manage to do it in a way that's safe. 603 00:28:22,568 --> 00:28:25,404 NARRATOR: Swarms of doctors could find cures for diseases 604 00:28:25,437 --> 00:28:27,039 faster than they could alone. 605 00:28:27,073 --> 00:28:30,409 Swarms of engineers could invent machines and build structures 606 00:28:30,442 --> 00:28:32,444 no individual can imagine. 607 00:28:32,478 --> 00:28:34,847 The bigger and more connected the swarm, 608 00:28:34,881 --> 00:28:38,084 the more powerful it could be. 609 00:28:38,117 --> 00:28:39,418 But like anything powerful, 610 00:28:39,451 --> 00:28:43,622 there is a dark side to the swarm. 611 00:28:43,655 --> 00:28:45,457 NEWITZ: As we all know, big groups of people 612 00:28:45,491 --> 00:28:49,929 sometimes get together and, you know, do really dumb things. 613 00:28:49,962 --> 00:28:51,597 NARRATOR: The Internet is already full 614 00:28:51,630 --> 00:28:52,799 of hackers and trolls. 615 00:28:52,832 --> 00:28:56,435 Now imagine a global swarm of snooping hackers, 616 00:28:56,468 --> 00:28:59,071 connected directly to your brain. 617 00:28:59,105 --> 00:29:00,773 AMY WEBB: As with everything, 618 00:29:00,807 --> 00:29:06,545 there is the technology and then the way that we use technology. 619 00:29:06,578 --> 00:29:08,881 MARQUIS-BOIRE: Technology acts as a power amplifier, right? 620 00:29:08,915 --> 00:29:12,651 And so it is neither sort of inherently good nor bad. 621 00:29:12,684 --> 00:29:15,321 It's simply a tool that amplifies the desires 622 00:29:15,354 --> 00:29:18,324 of the individual or the institution. 623 00:29:18,357 --> 00:29:22,061 WEBB: Part of our obligations as humans 624 00:29:22,094 --> 00:29:26,598 is to inject ourselves in the process. 625 00:29:26,632 --> 00:29:28,300 THURSTON: We have to learn from the mistakes we've made 626 00:29:28,334 --> 00:29:32,471 with past technologies and with past human interaction. 627 00:29:32,504 --> 00:29:34,941 NARRATOR: Swarm intelligence is powerful, 628 00:29:34,974 --> 00:29:36,408 and we'll have to make sure that 629 00:29:36,442 --> 00:29:38,811 we use our massive new intelligence 630 00:29:38,845 --> 00:29:42,481 to unite humanity, not to oppress others. 631 00:29:42,514 --> 00:29:45,517 But the true test of our elevated ability to cooperate 632 00:29:45,551 --> 00:29:49,655 will come when we encounter something other than ourselves. 633 00:29:49,688 --> 00:29:53,492 That's right. Aliens. 634 00:29:53,525 --> 00:29:56,195 KAKU: Let's say one day we're scanning the heavens, 635 00:29:56,228 --> 00:29:58,697 and we pick up a regular message. 636 00:29:58,730 --> 00:30:01,633 Not random noise, but a regular message. 637 00:30:01,667 --> 00:30:03,369 [humming tune from Close Encounters of the Third Kind] 638 00:30:03,402 --> 00:30:04,937 SOULE: Something like that, right? 639 00:30:04,971 --> 00:30:08,707 KAKU: That, of course, is going to be earthshaking. 640 00:30:08,740 --> 00:30:10,542 NARRATOR: It certainly will be. 641 00:30:10,576 --> 00:30:13,279 How will we communicate with them? 642 00:30:13,312 --> 00:30:15,814 Understanding what extraterrestrials are doing 643 00:30:15,848 --> 00:30:19,118 in the skies above us will be a major test 644 00:30:19,151 --> 00:30:22,221 of our newfound communication skills. 645 00:30:22,254 --> 00:30:29,028 It may be the difference between survival and extinction. 646 00:30:29,061 --> 00:30:36,068 * 647 00:30:50,749 --> 00:30:52,218 NARRATOR: As we trip the light fantastic 648 00:30:52,251 --> 00:30:53,920 down the path to Year Million, 649 00:30:53,953 --> 00:30:55,187 toward a time when we can build 650 00:30:55,221 --> 00:30:57,323 our very own tower to the heavens, 651 00:30:57,356 --> 00:31:00,759 communication will be completely redefined. 652 00:31:00,792 --> 00:31:03,329 But let's be clear when we talk about the Year Million. 653 00:31:03,362 --> 00:31:06,098 We're not really talking about a specific year. 654 00:31:06,132 --> 00:31:10,069 It's our way of saying a future so different, so transformative, 655 00:31:10,102 --> 00:31:13,172 that it's just a glimmer on the horizon of our imagination. 656 00:31:13,205 --> 00:31:16,108 And only the boldest thinkers are able to see 657 00:31:16,142 --> 00:31:17,543 where we're headed. 658 00:31:17,576 --> 00:31:20,046 We just might make some astounding discoveries 659 00:31:20,079 --> 00:31:23,049 along the way. 660 00:31:23,082 --> 00:31:24,416 KAKU: I'm going to stick my neck out 661 00:31:24,450 --> 00:31:27,586 and say that we will probably make contact 662 00:31:27,619 --> 00:31:30,056 with an extraterrestrial civilization. 663 00:31:30,089 --> 00:31:33,926 That's going to be one of the greatest turning points 664 00:31:33,960 --> 00:31:35,794 in human history. 665 00:31:35,827 --> 00:31:41,533 Every single historical account of the evolution of our species 666 00:31:41,567 --> 00:31:43,369 will have to take into account 667 00:31:43,402 --> 00:31:45,637 the fact that we have finally made contact 668 00:31:45,671 --> 00:31:48,607 with another intelligent life-form. 669 00:31:48,640 --> 00:31:50,809 NARRATOR: And when that day finally arrives, 670 00:31:50,842 --> 00:31:53,545 the question is, what then? 671 00:31:53,579 --> 00:31:55,047 EVELETH: You want to make sure that, 672 00:31:55,081 --> 00:31:56,415 like, they don't want to kill you. 673 00:31:56,448 --> 00:31:58,117 Very quickly, as quickly as you can, 674 00:31:58,150 --> 00:31:59,585 you want to make sure 675 00:31:59,618 --> 00:32:01,453 that they are not trying to kill and eat you, right. 676 00:32:01,487 --> 00:32:03,789 NARRATOR: Yes, that would be tops on the list, 677 00:32:03,822 --> 00:32:05,224 I would imagine. 678 00:32:05,257 --> 00:32:07,459 Assuming we get past that, what next? 679 00:32:07,493 --> 00:32:10,796 It could be the greatest moment in human history, or not. 680 00:32:10,829 --> 00:32:12,398 After all, first contact 681 00:32:12,431 --> 00:32:15,567 could be 'Arrival' or 'Independence Day.' 682 00:32:15,601 --> 00:32:17,169 THURSTON: I loved 'Arrival.' 683 00:32:17,203 --> 00:32:19,738 It's about feelings and communication. 684 00:32:19,771 --> 00:32:21,507 EVELETH: 'Arrival' is a great example 685 00:32:21,540 --> 00:32:23,775 of that being patient and trying to actually communicate, 686 00:32:23,809 --> 00:32:26,245 and trying to think about things scientifically 687 00:32:26,278 --> 00:32:28,414 and not rush into anything. 688 00:32:28,447 --> 00:32:31,083 When we are looking at these species or whatever, 689 00:32:31,117 --> 00:32:32,584 aliens that come, 690 00:32:32,618 --> 00:32:35,687 is a better method than just trying to blow them up. 691 00:32:35,721 --> 00:32:39,258 NEGIN FARSAD: I like the idea of aliens being like prettier 692 00:32:39,291 --> 00:32:41,293 than what we've thought of them. 693 00:32:41,327 --> 00:32:43,862 You know, we've kind of made them ugly 694 00:32:43,895 --> 00:32:45,531 over the last several decades. 695 00:32:45,564 --> 00:32:47,099 There's no need for that. 696 00:32:47,133 --> 00:32:50,469 They can actually be quite gorgeous, you know what I mean? 697 00:32:50,502 --> 00:32:52,104 NARRATOR: That's a good point, Negin, 698 00:32:52,138 --> 00:32:53,439 but whatever they look like, 699 00:32:53,472 --> 00:32:55,207 let's just assume for the sake of argument 700 00:32:55,241 --> 00:32:57,376 that first contact with extraterrestrials 701 00:32:57,409 --> 00:32:59,545 goes more in the direction of 'Arrival.' 702 00:32:59,578 --> 00:33:00,846 Then our greatest challenge, 703 00:33:00,879 --> 00:33:02,548 what all of our super-charged intelligence 704 00:33:02,581 --> 00:33:05,884 will need to figure out, is how to communicate with them. 705 00:33:05,917 --> 00:33:07,386 But it's not going to be easy. 706 00:33:07,419 --> 00:33:11,690 Without an Alien Dictionary, where do we even begin? 707 00:33:13,059 --> 00:33:14,493 KAKU: There are three features 708 00:33:14,526 --> 00:33:16,728 that we think intelligent alien life will have. 709 00:33:16,762 --> 00:33:19,398 First of all is vision, some kind of stereovision, 710 00:33:19,431 --> 00:33:21,233 the vision of a hunter. 711 00:33:21,267 --> 00:33:24,070 Second is a thumb, a grappling instrument, 712 00:33:24,103 --> 00:33:25,737 a tentacle of some sort. 713 00:33:25,771 --> 00:33:29,308 And third, a language by which you can hand down information 714 00:33:29,341 --> 00:33:31,377 from generation to generation. 715 00:33:31,410 --> 00:33:33,245 But they're not going to communicate 716 00:33:33,279 --> 00:33:34,880 using American English, 717 00:33:34,913 --> 00:33:37,349 and they're not going to have subject, verb, predicate, 718 00:33:37,383 --> 00:33:39,885 the way we construct sentences. 719 00:33:39,918 --> 00:33:43,889 NICE: I hope, I can only hope 720 00:33:43,922 --> 00:33:49,228 that they all look something like Selma Hayek. 721 00:33:49,261 --> 00:33:50,896 That would be really good. 722 00:33:50,929 --> 00:33:53,031 NARRATOR: We're talking about communication here, Chuck, 723 00:33:53,065 --> 00:33:54,400 let's stick to the subject. 724 00:33:54,433 --> 00:33:55,767 EVELETH: What are the linguistics of this? 725 00:33:55,801 --> 00:33:57,369 What does this actually look like? 726 00:33:57,403 --> 00:33:58,870 How do we figure out, 727 00:33:58,904 --> 00:34:00,772 when they're painting these weird circles, 728 00:34:00,806 --> 00:34:03,109 and we're using these weird lines and sticks, 729 00:34:03,142 --> 00:34:05,577 how do we figure out how to communicate with them? 730 00:34:05,611 --> 00:34:07,446 NARRATOR: How we figure out the aliens' language 731 00:34:07,479 --> 00:34:09,348 will make or break us, 732 00:34:09,381 --> 00:34:11,783 and scientists are already working on it. 733 00:34:11,817 --> 00:34:13,085 How, you might ask? 734 00:34:13,119 --> 00:34:15,921 Well, by taking a page from Dr. Doolittle's book 735 00:34:15,954 --> 00:34:19,925 and starting right here with the animals on Earth. 736 00:34:19,958 --> 00:34:22,361 Dolphins to be specific. 737 00:34:22,394 --> 00:34:24,296 At the National Aquarium in Maryland, 738 00:34:24,330 --> 00:34:27,599 Dr. Diana Reiss and her team are studying dolphins 739 00:34:27,633 --> 00:34:31,103 and how one day we might be able to not just understand them, 740 00:34:31,137 --> 00:34:33,605 but communicate with them. 741 00:34:33,639 --> 00:34:34,806 DIANA REISS: I got really interested 742 00:34:34,840 --> 00:34:36,775 in working with dolphins particularly, 743 00:34:36,808 --> 00:34:39,111 because they were so different from us. 744 00:34:39,145 --> 00:34:41,947 These animals are truly non-terrestrials 745 00:34:41,980 --> 00:34:44,283 in every sense of the word. 746 00:34:44,316 --> 00:34:45,951 ANA HOCEVAR: We're trying to understand 747 00:34:45,984 --> 00:34:49,488 how we could communicate to a completely different species 748 00:34:49,521 --> 00:34:53,925 that is as close to an alien as you can get, for a human. 749 00:34:53,959 --> 00:34:55,494 NARRATOR: 95 million years ago 750 00:34:55,527 --> 00:34:59,298 dolphins and primates parted ways on the evolutionary chain. 751 00:34:59,331 --> 00:35:01,400 But like us, dolphins have big brains 752 00:35:01,433 --> 00:35:03,935 and a sophisticated social intelligence. 753 00:35:03,969 --> 00:35:06,838 So, as far as working with any other animals on the planet, 754 00:35:06,872 --> 00:35:09,341 there is none better suited to being a test case 755 00:35:09,375 --> 00:35:11,910 for learning to speak to aliens than dolphins. 756 00:35:11,943 --> 00:35:16,248 REISS: What we did was we created an underwater keyboard. 757 00:35:16,282 --> 00:35:18,150 It's like a big iPhone. 758 00:35:18,184 --> 00:35:20,486 And if you touch it, something happens. 759 00:35:20,519 --> 00:35:23,489 I want to give us this interface, 760 00:35:23,522 --> 00:35:26,458 a window where we can exchange things. 761 00:35:26,492 --> 00:35:27,759 HOCEVAR: It really opens the door 762 00:35:27,793 --> 00:35:29,861 to understanding their vocalizations. 763 00:35:29,895 --> 00:35:32,664 NARRATOR: Of course! An interspecies iPhone. 764 00:35:32,698 --> 00:35:34,333 Dr. Reiss and her team are hopeful 765 00:35:34,366 --> 00:35:36,302 that this technology will be a platform 766 00:35:36,335 --> 00:35:37,669 in which humans and dolphins 767 00:35:37,703 --> 00:35:40,872 can one day learn to understand one another. 768 00:35:40,906 --> 00:35:43,108 REISS: Wouldn't it be amazing when they hit a key, 769 00:35:43,141 --> 00:35:44,643 it translates to English? 770 00:35:44,676 --> 00:35:48,347 You can hear that and you can respond. 771 00:35:48,380 --> 00:35:49,548 NARRATOR: Amazingly, their efforts 772 00:35:49,581 --> 00:35:51,383 are already being rewarded. 773 00:35:51,417 --> 00:35:52,851 The dolphin have already figured out 774 00:35:52,884 --> 00:35:54,886 that if they touch the screen with their beaks, 775 00:35:54,920 --> 00:35:57,189 they get a reaction. 776 00:35:57,223 --> 00:35:59,991 MATT MIRA: What are dolphins going to be talking about? 777 00:36:00,025 --> 00:36:03,962 Yeah, water's kind of warm today, huh? Yep. 778 00:36:03,995 --> 00:36:06,164 SOULE: I hope that that's basically what they're saying. 779 00:36:06,198 --> 00:36:09,668 These fish are great. I love to swim. Let's jump. 780 00:36:09,701 --> 00:36:11,737 NARRATOR: Or maybe they're discussing dolphin politics 781 00:36:11,770 --> 00:36:13,004 and dolphin philosophy. 782 00:36:13,038 --> 00:36:15,907 We just don't know, but one day we might. 783 00:36:15,941 --> 00:36:19,578 REISS: In a way, the touchscreen is a true window in itself, 784 00:36:19,611 --> 00:36:21,380 into the minds of these animals. 785 00:36:21,413 --> 00:36:25,083 I think technology can make what's invisible to us 786 00:36:25,116 --> 00:36:26,518 perhaps more visible; 787 00:36:26,552 --> 00:36:29,688 what's inaudible to us more audible. 788 00:36:29,721 --> 00:36:31,423 MARCELO MAGNASCO: If we were to succeed, 789 00:36:31,457 --> 00:36:34,059 we would succeed in actually 790 00:36:34,092 --> 00:36:36,862 not being alone in the universe anymore, 791 00:36:36,895 --> 00:36:38,997 which is a pretty sweet thought. 792 00:36:39,030 --> 00:36:40,599 NARRATOR: And that's why 793 00:36:40,632 --> 00:36:44,536 the Search for Extraterrestrial Intelligence, or SETI Institute, 794 00:36:44,570 --> 00:36:46,272 has been tracking her work with dolphins. 795 00:36:46,305 --> 00:36:49,441 The challenges she faces are the same ones they anticipate 796 00:36:49,475 --> 00:36:52,511 they'll face when aliens show up. 797 00:36:52,544 --> 00:36:53,845 We'll need a plan. 798 00:36:53,879 --> 00:36:57,549 And this technology just may be the answer. 799 00:36:57,583 --> 00:37:00,452 REISS: The technology that we're developing in these projects 800 00:37:00,486 --> 00:37:03,188 will enable us to see better, to hear better, 801 00:37:03,221 --> 00:37:07,593 to understand better, and to empathize more and care more, 802 00:37:07,626 --> 00:37:09,060 once we have that knowledge. 803 00:37:09,094 --> 00:37:12,964 You know, with knowledge comes great responsibility. 804 00:37:12,998 --> 00:37:14,266 NARRATOR: It certainly does. 805 00:37:14,300 --> 00:37:16,402 I couldn't have said it better myself. 806 00:37:16,435 --> 00:37:19,405 THURSTON: There is a potential for a shared connection 807 00:37:19,438 --> 00:37:21,407 and a shared mindset. 808 00:37:21,440 --> 00:37:23,542 FARSAD: With an interspecies Internet, 809 00:37:23,575 --> 00:37:25,043 we're sort of headed 810 00:37:25,076 --> 00:37:29,615 in this kind of universal one-language scenario. 811 00:37:29,648 --> 00:37:33,752 So if you've ever wanted to talk to a walrus, 812 00:37:33,785 --> 00:37:36,755 in the future, you could do that. 813 00:37:36,788 --> 00:37:40,058 THURSTON: That could be pretty magical. 814 00:37:40,091 --> 00:37:41,593 NARRATOR: Pretty magical indeed. 815 00:37:41,627 --> 00:37:44,129 In the future, we may not agree with dolphins or aliens, 816 00:37:44,162 --> 00:37:46,665 or even walruses on anything. 817 00:37:46,698 --> 00:37:48,667 But as long as we're communicating, 818 00:37:48,700 --> 00:37:52,237 there is always an opportunity for greater understanding, 819 00:37:52,270 --> 00:37:54,673 greater empathy, and greater connection. 820 00:37:54,706 --> 00:37:57,509 And as we travel deeper and deeper into the future, 821 00:37:57,543 --> 00:37:59,411 we'll need to master communication 822 00:37:59,445 --> 00:38:01,079 with all sentient beings 823 00:38:01,112 --> 00:38:06,184 if we ever hope to build our own Tower of Babel in the future. 824 00:38:06,217 --> 00:38:07,686 But we're not there yet. 825 00:38:07,719 --> 00:38:10,021 We're going to connect on a whole other level. 826 00:38:10,055 --> 00:38:12,624 If you thought losing your privacy was a big deal, 827 00:38:12,658 --> 00:38:16,462 the final step is going to be a big pill to swallow. 828 00:38:16,495 --> 00:38:18,664 DVORSKY: You're starting to lose the individual, 829 00:38:18,697 --> 00:38:19,865 and you're starting to now gain 830 00:38:19,898 --> 00:38:22,167 in this kind of massive collectivity, 831 00:38:22,200 --> 00:38:24,870 this, this entity that kind of maybe even thinks 832 00:38:24,903 --> 00:38:28,173 and has impulses and tendencies toward certain direction. 833 00:38:28,206 --> 00:38:30,442 The sum of its intelligence would hopefully be greater 834 00:38:30,476 --> 00:38:32,811 than the sum of its parts. 835 00:38:32,844 --> 00:38:34,846 THURSTON: That's going to change politics, right? 836 00:38:34,880 --> 00:38:37,182 That's going to change relationships. 837 00:38:37,215 --> 00:38:39,618 Could you merge minds? 838 00:38:39,651 --> 00:38:41,553 NARRATOR: Oh, yes, we can, and we will. 839 00:38:41,587 --> 00:38:44,022 In the Year Million era we'll take the final plunge 840 00:38:44,055 --> 00:38:47,359 and shed our ego, our sense of self, our individuality, 841 00:38:47,393 --> 00:38:49,795 and join together in a single consciousness. 842 00:38:49,828 --> 00:38:51,730 It's a high-bandwidth blending of our mind 843 00:38:51,763 --> 00:38:53,732 that creates its own super intelligence, 844 00:38:53,765 --> 00:38:57,669 a consciousness of which each person is just one small part. 845 00:38:57,703 --> 00:39:01,272 Say hello to your future in the hive mind. 846 00:39:10,115 --> 00:39:11,883 NARRATOR: We're almost at the end of our journey 847 00:39:11,917 --> 00:39:13,251 to Year Million 848 00:39:13,284 --> 00:39:17,723 and that future version of a Tower of Babel to the heavens. 849 00:39:17,756 --> 00:39:20,225 And the final step in our communication evolution 850 00:39:20,258 --> 00:39:21,893 is a doozy. 851 00:39:21,927 --> 00:39:23,161 Yep, you got it. 852 00:39:23,194 --> 00:39:25,196 In the Year Million we're not just communicating 853 00:39:25,230 --> 00:39:26,998 telepathically mind-to-mind, 854 00:39:27,032 --> 00:39:29,601 using our hyper-connectivity to swarm together 855 00:39:29,635 --> 00:39:32,203 and increase our intelligence a thousand-fold. 856 00:39:32,237 --> 00:39:36,174 The final step will be when we finally shed our sense of self 857 00:39:36,207 --> 00:39:37,743 and individuality 858 00:39:37,776 --> 00:39:42,914 and merge our minds together into a single consciousness. 859 00:39:45,951 --> 00:39:47,753 DVORSKY: As we become progressively interconnected 860 00:39:47,786 --> 00:39:49,054 with each other, 861 00:39:49,087 --> 00:39:51,423 lines that separate one brain from another brain 862 00:39:51,457 --> 00:39:53,425 will become increasingly blurred. 863 00:39:53,459 --> 00:39:54,893 And if you can imagine, you know, 864 00:39:54,926 --> 00:39:57,362 hundreds or if not even thousands of individuals, 865 00:39:57,395 --> 00:39:59,064 interlinked in this way, 866 00:39:59,097 --> 00:40:02,568 you're going to have what's referred to as the hive mind. 867 00:40:02,601 --> 00:40:04,903 NARRATOR: Whoa, the hive mind. 868 00:40:04,936 --> 00:40:06,371 It sounds scary, 869 00:40:06,404 --> 00:40:09,741 and it just might be, and we'll get to that in a minute. 870 00:40:09,775 --> 00:40:13,244 But it could also be incredibly empowering. 871 00:40:13,278 --> 00:40:14,880 NEWITZ: Basically like what we do 872 00:40:14,913 --> 00:40:16,648 when we create a computer cluster, 873 00:40:16,682 --> 00:40:18,049 putting all the computers together 874 00:40:18,083 --> 00:40:19,417 and having them work together 875 00:40:19,451 --> 00:40:22,488 allows them to do bigger and tougher problems. 876 00:40:22,521 --> 00:40:23,922 NARRATOR: You might say to yourself 877 00:40:23,955 --> 00:40:25,624 that sounds a lot like the swarm intelligence 878 00:40:25,657 --> 00:40:28,059 we talked about, but they are very different. 879 00:40:28,093 --> 00:40:30,629 Swarm intelligence is decentralized. 880 00:40:30,662 --> 00:40:33,632 We will still have our own identity, a sense of self. 881 00:40:33,665 --> 00:40:35,967 Hive mind is a paradigm shift. 882 00:40:36,001 --> 00:40:38,904 We will let go of our egos and come together 883 00:40:38,937 --> 00:40:41,239 to create a centralized consciousness. 884 00:40:41,272 --> 00:40:43,274 When you connect to the hive mind, 885 00:40:43,308 --> 00:40:46,311 you will share everything. 886 00:40:49,981 --> 00:40:52,818 NEWITZ: It could be a really great experience at a concert, 887 00:40:52,851 --> 00:40:54,953 where like all of us are feeling the same thing 888 00:40:54,986 --> 00:40:58,456 as we listen to an awesome guitar solo. 889 00:40:58,490 --> 00:41:01,760 NARRATOR: Exactly, like a concert. 890 00:41:01,793 --> 00:41:04,596 Speaking of which, remember that concert we saw earlier? 891 00:41:04,630 --> 00:41:07,332 Our future family has become part of the hive mind, 892 00:41:07,365 --> 00:41:11,136 and is experiencing a concert through a single consciousness. 893 00:41:11,169 --> 00:41:12,470 BRIAN GREENE: If all those minds 894 00:41:12,504 --> 00:41:16,041 are in some computer digital environment 895 00:41:16,074 --> 00:41:18,844 that allows them to interface in a more profound way 896 00:41:18,877 --> 00:41:21,947 than the biological means that we have at our disposal now, 897 00:41:21,980 --> 00:41:24,650 I can't help but think that that would be a greater level 898 00:41:24,683 --> 00:41:27,385 of collective communication. 899 00:41:27,418 --> 00:41:28,620 NARRATOR: It would be. 900 00:41:28,654 --> 00:41:30,556 And that's why they can be any part 901 00:41:30,589 --> 00:41:32,524 of this little concert they desire. 902 00:41:32,558 --> 00:41:33,892 Plugged in like they are, 903 00:41:33,925 --> 00:41:36,294 their minds will create a larger intelligence, 904 00:41:36,327 --> 00:41:37,963 a larger consciousness. 905 00:41:37,996 --> 00:41:40,465 Our whole idea of what it means to be human 906 00:41:40,498 --> 00:41:42,133 will change drastically, 907 00:41:42,167 --> 00:41:45,503 because when we let go of the ego, the self-preservation, 908 00:41:45,537 --> 00:41:48,006 the competition involved with having a sense of self 909 00:41:48,039 --> 00:41:51,677 and come together as one, well, then everything is possible. 910 00:41:51,710 --> 00:41:53,478 GREENE: There is something very powerful 911 00:41:53,511 --> 00:41:56,314 of all minds working together. 912 00:41:56,347 --> 00:41:59,084 Maybe we'll find, in this domain, 913 00:41:59,117 --> 00:42:01,720 that the notion of decision 914 00:42:01,753 --> 00:42:05,991 is not where we find our individual footprint. 915 00:42:06,024 --> 00:42:09,494 Maybe our individual footprint comes with an outlook 916 00:42:09,527 --> 00:42:12,664 or a perspective that we hold dear, 917 00:42:12,698 --> 00:42:16,334 and only we as individuals are aware of it, or know it. 918 00:42:16,367 --> 00:42:19,605 Maybe that will be enough, perhaps. 919 00:42:19,638 --> 00:42:22,173 NEWITZ: I definitely think it will radically impact 920 00:42:22,207 --> 00:42:25,577 what we think of as a self. 921 00:42:25,611 --> 00:42:27,078 NARRATOR: Take a moment, breathe. 922 00:42:27,112 --> 00:42:29,748 I know, it's a big idea to wrap your head around. 923 00:42:29,781 --> 00:42:31,416 Everything we think makes us human 924 00:42:31,449 --> 00:42:33,585 is tied to our sense of self. 925 00:42:33,619 --> 00:42:36,021 Hive mind won't come without sacrifice. 926 00:42:36,054 --> 00:42:38,223 It will take a complete redefinition 927 00:42:38,256 --> 00:42:39,557 of what it means to be human. 928 00:42:39,591 --> 00:42:43,595 The question is, will it be worth it? 929 00:42:43,629 --> 00:42:45,030 FARSAD: What feels dangerous about the hive mind 930 00:42:45,063 --> 00:42:47,132 is that we'll all, like, just know the same things 931 00:42:47,165 --> 00:42:49,601 and then we won't have anything to talk about. 932 00:42:49,635 --> 00:42:53,104 It'll be so sad, because like the whole point of life 933 00:42:53,138 --> 00:42:55,040 is to just like hang out with your friends. 934 00:42:55,073 --> 00:42:57,643 It's like, talk some smack, you know what I mean, 935 00:42:57,676 --> 00:43:00,278 and if you all already know the thing, you know, 936 00:43:00,311 --> 00:43:01,613 there's no smack to talk, 937 00:43:01,647 --> 00:43:03,782 and that would be very frustrating. 938 00:43:03,815 --> 00:43:05,583 NARRATOR: Frustrating indeed. 939 00:43:05,617 --> 00:43:08,654 What's life without some good old-fashioned trash talk? 940 00:43:08,687 --> 00:43:10,622 Well, we just might find out. 941 00:43:10,656 --> 00:43:12,490 But what about our autonomy? 942 00:43:12,523 --> 00:43:14,559 When we mingle our minds, what happens? 943 00:43:14,592 --> 00:43:16,161 Are we still you and me? 944 00:43:16,194 --> 00:43:18,396 Or do we become the same thing? 945 00:43:18,429 --> 00:43:23,334 NEWITZ: Any technology that we use to mingle our minds 946 00:43:23,368 --> 00:43:26,404 could have good or bad effects. 947 00:43:26,437 --> 00:43:28,740 So you want to be able to step back 948 00:43:28,774 --> 00:43:31,810 and be a little bit skeptical of any kind of groupthink. 949 00:43:31,843 --> 00:43:33,478 MARQUIS-BOIRE: There is the worry 950 00:43:33,511 --> 00:43:38,016 that this connected hive mind can be used in sinister ways, 951 00:43:38,049 --> 00:43:40,185 depending on who's in charge of it. 952 00:43:40,218 --> 00:43:42,187 NARRATOR: That's right, when you're part of the hive mind, 953 00:43:42,220 --> 00:43:44,756 your mind, at least in the traditional sense, 954 00:43:44,790 --> 00:43:46,692 might not be yours alone. 955 00:43:46,725 --> 00:43:49,828 KAKU: You're like a worker bee in a gigantic hive. 956 00:43:49,861 --> 00:43:54,265 You have no individuality whatsoever. 957 00:43:54,299 --> 00:43:56,534 MIRA: You know, if you think about the Borg from Star Trek, 958 00:43:56,567 --> 00:43:57,769 the Borg is the hive mind. 959 00:43:57,803 --> 00:43:59,537 The Borg are a collective, 960 00:43:59,570 --> 00:44:03,208 so they think as a whole and not as an individual. 961 00:44:03,241 --> 00:44:07,312 And in many ways individuality is what makes us feel human. 962 00:44:07,345 --> 00:44:12,317 To have that stripped away and become part of a hive collective 963 00:44:12,350 --> 00:44:15,120 is one of the more terrifying things you could do. 964 00:44:15,153 --> 00:44:16,955 It's like joining a cult. 965 00:44:16,988 --> 00:44:20,225 NEWITZ: The perfect way to get a zombie army would be 966 00:44:20,258 --> 00:44:21,860 string all their brains together, 967 00:44:21,893 --> 00:44:24,229 hook them up to somebody who really knows what they're doing, 968 00:44:24,262 --> 00:44:25,731 and just blasts their brains 969 00:44:25,764 --> 00:44:28,499 with like whatever information they want to give them. 970 00:44:28,533 --> 00:44:31,302 You know, now you must do this labor 971 00:44:31,336 --> 00:44:34,205 in order to exalt the great one. 972 00:44:34,239 --> 00:44:36,007 NARRATOR: That doesn't sound good at all. 973 00:44:36,041 --> 00:44:37,175 What if I want out? 974 00:44:37,208 --> 00:44:40,578 Is there some sort of hive mind eject button? 975 00:44:40,611 --> 00:44:42,047 DVORSKY: One would hope, for example, 976 00:44:42,080 --> 00:44:44,015 that you could perhaps pull out of the hive mind, 977 00:44:44,049 --> 00:44:46,251 that you could remove yourself from the grid. 978 00:44:46,284 --> 00:44:47,753 We struggle with this today, 979 00:44:47,786 --> 00:44:49,888 we turn off our phones or go into airplane mode, 980 00:44:49,921 --> 00:44:51,656 and we feel like we're naked somehow, 981 00:44:51,689 --> 00:44:53,358 or that somehow we're disconnected from the world. 982 00:44:53,391 --> 00:44:55,393 Imagine how terrifying or disconcerting it would be 983 00:44:55,426 --> 00:44:58,730 in the future, if we suddenly, after engaging in a hive mind, 984 00:44:58,764 --> 00:45:01,366 we pulled our self out of it. 985 00:45:01,399 --> 00:45:02,667 NARRATOR: The hive mind sounds like 986 00:45:02,700 --> 00:45:04,602 it could be a deeply oppressive place, 987 00:45:04,635 --> 00:45:07,405 like North Korea but on steroids. 988 00:45:07,438 --> 00:45:09,240 That's the worst-case scenario. 989 00:45:09,274 --> 00:45:12,744 KAKU: But another possibility is that it is freedom, 990 00:45:12,778 --> 00:45:14,079 a world of enlightenment, 991 00:45:14,112 --> 00:45:17,182 a world of knowledge and prosperity. 992 00:45:17,215 --> 00:45:19,250 NARRATOR: When we're all joined together as one, 993 00:45:19,284 --> 00:45:23,088 could we finally eliminate conflict, wars and suffering? 994 00:45:23,121 --> 00:45:25,623 THURSTON: And if you and I are the same, 995 00:45:25,656 --> 00:45:28,894 then when I hurt you, I literally hurt myself. 996 00:45:28,927 --> 00:45:33,064 That changes war, that changes anger, that changes love. 997 00:45:33,098 --> 00:45:37,102 'Cause when I love you, I love myself. 998 00:45:37,135 --> 00:45:38,403 ROSENBERG: We could evolve into something 999 00:45:38,436 --> 00:45:39,971 that we can't even conceive, 1000 00:45:40,005 --> 00:45:42,974 into a different type of creature. 1001 00:45:43,008 --> 00:45:44,876 This super-organism. 1002 00:45:44,910 --> 00:45:47,078 NARRATOR: Might the hive mind even be necessary 1003 00:45:47,112 --> 00:45:48,479 for our survival? 1004 00:45:48,513 --> 00:45:50,315 I mean, we've come this far alone. 1005 00:45:50,348 --> 00:45:51,817 But you know the old saying, 1006 00:45:51,850 --> 00:45:54,986 divided we fall and united we stand. 1007 00:45:55,020 --> 00:45:58,089 GREENE: To my mind, the only way that we survive 1008 00:45:58,123 --> 00:45:59,257 into the far future, 1009 00:45:59,290 --> 00:46:02,593 is to bring us all together in some manner 1010 00:46:02,627 --> 00:46:05,931 that leverages the whole collective consciousness 1011 00:46:05,964 --> 00:46:10,335 in a way that's more powerful than the individual minds alone. 1012 00:46:10,368 --> 00:46:11,736 THURSTON: I think that's exciting. 1013 00:46:11,769 --> 00:46:15,240 I think it's weird, though. 1014 00:46:15,273 --> 00:46:18,109 SANDBERG: I imagine it as a vast coral reef. 1015 00:46:18,143 --> 00:46:21,346 Explosion of new forms, new kinds of minds, 1016 00:46:21,379 --> 00:46:23,614 new kind of consciousness. 1017 00:46:23,648 --> 00:46:25,750 I can't imagine any of the details. 1018 00:46:25,783 --> 00:46:28,019 Because I think most of them would be beyond 1019 00:46:28,053 --> 00:46:29,921 my puny human brain. 1020 00:46:29,955 --> 00:46:32,858 Just like an ant cannot understand a city, 1021 00:46:32,891 --> 00:46:36,127 we cannot understand Year Million. 1022 00:46:36,161 --> 00:46:40,698 But we can see something there, beyond the clouds. 1023 00:46:40,731 --> 00:46:42,467 NARRATOR: That is the future we're barreling toward 1024 00:46:42,500 --> 00:46:43,935 in Year Million. 1025 00:46:43,969 --> 00:46:46,037 When we are one with animals, extraterrestrials, 1026 00:46:46,071 --> 00:46:47,873 and most importantly, each other, 1027 00:46:47,906 --> 00:46:51,376 then beyond the clouds may be exactly where we find ourselves. 1028 00:46:51,409 --> 00:46:55,680 That's right, we'll be building a new future, our own tower, 1029 00:46:55,713 --> 00:46:58,116 perhaps in a far-off distant galaxy, 1030 00:46:58,149 --> 00:46:59,217 a gleaming testament 1031 00:46:59,250 --> 00:47:01,819 to our brilliant ingenuity and creativity. 1032 00:47:01,853 --> 00:47:03,354 We're headed for the stars. 1033 00:47:03,388 --> 00:47:04,722 And that will be possible 1034 00:47:04,755 --> 00:47:06,858 because of the coming communication revolution 1035 00:47:06,892 --> 00:47:08,626 that will take human intelligence 1036 00:47:08,659 --> 00:47:11,162 into the stratosphere. 83357

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.