All language subtitles for Through the Wormhole s07e02 Is Privacy Dead.eng

af Afrikaans
sq Albanian
am Amharic
ar Arabic Download
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch Download
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian
pl Polish
pt Portuguese
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:01,736 --> 00:00:05,237 we live under a billion unblinking eyes... 2 00:00:05,239 --> 00:00:09,408 a global surveillance system that solves crime... 3 00:00:09,410 --> 00:00:12,945 uncovers terrorist plots... 4 00:00:12,947 --> 00:00:16,281 and helps stop abuse of power. 5 00:00:16,283 --> 00:00:19,618 but are we ready for a world without secrets... 6 00:00:21,522 --> 00:00:25,157 ...where not even our homes are off-limits 7 00:00:25,159 --> 00:00:27,626 and corporations know our every desire? 8 00:00:27,628 --> 00:00:31,263 should we say goodbye to our privacy? 9 00:00:31,265 --> 00:00:34,833 or is it time for the watched to become the watchers? 10 00:00:40,374 --> 00:00:44,710 space, time, life itself. 11 00:00:47,281 --> 00:00:52,217 the secrets of the cosmos lie through the wormhole. 12 00:01:04,398 --> 00:01:07,933 ever have the feeling you're being watched? 13 00:01:07,935 --> 00:01:11,236 well, you probably are. 14 00:01:11,238 --> 00:01:12,971 if you live in a large city, 15 00:01:12,973 --> 00:01:15,441 surveillance cameras take your picture 16 00:01:15,443 --> 00:01:17,643 hundreds of times per day. 17 00:01:17,645 --> 00:01:21,713 every transaction you make is electronically logged. 18 00:01:21,715 --> 00:01:24,750 scanners at airports can peer through your clothes. 19 00:01:24,752 --> 00:01:28,153 the latest models can even detect your emotional state. 20 00:01:28,155 --> 00:01:31,390 and in those moments when you're not being tracked, 21 00:01:31,392 --> 00:01:34,993 we're busy giving away our personal information 22 00:01:34,995 --> 00:01:36,361 on social media. 23 00:01:36,363 --> 00:01:40,365 we think of our privacy as a fundamental right. 24 00:01:40,367 --> 00:01:44,803 now it appears to be on the brink of extinction, 25 00:01:44,805 --> 00:01:47,539 which sounds like a nightmare. 26 00:01:47,541 --> 00:01:49,308 but is it? 27 00:01:52,980 --> 00:01:57,483 this footage was shot by the hawkeye ii surveillance camera 28 00:01:57,485 --> 00:02:01,620 flying two miles above ciudad ju?rez in mexico. 29 00:02:01,622 --> 00:02:04,523 once every second for hours on end, 30 00:02:04,525 --> 00:02:06,992 it takes a picture of the entire city. 31 00:02:06,994 --> 00:02:10,963 here it documents the murder of a police officer 32 00:02:10,965 --> 00:02:13,298 by members of a drug cartel. 33 00:02:13,300 --> 00:02:16,568 but it also captures the movements of the assassins. 34 00:02:16,570 --> 00:02:19,304 it tracks their cars as they leave the scene 35 00:02:19,306 --> 00:02:21,840 and leads the police to their hideout. 36 00:02:21,842 --> 00:02:24,042 cities around the world 37 00:02:24,044 --> 00:02:27,079 are beginning to use these total surveillance systems. 38 00:02:27,081 --> 00:02:30,816 one could be watching you right now. 39 00:02:30,818 --> 00:02:35,354 nick bostrom runs the future of humanity institute 40 00:02:35,356 --> 00:02:37,289 at oxford university. 41 00:02:37,291 --> 00:02:39,791 he believes constant surveillance 42 00:02:39,793 --> 00:02:42,294 will radically reshape our lives, 43 00:02:42,296 --> 00:02:45,931 but we won't end up fearing it like big brother. 44 00:02:45,933 --> 00:02:49,034 nick believes we'll embrace it. 45 00:02:49,036 --> 00:02:51,336 surveillance technology might be one of those things 46 00:02:51,338 --> 00:02:53,405 that could change social dynamics 47 00:02:53,407 --> 00:02:55,207 in some fairly fundamental way. 48 00:02:55,209 --> 00:02:57,776 yes, already in urban environments, 49 00:02:57,778 --> 00:03:01,513 there are a lot of cameras looking at us all the time. 50 00:03:01,515 --> 00:03:03,081 so, it's a lot of eyeballs, 51 00:03:03,083 --> 00:03:05,217 but they are kind of semi-isolated. 52 00:03:05,219 --> 00:03:06,518 an obvious next step, 53 00:03:06,520 --> 00:03:09,588 where all these video feeds are stored in perpetuity 54 00:03:09,590 --> 00:03:12,925 and coupled with facial-recognition systems 55 00:03:12,927 --> 00:03:16,128 so you could automatically tag and keep track 56 00:03:16,130 --> 00:03:18,630 of where any individual has been, 57 00:03:18,632 --> 00:03:20,499 whom they have been talking with, 58 00:03:20,501 --> 00:03:21,900 what they have been doing. 59 00:03:24,405 --> 00:03:27,005 that sounds like a bad thing, 60 00:03:27,007 --> 00:03:28,941 but it doesn't have to be. 61 00:03:28,943 --> 00:03:32,144 think about contagious diseases. 62 00:03:32,146 --> 00:03:34,079 the virus this man is carrying 63 00:03:34,081 --> 00:03:37,216 could spread across the city in just a few days. 64 00:03:38,352 --> 00:03:39,818 it could start a pandemic 65 00:03:39,820 --> 00:03:42,120 that kills tens of thousands. 66 00:03:43,524 --> 00:03:44,957 if you have a new outbreak, 67 00:03:44,959 --> 00:03:49,161 you have sars or h1n1 or some new disease that pops up, 68 00:03:49,163 --> 00:03:52,130 it gets really important to try to trace 69 00:03:52,132 --> 00:03:53,932 who might have been exposed to the germ. 70 00:03:53,934 --> 00:03:55,133 and it's painstaking work. 71 00:03:55,135 --> 00:03:56,568 you have to interview the people 72 00:03:56,570 --> 00:03:58,370 and try to get who they have interacted with, 73 00:03:58,372 --> 00:04:00,405 and then you have to go and interview those people. 74 00:04:00,407 --> 00:04:02,808 but constant surveillance 75 00:04:02,810 --> 00:04:06,278 could spot the origins of the outbreaks in real time, 76 00:04:06,280 --> 00:04:08,046 locating the infected, 77 00:04:08,048 --> 00:04:10,849 dispatching medical teams to them, 78 00:04:10,851 --> 00:04:13,986 and establishing quarantines. 79 00:04:13,988 --> 00:04:15,988 please return to your homes. 80 00:04:15,990 --> 00:04:18,590 this area is under temporary quarantine. 81 00:04:20,261 --> 00:04:22,494 those things obviously can become more efficient, 82 00:04:22,496 --> 00:04:24,663 the more detailed information you have. 83 00:04:24,665 --> 00:04:26,198 so, if you nip it in the bud, 84 00:04:26,200 --> 00:04:28,800 you potentially save millions of lives. 85 00:04:28,802 --> 00:04:32,904 imagine if every single person you interacted with 86 00:04:32,906 --> 00:04:35,674 were tracked 24/7. 87 00:04:35,676 --> 00:04:37,676 actually to be able to see 88 00:04:37,678 --> 00:04:39,444 what somebody has been up to in the past. 89 00:04:39,446 --> 00:04:40,679 have they kept their promises? 90 00:04:40,681 --> 00:04:43,415 there are a lot of jerks and cheaters in the world 91 00:04:43,417 --> 00:04:44,616 who get away with it. 92 00:04:44,618 --> 00:04:46,351 and by the time people wise up, 93 00:04:46,353 --> 00:04:48,754 they have moved on to their next victims. 94 00:04:48,756 --> 00:04:49,955 kind of nice 95 00:04:49,957 --> 00:04:53,458 to be able to disempower the jerks and cheaters, 96 00:04:53,460 --> 00:04:55,327 and it would encourage more people 97 00:04:55,329 --> 00:04:59,164 to behave in ways that -- that are good. 98 00:04:59,166 --> 00:05:02,934 and if cameras tracked you every moment of the day, 99 00:05:02,936 --> 00:05:04,336 some aspects of your life 100 00:05:04,338 --> 00:05:06,371 would become a lot more convenient. 101 00:05:06,373 --> 00:05:09,574 you could go into a shop and just take what you need. 102 00:05:09,576 --> 00:05:11,743 and the camera recognizes who you are 103 00:05:11,745 --> 00:05:14,579 and automatically rings it up to your bank account, 104 00:05:14,581 --> 00:05:16,481 and it's all taken care of. 105 00:05:16,483 --> 00:05:18,884 if you hadn't yet kicked the old habit 106 00:05:18,886 --> 00:05:20,485 of carrying a wallet, 107 00:05:20,487 --> 00:05:24,022 you'd never have to worry about remembering where you left it. 108 00:05:24,024 --> 00:05:27,426 in some ways, it's actually a return 109 00:05:27,428 --> 00:05:29,494 to a more normal human condition. 110 00:05:29,496 --> 00:05:32,164 we used to live in small tribes, small bands. 111 00:05:32,166 --> 00:05:33,965 you kind of know what everybody's doing, 112 00:05:33,967 --> 00:05:35,567 who they are, what they are up to, 113 00:05:35,569 --> 00:05:37,369 what they have been doing in the past. 114 00:05:37,371 --> 00:05:39,338 in some respects, it's not a complete novelty. 115 00:05:39,340 --> 00:05:41,840 it might be more a return to normalcy. 116 00:05:41,842 --> 00:05:44,643 life under global surveillance 117 00:05:44,645 --> 00:05:47,446 might resemble life in a small village, 118 00:05:47,448 --> 00:05:50,949 but could we adapt to being constantly surveilled, 119 00:05:50,951 --> 00:05:53,251 even inside our own homes? 120 00:05:53,253 --> 00:05:56,988 most people say they don't like being watched 121 00:05:56,990 --> 00:06:02,260 when they're eating, washing, or doing anything in the nude. 122 00:06:02,262 --> 00:06:06,131 but our homes are already full of cameras, 123 00:06:06,133 --> 00:06:08,633 from security cameras and cellphones 124 00:06:08,635 --> 00:06:10,302 to laptops and tvs. 125 00:06:10,304 --> 00:06:13,038 they're even hidden inside clocks 126 00:06:13,040 --> 00:06:15,240 that keep an eye on the nanny. 127 00:06:15,242 --> 00:06:17,743 you might think these cameras are harmless, 128 00:06:17,745 --> 00:06:19,978 but they all connect to the internet, 129 00:06:19,980 --> 00:06:23,648 which means they can be hacked. 130 00:06:26,387 --> 00:06:29,488 cognitive scientist antti oulasvirta 131 00:06:29,490 --> 00:06:31,256 lives in finland, 132 00:06:31,258 --> 00:06:35,127 a country known for notoriously shy people. 133 00:06:35,129 --> 00:06:37,662 they say you can spot an extroverted finn 134 00:06:37,664 --> 00:06:41,166 because they're looking at your shoes, not their own. 135 00:06:41,168 --> 00:06:45,504 antti realized his countrymen would be perfect guinea pigs 136 00:06:45,506 --> 00:06:46,938 for an experiment 137 00:06:46,940 --> 00:06:51,009 to see how people react to having no privacy. 138 00:06:51,011 --> 00:06:54,646 i have a motivation to keep the kitchen clean, 139 00:06:54,648 --> 00:06:59,017 but i have an extra motivation today -- this camera. 140 00:06:59,019 --> 00:07:02,654 antti wanted to see how people's behavior changed 141 00:07:02,656 --> 00:07:06,057 when their homes were wired for constant surveillance, 142 00:07:06,059 --> 00:07:08,660 so he persuaded several households 143 00:07:08,662 --> 00:07:11,530 to do what for finns is unthinkable -- 144 00:07:11,532 --> 00:07:15,801 submit to being watched for an entire year. 145 00:07:15,803 --> 00:07:19,137 we wired 10 households in finland for 12 months, 146 00:07:19,139 --> 00:07:22,207 including cameras and microfilms and even screen capture. 147 00:07:22,209 --> 00:07:24,176 so, this is bob, 148 00:07:24,178 --> 00:07:27,446 and bob looks like a regular piece of home electronics, 149 00:07:27,448 --> 00:07:28,346 but he's not. 150 00:07:28,348 --> 00:07:29,915 b-o-b, 151 00:07:29,917 --> 00:07:33,418 which stands for "behavioral observation system," 152 00:07:33,420 --> 00:07:36,955 records all the video and audio around the house. 153 00:07:36,957 --> 00:07:41,159 bob also keeps track of all e-mail, web traffic, 154 00:07:41,161 --> 00:07:44,796 online purchases, and television-viewing habits. 155 00:07:44,798 --> 00:07:47,065 so, we had them covered pretty well in all areas 156 00:07:47,067 --> 00:07:48,400 of ubiquitous surveillance. 157 00:07:48,402 --> 00:07:51,002 we didn't want to bust people for doing anything wrong. 158 00:07:51,004 --> 00:07:53,004 we simply wanted to see how they would react 159 00:07:53,006 --> 00:07:56,041 to not being able to be alone in their own homes. 160 00:07:56,043 --> 00:07:58,443 in the first weeks of the study, 161 00:07:58,445 --> 00:08:01,046 antti noticed his subjects appeared unsettled 162 00:08:01,048 --> 00:08:03,148 by the presence of the cameras. 163 00:08:03,150 --> 00:08:06,451 they had to keep their impulses in check, 164 00:08:06,453 --> 00:08:07,919 control their shouting. 165 00:08:07,921 --> 00:08:10,088 and if there was a stressful situation 166 00:08:10,090 --> 00:08:11,556 playing out in their lives, 167 00:08:11,558 --> 00:08:13,391 that could amplify the stress. 168 00:08:13,393 --> 00:08:15,827 and -- no surprise -- 169 00:08:15,829 --> 00:08:18,230 they were sensitive about being naked. 170 00:08:18,232 --> 00:08:20,565 being naked was, of course, an issue, 171 00:08:20,567 --> 00:08:24,069 and we left them a few spots -- for example, bathrooms -- 172 00:08:24,071 --> 00:08:27,272 where they could be alone without the cameras. 173 00:08:27,274 --> 00:08:29,941 they were like fish in a fishbowl. 174 00:08:29,943 --> 00:08:32,210 but as time went on, 175 00:08:32,212 --> 00:08:35,881 antti noticed something surprising. 176 00:08:35,883 --> 00:08:39,184 so after surveilling people for six months, 177 00:08:39,186 --> 00:08:41,920 we asked them to draw us a graph of their stress levels. 178 00:08:41,922 --> 00:08:44,189 they were stressed out in the beginning, 179 00:08:44,191 --> 00:08:47,859 but after a while, it leveled off. 180 00:08:47,861 --> 00:08:51,396 eventually, the subjects started to relax. 181 00:08:51,398 --> 00:08:54,833 they stopped worrying about being seen naked. 182 00:08:54,835 --> 00:08:56,434 so, the mentality was that, 183 00:08:56,436 --> 00:08:59,204 "now you've seen me once walking around the kitchen naked, 184 00:08:59,206 --> 00:09:01,239 so what's the point continuing to hide?" 185 00:09:01,241 --> 00:09:03,775 and when they really needed privacy 186 00:09:03,777 --> 00:09:05,577 for a delicate conversation, 187 00:09:05,579 --> 00:09:07,546 they figured out how to get it. 188 00:09:07,548 --> 00:09:10,916 they went to caf?s to have private conversations, 189 00:09:10,918 --> 00:09:14,319 and they avoided the cameras in creative ways. 190 00:09:14,321 --> 00:09:16,154 antti's study shows 191 00:09:16,156 --> 00:09:19,157 we can adapt to almost constant surveillance. 192 00:09:19,159 --> 00:09:21,860 he admits that this was a special case. 193 00:09:21,862 --> 00:09:25,263 the subject knew him and trusted him not to share the data. 194 00:09:25,265 --> 00:09:27,332 but to antti's surprise, 195 00:09:27,334 --> 00:09:30,268 some people didn't care who was watching. 196 00:09:30,270 --> 00:09:31,703 we asked the subjects 197 00:09:31,705 --> 00:09:33,672 who would they least want to share the data with, 198 00:09:33,674 --> 00:09:35,740 and the most striking feature was, 199 00:09:35,742 --> 00:09:37,542 some went as far as saying that, 200 00:09:37,544 --> 00:09:41,313 "it doesn't matter to whom you share the data." 201 00:09:41,315 --> 00:09:44,149 we have an amazing ability 202 00:09:44,151 --> 00:09:47,485 to adapt to changing environments. 203 00:09:47,487 --> 00:09:50,155 but if we learn to ignore cameras, 204 00:09:50,157 --> 00:09:54,526 it won't be long before we stop thinking about who's watching 205 00:09:54,528 --> 00:09:57,796 and why they are watching. 206 00:09:57,798 --> 00:10:01,499 they say ignorance is bliss. 207 00:10:01,501 --> 00:10:03,868 but in this case, 208 00:10:03,870 --> 00:10:06,605 what you don't know could hurt you. 209 00:10:10,118 --> 00:10:14,120 in george orwewell's novel "198" 210 00:10:14,122 --> 00:10:16,723 everyone lived under the watchful eye 211 00:10:16,725 --> 00:10:19,092 of an authoritarian government -- 212 00:10:19,094 --> 00:10:21,061 big brother. 213 00:10:21,063 --> 00:10:22,762 today, in real life, 214 00:10:22,764 --> 00:10:27,234 there's a different watchful eye we should worry about -- 215 00:10:27,236 --> 00:10:29,736 big business. 216 00:10:29,738 --> 00:10:31,438 ask yourself this -- 217 00:10:31,440 --> 00:10:35,141 do you know when corporations are watching you 218 00:10:35,143 --> 00:10:39,880 or how much they already know about you? 219 00:10:39,882 --> 00:10:42,849 the answer will shock you. 220 00:10:47,956 --> 00:10:51,424 alessandro acquisti always puts safety first. 221 00:10:51,426 --> 00:10:54,895 he knows a helmet will keep him safe on the road, 222 00:10:54,897 --> 00:10:58,832 and the same goes for his social-media profile picture. 223 00:10:58,834 --> 00:11:01,268 i do have a facebook profile. 224 00:11:01,270 --> 00:11:04,271 for my profile picture, i wear a motorcycle helmet. 225 00:11:04,273 --> 00:11:09,242 your name and your profile picture 226 00:11:09,244 --> 00:11:10,710 are public by default. 227 00:11:10,712 --> 00:11:12,646 therefore, they're searchable. 228 00:11:12,648 --> 00:11:17,350 so, my question is, how much can i learn about you 229 00:11:17,352 --> 00:11:20,420 starting just from a photo of your face? 230 00:11:20,422 --> 00:11:23,657 alessandro is a behavioral economist 231 00:11:23,659 --> 00:11:26,426 at carnegie mellon university in pittsburgh. 232 00:11:26,428 --> 00:11:29,529 he's trying to find out what private corporations 233 00:11:29,531 --> 00:11:31,898 might be able to find out about you 234 00:11:31,900 --> 00:11:34,367 just by taking a picture of your face. 235 00:11:34,369 --> 00:11:35,869 would you like to help us with a study? 236 00:11:35,871 --> 00:11:36,870 sure. 237 00:11:36,872 --> 00:11:39,506 so, alessandro and his team 238 00:11:39,508 --> 00:11:42,175 developed their own data-mining app. 239 00:11:42,177 --> 00:11:44,344 we took a shot of their faces. 240 00:11:44,346 --> 00:11:46,546 and then you wait a few seconds. 241 00:11:48,550 --> 00:11:49,783 in the meanwhile, 242 00:11:49,785 --> 00:11:52,385 the shot is being uploaded to a cloud, 243 00:11:52,387 --> 00:11:55,655 where we have previously downloaded 244 00:11:55,657 --> 00:11:59,225 a few hundred thousand images from facebook profiles. 245 00:11:59,227 --> 00:12:00,527 the app uses 246 00:12:00,529 --> 00:12:02,796 commercially available facial-recognition software 247 00:12:02,798 --> 00:12:05,799 to find a matching face online. 248 00:12:05,801 --> 00:12:08,301 this information is sent back to the phone 249 00:12:08,303 --> 00:12:12,405 and overlaid on the face of the subject in front of you. 250 00:12:12,407 --> 00:12:15,609 to see if we can identify you and see what information we... 251 00:12:15,611 --> 00:12:16,977 once it matches the photo 252 00:12:16,979 --> 00:12:18,078 to a social-media profile, 253 00:12:18,080 --> 00:12:21,147 the software can find out someone's name, 254 00:12:21,149 --> 00:12:24,851 their birth city, their interests, and much more. 255 00:12:24,853 --> 00:12:28,421 starting from just one snapshot of a person -- 256 00:12:28,423 --> 00:12:31,458 no name and no personal information -- 257 00:12:31,460 --> 00:12:33,259 we were able to lock on 258 00:12:33,261 --> 00:12:36,963 to the facebook profiles of these subjects. 259 00:12:36,965 --> 00:12:38,531 wow. No way. 260 00:12:38,533 --> 00:12:40,667 and once you get to the facebook profile, 261 00:12:40,669 --> 00:12:43,169 a world of information opens up. 262 00:12:43,171 --> 00:12:44,604 that's really eerie. 263 00:12:44,606 --> 00:12:48,108 most of us post photos of ourselves online, 264 00:12:48,110 --> 00:12:52,479 but not everyone realizes that photos are also data. 265 00:12:52,481 --> 00:12:55,448 and no one stole the data from us. 266 00:12:55,450 --> 00:12:59,085 we are willingly and publicly disclosing it. 267 00:12:59,087 --> 00:13:02,656 with a name and birthplace in hand, 268 00:13:02,658 --> 00:13:05,992 deeper corporate data mining can reveal data of birth, 269 00:13:05,994 --> 00:13:07,093 criminal record, 270 00:13:07,095 --> 00:13:08,962 and can even make a close guess 271 00:13:08,964 --> 00:13:11,297 at someone's social security number. 272 00:13:11,299 --> 00:13:13,867 that's one digit away from my actual social security number. 273 00:13:13,869 --> 00:13:16,102 how'd you predict that? 274 00:13:16,104 --> 00:13:19,139 once they have this information, 275 00:13:19,141 --> 00:13:20,740 there is virtually no limit 276 00:13:20,742 --> 00:13:23,777 to what else they may be able to find out about you. 277 00:13:23,779 --> 00:13:25,478 alessandro says 278 00:13:25,480 --> 00:13:29,282 he designed this software demonstration as a warning. 279 00:13:29,284 --> 00:13:32,052 it took very little effort to develop it. 280 00:13:32,054 --> 00:13:35,422 imagine what the corporations that rule the internet 281 00:13:35,424 --> 00:13:36,956 might already be doing. 282 00:13:36,958 --> 00:13:38,658 on any given day, 283 00:13:38,660 --> 00:13:42,629 2.1 billion people are active on social media. 284 00:13:42,631 --> 00:13:44,864 they tweet 500 million times. 285 00:13:44,866 --> 00:13:47,667 they share on facebook one billion times. 286 00:13:47,669 --> 00:13:51,471 they upload 1.8 billion photos. 287 00:13:51,473 --> 00:13:54,107 and every time you click on "like," 288 00:13:54,109 --> 00:13:56,342 a record is made of what you like. 289 00:13:58,146 --> 00:13:59,779 today the internet 290 00:13:59,781 --> 00:14:02,348 is essentially a surveillance economy. 291 00:14:02,350 --> 00:14:05,452 companies like amazon can sell millions of dollars 292 00:14:05,454 --> 00:14:07,587 of merchandising an hour. 293 00:14:07,589 --> 00:14:11,191 and much of these revenues come through ads, 294 00:14:11,193 --> 00:14:13,660 which are tailored to your preferences. 295 00:14:13,662 --> 00:14:15,962 the more a company can know you, 296 00:14:15,964 --> 00:14:18,098 the more they can manipulate you 297 00:14:18,100 --> 00:14:21,334 into clicking this link or buying this product. 298 00:14:21,336 --> 00:14:24,170 alessandro is convinced 299 00:14:24,172 --> 00:14:27,741 we'll keep giving up our personal data and our privacy 300 00:14:27,743 --> 00:14:31,144 because corporations make it so easy for us. 301 00:14:31,146 --> 00:14:34,414 marketers entice us 302 00:14:34,416 --> 00:14:37,283 into revealing more and more personal information. 303 00:14:37,285 --> 00:14:40,320 they work hard to make it a good experience for us. 304 00:14:40,322 --> 00:14:43,123 to us, it looks like the garden of eden, 305 00:14:43,125 --> 00:14:44,824 where everything is free. 306 00:14:44,826 --> 00:14:47,127 you get free apps, free content, 307 00:14:47,129 --> 00:14:49,195 you get to play angry birds, 308 00:14:49,197 --> 00:14:50,697 all of this in exchange 309 00:14:50,699 --> 00:14:53,199 for, say, having your location tracked 310 00:14:53,201 --> 00:14:54,868 1,000 times per day. 311 00:14:54,870 --> 00:14:56,636 once corporations 312 00:14:56,638 --> 00:14:59,839 have collected enough of your location history, 313 00:14:59,841 --> 00:15:02,375 they know you better than you know yourself. 314 00:15:02,377 --> 00:15:06,913 they can predict where you will be at a particular time of day 315 00:15:06,915 --> 00:15:12,352 with 80% accuracy up to one year into the future. 316 00:15:12,354 --> 00:15:14,554 with that kind of information, 317 00:15:14,556 --> 00:15:15,889 your phone -- 318 00:15:15,891 --> 00:15:18,491 and the companies that control the data in your phone -- 319 00:15:18,493 --> 00:15:22,061 will be able quite literally to steer your day. 320 00:15:22,063 --> 00:15:24,330 they may buy new shoes for you 321 00:15:24,332 --> 00:15:27,167 before you even know you need them. 322 00:15:27,169 --> 00:15:29,068 they may influence your decisions, 323 00:15:29,070 --> 00:15:30,670 which job you're going to take. 324 00:15:30,672 --> 00:15:32,405 alessandro believes 325 00:15:32,407 --> 00:15:34,941 we're losing the battle for our privacy, 326 00:15:34,943 --> 00:15:38,545 and it's a battle we don't even know we're fighting. 327 00:15:38,547 --> 00:15:40,446 the problem is, then, 328 00:15:40,448 --> 00:15:43,850 the system is basically built around trying to nudge us 329 00:15:43,852 --> 00:15:46,653 into revealing more and more personal information 330 00:15:46,655 --> 00:15:48,087 so that we no longer know 331 00:15:48,089 --> 00:15:50,623 whether what's being collected about us 332 00:15:50,625 --> 00:15:53,193 will be used in our best interest 333 00:15:53,195 --> 00:15:57,397 or will be used to the best interests of another entity. 334 00:15:57,399 --> 00:16:00,433 but even if we wise up, 335 00:16:00,435 --> 00:16:03,203 we may have a hard time stopping ourselves 336 00:16:03,205 --> 00:16:06,573 from sharing our most intimate likes and needs online 337 00:16:06,575 --> 00:16:09,108 because this scientist believes 338 00:16:09,110 --> 00:16:12,879 we may already be hooked on sharing, like a drug. 339 00:16:16,367 --> 00:16:19,568 people like to share. 340 00:16:19,570 --> 00:16:22,171 after all, we are social animals. 341 00:16:22,173 --> 00:16:24,907 but somehow the age of social media 342 00:16:24,909 --> 00:16:28,945 has got us sharing more and more... 343 00:16:28,947 --> 00:16:35,151 no matter how uninteresting it might be... 344 00:16:35,153 --> 00:16:38,754 even though we know every post gives marketers 345 00:16:38,756 --> 00:16:41,657 more and more information about us. 346 00:16:41,659 --> 00:16:45,728 so, why do we do it? 347 00:16:45,730 --> 00:16:49,999 and could we stop ourselves even if we tried? 348 00:16:52,170 --> 00:16:54,570 psychologist diana tamir knows 349 00:16:54,572 --> 00:16:57,807 she's sometimes guilty of over-sharing... 350 00:16:57,809 --> 00:17:00,042 -you gonna try this one? -i'm gonna try this one. 351 00:17:00,044 --> 00:17:01,811 ...especially when she tries out 352 00:17:01,813 --> 00:17:03,813 her favorite new hobby. 353 00:17:05,183 --> 00:17:07,883 it's super satisfying to be able to do a route 354 00:17:07,885 --> 00:17:10,086 that you weren't able to do before. 355 00:17:10,088 --> 00:17:12,888 that's part of the joy of rock climbing. 356 00:17:12,890 --> 00:17:15,658 getting to the top of a wall can feel rewarding. 357 00:17:15,660 --> 00:17:18,027 you don't have to conquer k2 358 00:17:18,029 --> 00:17:20,262 to feel that basic human impulse -- 359 00:17:20,264 --> 00:17:23,165 the impulse to talk about yourself. 360 00:17:27,772 --> 00:17:29,705 check it out. 361 00:17:29,707 --> 00:17:31,240 people talk about themselves all the time. 362 00:17:31,242 --> 00:17:32,608 they talk about themselves 363 00:17:32,610 --> 00:17:34,610 when they're having a conversation with other people. 364 00:17:34,612 --> 00:17:37,179 they talk about themselves 365 00:17:37,181 --> 00:17:39,582 when they're sharing information about themselves on social media 366 00:17:39,584 --> 00:17:41,784 or taking pictures of the thing that they ate for breakfast 367 00:17:41,786 --> 00:17:44,387 and posting it for the world to see. 368 00:17:44,389 --> 00:17:45,788 that's a really good one. 369 00:17:45,790 --> 00:17:47,089 there was a study 370 00:17:47,091 --> 00:17:48,591 that looked at what people tweet about on twitter, 371 00:17:48,593 --> 00:17:51,761 and they found that about 80% of what people are tweeting about 372 00:17:51,763 --> 00:17:53,896 is just their own personal experiences. 373 00:17:55,600 --> 00:17:58,534 why do we enjoy this so much? 374 00:17:58,536 --> 00:18:00,569 as a neuroscientist, 375 00:18:00,571 --> 00:18:04,373 diana thinks the answer may be hiding in our brains. 376 00:18:04,375 --> 00:18:08,711 so she designed an experiment using an mri scanner 377 00:18:08,713 --> 00:18:12,715 to see how talking about ourselves -- versus others -- 378 00:18:12,717 --> 00:18:14,617 changes brain activity. 379 00:18:18,289 --> 00:18:22,091 picture it like a public-access talk show, 380 00:18:22,093 --> 00:18:24,994 with diana taking the role of the interviewer... 381 00:18:26,130 --> 00:18:27,763 is it rewarding to talk about yourself? 382 00:18:27,765 --> 00:18:29,331 let's find out! 383 00:18:29,333 --> 00:18:32,401 ...and a ficus standing in for the scanner. 384 00:18:32,403 --> 00:18:34,103 -hey, adam. -hey. 385 00:18:34,105 --> 00:18:37,807 do you get excited to dress up for halloween? 386 00:18:37,809 --> 00:18:39,708 diana asks her subjects 387 00:18:39,710 --> 00:18:43,012 to respond to questions about themselves or other people 388 00:18:43,014 --> 00:18:46,248 while showing them corresponding photographs. 389 00:18:46,250 --> 00:18:48,350 does your dad like being photographed? 390 00:18:48,352 --> 00:18:51,587 do you enjoy spending time in nature? 391 00:18:54,225 --> 00:18:56,025 do you like being photographed? 392 00:18:57,662 --> 00:19:00,529 do you enjoy having a dog as a pet? 393 00:19:01,866 --> 00:19:06,135 for diana, the answers weren't important. 394 00:19:06,137 --> 00:19:07,636 what mattered was 395 00:19:07,638 --> 00:19:10,639 how her subjects' brains responded to the questions. 396 00:19:10,641 --> 00:19:14,110 all of them activated the prefrontal cortex, 397 00:19:14,112 --> 00:19:17,346 a region associated with higher thought. 398 00:19:17,348 --> 00:19:19,048 but something else happened 399 00:19:19,050 --> 00:19:22,284 when a subject answered questions about themselves. 400 00:19:22,286 --> 00:19:25,721 diana saw activation in two brain regions -- 401 00:19:25,723 --> 00:19:30,126 the ventral tegmental area and the nucleus accumbens. 402 00:19:30,128 --> 00:19:32,928 they belong to what neuroscientists call 403 00:19:32,930 --> 00:19:35,731 the reward pathway. 404 00:19:35,733 --> 00:19:38,100 so, we have these reward pathways in our brain 405 00:19:38,102 --> 00:19:39,468 that motivates our behavior 406 00:19:39,470 --> 00:19:43,005 by helping us to learn what things in the world 407 00:19:43,007 --> 00:19:44,240 feel rewarding 408 00:19:44,242 --> 00:19:48,410 that we need or want or desire, like food or sex. 409 00:19:48,412 --> 00:19:50,579 the brain's reward system 410 00:19:50,581 --> 00:19:53,916 is powered by a key chemical called dopamine. 411 00:19:53,918 --> 00:19:56,952 a surge of dopamine can trigger pleasant feelings, 412 00:19:56,954 --> 00:20:00,055 which motivate us to seek further rewards. 413 00:20:00,057 --> 00:20:02,391 it's the same system that fires up 414 00:20:02,393 --> 00:20:04,426 when people do drugs like cocaine 415 00:20:04,428 --> 00:20:06,595 or eat chocolate. 416 00:20:06,597 --> 00:20:10,166 so, like drugs, sharing can become addictive. 417 00:20:10,168 --> 00:20:13,002 but why does the dopamine system activate 418 00:20:13,004 --> 00:20:15,671 when we talk about ourselves? 419 00:20:15,673 --> 00:20:17,806 humans have a fundamental need to belong 420 00:20:17,808 --> 00:20:19,275 or connect with other people. 421 00:20:19,277 --> 00:20:21,644 so, social connection and making friends 422 00:20:21,646 --> 00:20:23,212 and interacting with people 423 00:20:23,214 --> 00:20:25,581 are something that we're highly motivated to get. 424 00:20:25,583 --> 00:20:29,385 so, being part of a group gets you more resources, food, 425 00:20:29,387 --> 00:20:34,190 reproductive options than if you were by yourself. 426 00:20:34,192 --> 00:20:37,826 self-promotion helps establish us 427 00:20:37,828 --> 00:20:39,828 as members of a group. 428 00:20:39,830 --> 00:20:42,064 and for hundreds of thousands of years, 429 00:20:42,066 --> 00:20:45,801 being part of a group has been essential to our survival. 430 00:20:45,803 --> 00:20:48,571 even when we can't see the other people in our group, 431 00:20:48,573 --> 00:20:53,842 we still have the instinctual urge to promote ourselves. 432 00:20:53,844 --> 00:20:56,445 part of the reason people share so much on social media 433 00:20:56,447 --> 00:20:59,014 is because it activates the same sort of neural systems 434 00:20:59,016 --> 00:21:01,116 as self-disclosing in person. 435 00:21:01,118 --> 00:21:04,753 sharing stems from a deep evolutionary drive. 436 00:21:04,755 --> 00:21:07,923 that's why it's so easy to get hooked on it. 437 00:21:07,925 --> 00:21:11,327 diana wanted to know how easy it would be 438 00:21:11,329 --> 00:21:14,063 for her subjects to kick the habit of over-sharing... 439 00:21:16,167 --> 00:21:18,267 ...so she tried bribing them. 440 00:21:18,269 --> 00:21:20,536 what we were looking at is whether or not 441 00:21:20,538 --> 00:21:23,105 people would kind of forego some extra monetary rewards 442 00:21:23,107 --> 00:21:25,674 in order to answer a question about themselves. 443 00:21:25,676 --> 00:21:29,912 this time, diana let her subjects decide -- 444 00:21:29,914 --> 00:21:32,548 talk about yourself and earn nothing 445 00:21:32,550 --> 00:21:35,684 or get paid to talk about somebody else. 446 00:21:35,686 --> 00:21:37,753 can you tell me about whether you or your friend 447 00:21:37,755 --> 00:21:39,121 like spending time in nature? 448 00:21:41,192 --> 00:21:43,626 money activates the dopamine system. 449 00:21:43,628 --> 00:21:48,530 in fact, our neural wiring has taught us to chase it. 450 00:21:48,532 --> 00:21:51,367 but they say money can't buy happiness -- 451 00:21:51,369 --> 00:21:54,003 at least, not as much happiness as you get 452 00:21:54,005 --> 00:21:55,771 when you talk about you. 453 00:21:55,773 --> 00:21:58,641 while some participants chose the money, 454 00:21:58,643 --> 00:22:00,542 most turned it down. 455 00:22:00,544 --> 00:22:03,245 we see that people place significant amounts of value 456 00:22:03,247 --> 00:22:05,314 on answering questions about themselves 457 00:22:05,316 --> 00:22:06,649 and significantly less value 458 00:22:06,651 --> 00:22:08,784 on answering questions about other people. 459 00:22:08,786 --> 00:22:10,452 it kind of really brought the point home 460 00:22:10,454 --> 00:22:12,254 that sharing information is rewarding. 461 00:22:14,825 --> 00:22:17,259 our compulsion to share 462 00:22:17,261 --> 00:22:20,929 is part of our biological makeup. 463 00:22:20,931 --> 00:22:24,066 but our biology could be the next target 464 00:22:24,068 --> 00:22:26,268 in the assault on our privacy. 465 00:22:26,270 --> 00:22:30,139 our most sensitive personal information 466 00:22:30,141 --> 00:22:33,809 may already have been sold to the highest bidder. 467 00:22:38,458 --> 00:22:42,093 which would you hate to lose the most -- 468 00:22:42,095 --> 00:22:44,395 your phone or your wallet? 469 00:22:44,397 --> 00:22:47,498 if either one of these is stolen, 470 00:22:47,500 --> 00:22:49,667 it's a total hassle. 471 00:22:49,669 --> 00:22:52,169 your private information is exposed. 472 00:22:52,171 --> 00:22:54,438 however, you can cancel bank cards, 473 00:22:54,440 --> 00:22:56,240 wipe the data from your phone, 474 00:22:56,242 --> 00:22:59,009 and change all of your passwords. 475 00:22:59,011 --> 00:23:02,747 but there is something else you leave behind every day 476 00:23:02,749 --> 00:23:06,784 that could be far more devastating to your privacy. 477 00:23:09,021 --> 00:23:10,454 a single strand of hair 478 00:23:10,456 --> 00:23:13,424 contains the most private information you have... 479 00:23:15,528 --> 00:23:18,362 ...your dna. 480 00:23:23,870 --> 00:23:27,004 yaniv erlich is a former hacker 481 00:23:27,006 --> 00:23:30,207 who used to break into banks to test their security. 482 00:23:40,453 --> 00:23:42,887 now he's a computational biologist, 483 00:23:42,889 --> 00:23:45,523 and he's concerned about the security 484 00:23:45,525 --> 00:23:47,658 of a different kind of bank -- 485 00:23:47,660 --> 00:23:48,859 a dna bank, 486 00:23:48,861 --> 00:23:52,463 which can store the individual genetic code 487 00:23:52,465 --> 00:23:54,932 of hundreds of thousands of people. 488 00:23:57,236 --> 00:23:58,769 he believes that hackers 489 00:23:58,771 --> 00:24:01,705 will soon be able to break in to those biobanks 490 00:24:01,707 --> 00:24:05,676 and steal our most valuable and most private asset. 491 00:24:05,678 --> 00:24:09,547 a number of large-scale biobanks 492 00:24:09,549 --> 00:24:13,551 offer you the opportunity to contribute your dna to science. 493 00:24:13,553 --> 00:24:16,153 it just takes a simple cheek swab 494 00:24:16,155 --> 00:24:18,255 to get the dna out of your mouth. 495 00:24:18,257 --> 00:24:21,592 and then, in a matter of days with the current technology, 496 00:24:21,594 --> 00:24:24,228 we can analyze your entire genome. 497 00:24:24,230 --> 00:24:30,534 companies like 23andme and ancestry.com 498 00:24:30,536 --> 00:24:32,002 will sequence your dna 499 00:24:32,004 --> 00:24:35,339 and send you back information about your family tree 500 00:24:35,341 --> 00:24:38,709 or whether you are at risk for certain inherited diseases. 501 00:24:38,711 --> 00:24:40,344 and scientists are using 502 00:24:40,346 --> 00:24:43,047 this huge database of genetic information 503 00:24:43,049 --> 00:24:48,652 to develop new cures for a wide range of diseases. 504 00:24:48,654 --> 00:24:51,088 so, with all these types of information, 505 00:24:51,090 --> 00:24:53,057 scientists can really understand 506 00:24:53,059 --> 00:24:55,860 how the vulnerability within the population 507 00:24:55,862 --> 00:24:59,230 is affected by the dna material we have. 508 00:24:59,232 --> 00:25:01,866 if the contents of your dna 509 00:25:01,868 --> 00:25:03,701 were stolen and disclosed, 510 00:25:03,703 --> 00:25:06,770 the consequences could be disastrous. 511 00:25:06,772 --> 00:25:10,608 imagine being denied health insurance or losing your job 512 00:25:10,610 --> 00:25:12,343 because your genes show 513 00:25:12,345 --> 00:25:15,312 you're at high risk for a heart attack. 514 00:25:15,314 --> 00:25:20,017 so biobanks say they make sure your dna remains anonymous. 515 00:25:20,019 --> 00:25:22,953 to increase the security, 516 00:25:22,955 --> 00:25:25,823 biobanks usually don't store your identifiers 517 00:25:25,825 --> 00:25:28,192 together with your genetic material. 518 00:25:28,194 --> 00:25:31,462 they will keep your name, telephone number, and address 519 00:25:31,464 --> 00:25:33,864 totally separated from this information. 520 00:25:33,866 --> 00:25:36,200 this way, no one knows what is the origin 521 00:25:36,202 --> 00:25:39,803 of the genetic material that you gave. 522 00:25:39,805 --> 00:25:42,373 but yaniv has found a serious flaw 523 00:25:42,375 --> 00:25:43,607 in biobank security. 524 00:25:43,609 --> 00:25:45,876 in fact, he's discovered that even those of us 525 00:25:45,878 --> 00:25:50,714 who have never had our dna sequenced are at risk, too. 526 00:25:52,618 --> 00:25:57,087 our dna is vulnerable to theft every single day. 527 00:25:57,089 --> 00:25:59,790 just think about what happens when you get a haircut. 528 00:25:59,792 --> 00:26:02,092 although dna is something very personal, 529 00:26:02,094 --> 00:26:03,694 you shed it everywhere. 530 00:26:03,696 --> 00:26:05,195 you go to the barbershop. 531 00:26:05,197 --> 00:26:06,430 you get a shave, 532 00:26:06,432 --> 00:26:09,199 you leave some of your dna on the blade. 533 00:26:09,201 --> 00:26:10,901 you take a sip from a glass. 534 00:26:10,903 --> 00:26:13,103 you have some of your saliva on the glass, 535 00:26:13,105 --> 00:26:14,972 you leave behind some of your dna. 536 00:26:14,974 --> 00:26:19,176 maybe if you're chewing gum or you smoke a cigarette, 537 00:26:19,178 --> 00:26:21,445 you leave the cigarette butt behind. 538 00:26:21,447 --> 00:26:23,013 you left some of your dna. 539 00:26:23,015 --> 00:26:26,383 if a gene thief got ahold of your dna, 540 00:26:26,385 --> 00:26:29,553 they could discover which inherited diseases you have, 541 00:26:29,555 --> 00:26:31,155 whether you have a tendency 542 00:26:31,157 --> 00:26:33,691 towards alcoholism or mental illness, 543 00:26:33,693 --> 00:26:36,293 and threaten to reveal that information 544 00:26:36,295 --> 00:26:41,565 to employers or insurers unless you pay up. 545 00:26:41,567 --> 00:26:43,534 aah! 546 00:26:43,536 --> 00:26:45,402 the key to being able to tie 547 00:26:45,404 --> 00:26:47,905 a piece of anonymous dna to a name, 548 00:26:47,907 --> 00:26:50,541 whether in a biobank or a barbershop, 549 00:26:50,543 --> 00:26:53,310 is in the "y" chromosome. 550 00:26:53,312 --> 00:26:54,545 if you're a male, 551 00:26:54,547 --> 00:26:58,716 we can know more about your paternal ancestry 552 00:26:58,718 --> 00:27:01,118 because you inherited a short piece of dna 553 00:27:01,120 --> 00:27:02,553 called the "y" chromosome 554 00:27:02,555 --> 00:27:04,488 that you just get from your father's side. 555 00:27:04,490 --> 00:27:07,658 now, here is the funny thing about your "y" chromosome. 556 00:27:07,660 --> 00:27:09,960 you get your surname from your father. 557 00:27:09,962 --> 00:27:12,162 he got it from his own father. 558 00:27:12,164 --> 00:27:14,531 and you got your "y" chromosome from the same path. 559 00:27:14,533 --> 00:27:15,766 this creates a correlation 560 00:27:15,768 --> 00:27:18,569 between "y" chromosome and surnames. 561 00:27:18,571 --> 00:27:20,904 in men, the "y" chromosome 562 00:27:20,906 --> 00:27:23,540 contains patterns of repeating letters of dna, 563 00:27:23,542 --> 00:27:24,942 a genetic fingerprint 564 00:27:24,944 --> 00:27:29,179 that passes from grandfather to father to son unchanged, 565 00:27:29,181 --> 00:27:31,081 just like the surname does. 566 00:27:31,083 --> 00:27:34,685 to prove that our genetic privacy is under threat, 567 00:27:34,687 --> 00:27:37,521 yaniv pretends to be a gene thief. 568 00:27:37,523 --> 00:27:41,525 he downloads an anonymous dna sequence from a biobank 569 00:27:41,527 --> 00:27:44,895 and zeroes in on its unique "y" chromosome patterns. 570 00:27:44,897 --> 00:27:47,665 then he log on to a genealogy database, 571 00:27:47,667 --> 00:27:51,802 where people voluntarily upload their "y" chromosome sequences, 572 00:27:51,804 --> 00:27:55,806 along with their names, to locate long-lost family. 573 00:27:55,808 --> 00:28:00,177 that allows him to match the anonymous biobank dna 574 00:28:00,179 --> 00:28:02,312 to a specific surname. 575 00:28:02,314 --> 00:28:04,815 and since the anonymous biobank sequences 576 00:28:04,817 --> 00:28:07,584 are tagged with the age and state of residence 577 00:28:07,586 --> 00:28:10,487 of the person who supplied the dna, 578 00:28:10,489 --> 00:28:14,425 a simple internet search reveals their identity. 579 00:28:14,427 --> 00:28:18,996 he has done this successfully 50 times. 580 00:28:18,998 --> 00:28:21,432 i was so shocked by the results 581 00:28:21,434 --> 00:28:23,500 that i have to take a walk 582 00:28:23,502 --> 00:28:26,103 to think about the implications of our method 583 00:28:26,105 --> 00:28:27,237 to genetic privacy. 584 00:28:27,239 --> 00:28:29,540 it means that if hackers can get 585 00:28:29,542 --> 00:28:32,209 the identified genetic information 586 00:28:32,211 --> 00:28:33,877 that is allegedly anonymous, 587 00:28:33,879 --> 00:28:35,779 it means that we cannot promise, 588 00:28:35,781 --> 00:28:37,715 we cannot guarantee full privacy, 589 00:28:37,717 --> 00:28:39,750 and we need to seek a different way 590 00:28:39,752 --> 00:28:43,153 to engage participants in these large-scale biobanks. 591 00:28:43,155 --> 00:28:46,757 in the wrong hands, a single strand of hair 592 00:28:46,759 --> 00:28:50,728 can ruin the life of the person who left it behind. 593 00:28:53,566 --> 00:28:58,202 how can we shield ourselves from this privacy onslaught? 594 00:28:58,204 --> 00:29:01,939 lock ourselves in our homes and never go outside? 595 00:29:01,941 --> 00:29:04,942 sterilize every room we've been in? 596 00:29:04,944 --> 00:29:06,477 one scientist thinks 597 00:29:06,479 --> 00:29:09,513 there's only one way to save our privacy. 598 00:29:09,515 --> 00:29:13,917 for him, the best defense is offense. 599 00:29:19,100 --> 00:29:20,500 feels like pretty soon, 600 00:29:20,502 --> 00:29:22,702 there won't be a minute of the day 601 00:29:22,704 --> 00:29:24,403 when we aren't being watched. 602 00:29:24,405 --> 00:29:28,074 any device you own could be hacked into 603 00:29:28,076 --> 00:29:30,877 and used to spy on you. 604 00:29:30,879 --> 00:29:33,813 so, what's the answer? 605 00:29:33,815 --> 00:29:36,949 go completely off-grid? 606 00:29:36,951 --> 00:29:39,685 maybe there's another way. 607 00:29:39,687 --> 00:29:44,223 we could develop technology to know when we're being watched 608 00:29:44,225 --> 00:29:49,128 and when we truly have privacy. 609 00:29:49,130 --> 00:29:54,634 steve mann has worn a computer every day for the last 38 years. 610 00:29:54,636 --> 00:29:56,903 in fact, he's been called 611 00:29:56,905 --> 00:29:59,872 the father of wearable computing. 612 00:29:59,874 --> 00:30:04,143 his ideas inspired better-known devices like google glass. 613 00:30:04,145 --> 00:30:06,179 but back when he began, 614 00:30:06,181 --> 00:30:09,282 his digital eyeglass was so bulky, 615 00:30:09,284 --> 00:30:12,418 he was often the subject of ridicule. 616 00:30:12,420 --> 00:30:14,887 so, 35 years of digital eyeglass, 617 00:30:14,889 --> 00:30:17,757 and finally we see how the industry is catching on 618 00:30:17,759 --> 00:30:19,258 to some of these concepts. 619 00:30:19,260 --> 00:30:22,261 so i feel kind of vindicated after people laugh at me 620 00:30:22,263 --> 00:30:25,364 for all the sort of stupid eyeglasses and crazy things. 621 00:30:25,366 --> 00:30:29,835 steve, a professor at the university of toronto, 622 00:30:29,837 --> 00:30:32,371 has a cult following among his students 623 00:30:32,373 --> 00:30:34,607 as the original cyborg. 624 00:30:34,609 --> 00:30:38,344 his digital eyewear is bolted to his skull. 625 00:30:38,346 --> 00:30:42,415 his interest in using technology to augment what he could see 626 00:30:42,417 --> 00:30:44,450 began when he was a kid. 627 00:30:44,452 --> 00:30:46,752 then in the 1970s, 628 00:30:46,754 --> 00:30:50,122 i started to notice these things watching us 629 00:30:50,124 --> 00:30:51,357 and sensing us -- 630 00:30:51,359 --> 00:30:54,060 microwave motion detectors and burglar alarms 631 00:30:54,062 --> 00:30:55,294 and stuff like that. 632 00:30:55,296 --> 00:30:56,562 and i was wondering, 633 00:30:56,564 --> 00:30:58,097 "well, why are all these machines spying on us?" 634 00:30:58,099 --> 00:31:01,467 and today he runs an entire research team 635 00:31:01,469 --> 00:31:03,936 dedicated to developing technology 636 00:31:03,938 --> 00:31:07,039 that can sniff out when we're being surveilled. 637 00:31:07,041 --> 00:31:09,675 so, this device will help you identify 638 00:31:09,677 --> 00:31:12,311 what devices are recording your sound. 639 00:31:12,313 --> 00:31:15,781 so, the lights over here move faster and bigger 640 00:31:15,783 --> 00:31:17,216 near a microphone. 641 00:31:17,218 --> 00:31:19,318 and then it locates the mike, 642 00:31:19,320 --> 00:31:21,954 and that's how you can sweep bugs. 643 00:31:21,956 --> 00:31:25,458 they have devices that can pick up radio waves, 644 00:31:25,460 --> 00:31:28,160 including those from your cellphone. 645 00:31:28,162 --> 00:31:30,830 so, the radio waves coming from my smartphone here, 646 00:31:30,832 --> 00:31:32,164 for example -- 647 00:31:32,166 --> 00:31:35,067 if i block that with my hand, the wave is very weak. 648 00:31:35,069 --> 00:31:38,170 see how weak that wave is when it's going through my hand, 649 00:31:38,172 --> 00:31:41,107 and then, whereas if i hold it like this, 650 00:31:41,109 --> 00:31:42,942 the wave is much stronger. 651 00:31:42,944 --> 00:31:45,444 perhaps his most important invention 652 00:31:45,446 --> 00:31:48,147 in this age of near total surveillance 653 00:31:48,149 --> 00:31:51,317 is technology that can detect precisely 654 00:31:51,319 --> 00:31:54,086 when you're being watched by a camera. 655 00:31:54,088 --> 00:31:56,522 so, there's a camera inside this dome, 656 00:31:56,524 --> 00:31:58,424 and we don't know which way it's pointing 657 00:31:58,426 --> 00:32:00,259 because it's shrouded in this dark dome. 658 00:32:00,261 --> 00:32:01,694 but the light here, 659 00:32:01,696 --> 00:32:04,130 when it comes into the field of view of the camera, glows, 660 00:32:04,132 --> 00:32:06,332 and when it goes out of the field of the camera, 661 00:32:06,334 --> 00:32:07,800 it goes dim again. 662 00:32:07,802 --> 00:32:11,070 and so you can see here it sort of paints out, if you will, 663 00:32:11,072 --> 00:32:14,106 the sight field of the camera. 664 00:32:14,108 --> 00:32:17,710 if i put my coat in front of it, my jacket, the bulb -- 665 00:32:17,712 --> 00:32:19,345 i haven't moved the bulb at all. 666 00:32:19,347 --> 00:32:21,314 i've just blocked it with my jacket, 667 00:32:21,316 --> 00:32:22,948 and when i unblock it, it glows. 668 00:32:22,950 --> 00:32:27,887 most of us are used to seeing cameras everywhere. 669 00:32:27,889 --> 00:32:31,724 but steve believes that if we knew when we were being watched, 670 00:32:31,726 --> 00:32:36,529 we'd start asking more questions about who's watching and why. 671 00:32:36,531 --> 00:32:37,863 it could be the police. 672 00:32:37,865 --> 00:32:39,031 it could be a computer. 673 00:32:39,033 --> 00:32:40,866 it could be artificial intelligence. 674 00:32:40,868 --> 00:32:43,202 it could be machine learning. We often don't know. 675 00:32:43,204 --> 00:32:45,905 many times, surveillance embraces hypocrisy, 676 00:32:45,907 --> 00:32:48,207 wanting to watch and not be watched, 677 00:32:48,209 --> 00:32:50,042 wanting to see and not be seen, 678 00:32:50,044 --> 00:32:52,345 wanting to know everything about us, 679 00:32:52,347 --> 00:32:54,246 but reveal nothing about itself. 680 00:32:54,248 --> 00:32:56,349 to redress that balance, 681 00:32:56,351 --> 00:32:59,085 steve is working to commercialize technology 682 00:32:59,087 --> 00:33:01,887 to detect the zones where a camera sees us... 683 00:33:01,889 --> 00:33:05,725 what he calls its veilance field. 684 00:33:05,727 --> 00:33:08,094 ryan jansen is working with steve 685 00:33:08,096 --> 00:33:12,365 on getting a veilance-field detector into a wearable device. 686 00:33:12,367 --> 00:33:17,603 these are some glasses where i can see the veilance fields 687 00:33:17,605 --> 00:33:19,839 from this surveillance camera. 688 00:33:19,841 --> 00:33:23,509 so, what it does is pokes and prods at the optical field 689 00:33:23,511 --> 00:33:26,645 until it figures out how much the camera is seeing. 690 00:33:26,647 --> 00:33:28,481 so, you can really take this around 691 00:33:28,483 --> 00:33:30,249 and measure a whole veilance field 692 00:33:30,251 --> 00:33:31,717 from a surveillance camera. 693 00:33:31,719 --> 00:33:35,121 what i'm excited about is to be able to finally see and know 694 00:33:35,123 --> 00:33:36,522 how much we're watched, 695 00:33:36,524 --> 00:33:38,791 know how much the watchers are watching us. 696 00:33:38,793 --> 00:33:40,893 veilance fields 697 00:33:40,895 --> 00:33:43,729 aren't always places you want to avoid. 698 00:33:43,731 --> 00:33:47,032 sometimes you may want to be watched. 699 00:33:47,034 --> 00:33:49,602 a lot of people who say, " you're into the sensing cameras, 700 00:33:49,604 --> 00:33:51,437 so must be against cameras." 701 00:33:51,439 --> 00:33:53,806 sometimes i'm walking home late at night in a dark alley 702 00:33:53,808 --> 00:33:55,574 and there's somebody sharpening a knife 703 00:33:55,576 --> 00:33:57,143 and somebody loading a gun down there, 704 00:33:57,145 --> 00:33:58,244 i might say, "you know what? 705 00:33:58,246 --> 00:33:59,645 i think i'd like to be watched," 706 00:33:59,647 --> 00:34:01,947 sort of say, "there's a veilance flux over there. 707 00:34:01,949 --> 00:34:03,916 i think i'm gonna move towards the camera." 708 00:34:03,918 --> 00:34:06,886 what i'm really against is the one-sided valence. 709 00:34:06,888 --> 00:34:08,888 government or big business 710 00:34:08,890 --> 00:34:11,991 use cameras to watch regular people. 711 00:34:11,993 --> 00:34:14,760 but regular people rarely turn their cameras 712 00:34:14,762 --> 00:34:16,662 on big business and government. 713 00:34:16,664 --> 00:34:20,966 steve believes wearable devices like his digital eyeglass 714 00:34:20,968 --> 00:34:24,503 can help us watch the watchers. 715 00:34:24,505 --> 00:34:26,172 "surveillance" is a french word 716 00:34:26,174 --> 00:34:27,673 that means "to watch from above." 717 00:34:27,675 --> 00:34:29,442 when we're doing the watching, 718 00:34:29,444 --> 00:34:32,011 we call that undersight, or sousveillance. 719 00:34:32,013 --> 00:34:34,914 but steve has already discovered 720 00:34:34,916 --> 00:34:37,516 that sousveillance can invite trouble. 721 00:34:37,518 --> 00:34:40,319 recently, he walked into a fast-food restaurant 722 00:34:40,321 --> 00:34:41,954 wearing his digital eyeglass 723 00:34:41,956 --> 00:34:44,790 and was confronted by employees enforcing policies 724 00:34:44,792 --> 00:34:46,926 that don't allow filming in their buildings. 725 00:34:46,928 --> 00:34:48,160 hey! 726 00:34:48,162 --> 00:34:51,564 despite the eyeglass being bolted to his skull, 727 00:34:51,566 --> 00:34:54,099 the employees tried to remove it, 728 00:34:54,101 --> 00:34:57,203 damaging it in the process. 729 00:34:57,205 --> 00:34:59,972 the cameras want to watch but not be seen. 730 00:34:59,974 --> 00:35:02,074 and, in fact, even if you photograph cameras, 731 00:35:02,076 --> 00:35:04,677 you find very quickly people come running out to tell you, 732 00:35:04,679 --> 00:35:05,978 "no cameras are allowed here." 733 00:35:05,980 --> 00:35:07,713 and you say, "well, aren't those all cameras around here?" 734 00:35:07,715 --> 00:35:09,815 "no, but those aren't cameras. They're surveillance." 735 00:35:09,817 --> 00:35:13,352 steve believes that if we know when we're being watched 736 00:35:13,354 --> 00:35:16,155 and if sousveillance becomes widespread, 737 00:35:16,157 --> 00:35:18,491 we'll finally have the weapons we need 738 00:35:18,493 --> 00:35:22,027 to fight back against the governments and corporations 739 00:35:22,029 --> 00:35:25,498 that constantly peer into our private lives. 740 00:35:25,500 --> 00:35:27,500 the goal is to create systems 741 00:35:27,502 --> 00:35:30,503 that improve the quality of people's lives, 742 00:35:30,505 --> 00:35:32,972 systems in which people are innately aware 743 00:35:32,974 --> 00:35:34,173 of what's happening, 744 00:35:34,175 --> 00:35:36,342 to create a society in which 745 00:35:36,344 --> 00:35:39,912 sousveillance is balanced with surveillance. 746 00:35:39,914 --> 00:35:43,549 one day, widespread digital eyesight 747 00:35:43,551 --> 00:35:46,919 will merge sousveillance and surveillance 748 00:35:46,921 --> 00:35:49,989 and transform society. 749 00:35:49,991 --> 00:35:52,558 although someone may be watching you, 750 00:35:52,560 --> 00:35:54,827 you can now do your own watching. 751 00:35:54,829 --> 00:35:58,030 but what will a world where everyone is watched 752 00:35:58,032 --> 00:36:00,599 and everyone is a watcher look like? 753 00:36:00,601 --> 00:36:06,138 what will life be like in a world with no more secrets? 754 00:36:11,571 --> 00:36:15,240 we stand on the brink of a new era. 755 00:36:15,242 --> 00:36:17,041 governments and corporations 756 00:36:17,043 --> 00:36:19,911 are peering into every corner of our lives. 757 00:36:19,913 --> 00:36:24,415 and we are developing tools to watch the watchers. 758 00:36:24,417 --> 00:36:29,187 so, will life in a world with almost no secrets 759 00:36:29,189 --> 00:36:31,489 be a living nightmare? 760 00:36:31,491 --> 00:36:35,627 or will the naked truth set us free? 761 00:36:38,598 --> 00:36:42,166 futurist and science-fiction author david brin 762 00:36:42,168 --> 00:36:45,103 thinks there is no point in trying to hide, 763 00:36:45,105 --> 00:36:47,872 so he's putting it all on display. 764 00:36:48,942 --> 00:36:51,075 in this modern era, 765 00:36:51,077 --> 00:36:54,379 when eyes are proliferating everywhere, 766 00:36:54,381 --> 00:36:57,115 with the cameras getting smaller, faster, cheaper, 767 00:36:57,117 --> 00:36:59,083 more numerous every day, 768 00:36:59,085 --> 00:37:01,352 the human reflex is to say, 769 00:37:01,354 --> 00:37:02,987 "get those things away from me. 770 00:37:02,989 --> 00:37:04,222 ban them." 771 00:37:04,224 --> 00:37:07,625 but over the long run, 772 00:37:07,627 --> 00:37:10,828 that approach is not only futile. 773 00:37:10,830 --> 00:37:14,599 it also is kind of cowardly. 774 00:37:17,337 --> 00:37:19,938 david has got used to the idea 775 00:37:19,940 --> 00:37:23,474 that even private spaces aren't so private anymore. 776 00:37:23,476 --> 00:37:25,243 he thinks the key to getting comfortable 777 00:37:25,245 --> 00:37:27,679 is to look to the past. 778 00:37:27,681 --> 00:37:30,081 after all, for most of human history, 779 00:37:30,083 --> 00:37:31,649 we lived without privacy. 780 00:37:31,651 --> 00:37:34,052 -thank you. -sure. 781 00:37:34,054 --> 00:37:35,653 cheers. 782 00:37:35,655 --> 00:37:38,156 our ancestors didn't have much of a concept of privacy. 783 00:37:38,158 --> 00:37:41,192 families would crowd into single cottages, 784 00:37:41,194 --> 00:37:42,794 knowing each other's business, 785 00:37:42,796 --> 00:37:44,862 seeing everything that was going on. 786 00:37:44,864 --> 00:37:49,033 the advantage was everybody knew your name. 787 00:37:49,035 --> 00:37:50,969 there was some sense of solidarity. 788 00:37:50,971 --> 00:37:54,305 but the olden times were no utopia. 789 00:37:54,307 --> 00:37:56,407 the disadvantages were huge. 790 00:37:56,409 --> 00:37:57,875 you were dominated 791 00:37:57,877 --> 00:38:00,178 by the lord on the hill and his thugs 792 00:38:00,180 --> 00:38:05,817 and by the local busybodies who knew everybody's business. 793 00:38:05,819 --> 00:38:10,355 today, we have our own versions of these watchers. 794 00:38:10,357 --> 00:38:13,591 you can think of the lord of the village 795 00:38:13,593 --> 00:38:15,393 as the nsa or the fbi. 796 00:38:18,131 --> 00:38:21,199 the busybodies are the media and your neighbors, 797 00:38:21,201 --> 00:38:24,502 who can see almost anything you do. 798 00:38:24,504 --> 00:38:27,805 david thinks it's with our fellow citizens, 799 00:38:27,807 --> 00:38:29,173 not the government, 800 00:38:29,175 --> 00:38:32,510 that the battle to reclaim our privacy must begin. 801 00:38:32,512 --> 00:38:34,612 the first step is to make sure 802 00:38:34,614 --> 00:38:38,650 people who are watching us and talking about us can't hide. 803 00:38:38,652 --> 00:38:41,586 we're all so used to personal gossip, 804 00:38:41,588 --> 00:38:47,525 where exchanging stories about other people is so natural 805 00:38:47,527 --> 00:38:49,427 that we put up with 806 00:38:49,429 --> 00:38:53,197 the filthier, more destructive aspects 807 00:38:53,199 --> 00:38:55,299 as just being part of life. 808 00:38:55,301 --> 00:38:57,435 what's going to bring this to a head 809 00:38:57,437 --> 00:38:59,137 is what's happening online. 810 00:39:00,340 --> 00:39:04,042 we all know about horrible crimes of bullying 811 00:39:04,044 --> 00:39:08,546 that have taken place online, empowered by anonymity. 812 00:39:08,548 --> 00:39:11,949 but we can use the tools of surveillance 813 00:39:11,951 --> 00:39:15,386 to expose prying eyes. 814 00:39:15,388 --> 00:39:19,390 the way to deal with the eyes is to spot them. 815 00:39:19,392 --> 00:39:21,092 hey! 816 00:39:21,094 --> 00:39:24,762 to find out who's looking and hold them accountable. 817 00:39:24,764 --> 00:39:28,199 if we all look back at the watchers, 818 00:39:28,201 --> 00:39:31,169 we have the power to change the way they behave. 819 00:39:31,171 --> 00:39:36,774 it's a step towards what david calls the transparent society. 820 00:39:36,776 --> 00:39:39,477 transparency can stamp out bad behavior 821 00:39:39,479 --> 00:39:40,945 from nosy neighbors. 822 00:39:40,947 --> 00:39:42,980 they won't be so quick to talk about you 823 00:39:42,982 --> 00:39:44,982 if they know you could talk about them. 824 00:39:44,984 --> 00:39:47,218 but it doesn't stop there. 825 00:39:47,220 --> 00:39:50,755 it ripples all the way up our society. 826 00:39:50,757 --> 00:39:54,392 2013 was the best year for civil liberties 827 00:39:54,394 --> 00:39:57,361 in the united states of america in a generation. 828 00:39:59,265 --> 00:40:00,732 that was the year 829 00:40:00,734 --> 00:40:02,633 that the administration joined the courts 830 00:40:02,635 --> 00:40:05,703 in declaring a universal right of citizens 831 00:40:05,705 --> 00:40:09,574 to record their encounters with police. 832 00:40:09,576 --> 00:40:11,976 it is empowering the good cops, 833 00:40:11,978 --> 00:40:16,180 but it's empowering groups like black lives matter to say, 834 00:40:16,182 --> 00:40:19,150 "what you do to us is what matters, 835 00:40:19,152 --> 00:40:21,085 and now we can prove it." 836 00:40:21,087 --> 00:40:23,988 a loss of privacy for those in power 837 00:40:23,990 --> 00:40:26,858 can make society better. 838 00:40:26,860 --> 00:40:30,194 in fact, to make the transparent society work, 839 00:40:30,196 --> 00:40:33,598 david believes the government's right to secrecy 840 00:40:33,600 --> 00:40:35,566 must be massively curtailed. 841 00:40:35,568 --> 00:40:38,369 it should be able to keep secrets for a while, 842 00:40:38,371 --> 00:40:41,906 like plans to arrest criminals or military invasions, 843 00:40:41,908 --> 00:40:44,942 but nothing should stay secret forever. 844 00:40:44,944 --> 00:40:47,645 any practical, tactical value to a secret 845 00:40:47,647 --> 00:40:51,616 is going to decay over time. 846 00:40:51,618 --> 00:40:55,453 let's say government agencies, corporations 847 00:40:55,455 --> 00:40:59,023 can get five years of secrecy for free. 848 00:40:59,025 --> 00:41:00,291 after five years, 849 00:41:00,293 --> 00:41:03,528 you have to cache the secrets in a secure place 850 00:41:03,530 --> 00:41:07,799 and pay money to extend it another five years. 851 00:41:07,801 --> 00:41:11,169 it's for this reason that david supports 852 00:41:11,171 --> 00:41:13,504 whistle-blowers like edward snowden. 853 00:41:13,506 --> 00:41:17,175 they shine a light in the dark corners of government. 854 00:41:17,177 --> 00:41:19,076 and in a free society, 855 00:41:19,078 --> 00:41:22,513 their leaks ultimately make us stronger. 856 00:41:22,515 --> 00:41:25,082 everything leaks. 857 00:41:25,084 --> 00:41:27,084 not a month goes by 858 00:41:27,086 --> 00:41:30,688 when something has not hemorrhaged 859 00:41:30,690 --> 00:41:32,123 all over the internet, 860 00:41:32,125 --> 00:41:33,324 getting headlines. 861 00:41:33,326 --> 00:41:35,359 but somehow western governments 862 00:41:35,361 --> 00:41:38,062 and western civilization keep surviving. 863 00:41:38,064 --> 00:41:40,698 in fact, it makes us better. 864 00:41:40,700 --> 00:41:42,567 now think about our enemies -- 865 00:41:42,569 --> 00:41:45,903 terrorists, tyrannical governments, 866 00:41:45,905 --> 00:41:47,605 and criminal gangs. 867 00:41:47,607 --> 00:41:49,941 to them, it's lethal. 868 00:41:49,943 --> 00:41:52,743 the world is never going back 869 00:41:52,745 --> 00:41:56,214 to the way it was just two decades ago. 870 00:41:56,216 --> 00:41:58,216 eyes will be everywhere. 871 00:41:58,218 --> 00:42:00,852 there will be no escaping them. 872 00:42:00,854 --> 00:42:06,224 but if we change our behavior, we can keep the privacy we need. 873 00:42:06,226 --> 00:42:08,226 privacy is essential to be human. 874 00:42:09,529 --> 00:42:13,464 we're just going to have to defend it differently 875 00:42:13,466 --> 00:42:14,899 and redefine it. 876 00:42:14,901 --> 00:42:20,771 we will probably look back on the last couple of centuries 877 00:42:20,773 --> 00:42:23,608 as a golden age of privacy, 878 00:42:23,610 --> 00:42:28,646 a time before the age of almost total surveillance. 879 00:42:28,648 --> 00:42:30,848 but there is an upside. 880 00:42:30,850 --> 00:42:34,485 if we accept that we are going to be watched, 881 00:42:34,487 --> 00:42:37,321 then governments and corporations 882 00:42:37,323 --> 00:42:39,323 must accept the same. 883 00:42:39,325 --> 00:42:42,727 we need the privacy of our bedrooms. 884 00:42:42,729 --> 00:42:46,030 government needs the privacy of its war rooms. 885 00:42:46,032 --> 00:42:50,167 beyond that, our society will be transparent. 886 00:42:50,169 --> 00:42:55,006 and this loss of secrecy could herald a new age, 887 00:42:55,008 --> 00:42:57,742 the age of honesty. 888 00:42:57,792 --> 00:43:02,342 Repair and Synchronization by Easy Subtitles Synchronizer 1.0.0.0 69267

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.