All language subtitles for Secrets_of_Silicon_Valley_BBC_2_of_2_The_Persuasion_Machine

af Afrikaans
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian
pl Polish
pt Portuguese
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish Download
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:04,520 --> 00:00:08,440 It was the biggest political earthquake of the century. 2 00:00:08,440 --> 00:00:12,680 We will make America great again! 3 00:00:15,440 --> 00:00:17,760 But just how did Donald Trump defy the predictions 4 00:00:17,760 --> 00:00:21,720 of political pundits and pollsters? 5 00:00:21,720 --> 00:00:25,080 The secret lies here, in San Antonio, Texas. 6 00:00:27,040 --> 00:00:29,320 This was our Project Alamo. 7 00:00:29,320 --> 00:00:35,720 This is where the digital arm of the Trump campaign operation was held. 8 00:00:37,920 --> 00:00:40,320 This is the extraordinary story of two men 9 00:00:40,320 --> 00:00:43,040 with two very different views of the world. 10 00:00:44,080 --> 00:00:46,280 We will build the wall... 11 00:00:46,280 --> 00:00:49,880 The path forward is to connect more, not less. 12 00:00:49,880 --> 00:00:54,240 And how Facebook's Mark Zuckerberg inadvertently helped Donald Trump 13 00:00:54,240 --> 00:00:57,520 become the most powerful man on the planet. 14 00:00:57,520 --> 00:01:00,680 Without Facebook, we wouldn't have won. 15 00:01:00,680 --> 00:01:04,920 I mean, Facebook really and truly put us over the edge. 16 00:01:04,920 --> 00:01:08,600 With their secret algorithms and online tracking, 17 00:01:08,600 --> 00:01:12,280 social media companies know more about us than anyone. 18 00:01:13,440 --> 00:01:16,960 So you can predict mine and everybody else's personality 19 00:01:16,960 --> 00:01:19,160 based on the things that they've liked? 20 00:01:19,160 --> 00:01:20,760 That's correct. 21 00:01:20,760 --> 00:01:24,120 A very accurate prediction of your intimate traits such as 22 00:01:24,120 --> 00:01:28,240 religiosity, political views, intelligence, sexual orientation. 23 00:01:28,240 --> 00:01:31,680 Now, this power is transforming politics. 24 00:01:31,680 --> 00:01:33,280 Can you understand though why maybe 25 00:01:33,280 --> 00:01:36,080 some people find it a little bit creepy? 26 00:01:36,080 --> 00:01:38,840 No, I can't - quite the opposite. That is the way the world is moving. 27 00:01:38,840 --> 00:01:42,440 Whether you like it or not, it's an inevitable fact. 28 00:01:42,440 --> 00:01:46,240 Social media can bring politics closer to the people, 29 00:01:46,240 --> 00:01:48,000 but its destructive power 30 00:01:48,000 --> 00:01:51,240 is creating a new and unpredictable world. 31 00:01:51,240 --> 00:01:55,840 This is the story of how Silicon Valley's mission to connect 32 00:01:55,840 --> 00:01:59,040 all of us is disrupting politics, 33 00:01:59,040 --> 00:02:02,280 plunging us into a world of political turbulence 34 00:02:02,280 --> 00:02:04,520 that no-one can control. 35 00:02:20,000 --> 00:02:24,480 It might not look like it, but anger is building in Silicon Valley. 36 00:02:24,480 --> 00:02:27,520 It's usually pretty quiet around here. 37 00:02:27,520 --> 00:02:29,840 But not today. 38 00:02:29,840 --> 00:02:34,000 Every day you wake up and you wonder what's going to be today's grief, 39 00:02:34,000 --> 00:02:37,040 brought by certain politicians and leaders in the world. 40 00:02:37,040 --> 00:02:41,720 This is a very unusual demonstration. 41 00:02:41,720 --> 00:02:44,440 I've been to loads of demonstrations, 42 00:02:44,440 --> 00:02:46,160 but these aren't the people 43 00:02:46,160 --> 00:02:48,760 that usually go to demonstrations. 44 00:02:48,760 --> 00:02:52,920 In some ways, these are the winners of society. 45 00:02:52,920 --> 00:02:55,800 This is the tech community in Silicon Valley. 46 00:02:55,800 --> 00:02:59,880 Some of the wealthiest people in the world. 47 00:02:59,880 --> 00:03:05,160 And they're here protesting and demonstrating... 48 00:03:05,160 --> 00:03:07,840 against what they see as the kind of changing world 49 00:03:07,840 --> 00:03:09,800 that they don't like. 50 00:03:09,800 --> 00:03:13,000 The problem, of course, is the election of Donald Trump. 51 00:03:15,800 --> 00:03:17,760 Every day, I'm sure, you think, 52 00:03:17,760 --> 00:03:20,120 "I could be a part of resisting those efforts 53 00:03:20,120 --> 00:03:22,320 "to mess things up for the rest of us." 54 00:03:22,320 --> 00:03:26,920 Trump came to power promising to control immigration... 55 00:03:26,920 --> 00:03:30,760 We will build the wall, 100%. 56 00:03:30,760 --> 00:03:33,880 ..and to disengage from the world. 57 00:03:33,880 --> 00:03:37,920 From this day forward, it's going to be 58 00:03:37,920 --> 00:03:40,360 only America first. 59 00:03:42,200 --> 00:03:44,520 America first. 60 00:03:46,840 --> 00:03:51,240 Now, Silicon Valley is mobilising against him. 61 00:03:51,240 --> 00:03:54,440 We are seeing this explosion of political activism, you know, 62 00:03:54,440 --> 00:03:56,960 all through the US and in Europe. 63 00:03:56,960 --> 00:03:59,440 Before he became an activist, 64 00:03:59,440 --> 00:04:03,160 Dex Torricke-Barton was speech writer to the chairman of Google 65 00:04:03,160 --> 00:04:05,320 and the founder of Facebook. 66 00:04:07,480 --> 00:04:10,760 It's a moment when people who believe in this global vision, 67 00:04:10,760 --> 00:04:12,840 as opposed to the nationalist vision of the world, 68 00:04:12,840 --> 00:04:15,200 who believe in a world that isn't about protectionism, 69 00:04:15,200 --> 00:04:17,840 whether it's data or whether it's about trade, 70 00:04:17,840 --> 00:04:20,880 they're coming to stand up and to mobilise in response to that. 71 00:04:20,880 --> 00:04:23,120 Because it feels like the whole of Silicon Valley 72 00:04:23,120 --> 00:04:26,000 - has been slightly taken by surprise by what's happening. - Absolutely. 73 00:04:26,000 --> 00:04:28,080 But these are the smartest minds in the world. 74 00:04:28,080 --> 00:04:30,720 - Yeah. - With the most amazing data models and polling. 75 00:04:30,720 --> 00:04:33,160 The smartest minds in the world often can be very, 76 00:04:33,160 --> 00:04:36,040 very ignorant of the things that are going on in the world. 77 00:04:36,040 --> 00:04:41,600 The tech god with the most ambitious global vision is Dex's old boss - 78 00:04:41,600 --> 00:04:44,440 Facebook founder Mark Zuckerberg. 79 00:04:44,440 --> 00:04:48,800 One word captures the world Zuck, as he's known here, 80 00:04:48,800 --> 00:04:51,440 is trying to build. 81 00:04:51,440 --> 00:04:53,560 Connectivity and access. If we connected them... 82 00:04:53,560 --> 00:04:55,880 You give people connectivity... That's the mission. 83 00:04:55,880 --> 00:04:58,440 Connecting with their friends... You connect people over time... 84 00:04:58,440 --> 00:05:00,600 Connect everyone in the world... I'm really optimistic about that. 85 00:05:00,600 --> 00:05:03,240 Make the world more open and connected, that's what I care about. 86 00:05:03,240 --> 00:05:06,040 This is part of the critical enabling infrastructure for the world. 87 00:05:06,040 --> 00:05:08,080 Thank you, guys. 88 00:05:08,080 --> 00:05:10,240 What's Mark Zuckerberg worried about most? 89 00:05:10,240 --> 00:05:13,280 Well, you know, Mark has dedicated his life to connecting, you know, 90 00:05:13,280 --> 00:05:16,040 the world. You know, this is something that he really, you know, 91 00:05:16,040 --> 00:05:17,480 cares passionately about. 92 00:05:17,480 --> 00:05:19,800 And, you know, as I said, you know, 93 00:05:19,800 --> 00:05:22,800 the same worldview and set of policies that, you know, 94 00:05:22,800 --> 00:05:24,600 we'll build walls here, 95 00:05:24,600 --> 00:05:26,920 we'll build walls against the sharing of information 96 00:05:26,920 --> 00:05:29,200 and building those kind of, you know, networks. 97 00:05:29,200 --> 00:05:30,560 It's bigger than just the tech. 98 00:05:30,560 --> 00:05:33,720 - It is. - It's about the society that Silicon Valley also wants to create? 99 00:05:33,720 --> 00:05:34,800 Absolutely. 100 00:05:34,800 --> 00:05:36,840 The tech gods believe the election 101 00:05:36,840 --> 00:05:39,920 of Donald Trump threatens their vision 102 00:05:39,920 --> 00:05:41,520 of a globalised world. 103 00:05:43,040 --> 00:05:44,560 But in a cruel twist, 104 00:05:44,560 --> 00:05:47,680 is it possible their mission to connect the world 105 00:05:47,680 --> 00:05:50,040 actually helped bring him to power? 106 00:05:50,040 --> 00:05:53,640 The question I have is whether the revolution brought about 107 00:05:53,640 --> 00:05:56,960 by social media companies like Facebook 108 00:05:56,960 --> 00:06:00,560 has actually led to the political changes in the world 109 00:06:00,560 --> 00:06:03,080 that these guys are so worried about. 110 00:06:06,920 --> 00:06:08,680 To answer that question, 111 00:06:08,680 --> 00:06:12,880 you have to understand how the tech titans of Silicon Valley 112 00:06:12,880 --> 00:06:15,520 rose to power. 113 00:06:15,520 --> 00:06:18,040 For that, you have to go back 20 years 114 00:06:18,040 --> 00:06:22,520 to a time when the online world was still in its infancy. 115 00:06:22,520 --> 00:06:25,400 MUSIC: Rock N Roll Star by Oasis 116 00:06:27,240 --> 00:06:31,560 There were fears the new internet was like the Wild West, 117 00:06:31,560 --> 00:06:34,560 anarchic and potentially harmful. 118 00:06:34,560 --> 00:06:40,240 Today our world is being remade, yet again, by an information revolution. 119 00:06:40,240 --> 00:06:43,600 Changing the way we work, the way we live, 120 00:06:43,600 --> 00:06:46,160 the way we relate to each other. 121 00:06:46,160 --> 00:06:49,440 The Telecommunications Act of 1996 122 00:06:49,440 --> 00:06:51,920 was designed to civilise the internet, 123 00:06:51,920 --> 00:06:56,120 including protect children from pornography. 124 00:06:56,120 --> 00:07:00,200 Today with the stroke of a pen, our laws will catch up with our future. 125 00:07:00,200 --> 00:07:05,800 But buried deep within the act was a secret whose impact no-one foresaw. 126 00:07:14,040 --> 00:07:15,400 Jeremy? 127 00:07:15,400 --> 00:07:16,960 Jamie. How are you doing? 128 00:07:16,960 --> 00:07:19,160 Nice to meet you. 129 00:07:19,160 --> 00:07:21,280 VOICEOVER: Jeremy Malcolm is an analyst 130 00:07:21,280 --> 00:07:23,440 at the Electronic Frontier Foundation, 131 00:07:23,440 --> 00:07:26,520 a civil liberties group for the digital age. 132 00:07:28,360 --> 00:07:32,320 Much of Silicon Valley's accelerated growth in the last two decades 133 00:07:32,320 --> 00:07:36,560 has been enabled by one clause in the legislation. 134 00:07:37,600 --> 00:07:40,400 Hidden away in the middle of that is this Section 230. 135 00:07:40,400 --> 00:07:42,440 - What's the key line? - It literally just says, 136 00:07:42,440 --> 00:07:45,120 no provider or user of an interactive computer service 137 00:07:45,120 --> 00:07:48,680 shall be treated as the publisher or speaker 138 00:07:48,680 --> 00:07:50,600 of any information provided 139 00:07:50,600 --> 00:07:53,320 by another information content provider. That's it. 140 00:07:53,320 --> 00:07:56,200 So what that basically means is, if you're an internet platform, 141 00:07:56,200 --> 00:07:58,880 you don't get treated as the publisher or speaker 142 00:07:58,880 --> 00:08:01,800 of something that your users say using your platform. 143 00:08:01,800 --> 00:08:05,560 If the user says something online that is, say, defamatory, 144 00:08:05,560 --> 00:08:07,640 the platform that they communicate on 145 00:08:07,640 --> 00:08:09,840 isn't going to be held responsible for it. 146 00:08:09,840 --> 00:08:13,000 And the user, of course, can be held directly responsible. 147 00:08:13,000 --> 00:08:19,680 How important is this line for social media companies today? 148 00:08:19,680 --> 00:08:22,320 I think if we didn't have this, we probably wouldn't have 149 00:08:22,320 --> 00:08:25,160 the same kind of social media companies that we have today. 150 00:08:25,160 --> 00:08:28,560 They wouldn't be willing to take on the risk of having so much 151 00:08:28,560 --> 00:08:33,160 - unfettered discussion. - It's key to the internet's freedom, really? 152 00:08:33,160 --> 00:08:36,400 We wouldn't have the internet of today without this. 153 00:08:36,400 --> 00:08:39,440 And so, if we are going to make any changes to it, 154 00:08:39,440 --> 00:08:41,320 we have to be really, really careful. 155 00:08:43,160 --> 00:08:46,440 These 26 words changed the world. 156 00:08:50,960 --> 00:08:53,840 They allowed a new kind of business to spring up - 157 00:08:53,840 --> 00:08:58,120 online platforms that became the internet giants of today. 158 00:08:58,120 --> 00:09:02,760 Facebook, Google, YouTube - they encouraged users to upload content, 159 00:09:02,760 --> 00:09:06,680 often things about their lives or moments that mattered to them, 160 00:09:06,680 --> 00:09:09,280 onto their sites for free. 161 00:09:09,280 --> 00:09:13,280 And in exchange, they got to hoard all of that data 162 00:09:13,280 --> 00:09:15,440 but without any real responsibility 163 00:09:15,440 --> 00:09:19,360 for the effects of the content that people were posting. 164 00:09:22,200 --> 00:09:25,400 Hundreds of millions of us flocked to these new sites, 165 00:09:25,400 --> 00:09:29,320 putting more of our lives online. 166 00:09:29,320 --> 00:09:32,560 At first, the tech firms couldn't figure out 167 00:09:32,560 --> 00:09:35,720 how to turn that data into big money. 168 00:09:35,720 --> 00:09:40,680 But that changed when a secret within that data was unlocked. 169 00:09:40,680 --> 00:09:42,560 Antonio, Jamie. 170 00:09:42,560 --> 00:09:46,560 'A secret Antonio Garcia Martinez helped reveal at Facebook.' 171 00:09:49,520 --> 00:09:51,520 Tell me a bit about your time at Facebook. 172 00:09:51,520 --> 00:09:53,160 Well, that was interesting. 173 00:09:53,160 --> 00:09:56,200 I was what's called a product manager for ads targeting. 174 00:09:56,200 --> 00:09:58,880 That means basically taking your data and using it to basically 175 00:09:58,880 --> 00:10:02,120 make money on Facebook, to monetise Facebook's data. 176 00:10:02,120 --> 00:10:04,800 If you go browse the internet or buy stuff in stores or whatever, 177 00:10:04,800 --> 00:10:06,680 and then you see ads related to all that stuff 178 00:10:06,680 --> 00:10:08,160 inside Facebook - I created that. 179 00:10:10,160 --> 00:10:12,200 Facebook offers advertisers ways 180 00:10:12,200 --> 00:10:15,600 to target individual users of the site with adverts. 181 00:10:15,600 --> 00:10:21,160 It can be driven by data about how we use the platform. 182 00:10:21,160 --> 00:10:24,000 Here's some examples of what's data for Facebook that makes money. 183 00:10:24,000 --> 00:10:26,960 What you've liked on Facebook, links that you shared, 184 00:10:26,960 --> 00:10:29,000 who you happen to know on Facebook, for example. 185 00:10:29,000 --> 00:10:32,160 Where you've used Facebook, what devices, your iPad, your work computer, 186 00:10:32,160 --> 00:10:34,240 your home computer. In the case of Amazon, 187 00:10:34,240 --> 00:10:35,920 it's obviously what you've purchased. 188 00:10:35,920 --> 00:10:38,040 In the case of Google, it's what you searched for. 189 00:10:38,040 --> 00:10:40,200 How do they turn me... I like something on Facebook, 190 00:10:40,200 --> 00:10:43,560 and I share a link on Facebook, how could they turn that 191 00:10:43,560 --> 00:10:46,520 into something that another company would care about? 192 00:10:46,520 --> 00:10:48,520 There is what's called a targeting system, 193 00:10:48,520 --> 00:10:50,880 and so the advertiser can actually go in and specify, 194 00:10:50,880 --> 00:10:54,840 I want people who are within this city and who have liked BMW or Burberry, for example. 195 00:10:54,840 --> 00:10:56,840 So an advertiser pays Facebook and says, 196 00:10:56,840 --> 00:11:00,080 - I want these sorts of people? - That's effectively it, that's right. 197 00:11:01,360 --> 00:11:05,440 The innovation that opened up bigger profits was to allow Facebook users 198 00:11:05,440 --> 00:11:12,200 to be targeted using data about what they do on the rest of the internet. 199 00:11:12,200 --> 00:11:14,040 The real key thing that marketers want 200 00:11:14,040 --> 00:11:18,880 is the unique, immutable, flawless, high fidelity ID 201 00:11:18,880 --> 00:11:21,360 for one person on the internet, and Facebook provides that. 202 00:11:21,360 --> 00:11:23,080 It is your identity online. 203 00:11:23,080 --> 00:11:26,920 Facebook can tell an advertiser, this is the real Jamie Barlow, 204 00:11:26,920 --> 00:11:28,800 - this is what he's like? - Yeah. 205 00:11:28,800 --> 00:11:32,520 A company like Walmart can literally take your data, your e-mail, 206 00:11:32,520 --> 00:11:35,640 phone number, whatever you use for their frequent shopper programme, etc, 207 00:11:35,640 --> 00:11:39,120 and join that to Facebook and literally target those people based on that data. 208 00:11:39,120 --> 00:11:40,280 That's part of what I built. 209 00:11:45,400 --> 00:11:49,760 The tech gods suck in all this data about how we use their technologies 210 00:11:49,760 --> 00:11:53,360 to build their vast fortunes. 211 00:11:55,120 --> 00:11:58,840 I mean, it sounds like data is like oil, it's keeping the economy going? 212 00:11:58,840 --> 00:12:00,920 Right. I mean, the difference is these companies, 213 00:12:00,920 --> 00:12:04,080 instead of drilling for this oil, they generate this oil via, 214 00:12:04,080 --> 00:12:07,080 by getting users to actually use their apps and then they actually 215 00:12:07,080 --> 00:12:09,520 monetise it. Usually via advertising or other mechanisms. 216 00:12:09,520 --> 00:12:12,160 But, yeah, it is the new oil. 217 00:12:12,160 --> 00:12:15,960 Data about billions of us is propelling Silicon Valley 218 00:12:15,960 --> 00:12:19,080 to the pinnacle of the global economy. 219 00:12:21,640 --> 00:12:23,520 The world's largest hotel company, Airbnb, 220 00:12:23,520 --> 00:12:25,320 doesn't own a single piece of real estate. 221 00:12:25,320 --> 00:12:28,000 The world's largest taxi company, Uber, doesn't own any cars. 222 00:12:28,000 --> 00:12:31,320 The world's largest media company, Facebook, doesn't produce any media, right? 223 00:12:31,320 --> 00:12:32,800 So what do they have? 224 00:12:32,800 --> 00:12:35,920 Well, they have the data around how you use those resources and how you 225 00:12:35,920 --> 00:12:38,280 use those assets. And that's really what they are. 226 00:12:41,880 --> 00:12:46,880 The secret of targeting us with adverts is keeping us online 227 00:12:46,880 --> 00:12:49,680 for as long as possible. 228 00:12:49,680 --> 00:12:53,320 I thought I'd just see how much time I spend on here. 229 00:12:53,320 --> 00:12:58,760 So I've got an app that counts how often I pick this thing up. 230 00:12:58,760 --> 00:13:02,480 Our time is the Holy Grail of Silicon Valley. 231 00:13:03,960 --> 00:13:06,600 Here's what my life looks like on a typical day. 232 00:13:09,160 --> 00:13:12,480 - Yeah, could I have a flat white, please? - Flat white? - Yeah. 233 00:13:12,480 --> 00:13:18,000 Like more and more of us, my phone is my gateway to the online world. 234 00:13:18,000 --> 00:13:20,680 It's how I check my social media accounts. 235 00:13:20,680 --> 00:13:26,200 On average, Facebook users spend 50 minutes every day on the site. 236 00:13:31,640 --> 00:13:33,360 The longer we spend connected, 237 00:13:33,360 --> 00:13:37,600 the more Silicon Valley can learn about us, 238 00:13:37,600 --> 00:13:41,560 and the more targeted and effective their advertising can be. 239 00:14:05,800 --> 00:14:08,400 So apparently, today, I... 240 00:14:08,400 --> 00:14:13,000 I've checked my phone 117 times... 241 00:14:14,600 --> 00:14:20,040 ..and I've been on this phone for nearly five and a half hours. 242 00:14:20,040 --> 00:14:22,840 Well, I mean, that's a lot, that's a lot of hours. 243 00:14:22,840 --> 00:14:25,960 I mean, it's kind of nearly half the day, 244 00:14:25,960 --> 00:14:28,480 spent on this phone. 245 00:14:28,480 --> 00:14:31,840 But it's weird, because it doesn't feel like I spend that long on it. 246 00:14:31,840 --> 00:14:33,560 The strange thing about it is 247 00:14:33,560 --> 00:14:36,120 that I don't even really know what I'm doing 248 00:14:36,120 --> 00:14:38,800 for these five hours that I'm spending on this phone. 249 00:14:40,320 --> 00:14:45,040 What is it that is keeping us hooked to Silicon Valley's global network? 250 00:14:49,480 --> 00:14:54,200 I'm in Seattle to meet someone who saw how the tech gods embraced 251 00:14:54,200 --> 00:14:59,040 new psychological insights into how we all make decisions. 252 00:15:02,480 --> 00:15:04,640 I was a post-doc with Stephen Hawking. 253 00:15:04,640 --> 00:15:07,560 Once Chief Technology Officer at Microsoft, 254 00:15:07,560 --> 00:15:10,440 Nathan Myhrvold is the most passionate technologist 255 00:15:10,440 --> 00:15:13,280 I have ever met. 256 00:15:13,280 --> 00:15:15,680 Some complicated-looking equations in the background. 257 00:15:15,680 --> 00:15:17,920 Well, it turns out if you work with Stephen Hawking, 258 00:15:17,920 --> 00:15:20,160 you do work with complicated equations. 259 00:15:20,160 --> 00:15:22,680 It's kind of the nature of the beast! 260 00:15:22,680 --> 00:15:24,040 Amazing picture. 261 00:15:27,040 --> 00:15:30,480 A decade ago, Nathan brought together Daniel Kahneman, 262 00:15:30,480 --> 00:15:33,720 pioneer of the new science of behavioural economics, 263 00:15:33,720 --> 00:15:37,640 and Silicon Valley's leaders, for a series of meetings. 264 00:15:37,640 --> 00:15:40,280 I came and Jeff Bezos came. 265 00:15:40,280 --> 00:15:46,160 That's Jeff Bezos, the founder of Amazon, worth 76 billion. 266 00:15:46,160 --> 00:15:48,320 Sean Parker. Sean was there. 267 00:15:48,320 --> 00:15:52,640 That's Sean Parker, the first president of Facebook. 268 00:15:52,640 --> 00:15:54,800 And the Google founders were there. 269 00:15:56,800 --> 00:16:03,440 The proposition was, come to this very nice resort in Napa, 270 00:16:03,440 --> 00:16:07,560 and for several days, just have Kahneman 271 00:16:07,560 --> 00:16:09,280 and then also a couple of 272 00:16:09,280 --> 00:16:12,760 other behavioural economists explain things. 273 00:16:12,760 --> 00:16:15,440 And ask questions and see what happens. 274 00:16:15,440 --> 00:16:21,720 Kahneman had a simple but brilliant theory on how we make decisions. 275 00:16:21,720 --> 00:16:27,080 He had found we use one of two different systems of thinking. 276 00:16:28,320 --> 00:16:31,560 In this dichotomy, you have... 277 00:16:31,560 --> 00:16:34,120 over here is a hunch... 278 00:16:36,360 --> 00:16:37,880 ..a guess, 279 00:16:37,880 --> 00:16:40,240 a gut feeling... 280 00:16:44,000 --> 00:16:47,080 ..and "I just know". 281 00:16:47,080 --> 00:16:53,360 So this is sort of emotional and more like, just instant stuff? 282 00:16:53,360 --> 00:16:56,160 That's the idea. This set of things 283 00:16:56,160 --> 00:17:00,240 is not particularly good at a different set of stuff 284 00:17:00,240 --> 00:17:02,280 that involves... 285 00:17:04,560 --> 00:17:05,840 ..analysis... 286 00:17:08,680 --> 00:17:10,480 ..numbers... 287 00:17:12,440 --> 00:17:13,480 ..probability. 288 00:17:14,960 --> 00:17:18,160 The meetings in Napa didn't deal with the basics 289 00:17:18,160 --> 00:17:21,160 of behavioural economics, but how might the insights of the new 290 00:17:21,160 --> 00:17:23,400 science have helped the tech gods? 291 00:17:23,400 --> 00:17:26,920 A lot of advertising is about trying to hook people 292 00:17:26,920 --> 00:17:30,560 in these type-one things to get interested one way or the other. 293 00:17:30,560 --> 00:17:35,520 Technology companies undoubtedly use that to one degree or another. 294 00:17:35,520 --> 00:17:37,160 You know, the term "clickbait", 295 00:17:37,160 --> 00:17:39,720 for things that look exciting to click on. 296 00:17:39,720 --> 00:17:42,440 There's billions of dollars change hands 297 00:17:42,440 --> 00:17:46,520 because we all get enticed into clicking something. 298 00:17:46,520 --> 00:17:50,320 And there's a lot of things that I click on and then you get there, 299 00:17:50,320 --> 00:17:53,320 you're like, OK, fine, you were just messing with me. 300 00:17:53,320 --> 00:17:55,920 You're playing to the type-one things, 301 00:17:55,920 --> 00:17:58,400 you're putting a set of triggers out there 302 00:17:58,400 --> 00:18:01,440 that make me want to click on it, 303 00:18:01,440 --> 00:18:06,360 and even though, like, I'm aware of that, I still sometimes click! 304 00:18:06,360 --> 00:18:09,080 Tech companies both try to understand 305 00:18:09,080 --> 00:18:12,760 our behaviour by having smart humans think about it and increasingly 306 00:18:12,760 --> 00:18:15,040 by having machines think about it. 307 00:18:15,040 --> 00:18:16,520 By having machines track us 308 00:18:16,520 --> 00:18:19,280 to see, what is the clickbait Nathan falls for? 309 00:18:19,280 --> 00:18:21,800 What are the things he really likes to spend time on? 310 00:18:21,800 --> 00:18:24,080 Let's show him more of that stuff! 311 00:18:25,880 --> 00:18:29,440 Trying to grab the attention of the consumer is nothing new. 312 00:18:29,440 --> 00:18:32,160 That's what advertising is all about. 313 00:18:32,160 --> 00:18:35,480 But insights into how we make decisions helped Silicon Valley 314 00:18:35,480 --> 00:18:37,760 to shape the online world. 315 00:18:37,760 --> 00:18:43,000 And little wonder, their success depends on keeping us engaged. 316 00:18:43,000 --> 00:18:47,520 From 1-Click buying on Amazon to the Facebook like, 317 00:18:47,520 --> 00:18:51,640 the more they've hooked us, the more the money has rolled in. 318 00:18:53,440 --> 00:18:55,520 As Silicon Valley became more influential, 319 00:18:55,520 --> 00:19:00,600 it started attracting powerful friends...in politics. 320 00:19:00,600 --> 00:19:07,040 In 2008, Barack Obama had pioneered political campaigning on Facebook. 321 00:19:07,040 --> 00:19:12,480 As President, he was drawn to Facebook's founder, Zuck. 322 00:19:12,480 --> 00:19:14,280 Sorry, I'm kind of nervous. 323 00:19:14,280 --> 00:19:17,920 We have the President of the United States here! 324 00:19:17,920 --> 00:19:21,520 My name is Barack Obama and I'm the guy who got Mark 325 00:19:21,520 --> 00:19:24,160 to wear a jacket and tie! 326 00:19:27,120 --> 00:19:28,400 How you doing? 327 00:19:28,400 --> 00:19:31,960 - Great. - I'll have huevos rancheros, please. 328 00:19:31,960 --> 00:19:36,560 And if I could have an egg and cheese sandwich on English muffin? 329 00:19:36,560 --> 00:19:40,760 Aneesh Chopra was Obama's first Chief Technology Officer, 330 00:19:40,760 --> 00:19:42,720 and saw how close the relationship 331 00:19:42,720 --> 00:19:47,080 between the White House and Silicon Valley became. 332 00:19:47,080 --> 00:19:50,840 The President's philosophy and his approach to governing 333 00:19:50,840 --> 00:19:53,600 garnered a great deal of personal interest 334 00:19:53,600 --> 00:19:56,360 among many executives in Silicon Valley. 335 00:19:56,360 --> 00:19:59,960 They were donors to his campaign, volunteers, 336 00:19:59,960 --> 00:20:01,840 active recruiters of engineering talent 337 00:20:01,840 --> 00:20:03,600 to support the campaign apparatus. 338 00:20:03,600 --> 00:20:05,840 He had struck a chord. 339 00:20:05,840 --> 00:20:07,120 - Why? - Because frankly, 340 00:20:07,120 --> 00:20:10,720 it was an inspiring voice that really tapped into 341 00:20:10,720 --> 00:20:13,360 the hopefulness of the country. 342 00:20:13,360 --> 00:20:17,000 And a lot of Silicon Valley shares this sense of hopefulness, 343 00:20:17,000 --> 00:20:19,400 this optimistic view that we can solve problems 344 00:20:19,400 --> 00:20:20,680 if we would work together 345 00:20:20,680 --> 00:20:24,080 and take advantage of these new capabilities that are coming online. 346 00:20:24,080 --> 00:20:27,960 So people in Silicon Valley saw President Obama 347 00:20:27,960 --> 00:20:32,480 - as a bit of a kindred spirit? - Oh, yeah. Oh, yeah. 348 00:20:32,480 --> 00:20:34,960 If you'd like, Mark, we can take our jackets off. 349 00:20:34,960 --> 00:20:36,640 That's good! 350 00:20:36,640 --> 00:20:39,040 Facebook's mission to connect the world 351 00:20:39,040 --> 00:20:44,240 went hand-in-hand with Obama's policies promoting globalisation 352 00:20:44,240 --> 00:20:46,000 and free markets. 353 00:20:46,000 --> 00:20:48,440 And Facebook was seen to be improving 354 00:20:48,440 --> 00:20:50,720 the political process itself. 355 00:20:50,720 --> 00:20:55,040 Part of what makes for a healthy democracy 356 00:20:55,040 --> 00:20:59,360 is when you've got citizens who are informed, who are engaged, 357 00:20:59,360 --> 00:21:02,560 and what Facebook allows us to do 358 00:21:02,560 --> 00:21:05,880 is make sure this isn't just a one-way conversation. 359 00:21:05,880 --> 00:21:09,400 You have the tech mind-set and governments increasingly share 360 00:21:09,400 --> 00:21:12,280 the same view of the world. That's not a natural... 361 00:21:12,280 --> 00:21:13,680 It's a spirit of liberty. 362 00:21:13,680 --> 00:21:15,920 It's a spirit of freedom. 363 00:21:15,920 --> 00:21:19,440 That is manifest today in these new technologies. 364 00:21:19,440 --> 00:21:23,000 It happens to be that freedom means I can tweet something offensive. 365 00:21:23,000 --> 00:21:26,280 But it also means that I have a voice. 366 00:21:26,280 --> 00:21:29,880 Please raise your right hand and repeat after me... 367 00:21:29,880 --> 00:21:32,040 By the time Obama won his second term, 368 00:21:32,040 --> 00:21:37,040 he was feted for his mastery of social media's persuasive power. 369 00:21:37,040 --> 00:21:39,480 But across the political spectrum, 370 00:21:39,480 --> 00:21:43,920 the race was on to find new ways to gain a digital edge. 371 00:21:43,920 --> 00:21:47,080 The world was about to change for Facebook. 372 00:21:56,680 --> 00:22:01,160 Stanford University, in the heart of Silicon Valley. 373 00:22:01,160 --> 00:22:05,800 Home to a psychologist investigating just how revealing 374 00:22:05,800 --> 00:22:10,760 Facebook's hoard of information about each of us could really be. 375 00:22:10,760 --> 00:22:13,760 How are you doing? Nice to meet you. 376 00:22:13,760 --> 00:22:18,640 Dr Michal Kosinski specialises in psychometrics - 377 00:22:18,640 --> 00:22:21,840 the science of predicting psychological traits 378 00:22:21,840 --> 00:22:23,880 like personality. 379 00:22:23,880 --> 00:22:26,800 So in the past, when you wanted to measure someone's personality 380 00:22:26,800 --> 00:22:30,680 or intelligence, you needed to give them a question or a test, 381 00:22:30,680 --> 00:22:33,720 and they would have to answer a bunch of questions. 382 00:22:33,720 --> 00:22:37,320 Now, many of those questions would basically ask you 383 00:22:37,320 --> 00:22:39,320 about whether you like poetry, 384 00:22:39,320 --> 00:22:41,440 or you like hanging out with other people, 385 00:22:41,440 --> 00:22:43,400 or you like the theatre, and so on. 386 00:22:43,400 --> 00:22:47,080 But these days, you don't need to ask these questions any more. 387 00:22:47,080 --> 00:22:51,000 Why? Because while going through our lives, 388 00:22:51,000 --> 00:22:54,480 we are leaving behind a lot of digital footprints 389 00:22:54,480 --> 00:22:57,200 that basically contain the same information. 390 00:22:57,200 --> 00:23:01,440 So instead of asking you whether you like poetry, 391 00:23:01,440 --> 00:23:05,880 I can just look at your reading history on Amazon 392 00:23:05,880 --> 00:23:08,000 or your Facebook likes, 393 00:23:08,000 --> 00:23:10,720 and I would just get exactly the same information. 394 00:23:12,320 --> 00:23:16,720 In 2011, Dr Kosinski and his team at the University of Cambridge 395 00:23:16,720 --> 00:23:18,720 developed an online survey 396 00:23:18,720 --> 00:23:21,760 to measure volunteers' personality traits. 397 00:23:21,760 --> 00:23:24,680 With their permission, he matched their results 398 00:23:24,680 --> 00:23:26,560 with their Facebook data. 399 00:23:26,560 --> 00:23:30,040 More than 6 million people took part. 400 00:23:30,040 --> 00:23:32,880 We have people's Facebook likes, 401 00:23:32,880 --> 00:23:36,240 people's status updates and profile data, 402 00:23:36,240 --> 00:23:40,480 and this allows us to build those... to gain better understanding 403 00:23:40,480 --> 00:23:43,320 of how psychological traits are being expressed 404 00:23:43,320 --> 00:23:45,320 in the digital environment. 405 00:23:45,320 --> 00:23:50,000 How you can measure psychological traits using digital footprints. 406 00:23:50,000 --> 00:23:52,680 An algorithm that can look at millions of people 407 00:23:52,680 --> 00:23:55,120 and it can look at hundreds of thousands 408 00:23:55,120 --> 00:23:56,960 or tens of thousands of your likes 409 00:23:56,960 --> 00:24:01,400 can extract and utilise even those little pieces of information 410 00:24:01,400 --> 00:24:04,840 and combine it into a very accurate profile. 411 00:24:04,840 --> 00:24:09,440 You can quite accurately predict mine, and in fact, everybody else's 412 00:24:09,440 --> 00:24:14,160 personality, based on the things that they've liked? 413 00:24:14,160 --> 00:24:15,520 That's correct. 414 00:24:15,520 --> 00:24:18,720 It can also be used to turn your digital footprint 415 00:24:18,720 --> 00:24:21,080 into a very accurate prediction 416 00:24:21,080 --> 00:24:23,520 of your intimate traits, such as religiosity, 417 00:24:23,520 --> 00:24:26,040 political views, personality, intelligence, 418 00:24:26,040 --> 00:24:30,200 sexual orientation and a bunch of other psychological traits. 419 00:24:30,200 --> 00:24:33,680 If I'm logged in, we can maybe see how accurate this actually is. 420 00:24:33,680 --> 00:24:37,520 So I hit "make prediction" and it's going to try and predict 421 00:24:37,520 --> 00:24:39,240 my personality from my Facebook page. 422 00:24:39,240 --> 00:24:41,240 From your Facebook likes. 423 00:24:41,240 --> 00:24:42,920 According to your likes, 424 00:24:42,920 --> 00:24:46,240 you're open-minded, liberal and artistic. 425 00:24:46,240 --> 00:24:49,120 Judges your intelligence to be extremely high. 426 00:24:49,120 --> 00:24:53,400 - Well done. - Yes. I'm extremely intelligent! 427 00:24:53,400 --> 00:24:56,400 You're not religious, but if you are religious, 428 00:24:56,400 --> 00:24:59,760 - most likely you'd be a Catholic. - I was raised a Catholic! 429 00:24:59,760 --> 00:25:02,160 I can't believe it knows that. 430 00:25:02,160 --> 00:25:05,720 Because I... I don't say anything about being a Catholic anywhere, 431 00:25:05,720 --> 00:25:09,520 but I was raised as a Catholic, but I'm not a practising Catholic. 432 00:25:09,520 --> 00:25:11,320 So it's like... 433 00:25:11,320 --> 00:25:13,360 It's absolutely spot on. 434 00:25:13,360 --> 00:25:17,320 Oh, my God! Journalism, but also what did I study? 435 00:25:17,320 --> 00:25:21,120 Studied history, and I didn't put anything about history in there. 436 00:25:21,120 --> 00:25:22,680 I think this is one of the things 437 00:25:22,680 --> 00:25:26,720 that people don't really get about those predictions, that they think, 438 00:25:26,720 --> 00:25:29,400 look, if I like Lady Gaga on Facebook, 439 00:25:29,400 --> 00:25:31,960 obviously people will know that I like Lady Gaga, 440 00:25:31,960 --> 00:25:34,600 or the Government will know that I like Lady Gaga. 441 00:25:34,600 --> 00:25:36,520 Look, you don't need a rocket scientist 442 00:25:36,520 --> 00:25:39,720 to look at your Spotify playlist or your Facebook likes 443 00:25:39,720 --> 00:25:42,160 to figure out that you like Lady Gaga. 444 00:25:42,160 --> 00:25:47,880 What's really world-changing about those algorithms is that they can 445 00:25:47,880 --> 00:25:51,920 take your music preferences or your book preferences 446 00:25:51,920 --> 00:25:55,560 and extract from this seemingly innocent information 447 00:25:55,560 --> 00:25:59,440 very accurate predictions about your religiosity, leadership potential, 448 00:25:59,440 --> 00:26:02,760 political views, personality and so on. 449 00:26:02,760 --> 00:26:05,560 Can you predict people's political persuasions with this? 450 00:26:05,560 --> 00:26:08,400 In fact, the first dimension here, openness to experience, 451 00:26:08,400 --> 00:26:11,240 is a very good predictor of political views. 452 00:26:11,240 --> 00:26:14,840 People scoring high on openness, they tend to be liberal, 453 00:26:14,840 --> 00:26:19,080 people who score low on openness, we even call it conservative and 454 00:26:19,080 --> 00:26:22,640 traditional, they tend to vote for conservative candidates. 455 00:26:22,640 --> 00:26:25,520 What about the potential to manipulate people? 456 00:26:25,520 --> 00:26:29,280 So, obviously, if you now can use an algorithm to get to know millions 457 00:26:29,280 --> 00:26:33,240 of people very intimately and then use another algorithm 458 00:26:33,240 --> 00:26:37,760 to adjust the message that you are sending to them, 459 00:26:37,760 --> 00:26:40,280 to make it most persuasive, 460 00:26:40,280 --> 00:26:43,240 obviously gives you a lot of power. 461 00:26:52,040 --> 00:26:55,560 I'm quite surprised at just how accurate this model could be. 462 00:26:55,560 --> 00:27:00,880 I mean, I cannot believe that on the basis of a few things 463 00:27:00,880 --> 00:27:05,480 I just, you know, carelessly clicked that I liked, 464 00:27:05,480 --> 00:27:10,200 the model was able to work out that I could have been Catholic, 465 00:27:10,200 --> 00:27:12,920 or had a Catholic upbringing. 466 00:27:12,920 --> 00:27:18,400 And clearly, this is a very powerful way of understanding people. 467 00:27:19,400 --> 00:27:22,800 Very exciting possibilities, but I can't help 468 00:27:22,800 --> 00:27:28,720 fearing that there is that potential, whoever has that power, 469 00:27:28,720 --> 00:27:31,480 whoever can control that model... 470 00:27:33,160 --> 00:27:37,920 ..will have sort of unprecedented possibilities of manipulating 471 00:27:37,920 --> 00:27:41,880 what people think, how they behave, what they see, 472 00:27:41,880 --> 00:27:45,960 whether that's selling things to people or how people vote, 473 00:27:45,960 --> 00:27:48,640 and that's pretty scary too. 474 00:27:52,000 --> 00:27:55,520 Our era is defined by political shocks. 475 00:27:59,240 --> 00:28:02,520 None bigger than the rise of Donald Trump, 476 00:28:02,520 --> 00:28:05,040 who defied pollsters and the mainstream media 477 00:28:05,040 --> 00:28:07,520 to win the American presidency. 478 00:28:08,560 --> 00:28:12,240 Now questions swirl around his use of the American affiliate 479 00:28:12,240 --> 00:28:14,440 of a British insights company, 480 00:28:14,440 --> 00:28:17,960 Cambridge Analytica, who use psychographics. 481 00:28:25,640 --> 00:28:27,600 I'm in Texas to uncover how far 482 00:28:27,600 --> 00:28:31,920 Cambridge Analytica's expertise in personality prediction 483 00:28:31,920 --> 00:28:36,320 played a part in Trump's political triumph, 484 00:28:36,320 --> 00:28:38,760 and how his revolutionary campaign 485 00:28:38,760 --> 00:28:42,240 exploited Silicon Valley's social networks. 486 00:28:44,080 --> 00:28:45,960 Everyone seems to agree 487 00:28:45,960 --> 00:28:49,480 that Trump ran an exceptional election campaign 488 00:28:49,480 --> 00:28:52,840 using digital technologies. 489 00:28:52,840 --> 00:28:56,880 But no-one really knows what they did, 490 00:28:56,880 --> 00:28:59,840 who they were working with, who was helping them, 491 00:28:59,840 --> 00:29:03,360 what the techniques they used were. 492 00:29:03,360 --> 00:29:07,920 So I've come here to try to unravel the mystery. 493 00:29:09,280 --> 00:29:13,120 The operation inside this unassuming building in San Antonio 494 00:29:13,120 --> 00:29:18,080 was largely hidden from view, but crucial to Trump's success. 495 00:29:18,080 --> 00:29:22,600 Since then, I'm the only person to get in here 496 00:29:22,600 --> 00:29:25,000 to find out what really happened. 497 00:29:25,000 --> 00:29:27,040 This was our Project Alamo, 498 00:29:27,040 --> 00:29:30,480 so this was where the digital arm 499 00:29:30,480 --> 00:29:33,680 of the Trump campaign operation was held. 500 00:29:33,680 --> 00:29:36,200 Theresa Hong is speaking publicly 501 00:29:36,200 --> 00:29:40,440 for the first time about her role as digital content director 502 00:29:40,440 --> 00:29:42,320 for the Trump campaign. 503 00:29:43,480 --> 00:29:47,240 So, why is it called Project Alamo? 504 00:29:47,240 --> 00:29:51,560 It was called Project Alamo based on the data, actually, 505 00:29:51,560 --> 00:29:53,960 that was Cambridge Analytica - 506 00:29:53,960 --> 00:29:56,640 they came up with the Alamo data set, right? 507 00:29:56,640 --> 00:30:00,200 So, we just kind of adopted the name Project Alamo. 508 00:30:00,200 --> 00:30:04,920 It does conjure up sort of images of a battle of some sort. 509 00:30:04,920 --> 00:30:06,720 Yeah. It kind of was! 510 00:30:06,720 --> 00:30:09,760 In a sense, you know. Yeah, yeah. 511 00:30:13,640 --> 00:30:16,320 Project Alamo was so important, 512 00:30:16,320 --> 00:30:20,040 Donald Trump visited the hundred or so workers based here 513 00:30:20,040 --> 00:30:22,520 during the campaign. 514 00:30:22,520 --> 00:30:26,000 Ever since he started tweeting in 2009, 515 00:30:26,000 --> 00:30:29,640 Trump had grasped the power of social media. 516 00:30:29,640 --> 00:30:32,640 Now, in the fight of his life, 517 00:30:32,640 --> 00:30:36,200 the campaign manipulated his Facebook presence. 518 00:30:36,200 --> 00:30:40,800 Trump's Twitter account, that's all his, he's the one that ran that, 519 00:30:40,800 --> 00:30:43,280 and I did a lot of his Facebook, 520 00:30:43,280 --> 00:30:48,880 so I wrote a lot for him, you know, I kind of channelled Mr Trump. 521 00:30:48,880 --> 00:30:52,320 How do you possibly write a post on Facebook like Donald Trump? 522 00:30:52,320 --> 00:30:54,000 "Believe me." 523 00:30:54,000 --> 00:30:58,040 A lot of believe me's, a lot of alsos, a lot of...verys. 524 00:30:58,040 --> 00:31:01,640 Actually, he was really wonderful to write for, just because 525 00:31:01,640 --> 00:31:06,240 it was so refreshing, it was so authentic. 526 00:31:09,760 --> 00:31:11,800 We headed to the heart of the operation. 527 00:31:14,520 --> 00:31:16,320 Cambridge Analytica was here. 528 00:31:16,320 --> 00:31:19,320 It was just a line of computers, right? 529 00:31:19,320 --> 00:31:22,520 This is where their operation was and this was kind of 530 00:31:22,520 --> 00:31:26,360 the brain of the data, this was the data centre. 531 00:31:26,360 --> 00:31:30,000 This was the data centre, this was the centre of the data centre. 532 00:31:30,000 --> 00:31:33,040 Exactly right, yes, it was. Yes, it was. 533 00:31:33,040 --> 00:31:35,040 Cambridge Analytica were using data 534 00:31:35,040 --> 00:31:40,600 on around 220 million Americans to target potential donors and voters. 535 00:31:40,600 --> 00:31:43,120 It was, you know, 536 00:31:43,120 --> 00:31:48,200 a bunch of card tables that sat here and a bunch of computers and people 537 00:31:48,200 --> 00:31:51,680 that were behind the computers, monitoring the data. 538 00:31:51,680 --> 00:31:55,400 We've got to target this state, that state or this universe or whatever. 539 00:31:55,400 --> 00:31:58,280 So that's what they were doing, they were gathering all the data. 540 00:31:58,280 --> 00:32:02,160 A "universe" was the name given to a group of voters 541 00:32:02,160 --> 00:32:05,560 defined by Cambridge Analytica. 542 00:32:05,560 --> 00:32:07,440 What sort of attributes did these people have 543 00:32:07,440 --> 00:32:09,480 that they had been able to work out? 544 00:32:09,480 --> 00:32:12,880 Some of the attributes would be, when was the last time they voted? 545 00:32:12,880 --> 00:32:14,560 Who did they vote for? 546 00:32:14,560 --> 00:32:17,080 You know, what kind of car do they drive? 547 00:32:17,080 --> 00:32:20,760 What kind of things do they look at on the internet? 548 00:32:20,760 --> 00:32:22,200 What do they stand for? 549 00:32:22,200 --> 00:32:25,200 I mean, one person might really be about job creation 550 00:32:25,200 --> 00:32:27,080 and keeping jobs in America, you know. 551 00:32:27,080 --> 00:32:30,200 Another person, they might resonate with, you know, 552 00:32:30,200 --> 00:32:33,160 Second Amendment and gun rights, so... 553 00:32:33,160 --> 00:32:34,560 How would they know that? 554 00:32:34,560 --> 00:32:38,280 - ..within that... - How would they know that? - That is their secret sauce. 555 00:32:40,360 --> 00:32:42,040 Did the "secret sauce" 556 00:32:42,040 --> 00:32:46,880 contain predictions of personality and political leanings? 557 00:32:46,880 --> 00:32:50,480 Were they able to kind of understand people's personalities? 558 00:32:50,480 --> 00:32:52,200 Yes, I mean, you know, 559 00:32:52,200 --> 00:32:56,200 they do specialise in psychographics, right? 560 00:32:56,200 --> 00:33:01,400 But based on personal interests and based on what a person cares for 561 00:33:01,400 --> 00:33:03,440 and what means something to them, 562 00:33:03,440 --> 00:33:06,760 they were able to extract and then we were able to target. 563 00:33:06,760 --> 00:33:10,680 So, the psychographic stuff, were they using that here? 564 00:33:10,680 --> 00:33:13,480 Was that part of the model you were working on? 565 00:33:13,480 --> 00:33:17,160 Well, I mean, towards the end with the persuasion, absolutely. 566 00:33:17,160 --> 00:33:19,120 I mean, we really were targeting 567 00:33:19,120 --> 00:33:22,120 on these universes that they had collected. 568 00:33:22,120 --> 00:33:25,440 I mean, did some of the attributes that you were able to use 569 00:33:25,440 --> 00:33:29,240 from Cambridge Analytica have a sort of emotional effect, you know, 570 00:33:29,240 --> 00:33:32,800 happy, sad, anxious, worried - moods, that kind of thing? 571 00:33:32,800 --> 00:33:34,080 Right, yeah. 572 00:33:34,080 --> 00:33:38,960 I do know Cambridge Analytica follows something called Ocean 573 00:33:38,960 --> 00:33:41,880 and it's based on, you know, whether or not, like, 574 00:33:41,880 --> 00:33:43,880 are you an extrovert? 575 00:33:43,880 --> 00:33:45,800 Are you more of an introvert? 576 00:33:45,800 --> 00:33:50,720 Are you fearful? Are you positive? 577 00:33:50,720 --> 00:33:52,840 So, they did use that. 578 00:33:54,880 --> 00:33:58,280 Armed with Cambridge Analytica's revolutionary insights, 579 00:33:58,280 --> 00:34:02,160 the next step in the battle to win over millions of Americans 580 00:34:02,160 --> 00:34:06,040 was to shape the online messages they would see. 581 00:34:06,040 --> 00:34:10,760 Now we're going to go into the big kind of bull pen where a lot of 582 00:34:10,760 --> 00:34:14,920 - the creatives were, and this is where I was as well. - Right. 583 00:34:14,920 --> 00:34:17,440 For today's modern working-class families, 584 00:34:17,440 --> 00:34:19,680 the challenge is very, very real. 585 00:34:19,680 --> 00:34:21,160 With childcare costs... 586 00:34:21,160 --> 00:34:25,120 Adverts were tailored to particular audiences, defined by data. 587 00:34:25,120 --> 00:34:28,360 Donald Trump wants to give families the break they deserve. 588 00:34:28,360 --> 00:34:31,160 This universe right here that Cambridge Analytica, 589 00:34:31,160 --> 00:34:34,920 they've collected data and they have identified as working mothers 590 00:34:34,920 --> 00:34:37,200 that are concerned about childcare, 591 00:34:37,200 --> 00:34:40,560 and childcare, obviously, that's not going to be, like, 592 00:34:40,560 --> 00:34:43,560 a war-ridden, you know, destructive ad, right? 593 00:34:43,560 --> 00:34:44,880 That's more warm and fuzzy, 594 00:34:44,880 --> 00:34:46,800 and this is what Trump is going to do for you. 595 00:34:46,800 --> 00:34:49,720 But if you notice, Trump's not speaking. 596 00:34:49,720 --> 00:34:51,760 He wasn't speaking, he wasn't in it at all. 597 00:34:51,760 --> 00:34:53,280 Right, Trump wasn't speaking in it. 598 00:34:53,280 --> 00:34:56,800 That audience there, we wanted a softer approach, 599 00:34:56,800 --> 00:34:59,520 so this is the type of approach that we would take. 600 00:34:59,520 --> 00:35:03,760 The campaign made thousands of different versions 601 00:35:03,760 --> 00:35:05,920 of the same fundraising adverts. 602 00:35:05,920 --> 00:35:11,120 The design was constantly tweaked to see which version performed best. 603 00:35:11,120 --> 00:35:15,680 It wasn't uncommon to have about 35-45,000 iterations 604 00:35:15,680 --> 00:35:18,320 of these types of ads every day, right? 605 00:35:18,320 --> 00:35:22,800 So, you know, it could be 606 00:35:22,800 --> 00:35:28,080 as subtle and just, you know, people wouldn't even notice, 607 00:35:28,080 --> 00:35:30,440 where you have the green button, 608 00:35:30,440 --> 00:35:32,800 sometimes a red button would work better 609 00:35:32,800 --> 00:35:34,400 or a blue button would work better. 610 00:35:34,400 --> 00:35:38,200 Now, the voters Cambridge Analytica had targeted 611 00:35:38,200 --> 00:35:40,360 were bombarded with adverts. 612 00:35:46,320 --> 00:35:47,680 I may have short-circuited... 613 00:35:49,120 --> 00:35:51,560 I'm Donald Trump and I approve this message. 614 00:35:52,760 --> 00:35:57,280 On a typical day, the campaign would run more than 100 different adverts. 615 00:35:57,280 --> 00:36:02,120 The delivery system was Silicon Valley's vast social networks. 616 00:36:02,120 --> 00:36:05,600 We had the Facebook and YouTube and Google people, 617 00:36:05,600 --> 00:36:07,600 they would congregate here. 618 00:36:07,600 --> 00:36:12,800 Almost all of America's voters could now be reached online in an instant. 619 00:36:12,800 --> 00:36:16,600 I mean, what were Facebook and Google and YouTube people actually 620 00:36:16,600 --> 00:36:19,840 - doing here, why were they here? - They were helping us, you know. 621 00:36:19,840 --> 00:36:24,760 They were basically our kind of hands-on partners 622 00:36:24,760 --> 00:36:27,440 as far as being able to utilise the platform 623 00:36:27,440 --> 00:36:29,600 as effectively as possible. 624 00:36:29,600 --> 00:36:34,160 The Trump campaign spent the lion's share of its advertising budget, 625 00:36:34,160 --> 00:36:38,480 around 85 million, on Facebook. 626 00:36:38,480 --> 00:36:42,440 When you're pumping in millions and millions of dollars 627 00:36:42,440 --> 00:36:44,240 to these social platforms, 628 00:36:44,240 --> 00:36:46,320 you're going to get white-glove treatment, 629 00:36:46,320 --> 00:36:48,560 so, they would send people, you know, 630 00:36:48,560 --> 00:36:51,200 representatives to, you know, 631 00:36:51,200 --> 00:36:56,200 Project Alamo to ensure that all of our needs were being met. 632 00:36:56,200 --> 00:36:59,440 The success of Trump's digital strategy was built on 633 00:36:59,440 --> 00:37:03,000 the effectiveness of Facebook as an advertising medium. 634 00:37:03,000 --> 00:37:06,240 It's become a very powerful political tool 635 00:37:06,240 --> 00:37:08,800 that's largely unregulated. 636 00:37:08,800 --> 00:37:12,040 Without Facebook, we wouldn't have won. 637 00:37:12,040 --> 00:37:16,000 I mean, Facebook really and truly put us over the edge, 638 00:37:16,000 --> 00:37:18,560 I mean, Facebook was the medium 639 00:37:18,560 --> 00:37:21,960 that proved most successful for this campaign. 640 00:37:25,680 --> 00:37:28,680 Facebook didn't want to meet me, but made it clear that like all 641 00:37:28,680 --> 00:37:32,560 advertisers on Facebook, political campaigns must ensure 642 00:37:32,560 --> 00:37:36,480 their ads comply with all applicable laws and regulations. 643 00:37:36,480 --> 00:37:40,000 The company also said no personally identifiable 644 00:37:40,000 --> 00:37:42,760 information can be shared with advertising, 645 00:37:42,760 --> 00:37:46,520 measurement or analytics partners unless people give permission. 646 00:37:51,400 --> 00:37:52,960 In London, I'm on the trail 647 00:37:52,960 --> 00:37:56,320 of the data company Cambridge Analytica. 648 00:37:59,760 --> 00:38:03,280 Ever since the American election, the firm has been under pressure 649 00:38:03,280 --> 00:38:08,320 to come clean over its use of personality prediction. 650 00:38:08,320 --> 00:38:12,080 I want to know exactly how the firm used psychographics 651 00:38:12,080 --> 00:38:16,400 to target voters for the Trump campaign. 652 00:38:16,400 --> 00:38:18,000 - Alexander. - Hi. 653 00:38:18,000 --> 00:38:19,960 How do you do? Pleasure. 654 00:38:19,960 --> 00:38:23,720 VOICEOVER: Alexander Nix's firm first used data to target voters 655 00:38:23,720 --> 00:38:27,360 in the American presidential elections while working on 656 00:38:27,360 --> 00:38:31,640 the Ted Cruz campaign for the Republican nomination. 657 00:38:31,640 --> 00:38:35,200 When Cruz lost, they went to work for Trump. 658 00:38:35,200 --> 00:38:37,320 I want to start with the Trump campaign. 659 00:38:37,320 --> 00:38:40,000 Did Cambridge Analytica ever use 660 00:38:40,000 --> 00:38:44,000 psychometric or psychographic methods in this campaign? 661 00:38:44,000 --> 00:38:47,840 We left the Cruz campaign in April after the nomination was over. 662 00:38:47,840 --> 00:38:50,160 We pivoted right across onto the Trump campaign. 663 00:38:50,160 --> 00:38:53,560 It was about five and a half months before polling. 664 00:38:53,560 --> 00:38:58,560 And whilst on the Cruz campaign, we were able to invest 665 00:38:58,560 --> 00:39:01,280 a lot more time into building psychographic models, 666 00:39:01,280 --> 00:39:04,600 into profiling, using behavioural profiling to understand different 667 00:39:04,600 --> 00:39:07,560 personality groups and different personality drivers, 668 00:39:07,560 --> 00:39:10,760 in order to inform our messaging and our creative. 669 00:39:10,760 --> 00:39:14,520 We simply didn't have the time to employ this level of 670 00:39:14,520 --> 00:39:17,000 rigorous methodology for Trump. 671 00:39:17,000 --> 00:39:21,560 For Cruz, Cambridge Analytica built computer models which could crunch 672 00:39:21,560 --> 00:39:26,600 huge amounts of data on each voter, including psychographic data 673 00:39:26,600 --> 00:39:30,440 predicting voters' personality types. 674 00:39:30,440 --> 00:39:32,600 When the firm transferred to the Trump campaign, 675 00:39:32,600 --> 00:39:34,840 they took data with them. 676 00:39:34,840 --> 00:39:40,480 Now, there is clearly some legacy psychographics in the data, 677 00:39:40,480 --> 00:39:44,080 because the data is, um, model data, 678 00:39:44,080 --> 00:39:46,800 or a lot of it is model data that we had used across 679 00:39:46,800 --> 00:39:50,400 the last 14, 15 months of campaigning through the midterms, 680 00:39:50,400 --> 00:39:52,480 and then through the primaries. 681 00:39:52,480 --> 00:39:57,720 But specifically, did we build specific psychographic models 682 00:39:57,720 --> 00:39:59,800 for the Trump campaign? No, we didn't. 683 00:39:59,800 --> 00:40:03,200 So you didn't build specific models for this campaign, 684 00:40:03,200 --> 00:40:07,720 but it sounds like you did use some element of psychographic modelling, 685 00:40:07,720 --> 00:40:10,960 as an approach in the Trump campaign? 686 00:40:10,960 --> 00:40:13,960 Only as a result of legacy data models. 687 00:40:13,960 --> 00:40:17,200 So, the answer... The answer you are looking for is no. 688 00:40:17,200 --> 00:40:20,360 The answer I am looking for is the extent to which it was used. 689 00:40:20,360 --> 00:40:23,520 I mean, legacy... I don't know what that means, legacy data modelling. 690 00:40:23,520 --> 00:40:26,000 What does that mean for the Trump campaign? 691 00:40:26,000 --> 00:40:28,560 Well, so we were able to take models we had made previously 692 00:40:28,560 --> 00:40:30,040 over the last two or three years, 693 00:40:30,040 --> 00:40:33,160 and integrate those into some of the work we were doing. 694 00:40:33,160 --> 00:40:34,920 Where did all the information 695 00:40:34,920 --> 00:40:38,880 to predict voters' personalities come from? 696 00:40:38,880 --> 00:40:42,920 Very originally, we used a combination of telephone surveys 697 00:40:42,920 --> 00:40:45,960 and then we used a number of... 698 00:40:46,960 --> 00:40:50,080 ..online platforms... 699 00:40:50,080 --> 00:40:52,600 for gathering questions. 700 00:40:52,600 --> 00:40:55,480 As we started to gather more data, 701 00:40:55,480 --> 00:40:57,920 we started to look at other platforms. 702 00:40:57,920 --> 00:41:00,440 Such as Facebook, for instance. 703 00:41:02,320 --> 00:41:05,480 Predicting personality is just one element in the big data 704 00:41:05,480 --> 00:41:09,600 companies like Cambridge Analytica are using to revolutionise 705 00:41:09,600 --> 00:41:11,680 the way democracy works. 706 00:41:11,680 --> 00:41:13,960 Can you understand, though, why maybe 707 00:41:13,960 --> 00:41:16,080 some people find it a little bit creepy? 708 00:41:16,080 --> 00:41:17,640 No, I can't. Quite the opposite. 709 00:41:17,640 --> 00:41:21,000 I think that the move away from blanket advertising, 710 00:41:21,000 --> 00:41:24,560 the move towards ever more personalised communication, 711 00:41:24,560 --> 00:41:25,880 is a very natural progression. 712 00:41:25,880 --> 00:41:27,760 I think it is only going to increase. 713 00:41:27,760 --> 00:41:30,960 I find it a little bit weird, if I had a very sort of detailed 714 00:41:30,960 --> 00:41:35,080 personality assessment of me, based on all sorts of different data 715 00:41:35,080 --> 00:41:37,520 that I had put all over the internet, 716 00:41:37,520 --> 00:41:40,840 and that as a result of that, some profile had been made of me 717 00:41:40,840 --> 00:41:43,320 that was then used to target me for adverts, 718 00:41:43,320 --> 00:41:45,920 and I didn't really know that any of that had happened. 719 00:41:45,920 --> 00:41:48,800 That's why some people might find it a little bit sinister. 720 00:41:48,800 --> 00:41:50,240 Well, you have just said yourself, 721 00:41:50,240 --> 00:41:52,680 you are putting this data out into the public domain. 722 00:41:52,680 --> 00:41:56,680 I'm sure that you have a supermarket loyalty card. 723 00:41:56,680 --> 00:42:01,400 I'm sure you understand the reciprocity that is going on there - 724 00:42:01,400 --> 00:42:05,080 you get points, and in return, they gather your data 725 00:42:05,080 --> 00:42:07,280 on your consumer behaviour. 726 00:42:07,280 --> 00:42:08,920 I mean, we are talking about politics 727 00:42:08,920 --> 00:42:12,120 and we're talking about shopping. Are they really the same thing? 728 00:42:12,120 --> 00:42:14,240 The technology is the same. 729 00:42:14,240 --> 00:42:15,640 In the next ten years, 730 00:42:15,640 --> 00:42:18,560 the sheer volumes of data that are going to be available, 731 00:42:18,560 --> 00:42:20,840 that are going to be driving all sorts of things 732 00:42:20,840 --> 00:42:23,200 including marketing and communications, 733 00:42:23,200 --> 00:42:25,960 is going to be a paradigm shift from where we are now 734 00:42:25,960 --> 00:42:27,520 and it's going to be a revolution, 735 00:42:27,520 --> 00:42:29,520 and that is the way the world is moving. 736 00:42:29,520 --> 00:42:31,400 And, you know, I think, 737 00:42:31,400 --> 00:42:35,000 whether you like it or not, it, it... 738 00:42:35,000 --> 00:42:37,800 it is an inevitable fact. 739 00:42:39,120 --> 00:42:42,840 To the new data barons, this is all just business. 740 00:42:42,840 --> 00:42:46,080 But to the rest of us, it's more than that. 741 00:42:49,160 --> 00:42:52,000 By the time of Donald Trump's inauguration, 742 00:42:52,000 --> 00:42:55,760 it was accepted that his mastery of data and social media 743 00:42:55,760 --> 00:42:59,840 had made him the most powerful man in the world. 744 00:42:59,840 --> 00:43:04,320 We will make America great again. 745 00:43:04,320 --> 00:43:06,400 The election of Donald Trump 746 00:43:06,400 --> 00:43:10,240 was greeted with barely concealed fury in Silicon Valley. 747 00:43:10,240 --> 00:43:13,280 But Facebook and other tech companies had made 748 00:43:13,280 --> 00:43:16,600 millions of dollars by helping to make it happen. 749 00:43:16,600 --> 00:43:19,040 Their power as advertising platforms 750 00:43:19,040 --> 00:43:21,840 had been exploited by a politician 751 00:43:21,840 --> 00:43:24,880 with a very different view of the world. 752 00:43:27,600 --> 00:43:30,360 But Facebook's problems were only just beginning. 753 00:43:32,480 --> 00:43:34,120 Another phenomenon of the election 754 00:43:34,120 --> 00:43:37,600 was plunging the tech titan into crisis. 755 00:43:43,440 --> 00:43:46,560 Fake news, often targeting Hillary Clinton, 756 00:43:46,560 --> 00:43:49,760 had dominated the election campaign. 757 00:43:52,080 --> 00:43:55,560 Now, the departing President turned on the social media giant 758 00:43:55,560 --> 00:43:58,040 he had once embraced. 759 00:43:59,240 --> 00:44:01,800 In an age where, uh... 760 00:44:01,800 --> 00:44:06,720 there is so much active misinformation, 761 00:44:06,720 --> 00:44:08,480 and it is packaged very well, 762 00:44:08,480 --> 00:44:11,840 and it looks the same when you see it on a Facebook page, 763 00:44:11,840 --> 00:44:13,960 or you turn on your television, 764 00:44:13,960 --> 00:44:17,240 if everything, uh... 765 00:44:17,240 --> 00:44:20,040 seems to be the same, 766 00:44:20,040 --> 00:44:22,280 and no distinctions are made, 767 00:44:22,280 --> 00:44:25,400 then we won't know what to protect. 768 00:44:25,400 --> 00:44:27,840 We won't know what to fight for. 769 00:44:33,440 --> 00:44:36,480 Fake news had provoked a storm of criticism 770 00:44:36,480 --> 00:44:39,280 over Facebook's impact on democracy. 771 00:44:39,280 --> 00:44:42,800 Its founder, Zuck, claimed it was extremely unlikely 772 00:44:42,800 --> 00:44:46,200 fake news had changed the election's outcome. 773 00:44:46,200 --> 00:44:49,120 But he didn't address why it had spread like wildfire 774 00:44:49,120 --> 00:44:51,400 across the platform. 775 00:44:51,400 --> 00:44:53,280 - Jamie. - Hi, Jamie. 776 00:44:53,280 --> 00:44:55,080 VOICEOVER: Meet Jeff Hancock, 777 00:44:55,080 --> 00:44:58,640 a psychologist who has investigated a hidden aspect of Facebook 778 00:44:58,640 --> 00:45:03,440 that helps explain how the platform became weaponised in this way. 779 00:45:06,360 --> 00:45:11,400 It turns out the power of Facebook to affect our emotions is key, 780 00:45:11,400 --> 00:45:13,600 something that had been uncovered 781 00:45:13,600 --> 00:45:18,040 in an experiment the company itself had run in 2012. 782 00:45:18,040 --> 00:45:19,760 It was one of the earlier, you know, 783 00:45:19,760 --> 00:45:22,880 what we would call big data, social science-type studies. 784 00:45:26,200 --> 00:45:30,920 The newsfeeds of nearly 700,000 users were secretly manipulated 785 00:45:30,920 --> 00:45:35,880 so they would see fewer positive or negative posts. 786 00:45:35,880 --> 00:45:38,400 Jeff helped interpret the results. 787 00:45:38,400 --> 00:45:40,320 So, what did you actually find? 788 00:45:40,320 --> 00:45:43,080 We found that if you were one of those people 789 00:45:43,080 --> 00:45:46,280 that were seeing less negative emotion words in their posts, 790 00:45:46,280 --> 00:45:49,880 then you would write with less negative emotion in your own posts, 791 00:45:49,880 --> 00:45:51,840 and more positive emotion. 792 00:45:51,840 --> 00:45:55,600 - This is emotional contagion. - And what about positive posts? 793 00:45:55,600 --> 00:45:57,640 Did that have the same kind of effect? 794 00:45:57,640 --> 00:45:59,840 Yeah, we saw the same effect 795 00:45:59,840 --> 00:46:03,520 when positive emotion worded posts were decreased. 796 00:46:03,520 --> 00:46:06,320 We saw the same thing. So I would produce fewer positive 797 00:46:06,320 --> 00:46:08,880 emotion words, and more negative emotion words. 798 00:46:08,880 --> 00:46:11,560 And that is consistent with emotional contagion theory. 799 00:46:11,560 --> 00:46:15,520 Basically, we were showing that people were writing in a way 800 00:46:15,520 --> 00:46:18,480 that was matching the emotion that they were seeing 801 00:46:18,480 --> 00:46:21,160 in the Facebook news feed. 802 00:46:21,160 --> 00:46:27,520 Emotion draws people to fake news, and then supercharges its spread. 803 00:46:27,520 --> 00:46:29,200 So the more emotional the content, 804 00:46:29,200 --> 00:46:31,760 the more likely it is to spread online. 805 00:46:31,760 --> 00:46:34,600 - Is that true? - Yeah, the more intense the emotion in content, 806 00:46:34,600 --> 00:46:38,280 the more likely it is to spread, to go viral. 807 00:46:38,280 --> 00:46:42,040 It doesn't matter whether it is sad or happy, like negative or positive, 808 00:46:42,040 --> 00:46:44,880 the more important thing is how intense the emotion is. 809 00:46:44,880 --> 00:46:47,000 The process of emotional contagion 810 00:46:47,000 --> 00:46:50,040 helps explain why fake news has spread 811 00:46:50,040 --> 00:46:52,960 so far across social media. 812 00:46:55,080 --> 00:46:58,040 Advertisers have been well aware of how emotion can be used 813 00:46:58,040 --> 00:46:59,920 to manipulate people's attention. 814 00:46:59,920 --> 00:47:02,480 But now we are seeing this with a whole host of other actors, 815 00:47:02,480 --> 00:47:04,000 some of them nefarious. 816 00:47:04,000 --> 00:47:06,440 So, other state actors trying to influence your election, 817 00:47:06,440 --> 00:47:08,440 people trying to manipulate the media 818 00:47:08,440 --> 00:47:10,720 are using emotional contagion, 819 00:47:10,720 --> 00:47:14,240 and also using those original platforms like Facebook 820 00:47:14,240 --> 00:47:19,120 to accomplish other objectives, like sowing distrust, 821 00:47:19,120 --> 00:47:21,040 or creating false beliefs. 822 00:47:21,040 --> 00:47:23,640 You see it sort of, maybe weaponised 823 00:47:23,640 --> 00:47:26,320 and being used at scale. 824 00:47:35,680 --> 00:47:37,080 I'm in Germany, 825 00:47:37,080 --> 00:47:39,680 where the presence of more than a million new refugees 826 00:47:39,680 --> 00:47:43,000 has caused tension across the political spectrum. 827 00:47:43,000 --> 00:47:44,880 Earlier this year, 828 00:47:44,880 --> 00:47:49,280 a story appeared on the American alt-right news site Breitbart 829 00:47:49,280 --> 00:47:54,080 about the torching of a church in Dortmund by a North African mob. 830 00:47:54,080 --> 00:47:57,520 It was widely shared on social media - 831 00:47:57,520 --> 00:47:59,440 but it wasn't true. 832 00:48:04,800 --> 00:48:06,440 The problem with social networks 833 00:48:06,440 --> 00:48:08,600 is that all information is treated equally. 834 00:48:08,600 --> 00:48:12,480 So you have good, honest, accurate information 835 00:48:12,480 --> 00:48:15,840 sitting alongside and treated equally to 836 00:48:15,840 --> 00:48:18,360 lies and propaganda. 837 00:48:18,360 --> 00:48:21,480 And the difficulty for citizens is that it can be very hard 838 00:48:21,480 --> 00:48:24,400 to tell the difference between the two. 839 00:48:24,400 --> 00:48:30,040 But one data scientist here has discovered an even darker side 840 00:48:30,040 --> 00:48:32,600 to the way Facebook is being manipulated. 841 00:48:33,920 --> 00:48:37,080 We created a network of all the likes 842 00:48:37,080 --> 00:48:40,280 in the refugee debate on Facebook. 843 00:48:42,360 --> 00:48:46,120 Professor Simon Hegelich has found evidence the debate 844 00:48:46,120 --> 00:48:47,880 about refugees on Facebook 845 00:48:47,880 --> 00:48:52,720 is being skewed by anonymous political forces. 846 00:48:52,720 --> 00:48:56,800 So you are trying to understand who is liking pages? 847 00:48:56,800 --> 00:48:58,080 Exactly. 848 00:48:58,080 --> 00:49:01,080 One statistic among many used by Facebook to rank stories in 849 00:49:01,080 --> 00:49:04,160 your news feed is the number of likes they get. 850 00:49:04,160 --> 00:49:08,960 The red area of this chart shows a network of people liking 851 00:49:08,960 --> 00:49:12,200 anti-refugee posts on Facebook. 852 00:49:12,200 --> 00:49:18,560 Most of the likes are sent by just a handful of people - 25 people are... 853 00:49:20,000 --> 00:49:24,040 ..responsible for more than 90% of all these likes. 854 00:49:24,040 --> 00:49:26,560 25 Facebook accounts each liked 855 00:49:26,560 --> 00:49:30,240 more than 30,000 comments over six months. 856 00:49:30,240 --> 00:49:36,480 These hyperactive accounts could be run by real people, or software. 857 00:49:36,480 --> 00:49:38,400 I think the rationale behind this 858 00:49:38,400 --> 00:49:42,400 is that they tried to make their content viral. 859 00:49:42,400 --> 00:49:46,000 They think if we like all this stuff, then Facebook, 860 00:49:46,000 --> 00:49:47,920 the algorithm of Facebook, 861 00:49:47,920 --> 00:49:51,560 will pick up our content and show it to other users, 862 00:49:51,560 --> 00:49:54,920 and then the whole world sees that refugees are bad 863 00:49:54,920 --> 00:49:57,520 and that they shouldn't come to Germany. 864 00:49:57,520 --> 00:50:02,160 This is evidence the number of likes on Facebook can be easily gamed 865 00:50:02,160 --> 00:50:05,240 as part of an effort to try to influence the prominence 866 00:50:05,240 --> 00:50:08,520 of anti-refugee content on the site. 867 00:50:08,520 --> 00:50:10,560 Does this worry you, though? 868 00:50:10,560 --> 00:50:14,680 It's definitely changing structure of public opinion. 869 00:50:14,680 --> 00:50:18,560 Democracy is built on public opinion, 870 00:50:18,560 --> 00:50:23,640 so such a change definitely has to change the way democracy works. 871 00:50:26,520 --> 00:50:30,080 Facebook told us they are working to disrupt the economic incentives 872 00:50:30,080 --> 00:50:34,480 behind false news, removing tens of thousands of fake accounts, 873 00:50:34,480 --> 00:50:36,240 and building new products 874 00:50:36,240 --> 00:50:39,720 to identify and limit the spread of false news. 875 00:50:39,720 --> 00:50:44,160 Zuck is trying to hold the line that his company is not a publisher, 876 00:50:44,160 --> 00:50:49,560 based on that obscure legal clause from the 1990s. 877 00:50:49,560 --> 00:50:51,680 You know, Facebook is a new kind of platform. 878 00:50:51,680 --> 00:50:54,840 You know, it's not a traditional technology company, 879 00:50:54,840 --> 00:50:57,760 it's not a traditional media company. 880 00:50:57,760 --> 00:50:59,840 Um, we don't write the news that people... 881 00:50:59,840 --> 00:51:01,880 that people read on the platform. 882 00:51:03,920 --> 00:51:08,440 In Germany, one man is taking on the tech gods over their responsibility 883 00:51:08,440 --> 00:51:10,960 for what appears on their sites. 884 00:51:12,720 --> 00:51:17,080 Ulrich Kelber is a minister in the Justice Department. 885 00:51:17,080 --> 00:51:20,480 He was once called a "Jewish pig" in a post on Facebook. 886 00:51:20,480 --> 00:51:24,600 He reported it to the company, who refused to delete it. 887 00:51:24,600 --> 00:51:26,560 IN GERMAN: 888 00:51:43,760 --> 00:51:45,280 Under a new law, 889 00:51:45,280 --> 00:51:47,520 Facebook and other social networking sites 890 00:51:47,520 --> 00:51:50,440 could be fined up to 50 million euros 891 00:51:50,440 --> 00:51:55,360 if they fail to take down hate speech posts that are illegal. 892 00:51:55,360 --> 00:51:56,760 Is this too much? 893 00:51:56,760 --> 00:51:59,360 Is this a little bit Draconian? 894 00:52:20,080 --> 00:52:23,640 This is the first time a government has challenged 895 00:52:23,640 --> 00:52:27,000 the principles underlying a Silicon Valley platform. 896 00:52:27,000 --> 00:52:29,640 Once this thread has been pulled, 897 00:52:29,640 --> 00:52:32,760 their whole world could start to unravel. 898 00:52:32,760 --> 00:52:35,080 They must find you a pain. 899 00:52:35,080 --> 00:52:37,160 I mean, this must be annoying for them. 900 00:52:37,160 --> 00:52:38,760 It must be. 901 00:52:47,400 --> 00:52:51,360 Facebook told us they share the goal of fighting hate speech, 902 00:52:51,360 --> 00:52:54,720 and they have made substantial progress 903 00:52:54,720 --> 00:52:56,520 in removing illegal content, 904 00:52:56,520 --> 00:53:00,560 adding 3,000 people to their community operations team. 905 00:53:04,280 --> 00:53:08,440 Facebook now connects more than two billion people around the world, 906 00:53:08,440 --> 00:53:12,320 including more and more voters in the West. 907 00:53:12,320 --> 00:53:14,440 When you think of what Facebook has become, 908 00:53:14,440 --> 00:53:18,440 in such a short space of time, it's actually pretty bizarre. 909 00:53:18,440 --> 00:53:20,080 I mean, this was just a platform 910 00:53:20,080 --> 00:53:23,200 for sharing photos or chatting to friends, 911 00:53:23,200 --> 00:53:27,040 but in less than a decade, it has become a platform that has 912 00:53:27,040 --> 00:53:30,720 dramatic implications for how our democracy works. 913 00:53:32,480 --> 00:53:35,400 Old structures of power are falling away. 914 00:53:35,400 --> 00:53:40,080 Social media is giving ordinary people access to huge audiences. 915 00:53:40,080 --> 00:53:43,200 And politics is changing as a result. 916 00:53:46,120 --> 00:53:47,920 Significant numbers were motivated 917 00:53:47,920 --> 00:53:50,480 through social media to vote for Brexit. 918 00:53:52,320 --> 00:53:54,800 Jeremy Corbyn lost the general election, 919 00:53:54,800 --> 00:53:58,000 but enjoyed unexpected gains, 920 00:53:58,000 --> 00:54:03,200 an achievement he put down in part to the power of social media. 921 00:54:06,640 --> 00:54:12,160 One influential force in this new world is found here in Bristol. 922 00:54:12,160 --> 00:54:15,840 The Canary is an online political news outlet. 923 00:54:15,840 --> 00:54:18,240 During the election campaign, 924 00:54:18,240 --> 00:54:22,800 their stories got more than 25 million hits on a tiny budget. 925 00:54:24,160 --> 00:54:27,800 So, how much of your readership comes through Facebook? 926 00:54:27,800 --> 00:54:30,160 It's really high, it's about 80%. 927 00:54:30,160 --> 00:54:33,560 So it is an enormously important distribution mechanism. 928 00:54:33,560 --> 00:54:35,240 And free. 929 00:54:35,240 --> 00:54:39,720 The Canary's presentation of its pro-Corbyn news 930 00:54:39,720 --> 00:54:43,360 is tailored to social media. 931 00:54:43,360 --> 00:54:45,920 I have noticed that a lot of your headlines, or your pictures, 932 00:54:45,920 --> 00:54:47,680 they are quite emotional. 933 00:54:47,680 --> 00:54:50,360 They are sort of pictures of sad Theresa May, 934 00:54:50,360 --> 00:54:52,040 or delighted Jeremy Corbyn. 935 00:54:52,040 --> 00:54:54,240 Is that part of the purpose of it all? 936 00:54:54,240 --> 00:54:57,640 Yeah, it has to be. We are out there trying to have a conversation 937 00:54:57,640 --> 00:55:00,960 with a lot of people, so it is on us to be compelling. 938 00:55:00,960 --> 00:55:04,240 Human beings work on facts, but they also work on gut instinct, 939 00:55:04,240 --> 00:55:09,080 they work on emotions, feelings, and fidelity and community. 940 00:55:09,080 --> 00:55:10,760 All of these issues. 941 00:55:10,760 --> 00:55:15,560 Social media enables those with few resources to compete 942 00:55:15,560 --> 00:55:20,800 with the mainstream media for the attention of millions of us. 943 00:55:20,800 --> 00:55:23,800 You put up quite sort of clickbait-y stories, you know, 944 00:55:23,800 --> 00:55:26,000 the headlines are there to get clicks. 945 00:55:26,000 --> 00:55:28,480 - Yeah. - Um... Is that a fair criticism? 946 00:55:28,480 --> 00:55:30,080 Of course they're there to get clicks. 947 00:55:30,080 --> 00:55:32,240 We don't want to have a conversation with ten people. 948 00:55:32,240 --> 00:55:34,280 You can't change the world talking to ten people. 949 00:55:42,880 --> 00:55:48,000 The tech gods are giving all of us the power to influence the world. 950 00:55:48,000 --> 00:55:50,400 Connectivity and access... Connect everyone in the world... 951 00:55:50,400 --> 00:55:51,920 Make the world more open and connected... 952 00:55:51,920 --> 00:55:54,040 You connect people over time... You get people connectivity... 953 00:55:54,040 --> 00:55:57,280 That's the mission. That's what I care about. 954 00:55:57,280 --> 00:56:01,040 Social media's unparalleled power to persuade, 955 00:56:01,040 --> 00:56:03,280 first developed for advertisers, 956 00:56:03,280 --> 00:56:06,920 is now being exploited by political forces of all kinds. 957 00:56:06,920 --> 00:56:11,120 Grassroots movements are regaining their power, 958 00:56:11,120 --> 00:56:13,480 challenging political elites. 959 00:56:13,480 --> 00:56:19,040 Extremists are discovering new ways to stoke hatred and spread lies. 960 00:56:19,040 --> 00:56:21,480 And wealthy political parties are developing the ability 961 00:56:21,480 --> 00:56:24,360 to manipulate our thoughts and feelings 962 00:56:24,360 --> 00:56:27,120 using powerful psychological tools, 963 00:56:27,120 --> 00:56:32,040 which is leading to a world of unexpected political opportunity, 964 00:56:32,040 --> 00:56:33,880 and turbulence. 965 00:56:33,880 --> 00:56:37,720 I think the people that connected the world really believed that 966 00:56:37,720 --> 00:56:42,760 somehow, just by us being connected, our politics would be better. 967 00:56:42,760 --> 00:56:48,880 But the world is changing in ways that they never imagined, 968 00:56:48,880 --> 00:56:51,480 and they are probably not happy about any more. 969 00:56:51,480 --> 00:56:53,080 But in truth, 970 00:56:53,080 --> 00:56:56,840 they are no more in charge of this technology than any of us are now. 971 00:56:56,840 --> 00:57:00,760 Silicon Valley's philosophy is called disruption. 972 00:57:00,760 --> 00:57:02,760 Breaking down the way we do things 973 00:57:02,760 --> 00:57:06,520 and using technology to improve the world. 974 00:57:06,520 --> 00:57:08,240 In this series, I have seen how 975 00:57:08,240 --> 00:57:11,680 sharing platforms like Uber and Airbnb 976 00:57:11,680 --> 00:57:14,360 are transforming our cities. 977 00:57:18,200 --> 00:57:21,320 And how automation and artificial intelligence 978 00:57:21,320 --> 00:57:24,240 threaten to destroy millions of jobs. 979 00:57:24,240 --> 00:57:27,560 Within 30 years, half of humanity won't have a job. 980 00:57:27,560 --> 00:57:29,960 It could get ugly, there could be revolution. 981 00:57:29,960 --> 00:57:34,800 Now, the technology to connect the world unleashed by a few billionaire 982 00:57:34,800 --> 00:57:39,440 entrepreneurs is having a dramatic influence on our politics. 983 00:57:39,440 --> 00:57:42,960 The people who are responsible for building this technology, 984 00:57:42,960 --> 00:57:47,880 for unleashing this disruption onto all of us, 985 00:57:47,880 --> 00:57:51,520 don't ever feel like they are responsible for the consequences 986 00:57:51,520 --> 00:57:53,400 of any of that. 987 00:57:53,400 --> 00:57:58,680 They retain this absolute religious faith that technology and 988 00:57:58,680 --> 00:58:03,320 connectivity is always going to make things turn out for the best. 989 00:58:03,320 --> 00:58:05,000 And it doesn't matter what happens, 990 00:58:05,000 --> 00:58:08,040 it doesn't matter how much that's proven not to be the case, 991 00:58:08,040 --> 00:58:10,400 they still believe. 992 00:58:22,520 --> 00:58:25,200 How did Silicon Valley become so influential? 993 00:58:25,200 --> 00:58:28,120 The Open University has produced an interactive timeline 994 00:58:28,120 --> 00:58:30,680 exploring the history of this place. 995 00:58:30,680 --> 00:58:32,720 To find out more, visit... 996 00:58:36,880 --> 00:58:40,080 ..and follow the links to the Open University. 81104

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.