All language subtitles for BBC.Secrets.of.Silicon.Valley.2of2.The.Persuasion.Machine.720p.HDTV.x264.AAC.MVGroup.[www.EramDownload.com]

af Afrikaans
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch Download
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek Download
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian Download
pl Polish
pt Portuguese
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:05,300 --> 00:00:09,220 It was the biggest political earthquake of the century. 2 00:00:09,220 --> 00:00:13,380 We will make America great again! 3 00:00:16,260 --> 00:00:18,540 But just how did Donald Trump defy the predictions 4 00:00:18,540 --> 00:00:22,580 of political pundits and pollsters? 5 00:00:22,580 --> 00:00:25,780 The secret lies here, in San Antonio, Texas. 6 00:00:27,820 --> 00:00:30,100 This was our Project Alamo. 7 00:00:30,100 --> 00:00:36,420 This is where the digital arm of the Trump campaign operation was held. 8 00:00:38,740 --> 00:00:41,140 This is the extraordinary story of two men 9 00:00:41,140 --> 00:00:43,820 with two very different views of the world. 10 00:00:44,860 --> 00:00:47,300 We will build the wall... 11 00:00:47,300 --> 00:00:50,700 The path forward is to connect more, not less. 12 00:00:50,700 --> 00:00:55,060 And how Facebook's Mark Zuckerberg inadvertently helped Donald Trump 13 00:00:55,060 --> 00:00:58,340 become the most powerful man on the planet. 14 00:00:58,340 --> 00:01:01,500 Without Facebook, we wouldn't have won. 15 00:01:01,500 --> 00:01:05,740 I mean, Facebook really and truly put us over the edge. 16 00:01:05,740 --> 00:01:09,460 With their secret algorithms and online tracking, 17 00:01:09,460 --> 00:01:12,980 social media companies know more about us than anyone. 18 00:01:14,380 --> 00:01:17,820 So you can predict mine and everybody else's personality 19 00:01:17,820 --> 00:01:19,900 based on the things that they've liked? 20 00:01:19,900 --> 00:01:21,540 That's correct. 21 00:01:21,540 --> 00:01:24,940 A very accurate prediction of your intimate traits such as 22 00:01:24,940 --> 00:01:29,020 religiosity, political views, intelligence, sexual orientation. 23 00:01:29,020 --> 00:01:32,420 Now, this power is transforming politics. 24 00:01:32,420 --> 00:01:34,060 Can you understand though why maybe 25 00:01:34,060 --> 00:01:36,900 some people find it a little bit creepy? 26 00:01:36,900 --> 00:01:39,660 No, I can't - quite the opposite. That is the way the world is moving. 27 00:01:39,660 --> 00:01:43,420 Whether you like it or not, it's an inevitable fact. 28 00:01:43,420 --> 00:01:47,060 Social media can bring politics closer to the people, 29 00:01:47,060 --> 00:01:48,820 but its destructive power 30 00:01:48,820 --> 00:01:52,060 is creating a new and unpredictable world. 31 00:01:52,060 --> 00:01:56,580 This is the story of how Silicon Valley's mission to connect 32 00:01:56,580 --> 00:01:59,900 all of us is disrupting politics, 33 00:01:59,900 --> 00:02:03,060 plunging us into a world of political turbulence 34 00:02:03,060 --> 00:02:05,260 that no-one can control. 35 00:02:20,900 --> 00:02:25,260 It might not look like it, but anger is building in Silicon Valley. 36 00:02:25,260 --> 00:02:28,260 It's usually pretty quiet around here. 37 00:02:28,260 --> 00:02:30,660 But not today. 38 00:02:30,660 --> 00:02:34,780 Every day you wake up and you wonder what's going to be today's grief, 39 00:02:34,780 --> 00:02:37,860 brought by certain politicians and leaders in the world. 40 00:02:37,860 --> 00:02:42,500 This is a very unusual demonstration. 41 00:02:42,500 --> 00:02:45,180 I've been to loads of demonstrations, 42 00:02:45,180 --> 00:02:46,940 but these aren't the people 43 00:02:46,940 --> 00:02:49,580 that usually go to demonstrations. 44 00:02:49,580 --> 00:02:53,700 In some ways, these are the winners of society. 45 00:02:53,700 --> 00:02:56,620 This is the tech community in Silicon Valley. 46 00:02:56,620 --> 00:03:00,660 Some of the wealthiest people in the world. 47 00:03:00,660 --> 00:03:05,940 And they're here protesting and demonstrating... 48 00:03:05,940 --> 00:03:08,660 against what they see as the kind of changing world 49 00:03:08,660 --> 00:03:10,740 that they don't like. 50 00:03:10,740 --> 00:03:13,700 The problem, of course, is the election of Donald Trump. 51 00:03:16,620 --> 00:03:18,540 Every day, I'm sure, you think, 52 00:03:18,540 --> 00:03:21,100 "I could be a part of resisting those efforts 53 00:03:21,100 --> 00:03:23,260 "to mess things up for the rest of us." 54 00:03:23,260 --> 00:03:27,660 Trump came to power promising to control immigration... 55 00:03:27,660 --> 00:03:31,540 We will build the wall, 100%. 56 00:03:31,540 --> 00:03:34,740 ..and to disengage from the world. 57 00:03:34,740 --> 00:03:38,660 From this day forward, it's going to be 58 00:03:38,660 --> 00:03:41,020 only America first. 59 00:03:43,020 --> 00:03:45,260 America first. 60 00:03:47,700 --> 00:03:52,180 Now, Silicon Valley is mobilising against him. 61 00:03:52,180 --> 00:03:55,180 We are seeing this explosion of political activism, you know, 62 00:03:55,180 --> 00:03:57,740 all through the US and in Europe. 63 00:03:57,740 --> 00:04:00,260 Before he became an activist, 64 00:04:00,260 --> 00:04:03,940 Dex Torricke-Barton was speech writer to the chairman of Google 65 00:04:03,940 --> 00:04:06,020 and the founder of Facebook. 66 00:04:08,260 --> 00:04:11,580 It's a moment when people who believe in this global vision, 67 00:04:11,580 --> 00:04:13,700 as opposed to the nationalist vision of the world, 68 00:04:13,700 --> 00:04:15,980 who believe in a world that isn't about protectionism, 69 00:04:15,980 --> 00:04:18,620 whether it's data or whether it's about trade, 70 00:04:18,620 --> 00:04:21,660 they're coming to stand up and to mobilise in response to that. 71 00:04:21,660 --> 00:04:23,900 Because it feels like the whole of Silicon Valley 72 00:04:23,900 --> 00:04:26,820 has been slightly taken by surprise by what's happening. Absolutely. 73 00:04:26,820 --> 00:04:28,940 But these are the smartest minds in the world. 74 00:04:28,940 --> 00:04:31,620 Yeah. With the most amazing data models and polling. 75 00:04:31,620 --> 00:04:34,020 The smartest minds in the world often can be very, 76 00:04:34,020 --> 00:04:36,860 very ignorant of the things that are going on in the world. 77 00:04:36,860 --> 00:04:42,380 The tech god with the most ambitious global vision is Dex's old boss - 78 00:04:42,380 --> 00:04:44,860 Facebook founder Mark Zuckerberg. 79 00:04:44,860 --> 00:04:49,460 One word captures the world Zuck, as he's known here, 80 00:04:49,460 --> 00:04:52,180 is trying to build. 81 00:04:52,180 --> 00:04:54,420 Connectivity and access. If we connected them... 82 00:04:54,420 --> 00:04:56,660 You give people connectivity... That's the mission. 83 00:04:56,660 --> 00:04:59,140 Connecting with their friends... You connect people over time... 84 00:04:59,140 --> 00:05:01,340 Connect everyone in the world... I'm really optimistic about that. 85 00:05:01,340 --> 00:05:04,020 Make the world more open and connected, that's what I care about. 86 00:05:04,020 --> 00:05:06,660 This is part of the critical enabling infrastructure for the world. 87 00:05:06,660 --> 00:05:08,820 Thank you, guys. 88 00:05:08,820 --> 00:05:11,140 What's Mark Zuckerberg worried about most? 89 00:05:11,140 --> 00:05:13,980 Well, you know, Mark has dedicated his life to connecting, you know, 90 00:05:13,980 --> 00:05:16,740 the world. You know, this is something that he really, you know, 91 00:05:16,740 --> 00:05:18,180 cares passionately about. 92 00:05:18,180 --> 00:05:20,500 And, you know, as I said, you know, 93 00:05:20,500 --> 00:05:23,660 the same worldview and set of policies that, you know, 94 00:05:23,660 --> 00:05:25,340 we'll build walls here, 95 00:05:25,340 --> 00:05:27,780 we'll build walls against the sharing of information 96 00:05:27,780 --> 00:05:29,940 and building those kind of, you know, networks. 97 00:05:29,940 --> 00:05:31,260 It's bigger than just the tech. 98 00:05:31,260 --> 00:05:34,420 It is. It's about the society that Silicon Valley also wants to create? 99 00:05:34,420 --> 00:05:35,500 Absolutely. 100 00:05:35,500 --> 00:05:37,540 The tech gods believe the election 101 00:05:37,540 --> 00:05:40,580 of Donald Trump threatens their vision 102 00:05:40,580 --> 00:05:42,260 of a globalised world. 103 00:05:43,700 --> 00:05:45,260 But in a cruel twist, 104 00:05:45,260 --> 00:05:48,500 is it possible their mission to connect the world 105 00:05:48,500 --> 00:05:50,740 actually helped bring him to power? 106 00:05:50,740 --> 00:05:54,380 The question I have is whether the revolution brought about 107 00:05:54,380 --> 00:05:57,660 by social media companies like Facebook 108 00:05:57,660 --> 00:06:01,300 has actually led to the political changes in the world 109 00:06:01,300 --> 00:06:03,700 that these guys are so worried about. 110 00:06:07,580 --> 00:06:09,420 To answer that question, 111 00:06:09,420 --> 00:06:13,540 you have to understand how the tech titans of Silicon Valley 112 00:06:13,540 --> 00:06:16,260 rose to power. 113 00:06:16,260 --> 00:06:18,780 For that, you have to go back 20 years 114 00:06:18,780 --> 00:06:23,220 to a time when the online world was still in its infancy. 115 00:06:23,220 --> 00:06:26,060 MUSIC: Rock N Roll Star by Oasis 116 00:06:28,100 --> 00:06:32,220 There were fears the new internet was like the Wild West, 117 00:06:32,220 --> 00:06:35,260 anarchic and potentially harmful. 118 00:06:35,260 --> 00:06:40,980 Today our world is being remade, yet again, by an information revolution. 119 00:06:40,980 --> 00:06:44,340 Changing the way we work, the way we live, 120 00:06:44,340 --> 00:06:46,780 the way we relate to each other. 121 00:06:46,780 --> 00:06:50,180 The Telecommunications Act of 1996 122 00:06:50,180 --> 00:06:52,700 was designed to civilise the internet, 123 00:06:52,700 --> 00:06:56,860 including protect children from pornography. 124 00:06:56,860 --> 00:07:00,940 Today with the stroke of a pen, our laws will catch up with our future. 125 00:07:00,940 --> 00:07:06,460 But buried deep within the act was a secret whose impact no-one foresaw. 126 00:07:14,700 --> 00:07:16,100 Jeremy? 127 00:07:16,100 --> 00:07:17,620 Jamie. How are you doing? 128 00:07:17,620 --> 00:07:19,980 Nice to meet you. 129 00:07:19,980 --> 00:07:22,140 VOICEOVER: Jeremy Malcolm is an analyst 130 00:07:22,140 --> 00:07:24,220 at the Electronic Frontier Foundation, 131 00:07:24,220 --> 00:07:27,180 a civil liberties group for the digital age. 132 00:07:29,060 --> 00:07:32,980 Much of Silicon Valley's accelerated growth in the last two decades 133 00:07:32,980 --> 00:07:37,140 has been enabled by one clause in the legislation. 134 00:07:38,260 --> 00:07:41,140 Hidden away in the middle of that is this Section 230. 135 00:07:41,140 --> 00:07:43,180 What's the key line? It literally just says, 136 00:07:43,180 --> 00:07:45,860 no provider or user of an interactive computer service 137 00:07:45,860 --> 00:07:49,340 shall be treated as the publisher or speaker 138 00:07:49,340 --> 00:07:51,300 of any information provided 139 00:07:51,300 --> 00:07:54,060 by another information content provider. That's it. 140 00:07:54,060 --> 00:07:56,860 So what that basically means is, if you're an internet platform, 141 00:07:56,860 --> 00:07:59,580 you don't get treated as the publisher or speaker 142 00:07:59,580 --> 00:08:02,580 of something that your users say using your platform. 143 00:08:02,580 --> 00:08:06,260 If the user says something online that is, say, defamatory, 144 00:08:06,260 --> 00:08:08,380 the platform that they communicate on 145 00:08:08,380 --> 00:08:10,540 isn't going to be held responsible for it. 146 00:08:10,540 --> 00:08:13,740 And the user, of course, can be held directly responsible. 147 00:08:13,740 --> 00:08:20,460 How important is this line for social media companies today? 148 00:08:20,460 --> 00:08:23,100 I think if we didn't have this, we probably wouldn't have 149 00:08:23,100 --> 00:08:25,940 the same kind of social media companies that we have today. 150 00:08:25,940 --> 00:08:29,300 They wouldn't be willing to take on the risk of having so much 151 00:08:29,300 --> 00:08:33,860 unfettered discussion. It's key to the internet's freedom, really? 152 00:08:33,860 --> 00:08:37,100 We wouldn't have the internet of today without this. 153 00:08:37,100 --> 00:08:40,140 And so, if we are going to make any changes to it, 154 00:08:40,140 --> 00:08:41,940 we have to be really, really careful. 155 00:08:43,860 --> 00:08:47,220 These 26 words changed the world. 156 00:08:51,700 --> 00:08:54,580 They allowed a new kind of business to spring up - 157 00:08:54,580 --> 00:08:58,820 online platforms that became the internet giants of today. 158 00:08:58,820 --> 00:09:03,500 Facebook, Google, YouTube - they encouraged users to upload content, 159 00:09:03,500 --> 00:09:07,340 often things about their lives or moments that mattered to them, 160 00:09:07,340 --> 00:09:10,180 onto their sites for free. 161 00:09:10,180 --> 00:09:14,020 And in exchange, they got to hoard all of that data 162 00:09:14,020 --> 00:09:16,300 but without any real responsibility 163 00:09:16,300 --> 00:09:19,940 for the effects of the content that people were posting. 164 00:09:22,980 --> 00:09:26,100 Hundreds of millions of us flocked to these new sites, 165 00:09:26,100 --> 00:09:29,980 putting more of our lives online. 166 00:09:29,980 --> 00:09:33,340 At first, the tech firms couldn't figure out 167 00:09:33,340 --> 00:09:36,420 how to turn that data into big money. 168 00:09:36,420 --> 00:09:41,340 But that changed when a secret within that data was unlocked. 169 00:09:41,340 --> 00:09:43,420 Antonio, Jamie. 170 00:09:43,420 --> 00:09:47,180 'A secret Antonio Garcia Martinez helped reveal at Facebook.' 171 00:09:50,220 --> 00:09:52,220 Tell me a bit about your time at Facebook. 172 00:09:52,220 --> 00:09:53,860 Well, that was interesting. 173 00:09:53,860 --> 00:09:56,940 I was what's called a product manager for ads targeting. 174 00:09:56,940 --> 00:09:59,580 That means basically taking your data and using it to basically 175 00:09:59,580 --> 00:10:02,820 make money on Facebook, to monetise Facebook's data. 176 00:10:02,820 --> 00:10:05,500 If you go browse the internet or buy stuff in stores or whatever, 177 00:10:05,500 --> 00:10:07,340 and then you see ads related to all that stuff 178 00:10:07,340 --> 00:10:08,780 inside Facebook - I created that. 179 00:10:10,900 --> 00:10:12,980 Facebook offers advertisers ways 180 00:10:12,980 --> 00:10:16,300 to target individual users of the site with adverts. 181 00:10:16,300 --> 00:10:21,900 It can be driven by data about how we use the platform. 182 00:10:21,900 --> 00:10:24,860 Here's some examples of what's data for Facebook that makes money. 183 00:10:24,860 --> 00:10:27,620 What you've liked on Facebook, links that you shared, 184 00:10:27,620 --> 00:10:29,740 who you happen to know on Facebook, for example. 185 00:10:29,740 --> 00:10:32,900 Where you've used Facebook, what devices, your iPad, your work computer, 186 00:10:32,900 --> 00:10:34,940 your home computer. In the case of Amazon, 187 00:10:34,940 --> 00:10:36,660 it's obviously what you've purchased. 188 00:10:36,660 --> 00:10:38,780 In the case of Google, it's what you searched for. 189 00:10:38,780 --> 00:10:40,940 How do they turn me... I like something on Facebook, 190 00:10:40,940 --> 00:10:44,260 and I share a link on Facebook, how could they turn that 191 00:10:44,260 --> 00:10:47,300 into something that another company would care about? 192 00:10:47,300 --> 00:10:49,260 There is what's called a targeting system, 193 00:10:49,260 --> 00:10:51,580 and so the advertiser can actually go in and specify, 194 00:10:51,580 --> 00:10:55,580 I want people who are within this city and who have liked BMW or Burberry, for example. 195 00:10:55,580 --> 00:10:57,580 So an advertiser pays Facebook and says, 196 00:10:57,580 --> 00:11:00,660 I want these sorts of people? That's effectively it, that's right. 197 00:11:02,140 --> 00:11:06,220 The innovation that opened up bigger profits was to allow Facebook users 198 00:11:06,220 --> 00:11:12,900 to be targeted using data about what they do on the rest of the internet. 199 00:11:12,900 --> 00:11:14,940 The real key thing that marketers want 200 00:11:14,940 --> 00:11:19,620 is the unique, immutable, flawless, high fidelity ID 201 00:11:19,620 --> 00:11:22,060 for one person on the internet, and Facebook provides that. 202 00:11:22,060 --> 00:11:23,860 It is your identity online. 203 00:11:23,860 --> 00:11:27,580 Facebook can tell an advertiser, this is the real Jamie Barlow, 204 00:11:27,580 --> 00:11:29,580 this is what he's like? Yeah. 205 00:11:29,580 --> 00:11:33,220 A company like Walmart can literally take your data, your e-mail, 206 00:11:33,220 --> 00:11:36,380 phone number, whatever you use for their frequent shopper programme, etc, 207 00:11:36,380 --> 00:11:39,780 and join that to Facebook and literally target those people based on that data. 208 00:11:39,780 --> 00:11:40,900 That's part of what I built. 209 00:11:46,180 --> 00:11:50,420 The tech gods suck in all this data about how we use their technologies 210 00:11:50,420 --> 00:11:53,980 to build their vast fortunes. 211 00:11:55,820 --> 00:11:59,620 I mean, it sounds like data is like oil, it's keeping the economy going? 212 00:11:59,620 --> 00:12:01,660 Right. I mean, the difference is these companies, 213 00:12:01,660 --> 00:12:04,820 instead of drilling for this oil, they generate this oil via, 214 00:12:04,820 --> 00:12:07,780 by getting users to actually use their apps and then they actually 215 00:12:07,780 --> 00:12:10,180 monetise it. Usually via advertising or other mechanisms. 216 00:12:10,180 --> 00:12:13,020 But, yeah, it is the new oil. 217 00:12:13,020 --> 00:12:16,660 Data about billions of us is propelling Silicon Valley 218 00:12:16,660 --> 00:12:19,700 to the pinnacle of the global economy. 219 00:12:22,380 --> 00:12:24,220 The world's largest hotel company, Airbnb, 220 00:12:24,220 --> 00:12:26,020 doesn't own a single piece of real estate. 221 00:12:26,020 --> 00:12:28,780 The world's largest taxi company, Uber, doesn't own any cars. 222 00:12:28,780 --> 00:12:31,980 The world's largest media company, Facebook, doesn't produce any media, right? 223 00:12:31,980 --> 00:12:33,540 So what do they have? 224 00:12:33,540 --> 00:12:36,620 Well, they have the data around how you use those resources and how you 225 00:12:36,620 --> 00:12:38,900 use those assets. And that's really what they are. 226 00:12:42,660 --> 00:12:47,540 The secret of targeting us with adverts is keeping us online 227 00:12:47,540 --> 00:12:50,500 for as long as possible. 228 00:12:50,500 --> 00:12:54,060 I thought I'd just see how much time I spend on here. 229 00:12:54,060 --> 00:12:59,540 So I've got an app that counts how often I pick this thing up. 230 00:12:59,540 --> 00:13:03,140 Our time is the Holy Grail of Silicon Valley. 231 00:13:04,620 --> 00:13:07,220 Here's what my life looks like on a typical day. 232 00:13:09,900 --> 00:13:13,340 Yeah, could I have a flat white, please? Flat white? Yeah. 233 00:13:13,340 --> 00:13:18,700 Like more and more of us, my phone is my gateway to the online world. 234 00:13:18,700 --> 00:13:21,500 It's how I check my social media accounts. 235 00:13:21,500 --> 00:13:26,860 On average, Facebook users spend 50 minutes every day on the site. 236 00:13:32,300 --> 00:13:34,100 The longer we spend connected, 237 00:13:34,100 --> 00:13:38,340 the more Silicon Valley can learn about us, 238 00:13:38,340 --> 00:13:42,340 and the more targeted and effective their advertising can be. 239 00:14:06,540 --> 00:14:09,060 So apparently, today, I... 240 00:14:09,060 --> 00:14:13,660 I've checked my phone 117 times... 241 00:14:15,300 --> 00:14:20,820 ..and I've been on this phone for nearly five and a half hours. 242 00:14:20,820 --> 00:14:23,620 Well, I mean, that's a lot, that's a lot of hours. 243 00:14:23,620 --> 00:14:26,660 I mean, it's kind of nearly half the day, 244 00:14:26,660 --> 00:14:29,220 spent on this phone. 245 00:14:29,220 --> 00:14:32,540 But it's weird, because it doesn't feel like I spend that long on it. 246 00:14:32,540 --> 00:14:34,380 The strange thing about it is 247 00:14:34,380 --> 00:14:36,820 that I don't even really know what I'm doing 248 00:14:36,820 --> 00:14:39,460 for these five hours that I'm spending on this phone. 249 00:14:41,100 --> 00:14:45,700 What is it that is keeping us hooked to Silicon Valley's global network? 250 00:14:50,220 --> 00:14:55,060 I'm in Seattle to meet someone who saw how the tech gods embraced 251 00:14:55,060 --> 00:14:59,660 new psychological insights into how we all make decisions. 252 00:15:03,420 --> 00:15:05,500 I was a post-doc with Stephen Hawking. 253 00:15:05,500 --> 00:15:08,340 Once Chief Technology Officer at Microsoft, 254 00:15:08,340 --> 00:15:11,140 Nathan Myhrvold is the most passionate technologist 255 00:15:11,140 --> 00:15:14,020 I have ever met. 256 00:15:14,020 --> 00:15:16,380 Some complicated-looking equations in the background. 257 00:15:16,380 --> 00:15:18,700 Well, it turns out if you work with Stephen Hawking, 258 00:15:18,700 --> 00:15:20,940 you do work with complicated equations. 259 00:15:20,940 --> 00:15:23,340 It's kind of the nature of the beast! 260 00:15:23,340 --> 00:15:24,580 Amazing picture. 261 00:15:27,820 --> 00:15:31,260 A decade ago, Nathan brought together Daniel Kahneman, 262 00:15:31,260 --> 00:15:34,540 pioneer of the new science of behavioural economics, 263 00:15:34,540 --> 00:15:38,340 and Silicon Valley's leaders, for a series of meetings. 264 00:15:38,340 --> 00:15:41,020 I came and Jeff Bezos came. 265 00:15:41,020 --> 00:15:46,980 That's Jeff Bezos, the founder of Amazon, worth $76 billion. 266 00:15:46,980 --> 00:15:49,140 Sean Parker. Sean was there. 267 00:15:49,140 --> 00:15:53,340 That's Sean Parker, the first president of Facebook. 268 00:15:53,340 --> 00:15:55,420 And the Google founders were there. 269 00:15:57,500 --> 00:16:04,180 The proposition was, come to this very nice resort in Napa, 270 00:16:04,180 --> 00:16:08,220 and for several days, just have Kahneman 271 00:16:08,220 --> 00:16:10,060 and then also a couple of 272 00:16:10,060 --> 00:16:13,540 other behavioural economists explain things. 273 00:16:13,540 --> 00:16:16,180 And ask questions and see what happens. 274 00:16:16,180 --> 00:16:22,460 Kahneman had a simple but brilliant theory on how we make decisions. 275 00:16:22,460 --> 00:16:27,740 He had found we use one of two different systems of thinking. 276 00:16:28,940 --> 00:16:32,260 In this dichotomy, you have... 277 00:16:32,260 --> 00:16:34,820 over here is a hunch... 278 00:16:37,060 --> 00:16:38,500 ..a guess, 279 00:16:38,500 --> 00:16:40,940 a gut feeling... 280 00:16:44,660 --> 00:16:47,860 ..and "I just know". 281 00:16:47,860 --> 00:16:54,140 So this is sort of emotional and more like, just instant stuff? 282 00:16:54,140 --> 00:16:56,900 That's the idea. This set of things 283 00:16:56,900 --> 00:17:00,900 is not particularly good at a different set of stuff 284 00:17:00,900 --> 00:17:02,900 that involves... 285 00:17:05,300 --> 00:17:06,500 ..analysis... 286 00:17:09,380 --> 00:17:11,140 ..numbers... 287 00:17:13,180 --> 00:17:14,220 ..probability. 288 00:17:15,700 --> 00:17:18,940 The meetings in Napa didn't deal with the basics 289 00:17:18,940 --> 00:17:21,900 of behavioural economics, but how might the insights of the new 290 00:17:21,900 --> 00:17:24,140 science have helped the tech gods? 291 00:17:24,140 --> 00:17:27,660 A lot of advertising is about trying to hook people 292 00:17:27,660 --> 00:17:31,340 in these type-one things to get interested one way or the other. 293 00:17:31,340 --> 00:17:36,180 Technology companies undoubtedly use that to one degree or another. 294 00:17:36,180 --> 00:17:37,860 You know, the term "clickbait", 295 00:17:37,860 --> 00:17:40,460 for things that look exciting to click on. 296 00:17:40,460 --> 00:17:43,180 There's billions of dollars change hands 297 00:17:43,180 --> 00:17:47,180 because we all get enticed into clicking something. 298 00:17:47,180 --> 00:17:51,020 And there's a lot of things that I click on and then you get there, 299 00:17:51,020 --> 00:17:54,100 you're like, OK, fine, you were just messing with me. 300 00:17:54,100 --> 00:17:56,660 You're playing to the type-one things, 301 00:17:56,660 --> 00:17:59,140 you're putting a set of triggers out there 302 00:17:59,140 --> 00:18:02,220 that make me want to click on it, 303 00:18:02,220 --> 00:18:07,100 and even though, like, I'm aware of that, I still sometimes click! 304 00:18:07,100 --> 00:18:09,820 Tech companies both try to understand 305 00:18:09,820 --> 00:18:13,420 our behaviour by having smart humans think about it and increasingly 306 00:18:13,420 --> 00:18:15,740 by having machines think about it. 307 00:18:15,740 --> 00:18:17,220 By having machines track us 308 00:18:17,220 --> 00:18:20,060 to see, what is the clickbait Nathan falls for? 309 00:18:20,060 --> 00:18:22,500 What are the things he really likes to spend time on? 310 00:18:22,500 --> 00:18:24,700 Let's show him more of that stuff! 311 00:18:26,580 --> 00:18:30,220 Trying to grab the attention of the consumer is nothing new. 312 00:18:30,220 --> 00:18:32,900 That's what advertising is all about. 313 00:18:32,900 --> 00:18:36,100 But insights into how we make decisions helped Silicon Valley 314 00:18:36,100 --> 00:18:38,540 to shape the online world. 315 00:18:38,540 --> 00:18:43,820 And little wonder, their success depends on keeping us engaged. 316 00:18:43,820 --> 00:18:48,220 From 1-Click buying on Amazon to the Facebook like, 317 00:18:48,220 --> 00:18:52,300 the more they've hooked us, the more the money has rolled in. 318 00:18:54,180 --> 00:18:56,260 As Silicon Valley became more influential, 319 00:18:56,260 --> 00:19:01,300 it started attracting powerful friends...in politics. 320 00:19:01,300 --> 00:19:07,820 In 2008, Barack Obama had pioneered political campaigning on Facebook. 321 00:19:07,820 --> 00:19:13,220 As President, he was drawn to Facebook's founder, Zuck. 322 00:19:13,220 --> 00:19:15,020 Sorry, I'm kind of nervous. 323 00:19:15,020 --> 00:19:18,620 We have the President of the United States here! 324 00:19:18,620 --> 00:19:22,220 My name is Barack Obama and I'm the guy who got Mark 325 00:19:22,220 --> 00:19:24,820 to wear a jacket and tie! 326 00:19:27,780 --> 00:19:29,140 How you doing? 327 00:19:29,140 --> 00:19:32,620 Great. I'll have huevos rancheros, please. 328 00:19:32,620 --> 00:19:37,260 And if I could have an egg and cheese sandwich on English muffin? 329 00:19:37,260 --> 00:19:41,460 Aneesh Chopra was Obama's first Chief Technology Officer, 330 00:19:41,460 --> 00:19:43,460 and saw how close the relationship 331 00:19:43,460 --> 00:19:47,820 between the White House and Silicon Valley became. 332 00:19:47,820 --> 00:19:51,540 The President's philosophy and his approach to governing 333 00:19:51,540 --> 00:19:54,420 garnered a great deal of personal interest 334 00:19:54,420 --> 00:19:57,140 among many executives in Silicon Valley. 335 00:19:57,140 --> 00:20:00,620 They were donors to his campaign, volunteers, 336 00:20:00,620 --> 00:20:02,660 active recruiters of engineering talent 337 00:20:02,660 --> 00:20:04,260 to support the campaign apparatus. 338 00:20:04,260 --> 00:20:06,500 He had struck a chord. 339 00:20:06,500 --> 00:20:07,860 Why? Because frankly, 340 00:20:07,860 --> 00:20:11,340 it was an inspiring voice that really tapped into 341 00:20:11,340 --> 00:20:14,060 the hopefulness of the country. 342 00:20:14,060 --> 00:20:17,700 And a lot of Silicon Valley shares this sense of hopefulness, 343 00:20:17,700 --> 00:20:20,100 this optimistic view that we can solve problems 344 00:20:20,100 --> 00:20:21,380 if we would work together 345 00:20:21,380 --> 00:20:24,780 and take advantage of these new capabilities that are coming online. 346 00:20:24,780 --> 00:20:28,660 So people in Silicon Valley saw President Obama 347 00:20:28,660 --> 00:20:33,180 as a bit of a kindred spirit? Oh, yeah. Oh, yeah. 348 00:20:33,180 --> 00:20:35,620 If you'd like, Mark, we can take our jackets off. 349 00:20:35,620 --> 00:20:37,380 That's good! 350 00:20:37,380 --> 00:20:39,820 Facebook's mission to connect the world 351 00:20:39,820 --> 00:20:44,900 went hand-in-hand with Obama's policies promoting globalisation 352 00:20:44,900 --> 00:20:46,740 and free markets. 353 00:20:46,740 --> 00:20:49,140 And Facebook was seen to be improving 354 00:20:49,140 --> 00:20:51,420 the political process itself. 355 00:20:51,420 --> 00:20:55,700 Part of what makes for a healthy democracy 356 00:20:55,700 --> 00:21:00,020 is when you've got citizens who are informed, who are engaged, 357 00:21:00,020 --> 00:21:03,300 and what Facebook allows us to do 358 00:21:03,300 --> 00:21:06,620 is make sure this isn't just a one-way conversation. 359 00:21:06,620 --> 00:21:10,100 You have the tech mind-set and governments increasingly share 360 00:21:10,100 --> 00:21:12,940 the same view of the world. That's not a natural... 361 00:21:12,940 --> 00:21:14,340 It's a spirit of liberty. 362 00:21:14,340 --> 00:21:16,580 It's a spirit of freedom. 363 00:21:16,580 --> 00:21:20,180 That is manifest today in these new technologies. 364 00:21:20,180 --> 00:21:23,740 It happens to be that freedom means I can tweet something offensive. 365 00:21:23,740 --> 00:21:26,940 But it also means that I have a voice. 366 00:21:26,940 --> 00:21:30,620 Please raise your right hand and repeat after me... 367 00:21:30,620 --> 00:21:32,740 By the time Obama won his second term, 368 00:21:32,740 --> 00:21:37,660 he was feted for his mastery of social media's persuasive power. 369 00:21:37,660 --> 00:21:40,380 But across the political spectrum, 370 00:21:40,380 --> 00:21:44,620 the race was on to find new ways to gain a digital edge. 371 00:21:44,620 --> 00:21:47,700 The world was about to change for Facebook. 372 00:21:57,420 --> 00:22:01,900 Stanford University, in the heart of Silicon Valley. 373 00:22:01,900 --> 00:22:06,500 Home to a psychologist investigating just how revealing 374 00:22:06,500 --> 00:22:11,580 Facebook's hoard of information about each of us could really be. 375 00:22:11,580 --> 00:22:14,500 How are you doing? Nice to meet you. 376 00:22:14,500 --> 00:22:19,380 Dr Michal Kosinski specialises in psychometrics - 377 00:22:19,380 --> 00:22:22,540 the science of predicting psychological traits 378 00:22:22,540 --> 00:22:24,660 like personality. 379 00:22:24,660 --> 00:22:27,500 So in the past, when you wanted to measure someone's personality 380 00:22:27,500 --> 00:22:31,340 or intelligence, you needed to give them a question or a test, 381 00:22:31,340 --> 00:22:34,420 and they would have to answer a bunch of questions. 382 00:22:34,420 --> 00:22:38,020 Now, many of those questions would basically ask you 383 00:22:38,020 --> 00:22:40,060 about whether you like poetry, 384 00:22:40,060 --> 00:22:42,100 or you like hanging out with other people, 385 00:22:42,100 --> 00:22:44,140 or you like the theatre, and so on. 386 00:22:44,140 --> 00:22:47,780 But these days, you don't need to ask these questions any more. 387 00:22:47,780 --> 00:22:51,700 Why? Because while going through our lives, 388 00:22:51,700 --> 00:22:55,180 we are leaving behind a lot of digital footprints 389 00:22:55,180 --> 00:22:57,940 that basically contain the same information. 390 00:22:57,940 --> 00:23:02,220 So instead of asking you whether you like poetry, 391 00:23:02,220 --> 00:23:06,540 I can just look at your reading history on Amazon 392 00:23:06,540 --> 00:23:08,740 or your Facebook likes, 393 00:23:08,740 --> 00:23:11,340 and I would just get exactly the same information. 394 00:23:13,100 --> 00:23:17,420 In 2011, Dr Kosinski and his team at the University of Cambridge 395 00:23:17,420 --> 00:23:19,500 developed an online survey 396 00:23:19,500 --> 00:23:22,540 to measure volunteers' personality traits. 397 00:23:22,540 --> 00:23:25,340 With their permission, he matched their results 398 00:23:25,340 --> 00:23:27,340 with their Facebook data. 399 00:23:27,340 --> 00:23:30,740 More than 6 million people took part. 400 00:23:30,740 --> 00:23:33,700 We have people's Facebook likes, 401 00:23:33,700 --> 00:23:36,980 people's status updates and profile data, 402 00:23:36,980 --> 00:23:41,220 and this allows us to build those... to gain better understanding 403 00:23:41,220 --> 00:23:44,020 of how psychological traits are being expressed 404 00:23:44,020 --> 00:23:46,100 in the digital environment. 405 00:23:46,100 --> 00:23:50,780 How you can measure psychological traits using digital footprints. 406 00:23:50,780 --> 00:23:53,420 An algorithm that can look at millions of people 407 00:23:53,420 --> 00:23:55,820 and it can look at hundreds of thousands 408 00:23:55,820 --> 00:23:57,700 or tens of thousands of your likes 409 00:23:57,700 --> 00:24:02,140 can extract and utilise even those little pieces of information 410 00:24:02,140 --> 00:24:05,660 and combine it into a very accurate profile. 411 00:24:05,660 --> 00:24:10,180 You can quite accurately predict mine, and in fact, everybody else's 412 00:24:10,180 --> 00:24:14,860 personality, based on the things that they've liked? 413 00:24:14,860 --> 00:24:16,340 That's correct. 414 00:24:16,340 --> 00:24:19,380 It can also be used to turn your digital footprint 415 00:24:19,380 --> 00:24:21,860 into a very accurate prediction 416 00:24:21,860 --> 00:24:24,300 of your intimate traits, such as religiosity, 417 00:24:24,300 --> 00:24:26,700 political views, personality, intelligence, 418 00:24:26,700 --> 00:24:30,980 sexual orientation and a bunch of other psychological traits. 419 00:24:30,980 --> 00:24:34,460 If I'm logged in, we can maybe see how accurate this actually is. 420 00:24:34,460 --> 00:24:38,300 So I hit "make prediction" and it's going to try and predict 421 00:24:38,300 --> 00:24:39,900 my personality from my Facebook page. 422 00:24:39,900 --> 00:24:41,940 From your Facebook likes. 423 00:24:41,940 --> 00:24:43,700 According to your likes, 424 00:24:43,700 --> 00:24:47,020 you're open-minded, liberal and artistic. 425 00:24:47,020 --> 00:24:49,900 Judges your intelligence to be extremely high. 426 00:24:49,900 --> 00:24:54,140 Well done. Yes. I'm extremely intelligent! 427 00:24:54,140 --> 00:24:57,180 You're not religious, but if you are religious, 428 00:24:57,180 --> 00:25:00,420 most likely you'd be a Catholic. I was raised a Catholic! 429 00:25:00,420 --> 00:25:02,860 I can't believe it knows that. 430 00:25:02,860 --> 00:25:06,460 Because I... I don't say anything about being a Catholic anywhere, 431 00:25:06,460 --> 00:25:10,220 but I was raised as a Catholic, but I'm not a practising Catholic. 432 00:25:10,220 --> 00:25:11,980 So it's like... 433 00:25:11,980 --> 00:25:14,060 It's absolutely spot on. 434 00:25:14,060 --> 00:25:18,060 Oh, my God! Journalism, but also what did I study? 435 00:25:18,060 --> 00:25:21,820 Studied history, and I didn't put anything about history in there. 436 00:25:21,820 --> 00:25:23,420 I think this is one of the things 437 00:25:23,420 --> 00:25:27,460 that people don't really get about those predictions, that they think, 438 00:25:27,460 --> 00:25:30,140 look, if I like Lady Gaga on Facebook, 439 00:25:30,140 --> 00:25:32,740 obviously people will know that I like Lady Gaga, 440 00:25:32,740 --> 00:25:35,380 or the Government will know that I like Lady Gaga. 441 00:25:35,380 --> 00:25:37,340 Look, you don't need a rocket scientist 442 00:25:37,340 --> 00:25:40,460 to look at your Spotify playlist or your Facebook likes 443 00:25:40,460 --> 00:25:42,900 to figure out that you like Lady Gaga. 444 00:25:42,900 --> 00:25:48,580 What's really world-changing about those algorithms is that they can 445 00:25:48,580 --> 00:25:52,660 take your music preferences or your book preferences 446 00:25:52,660 --> 00:25:56,300 and extract from this seemingly innocent information 447 00:25:56,300 --> 00:26:00,180 very accurate predictions about your religiosity, leadership potential, 448 00:26:00,180 --> 00:26:03,500 political views, personality and so on. 449 00:26:03,500 --> 00:26:06,460 Can you predict people's political persuasions with this? 450 00:26:06,460 --> 00:26:09,100 In fact, the first dimension here, openness to experience, 451 00:26:09,100 --> 00:26:11,940 is a very good predictor of political views. 452 00:26:11,940 --> 00:26:15,620 People scoring high on openness, they tend to be liberal, 453 00:26:15,620 --> 00:26:19,860 people who score low on openness, we even call it conservative and 454 00:26:19,860 --> 00:26:23,300 traditional, they tend to vote for conservative candidates. 455 00:26:23,300 --> 00:26:26,260 What about the potential to manipulate people? 456 00:26:26,260 --> 00:26:30,020 So, obviously, if you now can use an algorithm to get to know millions 457 00:26:30,020 --> 00:26:33,900 of people very intimately and then use another algorithm 458 00:26:33,900 --> 00:26:38,500 to adjust the message that you are sending to them, 459 00:26:38,500 --> 00:26:40,980 to make it most persuasive, 460 00:26:40,980 --> 00:26:43,820 obviously gives you a lot of power. 461 00:26:52,780 --> 00:26:56,260 I'm quite surprised at just how accurate this model could be. 462 00:26:56,260 --> 00:27:01,540 I mean, I cannot believe that on the basis of a few things 463 00:27:01,540 --> 00:27:06,180 I just, you know, carelessly clicked that I liked, 464 00:27:06,180 --> 00:27:10,900 the model was able to work out that I could have been Catholic, 465 00:27:10,900 --> 00:27:13,700 or had a Catholic upbringing. 466 00:27:13,700 --> 00:27:19,140 And clearly, this is a very powerful way of understanding people. 467 00:27:20,140 --> 00:27:23,460 Very exciting possibilities, but I can't help 468 00:27:23,460 --> 00:27:29,500 fearing that there is that potential, whoever has that power, 469 00:27:29,500 --> 00:27:32,140 whoever can control that model... 470 00:27:33,860 --> 00:27:38,660 ..will have sort of unprecedented possibilities of manipulating 471 00:27:38,660 --> 00:27:42,540 what people think, how they behave, what they see, 472 00:27:42,540 --> 00:27:46,580 whether that's selling things to people or how people vote, 473 00:27:46,580 --> 00:27:49,220 and that's pretty scary too. 474 00:27:52,700 --> 00:27:56,100 Our era is defined by political shocks. 475 00:27:59,940 --> 00:28:03,180 None bigger than the rise of Donald Trump, 476 00:28:03,180 --> 00:28:05,660 who defied pollsters and the mainstream media 477 00:28:05,660 --> 00:28:08,100 to win the American presidency. 478 00:28:09,140 --> 00:28:12,980 Now questions swirl around his use of the American affiliate 479 00:28:12,980 --> 00:28:15,220 of a British insights company, 480 00:28:15,220 --> 00:28:18,540 Cambridge Analytica, who use psychographics. 481 00:28:26,260 --> 00:28:28,260 I'm in Texas to uncover how far 482 00:28:28,260 --> 00:28:32,540 Cambridge Analytica's expertise in personality prediction 483 00:28:32,540 --> 00:28:36,940 played a part in Trump's political triumph, 484 00:28:36,940 --> 00:28:39,420 and how his revolutionary campaign 485 00:28:39,420 --> 00:28:42,820 exploited Silicon Valley's social networks. 486 00:28:44,660 --> 00:28:46,700 Everyone seems to agree 487 00:28:46,700 --> 00:28:50,140 that Trump ran an exceptional election campaign 488 00:28:50,140 --> 00:28:53,460 using digital technologies. 489 00:28:53,460 --> 00:28:57,500 But no-one really knows what they did, 490 00:28:57,500 --> 00:29:00,500 who they were working with, who was helping them, 491 00:29:00,500 --> 00:29:03,980 what the techniques they used were. 492 00:29:03,980 --> 00:29:08,500 So I've come here to try to unravel the mystery. 493 00:29:09,980 --> 00:29:13,740 The operation inside this unassuming building in San Antonio 494 00:29:13,740 --> 00:29:18,820 was largely hidden from view, but crucial to Trump's success. 495 00:29:18,820 --> 00:29:23,260 Since then, I'm the only person to get in here 496 00:29:23,260 --> 00:29:25,660 to find out what really happened. 497 00:29:25,660 --> 00:29:27,780 This was our Project Alamo, 498 00:29:27,780 --> 00:29:31,140 so this was where the digital arm 499 00:29:31,140 --> 00:29:34,260 of the Trump campaign operation was held. 500 00:29:34,260 --> 00:29:36,900 Theresa Hong is speaking publicly 501 00:29:36,900 --> 00:29:41,100 for the first time about her role as digital content director 502 00:29:41,100 --> 00:29:42,940 for the Trump campaign. 503 00:29:44,140 --> 00:29:47,860 So, why is it called Project Alamo? 504 00:29:47,860 --> 00:29:52,140 It was called Project Alamo based on the data, actually, 505 00:29:52,140 --> 00:29:54,660 that was Cambridge Analytica - 506 00:29:54,660 --> 00:29:57,260 they came up with the Alamo data set, right? 507 00:29:57,260 --> 00:30:00,860 So, we just kind of adopted the name Project Alamo. 508 00:30:00,860 --> 00:30:05,500 It does conjure up sort of images of a battle of some sort. 509 00:30:05,500 --> 00:30:07,420 Yeah. It kind of was! 510 00:30:07,420 --> 00:30:10,380 In a sense, you know. Yeah, yeah. 511 00:30:14,260 --> 00:30:17,020 Project Alamo was so important, 512 00:30:17,020 --> 00:30:20,700 Donald Trump visited the hundred or so workers based here 513 00:30:20,700 --> 00:30:23,220 during the campaign. 514 00:30:23,220 --> 00:30:26,660 Ever since he started tweeting in 2009, 515 00:30:26,660 --> 00:30:30,300 Trump had grasped the power of social media. 516 00:30:30,300 --> 00:30:33,380 Now, in the fight of his life, 517 00:30:33,380 --> 00:30:36,820 the campaign manipulated his Facebook presence. 518 00:30:36,820 --> 00:30:41,460 Trump's Twitter account, that's all his, he's the one that ran that, 519 00:30:41,460 --> 00:30:43,940 and I did a lot of his Facebook, 520 00:30:43,940 --> 00:30:49,540 so I wrote a lot for him, you know, I kind of channelled Mr Trump. 521 00:30:49,540 --> 00:30:53,020 How do you possibly write a post on Facebook like Donald Trump? 522 00:30:53,020 --> 00:30:54,700 "Believe me." 523 00:30:54,700 --> 00:30:58,740 A lot of believe me's, a lot of alsos, a lot of...verys. 524 00:30:58,740 --> 00:31:02,340 Actually, he was really wonderful to write for, just because 525 00:31:02,340 --> 00:31:06,820 it was so refreshing, it was so authentic. 526 00:31:10,380 --> 00:31:12,340 We headed to the heart of the operation. 527 00:31:15,180 --> 00:31:17,020 Cambridge Analytica was here. 528 00:31:17,020 --> 00:31:20,100 It was just a line of computers, right? 529 00:31:20,100 --> 00:31:23,180 This is where their operation was and this was kind of 530 00:31:23,180 --> 00:31:26,980 the brain of the data, this was the data centre. 531 00:31:26,980 --> 00:31:30,660 This was the data centre, this was the centre of the data centre. 532 00:31:30,660 --> 00:31:33,660 Exactly right, yes, it was. Yes, it was. 533 00:31:33,660 --> 00:31:35,740 Cambridge Analytica were using data 534 00:31:35,740 --> 00:31:41,180 on around 220 million Americans to target potential donors and voters. 535 00:31:41,180 --> 00:31:43,860 It was, you know, 536 00:31:43,860 --> 00:31:48,940 a bunch of card tables that sat here and a bunch of computers and people 537 00:31:48,940 --> 00:31:52,380 that were behind the computers, monitoring the data. 538 00:31:52,380 --> 00:31:56,060 We've got to target this state, that state or this universe or whatever. 539 00:31:56,060 --> 00:31:58,980 So that's what they were doing, they were gathering all the data. 540 00:31:58,980 --> 00:32:02,780 A "universe" was the name given to a group of voters 541 00:32:02,780 --> 00:32:06,180 defined by Cambridge Analytica. 542 00:32:06,180 --> 00:32:08,060 What sort of attributes did these people have 543 00:32:08,060 --> 00:32:10,180 that they had been able to work out? 544 00:32:10,180 --> 00:32:13,460 Some of the attributes would be, when was the last time they voted? 545 00:32:13,460 --> 00:32:15,260 Who did they vote for? 546 00:32:15,260 --> 00:32:17,780 You know, what kind of car do they drive? 547 00:32:17,780 --> 00:32:21,420 What kind of things do they look at on the internet? 548 00:32:21,420 --> 00:32:22,860 What do they stand for? 549 00:32:22,860 --> 00:32:25,900 I mean, one person might really be about job creation 550 00:32:25,900 --> 00:32:27,740 and keeping jobs in America, you know. 551 00:32:27,740 --> 00:32:30,900 Another person, they might resonate with, you know, 552 00:32:30,900 --> 00:32:33,780 Second Amendment and gun rights, so... 553 00:32:33,780 --> 00:32:35,260 How would they know that? 554 00:32:35,260 --> 00:32:38,980 ..within that... How would they know that? That is their secret sauce. 555 00:32:41,100 --> 00:32:42,780 Did the "secret sauce" 556 00:32:42,780 --> 00:32:47,620 contain predictions of personality and political leanings? 557 00:32:47,620 --> 00:32:51,140 Were they able to kind of understand people's personalities? 558 00:32:51,140 --> 00:32:52,900 Yes, I mean, you know, 559 00:32:52,900 --> 00:32:56,900 they do specialise in psychographics, right? 560 00:32:56,900 --> 00:33:02,020 But based on personal interests and based on what a person cares for 561 00:33:02,020 --> 00:33:04,100 and what means something to them, 562 00:33:04,100 --> 00:33:07,460 they were able to extract and then we were able to target. 563 00:33:07,460 --> 00:33:11,380 So, the psychographic stuff, were they using that here? 564 00:33:11,380 --> 00:33:14,220 Was that part of the model you were working on? 565 00:33:14,220 --> 00:33:17,820 Well, I mean, towards the end with the persuasion, absolutely. 566 00:33:17,820 --> 00:33:19,820 I mean, we really were targeting 567 00:33:19,820 --> 00:33:22,780 on these universes that they had collected. 568 00:33:22,780 --> 00:33:26,100 I mean, did some of the attributes that you were able to use 569 00:33:26,100 --> 00:33:29,900 from Cambridge Analytica have a sort of emotional effect, you know, 570 00:33:29,900 --> 00:33:33,460 happy, sad, anxious, worried - moods, that kind of thing? 571 00:33:33,460 --> 00:33:34,780 Right, yeah. 572 00:33:34,780 --> 00:33:39,620 I do know Cambridge Analytica follows something called Ocean 573 00:33:39,620 --> 00:33:42,460 and it's based on, you know, whether or not, like, 574 00:33:42,460 --> 00:33:44,500 are you an extrovert? 575 00:33:44,500 --> 00:33:46,500 Are you more of an introvert? 576 00:33:46,500 --> 00:33:51,380 Are you fearful? Are you positive? 577 00:33:51,380 --> 00:33:53,420 So, they did use that. 578 00:33:55,580 --> 00:33:58,940 Armed with Cambridge Analytica's revolutionary insights, 579 00:33:58,940 --> 00:34:02,820 the next step in the battle to win over millions of Americans 580 00:34:02,820 --> 00:34:06,740 was to shape the online messages they would see. 581 00:34:06,740 --> 00:34:11,500 Now we're going to go into the big kind of bull pen where a lot of 582 00:34:11,500 --> 00:34:15,580 the creatives were, and this is where I was as well. Right. 583 00:34:15,580 --> 00:34:18,140 For today's modern working-class families, 584 00:34:18,140 --> 00:34:20,340 the challenge is very, very real. 585 00:34:20,340 --> 00:34:21,900 With childcare costs... 586 00:34:21,900 --> 00:34:25,780 Adverts were tailored to particular audiences, defined by data. 587 00:34:25,780 --> 00:34:29,220 Donald Trump wants to give families the break they deserve. 588 00:34:29,220 --> 00:34:31,860 This universe right here that Cambridge Analytica, 589 00:34:31,860 --> 00:34:35,500 they've collected data and they have identified as working mothers 590 00:34:35,500 --> 00:34:37,940 that are concerned about childcare, 591 00:34:37,940 --> 00:34:41,220 and childcare, obviously, that's not going to be, like, 592 00:34:41,220 --> 00:34:44,220 a war-ridden, you know, destructive ad, right? 593 00:34:44,220 --> 00:34:45,580 That's more warm and fuzzy, 594 00:34:45,580 --> 00:34:47,500 and this is what Trump is going to do for you. 595 00:34:47,500 --> 00:34:50,420 But if you notice, Trump's not speaking. 596 00:34:50,420 --> 00:34:52,420 He wasn't speaking, he wasn't in it at all. 597 00:34:52,420 --> 00:34:54,180 Right, Trump wasn't speaking in it. 598 00:34:54,180 --> 00:34:57,460 That audience there, we wanted a softer approach, 599 00:34:57,460 --> 00:35:00,380 so this is the type of approach that we would take. 600 00:35:00,380 --> 00:35:04,540 The campaign made thousands of different versions 601 00:35:04,540 --> 00:35:06,700 of the same fundraising adverts. 602 00:35:06,700 --> 00:35:11,860 The design was constantly tweaked to see which version performed best. 603 00:35:11,860 --> 00:35:16,340 It wasn't uncommon to have about 35-45,000 iterations 604 00:35:16,340 --> 00:35:19,140 of these types of ads every day, right? 605 00:35:19,140 --> 00:35:23,460 So, you know, it could be 606 00:35:23,460 --> 00:35:28,740 as subtle and just, you know, people wouldn't even notice, 607 00:35:28,740 --> 00:35:31,180 where you have the green button, 608 00:35:31,180 --> 00:35:33,620 sometimes a red button would work better 609 00:35:33,620 --> 00:35:35,060 or a blue button would work better. 610 00:35:35,060 --> 00:35:38,860 Now, the voters Cambridge Analytica had targeted 611 00:35:38,860 --> 00:35:40,900 were bombarded with adverts. 612 00:35:46,980 --> 00:35:48,260 I may have short-circuited... 613 00:35:49,820 --> 00:35:52,260 I'm Donald Trump and I approve this message. 614 00:35:53,500 --> 00:35:57,980 On a typical day, the campaign would run more than 100 different adverts. 615 00:35:57,980 --> 00:36:02,820 The delivery system was Silicon Valley's vast social networks. 616 00:36:02,820 --> 00:36:06,260 We had the Facebook and YouTube and Google people, 617 00:36:06,260 --> 00:36:08,340 they would congregate here. 618 00:36:08,340 --> 00:36:13,500 Almost all of America's voters could now be reached online in an instant. 619 00:36:13,500 --> 00:36:17,380 I mean, what were Facebook and Google and YouTube people actually 620 00:36:17,380 --> 00:36:20,500 doing here, why were they here? They were helping us, you know. 621 00:36:20,500 --> 00:36:25,580 They were basically our kind of hands-on partners 622 00:36:25,580 --> 00:36:28,100 as far as being able to utilise the platform 623 00:36:28,100 --> 00:36:30,260 as effectively as possible. 624 00:36:30,260 --> 00:36:34,860 The Trump campaign spent the lion's share of its advertising budget, 625 00:36:34,860 --> 00:36:39,220 around $85 million, on Facebook. 626 00:36:39,220 --> 00:36:43,020 When you're pumping in millions and millions of dollars 627 00:36:43,020 --> 00:36:44,900 to these social platforms, 628 00:36:44,900 --> 00:36:47,020 you're going to get white-glove treatment, 629 00:36:47,020 --> 00:36:49,260 so, they would send people, you know, 630 00:36:49,260 --> 00:36:51,900 representatives to, you know, 631 00:36:51,900 --> 00:36:56,900 Project Alamo to ensure that all of our needs were being met. 632 00:36:56,900 --> 00:37:00,140 The success of Trump's digital strategy was built on 633 00:37:00,140 --> 00:37:03,780 the effectiveness of Facebook as an advertising medium. 634 00:37:03,780 --> 00:37:07,020 It's become a very powerful political tool 635 00:37:07,020 --> 00:37:09,500 that's largely unregulated. 636 00:37:09,500 --> 00:37:12,780 Without Facebook, we wouldn't have won. 637 00:37:12,780 --> 00:37:16,700 I mean, Facebook really and truly put us over the edge, 638 00:37:16,700 --> 00:37:19,260 I mean, Facebook was the medium 639 00:37:19,260 --> 00:37:22,620 that proved most successful for this campaign. 640 00:37:26,420 --> 00:37:29,420 Facebook didn't want to meet me, but made it clear that like all 641 00:37:29,420 --> 00:37:33,260 advertisers on Facebook, political campaigns must ensure 642 00:37:33,260 --> 00:37:37,220 their ads comply with all applicable laws and regulations. 643 00:37:37,220 --> 00:37:40,740 The company also said no personally identifiable 644 00:37:40,740 --> 00:37:43,500 information can be shared with advertising, 645 00:37:43,500 --> 00:37:47,180 measurement or analytics partners unless people give permission. 646 00:37:52,100 --> 00:37:53,740 In London, I'm on the trail 647 00:37:53,740 --> 00:37:57,060 of the data company Cambridge Analytica. 648 00:38:00,500 --> 00:38:04,020 Ever since the American election, the firm has been under pressure 649 00:38:04,020 --> 00:38:09,100 to come clean over its use of personality prediction. 650 00:38:09,100 --> 00:38:12,820 I want to know exactly how the firm used psychographics 651 00:38:12,820 --> 00:38:17,060 to target voters for the Trump campaign. 652 00:38:17,060 --> 00:38:18,700 Alexander. Hi. 653 00:38:18,700 --> 00:38:20,780 How do you do? Pleasure. 654 00:38:20,780 --> 00:38:24,420 VOICEOVER: Alexander Nix's firm first used data to target voters 655 00:38:24,420 --> 00:38:28,220 in the American presidential elections while working on 656 00:38:28,220 --> 00:38:32,460 the Ted Cruz campaign for the Republican nomination. 657 00:38:32,460 --> 00:38:35,940 When Cruz lost, they went to work for Trump. 658 00:38:35,940 --> 00:38:38,020 I want to start with the Trump campaign. 659 00:38:38,020 --> 00:38:40,780 Did Cambridge Analytica ever use 660 00:38:40,780 --> 00:38:44,860 psychometric or psychographic methods in this campaign? 661 00:38:44,860 --> 00:38:48,580 We left the Cruz campaign in April after the nomination was over. 662 00:38:48,580 --> 00:38:51,020 We pivoted right across onto the Trump campaign. 663 00:38:51,020 --> 00:38:54,300 It was about five and a half months before polling. 664 00:38:54,300 --> 00:38:59,460 And whilst on the Cruz campaign, we were able to invest 665 00:38:59,460 --> 00:39:02,020 a lot more time into building psychographic models, 666 00:39:02,020 --> 00:39:05,380 into profiling, using behavioural profiling to understand different 667 00:39:05,380 --> 00:39:08,300 personality groups and different personality drivers, 668 00:39:08,300 --> 00:39:11,540 in order to inform our messaging and our creative. 669 00:39:11,540 --> 00:39:15,220 We simply didn't have the time to employ this level of 670 00:39:15,220 --> 00:39:17,740 rigorous methodology for Trump. 671 00:39:17,740 --> 00:39:22,300 For Cruz, Cambridge Analytica built computer models which could crunch 672 00:39:22,300 --> 00:39:27,340 huge amounts of data on each voter, including psychographic data 673 00:39:27,340 --> 00:39:31,180 predicting voters' personality types. 674 00:39:31,180 --> 00:39:33,260 When the firm transferred to the Trump campaign, 675 00:39:33,260 --> 00:39:35,540 they took data with them. 676 00:39:35,540 --> 00:39:41,180 Now, there is clearly some legacy psychographics in the data, 677 00:39:41,180 --> 00:39:44,860 because the data is, um, model data, 678 00:39:44,860 --> 00:39:47,500 or a lot of it is model data that we had used across 679 00:39:47,500 --> 00:39:51,180 the last 14, 15 months of campaigning through the midterms, 680 00:39:51,180 --> 00:39:53,380 and then through the primaries. 681 00:39:53,380 --> 00:39:58,460 But specifically, did we build specific psychographic models 682 00:39:58,460 --> 00:40:00,500 for the Trump campaign? No, we didn't. 683 00:40:00,500 --> 00:40:03,940 So you didn't build specific models for this campaign, 684 00:40:03,940 --> 00:40:08,420 but it sounds like you did use some element of psychographic modelling, 685 00:40:08,420 --> 00:40:11,660 as an approach in the Trump campaign? 686 00:40:11,660 --> 00:40:14,700 Only as a result of legacy data models. 687 00:40:14,700 --> 00:40:17,980 So, the answer... The answer you are looking for is no. 688 00:40:17,980 --> 00:40:21,060 The answer I am looking for is the extent to which it was used. 689 00:40:21,060 --> 00:40:24,260 I mean, legacy... I don't know what that means, legacy data modelling. 690 00:40:24,260 --> 00:40:26,700 What does that mean for the Trump campaign? 691 00:40:26,700 --> 00:40:29,300 Well, so we were able to take models we had made previously 692 00:40:29,300 --> 00:40:30,980 over the last two or three years, 693 00:40:30,980 --> 00:40:33,820 and integrate those into some of the work we were doing. 694 00:40:33,820 --> 00:40:35,660 Where did all the information 695 00:40:35,660 --> 00:40:39,620 to predict voters' personalities come from? 696 00:40:39,620 --> 00:40:43,540 Very originally, we used a combination of telephone surveys 697 00:40:43,540 --> 00:40:46,580 and then we used a number of... 698 00:40:47,620 --> 00:40:50,780 ..online platforms... 699 00:40:50,780 --> 00:40:53,300 for gathering questions. 700 00:40:53,300 --> 00:40:56,180 As we started to gather more data, 701 00:40:56,180 --> 00:40:58,540 we started to look at other platforms. 702 00:40:58,540 --> 00:41:01,020 Such as Facebook, for instance. 703 00:41:03,060 --> 00:41:06,220 Predicting personality is just one element in the big data 704 00:41:06,220 --> 00:41:10,340 companies like Cambridge Analytica are using to revolutionise 705 00:41:10,340 --> 00:41:12,460 the way democracy works. 706 00:41:12,460 --> 00:41:14,740 Can you understand, though, why maybe 707 00:41:14,740 --> 00:41:16,820 some people find it a little bit creepy? 708 00:41:16,820 --> 00:41:18,340 No, I can't. Quite the opposite. 709 00:41:18,340 --> 00:41:21,740 I think that the move away from blanket advertising, 710 00:41:21,740 --> 00:41:25,180 the move towards ever more personalised communication, 711 00:41:25,180 --> 00:41:26,620 is a very natural progression. 712 00:41:26,620 --> 00:41:28,500 I think it is only going to increase. 713 00:41:28,500 --> 00:41:31,700 I find it a little bit weird, if I had a very sort of detailed 714 00:41:31,700 --> 00:41:35,780 personality assessment of me, based on all sorts of different data 715 00:41:35,780 --> 00:41:38,220 that I had put all over the internet, 716 00:41:38,220 --> 00:41:41,580 and that as a result of that, some profile had been made of me 717 00:41:41,580 --> 00:41:44,100 that was then used to target me for adverts, 718 00:41:44,100 --> 00:41:46,660 and I didn't really know that any of that had happened. 719 00:41:46,660 --> 00:41:49,580 That's why some people might find it a little bit sinister. 720 00:41:49,580 --> 00:41:51,020 Well, you have just said yourself, 721 00:41:51,020 --> 00:41:53,460 you are putting this data out into the public domain. 722 00:41:53,460 --> 00:41:57,420 I'm sure that you have a supermarket loyalty card. 723 00:41:57,420 --> 00:42:02,300 I'm sure you understand the reciprocity that is going on there - 724 00:42:02,300 --> 00:42:05,780 you get points, and in return, they gather your data 725 00:42:05,780 --> 00:42:08,060 on your consumer behaviour. 726 00:42:08,060 --> 00:42:09,700 I mean, we are talking about politics 727 00:42:09,700 --> 00:42:12,780 and we're talking about shopping. Are they really the same thing? 728 00:42:12,780 --> 00:42:14,940 The technology is the same. 729 00:42:14,940 --> 00:42:16,380 In the next ten years, 730 00:42:16,380 --> 00:42:19,260 the sheer volumes of data that are going to be available, 731 00:42:19,260 --> 00:42:21,580 that are going to be driving all sorts of things 732 00:42:21,580 --> 00:42:23,900 including marketing and communications, 733 00:42:23,900 --> 00:42:26,700 is going to be a paradigm shift from where we are now 734 00:42:26,700 --> 00:42:28,220 and it's going to be a revolution, 735 00:42:28,220 --> 00:42:30,180 and that is the way the world is moving. 736 00:42:30,180 --> 00:42:32,180 And, you know, I think, 737 00:42:32,180 --> 00:42:35,660 whether you like it or not, it, it... 738 00:42:35,660 --> 00:42:38,460 it is an inevitable fact. 739 00:42:39,820 --> 00:42:43,620 To the new data barons, this is all just business. 740 00:42:43,620 --> 00:42:46,700 But to the rest of us, it's more than that. 741 00:42:50,020 --> 00:42:52,740 By the time of Donald Trump's inauguration, 742 00:42:52,740 --> 00:42:56,500 it was accepted that his mastery of data and social media 743 00:42:56,500 --> 00:43:00,540 had made him the most powerful man in the world. 744 00:43:00,540 --> 00:43:05,060 We will make America great again. 745 00:43:05,060 --> 00:43:07,180 The election of Donald Trump 746 00:43:07,180 --> 00:43:11,020 was greeted with barely concealed fury in Silicon Valley. 747 00:43:11,020 --> 00:43:14,060 But Facebook and other tech companies had made 748 00:43:14,060 --> 00:43:17,340 millions of dollars by helping to make it happen. 749 00:43:17,340 --> 00:43:19,780 Their power as advertising platforms 750 00:43:19,780 --> 00:43:22,620 had been exploited by a politician 751 00:43:22,620 --> 00:43:25,460 with a very different view of the world. 752 00:43:28,340 --> 00:43:31,020 But Facebook's problems were only just beginning. 753 00:43:33,140 --> 00:43:34,820 Another phenomenon of the election 754 00:43:34,820 --> 00:43:38,220 was plunging the tech titan into crisis. 755 00:43:44,140 --> 00:43:47,260 Fake news, often targeting Hillary Clinton, 756 00:43:47,260 --> 00:43:50,420 had dominated the election campaign. 757 00:43:52,820 --> 00:43:56,260 Now, the departing President turned on the social media giant 758 00:43:56,260 --> 00:43:58,700 he had once embraced. 759 00:43:59,980 --> 00:44:02,540 In an age where, uh... 760 00:44:02,540 --> 00:44:07,460 there is so much active misinformation, 761 00:44:07,460 --> 00:44:09,180 and it is packaged very well, 762 00:44:09,180 --> 00:44:12,540 and it looks the same when you see it on a Facebook page, 763 00:44:12,540 --> 00:44:14,660 or you turn on your television, 764 00:44:14,660 --> 00:44:17,940 if everything, uh... 765 00:44:17,940 --> 00:44:20,700 seems to be the same, 766 00:44:20,700 --> 00:44:22,940 and no distinctions are made, 767 00:44:22,940 --> 00:44:26,140 then we won't know what to protect. 768 00:44:26,140 --> 00:44:28,500 We won't know what to fight for. 769 00:44:34,140 --> 00:44:37,140 Fake news had provoked a storm of criticism 770 00:44:37,140 --> 00:44:40,180 over Facebook's impact on democracy. 771 00:44:40,180 --> 00:44:43,500 Its founder, Zuck, claimed it was extremely unlikely 772 00:44:43,500 --> 00:44:46,900 fake news had changed the election's outcome. 773 00:44:46,900 --> 00:44:49,780 But he didn't address why it had spread like wildfire 774 00:44:49,780 --> 00:44:52,140 across the platform. 775 00:44:52,140 --> 00:44:53,980 Jamie. Hi, Jamie. 776 00:44:53,980 --> 00:44:55,860 VOICEOVER: Meet Jeff Hancock, 777 00:44:55,860 --> 00:44:59,380 a psychologist who has investigated a hidden aspect of Facebook 778 00:44:59,380 --> 00:45:04,100 that helps explain how the platform became weaponised in this way. 779 00:45:07,220 --> 00:45:12,060 It turns out the power of Facebook to affect our emotions is key, 780 00:45:12,060 --> 00:45:14,340 something that had been uncovered 781 00:45:14,340 --> 00:45:18,780 in an experiment the company itself had run in 2012. 782 00:45:18,780 --> 00:45:20,460 It was one of the earlier, you know, 783 00:45:20,460 --> 00:45:23,460 what we would call big data, social science-type studies. 784 00:45:26,940 --> 00:45:31,620 The newsfeeds of nearly 700,000 users were secretly manipulated 785 00:45:31,620 --> 00:45:36,620 so they would see fewer positive or negative posts. 786 00:45:36,620 --> 00:45:39,100 Jeff helped interpret the results. 787 00:45:39,100 --> 00:45:41,100 So, what did you actually find? 788 00:45:41,100 --> 00:45:43,780 We found that if you were one of those people 789 00:45:43,780 --> 00:45:47,020 that were seeing less negative emotion words in their posts, 790 00:45:47,020 --> 00:45:50,620 then you would write with less negative emotion in your own posts, 791 00:45:50,620 --> 00:45:52,540 and more positive emotion. 792 00:45:52,540 --> 00:45:56,380 This is emotional contagion. And what about positive posts? 793 00:45:56,380 --> 00:45:58,380 Did that have the same kind of effect? 794 00:45:58,380 --> 00:46:00,620 Yeah, we saw the same effect 795 00:46:00,620 --> 00:46:04,300 when positive emotion worded posts were decreased. 796 00:46:04,300 --> 00:46:07,140 We saw the same thing. So I would produce fewer positive 797 00:46:07,140 --> 00:46:09,580 emotion words, and more negative emotion words. 798 00:46:09,580 --> 00:46:12,260 And that is consistent with emotional contagion theory. 799 00:46:12,260 --> 00:46:16,300 Basically, we were showing that people were writing in a way 800 00:46:16,300 --> 00:46:19,180 that was matching the emotion that they were seeing 801 00:46:19,180 --> 00:46:21,980 in the Facebook news feed. 802 00:46:21,980 --> 00:46:28,260 Emotion draws people to fake news, and then supercharges its spread. 803 00:46:28,260 --> 00:46:29,900 So the more emotional the content, 804 00:46:29,900 --> 00:46:32,540 the more likely it is to spread online. 805 00:46:32,540 --> 00:46:35,380 Is that true? Yeah, the more intense the emotion in content, 806 00:46:35,380 --> 00:46:39,020 the more likely it is to spread, to go viral. 807 00:46:39,020 --> 00:46:42,900 It doesn't matter whether it is sad or happy, like negative or positive, 808 00:46:42,900 --> 00:46:45,580 the more important thing is how intense the emotion is. 809 00:46:45,580 --> 00:46:47,780 The process of emotional contagion 810 00:46:47,780 --> 00:46:50,780 helps explain why fake news has spread 811 00:46:50,780 --> 00:46:53,620 so far across social media. 812 00:46:55,820 --> 00:46:58,700 Advertisers have been well aware of how emotion can be used 813 00:46:58,700 --> 00:47:00,660 to manipulate people's attention. 814 00:47:00,660 --> 00:47:03,180 But now we are seeing this with a whole host of other actors, 815 00:47:03,180 --> 00:47:04,700 some of them nefarious. 816 00:47:04,700 --> 00:47:07,180 So, other state actors trying to influence your election, 817 00:47:07,180 --> 00:47:09,140 people trying to manipulate the media 818 00:47:09,140 --> 00:47:11,460 are using emotional contagion, 819 00:47:11,460 --> 00:47:14,980 and also using those original platforms like Facebook 820 00:47:14,980 --> 00:47:19,820 to accomplish other objectives, like sowing distrust, 821 00:47:19,820 --> 00:47:21,820 or creating false beliefs. 822 00:47:21,820 --> 00:47:24,380 You see it sort of, maybe weaponised 823 00:47:24,380 --> 00:47:26,940 and being used at scale. 824 00:47:36,380 --> 00:47:37,780 I'm in Germany, 825 00:47:37,780 --> 00:47:40,380 where the presence of more than a million new refugees 826 00:47:40,380 --> 00:47:43,660 has caused tension across the political spectrum. 827 00:47:43,660 --> 00:47:45,660 Earlier this year, 828 00:47:45,660 --> 00:47:49,980 a story appeared on the American alt-right news site Breitbart 829 00:47:49,980 --> 00:47:54,820 about the torching of a church in Dortmund by a North African mob. 830 00:47:54,820 --> 00:47:58,220 It was widely shared on social media - 831 00:47:58,220 --> 00:48:00,060 but it wasn't true. 832 00:48:05,500 --> 00:48:07,180 The problem with social networks 833 00:48:07,180 --> 00:48:09,420 is that all information is treated equally. 834 00:48:09,420 --> 00:48:13,260 So you have good, honest, accurate information 835 00:48:13,260 --> 00:48:16,500 sitting alongside and treated equally to 836 00:48:16,500 --> 00:48:19,100 lies and propaganda. 837 00:48:19,100 --> 00:48:22,220 And the difficulty for citizens is that it can be very hard 838 00:48:22,220 --> 00:48:25,060 to tell the difference between the two. 839 00:48:25,060 --> 00:48:30,980 But one data scientist here has discovered an even darker side 840 00:48:30,980 --> 00:48:33,220 to the way Facebook is being manipulated. 841 00:48:34,620 --> 00:48:37,740 We created a network of all the likes 842 00:48:37,740 --> 00:48:40,860 in the refugee debate on Facebook. 843 00:48:43,100 --> 00:48:46,780 Professor Simon Hegelich has found evidence the debate 844 00:48:46,780 --> 00:48:48,620 about refugees on Facebook 845 00:48:48,620 --> 00:48:53,460 is being skewed by anonymous political forces. 846 00:48:53,460 --> 00:48:57,500 So you are trying to understand who is liking pages? 847 00:48:57,500 --> 00:48:58,780 Exactly. 848 00:48:58,780 --> 00:49:01,900 One statistic among many used by Facebook to rank stories in 849 00:49:01,900 --> 00:49:04,900 your news feed is the number of likes they get. 850 00:49:04,900 --> 00:49:09,660 The red area of this chart shows a network of people liking 851 00:49:09,660 --> 00:49:12,900 anti-refugee posts on Facebook. 852 00:49:12,900 --> 00:49:19,100 Most of the likes are sent by just a handful of people - 25 people are... 853 00:49:20,860 --> 00:49:24,780 ..responsible for more than 90% of all these likes. 854 00:49:24,780 --> 00:49:27,220 25 Facebook accounts each liked 855 00:49:27,220 --> 00:49:31,100 more than 30,000 comments over six months. 856 00:49:31,100 --> 00:49:37,180 These hyperactive accounts could be run by real people, or software. 857 00:49:37,180 --> 00:49:39,100 I think the rationale behind this 858 00:49:39,100 --> 00:49:43,060 is that they tried to make their content viral. 859 00:49:43,060 --> 00:49:46,660 They think if we like all this stuff, then Facebook, 860 00:49:46,660 --> 00:49:48,540 the algorithm of Facebook, 861 00:49:48,540 --> 00:49:52,220 will pick up our content and show it to other users, 862 00:49:52,220 --> 00:49:55,660 and then the whole world sees that refugees are bad 863 00:49:55,660 --> 00:49:58,220 and that they shouldn't come to Germany. 864 00:49:58,220 --> 00:50:02,860 This is evidence the number of likes on Facebook can be easily gamed 865 00:50:02,860 --> 00:50:05,900 as part of an effort to try to influence the prominence 866 00:50:05,900 --> 00:50:09,140 of anti-refugee content on the site. 867 00:50:09,140 --> 00:50:11,260 Does this worry you, though? 868 00:50:11,260 --> 00:50:15,380 It's definitely changing structure of public opinion. 869 00:50:15,380 --> 00:50:19,220 Democracy is built on public opinion, 870 00:50:19,220 --> 00:50:24,260 so such a change definitely has to change the way democracy works. 871 00:50:27,340 --> 00:50:30,820 Facebook told us they are working to disrupt the economic incentives 872 00:50:30,820 --> 00:50:35,020 behind false news, removing tens of thousands of fake accounts, 873 00:50:35,020 --> 00:50:36,900 and building new products 874 00:50:36,900 --> 00:50:40,380 to identify and limit the spread of false news. 875 00:50:40,380 --> 00:50:44,820 Zuck is trying to hold the line that his company is not a publisher, 876 00:50:44,820 --> 00:50:50,260 based on that obscure legal clause from the 1990s. 877 00:50:50,260 --> 00:50:52,340 You know, Facebook is a new kind of platform. 878 00:50:52,340 --> 00:50:55,540 You know, it's not a traditional technology company, 879 00:50:55,540 --> 00:50:58,420 it's not a traditional media company. 880 00:50:58,420 --> 00:51:00,540 Um, we don't write the news that people... 881 00:51:00,540 --> 00:51:02,620 that people read on the platform. 882 00:51:04,740 --> 00:51:09,100 In Germany, one man is taking on the tech gods over their responsibility 883 00:51:09,100 --> 00:51:11,580 for what appears on their sites. 884 00:51:13,420 --> 00:51:17,740 Ulrich Kelber is a minister in the Justice Department. 885 00:51:17,740 --> 00:51:21,340 He was once called a "Jewish pig" in a post on Facebook. 886 00:51:21,340 --> 00:51:25,380 He reported it to the company, who refused to delete it. 887 00:51:25,380 --> 00:51:27,140 IN GERMAN: 888 00:51:44,420 --> 00:51:45,980 Under a new law, 889 00:51:45,980 --> 00:51:48,380 Facebook and other social networking sites 890 00:51:48,380 --> 00:51:51,100 could be fined up to 50 million euros 891 00:51:51,100 --> 00:51:56,020 if they fail to take down hate speech posts that are illegal. 892 00:51:56,020 --> 00:51:57,420 Is this too much? 893 00:51:57,420 --> 00:51:59,940 Is this a little bit Draconian? 894 00:52:20,740 --> 00:52:24,340 This is the first time a government has challenged 895 00:52:24,340 --> 00:52:27,780 the principles underlying a Silicon Valley platform. 896 00:52:27,780 --> 00:52:30,300 Once this thread has been pulled, 897 00:52:30,300 --> 00:52:33,460 their whole world could start to unravel. 898 00:52:33,460 --> 00:52:35,740 They must find you a pain. 899 00:52:35,740 --> 00:52:37,820 I mean, this must be annoying for them. 900 00:52:37,820 --> 00:52:39,340 It must be. 901 00:52:48,140 --> 00:52:52,020 Facebook told us they share the goal of fighting hate speech, 902 00:52:52,020 --> 00:52:55,380 and they have made substantial progress 903 00:52:55,380 --> 00:52:57,260 in removing illegal content, 904 00:52:57,260 --> 00:53:01,100 adding 3,000 people to their community operations team. 905 00:53:05,180 --> 00:53:09,260 Facebook now connects more than two billion people around the world, 906 00:53:09,260 --> 00:53:12,940 including more and more voters in the West. 907 00:53:12,940 --> 00:53:15,140 When you think of what Facebook has become, 908 00:53:15,140 --> 00:53:19,020 in such a short space of time, it's actually pretty bizarre. 909 00:53:19,020 --> 00:53:20,820 I mean, this was just a platform 910 00:53:20,820 --> 00:53:23,940 for sharing photos or chatting to friends, 911 00:53:23,940 --> 00:53:27,740 but in less than a decade, it has become a platform that has 912 00:53:27,740 --> 00:53:31,340 dramatic implications for how our democracy works. 913 00:53:33,220 --> 00:53:36,060 Old structures of power are falling away. 914 00:53:36,060 --> 00:53:40,740 Social media is giving ordinary people access to huge audiences. 915 00:53:40,740 --> 00:53:43,740 And politics is changing as a result. 916 00:53:46,780 --> 00:53:48,740 Significant numbers were motivated 917 00:53:48,740 --> 00:53:51,060 through social media to vote for Brexit. 918 00:53:53,060 --> 00:53:55,420 Jeremy Corbyn lost the general election, 919 00:53:55,420 --> 00:53:58,700 but enjoyed unexpected gains, 920 00:53:58,700 --> 00:54:03,820 an achievement he put down in part to the power of social media. 921 00:54:07,540 --> 00:54:12,820 One influential force in this new world is found here in Bristol. 922 00:54:12,820 --> 00:54:16,420 The Canary is an online political news outlet. 923 00:54:16,420 --> 00:54:18,900 During the election campaign, 924 00:54:18,900 --> 00:54:23,380 their stories got more than 25 million hits on a tiny budget. 925 00:54:24,820 --> 00:54:28,420 So, how much of your readership comes through Facebook? 926 00:54:28,420 --> 00:54:30,900 It's really high, it's about 80%. 927 00:54:30,900 --> 00:54:34,260 So it is an enormously important distribution mechanism. 928 00:54:34,260 --> 00:54:35,900 And free. 929 00:54:35,900 --> 00:54:40,380 The Canary's presentation of its pro-Corbyn news 930 00:54:40,380 --> 00:54:44,060 is tailored to social media. 931 00:54:44,060 --> 00:54:46,540 I have noticed that a lot of your headlines, or your pictures, 932 00:54:46,540 --> 00:54:48,380 they are quite emotional. 933 00:54:48,380 --> 00:54:51,180 They are sort of pictures of sad Theresa May, 934 00:54:51,180 --> 00:54:52,740 or delighted Jeremy Corbyn. 935 00:54:52,740 --> 00:54:54,860 Is that part of the purpose of it all? 936 00:54:54,860 --> 00:54:58,340 Yeah, it has to be. We are out there trying to have a conversation 937 00:54:58,340 --> 00:55:01,580 with a lot of people, so it is on us to be compelling. 938 00:55:01,580 --> 00:55:04,940 Human beings work on facts, but they also work on gut instinct, 939 00:55:04,940 --> 00:55:09,860 they work on emotions, feelings, and fidelity and community. 940 00:55:09,860 --> 00:55:11,500 All of these issues. 941 00:55:11,500 --> 00:55:16,220 Social media enables those with few resources to compete 942 00:55:16,220 --> 00:55:21,500 with the mainstream media for the attention of millions of us. 943 00:55:21,500 --> 00:55:24,540 You put up quite sort of clickbait-y stories, you know, 944 00:55:24,540 --> 00:55:26,740 the headlines are there to get clicks. 945 00:55:26,740 --> 00:55:29,220 Yeah. Um... Is that a fair criticism? 946 00:55:29,220 --> 00:55:30,780 Of course they're there to get clicks. 947 00:55:30,780 --> 00:55:32,900 We don't want to have a conversation with ten people. 948 00:55:32,900 --> 00:55:34,860 You can't change the world talking to ten people. 949 00:55:43,540 --> 00:55:48,700 The tech gods are giving all of us the power to influence the world. 950 00:55:48,700 --> 00:55:51,140 Connectivity and access... Connect everyone in the world... 951 00:55:51,140 --> 00:55:52,580 Make the world more open and connected... 952 00:55:52,580 --> 00:55:54,780 You connect people over time... You get people connectivity... 953 00:55:54,780 --> 00:55:57,980 That's the mission. That's what I care about. 954 00:55:57,980 --> 00:56:01,860 Social media's unparalleled power to persuade, 955 00:56:01,860 --> 00:56:03,940 first developed for advertisers, 956 00:56:03,940 --> 00:56:07,620 is now being exploited by political forces of all kinds. 957 00:56:07,620 --> 00:56:11,820 Grassroots movements are regaining their power, 958 00:56:11,820 --> 00:56:14,300 challenging political elites. 959 00:56:14,300 --> 00:56:19,780 Extremists are discovering new ways to stoke hatred and spread lies. 960 00:56:19,780 --> 00:56:22,220 And wealthy political parties are developing the ability 961 00:56:22,220 --> 00:56:25,060 to manipulate our thoughts and feelings 962 00:56:25,060 --> 00:56:27,820 using powerful psychological tools, 963 00:56:27,820 --> 00:56:32,740 which is leading to a world of unexpected political opportunity, 964 00:56:32,740 --> 00:56:34,620 and turbulence. 965 00:56:34,620 --> 00:56:38,460 I think the people that connected the world really believed that 966 00:56:38,460 --> 00:56:43,540 somehow, just by us being connected, our politics would be better. 967 00:56:43,540 --> 00:56:49,660 But the world is changing in ways that they never imagined, 968 00:56:49,660 --> 00:56:52,100 and they are probably not happy about any more. 969 00:56:52,100 --> 00:56:53,860 But in truth, 970 00:56:53,860 --> 00:56:57,580 they are no more in charge of this technology than any of us are now. 971 00:56:57,580 --> 00:57:01,380 Silicon Valley's philosophy is called disruption. 972 00:57:01,380 --> 00:57:03,460 Breaking down the way we do things 973 00:57:03,460 --> 00:57:07,100 and using technology to improve the world. 974 00:57:07,100 --> 00:57:08,940 In this series, I have seen how 975 00:57:08,940 --> 00:57:12,420 sharing platforms like Uber and Airbnb 976 00:57:12,420 --> 00:57:14,980 are transforming our cities. 977 00:57:18,900 --> 00:57:21,980 And how automation and artificial intelligence 978 00:57:21,980 --> 00:57:24,980 threaten to destroy millions of jobs. 979 00:57:24,980 --> 00:57:28,220 Within 30 years, half of humanity won't have a job. 980 00:57:28,220 --> 00:57:30,660 It could get ugly, there could be revolution. 981 00:57:30,660 --> 00:57:35,540 Now, the technology to connect the world unleashed by a few billionaire 982 00:57:35,540 --> 00:57:40,140 entrepreneurs is having a dramatic influence on our politics. 983 00:57:40,140 --> 00:57:43,700 The people who are responsible for building this technology, 984 00:57:43,700 --> 00:57:48,540 for unleashing this disruption onto all of us, 985 00:57:48,540 --> 00:57:52,180 don't ever feel like they are responsible for the consequences 986 00:57:52,180 --> 00:57:54,060 of any of that. 987 00:57:54,060 --> 00:57:59,340 They retain this absolute religious faith that technology and 988 00:57:59,340 --> 00:58:03,980 connectivity is always going to make things turn out for the best. 989 00:58:03,980 --> 00:58:05,660 And it doesn't matter what happens, 990 00:58:05,660 --> 00:58:08,700 it doesn't matter how much that's proven not to be the case, 991 00:58:08,700 --> 00:58:11,060 they still believe. 992 00:58:23,300 --> 00:58:25,940 How did Silicon Valley become so influential? 993 00:58:25,940 --> 00:58:28,740 The Open University has produced an interactive timeline 994 00:58:28,740 --> 00:58:31,460 exploring the history of this place. 995 00:58:31,460 --> 00:58:33,460 To find out more, visit... 996 00:58:37,700 --> 00:58:40,700 ..and follow the links to the Open University. 85722

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.