All language subtitles for The.Instagram.Effect.2022.720p.WEBRip.x264.AAC-[YTS.MX]

af Afrikaans
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bn Bengali
bs Bosnian
bg Bulgarian
ca Catalan
ceb Cebuano
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
tl Filipino
fi Finnish
fr French
fy Frisian
gl Galician
ka Georgian
de German
el Greek
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
km Khmer
ko Korean
ku Kurdish (Kurmanji)
ky Kyrgyz
lo Lao
la Latin
lv Latvian
lt Lithuanian
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mn Mongolian
my Myanmar (Burmese)
ne Nepali
no Norwegian
ps Pashto
fa Persian
pl Polish
pt Portuguese Download
pa Punjabi
ro Romanian
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
st Sesotho
sn Shona
sd Sindhi
si Sinhala
sk Slovak
sl Slovenian
so Somali
es Spanish
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
te Telugu
th Thai
tr Turkish
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
or Odia (Oriya)
rw Kinyarwanda
tk Turkmen
tt Tatar
ug Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:02,000 --> 00:00:07,000 Downloaded from YTS.MX 2 00:00:06,931 --> 00:00:07,862 - I have, like, 3 00:00:07,965 --> 00:00:11,551 some notes on my leaving speech at Instagram. 4 00:00:08,000 --> 00:00:13,000 Official YIFY movies site: YTS.MX 5 00:00:11,655 --> 00:00:13,137 It says, "I joined three years ago 6 00:00:13,241 --> 00:00:15,931 "as the first international employee based out of London. 7 00:00:16,034 --> 00:00:17,827 "I couldn't imagine leaving our community 8 00:00:17,931 --> 00:00:18,862 "in better hands. 9 00:00:18,965 --> 00:00:21,103 "I've met so many amazing people 10 00:00:21,206 --> 00:00:22,689 "and had the chance to change lives. 11 00:00:22,793 --> 00:00:26,103 "Learning to laugh at yourself, humility, 12 00:00:26,206 --> 00:00:28,586 "and always thank the little people." 13 00:00:28,689 --> 00:00:31,827 Kind of crazy that you have all these things in here, 14 00:00:31,931 --> 00:00:35,344 you know, like a trip down memory lane. 15 00:00:36,896 --> 00:00:39,241 My name is Hannah Ray. 16 00:00:39,344 --> 00:00:42,448 I worked at Instagram as a community manager 17 00:00:42,551 --> 00:00:44,586 for Europe, Middle East and Africa. 18 00:00:44,689 --> 00:00:45,655 I'm Greg Hochmuth. 19 00:00:45,758 --> 00:00:48,068 I was one of the first engineers at Instagram. 20 00:00:48,172 --> 00:00:49,724 My name is Andy Warr. 21 00:00:49,827 --> 00:00:52,275 I was the Head of Research there. 22 00:00:52,379 --> 00:00:53,344 My name is Chris Messina. 23 00:00:53,448 --> 00:00:54,965 And I am best known 24 00:00:55,068 --> 00:00:56,551 for inventing the hashtag. 25 00:00:56,655 --> 00:00:57,758 It's a little hard to see, 26 00:00:57,862 --> 00:00:59,620 but I've got a little hashtag tattoo 27 00:00:59,724 --> 00:01:00,965 right here on my wrist. 28 00:01:01,068 --> 00:01:03,344 So...yeah. 29 00:01:10,000 --> 00:01:11,241 Instagram was this place 30 00:01:11,344 --> 00:01:13,758 where you posted your best self, 31 00:01:13,862 --> 00:01:16,241 the digital representation of you 32 00:01:16,344 --> 00:01:18,931 that you wanted to portray to your friends. 33 00:01:19,034 --> 00:01:23,310 Now follower counts have like a certain social currency. 34 00:01:23,413 --> 00:01:25,482 Maybe we weren't quite aware 35 00:01:25,586 --> 00:01:28,034 that we were kind of kingmaking in that way. 36 00:01:32,586 --> 00:01:33,551 I think comes a point 37 00:01:33,655 --> 00:01:37,137 when you just aren't in control of it anymore. 38 00:01:37,241 --> 00:01:39,517 And then it, like, took on a life of its own. 39 00:01:39,620 --> 00:01:42,379 - Facebook's Instagram app... 40 00:01:42,482 --> 00:01:44,413 ..can be toxic for teenagers... 41 00:01:44,517 --> 00:01:47,206 ..and Facebook knows it. 42 00:01:47,310 --> 00:01:48,655 She's the former Facebook employee 43 00:01:48,758 --> 00:01:49,793 turned whistleblower, 44 00:01:49,896 --> 00:01:51,655 whose revelations are reverberating 45 00:01:51,758 --> 00:01:52,862 around the world. 46 00:01:52,965 --> 00:01:54,068 - "The company's leadership 47 00:01:54,172 --> 00:01:56,862 "knows how to make Facebook and Instagram safer. 48 00:01:56,965 --> 00:01:59,103 "They have put their astronomical profits 49 00:01:59,206 --> 00:02:00,793 "before people." 50 00:02:00,896 --> 00:02:02,448 My name is Frances Haugen. 51 00:02:02,551 --> 00:02:05,413 We're facing one of the fundamental conflicts 52 00:02:05,517 --> 00:02:07,000 of our civilization - 53 00:02:07,103 --> 00:02:09,482 which is are we going to let algorithms rule us, 54 00:02:09,586 --> 00:02:11,724 or will people rule algorithms? 55 00:02:11,827 --> 00:02:13,413 How do we increase engagement, 56 00:02:13,517 --> 00:02:15,448 how do we increase time spent? 57 00:02:15,551 --> 00:02:17,482 I didn't feel like that was the best use 58 00:02:17,586 --> 00:02:19,241 of people's time. 59 00:02:19,344 --> 00:02:21,379 We don't fully understand what the consequences 60 00:02:21,482 --> 00:02:23,448 of this period will be. 61 00:02:23,551 --> 00:02:24,931 This is just the beginning. 62 00:02:25,034 --> 00:02:28,068 And the thing that I think young people see, 63 00:02:28,172 --> 00:02:30,310 that maybe people a little bit older than them don't, 64 00:02:30,413 --> 00:02:32,586 is how intense the problems are, 65 00:02:32,689 --> 00:02:34,551 because teenagers are killing themselves 66 00:02:34,655 --> 00:02:36,000 because of Instagram. 67 00:02:36,103 --> 00:02:37,344 This challenge of, like, 68 00:02:37,448 --> 00:02:39,793 making Instagram feel less pressurised 69 00:02:39,896 --> 00:02:42,931 I think is very difficult, if not impossible. 70 00:02:43,034 --> 00:02:44,241 Could we predict? 71 00:02:44,344 --> 00:02:45,344 I don't even know. 72 00:02:45,448 --> 00:02:47,758 I don't know how to answer that question, really. 73 00:02:47,862 --> 00:02:49,379 I'm gonna go with no...? 74 00:03:10,068 --> 00:03:12,896 - My name is Lauren Black and I'm an influencer. 75 00:03:13,000 --> 00:03:16,241 I have 123,000 followers. 76 00:03:19,068 --> 00:03:20,827 One of my ex-boyfriends' sisters 77 00:03:20,931 --> 00:03:23,103 actually bought me a book for Christmas about blogging, 78 00:03:23,206 --> 00:03:26,413 and that was when I started my Instagram. 79 00:03:26,517 --> 00:03:29,310 I was posting about fashion, every day, 80 00:03:29,413 --> 00:03:30,448 like, what I was wearing, 81 00:03:30,551 --> 00:03:32,655 like, outfit of the day posts. 82 00:03:32,758 --> 00:03:36,034 I went full time about six months after starting. 83 00:03:36,137 --> 00:03:39,724 I took a massive risk by, like, leaving uni 84 00:03:39,827 --> 00:03:41,965 and not having any savings. 85 00:03:42,068 --> 00:03:45,137 And I just threw everything that I had into it. 86 00:03:45,241 --> 00:03:47,068 Hi! Oh! 87 00:03:47,172 --> 00:03:52,448 - She loves beautiful ladies. - Aw, thank you. 88 00:03:54,413 --> 00:03:56,931 I started posting IGTV videos. 89 00:03:57,034 --> 00:04:00,275 And that took me from about 30,000 followers 90 00:04:00,379 --> 00:04:03,689 to over 100,000 followers. 91 00:04:03,793 --> 00:04:04,896 It was quite a jump, 92 00:04:05,000 --> 00:04:05,896 it was something that 93 00:04:06,000 --> 00:04:08,103 I wasn't really mentally prepared for. 94 00:04:08,206 --> 00:04:10,517 I was quite anxious because every day 95 00:04:10,620 --> 00:04:12,172 I had to then please 96 00:04:12,275 --> 00:04:15,034 thousands more people than usual. 97 00:04:15,137 --> 00:04:18,551 I was on Instagram all the time, 98 00:04:18,655 --> 00:04:22,103 like my screentime was like 13 hours a day. 99 00:04:23,965 --> 00:04:25,965 It was, like, from the moment I woke up 100 00:04:26,068 --> 00:04:28,137 until the moment I went to sleep. 101 00:04:28,241 --> 00:04:30,206 I'd literally just be scrolling. 102 00:04:30,310 --> 00:04:32,310 And, like, liking other people's pictures, 103 00:04:32,413 --> 00:04:36,551 and, like...yeah, just scrolling. 104 00:04:37,551 --> 00:04:38,862 - I'm Manish Raghavan. 105 00:04:38,965 --> 00:04:40,931 I worked for the Responsible AI team. 106 00:04:41,034 --> 00:04:43,310 We served to try to help other teams within Facebook 107 00:04:43,413 --> 00:04:46,310 better understand how to build 108 00:04:46,413 --> 00:04:47,931 and deploy AI responsibly. 109 00:04:48,034 --> 00:04:50,965 If you want to understand why Instagram is the way it is, 110 00:04:51,068 --> 00:04:53,344 you have to understand how the algorithms have shaped it 111 00:04:53,448 --> 00:04:54,379 to be that way. 112 00:04:54,482 --> 00:04:55,827 They are the result of people's choices. 113 00:04:55,931 --> 00:04:58,241 - The goal of any social media company 114 00:04:58,344 --> 00:05:01,172 is to make sure that you spend as much time 115 00:05:01,275 --> 00:05:03,655 on that platform as possible, 116 00:05:03,758 --> 00:05:06,103 and engage as much with content 117 00:05:06,206 --> 00:05:08,862 on that platform as possible. 118 00:05:08,965 --> 00:05:11,724 Different algorithms underpin platforms like Instagram. 119 00:05:11,827 --> 00:05:14,931 It's a way of ordering things. 120 00:05:15,034 --> 00:05:17,310 It's a way of saying yes or no to something. 121 00:05:17,413 --> 00:05:19,413 What people often think 122 00:05:19,517 --> 00:05:23,689 about how social media feed systems work 123 00:05:23,793 --> 00:05:25,517 is that they're one algorithm, 124 00:05:25,620 --> 00:05:28,275 and that somebody sits down and writes that algorithm 125 00:05:28,379 --> 00:05:30,896 and says, "If you have two photos, 126 00:05:31,000 --> 00:05:33,137 "one of them is a person and one of them is a cat, 127 00:05:33,241 --> 00:05:34,551 "people are more important than cats, 128 00:05:34,655 --> 00:05:36,931 "so we're gonna rank the person above the cat." 129 00:05:37,896 --> 00:05:40,448 But there is no one algorithm 130 00:05:40,551 --> 00:05:42,551 that is written down that determines 131 00:05:42,655 --> 00:05:44,448 the order in which you see things 132 00:05:44,551 --> 00:05:47,103 on your Instagram feed, on your Facebook News Feed 133 00:05:47,206 --> 00:05:48,827 or on your Google search. 134 00:05:50,551 --> 00:05:53,241 The crucial thing about those algorithms 135 00:05:53,344 --> 00:05:57,724 is that they use hundreds of machine learning models. 136 00:05:59,586 --> 00:06:02,379 The way that the hundreds of machine learning models 137 00:06:02,482 --> 00:06:04,103 interact with each other 138 00:06:04,206 --> 00:06:07,413 is determined by the overarching metric 139 00:06:07,517 --> 00:06:09,862 that guides the feed ranking system. 140 00:06:09,965 --> 00:06:11,275 In Instagram, 141 00:06:11,379 --> 00:06:12,689 that overriding metric 142 00:06:12,793 --> 00:06:14,793 that orients the machine learning models 143 00:06:14,896 --> 00:06:18,482 is meaningful social interactions. 144 00:06:18,586 --> 00:06:20,551 The meaningful bit cannot just be about the people 145 00:06:20,655 --> 00:06:22,965 that are close to you, like friends and family. 146 00:06:23,068 --> 00:06:25,482 It can be about stuff that you're interested in. 147 00:06:25,586 --> 00:06:27,931 Instagram figures out who is meaningful to you 148 00:06:28,034 --> 00:06:31,827 and who tends to engage often with the content you produce. 149 00:06:31,931 --> 00:06:34,413 They predict whether you're likely to like a post, 150 00:06:34,517 --> 00:06:36,241 whether you're likely to share a post 151 00:06:36,344 --> 00:06:40,379 or whether you're likely to comment underneath the post. 152 00:06:40,482 --> 00:06:44,068 And so interactions from the people closer to you 153 00:06:44,172 --> 00:06:46,655 tend to be prioritised above interactions 154 00:06:46,758 --> 00:06:47,827 from people who are further out 155 00:06:47,931 --> 00:06:49,517 in that network from you. 156 00:06:50,655 --> 00:06:52,241 - What Instagram is looking to do 157 00:06:52,344 --> 00:06:54,896 is keep you engaged in some way. 158 00:06:55,000 --> 00:06:56,448 This is slightly different from asking 159 00:06:56,551 --> 00:06:58,896 what do you actually want to see? 160 00:06:59,000 --> 00:07:02,068 The algorithms are the most opaque part 161 00:07:02,172 --> 00:07:03,758 of this whole system. 162 00:07:03,862 --> 00:07:05,793 You can see the user behaviour, 163 00:07:05,896 --> 00:07:07,482 you can see sort of like the consequences 164 00:07:07,586 --> 00:07:09,172 of what happens as a result of being exposed 165 00:07:09,275 --> 00:07:10,172 to certain content, 166 00:07:10,275 --> 00:07:11,758 especially over and over again. 167 00:07:11,862 --> 00:07:13,689 But we don't necessarily understand, 168 00:07:13,793 --> 00:07:15,206 and some of the companies don't even understand, 169 00:07:15,310 --> 00:07:19,000 how content is targeted to different individuals, 170 00:07:19,103 --> 00:07:21,965 what choices they made to go down certain paths. 171 00:07:22,068 --> 00:07:23,655 And that lack of awareness 172 00:07:23,758 --> 00:07:25,896 is where we're the most vulnerable. 173 00:07:28,068 --> 00:07:30,689 I've always had problems with food 174 00:07:30,793 --> 00:07:33,206 since I was a teenager. 175 00:07:34,827 --> 00:07:36,206 I used to collect 'Vogue' 176 00:07:36,310 --> 00:07:38,137 and all fashion magazines anyway. 177 00:07:38,241 --> 00:07:41,793 So, I already was aspiring 178 00:07:41,896 --> 00:07:45,310 to be model-like. 179 00:07:48,344 --> 00:07:51,379 When I started this job, 180 00:07:51,482 --> 00:07:54,896 I didn't realise, like, 181 00:07:55,000 --> 00:07:59,344 the impact of looking and comparing myself 182 00:07:59,448 --> 00:08:03,034 to other people had on my mental health. 183 00:08:03,137 --> 00:08:08,034 And my eating disorder got worse over the years. 184 00:08:10,103 --> 00:08:11,793 - I think the thing that's really important 185 00:08:11,896 --> 00:08:14,206 for people to understand is how Instagram 186 00:08:14,310 --> 00:08:17,000 differs from other social media platforms. 187 00:08:17,103 --> 00:08:18,896 Tik Tok is about performance. 188 00:08:19,000 --> 00:08:21,379 It's about doing fun things with your friends. 189 00:08:21,482 --> 00:08:23,689 Snapchat is about faces and augmented reality. 190 00:08:23,793 --> 00:08:25,103 You know, it's not actually you. 191 00:08:25,206 --> 00:08:27,034 It's you with a mask on. 192 00:08:27,137 --> 00:08:28,137 Facebook's internal research 193 00:08:28,241 --> 00:08:30,241 says that Instagram is about bodies 194 00:08:30,344 --> 00:08:31,482 and about social comparison. 195 00:08:31,586 --> 00:08:32,551 You know, it's about seeing 196 00:08:32,655 --> 00:08:34,310 these little windows into other people's lives 197 00:08:34,413 --> 00:08:35,413 and comparing, like, 198 00:08:35,517 --> 00:08:37,620 what is your life like to theirs? 199 00:08:37,724 --> 00:08:40,103 The human brain is not evolved 200 00:08:40,206 --> 00:08:41,793 to deal with social media. 201 00:08:41,896 --> 00:08:44,310 It isn't evolved to look at a screen all day 202 00:08:44,413 --> 00:08:45,965 and be able to understand, let's say, 203 00:08:46,068 --> 00:08:46,965 that the thing that 204 00:08:47,068 --> 00:08:48,551 you're seeing on the other side of the screen 205 00:08:48,655 --> 00:08:50,965 isn't someone's real life. 206 00:08:51,068 --> 00:08:52,896 - Everyone that I followed on Instagram then 207 00:08:53,000 --> 00:08:57,620 would have been probably like from sizes four to six. 208 00:08:57,724 --> 00:09:01,068 And that was kind of what I was trying to achieve. 209 00:09:02,517 --> 00:09:04,793 - The algorithms are looking for the content 210 00:09:04,896 --> 00:09:06,137 that will lure you in. 211 00:09:06,241 --> 00:09:08,103 These algorithms are always looking for the rabbit hole 212 00:09:08,206 --> 00:09:09,793 that you are most vulnerable to. 213 00:09:09,896 --> 00:09:12,172 You can start with quite innocuous interests, 214 00:09:12,275 --> 00:09:14,103 you know, you can search for healthy eating. 215 00:09:14,206 --> 00:09:17,827 These rabbit holes normalise the idea 216 00:09:17,931 --> 00:09:19,793 that you should be extremely thin 217 00:09:19,896 --> 00:09:22,103 and is it your fault if you're not. 218 00:09:23,379 --> 00:09:24,689 - So, I used to see myself 219 00:09:24,793 --> 00:09:27,379 as bigger than what I was. 220 00:09:27,482 --> 00:09:30,310 Yeah, it was kind of unconsciously... 221 00:09:31,655 --> 00:09:34,724 ..it started getting bad. 222 00:09:34,827 --> 00:09:37,551 I basically just ignored when I was hungry. 223 00:09:39,137 --> 00:09:41,137 And when I did eat, 224 00:09:41,241 --> 00:09:44,931 it would just be the tiniest amount. 225 00:09:45,034 --> 00:09:48,620 I woke up and I was so busy with work 226 00:09:48,724 --> 00:09:52,379 that I just put it off for as long as possible. 227 00:09:52,482 --> 00:09:54,137 And I feel like that's kind of like 228 00:09:54,241 --> 00:09:55,586 the sense of self harm 229 00:09:55,689 --> 00:09:57,620 of having an eating disorder. 230 00:09:57,724 --> 00:09:59,965 It's just basically starving yourself. 231 00:10:00,068 --> 00:10:01,620 When I look back on pictures now, 232 00:10:01,724 --> 00:10:03,793 you can clearly see that I was ill. 233 00:10:03,896 --> 00:10:07,275 But when I was in the middle of it all 234 00:10:07,379 --> 00:10:10,379 I just didn't really know. 235 00:10:13,620 --> 00:10:16,655 I think I thought if I changed myself 236 00:10:16,758 --> 00:10:20,344 then I would gain more followers 237 00:10:20,448 --> 00:10:24,172 and get all these new opportunities. 238 00:10:24,275 --> 00:10:25,724 I just created this, 239 00:10:25,827 --> 00:10:30,379 like, like, vision in my mind that if I went and did this 240 00:10:30,482 --> 00:10:32,344 then this would be the outcome. 241 00:10:32,448 --> 00:10:34,517 And it just...it was a lie. 242 00:10:36,379 --> 00:10:38,241 - There were things that we were aware of 243 00:10:38,344 --> 00:10:39,241 that we would feed back. 244 00:10:39,344 --> 00:10:42,655 The hashtag 'thinspo' has just popped up 245 00:10:42,758 --> 00:10:44,689 and it should probably be banned. 246 00:10:44,793 --> 00:10:45,758 So, those things would come up 247 00:10:45,862 --> 00:10:48,137 and then we'd kind of jump on them 248 00:10:48,241 --> 00:10:49,827 and quickly figure out, like, 249 00:10:49,931 --> 00:10:51,862 the best policy to do that. 250 00:10:51,965 --> 00:10:54,758 But you have to remember, that's alongside, like, 251 00:10:54,862 --> 00:10:57,448 loads of other kind of firefighting issues 252 00:10:57,551 --> 00:10:58,413 that were popping up. 253 00:10:58,517 --> 00:10:59,758 I mean, daily, 254 00:10:59,862 --> 00:11:01,758 there were, like, new things coming up 255 00:11:01,862 --> 00:11:05,172 that I think are just part of, like, just how huge 256 00:11:05,275 --> 00:11:07,517 the platform was growing so quickly. 257 00:11:19,517 --> 00:11:20,551 At the time I joined, 258 00:11:20,655 --> 00:11:23,965 Instagram had celebrated its third birthday. 259 00:11:24,068 --> 00:11:27,275 So, I remember that month I was in California training. 260 00:11:27,379 --> 00:11:31,000 And the whole sky was pink. 261 00:11:31,103 --> 00:11:35,344 And it just felt like being in an Instagram filter. 262 00:11:35,448 --> 00:11:37,965 You know, and then the cab 263 00:11:38,068 --> 00:11:41,482 was going down places like Valencia and Dogpatch, 264 00:11:41,586 --> 00:11:43,551 and those were names of filters, 265 00:11:43,655 --> 00:11:44,586 you know, on Instagram. 266 00:11:44,689 --> 00:11:47,793 So, I really felt like I'd somehow descended into, 267 00:11:47,896 --> 00:11:50,379 you know, the world of Instagram. 268 00:11:52,275 --> 00:11:55,724 - At the time, we honestly didn't really have a clue 269 00:11:55,827 --> 00:11:57,827 how big it was, how much it was growing. 270 00:11:57,931 --> 00:11:59,827 I would go to many concerts. 271 00:11:59,931 --> 00:12:01,827 As people were taking photos 272 00:12:01,931 --> 00:12:03,827 of their time at the show, 273 00:12:03,931 --> 00:12:06,551 I would look over and I would see on their phone, 274 00:12:06,655 --> 00:12:09,379 "Oh, like, that's Instagram, that's the filter screen." 275 00:12:09,482 --> 00:12:13,758 - So, I was friends with the creators of Instagram. 276 00:12:13,862 --> 00:12:19,034 I ended up being invited to a beta of this product. 277 00:12:19,137 --> 00:12:21,793 We were certainly plenty naive, 278 00:12:21,896 --> 00:12:23,655 and I think we had ideas 279 00:12:23,758 --> 00:12:25,241 about the world and about how technology 280 00:12:25,344 --> 00:12:26,517 could improve the world. 281 00:12:26,620 --> 00:12:27,482 Priorities were basically, like, 282 00:12:27,586 --> 00:12:29,172 make a very simple app 283 00:12:29,275 --> 00:12:32,172 whose main goal is to help you share a photo, 284 00:12:32,275 --> 00:12:33,655 make it look beautiful. 285 00:12:33,758 --> 00:12:35,413 It did create this very nice feedback loop 286 00:12:35,517 --> 00:12:36,413 people really enjoyed. 287 00:12:36,517 --> 00:12:37,896 I post the photo, I get these likes. 288 00:12:38,758 --> 00:12:39,896 - My name is Cole Rise, 289 00:12:40,000 --> 00:12:42,413 and I designed the original icon 290 00:12:42,517 --> 00:12:44,724 and seven of the filters. 291 00:12:44,827 --> 00:12:47,862 Instagram's brand was about covering 292 00:12:47,965 --> 00:12:50,172 how bad the cameras were in the phones of the day. 293 00:12:50,275 --> 00:12:52,931 They were very pixely. 294 00:12:53,034 --> 00:12:55,379 So how do you make a great photo? 295 00:12:55,482 --> 00:12:57,172 You put a nice, heavy filter on it 296 00:12:57,275 --> 00:12:58,206 that makes it something else. 297 00:12:58,310 --> 00:13:00,620 It's not supposed to be true to life. 298 00:13:00,724 --> 00:13:01,931 There was probably, like, 299 00:13:02,034 --> 00:13:04,965 a certain je ne sais quoi about Instagram 300 00:13:05,068 --> 00:13:06,517 that we would call, like, 301 00:13:06,620 --> 00:13:08,103 the Instagrammy-ness. 302 00:13:08,206 --> 00:13:11,206 The vibe that we had tried to cultivate and create 303 00:13:11,310 --> 00:13:12,206 on the community team. 304 00:13:12,310 --> 00:13:13,931 I found these postcards. 305 00:13:14,034 --> 00:13:15,103 These are, like... 306 00:13:16,551 --> 00:13:18,206 ..the brand values that we had at the time. 307 00:13:18,310 --> 00:13:19,931 Simplicity matters, 308 00:13:20,034 --> 00:13:23,586 and inspire creativity, 309 00:13:23,689 --> 00:13:26,689 and, my favourite, 310 00:13:26,793 --> 00:13:28,310 community first. 311 00:13:28,413 --> 00:13:31,137 I still say that a lot today, actually. 312 00:13:31,241 --> 00:13:34,000 - I think so many of the essence 313 00:13:34,103 --> 00:13:36,379 of how we built the product were around, like, 314 00:13:36,482 --> 00:13:38,137 what's going to be good for a person? 315 00:13:38,241 --> 00:13:39,655 Like, what's the next most important thing 316 00:13:39,758 --> 00:13:40,758 that we can work on 317 00:13:40,862 --> 00:13:43,482 that's going to make someone's life better? 318 00:13:43,586 --> 00:13:46,724 The biggest team, actually, at least in my mind, 319 00:13:46,827 --> 00:13:47,793 was the community team, 320 00:13:47,896 --> 00:13:50,137 who were all focused on looking 321 00:13:50,241 --> 00:13:51,551 at what's happening on Instagram, 322 00:13:51,655 --> 00:13:53,241 engaging with active users, 323 00:13:53,344 --> 00:13:54,551 starting conversations, 324 00:13:54,655 --> 00:13:55,827 highlighting interesting users, 325 00:13:55,931 --> 00:13:57,793 picking who was featured and things like that. 326 00:13:57,896 --> 00:13:59,344 We had a number of different ways 327 00:13:59,448 --> 00:14:01,275 that we would feature people. 328 00:14:01,379 --> 00:14:03,034 The main Instagram account, 329 00:14:03,137 --> 00:14:05,379 which was just @Instagram. 330 00:14:05,482 --> 00:14:06,551 It was, like, the most followed account 331 00:14:06,655 --> 00:14:07,655 in the world. 332 00:14:07,758 --> 00:14:09,000 I think it still has a bigger circulation 333 00:14:09,103 --> 00:14:10,034 than 'The New York Times'. 334 00:14:10,137 --> 00:14:13,379 Like, it was a hugely kind of impactful real estate 335 00:14:13,482 --> 00:14:15,206 to share someone's story. 336 00:14:15,310 --> 00:14:16,517 And whenever we would share 337 00:14:16,620 --> 00:14:17,724 someone's story on there, 338 00:14:17,827 --> 00:14:19,137 quite often, you would see 339 00:14:19,241 --> 00:14:20,551 their kind of follower count 340 00:14:20,655 --> 00:14:23,965 and everything blow up for them. 341 00:14:24,068 --> 00:14:26,241 It felt kind of worrying, you know, 342 00:14:26,344 --> 00:14:30,000 to kind of be making those editorial decisions. 343 00:14:30,103 --> 00:14:32,620 Partly because of the huge influence and impact 344 00:14:32,724 --> 00:14:34,206 it could have on someone's life. 345 00:14:34,310 --> 00:14:36,758 I think we knew we were on, like, 346 00:14:36,862 --> 00:14:39,068 the cusp of something big that was happening 347 00:14:39,172 --> 00:14:42,068 and powerful and impactful. 348 00:14:42,172 --> 00:14:45,379 I don't think we were aware, on the Community Team, 349 00:14:45,482 --> 00:14:49,068 of how big it was gonna get. 350 00:14:49,172 --> 00:14:52,275 - This world that Instagram had created, 351 00:14:52,379 --> 00:14:55,000 where you could present your life as more beautiful 352 00:14:55,103 --> 00:14:56,551 and interesting than it actually was. 353 00:14:56,655 --> 00:14:59,000 Where you could see into 354 00:14:59,103 --> 00:15:02,482 all these really interesting lives and careers, 355 00:15:02,586 --> 00:15:06,034 and places in the world, from your phone. 356 00:15:06,137 --> 00:15:08,758 That was so exciting 357 00:15:08,862 --> 00:15:11,344 that it grew so quickly 358 00:15:11,448 --> 00:15:13,586 and caught the attention of Mark Zuckerberg. 359 00:15:22,241 --> 00:15:23,758 Zuckerberg wanted to make sure 360 00:15:23,862 --> 00:15:25,413 that he got to Instagram 361 00:15:25,517 --> 00:15:27,448 before any of his competitors did 362 00:15:27,551 --> 00:15:31,344 and that he made sure that they joined Facebook. 363 00:15:31,448 --> 00:15:33,793 So he offered them a billion dollars, 364 00:15:33,896 --> 00:15:36,068 which is more than anyone had ever paid 365 00:15:36,172 --> 00:15:38,310 for a mobile app before. 366 00:15:38,413 --> 00:15:40,172 - Kevin opens the conversation and says, 367 00:15:40,275 --> 00:15:42,034 "You know, we've been talking to Facebook 368 00:15:42,137 --> 00:15:43,689 "over the last few days, 369 00:15:43,793 --> 00:15:46,068 "and they made an offer to acquire us". 370 00:15:46,172 --> 00:15:48,551 At this moment I think what goes through my head 371 00:15:48,655 --> 00:15:49,827 and what goes through 372 00:15:49,931 --> 00:15:51,034 everyone else's head, I believe, 373 00:15:51,137 --> 00:15:52,724 is the next sentence will be, 374 00:15:52,827 --> 00:15:54,310 "And we rejected them". 375 00:15:54,413 --> 00:15:56,206 And then the very next thing Kevin says is, 376 00:15:56,310 --> 00:15:58,482 "and we accepted and we signed". 377 00:15:58,586 --> 00:15:59,724 I remember walking back 378 00:15:59,827 --> 00:16:02,379 from that little circle to our desks again 379 00:16:02,482 --> 00:16:03,965 next to one of my co-workers, 380 00:16:04,068 --> 00:16:05,448 and he just whispers to me, 381 00:16:05,551 --> 00:16:07,655 "This is not how I thought it was going to end". 382 00:16:07,758 --> 00:16:10,965 And I think that's what everyone felt. 383 00:16:12,758 --> 00:16:15,448 This idea that, like, this ride that we're on, 384 00:16:15,551 --> 00:16:17,034 this roller coaster of being in control 385 00:16:17,137 --> 00:16:18,379 of our own destiny, 386 00:16:18,482 --> 00:16:20,758 growing a small little thing 387 00:16:20,862 --> 00:16:22,344 that we've been working on our own 388 00:16:22,448 --> 00:16:23,793 was like suddenly over. 389 00:16:23,896 --> 00:16:25,551 Instagram essentially had to decide, 390 00:16:25,655 --> 00:16:27,965 do we want to go up against Facebook, 391 00:16:28,068 --> 00:16:29,448 which is this behemoth? 392 00:16:29,551 --> 00:16:32,034 Or is this our time to, one, 393 00:16:32,137 --> 00:16:33,689 get a pretty significant payout, 394 00:16:33,793 --> 00:16:35,827 and be able to build upon 395 00:16:35,931 --> 00:16:38,896 Facebook's existing infrastructure, platform, 396 00:16:39,000 --> 00:16:41,137 and then cater to all of Facebook's users. 397 00:16:41,241 --> 00:16:43,620 But at the time it was shocking. 398 00:16:43,724 --> 00:16:45,310 It definitely gave me pause 399 00:16:45,413 --> 00:16:47,068 and a moment to be like, uh... 400 00:16:47,172 --> 00:16:48,344 I don't know if this is ultimately 401 00:16:48,448 --> 00:16:51,517 going to be the best thing for this community. 402 00:16:51,620 --> 00:16:54,275 Facebook has tried over and over and over 403 00:16:54,379 --> 00:16:57,517 to make a product that will appeal to young people. 404 00:16:57,620 --> 00:17:00,137 They had these labs that were turning out 405 00:17:00,241 --> 00:17:03,448 new kinds of apps to possibly create a new hit. 406 00:17:03,551 --> 00:17:05,413 Nothing really worked. 407 00:17:05,517 --> 00:17:07,068 Their main avenue 408 00:17:07,172 --> 00:17:09,310 into the youth audience is Instagram 409 00:17:09,413 --> 00:17:11,172 and whatever Instagram can create, 410 00:17:11,275 --> 00:17:14,655 and in order for Facebook to stay relevant, 411 00:17:14,758 --> 00:17:20,448 Instagram needs to maintain that...that cache with youth. 412 00:17:20,551 --> 00:17:22,379 There was this really burning question 413 00:17:22,482 --> 00:17:24,620 from folks on the Facebook side, 414 00:17:24,724 --> 00:17:26,586 to understand our own growth 415 00:17:26,689 --> 00:17:28,379 in a way that we had never really been 416 00:17:28,482 --> 00:17:30,793 as eager about or as specific about. 417 00:17:30,896 --> 00:17:32,551 You know, previously, success for us meant 418 00:17:32,655 --> 00:17:34,517 that we built a good product, that, like, 419 00:17:34,620 --> 00:17:37,931 the things we worked on were doing really well, 420 00:17:38,034 --> 00:17:40,310 that, like, people using them felt good about them. 421 00:17:40,413 --> 00:17:41,965 Shifting from that mindset 422 00:17:42,068 --> 00:17:45,793 to something where success is just measured by numbers 423 00:17:45,896 --> 00:17:47,068 in a very plain way. 424 00:17:47,172 --> 00:17:49,689 Over time, that became much more and more 425 00:17:49,793 --> 00:17:52,275 part of the way things were done, 426 00:17:52,379 --> 00:17:54,551 things were evaluated, things were designed. 427 00:18:00,448 --> 00:18:02,965 I definitely noticed a change 428 00:18:03,068 --> 00:18:04,793 when we launched ads, 429 00:18:04,896 --> 00:18:07,965 and when we started to onboard brands 430 00:18:08,068 --> 00:18:09,275 onto the platform. 431 00:18:09,379 --> 00:18:10,551 Now, looking back, 432 00:18:10,655 --> 00:18:13,586 what I've seen or what I've realised 433 00:18:13,689 --> 00:18:16,586 is with the launch of ads and brands on the platform, 434 00:18:16,689 --> 00:18:18,551 this kind of tension or friction 435 00:18:18,655 --> 00:18:20,517 was created between brands 436 00:18:20,620 --> 00:18:22,758 that were almost trying to act like individuals, 437 00:18:22,862 --> 00:18:26,724 and individuals that then start to act like brands. 438 00:18:33,344 --> 00:18:35,000 - When I first started doing Instagram, 439 00:18:35,103 --> 00:18:36,758 there was no such thing as influencing. 440 00:18:36,862 --> 00:18:39,068 I come from a modelling background. 441 00:18:39,172 --> 00:18:41,896 A few years later than that Instagram came about. 442 00:18:42,000 --> 00:18:43,551 And obviously you like to take pictures, 443 00:18:43,655 --> 00:18:44,586 it's a picture app. 444 00:18:44,689 --> 00:18:45,931 So, from my modelling history, I was like, 445 00:18:46,034 --> 00:18:47,103 OK, great, this is for me. 446 00:18:47,206 --> 00:18:49,172 I was very consistent with Instagram 447 00:18:49,275 --> 00:18:50,379 and I enjoyed it. 448 00:18:50,482 --> 00:18:52,965 Like, you get free meals, you get to go to restaurants. 449 00:18:53,068 --> 00:18:56,068 I've even got holidays for promo as they call it. 450 00:18:56,172 --> 00:18:58,448 There's so many Instagram stylists. 451 00:18:58,551 --> 00:19:00,000 I feel like unless girls see you working 452 00:19:00,103 --> 00:19:01,172 with a massive influencer... 453 00:19:01,275 --> 00:19:02,275 Don't wanna work with you. 454 00:19:02,379 --> 00:19:03,620 ..they don't wanna work with you 455 00:19:03,724 --> 00:19:05,103 or they don't want to book your services. 456 00:19:05,206 --> 00:19:06,517 They just want to go who's popping 457 00:19:06,620 --> 00:19:07,724 because they've gone there. 458 00:19:07,827 --> 00:19:08,827 - I've got some friends that... 459 00:19:08,931 --> 00:19:10,793 They will go to all these Instagram people, 460 00:19:10,896 --> 00:19:12,586 and then when the people mess up their hair, 461 00:19:12,689 --> 00:19:13,724 they will come back to me. 462 00:19:13,827 --> 00:19:14,827 I'm like, "I'm not doing your hair!" 463 00:19:14,931 --> 00:19:15,827 You better just... 464 00:19:15,931 --> 00:19:17,206 You know your friend does hair 465 00:19:17,310 --> 00:19:18,758 why didn't you go and meet your friend before? 466 00:19:18,862 --> 00:19:19,793 Why are you coming to me? 467 00:19:19,896 --> 00:19:20,965 OK, so first of all, 468 00:19:21,068 --> 00:19:23,275 I would like to say Instagram World is fake. 469 00:19:24,620 --> 00:19:26,275 Instagram's kind of like a facade. 470 00:19:26,379 --> 00:19:30,344 It's kind of like made the highlife 471 00:19:30,448 --> 00:19:32,620 look so easily accessible. 472 00:19:32,724 --> 00:19:34,724 At 21, I'm getting on a private jet. 473 00:19:34,827 --> 00:19:37,448 It's not like that at all. 474 00:19:37,551 --> 00:19:39,689 I remember when people started getting paid to post, 475 00:19:39,793 --> 00:19:41,310 and that idea 476 00:19:41,413 --> 00:19:44,172 quickly makes what you post a bit artificial, 477 00:19:44,275 --> 00:19:47,000 where some people are doing, like, full-on BMW ads 478 00:19:47,103 --> 00:19:49,758 and it's not genuine to who they are as a person 479 00:19:49,862 --> 00:19:51,103 because I know that person. 480 00:19:51,206 --> 00:19:53,172 They don't drive a BMW. 481 00:19:53,275 --> 00:19:55,482 The core of Instagram, as I remember, 482 00:19:55,586 --> 00:19:57,827 it was just, "Here's what's happening." 483 00:19:57,931 --> 00:19:59,965 But if what's happening is an ad 484 00:20:00,068 --> 00:20:03,137 then there's an artificiality to it. 485 00:20:05,344 --> 00:20:06,965 Look at bloody COVID-19, 486 00:20:07,068 --> 00:20:09,448 they was paying influencers to... 487 00:20:09,551 --> 00:20:11,724 Yeah, to say that they're doing it. 488 00:20:11,827 --> 00:20:12,827 They was paying influencers 489 00:20:12,931 --> 00:20:14,241 to promote the vaccine. 490 00:20:14,344 --> 00:20:15,724 What? 491 00:20:15,827 --> 00:20:17,551 Instagram followers are very important, 492 00:20:17,655 --> 00:20:19,758 especially if you want to be paid as an influencer. 493 00:20:19,862 --> 00:20:25,655 My long-term goal for how many followers I want 494 00:20:25,758 --> 00:20:27,137 is probably a billion. 495 00:20:27,241 --> 00:20:28,827 I want a billion followers. 496 00:20:28,931 --> 00:20:31,517 - Now follower counts have, like, 497 00:20:31,620 --> 00:20:33,344 a certain social currency, 498 00:20:33,448 --> 00:20:36,448 as if it's some kind of golden ticket 499 00:20:36,551 --> 00:20:39,413 to unlocking the universe. 500 00:20:39,517 --> 00:20:41,068 The more followers we get, 501 00:20:41,172 --> 00:20:43,448 the more comments we get, the better. 502 00:20:43,551 --> 00:20:45,827 We have also become growth obsessed, 503 00:20:45,931 --> 00:20:48,827 and it's thanks to Facebook and Instagram 504 00:20:48,931 --> 00:20:52,000 in the way that they have let us measure ourselves 505 00:20:52,103 --> 00:20:54,758 that we think that way. 506 00:20:56,241 --> 00:20:58,896 - I decided to get fake followers. 507 00:20:59,000 --> 00:21:00,482 I can't remember what the package was. 508 00:21:00,586 --> 00:21:01,689 It wasn't even that much. 509 00:21:01,793 --> 00:21:03,862 I think I paid, like... Maybe, like, £100. 510 00:21:03,965 --> 00:21:05,413 And it basically was, like, 511 00:21:05,517 --> 00:21:08,137 I got 3,000 likes on each picture, 512 00:21:08,241 --> 00:21:10,862 and then I think I gained, like, 10,000 followers, 513 00:21:10,965 --> 00:21:13,206 because it did take me from, like, 10,000 to 20,000. 514 00:21:13,310 --> 00:21:14,206 But then after that, 515 00:21:14,310 --> 00:21:16,448 it was like why are all my followers Russian 516 00:21:16,551 --> 00:21:19,034 and why are all these comments in Russian? 517 00:21:19,137 --> 00:21:20,689 Like, what's going on? 518 00:21:20,793 --> 00:21:22,586 We need to get cracking, 519 00:21:22,689 --> 00:21:24,551 we've got, like, literally 10 minutes - let's go. 520 00:21:31,068 --> 00:21:33,103 - It's not very hard to get fake followers, 521 00:21:33,206 --> 00:21:35,655 fake likes, etcetera, on Instagram. 522 00:21:35,758 --> 00:21:37,241 You just, you just need an internet connection 523 00:21:37,344 --> 00:21:38,689 and a credit card. 524 00:21:38,793 --> 00:21:39,896 I'm Sophie Zhang, 525 00:21:40,000 --> 00:21:41,827 or Zhang Sui Feng in Mandarin. 526 00:21:41,931 --> 00:21:43,620 I was a data scientist at Facebook 527 00:21:43,724 --> 00:21:44,793 on what's called 528 00:21:44,896 --> 00:21:45,965 the fake engagement team. 529 00:21:46,068 --> 00:21:48,586 I also conducted work on Instagram. 530 00:21:50,275 --> 00:21:53,206 Back in late 2019 and early 2020 531 00:21:53,310 --> 00:21:55,068 roughly ten percent of follows 532 00:21:55,172 --> 00:21:56,931 on Instagram were fake. 533 00:21:57,034 --> 00:21:59,241 This is a very rough estimate, 534 00:21:59,344 --> 00:22:01,551 but considerably higher than Facebook. 535 00:22:01,655 --> 00:22:04,034 It creates an environment of constant doubt 536 00:22:04,137 --> 00:22:06,448 and insecurity and uncertainty. 537 00:22:08,896 --> 00:22:10,034 - I had to go and block 538 00:22:10,137 --> 00:22:11,827 all these fake Russian accounts. 539 00:22:11,931 --> 00:22:13,862 Every time they commented I had to, like, block them. 540 00:22:13,965 --> 00:22:17,034 Lucky I did that because after I did that, 541 00:22:17,137 --> 00:22:18,379 Instagram had this thing 542 00:22:18,482 --> 00:22:20,034 where they started shutting down accounts 543 00:22:20,137 --> 00:22:21,620 that had fake followers. 544 00:22:21,724 --> 00:22:23,655 So, you would see, like, celebrities, for instance. 545 00:22:23,758 --> 00:22:24,724 One of my friends, 546 00:22:24,827 --> 00:22:26,137 their Instagram account was just, like, 547 00:22:26,241 --> 00:22:27,655 gone the next day, I was like, what? 548 00:22:27,758 --> 00:22:29,896 I was like, bro, where's your account? 549 00:22:30,000 --> 00:22:31,310 Let's just do one more. 550 00:22:31,413 --> 00:22:32,758 And he was like, "Oh, I got hacked", 551 00:22:32,862 --> 00:22:33,758 and everyone knew, no, 552 00:22:33,862 --> 00:22:34,862 Instagram's going around 553 00:22:34,965 --> 00:22:35,965 shutting around fake accounts. 554 00:22:36,068 --> 00:22:37,137 So that was quite embarrassing. 555 00:22:37,241 --> 00:22:38,137 But luckily for me, 556 00:22:38,241 --> 00:22:39,551 I'd got rid of my bots by then. 557 00:22:39,655 --> 00:22:42,758 Quite funny. 558 00:22:42,862 --> 00:22:43,896 Girl, go and get changed. 559 00:22:44,000 --> 00:22:45,241 - Go. - Thank you 560 00:22:46,551 --> 00:22:47,586 I was pretty naive, 561 00:22:47,689 --> 00:22:49,413 because I had no clue what working for Facebook 562 00:22:49,517 --> 00:22:50,448 would be like. 563 00:22:50,551 --> 00:22:51,586 There is often a conflict between 564 00:22:51,689 --> 00:22:53,551 the product side of companies, 565 00:22:53,655 --> 00:22:55,689 whose goal is to grow the company, 566 00:22:55,793 --> 00:22:57,482 and the integrity side of companies 567 00:22:57,586 --> 00:22:59,965 whose goal is to protect the company from harm. 568 00:23:00,068 --> 00:23:01,793 Instagram was more aggressively focused 569 00:23:01,896 --> 00:23:03,482 on growing than Facebook. 570 00:23:03,586 --> 00:23:06,551 Because Instagram has more room to grow. 571 00:23:06,655 --> 00:23:08,310 This created a dynamic 572 00:23:08,413 --> 00:23:10,586 in which there was more resistance 573 00:23:10,689 --> 00:23:12,724 to integrity measures on Instagram 574 00:23:12,827 --> 00:23:15,655 than I personally saw on Facebook. 575 00:23:15,758 --> 00:23:18,689 Facebook is a public company, Instagram is part of it, 576 00:23:18,793 --> 00:23:22,413 and Instagram needs to be the growth engine for Facebook - 577 00:23:22,517 --> 00:23:25,103 to bring in as many young people as they can. 578 00:23:25,206 --> 00:23:27,310 And it's becoming more and more important, 579 00:23:27,413 --> 00:23:30,655 a bigger slice of that advertising every year. 580 00:23:40,172 --> 00:23:41,965 When Instagram introduced advertising, 581 00:23:42,068 --> 00:23:43,793 it was obviously better for people 582 00:23:43,896 --> 00:23:45,965 to spend more time in the app, 583 00:23:46,068 --> 00:23:48,931 because more time in the app means you see more ads. 584 00:23:49,034 --> 00:23:51,793 And the single best way to get you to spend 585 00:23:51,896 --> 00:23:52,827 more time in the app 586 00:23:52,931 --> 00:23:54,413 is to make your feed more interesting. 587 00:23:54,517 --> 00:23:56,620 One of the kind of trademarks of Instagram 588 00:23:56,724 --> 00:23:59,068 when I first joined was that your feed 589 00:23:59,172 --> 00:24:00,896 would be in chronological order. 590 00:24:01,000 --> 00:24:04,275 But as the platform became more saturated, 591 00:24:04,379 --> 00:24:06,448 what we were finding was that more and more people 592 00:24:06,551 --> 00:24:08,655 weren't able to actually see a lot of the content 593 00:24:08,758 --> 00:24:10,206 that they followed. 594 00:24:10,310 --> 00:24:13,551 It became apparent that it would make more sense 595 00:24:13,655 --> 00:24:16,586 and actually help users more to order their feed. 596 00:24:16,689 --> 00:24:17,896 So that was rolled out. 597 00:24:21,241 --> 00:24:24,620 - There was a team of machine learning engineers 598 00:24:24,724 --> 00:24:28,586 who would evaluate the algorithm, 599 00:24:28,689 --> 00:24:31,310 and they would compare these through A/B tests 600 00:24:31,413 --> 00:24:33,689 to see which was performing better. 601 00:24:33,793 --> 00:24:36,000 There was, amongst some users, 602 00:24:36,103 --> 00:24:38,448 a negative perception of this. 603 00:24:38,551 --> 00:24:41,413 However, if you looked at the metrics, 604 00:24:41,517 --> 00:24:42,655 like our engagement metrics, 605 00:24:42,758 --> 00:24:44,103 like, those were increasing. 606 00:24:44,206 --> 00:24:46,586 Now you can kind of turn on any time 607 00:24:46,689 --> 00:24:47,655 and get something different, 608 00:24:47,758 --> 00:24:50,103 which I think for your brain, it's just this, like, candy 609 00:24:50,206 --> 00:24:52,068 that it can't help but go back for. 610 00:24:52,172 --> 00:24:53,965 It looks like a logical decision now 611 00:24:54,068 --> 00:24:55,689 looking back, as to like why Instagram did it 612 00:24:55,793 --> 00:24:56,862 as a business or, like, 613 00:24:56,965 --> 00:24:58,862 why it at the time it looked good for users. 614 00:24:58,965 --> 00:25:00,758 But it just completely changed obviously, like, 615 00:25:00,862 --> 00:25:01,862 how it's being used 616 00:25:01,965 --> 00:25:03,275 and what people get out of it. 617 00:25:03,379 --> 00:25:06,275 And I think that's impossible to reverse. 618 00:25:06,379 --> 00:25:07,689 There's a vast array of content 619 00:25:07,793 --> 00:25:09,551 that you may never have discovered otherwise, 620 00:25:09,655 --> 00:25:11,482 unless the algorithm recognises 621 00:25:11,586 --> 00:25:13,137 something about your behaviour. 622 00:25:13,241 --> 00:25:14,137 But at the same time, 623 00:25:14,241 --> 00:25:16,068 depending on the nature of your interests, 624 00:25:16,172 --> 00:25:17,689 the things that you might get shown 625 00:25:17,793 --> 00:25:19,586 may or may not lead you down certain paths. 626 00:25:19,689 --> 00:25:20,862 That can lead to some, I think, 627 00:25:20,965 --> 00:25:23,586 both, like, scary places and also ones 628 00:25:23,689 --> 00:25:26,068 that are not super, I guess, healthy. 629 00:25:32,137 --> 00:25:34,724 - I'm Abby and I'm 17-years-old. 630 00:25:34,827 --> 00:25:37,000 I live with my mum and my sister, 631 00:25:37,103 --> 00:25:38,896 and all my dogs. 632 00:25:41,551 --> 00:25:42,896 A good four years, 633 00:25:43,000 --> 00:25:44,655 I've been associated with 634 00:25:44,758 --> 00:25:47,551 these anti-recovery Instagram communities. 635 00:25:48,758 --> 00:25:50,689 I probably wouldn't have known what self-harm was 636 00:25:50,793 --> 00:25:52,862 if I didn't look on them accounts. 637 00:25:58,862 --> 00:26:00,931 I knew how long I had to stay under water 638 00:26:01,034 --> 00:26:02,172 to drown myself. 639 00:26:02,275 --> 00:26:04,620 I knew how many paracetamol 640 00:26:04,724 --> 00:26:06,206 I'd have to take to kill myself. 641 00:26:06,310 --> 00:26:08,000 I knew how many minutes you'd have to have something 642 00:26:08,103 --> 00:26:11,000 around your neck to...ligature. 643 00:26:14,620 --> 00:26:16,172 You'd get added into groups 644 00:26:16,275 --> 00:26:19,275 and people would tell us to go and kill myself, 645 00:26:19,379 --> 00:26:21,310 and they'd want to do it together. 646 00:26:21,413 --> 00:26:23,655 And they would want to come on a video call 647 00:26:23,758 --> 00:26:24,758 to self-harm together, 648 00:26:24,862 --> 00:26:28,482 which is just...corrupt. 649 00:26:31,034 --> 00:26:33,517 The bullying started, like, year 7, year 8. 650 00:26:33,620 --> 00:26:35,172 I was bullied all the way through school 651 00:26:35,275 --> 00:26:36,413 for five years. 652 00:26:36,517 --> 00:26:37,793 So when I was 14, 653 00:26:37,896 --> 00:26:39,620 my school advised that I study at home, 654 00:26:39,724 --> 00:26:42,586 due to my mental health deteriorating. 655 00:26:42,689 --> 00:26:45,000 I had so much free time, like, 656 00:26:45,103 --> 00:26:47,586 I was missing out six hours a day on education. 657 00:26:47,689 --> 00:26:51,206 So sitting in front of your phone 658 00:26:51,310 --> 00:26:54,310 for hours and hours a day, 659 00:26:54,413 --> 00:26:56,172 I would make friends on Instagram. 660 00:26:58,241 --> 00:26:59,344 There'd be these, like, accounts. 661 00:26:59,448 --> 00:27:01,310 They'd put a sad quote and they would put it in 662 00:27:01,413 --> 00:27:04,172 with like sad-toned music. 663 00:27:04,275 --> 00:27:06,724 And they have, like, writing on it 664 00:27:06,827 --> 00:27:07,931 about, like, how you're feeling. 665 00:27:08,034 --> 00:27:09,758 So, then, you can relate to some of that, 666 00:27:09,862 --> 00:27:11,344 so you like it and then... 667 00:27:11,448 --> 00:27:12,379 If you let them, 668 00:27:12,482 --> 00:27:14,034 then people often follow you if you like them. 669 00:27:14,137 --> 00:27:15,551 And then you follow them. 670 00:27:15,655 --> 00:27:18,448 And then you get sucked into that community. 671 00:27:23,551 --> 00:27:24,724 - The kinds of problematic content 672 00:27:24,827 --> 00:27:27,103 I remember being surprised by 673 00:27:27,206 --> 00:27:29,586 were actually around things like self-harm. 674 00:27:29,689 --> 00:27:31,068 Enough for you to be like, oh, actually, 675 00:27:31,172 --> 00:27:32,206 people are using Instagram 676 00:27:32,310 --> 00:27:34,068 in ways that, like, aren't, you know, 677 00:27:34,172 --> 00:27:35,448 the sort of, like, nice vision 678 00:27:35,551 --> 00:27:37,758 that we have for what it could be and should be. 679 00:27:37,862 --> 00:27:39,413 I mean, just from the very earliest days 680 00:27:39,517 --> 00:27:40,793 there was a way to report 681 00:27:40,896 --> 00:27:42,137 offensive content in the app. 682 00:27:42,241 --> 00:27:43,517 You could just tap any photo and say, 683 00:27:43,620 --> 00:27:44,551 "Hey, I don't like this, 684 00:27:44,655 --> 00:27:45,758 "this is not...this is not good." 685 00:27:45,862 --> 00:27:47,862 And then we would actually have to review it. 686 00:27:47,965 --> 00:27:49,310 We were actually a really tiny team. 687 00:27:49,413 --> 00:27:50,896 We would have to, like, look at the photos, 688 00:27:51,000 --> 00:27:53,068 which was never fun, block accounts. 689 00:27:53,172 --> 00:27:55,206 So there was a little bit of hunting through 690 00:27:55,310 --> 00:27:56,931 and just like being proactive and reactive 691 00:27:57,034 --> 00:27:58,413 to the stuff that was happening. 692 00:28:02,344 --> 00:28:03,586 Dealing with problematic content 693 00:28:03,689 --> 00:28:05,310 was something that Facebook 694 00:28:05,413 --> 00:28:07,793 was able to help quite a bit with, actually. 695 00:28:07,896 --> 00:28:08,931 They've been running, you know, 696 00:28:09,034 --> 00:28:10,482 a huge platform for many more years, 697 00:28:10,586 --> 00:28:14,000 and have had to learn and deal with 698 00:28:14,103 --> 00:28:15,379 those problems in much bigger numbers 699 00:28:15,482 --> 00:28:16,689 than we ever had. 700 00:28:16,793 --> 00:28:20,172 I remember that was actually one of the big benefits 701 00:28:20,275 --> 00:28:21,862 that, like, we saw looking at Facebook. 702 00:28:21,965 --> 00:28:23,137 Being, like, "Oh, wow." 703 00:28:23,241 --> 00:28:24,275 Like, they can actually 704 00:28:24,379 --> 00:28:25,931 really take this off our shoulders. 705 00:28:26,034 --> 00:28:29,275 By offloading all of that community moderation 706 00:28:29,379 --> 00:28:30,689 to Facebook, 707 00:28:30,793 --> 00:28:32,862 Instagram detached themselves 708 00:28:32,965 --> 00:28:35,000 from one of the most important aspects 709 00:28:35,103 --> 00:28:38,206 of making sure that their platform was healthy. 710 00:28:38,310 --> 00:28:39,413 Facebook's priority 711 00:28:39,517 --> 00:28:41,310 is to have as few humans 712 00:28:41,413 --> 00:28:43,517 involved in that process as possible. 713 00:28:43,620 --> 00:28:45,724 And to prioritise the issues 714 00:28:45,827 --> 00:28:48,172 that are getting the most public attention, 715 00:28:48,275 --> 00:28:50,655 which are the issues mostly facing Facebook, 716 00:28:50,758 --> 00:28:52,068 not the issues facing Instagram. 717 00:28:52,172 --> 00:28:54,379 Content moderation happens 718 00:28:54,482 --> 00:28:56,034 in a combination of different ways. 719 00:28:56,137 --> 00:28:59,448 Some content moderation is what's called proactive, 720 00:28:59,551 --> 00:29:03,068 where when content gets posted to the platform, 721 00:29:03,172 --> 00:29:04,724 it gets flagged by an algorithm that says, 722 00:29:04,827 --> 00:29:06,000 hey, this looks suspicious. 723 00:29:06,103 --> 00:29:08,517 Either it's going to be removed from the platform 724 00:29:08,620 --> 00:29:09,931 or it'll be sent, perhaps, 725 00:29:10,034 --> 00:29:12,310 to some human to review it and say, 726 00:29:12,413 --> 00:29:13,379 "Is this actually a problem?" 727 00:29:13,482 --> 00:29:15,862 But algorithms are not perfect, right? 728 00:29:15,965 --> 00:29:19,137 Algorithms often struggle to take into account 729 00:29:19,241 --> 00:29:20,827 social nuances that might determine 730 00:29:20,931 --> 00:29:22,413 whether something is problematic or not. 731 00:29:22,517 --> 00:29:27,206 Even if you know your AI for detecting harmful content 732 00:29:27,310 --> 00:29:28,862 is 99 percent accurate, 733 00:29:28,965 --> 00:29:31,034 if people are posting a million bad things a day, 734 00:29:31,137 --> 00:29:32,517 which is probably a vast underestimate, 735 00:29:32,620 --> 00:29:34,655 getting one percent of them wrong 736 00:29:34,758 --> 00:29:36,689 means 10,000 pieces of harmful content 737 00:29:36,793 --> 00:29:38,103 are coming through the platform every day. 738 00:29:38,206 --> 00:29:40,275 And people are going to be exposed to that. 739 00:29:42,896 --> 00:29:44,137 I'm just packing my things 740 00:29:44,241 --> 00:29:47,068 to go back to placement. 741 00:29:47,172 --> 00:29:49,241 I stay at home on the weekend. 742 00:29:49,344 --> 00:29:52,241 And then I stay there for the rest of the week 743 00:29:52,344 --> 00:29:53,793 with like two staff, 744 00:29:53,896 --> 00:29:55,793 'cause I've been poorly. 745 00:29:55,896 --> 00:29:58,379 So I've got their support through the week. 746 00:30:04,931 --> 00:30:06,379 End of year 10, year 11, 747 00:30:06,482 --> 00:30:08,482 I started really wanting to hurt myself, and I would. 748 00:30:08,586 --> 00:30:11,517 But I think apart from that, then, 749 00:30:11,620 --> 00:30:13,896 like, before then I didn't really want to, 750 00:30:14,000 --> 00:30:17,275 and I was just doing it because on Instagram 751 00:30:17,379 --> 00:30:18,896 it's glamourised. 752 00:30:19,000 --> 00:30:20,862 And it's meant to be, like, people make it look good. 753 00:30:20,965 --> 00:30:23,137 Like everything on Instagram's glamourised, 754 00:30:23,241 --> 00:30:24,310 whether it's the positive life 755 00:30:24,413 --> 00:30:25,896 and going shopping, 756 00:30:26,000 --> 00:30:28,862 having loads of money, or self-harm. 757 00:30:34,103 --> 00:30:35,068 - We found her phone 758 00:30:35,172 --> 00:30:36,620 and we got her password and stuff and then 759 00:30:36,724 --> 00:30:39,275 when we looked into it, saw, like, the full extent 760 00:30:39,379 --> 00:30:40,551 to it and messages. 761 00:30:40,655 --> 00:30:42,620 "Aw, Abby, cut the cake, cut the cake, go on", 762 00:30:42,724 --> 00:30:44,310 referencing to her self-harm. 763 00:30:44,413 --> 00:30:45,517 It was upsetting. 764 00:30:45,620 --> 00:30:47,448 Especially as well 'cause they're vulnerable. 765 00:30:47,551 --> 00:30:49,034 And they've been through stuff like that, 766 00:30:49,137 --> 00:30:50,482 and having to see what the effect 767 00:30:50,586 --> 00:30:51,758 it has on them as well. 768 00:30:51,862 --> 00:30:54,034 'Cause it's not just Abby who has that. 769 00:30:54,137 --> 00:30:55,862 It's quite a lot of other people. 770 00:30:57,379 --> 00:30:59,137 - I was quite shocked. 771 00:30:59,241 --> 00:31:01,172 Naively, I was just thinking 772 00:31:01,275 --> 00:31:04,586 that Abby was quite innocently on things. 773 00:31:05,655 --> 00:31:06,931 - Later on, I could see, like, 774 00:31:07,034 --> 00:31:08,275 she was just always on her phone 775 00:31:08,379 --> 00:31:09,517 and when she had her phone 776 00:31:09,620 --> 00:31:11,793 she seemed to be more heightened. 777 00:31:11,896 --> 00:31:13,896 So that's when we came to realise, ah, 778 00:31:14,000 --> 00:31:16,551 it's her phone that's doing it. 779 00:31:18,551 --> 00:31:19,689 - You might have a low day, 780 00:31:19,793 --> 00:31:20,965 a single low day, 781 00:31:21,068 --> 00:31:23,655 and you might go and search for something on Instagram, 782 00:31:23,758 --> 00:31:25,310 that you might not otherwise search for. 783 00:31:25,413 --> 00:31:27,758 Suddenly, the algorithm has a little seed 784 00:31:27,862 --> 00:31:29,310 it can start to grow. 785 00:31:29,413 --> 00:31:33,793 And because that topic is so intense, you know, 786 00:31:33,896 --> 00:31:36,413 as they show you more of it, you engage with it more. 787 00:31:36,517 --> 00:31:37,551 A little tiny glimmer 788 00:31:37,655 --> 00:31:40,068 and Instagram will pull you in that direction. 789 00:31:41,965 --> 00:31:43,931 - A machine learning model predicts stuff, 790 00:31:44,034 --> 00:31:46,896 and in particular, they predict behaviour. 791 00:31:48,310 --> 00:31:50,448 Like think about the risk of using prediction 792 00:31:50,551 --> 00:31:52,000 to rank things in terms of 793 00:31:52,103 --> 00:31:54,275 a physical public square. 794 00:31:54,379 --> 00:31:56,655 The whole square is controlled by one company. 795 00:31:56,758 --> 00:31:58,896 There are thousands of possible routes 796 00:31:59,000 --> 00:32:00,517 you can take through that square. 797 00:32:00,620 --> 00:32:02,896 And when you come into the square, 798 00:32:03,000 --> 00:32:04,310 the first thing that happens is you're handed 799 00:32:04,413 --> 00:32:05,551 a particular leaflet, 800 00:32:05,655 --> 00:32:07,758 and then how you react to that leaflet 801 00:32:07,862 --> 00:32:11,034 determines which leaflet you're handed next. 802 00:32:11,137 --> 00:32:12,965 Once you start down a particular route 803 00:32:13,068 --> 00:32:14,551 through that square, 804 00:32:14,655 --> 00:32:16,379 the leaflets that you see are going to be 805 00:32:16,482 --> 00:32:17,689 more and more related to each other. 806 00:32:17,793 --> 00:32:18,827 And that might be a good thing. 807 00:32:18,931 --> 00:32:20,827 It might be that you get shown a cat leaflet 808 00:32:20,931 --> 00:32:22,206 and you really like that cat leaflet, 809 00:32:22,310 --> 00:32:24,344 so then you get shown more cat leaflets. 810 00:32:24,448 --> 00:32:27,724 But it might also be a dangerous thing... 811 00:32:28,862 --> 00:32:30,275 ..that again, you find intriguing, 812 00:32:30,379 --> 00:32:32,931 and then you're shown more of the same leaflet. 813 00:32:33,034 --> 00:32:35,310 And that's the route you then take through the square. 814 00:32:36,620 --> 00:32:37,862 What that means 815 00:32:37,965 --> 00:32:40,275 is that you have to have the determination 816 00:32:40,379 --> 00:32:43,586 to give the ranking model 817 00:32:43,689 --> 00:32:47,862 the same signals consistently over time 818 00:32:47,965 --> 00:32:50,206 that you are not in fact interested in that thing 819 00:32:50,310 --> 00:32:51,620 that you were interested before. 820 00:32:51,724 --> 00:32:52,655 And it'll test to see 821 00:32:52,758 --> 00:32:54,137 if you're still not interested in them. 822 00:32:54,241 --> 00:32:55,862 And if you say, actually, I am interested, 823 00:32:55,965 --> 00:32:58,172 then it'll start showing that to you again. 824 00:32:58,275 --> 00:33:00,931 Fundamentally we're habitual beings, 825 00:33:01,034 --> 00:33:03,862 so it takes real will and resolve 826 00:33:03,965 --> 00:33:06,931 to break the self-reinforcing dynamic 827 00:33:07,034 --> 00:33:09,344 of using prediction to rank things. 828 00:33:09,448 --> 00:33:10,620 It's not impossible, 829 00:33:10,724 --> 00:33:14,137 but it asks a real, like, test of human psychology. 830 00:33:15,620 --> 00:33:18,655 Betty? No. 831 00:33:21,724 --> 00:33:23,862 - Have you got everything? - Yeah. 832 00:33:23,965 --> 00:33:26,827 - It's hard to manage, it's a big problem. 833 00:33:26,931 --> 00:33:30,206 Abby can get very distressed, you know, 834 00:33:30,310 --> 00:33:32,241 she's been in hospital for three months. 835 00:33:32,344 --> 00:33:35,034 And it is very, very hard. 836 00:33:37,137 --> 00:33:38,655 I think there must be a way 837 00:33:38,758 --> 00:33:41,103 that Instagram can monitor things 838 00:33:41,206 --> 00:33:42,482 that are happening on there. 839 00:33:42,586 --> 00:33:44,310 They must know 840 00:33:44,413 --> 00:33:46,862 and possibly turn a blind eye... 841 00:33:48,068 --> 00:33:49,448 Have you got all your things? 842 00:33:49,551 --> 00:33:51,137 - Yeah. - Yeah? 843 00:33:51,241 --> 00:33:52,724 ..because it's happening every day. 844 00:33:52,827 --> 00:33:55,344 It's happening every day to everyday kids. 845 00:33:55,448 --> 00:33:58,241 And, you know, we are very lucky Abby is still here. 846 00:33:58,344 --> 00:34:00,000 Give us a hug. 847 00:34:00,103 --> 00:34:01,896 Right. I'll see you through the week, 848 00:34:02,000 --> 00:34:03,448 - darling. - Alright. 849 00:34:04,758 --> 00:34:06,137 - It's just the thought of everything 850 00:34:06,241 --> 00:34:07,965 that she's been through, you know. 851 00:34:09,413 --> 00:34:11,413 A lot of the people on there, Abby doesn't even know. 852 00:34:11,517 --> 00:34:13,172 She's never met them. 853 00:34:15,241 --> 00:34:17,310 Yeah, I mean she is, she's a strong girl, 854 00:34:17,413 --> 00:34:19,482 and she will get there. 855 00:34:24,517 --> 00:34:25,862 - One of the hardest things 856 00:34:25,965 --> 00:34:28,137 when it comes to moderating Instagram 857 00:34:28,241 --> 00:34:30,655 is the visual nature of the product. 858 00:34:30,758 --> 00:34:34,413 The company simply hasn't figured out well 859 00:34:34,517 --> 00:34:38,689 how to understand the dark corners of Instagram, 860 00:34:38,793 --> 00:34:40,896 that are harder to look for 861 00:34:41,000 --> 00:34:42,862 if you don't know what you're looking for. 862 00:34:42,965 --> 00:34:45,655 The hashtag communities that exist. 863 00:34:45,758 --> 00:34:47,103 If you don't know the right hashtag 864 00:34:47,206 --> 00:34:48,689 you might never find them. 865 00:34:48,793 --> 00:34:51,068 So, it's very easy for somebody 866 00:34:51,172 --> 00:34:53,137 to be doing something that is harmful 867 00:34:53,241 --> 00:34:55,137 and not coming to the attention 868 00:34:55,241 --> 00:34:58,310 of anyone in Menlo Park, California. 869 00:34:58,413 --> 00:34:59,655 - The people that were building 870 00:34:59,758 --> 00:35:01,379 a lot of the social technologies were, 871 00:35:01,482 --> 00:35:03,724 I don't want say similar to me, 872 00:35:03,827 --> 00:35:06,862 but they were probably more similar to me than different. 873 00:35:06,965 --> 00:35:08,482 And what I mean by that is that we were, 874 00:35:08,586 --> 00:35:10,275 one, interested in technology. 875 00:35:10,379 --> 00:35:13,827 Two, we were pretty literate and educated. 876 00:35:13,931 --> 00:35:18,310 And we...I thought that 877 00:35:18,413 --> 00:35:20,206 by bringing this technology to the world... 878 00:35:22,413 --> 00:35:24,379 ..a lot of things would just sort of sort themselves out. 879 00:35:24,482 --> 00:35:27,448 I'm sure that there was some vague awareness, 880 00:35:27,551 --> 00:35:29,551 perhaps, about that risk 881 00:35:29,655 --> 00:35:31,862 in the early days of social media. 882 00:35:31,965 --> 00:35:32,965 In some ways, 883 00:35:33,068 --> 00:35:34,034 I don't want to say 884 00:35:34,137 --> 00:35:35,586 we were, like, blinded by ambition, 885 00:35:35,689 --> 00:35:36,862 but blinded by the desire 886 00:35:36,965 --> 00:35:38,827 to just kind of get this stuff out there 887 00:35:38,931 --> 00:35:39,965 and then we'd sort it out. 888 00:35:40,068 --> 00:35:41,413 Part of this was a selection bias. 889 00:35:41,517 --> 00:35:42,931 We were seeing people having a great time 890 00:35:43,034 --> 00:35:44,517 and using these things for good. 891 00:35:44,620 --> 00:35:46,620 And I think that, you know, at least for me, 892 00:35:46,724 --> 00:35:48,103 propelled me to continue to believe 893 00:35:48,206 --> 00:35:51,482 that bringing more access to more people was a net good 894 00:35:51,586 --> 00:35:53,000 as opposed to just saying, 895 00:35:53,103 --> 00:35:54,724 "OK, you know what? Humans aren't ready for this. 896 00:35:54,827 --> 00:35:56,241 "Let's give it up." 897 00:36:04,620 --> 00:36:05,862 What was really interesting 898 00:36:05,965 --> 00:36:09,344 about the way Instagram solved its problems 899 00:36:09,448 --> 00:36:12,689 is they focused on the biggest users first. 900 00:36:12,793 --> 00:36:15,896 People like Taylor Swift and Ariana Grande, 901 00:36:16,000 --> 00:36:18,448 they had such big follower counts 902 00:36:18,551 --> 00:36:22,482 that they also got a bigger dose of the issues. 903 00:36:22,586 --> 00:36:26,310 Taylor Swift was embroiled in a controversy 904 00:36:26,413 --> 00:36:28,931 and people were posting a lot of snakes on her page 905 00:36:29,034 --> 00:36:30,241 and she hated it. 906 00:36:30,344 --> 00:36:32,206 She didn't want to have a platform 907 00:36:32,310 --> 00:36:34,310 that was just full of harassment 908 00:36:34,413 --> 00:36:35,827 every time she logged on. 909 00:36:35,931 --> 00:36:37,586 And so that year, 910 00:36:37,689 --> 00:36:42,586 Instagram decided to allow muting of any words or emoji 911 00:36:42,689 --> 00:36:44,137 that you don't want to see. 912 00:36:50,206 --> 00:36:52,517 Hiding the words in your comments 913 00:36:52,620 --> 00:36:53,724 that make you sad. 914 00:36:53,827 --> 00:36:55,965 Blocking certain people from following you. 915 00:36:56,068 --> 00:36:59,068 Making sure that you can't see likes on those posts. 916 00:36:59,172 --> 00:37:02,827 Well, hiding it doesn't make it go away. 917 00:37:02,931 --> 00:37:04,275 You are still in this world 918 00:37:04,379 --> 00:37:06,896 where everyone's being compared to everyone else, 919 00:37:07,000 --> 00:37:10,862 and that is by its very nature anxiety-inducing. 920 00:37:10,965 --> 00:37:12,206 What happens for some adults, 921 00:37:12,310 --> 00:37:13,482 for some kids, 922 00:37:13,586 --> 00:37:15,931 is that they self-soothe 923 00:37:16,034 --> 00:37:18,344 by scrolling on Instagram. 924 00:37:18,448 --> 00:37:21,586 The only problem is if your self-soothing technique, 925 00:37:21,689 --> 00:37:23,896 you know, sitting and scrolling, 926 00:37:24,000 --> 00:37:26,310 is exposing you to more of the content 927 00:37:26,413 --> 00:37:28,862 that is making you sick, 928 00:37:28,965 --> 00:37:31,379 that's not a good coping mechanism. 929 00:37:43,344 --> 00:37:44,724 - Having an eating disorder, 930 00:37:44,827 --> 00:37:46,379 it's really easy for you 931 00:37:46,482 --> 00:37:48,241 to fall into...um, 932 00:37:48,344 --> 00:37:50,482 it's kind of like a trigger hole. 933 00:37:50,586 --> 00:37:52,896 Once I see one thing that's triggering, 934 00:37:53,000 --> 00:37:55,310 it's kind of easy to then go and find things 935 00:37:55,413 --> 00:37:56,482 that are also triggering, 936 00:37:56,586 --> 00:37:59,034 just because that part of your brain likes it. 937 00:38:00,034 --> 00:38:03,413 I get food pictures pop up 938 00:38:03,517 --> 00:38:07,517 and they've got calorie counts on each side. 939 00:38:07,620 --> 00:38:10,172 And it will say, like, which one is more healthy 940 00:38:10,275 --> 00:38:13,793 or how you can swap out different foods 941 00:38:13,896 --> 00:38:16,862 so that your calorie count is lower. 942 00:38:16,965 --> 00:38:18,689 Being in a trigger hole 943 00:38:18,793 --> 00:38:20,310 can have, like, serious effects 944 00:38:20,413 --> 00:38:23,413 on what happens next in your life. 945 00:38:23,517 --> 00:38:25,000 Like, I might see something 946 00:38:25,103 --> 00:38:26,931 and then go and not eat dinner. 947 00:38:28,172 --> 00:38:29,965 - In a sense, there is a feedback loop. 948 00:38:30,068 --> 00:38:31,448 The algorithm shows you some things 949 00:38:31,551 --> 00:38:33,034 that it thinks you're likely to interact with. 950 00:38:33,137 --> 00:38:34,448 You may interact with those things, 951 00:38:34,551 --> 00:38:36,517 and then it'll try to show you more of those things 952 00:38:36,620 --> 00:38:37,620 as you interact with it more. 953 00:38:37,724 --> 00:38:40,379 That feedback loop is reasonably well understood. 954 00:38:40,482 --> 00:38:41,793 I think the aspect of it 955 00:38:41,896 --> 00:38:43,344 that's a little harder to understand 956 00:38:43,448 --> 00:38:46,965 is they will actually change your own behaviour over time 957 00:38:47,068 --> 00:38:48,827 by showing you certain things. 958 00:38:48,931 --> 00:38:51,448 - If I'm scrolling 13 hours a day, 959 00:38:51,551 --> 00:38:54,137 it kind of...no wonder 960 00:38:54,241 --> 00:38:57,793 that it had a negative effect on my mental health. 961 00:38:57,896 --> 00:39:00,931 All the images would be quite highly edited. 962 00:39:01,034 --> 00:39:02,379 I was, like, comparing myself 963 00:39:02,482 --> 00:39:05,758 to this false person that I was looking at, 964 00:39:05,862 --> 00:39:09,758 and not really realising that that's what I was doing. 965 00:39:09,862 --> 00:39:12,172 I always used to use filters on Instagram, 966 00:39:12,275 --> 00:39:14,827 and I didn't realise how damaging they were. 967 00:39:14,931 --> 00:39:16,758 But I always used to put, like, 968 00:39:16,862 --> 00:39:18,172 my pictures in Facetune 969 00:39:18,275 --> 00:39:20,620 and, like, make my waist smaller 970 00:39:20,724 --> 00:39:22,034 than what it was. 971 00:39:22,137 --> 00:39:24,586 The underlying goal has always been the same. 972 00:39:24,689 --> 00:39:26,000 How do I look best? 973 00:39:26,103 --> 00:39:29,000 And maybe we need fewer, you know, 974 00:39:29,103 --> 00:39:30,689 contrast filters and saturation filters 975 00:39:30,793 --> 00:39:31,689 and all this stuff. 976 00:39:31,793 --> 00:39:33,068 And, oh, take away my freckles 977 00:39:33,172 --> 00:39:35,275 or take away my, like, wrinkles or something. 978 00:39:35,379 --> 00:39:38,275 And how do you guarantee, like, a selfie to look good? 979 00:39:38,379 --> 00:39:41,586 That was a problem we had to try and solve. 980 00:39:41,689 --> 00:39:43,206 Making sure that, yeah, 981 00:39:43,310 --> 00:39:46,896 when you took a photo, your skin looked good, 982 00:39:47,000 --> 00:39:48,862 so all the blemishes go away. 983 00:39:53,931 --> 00:39:56,758 Oh, wow. Um... 984 00:39:56,862 --> 00:39:58,068 So, I have a two-year-old, 985 00:39:58,172 --> 00:40:01,448 in 16 years when she's 18... 986 00:40:01,551 --> 00:40:03,586 As a parent, I want to educate my daughter to say, 987 00:40:03,689 --> 00:40:04,689 "You know what, it's OK. 988 00:40:04,793 --> 00:40:08,172 "Beauty is up to you to define, to decide. 989 00:40:08,275 --> 00:40:12,103 "It's not something that is defined by someone else." 990 00:40:12,206 --> 00:40:14,034 You make people aware that there is a filter. 991 00:40:14,137 --> 00:40:15,448 That's good, 992 00:40:15,551 --> 00:40:18,103 because they know that there's some sense of, 993 00:40:18,206 --> 00:40:19,827 like, this image has been doctored. 994 00:40:19,931 --> 00:40:21,482 I think it's our responsibility 995 00:40:21,586 --> 00:40:24,103 to...to tell people that. 996 00:40:25,862 --> 00:40:27,103 - During COVID, 997 00:40:27,206 --> 00:40:29,344 I made friends with a neighbour... 998 00:40:30,482 --> 00:40:32,172 ..and she'd been through 999 00:40:32,275 --> 00:40:33,931 eating disorder recovery before, 1000 00:40:34,034 --> 00:40:35,344 she'd been an inpatient. 1001 00:40:35,448 --> 00:40:37,655 And she said, like, 1002 00:40:37,758 --> 00:40:40,379 "You're actually ill," like, 1003 00:40:40,482 --> 00:40:42,724 "You're ill, you need to get help". 1004 00:40:44,413 --> 00:40:46,965 Something just clicked and I was like... 1005 00:40:49,172 --> 00:40:51,517 "Yeah, I think actually, you're right." 1006 00:40:51,620 --> 00:40:55,000 And, yeah, I then got help from there. 1007 00:41:00,586 --> 00:41:03,827 Right, I'm filming now and I'm not stopping it. 1008 00:41:03,931 --> 00:41:06,275 I've stopped this video about 100 times. 1009 00:41:07,517 --> 00:41:10,551 Hi everyone, welcome back to my IGTV. 1010 00:41:10,655 --> 00:41:11,896 So, this week 1011 00:41:12,000 --> 00:41:14,000 is Eating Disorder Awareness Week, 1012 00:41:14,103 --> 00:41:15,448 if you didn't know already. 1013 00:41:15,551 --> 00:41:18,655 The content I post now is a lot more uplifting, 1014 00:41:18,758 --> 00:41:20,310 a lot more positive, 1015 00:41:20,413 --> 00:41:22,931 kind of still having that fashion element. 1016 00:41:23,034 --> 00:41:25,206 but I also now talk about healing. 1017 00:41:28,379 --> 00:41:29,275 When you look back 1018 00:41:29,379 --> 00:41:30,862 at your content from then, like, 1019 00:41:30,965 --> 00:41:33,103 what do you think about that girl? 1020 00:41:34,689 --> 00:41:36,034 - Um... 1021 00:41:36,137 --> 00:41:37,931 I feel like I have, like, 1022 00:41:38,034 --> 00:41:40,000 quite a lot of compassion for that girl. 1023 00:41:40,103 --> 00:41:41,172 Because I feel like she was, like, 1024 00:41:41,275 --> 00:41:45,206 striving to achieve this level of, like, perfection 1025 00:41:45,310 --> 00:41:47,931 that just was never going to happen. 1026 00:41:48,034 --> 00:41:51,586 But she also, like, worked so hard to, like, 1027 00:41:51,689 --> 00:41:53,758 get to where she was. 1028 00:41:53,862 --> 00:41:55,586 I'm getting upset. 1029 00:42:01,724 --> 00:42:03,000 Yeah... 1030 00:42:07,931 --> 00:42:11,000 When I look back at that girl I just think... 1031 00:42:12,655 --> 00:42:16,103 ..I just wish that someone would have been, like, 1032 00:42:16,206 --> 00:42:18,448 "Hey, it's OK," like, 1033 00:42:18,551 --> 00:42:20,413 "there's something wrong here, 1034 00:42:20,517 --> 00:42:22,034 "but it's all going to be OK", 1035 00:42:22,137 --> 00:42:23,137 you know? 1036 00:42:30,379 --> 00:42:31,862 - I don't know if it's an unusual thing 1037 00:42:31,965 --> 00:42:34,275 about the nature of work as an influencer. 1038 00:42:34,379 --> 00:42:37,620 But there are no built-in protections. 1039 00:42:37,724 --> 00:42:39,793 You're competing with literally millions of people 1040 00:42:39,896 --> 00:42:41,413 on a scale that most humans 1041 00:42:41,517 --> 00:42:43,000 have never had to confront before. 1042 00:42:43,103 --> 00:42:45,862 Imagine constructing your world 1043 00:42:45,965 --> 00:42:48,206 such that you create things that happen 1044 00:42:48,310 --> 00:42:50,068 so you can take photos of them 1045 00:42:50,172 --> 00:42:51,758 in order for them to get likes. 1046 00:42:51,862 --> 00:42:53,206 So then you can get the validation 1047 00:42:53,310 --> 00:42:55,896 to stay at a certain rung on a ladder. 1048 00:42:56,000 --> 00:42:57,379 That if you stop doing that, 1049 00:42:57,482 --> 00:42:59,241 then you lose all of your self-worth, 1050 00:42:59,344 --> 00:43:01,068 and you'll start to question your value. 1051 00:43:01,172 --> 00:43:03,482 It's a really tough place to be in. 1052 00:43:03,586 --> 00:43:05,344 It's in the very structure of Instagram 1053 00:43:05,448 --> 00:43:07,379 that these problems have come about. 1054 00:43:07,482 --> 00:43:10,000 And so if you don't change that, 1055 00:43:10,103 --> 00:43:12,655 you really can't change the nature 1056 00:43:12,758 --> 00:43:14,206 of how people feel about being compared 1057 00:43:14,310 --> 00:43:15,793 to others all the time. 1058 00:43:22,793 --> 00:43:23,793 - My name's Callum, 1059 00:43:23,896 --> 00:43:25,827 I work for the Centre for Countering Digital Hate 1060 00:43:25,931 --> 00:43:27,310 as its Head of Research. 1061 00:43:27,413 --> 00:43:30,620 We've investigated a range of social media platforms. 1062 00:43:30,724 --> 00:43:31,896 We look at how their algorithms work 1063 00:43:32,000 --> 00:43:34,310 and what sort of content that they promote, 1064 00:43:34,413 --> 00:43:35,620 and whether they're profiting from things 1065 00:43:35,724 --> 00:43:36,758 that they shouldn't be. 1066 00:43:36,862 --> 00:43:38,551 A big part of Instagram's business model 1067 00:43:38,655 --> 00:43:40,758 is that it has an understanding 1068 00:43:40,862 --> 00:43:42,103 of what users are interested in. 1069 00:43:42,206 --> 00:43:43,482 And what it says to advertisers 1070 00:43:43,586 --> 00:43:45,379 is put your advertising on our platform 1071 00:43:45,482 --> 00:43:47,620 and we'll make sure it's shown to people 1072 00:43:47,724 --> 00:43:50,620 who are most likely to engage with this. 1073 00:43:51,896 --> 00:43:53,344 We set up an account to investigate 1074 00:43:53,448 --> 00:43:55,034 what's been called bigorexia, 1075 00:43:55,137 --> 00:43:56,965 and that's primarily men who become obsessed 1076 00:43:57,068 --> 00:43:59,034 with becoming very big and muscular, 1077 00:43:59,137 --> 00:44:01,310 obsessed with bodybuilding. 1078 00:44:01,413 --> 00:44:03,482 What we found was that the algorithm 1079 00:44:03,586 --> 00:44:06,448 would push bodybuilding content very, very hard 1080 00:44:06,551 --> 00:44:08,896 to accounts that began to follow that content. 1081 00:44:09,000 --> 00:44:11,689 To the extent that the Explore page on Instagram, 1082 00:44:11,793 --> 00:44:15,448 80 percent of the posts on there recommended to you 1083 00:44:15,551 --> 00:44:19,344 would feature pretty extreme bodybuilding physiques. 1084 00:44:20,689 --> 00:44:23,241 This has been likened to a form of body dysmorphia, 1085 00:44:23,344 --> 00:44:25,724 and increasingly is thought to be harmful. 1086 00:44:25,827 --> 00:44:28,896 And so the signs that the algorithm 1087 00:44:29,000 --> 00:44:30,689 is pushing men towards that content 1088 00:44:30,793 --> 00:44:32,275 and is encouraging them to see 1089 00:44:32,379 --> 00:44:34,655 this pretty extraordinary body shape 1090 00:44:34,758 --> 00:44:38,068 as ordinary, normal and easily attainable. 1091 00:44:38,172 --> 00:44:41,413 It was getting adverts for health supplements, 1092 00:44:41,517 --> 00:44:45,068 for diet plans, for workout plans. 1093 00:44:45,172 --> 00:44:47,448 So none of which is harmful in itself, 1094 00:44:47,551 --> 00:44:49,413 but it is another example of how Instagram 1095 00:44:49,517 --> 00:44:51,517 turns that time spent on the platform 1096 00:44:51,620 --> 00:44:54,620 looking at this content into money through advertising. 1097 00:44:54,724 --> 00:44:56,931 If you're Instagram, strategically, 1098 00:44:57,034 --> 00:44:59,551 the ordinary user doesn't matter 1099 00:44:59,655 --> 00:45:02,413 as much as the advertiser. 1100 00:45:02,517 --> 00:45:06,068 Unless you're thinking about them en masse. 1101 00:45:06,172 --> 00:45:07,827 They need more ordinary users 1102 00:45:07,931 --> 00:45:09,275 spending time on Instagram 1103 00:45:09,379 --> 00:45:12,482 to be the ones buying the products. 1104 00:45:12,586 --> 00:45:14,482 I mean, the most effective ads 1105 00:45:14,586 --> 00:45:15,827 and brand marketing 1106 00:45:15,931 --> 00:45:18,413 is content that makes you feel something, 1107 00:45:18,517 --> 00:45:20,482 that causes a reaction or a response. 1108 00:45:20,586 --> 00:45:22,206 Instagram was one of the first platforms 1109 00:45:22,310 --> 00:45:24,379 where I think that creator class 1110 00:45:24,482 --> 00:45:26,827 really realised that the best ads 1111 00:45:26,931 --> 00:45:28,655 are ads that don't feel like ads at all. 1112 00:45:32,517 --> 00:45:35,241 - So my natural body type is very slim. 1113 00:45:35,344 --> 00:45:38,620 And when I was younger skinny girls were in. 1114 00:45:38,724 --> 00:45:41,448 But when you came to Instagram now 1115 00:45:41,551 --> 00:45:43,344 the curvy girl was cool. 1116 00:45:43,448 --> 00:45:46,724 And then with the whole advertising of, 1117 00:45:46,827 --> 00:45:50,172 oh, Brazilian butt lift, BBL, you can do this, 1118 00:45:50,275 --> 00:45:51,517 you can do that. 1119 00:45:51,620 --> 00:45:53,379 And if you don't know what a BBL is, 1120 00:45:53,482 --> 00:45:56,793 they transfer fat from other areas of your body 1121 00:45:56,896 --> 00:45:58,413 and they pull it into your bum 1122 00:45:58,517 --> 00:45:59,931 and then they sculpt you. 1123 00:46:00,034 --> 00:46:01,344 I was thinking, 1124 00:46:01,448 --> 00:46:03,931 OK, curvy girls are in. 1125 00:46:04,034 --> 00:46:05,517 If I'm curvier I'm going to get more followers. 1126 00:46:05,620 --> 00:46:07,379 If I get more followers I'm going to get more money. 1127 00:46:07,482 --> 00:46:08,344 If I get more money 1128 00:46:08,448 --> 00:46:10,793 that's a better future for my children. 1129 00:46:15,620 --> 00:46:18,275 I'm Chenade Laroy and I'm out here in Istanbul, 1130 00:46:18,379 --> 00:46:21,724 and I've just had my BBL surgery, 1131 00:46:21,827 --> 00:46:23,344 hence why I'm a little bit stiff. 1132 00:46:23,448 --> 00:46:27,896 I got my BBL surgery done by a company. 1133 00:46:28,000 --> 00:46:30,344 They expected from me that I post stories 1134 00:46:30,448 --> 00:46:32,931 and post my results of before and after. 1135 00:46:33,034 --> 00:46:34,448 In return for that, 1136 00:46:34,551 --> 00:46:37,482 they would give me a reduced rate. 1137 00:46:37,586 --> 00:46:42,068 It turned out to be something I wouldn't do again. 1138 00:46:42,172 --> 00:46:44,827 - Hi. How are you, babe? - Hello! 1139 00:46:46,724 --> 00:46:48,206 - How are you? - Great, thanks. 1140 00:46:48,310 --> 00:46:49,862 Get some water if you like, some biscuits. 1141 00:46:49,965 --> 00:46:51,241 Yeah, definitely need some water. 1142 00:46:51,344 --> 00:46:52,896 After my surgery, when I came back to England, 1143 00:46:53,000 --> 00:46:54,000 my body was perfect. 1144 00:46:54,103 --> 00:46:56,379 And it was only when I was having my massages done, 1145 00:46:56,482 --> 00:46:58,103 one of the masseuses was like, 1146 00:46:58,206 --> 00:47:00,000 "I think you still have fluid left inside you." 1147 00:47:08,793 --> 00:47:09,827 Can you see like here? 1148 00:47:09,931 --> 00:47:12,034 It looks like I've got, like, a small baby. 1149 00:47:14,275 --> 00:47:16,931 And the thing is, like, people knew me on Instagram, 1150 00:47:17,034 --> 00:47:18,655 'cause I've always been into my health and fitness. 1151 00:47:18,758 --> 00:47:20,448 I've always had a six pack. 1152 00:47:20,551 --> 00:47:22,862 I did go onto my Instagram, put up a whole post. 1153 00:47:22,965 --> 00:47:24,931 I had a negative experience, 1154 00:47:25,034 --> 00:47:27,758 so I owe it to my audience to be honest and say, 1155 00:47:27,862 --> 00:47:29,517 "This is what happened to me." 1156 00:47:29,620 --> 00:47:32,275 You need to not fall into the hype like I did. 1157 00:47:32,379 --> 00:47:36,517 You need to actually think about the whole process. 1158 00:47:36,620 --> 00:47:38,379 It's one of the most dangerous procedures, 1159 00:47:38,482 --> 00:47:39,931 and obviously, like, I have two children. 1160 00:47:40,034 --> 00:47:42,034 It's very irresponsible of me, 1161 00:47:42,137 --> 00:47:43,827 especially being a mum. 1162 00:47:43,931 --> 00:47:45,172 But if I'm honest, 1163 00:47:45,275 --> 00:47:47,344 I feel like I probably still would've done it. 1164 00:47:47,448 --> 00:47:49,206 It's the advertising. 1165 00:47:49,310 --> 00:47:50,241 And like, 1166 00:47:50,344 --> 00:47:52,068 "Oh, but he looks good, and her body looks good." 1167 00:47:52,172 --> 00:47:53,724 And this is why, like, for me, 1168 00:47:53,827 --> 00:47:55,241 I had to go through my Instagram 1169 00:47:55,344 --> 00:47:57,517 and filter who I follow. 1170 00:48:20,620 --> 00:48:21,827 When it comes to the algorithm 1171 00:48:21,931 --> 00:48:23,724 that's where safety has to be paramount, 1172 00:48:23,827 --> 00:48:25,034 because that's where the platforms 1173 00:48:25,137 --> 00:48:28,931 take an active role in presenting content to users 1174 00:48:29,034 --> 00:48:30,344 and it can cause harm to them. 1175 00:48:37,896 --> 00:48:40,172 Yes, influencers have a responsibility 1176 00:48:40,275 --> 00:48:41,827 to their public, to their audience, 1177 00:48:41,931 --> 00:48:43,310 especially if they're younger people. 1178 00:48:43,413 --> 00:48:45,137 However, the government 1179 00:48:45,241 --> 00:48:47,241 and the people that actually run Instagram, 1180 00:48:47,344 --> 00:48:49,620 that's where the responsibility really falls. 1181 00:48:49,724 --> 00:48:51,344 If something is that detrimental, 1182 00:48:51,448 --> 00:48:53,241 you have the algorithms, you have the numbers. 1183 00:48:53,344 --> 00:48:55,724 You're seeing everything from the top. 1184 00:48:55,827 --> 00:48:58,689 They have to control it. They have to figure out. 1185 00:49:14,896 --> 00:49:15,827 It's not the case 1186 00:49:15,931 --> 00:49:18,206 that social media companies like Instagram 1187 00:49:18,310 --> 00:49:20,034 are just not thinking about all the ways 1188 00:49:20,137 --> 00:49:22,793 in which their content could harm people - they are. 1189 00:49:22,896 --> 00:49:25,206 They write rules which are publicly available, 1190 00:49:25,310 --> 00:49:26,655 and on the basis of those rules, 1191 00:49:26,758 --> 00:49:30,310 content is completely removed by people from the site. 1192 00:49:30,413 --> 00:49:31,689 But the other way they're thinking 1193 00:49:31,793 --> 00:49:34,034 about reducing content that's harmful to people 1194 00:49:34,137 --> 00:49:35,965 is by training machine learning models 1195 00:49:36,068 --> 00:49:38,310 to predict what kinds of content 1196 00:49:38,413 --> 00:49:40,241 meet certain definitions, 1197 00:49:40,344 --> 00:49:44,965 and then demote that on people's Instagram feeds. 1198 00:49:45,068 --> 00:49:46,586 But the crucial question about that - 1199 00:49:46,689 --> 00:49:48,965 and this is a question for society - 1200 00:49:49,068 --> 00:49:51,586 is who should get to decide? 1201 00:49:51,689 --> 00:49:52,827 What are the concepts 1202 00:49:52,931 --> 00:49:54,827 that we should train machine learning models 1203 00:49:54,931 --> 00:49:57,275 to demote content on the basis of? 1204 00:49:57,379 --> 00:49:59,379 Should that be the companies themselves? 1205 00:49:59,482 --> 00:50:02,103 Should it be legislators or regulators? 1206 00:50:02,206 --> 00:50:05,034 And how should we know what those concepts are? 1207 00:50:05,137 --> 00:50:06,034 'Cause right now, 1208 00:50:06,137 --> 00:50:08,068 we don't know all that much about them. 1209 00:50:21,482 --> 00:50:22,896 - What's going on, Abby? 1210 00:50:26,034 --> 00:50:28,379 Have a think of 10 different things 1211 00:50:28,482 --> 00:50:31,000 that are important or special about you 1212 00:50:31,103 --> 00:50:34,931 that nobody would know about unless you told them. 1213 00:50:35,034 --> 00:50:36,137 I'm living at home now, 1214 00:50:36,241 --> 00:50:38,482 so I've moved from my placement. 1215 00:50:38,586 --> 00:50:42,068 I didn't have my phone from my first placement. 1216 00:50:42,172 --> 00:50:45,482 I'd say about 14, 15 months, I didn't have my phone for. 1217 00:50:45,586 --> 00:50:47,586 They knew what was going on with social media, 1218 00:50:47,689 --> 00:50:48,827 stuff like that, so it was like, 1219 00:50:48,931 --> 00:50:50,137 "No, we're taking your phone off you, 1220 00:50:50,241 --> 00:50:53,206 "um, because it isn't making you any better, 1221 00:50:53,310 --> 00:50:54,655 "it's just making you worse". 1222 00:50:55,862 --> 00:50:57,379 Not having access to them groups 1223 00:50:57,482 --> 00:50:58,758 as such was very good, 1224 00:50:58,862 --> 00:51:00,241 like, still now I'm very grateful 1225 00:51:00,344 --> 00:51:01,517 that I didn't have my phone. 1226 00:51:01,620 --> 00:51:03,206 I'm not in them groups anymore. 1227 00:51:03,310 --> 00:51:04,551 If I get added, I'll leave. 1228 00:51:04,655 --> 00:51:07,241 And then look at her face when she turns. 1229 00:51:07,344 --> 00:51:10,103 - With Abby's phone and that it is a bit of a concern, 1230 00:51:10,206 --> 00:51:11,448 'cause you don't know whose on there, 1231 00:51:11,551 --> 00:51:12,655 who she's messaging. 1232 00:51:12,758 --> 00:51:14,068 So it makes you quite worried, 1233 00:51:14,172 --> 00:51:17,344 what she's doing by herself and on her phone. 1234 00:51:26,586 --> 00:51:27,862 - As soon as you open Instagram 1235 00:51:27,965 --> 00:51:29,068 it's the feed page you see 1236 00:51:29,172 --> 00:51:31,068 and not the group chats. 1237 00:51:31,172 --> 00:51:33,275 I went on Instagram, like, about two weeks ago 1238 00:51:33,379 --> 00:51:37,379 and there was someone posting, like, 1239 00:51:37,482 --> 00:51:40,172 a few screen recordings of these videos - 1240 00:51:40,275 --> 00:51:43,068 people committing suicide who'd videoed their suicides. 1241 00:51:44,344 --> 00:51:47,068 I don't know how them videos were uploaded. 1242 00:51:47,172 --> 00:51:48,896 I didn't really look into it, 1243 00:51:49,000 --> 00:51:51,206 'cause I was so traumatised. 1244 00:51:51,310 --> 00:51:53,413 And one of them there was this woman 1245 00:51:53,517 --> 00:51:54,827 and she was hanging herself 1246 00:51:54,931 --> 00:51:56,310 and she was crying and all this. 1247 00:51:57,724 --> 00:51:59,103 It's getting out of hand. 1248 00:51:59,206 --> 00:52:00,931 When I first went on the communities 1249 00:52:01,034 --> 00:52:03,103 it was like, yeah, people posting pictures 1250 00:52:03,206 --> 00:52:05,655 of self-harm and that but now it's, like... 1251 00:52:06,724 --> 00:52:08,103 I mean, that is bad in itself, 1252 00:52:08,206 --> 00:52:09,586 but now it's like extremes 1253 00:52:09,689 --> 00:52:12,068 and it's just getting, spiralling out of control. 1254 00:52:12,172 --> 00:52:14,413 And I think people can't control 1255 00:52:14,517 --> 00:52:15,586 what people put on anymore, 1256 00:52:15,689 --> 00:52:18,034 because they didn't stop it in the first place. 1257 00:52:18,137 --> 00:52:21,103 - Obviously, no social media would be the best solution, 1258 00:52:21,206 --> 00:52:24,275 but you can't take somebody's phone off them forever, 1259 00:52:24,379 --> 00:52:25,758 their social media off them forever. 1260 00:52:25,862 --> 00:52:28,310 So I think certain guidelines need to be put in place 1261 00:52:28,413 --> 00:52:31,413 to stop this harm towards people. 1262 00:52:47,931 --> 00:52:49,827 I don't think they really put a lot of safeguards in place, 1263 00:52:49,931 --> 00:52:52,379 so all it would be is sensitive warning 1264 00:52:52,482 --> 00:52:53,482 on this video. 1265 00:52:53,586 --> 00:52:55,103 All you have to do is click yes, 1266 00:52:55,206 --> 00:52:56,103 do you know what I mean? 1267 00:52:56,206 --> 00:52:58,034 It's not...it just shouldn't be up there, 1268 00:52:58,137 --> 00:53:01,034 that video should not be on social media. 1269 00:53:03,000 --> 00:53:04,724 - I think we're starting to have 1270 00:53:04,827 --> 00:53:06,275 a reckoning as a society 1271 00:53:06,379 --> 00:53:10,586 for the power that Instagram holds 1272 00:53:10,689 --> 00:53:15,241 over young people and over, really, all of us. 1273 00:53:15,344 --> 00:53:17,137 In the last few years, 1274 00:53:17,241 --> 00:53:18,413 employees have realised 1275 00:53:18,517 --> 00:53:20,862 that the power that this company holds 1276 00:53:20,965 --> 00:53:25,000 is simply too large, too important to keep secret. 1277 00:53:33,068 --> 00:53:35,137 - I worked at Facebook for approximately two years, 1278 00:53:35,241 --> 00:53:38,896 starting in June of 2019. 1279 00:53:39,000 --> 00:53:41,068 And it really struck me, 1280 00:53:41,172 --> 00:53:42,275 when I worked at Facebook, 1281 00:53:42,379 --> 00:53:44,068 that there were a number of systemic problems 1282 00:53:44,172 --> 00:53:45,379 that I didn't believe 1283 00:53:45,482 --> 00:53:47,000 Facebook could solve on its own. 1284 00:53:55,034 --> 00:53:56,482 Facebook's research says 1285 00:53:56,586 --> 00:53:58,000 that in the United States 1286 00:53:58,103 --> 00:53:59,517 six percent of teenagers 1287 00:53:59,620 --> 00:54:00,793 who thought about 1288 00:54:00,896 --> 00:54:01,931 suicidal ideation said that 1289 00:54:02,034 --> 00:54:03,517 it was driven by Instagram, 1290 00:54:03,620 --> 00:54:05,482 and 13 percent in the UK. 1291 00:54:07,241 --> 00:54:08,137 Instagram makes 1292 00:54:08,241 --> 00:54:09,517 body image issues worse 1293 00:54:09,620 --> 00:54:11,103 in one in three teen girls. 1294 00:54:33,551 --> 00:54:35,655 The documents describe an addict's narrative. 1295 00:54:35,758 --> 00:54:37,344 No, I don't feel good when I use this. 1296 00:54:37,448 --> 00:54:38,724 I can't stop. 1297 00:54:38,827 --> 00:54:40,344 I want to stop. 1298 00:54:40,448 --> 00:54:42,965 If I leave, I'll be ostracised. 1299 00:54:43,068 --> 00:54:44,586 And when they go to their parents 1300 00:54:44,689 --> 00:54:46,137 and talk about these problems, 1301 00:54:46,241 --> 00:54:50,000 because their parents never lived this experience, 1302 00:54:50,103 --> 00:54:54,344 Facebook's own documents say kids are suffering alone. 1303 00:54:54,448 --> 00:54:55,896 At some point I realised 1304 00:54:56,000 --> 00:54:58,103 that the only way these problems 1305 00:54:58,206 --> 00:54:59,241 were going to be solved 1306 00:54:59,344 --> 00:55:01,413 was if we stepped out of the frame. 1307 00:55:01,517 --> 00:55:03,758 Facebook is trying to solve these problems in private. 1308 00:55:03,862 --> 00:55:05,482 And that's part of why I came forward. 1309 00:55:05,586 --> 00:55:07,655 We have to solve these problems together. 1310 00:55:07,758 --> 00:55:09,862 Facebook and Instagram have a responsibility 1311 00:55:09,965 --> 00:55:12,137 to better understand the impacts 1312 00:55:12,241 --> 00:55:13,551 that they're having on the world. 1313 00:55:13,655 --> 00:55:15,724 But, in my mind, that's not enough. 1314 00:55:15,827 --> 00:55:19,379 What really needs to be part of the solution 1315 00:55:19,482 --> 00:55:21,586 is a much broader conversation with regulators, 1316 00:55:21,689 --> 00:55:22,862 with the public, 1317 00:55:22,965 --> 00:55:26,034 in terms of can we actually envision a world 1318 00:55:26,137 --> 00:55:28,172 where Facebook isn't just left alone 1319 00:55:28,275 --> 00:55:31,034 to do whatever it wants to its platforms? 1320 00:55:31,137 --> 00:55:32,586 But there's actually some way 1321 00:55:32,689 --> 00:55:34,206 that we have some transparency 1322 00:55:34,310 --> 00:55:35,241 into what they're doing. 1323 00:55:35,344 --> 00:55:36,620 And I think people are starting to realise that, 1324 00:55:36,724 --> 00:55:39,517 actually, there's a way that this could be done better. 1325 00:55:47,344 --> 00:55:49,689 - I'm literally a completely different person. 1326 00:55:49,793 --> 00:55:51,862 Every single thing is different about me now 1327 00:55:51,965 --> 00:55:54,655 because I've been through so much therapy. 1328 00:55:54,758 --> 00:55:57,034 My whole life has changed. 1329 00:56:00,655 --> 00:56:05,793 I've decided more recently to not edit any of my content. 1330 00:56:05,896 --> 00:56:07,068 So... 1331 00:56:07,172 --> 00:56:08,827 Oh, this is cute. 1332 00:56:08,931 --> 00:56:11,689 ..I just have it straight from my iPhone camera 1333 00:56:11,793 --> 00:56:13,689 to Instagram, 1334 00:56:13,793 --> 00:56:15,137 not being touched. 1335 00:56:15,241 --> 00:56:18,172 I don't want to add to the problem, 1336 00:56:18,275 --> 00:56:21,482 that is probably never going to go away, 1337 00:56:21,586 --> 00:56:27,206 by showing people this false image of myself. 1338 00:56:28,655 --> 00:56:30,827 I feel a lot happier and healthier. 1339 00:56:30,931 --> 00:56:33,586 Like, I literally always smile... 1340 00:56:35,448 --> 00:56:38,344 ..which I never really did before. 1341 00:56:38,448 --> 00:56:41,137 - I hope that we will be able to find ways 1342 00:56:41,241 --> 00:56:43,068 to feel interconnected in a positive way 1343 00:56:43,172 --> 00:56:45,379 that benefits society at large. 1344 00:56:45,482 --> 00:56:47,275 And I think that there are ways 1345 00:56:47,379 --> 00:56:49,241 that social media platforms can do that. 1346 00:56:49,344 --> 00:56:51,655 But they do need to change. 1347 00:56:53,448 --> 00:56:54,931 - Now I'm in college. 1348 00:56:55,034 --> 00:56:57,344 I'm doing my level three Health and Social Care, 1349 00:56:57,448 --> 00:56:59,172 so I can become a Mental Health nurse. 1350 00:56:59,275 --> 00:57:01,241 I think people need to take responsibility 1351 00:57:01,344 --> 00:57:02,344 for what they're posting 1352 00:57:02,448 --> 00:57:04,758 and acknowledge how it can impact others. 1353 00:57:04,862 --> 00:57:07,413 Instagram could try and monitor this 1354 00:57:07,517 --> 00:57:08,827 a lot more closely 1355 00:57:08,931 --> 00:57:10,620 and try and tackle this problem 1356 00:57:10,724 --> 00:57:12,379 as soon as possible. 1357 00:57:12,482 --> 00:57:16,241 I don't know, like, who is responsible. 1358 00:57:16,344 --> 00:57:18,655 I do think it will take us a while to kind of look back 1359 00:57:18,758 --> 00:57:19,931 and kind of figure that out. 1360 00:57:20,034 --> 00:57:21,517 Um, am I responsible? 1361 00:57:21,620 --> 00:57:22,586 Of course, I am. 1362 00:57:22,689 --> 00:57:25,310 I was part of designing features 1363 00:57:25,413 --> 00:57:27,344 and building features that are now used 1364 00:57:27,448 --> 00:57:28,620 in the way they are. 1365 00:57:28,724 --> 00:57:32,896 I can't completely disconnect from having worked on it. 1366 00:57:33,000 --> 00:57:34,275 I still love Instagram. 1367 00:57:34,379 --> 00:57:35,241 And I really hope 1368 00:57:35,344 --> 00:57:36,689 that it's able to sort of figure out 1369 00:57:36,793 --> 00:57:37,758 how to navigate this space 1370 00:57:37,862 --> 00:57:39,965 and get back to what it wanted to be. 1371 00:57:40,068 --> 00:57:42,413 But I can't say that I'm holding my breath 1372 00:57:42,517 --> 00:57:43,448 for that to happen. 1373 00:57:43,551 --> 00:57:44,827 The only way Facebook will change 1374 00:57:44,931 --> 00:57:46,275 is if they're forced to change. 1375 00:57:46,379 --> 00:57:48,482 Companies don't change until the cost of changing 1376 00:57:48,586 --> 00:57:51,275 is less than the cost of staying the same. 1377 00:57:51,379 --> 00:57:54,448 And so we need to stand up and demand better things. 1378 00:57:54,551 --> 00:57:56,482 We need to demand the Instagram that we deserve, 1379 00:57:56,586 --> 00:57:58,448 not the one that Facebook is giving us today. 97718

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.