Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,000 --> 00:00:05,400
This programme contains some
strong language
2
00:00:05,400 --> 00:00:07,760
TikTok - over one billion of us
use it.
3
00:00:07,760 --> 00:00:10,320
In just a few years,
it's changed the world.
4
00:00:10,320 --> 00:00:12,520
But is it now changing us?
5
00:00:12,520 --> 00:00:15,880
If I was a copper today,
I'd be absolutely furious
6
00:00:15,880 --> 00:00:17,160
with TikTok.
7
00:00:17,160 --> 00:00:20,760
TikTok seems to be driving
some users into a frenzy.
8
00:00:20,760 --> 00:00:22,640
You have blood on your hands!
9
00:00:22,640 --> 00:00:24,960
From witch hunts
by amateur sleuths...
10
00:00:24,960 --> 00:00:27,480
TikTokers have been
playing detective.
11
00:00:27,480 --> 00:00:30,480
It has significantly distracted
the investigation.
12
00:00:30,480 --> 00:00:33,000
..to incidents of
copycat violence...
13
00:00:33,000 --> 00:00:35,920
They stole a bus and burned it.
14
00:00:35,920 --> 00:00:39,760
..these frenzies seem to be fuelled
by posting and sharing
15
00:00:39,760 --> 00:00:41,000
on the platform.
16
00:00:41,000 --> 00:00:42,920
It's like wildfires spreading.
17
00:00:42,920 --> 00:00:45,920
You can put out the main fire,
but, like, all of the little fires
18
00:00:45,920 --> 00:00:48,120
are still there
and people can see it.
19
00:00:48,120 --> 00:00:51,280
I'm the BBC's disinformation
and social media correspondent.
20
00:00:51,280 --> 00:00:54,280
Like so many of us, I'm obsessed
with TikTok.
21
00:00:54,280 --> 00:00:57,400
Sometimes I think that maybe
the TikTok algorithm knows me
22
00:00:57,400 --> 00:00:59,080
better than I know myself.
23
00:00:59,080 --> 00:01:03,320
And I want to know whether it's
really driving antisocial behaviour.
24
00:01:03,320 --> 00:01:06,440
TikTok will send you
an encouragement telling you
25
00:01:06,440 --> 00:01:08,680
you're a hit, keep going.
26
00:01:09,760 --> 00:01:13,000
TikTok incentivises creators
very, very clearly
27
00:01:13,000 --> 00:01:14,760
by paying them money.
28
00:01:14,760 --> 00:01:17,800
If so, are the company aware
of what's going on?
29
00:01:17,800 --> 00:01:20,280
Or is the machine out of control?
30
00:01:28,640 --> 00:01:31,240
Am I looking at this one?
Sorry, I just fucked this up.
31
00:01:31,240 --> 00:01:34,200
It's all right. Take two.
32
00:01:34,200 --> 00:01:37,480
I'm Taylor Cohen.
I worked at TikTok in New York
33
00:01:37,480 --> 00:01:40,240
between the years of
2020 and 2022,
34
00:01:40,240 --> 00:01:42,960
as a global creative strategist.
35
00:01:42,960 --> 00:01:45,400
I was a really early adopter
of the platform.
36
00:01:45,400 --> 00:01:48,920
I couldn't put it down,
for like, days, months,
37
00:01:48,920 --> 00:01:51,960
which turned into years. Started
to see more of my friends joining
38
00:01:51,960 --> 00:01:54,640
and just saw it as not going away,
39
00:01:54,640 --> 00:01:56,960
and I just needed to be
a part of it.
40
00:01:56,960 --> 00:02:02,600
But do you have a blue North Face
TikTok hat?
41
00:02:02,600 --> 00:02:04,840
Because I do.
42
00:02:04,840 --> 00:02:08,960
TikTok is different than anything
that ever came before it.
43
00:02:08,960 --> 00:02:12,040
If I met somebody who didn't
understand how the TikTok
44
00:02:12,040 --> 00:02:16,120
algorithm worked, I would
explain it to them in this way.
45
00:02:16,120 --> 00:02:20,520
Do you want to see what you
actually are interested in?
46
00:02:20,520 --> 00:02:24,080
TikTok is a content graph,
meaning that the algorithm
47
00:02:24,080 --> 00:02:26,680
serves you content
based on topics you like.
48
00:02:26,680 --> 00:02:30,640
So, as an example, if you really
like dogs and you start to see
49
00:02:30,640 --> 00:02:33,720
maybe one or two TikToks
in your feed of dogs,
50
00:02:33,720 --> 00:02:36,480
but you're always liking
that dog post,
51
00:02:36,480 --> 00:02:39,800
you're going to start to see a lot
more dog content in your feed.
52
00:02:39,800 --> 00:02:42,840
Now, on Instagram, the biggest
difference there is, you know,
53
00:02:42,840 --> 00:02:46,040
that's a social graph, meaning
you're only going to see content
54
00:02:46,040 --> 00:02:48,840
based upon other people's influence.
55
00:02:48,840 --> 00:02:53,040
So, your other friends liking posts,
how many followers the person has,
56
00:02:53,040 --> 00:02:55,520
how much engagement they're getting.
57
00:02:55,520 --> 00:02:59,360
When I think about TikTok,
it's allowing anybody
58
00:02:59,360 --> 00:03:02,480
to show up at the top of your feed
or in the first couple of like,
59
00:03:02,480 --> 00:03:04,240
swipes, whatever it is.
60
00:03:04,240 --> 00:03:06,640
But it's so relevant
to you and it's so catered
61
00:03:06,640 --> 00:03:09,880
that you feel like you're actually
living in this like, experience,
62
00:03:09,880 --> 00:03:13,200
with other people, where you're
like, "Oh, wow, I feel that same
way."
63
00:03:13,200 --> 00:03:15,880
And I can't say, like, at least
for myself, I've never experienced
64
00:03:15,880 --> 00:03:18,400
that on any other social platform,
ever.
65
00:03:20,440 --> 00:03:24,040
I'm Jeff Allen. I'm the co-founder
and chief research officer
66
00:03:24,040 --> 00:03:25,840
at the Integrity Institute.
67
00:03:25,840 --> 00:03:28,720
Videos uploaded to TikTok
are public, which means that anyone
68
00:03:28,720 --> 00:03:30,560
on Earth can see them.
69
00:03:30,560 --> 00:03:32,480
It's much more about discovery.
70
00:03:32,480 --> 00:03:36,000
It's much more about seeing content
that is new and novel.
71
00:03:36,000 --> 00:03:40,120
It does mean that the algorithm
is much more in control and has much
72
00:03:40,120 --> 00:03:42,840
more influence over what users
are seeing.
73
00:03:42,840 --> 00:03:45,640
You don't quite know what you're
going to see, when you log into it.
74
00:03:53,120 --> 00:03:56,200
Four students stabbed to death
in their off-campus house
75
00:03:56,200 --> 00:03:58,240
while two of their housemates slept.
76
00:03:58,240 --> 00:04:01,720
Those are the bare facts of a crime
committed on the 13th of November
77
00:04:01,720 --> 00:04:04,720
in the small college town
of Moscow, Idaho.
78
00:04:04,720 --> 00:04:06,760
In November 2022,
79
00:04:06,760 --> 00:04:10,040
four students from the University
of Idaho were murdered
80
00:04:10,040 --> 00:04:12,920
in their bedrooms, while two
surviving housemates slept.
81
00:04:12,920 --> 00:04:16,200
Unfounded speculation around
who committed the murders
82
00:04:16,200 --> 00:04:17,720
gripped TikTok.
83
00:04:17,720 --> 00:04:19,800
So, like everybody else
on TikTok,
84
00:04:19,800 --> 00:04:21,400
I'm obsessed with this.
85
00:04:21,400 --> 00:04:23,160
I'd never heard of Moscow, Idaho.
86
00:04:23,160 --> 00:04:26,320
But the murders quickly
flooded my feed before being widely
87
00:04:26,320 --> 00:04:27,840
covered by the media.
88
00:04:27,840 --> 00:04:29,160
I'm fucking over The TikTok
89
00:04:29,160 --> 00:04:31,080
detectives are all over this fucking
app.
90
00:04:31,080 --> 00:04:34,000
Within a few days, I was almost
seeing nothing else.
91
00:04:34,000 --> 00:04:36,640
I discovered that TikTok users
were uniquely obsessed
92
00:04:36,640 --> 00:04:38,040
with the case.
93
00:04:38,040 --> 00:04:41,520
I found videos on TikTok
using the Idaho 4 hashtag racked up
94
00:04:41,520 --> 00:04:44,800
two billion views compared
to just 80,000 views on YouTube
95
00:04:44,800 --> 00:04:48,760
over the same period. The case
seemed to generate more engagement
96
00:04:48,760 --> 00:04:52,560
on TikTok than any of the other
social media sites.
97
00:04:52,560 --> 00:04:55,240
So I was doing some digging
and I went on Maddie, which is one
98
00:04:55,240 --> 00:04:58,920
of the victims' Instagram, and I saw
that she has a boyfriend named Jake.
99
00:04:58,920 --> 00:05:03,000
Sleuths trying to solve mysteries
online happens all the time.
100
00:05:03,000 --> 00:05:06,160
But what I noticed was how so many
people were getting involved
101
00:05:06,160 --> 00:05:09,840
in baseless speculation,
who didn't seem to have posted
102
00:05:09,840 --> 00:05:12,960
about true crime before,
getting huge numbers of views,
103
00:05:12,960 --> 00:05:17,520
sharing theories about a murder
that they had limited knowledge of.
104
00:05:19,080 --> 00:05:22,720
What I think happened is that Kaylee
had a stalker.
105
00:05:22,720 --> 00:05:24,760
The more extreme the views,
the more interest
106
00:05:24,760 --> 00:05:25,960
they seemed to attract.
107
00:05:25,960 --> 00:05:28,240
This is Kaylee, and this is her ex.
And everyone's like,
108
00:05:28,240 --> 00:05:30,440
"Kaylee was the target,
Kaylee was the target."
109
00:05:30,440 --> 00:05:33,280
Like many, I found myself
watching video after video.
110
00:05:33,280 --> 00:05:35,840
I was hooked by people throwing
around accusations
111
00:05:35,840 --> 00:05:38,360
about who they believed
to be the murder suspect.
112
00:05:38,360 --> 00:05:42,800
I am starting to believe more
and more that Jack was involved.
113
00:05:42,800 --> 00:05:47,040
Stop thinking that you are
on an episode of Law And Order.
114
00:05:47,040 --> 00:05:49,040
These are actual lives.
115
00:05:49,040 --> 00:05:52,040
Some TikTokers, including
those who lived in Moscow, Idaho,
116
00:05:52,040 --> 00:05:54,480
started to call out
what was happening.
117
00:05:54,480 --> 00:05:57,880
These are real fucking lives
that you are discussing
118
00:05:57,880 --> 00:05:59,360
and ruining.
119
00:05:59,360 --> 00:06:01,600
People died. Fucking stop.
120
00:06:01,600 --> 00:06:04,880
One of the main theories involved
someone called Jack Showalter.
121
00:06:04,880 --> 00:06:06,720
Dubbed "Hoodie Guy" by TikTokers,
122
00:06:06,720 --> 00:06:10,120
he was falsely accused
of being involved.
123
00:06:10,120 --> 00:06:13,560
Jack was caught on CCTV by a food
truck close to two of the victims,
124
00:06:13,560 --> 00:06:15,600
hours before they were killed.
125
00:06:15,600 --> 00:06:18,880
The most suspect is grey Hoodie Guy
from the food truck.
126
00:06:18,880 --> 00:06:21,800
He followed them.
This all just ties up.
127
00:06:21,800 --> 00:06:23,680
He had it planned out.
128
00:06:23,680 --> 00:06:27,640
TikTokers deciding he was a
suspect, dissected his life online.
129
00:06:27,640 --> 00:06:30,920
The big rumour going around is
that his parents whisked him
130
00:06:30,920 --> 00:06:34,200
off to South Africa,
so they can't get his DNA.
131
00:06:34,200 --> 00:06:36,360
I think that it was premeditated.
132
00:06:36,360 --> 00:06:40,400
That's why the main suspect
is now in Africa.
133
00:06:40,400 --> 00:06:43,920
We do not believe the following
individuals are involved
134
00:06:43,920 --> 00:06:48,800
in this crime. A male specifically
wearing a white hoodie.
135
00:06:48,800 --> 00:06:50,640
It's not Jack Showalter.
136
00:06:50,640 --> 00:06:53,600
It has nothing to do with my family
or the Showalters.
137
00:06:53,600 --> 00:06:57,240
His sister released a TikTok
asking the witch-hunt to stop.
138
00:06:57,240 --> 00:07:01,520
There were so many victims
that were created
139
00:07:01,520 --> 00:07:04,280
through internet sleuth videos
like this.
140
00:07:04,280 --> 00:07:09,080
We have received threats
and harassment.
141
00:07:09,080 --> 00:07:10,480
Um...
142
00:07:10,480 --> 00:07:14,320
..and...we didn't deserve that.
143
00:07:14,320 --> 00:07:16,560
Jack didn't deserve that.
144
00:07:19,640 --> 00:07:23,440
I'm Jasmine, and I worked
as a TikTok moderator in 2022.
145
00:07:23,440 --> 00:07:26,560
I think it's fair to say
that TikTok has a unique quality
146
00:07:26,560 --> 00:07:31,360
to it that makes it especially
sticky for people to engage
147
00:07:31,360 --> 00:07:34,720
in trends. Because you feel like
everyone else is already doing it,
148
00:07:34,720 --> 00:07:38,200
so what does it matter if I also
engage in this trend?
149
00:07:38,200 --> 00:07:41,480
When really, it's kind of feeding
the algorithm and it's making
150
00:07:41,480 --> 00:07:44,120
this topic more popular
by you posting about it.
151
00:07:44,120 --> 00:07:47,880
You could see it, as a moderator,
that people would completely pivot
152
00:07:47,880 --> 00:07:51,680
their content and kind of try to
jump on whatever's trendy right now.
153
00:07:51,680 --> 00:07:55,960
So right now, we are driving
to Moscow, Idaho.
154
00:07:55,960 --> 00:07:57,440
We just flew out from Florida.
155
00:07:57,440 --> 00:08:00,240
While most joined in from their
homes, I found one TikToker
156
00:08:00,240 --> 00:08:02,640
called Olivia,
who took things a step further.
157
00:08:02,640 --> 00:08:04,840
She decided to visit the scene.
158
00:08:04,840 --> 00:08:08,960
This is the home where four students
were brutally stabbed to death.
159
00:08:08,960 --> 00:08:12,520
One of her posts even racked up
over 20 million views.
160
00:08:12,520 --> 00:08:15,520
A reddish coloured substance,
believed to be blood,
161
00:08:15,520 --> 00:08:17,480
can be seen coming from Xana's room.
162
00:08:17,480 --> 00:08:19,560
Many were gripped by her videos.
163
00:08:19,560 --> 00:08:22,080
Others were more critical
and asked her to stop.
164
00:08:22,080 --> 00:08:23,640
But she kept posting.
165
00:08:23,640 --> 00:08:27,800
It is insanely disrespectful
to go ahead and make
166
00:08:27,800 --> 00:08:31,720
a little mock documentary
for a 15-second TikTok video
167
00:08:31,720 --> 00:08:33,440
of an active crime.
168
00:08:33,440 --> 00:08:36,720
In this leaked screenshot
from surveillance footage, it shows
169
00:08:36,720 --> 00:08:38,880
the girls speaking with Kaylee's ex.
170
00:08:38,880 --> 00:08:41,840
I wanted to understand what was
driving her to speculate
171
00:08:41,840 --> 00:08:43,200
about the case.
172
00:08:46,600 --> 00:08:49,760
She agreed to meet me in Florida,
where she lives.
173
00:08:51,480 --> 00:08:55,120
There was something about it
that really drew me to it,
174
00:08:55,120 --> 00:09:00,240
and I just felt this need to go out
there and dig for answers and see
175
00:09:00,240 --> 00:09:03,120
if I can help out in any way.
176
00:09:03,120 --> 00:09:07,000
Right here is the home where the
four University of Idaho students
177
00:09:07,000 --> 00:09:08,520
were murdered.
178
00:09:08,520 --> 00:09:13,000
I was at the location for a week,
talking to people, to neighbours,
179
00:09:13,000 --> 00:09:16,440
to students, and trying
to get a feel for who could
180
00:09:16,440 --> 00:09:18,800
have possibly done this.
181
00:09:18,800 --> 00:09:22,400
How far is Moscow, Idaho,
from here in Florida?
182
00:09:22,400 --> 00:09:27,600
Moscow, Idaho is about
a six or seven-hour flight,
183
00:09:27,600 --> 00:09:30,080
I think. That is a long way to go?
184
00:09:30,080 --> 00:09:34,720
Yeah, it was very long,
but the content that I make
185
00:09:34,720 --> 00:09:38,600
where I am actually at the scene
of something, does much better
186
00:09:38,600 --> 00:09:42,080
versus if I was just at home
sitting somewhere
187
00:09:42,080 --> 00:09:46,600
and talking about a case, because
I'm showing them - this is exactly
188
00:09:46,600 --> 00:09:48,520
what is happening right now.
189
00:09:48,520 --> 00:09:51,000
What has been the reaction
on the ground when you turn up
190
00:09:51,000 --> 00:09:53,960
at different locations
to cover these cases?
191
00:09:53,960 --> 00:09:58,320
Typically, the reaction is positive,
but sometimes that's not
192
00:09:58,320 --> 00:10:02,800
always the case. I think
one example of a hostile situation
193
00:10:02,800 --> 00:10:05,200
was the Summer Wells case.
194
00:10:05,200 --> 00:10:08,560
Something bad happened here
and her mother won't talk about it.
195
00:10:08,560 --> 00:10:12,040
Five-year-old Summer Wells
went missing in June 2021.
196
00:10:12,040 --> 00:10:13,480
11 acres of forest.
197
00:10:13,480 --> 00:10:16,200
Her toys are still scattered
about in her back yard.
198
00:10:16,200 --> 00:10:19,480
Olivia was one of the people
investigating the case on TikTok.
199
00:10:19,480 --> 00:10:22,080
Summer's parents say
that she was abducted.
200
00:10:22,080 --> 00:10:25,160
At one point, they said that she
just walked out the back door
201
00:10:25,160 --> 00:10:29,160
by herself. A neighbour says they
heard a loud scream around the time
202
00:10:29,160 --> 00:10:30,400
she went missing.
203
00:10:30,400 --> 00:10:32,960
Her parents were not named as
suspects, and there was no evidence
204
00:10:32,960 --> 00:10:35,200
at the time to suggest
they were involved.
205
00:10:35,200 --> 00:10:37,800
Yet they were pursued
by rumours on social media
206
00:10:37,800 --> 00:10:39,640
that they were somehow responsible.
207
00:10:39,640 --> 00:10:42,160
Another rumour is that
one of Summer's siblings say
208
00:10:42,160 --> 00:10:44,760
they saw their father
holding Summer's lifeless body,
209
00:10:44,760 --> 00:10:46,200
running out of the house.
210
00:10:46,200 --> 00:10:48,440
There's a cemetery across the street
from their home.
211
00:10:48,440 --> 00:10:51,520
We spent days searching
for Summer, but found nothing.
212
00:10:51,520 --> 00:10:54,360
Olivia was one of a group
of TikTokers who turned up
213
00:10:54,360 --> 00:10:57,520
outside the parents' house
in Tennessee.
214
00:10:57,520 --> 00:11:00,640
Summer's parents spoke out about
the harm the unfounded allegations
215
00:11:00,640 --> 00:11:02,560
were causing.
216
00:11:02,560 --> 00:11:05,840
This is the hardest time
of me and Candus's life.
217
00:11:05,840 --> 00:11:08,400
You lose your daughter
and then you get blamed for it
218
00:11:08,400 --> 00:11:10,760
and then everybody turns on you.
219
00:11:10,760 --> 00:11:15,640
Her parents didn't want
anyone to really help search
220
00:11:15,640 --> 00:11:17,200
for their missing daughter.
221
00:11:17,200 --> 00:11:20,360
You told me you couldn't
come to church today.
222
00:11:20,360 --> 00:11:25,600
Her parents actually shot
a flare gun at my mother
223
00:11:25,600 --> 00:11:28,680
and I and some friends.
Was that scary? It was,
224
00:11:28,680 --> 00:11:30,320
it was a bit scary.
225
00:11:30,320 --> 00:11:33,680
But I just kind of get in this zone
where I'm like, kind of
226
00:11:33,680 --> 00:11:37,000
in my own little world,
and I don't even really realise.
227
00:11:37,000 --> 00:11:39,560
Does it ever feel like a movie
to you,
228
00:11:39,560 --> 00:11:42,280
almost as a coping mechanism,
like, that distance that you put
229
00:11:42,280 --> 00:11:45,840
between yourself and the case?
I feel a lot of people
230
00:11:45,840 --> 00:11:51,800
will accuse me of, or say that
I sensationalise these stories
231
00:11:51,800 --> 00:11:55,440
and that it's bad.
But I don't see it like that.
232
00:11:55,440 --> 00:11:57,320
I'm a very empathic person.
233
00:11:57,320 --> 00:12:01,440
It's just the way that I edit
that people see it as, like,
234
00:12:01,440 --> 00:12:05,400
a Netflix thing, and that's why they
say I'm sensationalising things.
235
00:12:07,440 --> 00:12:10,080
Two years later,
Summer remains missing.
236
00:12:10,080 --> 00:12:13,480
The police have kept the case open.
237
00:12:13,480 --> 00:12:15,000
OK, here's part six.
238
00:12:15,000 --> 00:12:19,960
And we found two men's ankle socks,
and this was probably a couple
239
00:12:19,960 --> 00:12:23,120
of feet away from where
his remains were found.
240
00:12:23,120 --> 00:12:26,960
I felt it was almost like Olivia
didn't realise the harm this kind
241
00:12:26,960 --> 00:12:30,800
of behaviour could cause the people
living through tragedies.
242
00:12:30,800 --> 00:12:34,520
For people who don't watch this kind
of content, they could find it
243
00:12:34,520 --> 00:12:37,640
quite unusual, quite unusual
that someone would go to the place
244
00:12:37,640 --> 00:12:41,120
where a disappearance has happened,
or murders have happened,
245
00:12:41,120 --> 00:12:43,640
and record it and look
for information.
246
00:12:43,640 --> 00:12:45,440
Do you think it's unusual?
247
00:12:45,440 --> 00:12:50,160
I do see how that could be unusual
for people that aren't on TikTok,
248
00:12:50,160 --> 00:12:53,400
but I think it's a new thing
that is happening
249
00:12:53,400 --> 00:12:56,280
and it will become more normal.
It's the beginning.
250
00:12:56,280 --> 00:12:58,800
It's kind of like the Wild West
out here.
251
00:12:58,800 --> 00:13:03,040
I do worry that I could accidentally
give out wrong information.
252
00:13:03,040 --> 00:13:06,400
I won't go out of my way
to talk about someone,
253
00:13:06,400 --> 00:13:09,200
unless I really believe
that they're really involved.
254
00:13:09,200 --> 00:13:11,400
It's a new type of journalism.
255
00:13:11,400 --> 00:13:13,440
Journalism has changed.
256
00:13:13,440 --> 00:13:15,200
It's not like how it was before.
257
00:13:15,200 --> 00:13:16,760
Now, you don't need a diploma.
258
00:13:16,760 --> 00:13:18,720
You don't need a degree
to really do it.
259
00:13:18,720 --> 00:13:20,760
You just need your phone,
that's all you need.
260
00:13:20,760 --> 00:13:23,680
And sometimes I can even
post controversial things.
261
00:13:23,680 --> 00:13:28,600
I can post about information
that some news companies would need
262
00:13:28,600 --> 00:13:33,760
confirmation about before posting,
but I have just me,
263
00:13:33,760 --> 00:13:36,120
so I could just post it.
I can do that.
264
00:13:36,120 --> 00:13:38,920
I have the power to do that.
265
00:13:38,920 --> 00:13:43,520
So, we finished searching, um,
in the northern part of...
266
00:13:43,520 --> 00:13:46,480
Olivia says she's a journalist,
but what she's doing is not subject
267
00:13:46,480 --> 00:13:49,680
to the codes of practice
that govern traditional media.
268
00:13:49,680 --> 00:13:53,800
I'm obliged not to promote
false or unfounded theories.
269
00:13:53,800 --> 00:13:56,760
We're about half a mile out here,
270
00:13:56,760 --> 00:13:58,080
Over.
271
00:14:00,040 --> 00:14:03,760
At the moment, TikTok regulates
itself and says it removes content
272
00:14:03,760 --> 00:14:06,560
that breaches its community
guidelines, including
273
00:14:06,560 --> 00:14:09,280
misinformation and hate.
274
00:14:09,280 --> 00:14:13,240
The Online Safety Bill is expected
to become law in the UK at the end
275
00:14:13,240 --> 00:14:18,000
of 2023, and says it will force
social media sites to uphold
276
00:14:18,000 --> 00:14:20,120
their commitments to protect users.
277
00:14:20,120 --> 00:14:24,000
If they don't,
they could face hefty fines.
278
00:14:24,000 --> 00:14:29,840
But speculation about the Idaho
murders felt like a free-for-all.
279
00:14:29,840 --> 00:14:34,840
I think that TikTok does
encourage people to participate
280
00:14:34,840 --> 00:14:37,000
more than other apps,
281
00:14:37,000 --> 00:14:40,200
because you can just be sitting
on your couch
282
00:14:40,200 --> 00:14:43,840
and make a video,
and then reach tonnes of people.
283
00:14:43,840 --> 00:14:48,640
One video on TikTok could get
millions of plays,
284
00:14:48,640 --> 00:14:51,280
versus if I post the same video
on Instagram,
285
00:14:51,280 --> 00:14:53,480
it'll get, like, 200 views.
286
00:14:55,000 --> 00:14:58,080
It's much easier to go viral
on TikTok.
287
00:14:58,080 --> 00:15:02,080
And there isn't this barrier
of developing a huge following.
288
00:15:02,080 --> 00:15:05,000
You know, maybe the second video
that you upload, TikTok decides
289
00:15:05,000 --> 00:15:07,240
is a great video to show everyone,
and boom,
290
00:15:07,240 --> 00:15:08,920
you're off to 20 million views.
291
00:15:08,920 --> 00:15:11,720
As something gets more harmful,
things tend to get more engagement.
292
00:15:11,720 --> 00:15:14,840
These algorithmically ranked systems
are all floating bad content
293
00:15:14,840 --> 00:15:17,680
to the top. If you see - oh, this
video, someone's doing something
294
00:15:17,680 --> 00:15:20,280
very dangerous in it and they're
getting a lot of views, cool,
295
00:15:20,280 --> 00:15:23,280
maybe I'll do something dangerous.
You know, someone's doing something
296
00:15:23,280 --> 00:15:26,240
that's very disruptive and damaging,
you know, to their community.
297
00:15:26,240 --> 00:15:28,960
You're like, OK, well, maybe
I will disrupt my community too,
298
00:15:28,960 --> 00:15:31,640
and maybe I will get a lot
of views if I do that.
299
00:15:31,640 --> 00:15:34,160
So how does TikTok deal with this?
300
00:15:34,160 --> 00:15:37,360
Does it actually incentivise users
to take part,
301
00:15:37,360 --> 00:15:38,920
even if it causes harm?
302
00:15:38,920 --> 00:15:40,200
My name is Lara.
303
00:15:40,200 --> 00:15:43,840
I was a content moderator at TikTok
in the Australian market
304
00:15:43,840 --> 00:15:46,280
from 2020 to 2022.
305
00:15:46,280 --> 00:15:51,160
I saw a lot of trends that were
either super dangerous, but people
306
00:15:51,160 --> 00:15:54,720
got the thrill out of being
involved in something.
307
00:15:54,720 --> 00:15:58,880
TikTok has definitely impacted
how people act, how people behave,
308
00:15:58,880 --> 00:16:01,080
how people treat others.
309
00:16:01,080 --> 00:16:04,080
TikTok gives you that opportunity
to see something that's happening
310
00:16:04,080 --> 00:16:06,920
and then suddenly feel like,
Oh, I need to go there.
311
00:16:06,920 --> 00:16:10,800
It's an hour away, I can film myself
doing that, I can be relevant.
312
00:16:10,800 --> 00:16:14,960
I think that's when people forget
that fine line of what is socially
313
00:16:14,960 --> 00:16:18,440
acceptable and what's not, but also
what's legal and what's not.
314
00:16:18,440 --> 00:16:21,840
And then other people are watching
and think that that person is right.
315
00:16:21,840 --> 00:16:23,360
They follow, they copy.
316
00:16:23,360 --> 00:16:25,320
If you were face-to-face
with that person,
317
00:16:25,320 --> 00:16:28,000
you wouldn't be commenting
the things that you are commenting.
318
00:16:28,000 --> 00:16:30,440
But because it feels like
you're just talking to friends,
319
00:16:30,440 --> 00:16:33,360
and you see hundreds of other people
who have also done the same thing,
320
00:16:33,360 --> 00:16:36,000
it becomes much more
socially acceptable.
321
00:16:36,000 --> 00:16:39,080
If you see someone attacking
a person, you clearly know
322
00:16:39,080 --> 00:16:42,240
that's wrong, but that's always
going to be taken off TikTok.
323
00:16:42,240 --> 00:16:46,320
But if you see someone invading
their privacy or bullying them,
324
00:16:46,320 --> 00:16:49,880
that's a bit more grey in terms
of how people
325
00:16:49,880 --> 00:16:52,360
take it in and accept.
326
00:16:52,360 --> 00:16:56,680
In December 2022, a man was arrested
and charged with the murders
327
00:16:56,680 --> 00:17:00,480
of the four Idaho students.
He's awaiting trial.
328
00:17:00,480 --> 00:17:03,480
It seems to me that Olivia and
other TikTokers found themselves
329
00:17:03,480 --> 00:17:05,360
caught up in a sort of frenzy.
330
00:17:05,360 --> 00:17:07,880
As more extreme allegations
were made about the case,
331
00:17:07,880 --> 00:17:10,440
they were encouraged to get
involved, forgetting about
332
00:17:10,440 --> 00:17:13,080
the consequences and the real people
targeted.
333
00:17:13,080 --> 00:17:16,160
So why do these frenzies
happen on TikTok?
334
00:17:16,160 --> 00:17:20,000
When you post a video on TikTok,
rather than just being promoted
335
00:17:20,000 --> 00:17:22,480
to your friends and followers,
it will appear on the feeds
336
00:17:22,480 --> 00:17:26,160
of other users who TikTok thinks
could be interested in it.
337
00:17:26,160 --> 00:17:28,840
Depending on how they engage
with that video, the algorithm
338
00:17:28,840 --> 00:17:31,840
might decide this is something
people like and push it to millions
339
00:17:31,840 --> 00:17:35,200
more at a speed and scale
that doesn't seem possible on other
340
00:17:35,200 --> 00:17:37,040
social media platforms.
341
00:17:37,040 --> 00:17:39,960
After seeing the original
video's success, other users
342
00:17:39,960 --> 00:17:43,600
might then decide to create
their own versions, which in turn
343
00:17:43,600 --> 00:17:47,160
will be sent out to more people.
And the cycle starts again.
344
00:17:47,160 --> 00:17:51,600
And so you have a TikTok frenzy,
often with users pushing it
345
00:17:51,600 --> 00:17:55,800
to new extremes to get noticed,
because the likes and views
346
00:17:55,800 --> 00:17:58,000
make them think that it's OK.
347
00:18:01,720 --> 00:18:03,840
These frenzies are different
on TikTok
348
00:18:03,840 --> 00:18:06,320
because of its emphasis
on participation.
349
00:18:06,320 --> 00:18:09,360
From the outset, when people
came to it as a dance app,
350
00:18:09,360 --> 00:18:12,360
getting involved has been
a key part of its DNA.
351
00:18:14,120 --> 00:18:17,360
TikTok inspires participation
more than any other platform.
352
00:18:17,360 --> 00:18:21,680
I truly believe
it is THE participatory platform.
353
00:18:21,680 --> 00:18:24,360
SCREAM
354
00:18:21,680 --> 00:18:24,360
Oh, yeah!
355
00:18:24,360 --> 00:18:28,360
But the really cool thing about
TikTok and TikTok frenzies,
356
00:18:28,360 --> 00:18:30,120
is that people try...
357
00:18:30,120 --> 00:18:33,280
They don't just recreate it.
They put their own spin on it.
358
00:18:33,280 --> 00:18:37,400
And so you start to see what
the original piece of content was
359
00:18:37,400 --> 00:18:40,360
that went viral and led to
this TikTok frenzy,
360
00:18:40,360 --> 00:18:43,640
and then you'll see, like, the last
video that kind of like blew up
361
00:18:43,640 --> 00:18:45,760
and it's nowhere near the first one.
362
00:18:45,760 --> 00:18:48,360
And it's so cool to see,
like, the creativity
363
00:18:48,360 --> 00:18:52,040
that the community brings
to those TikTok frenzies.
364
00:18:52,040 --> 00:18:53,680
My name is Chris Stokel-Walker.
365
00:18:53,680 --> 00:18:56,040
I'm a journalist and the author
of TikTok Boom.
366
00:18:56,040 --> 00:18:59,720
Traditionally, in social media,
you have been passive consumers -
367
00:18:59,720 --> 00:19:04,960
you've watched videos wash over you
through the app's delivery of them.
368
00:19:04,960 --> 00:19:09,160
But with TikTok, you are actively
encouraged to participate,
369
00:19:09,160 --> 00:19:14,440
and it's because that allows them
to essentially ensure
370
00:19:14,440 --> 00:19:18,040
that you go from being someone
who can dip in and out,
371
00:19:18,040 --> 00:19:20,960
to being someone who is
actively invested in this.
372
00:19:23,160 --> 00:19:26,320
What unfolded on TikTok around
this police investigation
373
00:19:26,320 --> 00:19:27,960
felt different to me.
374
00:19:29,000 --> 00:19:31,520
I wondered whether this
was just a US phenomenon.
375
00:19:33,560 --> 00:19:35,720
But then it happened again.
376
00:19:38,400 --> 00:19:40,960
REPORTER: Just what has happened
to Nikki Bulley?
377
00:19:40,960 --> 00:19:44,920
She vanished from this Lancashire
river bank last Friday morning,
378
00:19:44,920 --> 00:19:48,400
a mum of two young girls
who'd just dropped off at school,
379
00:19:48,400 --> 00:19:51,280
and was taking her dog
for its regular walk.
380
00:19:51,280 --> 00:19:54,240
Nikki's mobile phone was found
on this bench.
381
00:19:54,240 --> 00:19:57,680
It was still connected to
a work conference call.
382
00:19:57,680 --> 00:20:01,040
Her spaniel, a short while later,
was found wandering nearby.
383
00:20:01,040 --> 00:20:03,280
Local people who saw it,
say it was dry
384
00:20:03,280 --> 00:20:05,240
and hadn't been in the river.
385
00:20:05,240 --> 00:20:08,800
That leash was meant to be there
and so was that phone.
386
00:20:08,800 --> 00:20:10,160
It was a set-up.
387
00:20:10,160 --> 00:20:13,880
Similar to Idaho, a frenzy began
around her disappearance.
388
00:20:13,880 --> 00:20:18,000
I spotted it on my TikTok feed
before it totally flooded the news.
389
00:20:18,000 --> 00:20:21,040
According to TikTok, it deploys
additional resources
390
00:20:21,040 --> 00:20:24,280
to reduce the potential spread
of conspiratorial content
391
00:20:24,280 --> 00:20:26,200
about unfolding events.
392
00:20:26,200 --> 00:20:29,040
But as the days went on,
the witch-hunts began,
393
00:20:29,040 --> 00:20:31,720
accusing friends and family
of Nicola of being involved.
394
00:20:31,720 --> 00:20:35,000
FEMALE VOICE: He just gives me
bad vibes. What do you think?
395
00:20:35,000 --> 00:20:38,080
One TikTok account was racking up
millions of views,
396
00:20:38,080 --> 00:20:41,720
posting speculation about Nicola's
friend Emma and her partner Paul.
397
00:20:41,720 --> 00:20:43,840
MALE VOICE: I believe
Emma posed as Nicola
398
00:20:43,840 --> 00:20:46,160
the morning she was
reported missing.
399
00:20:46,160 --> 00:20:47,920
Does this support my theory?
400
00:20:47,920 --> 00:20:50,400
Same socks or same person?
401
00:20:50,400 --> 00:20:53,000
FEMALE VOICE: You can clearly see
Paul's reflection right here
402
00:20:53,000 --> 00:20:55,320
and also note how he is in darkness,
403
00:20:55,320 --> 00:20:58,400
while the girls and Nikki
are photographed with flash.
404
00:20:58,400 --> 00:21:01,640
MALE VOICE: Who are these people and
why were they allowed to be present?
405
00:21:01,640 --> 00:21:03,040
Why are you hiding?
406
00:21:03,040 --> 00:21:05,920
Do they know something we don't?
407
00:21:05,920 --> 00:21:08,880
The person behind this TikTok
account has agreed to meet me.
408
00:21:10,600 --> 00:21:12,960
Hello. Hi!
Lovely to meet you. You too.
409
00:21:12,960 --> 00:21:15,320
Heather had never really used
TikTok before,
410
00:21:15,320 --> 00:21:17,840
but this case caught her attention
on the app.
411
00:21:17,840 --> 00:21:21,760
And she says TikTok inspired her
to get involved.
412
00:21:21,760 --> 00:21:25,880
So I went on TikTok to look at
the real person's perspective
413
00:21:25,880 --> 00:21:28,440
on the Nicola Bulley case.
I think the initial posts
414
00:21:28,440 --> 00:21:31,360
were all just pleas,
trying to find her,
415
00:21:31,360 --> 00:21:33,320
little descriptions
of what she was wearing,
416
00:21:33,320 --> 00:21:35,200
retracing her footsteps
from the day.
417
00:21:35,200 --> 00:21:37,600
But there's the dark side,
where the conspiracists
418
00:21:37,600 --> 00:21:40,640
were unpicking her life,
scrutinising the people involved
419
00:21:40,640 --> 00:21:45,040
in her life, and I think that's then
what engages natural curiosity.
420
00:21:45,040 --> 00:21:48,520
When you're being constantly
surrounded by those kinds of videos,
421
00:21:48,520 --> 00:21:50,760
can it become quite difficult
to say,
422
00:21:50,760 --> 00:21:53,720
"Oh, hold up a second -
is that true, is that false?"
423
00:21:53,720 --> 00:21:58,240
I think it's very easy to fall down,
like, a conspiracy hole.
424
00:21:58,240 --> 00:22:00,320
Any little thing
can become suspicious
425
00:22:00,320 --> 00:22:03,080
when you're looking at it
over and over and almost wanting
426
00:22:03,080 --> 00:22:05,840
to find discrepancies
in people's stories.
427
00:22:05,840 --> 00:22:08,200
MALE VOICE: This is so obvious
to me.
428
00:22:08,200 --> 00:22:10,120
Emma posed as Nicola.
429
00:22:10,120 --> 00:22:13,360
This is meant to be doorbell footage
of Nicola.
430
00:22:13,360 --> 00:22:16,120
Heather posted a video
scrutinising doorbell footage
431
00:22:16,120 --> 00:22:19,400
of Nicola leaving the house
on the morning of her disappearance.
432
00:22:19,400 --> 00:22:21,880
She falsely implied the footage
actually showed
433
00:22:21,880 --> 00:22:24,920
Nicola's best friend Emma
pretending to be Nicola.
434
00:22:24,920 --> 00:22:27,760
I can't be the only one seeing this.
435
00:22:27,760 --> 00:22:29,520
Do they share jewellery?
436
00:22:30,520 --> 00:22:32,600
How viral did it go?
437
00:22:32,600 --> 00:22:36,440
So, it got to 3.6 million
within 72 hours.
438
00:22:36,440 --> 00:22:40,760
That level of explosion
was just completely unexpected.
439
00:22:40,760 --> 00:22:43,360
Whenever you post something
and it gains traction,
440
00:22:43,360 --> 00:22:46,160
TikTok will send you little emails
of encouragement,
441
00:22:46,160 --> 00:22:48,680
telling you that, you've received
this many views,
442
00:22:48,680 --> 00:22:50,440
"You're a hit. Keep going."
443
00:22:50,440 --> 00:22:53,520
And, you know, it's there
to encourage you to keep posting.
444
00:22:53,520 --> 00:22:58,360
If you post something and you
receive a lot of positivity from it,
445
00:22:58,360 --> 00:23:00,840
I think that can definitely
change your behaviour,
446
00:23:00,840 --> 00:23:04,560
whereas, before, you might not
have had that level of empowerment
447
00:23:04,560 --> 00:23:07,200
or entitlement, and now
all of a sudden, you feel that
448
00:23:07,200 --> 00:23:11,200
you've got this authority to keep
posting whatever you've done before.
449
00:23:11,200 --> 00:23:14,840
And I think it does... Definitely
has the power to alter people's
450
00:23:14,840 --> 00:23:17,120
perspective on reality.
451
00:23:17,120 --> 00:23:19,840
There was, I think,
around 10,000 comments
452
00:23:19,840 --> 00:23:21,520
just on that one video alone.
453
00:23:21,520 --> 00:23:24,760
There are obviously, dotted
in there, quite abusive ones -
454
00:23:24,760 --> 00:23:26,640
you know, "You're immoral,"
455
00:23:26,640 --> 00:23:29,000
"you're making money off the misery
of people,"
456
00:23:29,000 --> 00:23:31,040
and that's just not true.
457
00:23:31,040 --> 00:23:33,560
We've got two beautiful little girls
that need their mummy home,
458
00:23:33,560 --> 00:23:35,240
and that's what's keeping us going.
459
00:23:35,240 --> 00:23:37,120
But, again, the community
is united...
460
00:23:37,120 --> 00:23:38,640
As interest in the case grew,
461
00:23:38,640 --> 00:23:41,320
Emma spoke to the media
about the disappearance.
462
00:23:41,320 --> 00:23:44,120
This led to TikTokers
analysing her behaviour
463
00:23:44,120 --> 00:23:46,120
and claiming she was involved.
464
00:23:46,120 --> 00:23:49,760
Emma White is really beginning to
grind on me for a number of reasons.
465
00:23:49,760 --> 00:23:51,520
What's the need for all the make-up?
466
00:23:51,520 --> 00:23:53,760
MALE VOICE: Her make-up is always
done well.
467
00:23:53,760 --> 00:23:56,440
The negative that sticks in my mind
is Emma White,
468
00:23:56,440 --> 00:23:57,680
who I don't trust.
469
00:23:57,680 --> 00:24:00,600
MALE VOICE: I believe the CCTV
footage to be that of Emma.
470
00:24:00,600 --> 00:24:02,160
And it wasn't just Emma.
471
00:24:02,160 --> 00:24:05,040
Nicola's partner Paul also became
a particular target
472
00:24:05,040 --> 00:24:07,320
of false claims on TikTok.
473
00:24:07,320 --> 00:24:09,360
Can you not see through him?
474
00:24:09,360 --> 00:24:11,360
He's lying through his teeth.
475
00:24:11,360 --> 00:24:14,080
What doesn't sit right with me
with this gentleman
476
00:24:14,080 --> 00:24:15,760
is his body language.
477
00:24:15,760 --> 00:24:18,680
Very relaxed, no emotion.
478
00:24:18,680 --> 00:24:21,560
His face... There's some confusion,
as the sides of his mouth go up
479
00:24:21,560 --> 00:24:22,920
and his brow goes down.
480
00:24:22,920 --> 00:24:26,560
It's all so ridiculous,
his head and his scarf.
481
00:24:26,560 --> 00:24:28,560
England is not particularly cold,
482
00:24:28,560 --> 00:24:31,960
so it makes you wonder what
he's trying to hide.
483
00:24:31,960 --> 00:24:33,760
These theories were wrong.
484
00:24:33,760 --> 00:24:36,280
Emma and Paul were innocent.
485
00:24:36,280 --> 00:24:39,640
Lancashire Police said that they
are continuing to see a huge amount
486
00:24:39,640 --> 00:24:43,040
of commentary from so-called
experts, ill-informed speculation,
487
00:24:43,040 --> 00:24:46,840
and conspiracy theories,
which they said was damaging
488
00:24:46,840 --> 00:24:49,720
to the investigation, the community
here in St Michael's
489
00:24:49,720 --> 00:24:52,200
and, most importantly,
to Nicola's family.
490
00:24:52,200 --> 00:24:54,160
They said it must stop.
491
00:24:54,160 --> 00:24:56,920
"Our girls will get the support
they need
492
00:24:56,920 --> 00:24:59,480
"from the people who love them
the most.
493
00:25:00,720 --> 00:25:04,000
"And it saddens us to think
that one day,
494
00:25:04,000 --> 00:25:09,520
"we will have to explain to them
that the press and members of the
495
00:25:09,520 --> 00:25:13,000
"public accused their dad of
wrongdoing...
496
00:25:14,360 --> 00:25:18,800
"..misquoted and vilified
friends and family.
497
00:25:18,800 --> 00:25:21,320
"This is absolutely appalling.
498
00:25:21,320 --> 00:25:23,280
"They have to be held accountable.
499
00:25:23,280 --> 00:25:26,600
"This cannot happen to
another family."
500
00:25:26,600 --> 00:25:29,640
The police announcements didn't stop
some users from posting about
501
00:25:29,640 --> 00:25:32,520
the case, even though
the speculation was causing
502
00:25:32,520 --> 00:25:35,800
serious harm to a family
already living through the worst.
503
00:25:37,840 --> 00:25:40,600
Do you think that when lots and lots
of other people
504
00:25:40,600 --> 00:25:43,480
are posting about something,
sharing their views about it,
505
00:25:43,480 --> 00:25:45,680
and millions of people
are watching it,
506
00:25:45,680 --> 00:25:48,480
do you think it can make us
a bit desensitised?
507
00:25:48,480 --> 00:25:51,840
When you're seeing video after video
after video of the same content,
508
00:25:51,840 --> 00:25:54,520
on the same topic, it's very easy
to just think,
509
00:25:54,520 --> 00:25:57,480
"Well, I can join in on that.
And I'm just another person."
510
00:25:57,480 --> 00:26:00,720
I've had to remind myself,
"These are other people's lives.
511
00:26:00,720 --> 00:26:03,960
"And it's not just a video
that's going to go nowhere -
512
00:26:03,960 --> 00:26:06,680
"it's potentially going to
blow up in your face,
513
00:26:06,680 --> 00:26:08,400
"and then you are accountable."
514
00:26:08,400 --> 00:26:12,800
I think that's the danger of TikTok,
is that if somebody posts something
515
00:26:12,800 --> 00:26:16,120
that isn't factual, then it gets
a lot of views
516
00:26:16,120 --> 00:26:18,800
and then it sort of becomes
a conspiracy.
517
00:26:21,040 --> 00:26:23,480
Since speaking to me,
Heather deleted her posts
518
00:26:23,480 --> 00:26:25,720
about Nicola Bulley.
519
00:26:25,720 --> 00:26:29,040
Heather seemed to really regret
becoming caught up in this frenzy,
520
00:26:29,040 --> 00:26:31,440
and she told me that without TikTok,
she just doesn't think
521
00:26:31,440 --> 00:26:33,680
she would have participated
in this way.
522
00:26:33,680 --> 00:26:36,960
The press was also to blame for
what unfolded around this case,
523
00:26:36,960 --> 00:26:39,560
but it was on TikTok
where some of the misinformation
524
00:26:39,560 --> 00:26:41,720
that couldn't be aired by
the traditional media
525
00:26:41,720 --> 00:26:46,000
spread like wildfire.
It all became really out of control.
526
00:26:47,360 --> 00:26:50,360
A study by the Integrity Institute
measured how much engagement
527
00:26:50,360 --> 00:26:53,120
certain creators received
on inaccurate posts
528
00:26:53,120 --> 00:26:55,320
compared to their other posts.
529
00:26:55,320 --> 00:26:59,320
They found that misinformation
was amplified 25 times on TikTok,
530
00:26:59,320 --> 00:27:02,840
compared to 2.4 times on Instagram
531
00:27:02,840 --> 00:27:05,520
and 3.8 times on Facebook.
532
00:27:05,520 --> 00:27:08,440
The platform that did
the least worst is Instagram,
533
00:27:08,440 --> 00:27:10,400
and that is actually a design...
534
00:27:10,400 --> 00:27:13,080
..an implication
of how Instagram is designed.
535
00:27:13,080 --> 00:27:15,280
It's very much dependent
upon who you follow,
536
00:27:15,280 --> 00:27:18,000
there's no reshare
or retweet button,
537
00:27:18,000 --> 00:27:22,160
and most of the views are not coming
from algorithmic recommendations.
538
00:27:22,160 --> 00:27:24,880
Like the other social media sites,
TikTok do have
539
00:27:24,880 --> 00:27:28,600
specialised misinformation
moderators, equipped with tools
540
00:27:28,600 --> 00:27:31,400
to help keep misinformation
off the platform.
541
00:27:31,400 --> 00:27:34,360
The fact-checking tool
is something they brought in,
542
00:27:34,360 --> 00:27:37,240
and it felt very much as a response
to the criticism
543
00:27:37,240 --> 00:27:38,960
TikTok was getting in the media.
544
00:27:38,960 --> 00:27:41,760
So, they wanted something
for moderators,
545
00:27:41,760 --> 00:27:44,800
so they could look up in real time
what was happening.
546
00:27:44,800 --> 00:27:49,240
Actually using the fact-checking
tool was quite complicated.
547
00:27:49,240 --> 00:27:51,880
Sometimes the fact-checkers
weren't really, like, clear
548
00:27:51,880 --> 00:27:53,280
on what they have to fact-check.
549
00:27:53,280 --> 00:27:58,200
So, for normal users who don't have
this privilege of a fact-check tool,
550
00:27:58,200 --> 00:28:01,000
yeah, it's having
a really real effect.
551
00:28:01,000 --> 00:28:04,760
And it is changing people's minds
about things.
552
00:28:13,000 --> 00:28:14,960
It is quite steep down here.
553
00:28:14,960 --> 00:28:18,000
Similar to Idaho, the frenzy on
TikTok around the Nicola Bulley case
554
00:28:18,000 --> 00:28:19,960
grew exponentially.
555
00:28:19,960 --> 00:28:22,280
Within the first three weeks
of her disappearance,
556
00:28:22,280 --> 00:28:26,720
I found videos using the hashtag of
her name had 270 million views
557
00:28:26,720 --> 00:28:31,440
on TikTok, compared to
just 3.3 million on YouTube.
558
00:28:31,440 --> 00:28:33,640
Yet what was different
was the number of TikTokers
559
00:28:33,640 --> 00:28:35,800
turning up at the scene.
560
00:28:35,800 --> 00:28:38,160
This caused the police
to take the unprecedented move
561
00:28:38,160 --> 00:28:40,400
of calling out the interference
by TikTokers
562
00:28:40,400 --> 00:28:43,320
and issuing a dispersal order
around the area.
563
00:28:43,320 --> 00:28:47,280
TikTokers have been playing
their own private detectives.
564
00:28:47,280 --> 00:28:50,880
In 29 years' police service,
I've never seen anything like it.
565
00:28:50,880 --> 00:28:54,680
It has significantly distracted
the investigation.
566
00:28:54,680 --> 00:28:56,640
Hi, guys. My name is
Spencer Sutcliffe
567
00:28:56,640 --> 00:28:58,840
from Spencer Sutcliffe
Security and Training Ltd...
568
00:28:58,840 --> 00:29:01,440
Locals even employed
a private security firm
569
00:29:01,440 --> 00:29:02,960
to keep TikTokers away.
570
00:29:02,960 --> 00:29:05,320
..all these conspiracy theories
that are going on
571
00:29:05,320 --> 00:29:07,040
and people taking it as fact.
572
00:29:07,040 --> 00:29:09,120
So, if you don't need to come
to the village, guys,
573
00:29:09,120 --> 00:29:10,320
please just stay away.
574
00:29:10,320 --> 00:29:13,840
The caravan site close to the bench
where Nicola went missing
575
00:29:13,840 --> 00:29:17,440
soon became a central focal point
for the TikTok sleuths.
576
00:29:17,440 --> 00:29:20,400
There is a caravan site
with a CCTV blind spot
577
00:29:20,400 --> 00:29:22,840
that the police haven't bothered
to look into.
578
00:29:22,840 --> 00:29:25,680
That's a map of the caravan park,
and as you can see,
579
00:29:25,680 --> 00:29:29,360
it backs onto the river
where Nicola was last seen.
580
00:29:29,360 --> 00:29:34,440
There has been some mention of CCTV,
specifically at the caravan site,
581
00:29:34,440 --> 00:29:37,520
that hasn't been working,
and that that's suspicious itself.
582
00:29:37,520 --> 00:29:39,080
That is not the case.
583
00:29:39,080 --> 00:29:42,800
We have been helped and assisted
beyond all belief
584
00:29:42,800 --> 00:29:44,400
by the caravan owners.
585
00:29:46,320 --> 00:29:48,440
Hello! Hello. Nice to see you.
How are you doing?
586
00:29:48,440 --> 00:29:51,480
Not bad, thank you. Thanks for
having me. You're welcome!
587
00:29:51,480 --> 00:29:54,800
I've come to meet Oliver,
whose family own the caravan site.
588
00:29:54,800 --> 00:29:57,800
So, tell me a bit
about this caravan park.
589
00:29:57,800 --> 00:29:59,320
When did your family come here?
590
00:29:59,320 --> 00:30:01,840
My grandparents bought this place
in the late '90s.
591
00:30:01,840 --> 00:30:03,960
All these fields here
where the campers stay
592
00:30:03,960 --> 00:30:06,720
and the house is owned by my granny.
593
00:30:06,720 --> 00:30:10,360
And she sort of runs the business
and she lives there as well.
594
00:30:10,360 --> 00:30:12,120
Yeah...
595
00:30:10,360 --> 00:30:12,120
THEY LAUGH
596
00:30:13,600 --> 00:30:16,680
I'd spotted how Oliver's
78-year-old grandmother Penny
597
00:30:16,680 --> 00:30:20,480
found herself at the centre of
some TikTok conspiracy theories.
598
00:30:21,560 --> 00:30:25,840
My grandmother was the one who found
poor Nicola's dog
599
00:30:25,840 --> 00:30:27,720
at the bench by the river.
600
00:30:27,720 --> 00:30:30,440
Things sort of snowballed
from there,
601
00:30:30,440 --> 00:30:34,040
as national papers started
to pick up the story,
602
00:30:34,040 --> 00:30:37,720
and then others unfortunately
started to come up
603
00:30:37,720 --> 00:30:40,600
with their own sort of theories
as to what happened.
604
00:30:40,600 --> 00:30:43,640
People talking about
this tragic case online
605
00:30:43,640 --> 00:30:46,400
found out that my grandmother lives
just mere yards
606
00:30:46,400 --> 00:30:48,960
from where Nicola was last seen.
607
00:30:48,960 --> 00:30:52,760
The theories started,
and the TikTok videos, of course,
608
00:30:52,760 --> 00:30:54,680
began to be produced.
609
00:30:54,680 --> 00:30:57,400
And, of course, the theories just
got wilder and wilder.
610
00:30:57,400 --> 00:31:00,120
Within days, if not hours,
that quickly escalated
611
00:31:00,120 --> 00:31:02,760
to people finding out
what my grandmother's name is,
612
00:31:02,760 --> 00:31:04,880
the address being posted online.
613
00:31:04,880 --> 00:31:07,040
People started to physically
turn up in the village
614
00:31:07,040 --> 00:31:09,640
and to start to take photos
of the house.
615
00:31:09,640 --> 00:31:12,320
And one particular TikToker
came onto the property
616
00:31:12,320 --> 00:31:15,760
and started to speak to some of
our customers at our business.
617
00:31:15,760 --> 00:31:19,640
And then the same TikToker
then approached my grandmother.
618
00:31:24,160 --> 00:31:27,120
MAN: We actually came on
just to look around how big it is.
619
00:31:27,120 --> 00:31:28,440
That's all it was.
620
00:31:28,440 --> 00:31:30,120
What do you think you're doing?
621
00:31:30,120 --> 00:31:33,360
Cos it was open, we walked in.
It's not open. Off you go!
622
00:31:33,360 --> 00:31:34,800
Go on, both of you!
623
00:31:34,800 --> 00:31:36,200
So you got...no words to say, no?
624
00:31:36,200 --> 00:31:38,480
No, nothing to say... No, no.
625
00:31:38,480 --> 00:31:40,440
Do you have anything to say
about Nicola Bulley?
626
00:31:40,440 --> 00:31:42,520
Nothing at all! Nothing?
Go on. Nothing at all.
627
00:31:42,520 --> 00:31:44,520
Off you go. OK.
628
00:31:44,520 --> 00:31:48,280
He stood in front of her,
asking her questions,
629
00:31:48,280 --> 00:31:51,720
trying to provoke her,
antagonise her into a reaction.
630
00:31:51,720 --> 00:31:56,920
To have... You know,
my grandmother's a 78-year-old woman
631
00:31:56,920 --> 00:31:58,360
who lives on her own.
632
00:31:58,360 --> 00:32:03,640
To have people effectively come to
her door, filming her, was...
633
00:32:03,640 --> 00:32:06,960
It was frightening for all of us,
as a family.
634
00:32:06,960 --> 00:32:12,640
And she said that she feels like
someone may come and attack her
635
00:32:12,640 --> 00:32:14,000
and to try and kill her.
636
00:32:14,000 --> 00:32:18,200
They've been intrusive,
intimidating, antagonistic -
637
00:32:18,200 --> 00:32:22,920
and all for getting clicks and likes
and to earn money online.
638
00:32:24,160 --> 00:32:27,360
This same TikToker, Curtis Arnold,
achieved notoriety
639
00:32:27,360 --> 00:32:30,520
for his invasive videos
around the case.
640
00:32:30,520 --> 00:32:34,400
A man has been arrested over footage
shot from inside a police cordon
641
00:32:34,400 --> 00:32:36,560
on the day the body
of Nicola Bulley was found
642
00:32:36,560 --> 00:32:38,640
in a river in Lancashire last month.
643
00:32:38,640 --> 00:32:43,720
To try and catch them recovering
a deceased body from the water
644
00:32:43,720 --> 00:32:45,760
is absolutely disgusting.
645
00:32:45,760 --> 00:32:49,000
But even the TikTok community
thought he'd taken it too far
646
00:32:49,000 --> 00:32:51,680
when he shared a video of himself
getting past the cordon
647
00:32:51,680 --> 00:32:53,640
to film the removal
of Nicola's body.
648
00:32:53,640 --> 00:32:57,160
Police say the 34-year-old man
from Kidderminster was detained
649
00:32:57,160 --> 00:32:59,880
on suspicion of malicious
communication offences
650
00:32:59,880 --> 00:33:03,040
and perverting the course
of justice.
651
00:33:03,040 --> 00:33:08,120
TikTok incentivises creators very,
very clearly by paying them money.
652
00:33:08,120 --> 00:33:11,680
If you have a large following
on TikTok and your videos
653
00:33:11,680 --> 00:33:15,400
are getting typically high view
rates, TikTok will pay you,
654
00:33:15,400 --> 00:33:18,200
so there's very, very
clear incentive
655
00:33:18,200 --> 00:33:20,560
to get as many views as possible.
656
00:33:22,720 --> 00:33:26,960
TikTok pays creators with over
10,000 followers between $20 and $40
657
00:33:26,960 --> 00:33:28,560
per million views.
658
00:33:28,560 --> 00:33:32,400
It's unclear how much Curtis Arnold
made from his social media posts.
659
00:33:32,400 --> 00:33:35,680
If you can make this your living,
660
00:33:35,680 --> 00:33:39,480
or if you can build up an audience
and a little bit of credibility
661
00:33:39,480 --> 00:33:42,000
and maybe even some fame
through this,
662
00:33:42,000 --> 00:33:44,720
you're much more likely to log on
every day.
663
00:33:44,720 --> 00:33:49,160
So there's this kind of vicious
circle that you go in,
664
00:33:49,160 --> 00:33:53,840
where you kind of are shown content,
you're encouraged to make content,
665
00:33:53,840 --> 00:33:58,000
you're given the first taste of fame
from that content and, therefore,
666
00:33:58,000 --> 00:34:01,000
you create more and more
and constantly feed that algorithm
667
00:34:01,000 --> 00:34:02,080
for TikTok.
668
00:34:02,080 --> 00:34:05,360
It's almost natural to become
a reverse engineer
669
00:34:05,360 --> 00:34:09,560
of the TikTok algorithm, to do
what the algorithm wants you to do,
670
00:34:09,560 --> 00:34:12,360
do the kind of stuff that gets you
lots of views and impressions.
671
00:34:12,360 --> 00:34:15,720
This will very naturally
sort of influence people's behaviour
672
00:34:15,720 --> 00:34:19,120
and, you know, the videos that
they actually do create.
673
00:34:19,120 --> 00:34:22,600
Do you think that it's changed
how people behave
674
00:34:22,600 --> 00:34:25,000
and how people think it's acceptable
to behave,
675
00:34:25,000 --> 00:34:27,800
these kinds of TikTok videos
and posts online?
676
00:34:27,800 --> 00:34:31,040
I think 20 years ago, this simply
wouldn't have happened.
677
00:34:31,040 --> 00:34:33,640
The village has been turned into
a tourist attraction -
678
00:34:33,640 --> 00:34:38,400
a fairground, almost - where people
think that they can come here
679
00:34:38,400 --> 00:34:45,120
and film our family home as if
it's some sort of macabre exhibit.
680
00:34:45,120 --> 00:34:47,600
TikTok have a responsibility.
681
00:34:47,600 --> 00:34:50,520
And I'd like to ask,
"Do they think they are
682
00:34:50,520 --> 00:34:53,240
"taking that responsibility
seriously enough?"
683
00:34:53,240 --> 00:34:56,480
Cos I certainly don't think
they are.
684
00:34:56,480 --> 00:34:58,920
An inquest has determined
that Nicola's death
685
00:34:58,920 --> 00:35:01,000
was due to accidental drowning.
686
00:35:02,920 --> 00:35:04,520
From investigating these cases,
687
00:35:04,520 --> 00:35:07,320
it seems to me that something new
is happening on TikTok.
688
00:35:07,320 --> 00:35:10,360
The social media site appears to be
recommending this kind of content
689
00:35:10,360 --> 00:35:13,040
to users who just wouldn't
have come across it before,
690
00:35:13,040 --> 00:35:16,120
and then they decide to participate
in a way that they just wouldn't
691
00:35:16,120 --> 00:35:19,320
have done previously. It's almost
like a murder mystery game,
692
00:35:19,320 --> 00:35:21,400
and it gets really out of control.
693
00:35:21,400 --> 00:35:25,200
There are misleading and false
claims targeting innocent people,
694
00:35:25,200 --> 00:35:28,560
and they seem to be enjoyed
by millions.
695
00:35:48,560 --> 00:35:50,640
So, are TikTok aware of
what's going on?
696
00:35:50,640 --> 00:35:52,600
And what are they doing about it?
697
00:35:52,600 --> 00:35:55,600
I've tracked down someone
who used to be on the inside
698
00:35:55,600 --> 00:35:56,840
for some answers.
699
00:35:56,840 --> 00:36:01,040
We've revoiced him and are calling
him Lucas to protect his identity.
700
00:36:01,040 --> 00:36:04,120
I worked in a role in data strategy
and analysis,
701
00:36:04,120 --> 00:36:05,760
particularly focused on revenue
702
00:36:05,760 --> 00:36:08,000
and the profit-driving side
of the business.
703
00:36:08,000 --> 00:36:10,600
Do you think when you were there
that they were equipped
704
00:36:10,600 --> 00:36:12,760
to become more than just
a dancing app,
705
00:36:12,760 --> 00:36:15,440
that they were equipped
to deal with videos about
706
00:36:15,440 --> 00:36:18,880
stuff happening in the news about
a whole range of different issues?
707
00:36:18,880 --> 00:36:22,400
No, because it grew so fast that
they couldn't possibly keep up with
708
00:36:22,400 --> 00:36:25,280
or predicted every single way
the app was going to go.
709
00:36:25,280 --> 00:36:28,320
But in terms of dangerous content,
at least, I haven't ever heard
710
00:36:28,320 --> 00:36:31,640
of them trying to proactively
prevent them from getting big.
711
00:36:31,640 --> 00:36:33,360
And in general, they don't want to.
712
00:36:33,360 --> 00:36:36,400
They don't want to stand in the way
of entertainment growing quickly
713
00:36:36,400 --> 00:36:39,360
on their platform. They don't want
to be content moderators
714
00:36:39,360 --> 00:36:41,880
because then they are responsible
for the content.
715
00:36:41,880 --> 00:36:44,200
You said that phrase -
"They don't want to get in the way
716
00:36:44,200 --> 00:36:46,920
"of entertainment and of growth."
717
00:36:46,920 --> 00:36:48,560
Why is that? Money?
718
00:36:48,560 --> 00:36:50,240
Yeah, it's about money.
719
00:36:50,240 --> 00:36:52,720
The more users they have on
the platform,
720
00:36:52,720 --> 00:36:54,680
spending more time watching videos,
721
00:36:54,680 --> 00:36:58,200
they can sell more ads,
sell them for a higher price.
722
00:36:58,200 --> 00:37:01,000
It feels like an engine to make
revenue off people's time
723
00:37:01,000 --> 00:37:02,840
in unproductive ways.
724
00:37:02,840 --> 00:37:04,480
It feels potentially dangerous,
725
00:37:04,480 --> 00:37:07,760
and it's hard to get excited about
working towards that goal.
726
00:37:07,760 --> 00:37:11,280
It's probably the most addictive
platform that we've encountered yet,
727
00:37:11,280 --> 00:37:14,720
and I think that's a real danger,
especially because of how young
728
00:37:14,720 --> 00:37:18,520
the audience is
and how impressionable they are.
729
00:37:18,520 --> 00:37:21,880
There was evidence that Tiktok's
algorithm was in some way driving
730
00:37:21,880 --> 00:37:26,640
the frenzy of strange behaviour
in both the Bulley and Idaho cases.
731
00:37:26,640 --> 00:37:29,000
But how widespread was the problem?
732
00:37:29,000 --> 00:37:32,360
Was there evidence that TikTok
was connected to other outbreaks
733
00:37:32,360 --> 00:37:34,440
of unusual behaviour?
734
00:37:34,440 --> 00:37:36,480
SHOUTING
735
00:37:39,160 --> 00:37:40,440
Oh, my God!
736
00:37:42,160 --> 00:37:44,840
The so-called "TikTok protests"
have continued to take place
737
00:37:44,840 --> 00:37:48,560
in Britain's schools, as hundreds
of pupils rebelled against teachers
738
00:37:48,560 --> 00:37:52,360
over new rules, with some clips
attracting millions of views.
739
00:37:52,360 --> 00:37:55,840
In February 2023, another frenzy
took hold,
740
00:37:55,840 --> 00:37:58,040
this time in Britain's schools.
741
00:37:59,080 --> 00:38:02,080
A protest at Rainford High
about school uniform checks
742
00:38:02,080 --> 00:38:03,960
was posted on TikTok,
743
00:38:03,960 --> 00:38:08,320
and within three days, students at
over 60 schools filmed protests.
744
00:38:08,320 --> 00:38:12,320
Within a week, there were over
100 schools involved.
745
00:38:12,320 --> 00:38:15,360
Windows were smashed,
trees were set on fire
746
00:38:15,360 --> 00:38:17,720
and teachers were pushed over.
747
00:38:19,080 --> 00:38:20,760
School protests aren't new,
748
00:38:20,760 --> 00:38:23,440
but I noticed that a significant
number of students
749
00:38:23,440 --> 00:38:26,680
seemed to be behaving
in an unusually extreme way.
750
00:38:28,520 --> 00:38:32,200
Some schools were forced
to get police involved.
751
00:38:32,200 --> 00:38:35,560
According to TikTok, most of
the videos showed pupils engaging
752
00:38:35,560 --> 00:38:40,040
in peaceful demonstrations, but
it all felt quite familiar to me -
753
00:38:40,040 --> 00:38:42,960
the sheer speed at which the posts
spread on TikTok
754
00:38:42,960 --> 00:38:46,200
and the way the users
felt emboldened to participate.
755
00:38:48,040 --> 00:38:50,720
Most of the worrying videos
we would see
756
00:38:50,720 --> 00:38:52,680
were by really young users.
757
00:38:52,680 --> 00:38:56,200
These kind of, like, frenzies of,
like, what is currently cool to do
758
00:38:56,200 --> 00:38:58,120
in a school has always
been the case,
759
00:38:58,120 --> 00:39:01,240
but I feel like what TikTok
is enabling people to do now
760
00:39:01,240 --> 00:39:03,920
is to take one thing that's viral
in one school
761
00:39:03,920 --> 00:39:06,680
and transporting it to, like,
the whole region
762
00:39:06,680 --> 00:39:08,840
and making it a competition
763
00:39:08,840 --> 00:39:12,480
in who can up the other schools
and make it more extreme?
764
00:39:12,480 --> 00:39:15,800
TikTok itself will not solve
these problems
765
00:39:15,800 --> 00:39:20,040
because these frenzies,
it is good for the business.
766
00:39:21,840 --> 00:39:23,040
During the school protests,
767
00:39:23,040 --> 00:39:26,320
I decided to see what type of
content the algorithm might offer
768
00:39:26,320 --> 00:39:28,880
a 15-year-old boy's For You feed,
769
00:39:28,880 --> 00:39:31,640
so I set up an undercover account.
770
00:39:31,640 --> 00:39:34,800
Right, so these are all videos
about football
771
00:39:34,800 --> 00:39:37,400
because we've shown an interest
in football.
772
00:39:37,400 --> 00:39:39,720
School uniforms -
what is it?
773
00:39:39,720 --> 00:39:41,720
It's a lovely thing provided
by the school...
774
00:39:41,720 --> 00:39:43,760
It's not even provided
by the schools, by the way.
775
00:39:43,760 --> 00:39:46,440
It looks shit, it feels shit and it
costs way too much for what it is.
776
00:39:46,440 --> 00:39:49,440
The fourth post that came up
was an influencer called Adrian,
777
00:39:49,440 --> 00:39:52,600
who posts self-improvement videos,
and seemed to be encouraging
778
00:39:52,600 --> 00:39:53,840
anti-school behaviour.
779
00:39:53,840 --> 00:39:56,320
You know school is pointless
and you know it's not helping you
780
00:39:56,320 --> 00:39:58,400
prepare for your future
one little bit.
781
00:39:58,400 --> 00:39:59,560
What are we doing here?!
782
00:39:59,560 --> 00:40:02,280
I'll tell you what - being slaves
to the government, making sure
783
00:40:02,280 --> 00:40:06,360
the institution that is a worker
and slave development plan
784
00:40:06,360 --> 00:40:08,200
continues to function...
785
00:40:08,200 --> 00:40:10,440
Over the next few days,
the algorithm drove
786
00:40:10,440 --> 00:40:12,640
more and more of his content
to my feed.
787
00:40:12,640 --> 00:40:16,680
Some bitch told me I need to sit
here in this shitty plastic chair.
788
00:40:16,680 --> 00:40:19,480
They have the audacity
to tell the youth
789
00:40:19,480 --> 00:40:21,200
what to do with their future!
790
00:40:21,200 --> 00:40:24,680
I teach not to be reliant on
higher-ups and teachers. I teach...
791
00:40:24,680 --> 00:40:28,000
Attached to Adrian's account
is a link to his business.
792
00:40:28,000 --> 00:40:30,720
Thousands of teachers across the UK
are worried about me.
793
00:40:30,720 --> 00:40:32,200
They literally did me a favour.
794
00:40:32,200 --> 00:40:34,080
They gave me free exposure.
795
00:40:34,080 --> 00:40:36,520
Thank you to those of you
who continue to stand up for me
796
00:40:36,520 --> 00:40:38,960
against schools, against teachers,
against the boards.
797
00:40:38,960 --> 00:40:42,000
I started to look at some of
the comments under his videos.
798
00:40:42,000 --> 00:40:45,120
Many seemed to be from teenage boys
in the UK.
799
00:40:45,120 --> 00:40:48,240
They suggested that some of them
were changing their behaviour
800
00:40:48,240 --> 00:40:50,880
after watching Adrian's videos.
801
00:40:50,880 --> 00:40:55,120
Others pushed back after noticing
the behaviour of their friends.
802
00:41:06,120 --> 00:41:09,240
Adrian's agreed to have
a conversation with me.
803
00:41:09,240 --> 00:41:13,160
I want to ask him about his content
and the impact it could be having.
804
00:41:13,160 --> 00:41:15,680
..every single day
without realising...
805
00:41:15,680 --> 00:41:19,160
Not one single piece of content,
not one video that I've put out
806
00:41:19,160 --> 00:41:23,920
is in any way with misinformation
or with hate speech or anything,
807
00:41:23,920 --> 00:41:27,600
any kind of negative impact
towards my audience.
808
00:41:27,600 --> 00:41:31,560
I encourage them to rebel against
ridiculous rules in schools,
809
00:41:31,560 --> 00:41:33,840
where you're discouraged
from simple communication.
810
00:41:33,840 --> 00:41:35,680
I mean, you know, it's crazy.
811
00:41:35,680 --> 00:41:38,120
What do you think about
those kinds of protests?
812
00:41:38,120 --> 00:41:39,760
Do you think they've gone too far?
813
00:41:39,760 --> 00:41:42,720
Do you think that they still are
within the means of legitimate,
814
00:41:42,720 --> 00:41:45,360
I don't know, opposition
to school rules?
815
00:41:45,360 --> 00:41:46,600
I think these things are very rare
816
00:41:46,600 --> 00:41:48,440
and they're definitely extreme,
to say the least,
817
00:41:48,440 --> 00:41:50,240
cos if you look at my following.
Let's say...
818
00:41:50,240 --> 00:41:52,160
I've got 1.2 million followers
on my account,
819
00:41:52,160 --> 00:41:57,120
let's say, of which...let's chop it
down to 800K are schoolchildren.
820
00:41:57,120 --> 00:42:00,320
How many of those children,
realistically, were involved
821
00:42:00,320 --> 00:42:03,040
in those acts? I mean, a tiny,
tiny percentage.
822
00:42:03,040 --> 00:42:04,320
I can't control that.
823
00:42:04,320 --> 00:42:07,640
How do you balance making sure
that you are encouraging people
824
00:42:07,640 --> 00:42:10,360
to express what they think,
but not to take it too far
825
00:42:10,360 --> 00:42:13,720
and actually to, you know, harass
teachers or call them horrible names
826
00:42:13,720 --> 00:42:15,760
or make sure that they're not
behaving in, like,
827
00:42:15,760 --> 00:42:17,640
a violent way towards
other people at school?
828
00:42:17,640 --> 00:42:19,160
I mean, how do you tread that line?
829
00:42:19,160 --> 00:42:21,720
Or is it not possible to,
with your content?
830
00:42:21,720 --> 00:42:24,120
Ultimately, if I'm putting out
a video
831
00:42:24,120 --> 00:42:27,360
to 1.5 million people, I'm not...
832
00:42:27,360 --> 00:42:28,880
I absolutely cannot cover
833
00:42:28,880 --> 00:42:31,480
for the 200 out of
those 1.5 million people
834
00:42:31,480 --> 00:42:34,000
who decide to do something
absolutely ridiculous.
835
00:42:34,000 --> 00:42:36,720
I mean, in none of my videos
will you ever see me condone
836
00:42:36,720 --> 00:42:39,280
anything like the things
that you've listed up.
837
00:42:39,280 --> 00:42:42,040
I think to say that social media
and TikTok in particular
838
00:42:42,040 --> 00:42:45,360
or my content is to blame for
these things, it's completely false.
839
00:43:11,320 --> 00:43:13,360
What was interesting about
the school frenzy
840
00:43:13,360 --> 00:43:15,600
was that it was real world
from the off.
841
00:43:15,600 --> 00:43:17,880
Users could participate immediately.
842
00:43:17,880 --> 00:43:20,720
That was maybe one reason
why it spiked so quickly
843
00:43:20,720 --> 00:43:22,360
and with such intensity.
844
00:43:23,960 --> 00:43:27,040
A few months later, in June,
I spotted some more protests
845
00:43:27,040 --> 00:43:29,760
involving young people
unfolding in France.
846
00:43:29,760 --> 00:43:33,880
Protests and unrest erupted
in the region around Paris overnight
847
00:43:33,880 --> 00:43:38,320
after police shot dead a 17-year-old
who failed to stop when ordered to.
848
00:43:38,320 --> 00:43:41,160
The statement from the police saying
the police officers
849
00:43:41,160 --> 00:43:43,400
were feeling threatened
and had to shoot,
850
00:43:43,400 --> 00:43:45,720
but this video posted
on social media
851
00:43:45,720 --> 00:43:48,400
had a completely different tell
of the story.
852
00:43:48,400 --> 00:43:51,520
This is what created a lot of anger.
853
00:43:51,520 --> 00:43:53,520
The French President
Emmanuel Macron
854
00:43:53,520 --> 00:43:56,480
is holding a crisis meeting
right now.
855
00:43:56,480 --> 00:43:58,200
He has called for calm.
856
00:43:59,320 --> 00:44:01,600
They stole a bus and burned it.
857
00:44:02,880 --> 00:44:06,600
The French President levelled
the blame at TikTok and Snapchat.
858
00:44:06,600 --> 00:44:09,040
TRANSLATION:
859
00:44:31,920 --> 00:44:34,200
Was there another TikTok frenzy
at play?
860
00:44:34,200 --> 00:44:37,160
Or was the French President
just deflecting responsibility
861
00:44:37,160 --> 00:44:40,320
from a shocking incident
that provoked mass unrest?
862
00:44:41,680 --> 00:44:44,120
I looked through the content
spreading on social media
863
00:44:44,120 --> 00:44:45,920
using Nahel's name.
864
00:44:45,920 --> 00:44:50,960
On Snapchat, I found public videos
racking up over 167,000 views.
865
00:44:50,960 --> 00:44:54,520
That doesn't include private chats
that I can't access.
866
00:44:54,520 --> 00:44:57,080
Even then, on TikTok,
videos using this hashtag
867
00:44:57,080 --> 00:45:00,240
racked up over 853 million views.
868
00:45:00,240 --> 00:45:03,360
Quite a few of the TikToks
were Snapchat videos reshared
869
00:45:03,360 --> 00:45:07,080
with a much higher reach than just
friends or those who live locally.
870
00:45:09,240 --> 00:45:11,440
The kind of problem
that we're seeing now is
871
00:45:11,440 --> 00:45:14,720
bad behaviour in one part
of the world
872
00:45:14,720 --> 00:45:17,600
can very quickly spread
to other parts of the world.
873
00:45:17,600 --> 00:45:21,520
And so, you know, when you log on
to TikTok, in some respects,
874
00:45:21,520 --> 00:45:23,720
you're looking at
all of the bad behaviour
875
00:45:23,720 --> 00:45:27,040
from all of the different locations
in the country, all at once,
876
00:45:27,040 --> 00:45:29,200
and so, you know, it's only natural
to imagine
877
00:45:29,200 --> 00:45:31,560
that's going to inspire
some copycatting.
878
00:45:31,560 --> 00:45:33,640
The French protests
don't surprise me.
879
00:45:33,640 --> 00:45:36,160
TikTok can definitely drive
more eyeballs in a way
880
00:45:36,160 --> 00:45:38,120
that escalates situations.
881
00:45:38,120 --> 00:45:40,560
I don't think it can
validate behaviour.
882
00:45:40,560 --> 00:45:44,440
Groupthink is very real,
especially in places like TikTok.
883
00:45:44,440 --> 00:45:47,160
I do think TikTok wants to
de-amplify content
884
00:45:47,160 --> 00:45:49,080
that's explicitly dangerous,
885
00:45:49,080 --> 00:45:52,200
but it doesn't want to overly censor
or slow things down
886
00:45:52,200 --> 00:45:53,880
which are developing as viral.
887
00:45:53,880 --> 00:45:56,160
And that can be a tricky line
to walk.
888
00:45:56,160 --> 00:45:59,280
But on the other hand, TikTok
can make people aware of protests
889
00:45:59,280 --> 00:46:01,600
in a positive way too.
890
00:46:01,600 --> 00:46:04,520
I spotted several videos
showing violence and devastation
891
00:46:04,520 --> 00:46:07,000
in a town called Viry-Chatillon.
892
00:46:07,000 --> 00:46:10,280
Protests seem to have spread
at an unusually fast rate
893
00:46:10,280 --> 00:46:12,640
and with an unusual intensity.
894
00:46:14,640 --> 00:46:17,360
Bonjour! Bonjour, Marianna!
895
00:46:17,360 --> 00:46:19,800
What part did TikTok play here?
896
00:46:19,800 --> 00:46:21,760
Jean is the town's mayor.
897
00:48:18,720 --> 00:48:21,560
The French EU Commissioner
criticised social media sites
898
00:48:21,560 --> 00:48:23,880
for not doing enough
to deal with the content.
899
00:48:39,280 --> 00:48:44,840
As a moderator, it felt like
you were there as kind of a front
900
00:48:44,840 --> 00:48:47,480
for TikTok to say that
they were trying their best.
901
00:48:47,480 --> 00:48:49,720
But you were very much aware
while you were moderating
902
00:48:49,720 --> 00:48:53,840
that this is an impossible task
because of the speed of how quickly
903
00:48:53,840 --> 00:48:58,240
someone can upload something and it
being available for everyone to see.
904
00:48:58,240 --> 00:49:02,040
You're constantly playing catch-up
with, like, an invisible beast,
905
00:49:02,040 --> 00:49:05,440
which the algorithm is. It's
kind of like wildfire spreading.
906
00:49:05,440 --> 00:49:07,240
So you can kind of put out
the main fire,
907
00:49:07,240 --> 00:49:09,520
but, like, all of the little fires
are still there
908
00:49:09,520 --> 00:49:11,600
and people can see it and...
909
00:49:11,600 --> 00:49:15,600
Yeah, it's dangerous.
910
00:49:15,600 --> 00:49:17,360
SCREAMING
911
00:49:18,880 --> 00:49:22,240
Only weeks after my trip to Paris,
something else happened -
912
00:49:22,240 --> 00:49:25,440
disorder in the centre of London,
with hundreds of teenagers
913
00:49:25,440 --> 00:49:29,720
descending on Oxford Circus,
talking about looting the shops.
914
00:49:29,720 --> 00:49:31,560
Police issued a dispersal order.
915
00:49:31,560 --> 00:49:35,120
Sadiq Khan directly blamed
social media.
916
00:49:35,120 --> 00:49:38,040
I'm worried about this nonsense
we've seen on TikTok
917
00:49:38,040 --> 00:49:40,720
encouraging people
to go to Oxford Street.
918
00:49:40,720 --> 00:49:43,560
I'd encourage anybody
who's seen that not to go.
919
00:49:43,560 --> 00:49:48,960
Don't allow yourself to be sucked in
by this sort of criminality.
920
00:49:48,960 --> 00:49:52,120
What seemed to have started
on Snapchat, like in France,
921
00:49:52,120 --> 00:49:55,600
had been picked up and shared
more widely on TikTok.
922
00:49:55,600 --> 00:49:59,000
Several I've spoken to told me that
if they hadn't seen those videos
923
00:49:59,000 --> 00:50:01,240
on TikTok, they wouldn't
have turned up.
924
00:50:02,800 --> 00:50:07,320
If I was a copper today, I'd
be absolutely furious with TikTok,
925
00:50:07,320 --> 00:50:10,320
and I want the Government
to throw the book at that lot.
926
00:50:10,320 --> 00:50:13,800
That should never have been
allowed up.
927
00:50:13,800 --> 00:50:16,720
When I first started using TikTok
during the Covid-19 pandemic,
928
00:50:16,720 --> 00:50:18,520
it was all dances and life hacks.
929
00:50:18,520 --> 00:50:20,760
Now almost every major
global news event
930
00:50:20,760 --> 00:50:23,480
unfolds on my For You page
in real time,
931
00:50:23,480 --> 00:50:26,800
including events I just wouldn't
have come across before.
932
00:50:26,800 --> 00:50:28,600
When I first heard about
those Idaho murders,
933
00:50:28,600 --> 00:50:30,560
I had no idea what I'd uncover.
934
00:50:30,560 --> 00:50:34,240
Having spoken to TikTok insiders and
people caught up in these frenzies,
935
00:50:34,240 --> 00:50:37,120
it seems to me that they're really
difficult to get on top of -
936
00:50:37,120 --> 00:50:39,800
and not just that,
but there isn't the incentive to.
937
00:50:39,800 --> 00:50:42,960
After all, a slower
and less participatory TikTok
938
00:50:42,960 --> 00:50:45,920
doesn't seem to be the goal
for the social media company.
939
00:50:45,920 --> 00:50:48,680
All of these frenzies
seem really out of control,
940
00:50:48,680 --> 00:50:51,640
but what happens when someone
works out how to harness them -
941
00:50:51,640 --> 00:50:54,520
and the behaviour connected
to them - for their own aims?
942
00:50:54,520 --> 00:50:57,120
That seems pretty scary.
943
00:50:57,120 --> 00:51:00,680
Do you think TikTok is good or bad
for the world?
944
00:51:00,680 --> 00:51:03,000
I think it's probably
a net negative.
945
00:51:03,000 --> 00:51:06,480
I don't believe that a platform
with short, addictive videos
946
00:51:06,480 --> 00:51:09,960
consuming so much of our time
can be a positive.
947
00:51:11,840 --> 00:51:14,960
If you had children,
would you let them use TikTok?
948
00:51:14,960 --> 00:51:20,360
No, definitely not. I would not
let any of my children use TikTok.
949
00:51:20,360 --> 00:51:23,360
I think it's impossible
to work at TikTok
950
00:51:23,360 --> 00:51:27,840
and to then send your children
on that site as well.
951
00:51:27,840 --> 00:51:30,680
It's very telling that all of the
CEOs of big social media companies,
952
00:51:30,680 --> 00:51:34,160
their children aren't on
social media platforms.
953
00:51:34,160 --> 00:51:38,480
So, long term, I do think
it is necessary for regulation
954
00:51:38,480 --> 00:51:41,560
to step in and create
real incentives for platforms
955
00:51:41,560 --> 00:51:43,160
to design responsibly.
956
00:51:43,160 --> 00:51:46,320
If you just want to maximise,
you know, your usage next week,
957
00:51:46,320 --> 00:51:49,040
show people lots of misinformation,
show people hate speech,
958
00:51:49,040 --> 00:51:51,480
you know, show people dangerous,
violent content,
959
00:51:51,480 --> 00:51:55,600
that is what is going to get people
to watch your platform next week.
960
00:51:55,600 --> 00:51:58,480
If you want to maximise
how many people are using your app
961
00:51:58,480 --> 00:52:00,920
a year from now
or five years from now,
962
00:52:00,920 --> 00:52:04,680
you're going to have to think about
something very, very different.
134108
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.