Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
ប្រតិចារឹក
0:00
testimony that's truthful and honest and complete let me ask you this Joe Biden last year said that xinping was a
0:05
dictator do you agree with Joe Biden is xinping a dictator Senator I'm not going to comment on any world leaders what why
0:12
won't you answer these very simple questions Senator it's not appropriate for me as a businessman to come in and World lead this are you scared that
0:17
you'll lose your job if you say anything about negative about the Chinese Communist party I disagree with that you will find content that is critical of
0:23
China the next time you go on are you scared that you'll be arrested and disappear the next time you go to Mainland China Senator I you will find
0:29
content is critical of China and any other country freely on Tik Tok okay okay let's let's turn to what Tik Tok a
0:36
tool of the Chinese Communist party is doing to America's youth does the uh does the name Mason Eden ring a
0:43
bell uh Senator you may have to give me more specifics if you don't mind yeah he was a 16-year-old AR Kanan after a
0:49
breakup in 2022 he went on your platform and searched for things like inspirational quotes and positive
0:55
affirmations instead he was served up numerous videos glam memorizing suicide
1:01
until he killed himself by gun what about the name Chase
1:06
NASA is that ring a bell would you mind giving me more details please he was a 16-year-old who saw more than a thousand
1:13
videos on your platform about violence and suicide until he took his own life by stepping in front of a plane or train
1:19
are you aware that his parents Dean and Michelle are suing Tick Tock and bite dance for pushing their son to take his
1:25
own life uh yes I'm aware of that okay
1:31
finally Mr Chu um has the Federal Trade Commission sued Tick Tock during the Biden
1:37
Administration Senator I cannot talk about whether there's any are you being are you currently being sued by the
1:42
Federal Trade Commission Senator I cannot talk about any potential lawsuit say potential actual are you being sued
1:48
by the Federal Trade Commission Senator I think I've given you my answer I can talk about no Miss jino's company is
1:55
being sued I believe Mr Zuckerberg's company is being sued I believe yet Tik Tock the agent of the Chinese Communist
2:02
party is not being sued by the Biden Administration are you familiar with a
2:07
the name Christina kafara you may have to give me more details Christina kafar was a paid
2:14
adviser to bite dance your communist influenced parent company she was then
2:20
hired by the Biden FTC to advise on how to sue Mr Zuckerberg's
2:26
company Senator B is a global company and not a Chinese
2:31
report public reports indicate that your lobbyists visited the White House more than 40 times in 2022 how many times did
2:38
your company visit did your company's lobbyist visit the White House last year I I don't know that Senator are you are
2:43
you aware that the Biden campaign and the Democratic National Committee is on your platform they have Tik Tok accounts
2:50
Senator we encourage people to come on to't let them they won't let their staffers use their personal phones they
2:56
give them separate phones that they only use Tik Tok on we encourage everyone to join including so all these companies
3:02
are being sued by the FTC you're not the FTC has a former paid advisor your parent talking about how they can sue Mr
3:08
Zuckerberg's company Joe Biden's re-election campaign and the Democratic National Committee is on your platform
3:14
let me ask you have you or anyone else at Tik Tock communicated with or
3:20
coordinated with the Biden Administration the Biden campaign or the Democratic National Committee to
3:25
influence the flow of information on your platform we work with uh anyone any
3:30
creators who want to use our campaign it's it's all the same um process that we have okay so what we have here we
3:35
have a company that's a tool of the Chinese Communist party that is poisoning the minds of America's children in some cases driving them to
3:42
suicide and that at best the Biden Administration is taking a pass on at worst may be in collaboration with thank
3:49
you Mr Chu thank you Senator cotton so we're going to take a break now we're on the
3:54
second roll call Members can take advantage of that they wish the break will last about 10 minutes minutes
4:00
please do your best to
4:10
return you try to I think these guys
4:16
are excuse us sorry
4:28
EXC
4:58
of
5:15
I
5:28
only
5:53
very I know they the way say
6:21
and if we break through none of this will
6:28
be
6:58
children
7:28
AB
7:58
Mar
8:28
ad
8:36
he
8:58
wrot
9:19
I'll share all
9:28
ours.
9:58
Sammy
10:28
the he
10:55
uh we've been going all through Los Angeles
11:27
people
11:57
St
12:27
ch
12:51
I had a phone
12:57
with
13:13
oh you can I can give you my
13:27
card
13:46
yes please watch it you'll find it on our website I will thank you
13:57
so
14:19
can we get a
14:25
picture I'm just a
14:56
hband better
15:15
are you ni to
15:27
see
15:57
go
16:27
h
16:56
good job
17:27
thir
17:57
no
18:27
e
18:57
can than you than
19:27
you
19:57
um
20:27
power
20:57
they
21:27
number
21:57
num
22:26
know
22:56
yeah
23:06
Senate Judiciary Committee will resume we have nine Senators who have not uh asked questions yet in seven minute
23:13
rounds and we'll uh turn first to Senator Pia thank you Mr chair um colleagues as
23:22
we reconvene uh I'm proud once again to uh uh share that I am one of the few
23:30
senders with younger children uh and I lead with that because as we are having
23:37
this conversation today uh it's not lost on me that uh between my children who are all now in the teen and pre-teen
23:43
category uh and their friends uh I see this issue very up
23:49
close and personal um and in that spirit I want to take a second to just acknowledge and
23:55
thank all the parents who who are in the audience today many of whom have shared
24:00
their stories with our offices and I credit them uh for um finding strength
24:08
through their suffering through their struggle and channeling that into the
24:14
advocacy that is making a difference I thank all of you um now I appreciate
24:21
again personally the challenges that parents and caretakers uh School personnel and
24:28
others face in helping our young people navigate this uh world of social media
24:35
and technology in general now the services our children are growing up with uh provide them unrivaled access to
24:43
information I mean this is beyond what previous generations uh have experienced
24:48
and that includes learning opportunities socialization and much much more but we
24:53
also clearly have a lot of work to do to better protect protect our children from
25:00
the Predators and predatory behavior that these technologies have
25:06
enabled and yes Mr Zuckerberg that includes exacerbating the Mental Health
25:12
crisis in America nearly all teens we know have
25:17
access to uh smartphones and the internet uh and use the internet daily
25:24
and while Guardians do have primary responsibility for caring for our children the old addage says uh it takes
25:32
a village uh and so society as a whole
25:37
including leaders in the tech industry must prioritize the health and
25:43
safety of our children now dive into my questions now and be specific platform by platform
25:49
Witness by witness on the topic of some of the parental tools you have each made
25:55
reference to Mr citen how many miners are on Discord and how many of them have
26:00
caretakers that have adopted your family center tool and if you don't have the numbers just say that quickly and uh
26:06
provide that to our office uh we can follow up with you on that how have you ensured that young
26:13
people in the Guardians are aware of the tools that you offer um we make it very clear to use to
26:19
to teens on our platform what tools are available and our te Sounds by what specifically do you do what that what
26:26
may be clear to you is not clear the general public so what do you do in your opinion to make it very clear uh so our
26:31
teen safety assist which is a feature that um helps uh teens keep themselves
26:36
safe in addition to blocking and blurring images that may be sent to them that is on by default for teen accounts and it cannot be turned off we also have
26:44
we we market and to our uh teen users directly in our platform we launched our family center we created a promotional
26:51
video we put it directly on our product so when every Teen um opened the app in fact every user opened the app they got
26:57
an alert like hey hey Discord has this um they they want you to use it thank you look forward to the the data that we're requesting for Mr Zuckerberg
27:03
across all of meta services from Instagram Facebook messenger and Horizon uh how many miners use your applications
27:10
and of those miners how many have a caretaker that has adopted the parental supervision tools that you
27:18
offer sorry I can follow up with the specific stats on that okay it would be very helpful not just for us to know but for you to know as a leader of your
27:26
company uh and how same question how are you ensuring that uh young people and their Gardens are aware of the tools
27:32
that you offer uh we run pretty extensive ad campaigns both on our platforms and outside we work with
27:38
creators and organizations like Girl Scouts to make sure that this is broadly a that there's broad awareness of the
27:45
tools okay Mr spel how many minors use Snapchat and of those minors how many
27:51
have caretakers that are registered with your family center Senator I believe approximately in the United States there
27:57
are approximately 20 million uh Teenage users of Snapchat I believe approximately 200,000 parents use Family
28:03
Center and about 400,000 teens have a linked their account to their parents using Family Center so 200,000 400,000
28:09
sounds like a big number but small in percentage of the minors using SnapChat uh what are you doing to ensure that
28:15
young people in their Gardens are aware of the tools you offer Senator we uh create a banner for Family Center on the
28:21
users profile so that accounts We Believe maybe of the age that they could be parents can see uh the the entry
28:27
Point into Family Center easily okay uh Mr Chu how many miners are on Tik Tok and how many of them have a caregiver
28:33
that uses your family tools Senator I need to get back to you on the specific numbers um but we were one of the first
28:40
platforms to give what we call Family pairing to parents you go to settings you turn on a QR code your teenager's QR
28:46
code and yours you scan it and what it allows you to do is you can set screen time limits you can um filter out some
28:52
keywords you can turn on the more restricted mode and we're always talking to parents um I'm I met a you know a
28:58
group of parents and teenagers and the high school teachers last week to talk about what more we can provide in the family pairing mode Miss jarino how many
29:06
miners use x and are you planning to implement safety measures or guidance for caretakers like uh your pure
29:13
companies have thank you Senator less than 1% of all us users are between the
29:19
ages of 13 and 17 less than 1% of how many of 90 million us users okay so
29:26
still hundreds of thousands continue yes yes and every single one is very important uh being a 14-month-old
29:33
company we have reprioritized child protection and safety measures and we
29:38
have just begun to talk about and discuss how we can enhance those with parental
29:44
controls let me uh continue with a followup question for um Mr Citron in
29:51
addition keeping parents informed about the nature of various internet services there's a lot more we obviously need to
29:57
do do for today's purposes while many companies offer a broad range of quote unquote user empowerment tools it's
30:04
helpful to understand whether young people even find these tools helpful so appreciate you sharing your teen safety
30:10
assist on the tools and how you're advertising it but have you conducted any assessments uh of how these features
30:17
are impacting miners use of your platform Our intention is to is to give
30:24
teens tools capabilities that they can use to keeps s safe and also so our teams can help keep teens safe um we
30:31
recently launched teen safety assist last year and we I I do not have um a study off the top of my head but we'd be happy to follow up with you on that okay
30:37
uh my time is up I'll have followup questions for uh each of you either in the second round or through statements
30:42
for the record on a a similar assessment of the tools that you've proposed thank you Mr chair thank you Senator Pia
30:48
Senator Kennedy thank you all for being
30:55
here uh Mr Spiegel I see you hiding down
31:04
there what does y yada y
31:09
mean I'm not familiar with the term Senator very
31:16
uncool can we agree that what you do not what you say what you do is what you believe and
31:24
everything else is just cottage cheese
31:30
yes Senator you agree with that speak up don't be
31:36
shy I I I've listened to to you today I've heard a lot of yach yada
31:44
ying and I've heard you talk about the reforms you've made and I appreciate
31:50
them and I've heard you talk about the reforms you're going to
31:55
make but I don't think you're going to solve the problem I think Congress is going to
32:02
have to help you I think the reforms you're talking about to some extent are going to be
32:08
like putting putting paint on rotten wood and I'm not sure you're going to
32:14
support this legislation I'm not um the the fact is that you and some
32:21
of your internet colleagues who are not here are no longer you're you you're not
32:27
companies your countries you're you're very very
32:33
powerful and you and some of your colleagues who are not here have blocked everything we have
32:41
tried to do in terms of reasonable
32:46
regulation everything from privacy to child
32:52
exploitation and um in fact we we have a new def definition of
32:59
recession um a recession is when we know we're in a recession when Google has to
33:04
lay off 25 members of Congress that's what we're down
33:10
to we're also down to this fact that your platforms are hurting
33:15
children I'm not saying they're not doing some good things but they're hurting children and I know how to count votes
33:23
and if this bill comes to the floor of the United States Senate it will pass
33:28
what we're going to have to do and I say this with all the respect I can muster
33:33
is convince my good friend Senator Schumer to to go to Amazon buy a spine online
33:41
and bring this bill to the senate floor and uh the house will then pass
33:49
it now that's that's one person's opinion I may be wrong but I doubt about
33:58
it uh Mr Zuckerberg let me ask you a couple of questions let's I might wax a
34:03
little philosophical here um I have to hand it to
34:11
you uh you you have um you have convinced over two billion
34:19
people to give up all of their personal information every bit of
34:25
it in exchange for getting to see what their high school friends had
34:31
for dinner Saturday night that's pretty much your business model isn't
34:37
it it's not how I would characterize it and we give people the ability to connect with the people they care about
34:43
and um and to engage with the topics that they care about and you and you
34:48
take this information this abundance of personal
34:54
information and then you develop algorithm to punch people's hot
35:02
buttons which and send and and steer to them information that punches their hot
35:08
buttons again and again and again to keep them coming back and to keep them
35:13
staying longer and as a result your users see
35:19
only one side of an issue and so to some extent your
35:25
platform has become a killing feeli for the truth hasn't it I mean Senator I
35:30
disagree with that that characterization um you know we build ranking and recommendations because people have a
35:37
lot of friends and a lot of interests and they want to make sure that they see the content that's relevant to them um
35:43
we're trying to make a product that's useful to people and and make our services um as helpful as possible for
35:48
people to connect with the people they care about and the interests they care about but you don't show them both sides you don't give them balanced information
35:55
you just keep punching in their hot buttons punch in their hot buttons you don't show them balanced information so
36:02
people can discern the truth for themselves and and you rev them up so much that that so often your platform
36:10
and others becomes just cess pools of snark where nobody learns anything don't
36:18
they well Senator I disagree with that I think people can engage in the things that they're interested in um and learn
36:25
quite a bit about those we have done a a handful of different experiments and things in the past around news and
36:32
trying to show content on you know diverse set of of of perspectives I think that there's more that needs to be
36:39
explored there but I don't think that we can solve that by ourselves do you think I'm sorry to cut
36:44
you off Mr Mr President but I'm going to run out of time do do you think your
36:50
users really understand what they're giving to you all their personal
36:55
information and how you how you process it and how you monetize it do you think
37:00
people really understand uh Senator I think people understand the basic terms
37:08
I mean I think that there's I actually think that a lot of people information been a couple years since we talked
37:14
about this does your user agreement still suck I I'm not sure how answer thator
37:21
can you still have can you still have a dead body and all that legal Lee
37:27
where nobody can find it Senator I'm not I'm not quite sure what you're referring to but I think people get the basic deal
37:34
of using these Services it's a free service you're using it to connect with the people you care about if you share
37:39
something with people other people will be able to see your information it's it's inherently you know if you're putting something out there to be shared
37:45
publicly um or with a private set of people it's you know you're inherently putting it out there so I think people
37:51
get that basic part of how Mr Zuckerberg you're in the foothills of creepy you
37:56
you track you track you track people who aren't even Facebook users
38:03
you track your own people your own users who are your product even even when
38:08
they're not on Facebook I mean I'm I'm going to land
38:14
This Plane pretty quickly Mr chairman I I I mean it's creepy and I understand
38:19
you make a lot of money doing it but I just wonder if if our
38:25
technology is greater than our Humanity I mean let
38:31
me ask you this final question Instagram is
38:37
harmful to young people isn't it Senator I disagree with that that's
38:42
not what the research shows on balance that doesn't mean that individual people don't have issues and that there aren't
38:48
things that that we need to do to to help provide the right tools for people but across all the research that we've
38:54
done internally I I mean this this the uh you know survey that uh the senator
39:00
previously cited um you know there are 12 or 15 different categories of harm
39:06
that we asked um teens if they felt that Instagram made it worse or better and
39:12
across all of them except for the one that that that um that Senator Holly cited um more people said that using
39:18
Instagram issu they face Zu either positive or let me we're just going have to agree to disagree if if you believe
39:25
that Instagram I it's I'm not saying it's intentional but if you agree that
39:30
Instagram if you think that Instagram is not hurting millions of our young people particularly young teens particularly
39:37
young women you shouldn't be driving it
39:43
is thanks Senator Butler thank you Mr chair and um thank
39:50
you to um our panelists who've come to uh have an important conversation ation
39:56
with us most importantly I want to appreciate the families uh who have uh shown up to continue to be remarkable um
40:04
champions of your children and your loved ones for um being here and in particular two California families um
40:12
that I was able to just talk to on on the break the families of Sammy Chapman
40:17
from Los Angeles and Daniel perta uh from Santa Clarita uh they are are here
40:23
today and are doing some incredible work uh to not just protect the memory and
40:29
Legacy of their boys um but the work that they're doing is going to protect
40:34
my nine-year-old uh and that is uh indeed why we're here there are a couple
40:39
questions that I want to ask um some individuals let me start with a question
40:44
for each of you uh Mr Citron have you ever sat with a family and talked about
40:50
their experience and what they need from your product yes or no uh yes I have spoken with parents
40:57
about how we can build tools to help them Mr Spiegel have you sat with families and young people to talk about
41:03
your products and what they need from your product yes Senator Mr Shu yes I
41:09
just did it two weeks ago for example I don't want to know what you did for the hearing prep Mr Chu I just wanted to
41:15
know if anything in terms of Prov designing the product that you are
41:25
creating Mr Zuckerberg um have you sat with parents and young people to talk
41:30
about how you design product uh for uh your cons for your uh consumers yes over
41:37
the years I've had a lot of conversations with parents you know that's interesting Mr Zuckerberg because we talked about this last night and you
41:43
gave me a very different answer I asked you this very
41:49
question well I I told you that I wasn't that I didn't know what specific
41:55
processes are company has no Mr Zuckerberg you said to me that you had
42:01
not I I must have misspoke I I want to give you the room to missp misspeak Mr
42:07
Zuckerberg but I asked you this very question I asked all of you this question uh and you told me a very
42:14
different answer when we spoke but I won't belabor it can I um a number of
42:20
you have talked about the I'm sorry ex uh Miss sharino have you talked to
42:25
parents directly young people but about designing your product as a new leader of X the answer
42:32
is yes I've spoken to them about the behavioral patterns e because less than
42:39
1% of our users are in that age group but yes I have spoken to them thank you
42:44
ma'am Mr Spiegel um there are a number of parents who've uh children have been
42:51
able to access uh illegal drugs on your platform what do you say to those
42:59
parents well Senator we are devastated that we cannot to the parents what do
43:04
you say to those parents Mr Spiegel I'm so sorry that we have not been able to prevent these tragedies we work very
43:11
hard to block all Search terms related to drugs from our platform we proactively look for and detect drug
43:17
rated content we remove it from our platform preserve it as evidence we and then we refer it to law enforcement uh
43:23
for Action we've worked together with nonprofits and with families on education campaigns because the scale of
43:29
the fenel epidemic is extraordinary over 100,000 people lost their lives last year and we believe people need to know
43:35
that one pill can kill that campaign reached more than 200 was viewed more than 260 million times on Snapchat we
43:42
also there are two fathers in this room who lost their sons they're 16 years
43:48
old they're children were able to get those pills from
43:55
Snapchat I know that there are statistics and I know that there are good efforts none of
44:01
those efforts are keeping our kids from getting access to those drugs on your
44:06
platform uh as uh California company all of you I've talked with you about what it means to be a good neighbor and what
44:12
California families and American families should be expecting from you you owe them more than just a a set of
44:20
Statistics uh and I look forward to you showing up on all pieces of this legislation all of you showing up on all
44:27
pieces of legislation to keep our children safe Mr Zuckerberg I want to come back to you I um talked with you
44:34
about being a a parent to a young child um who's doesn't
44:40
have a phone doesn't you know is not on social media at all um and one of the
44:47
things that I am deeply concerned with uh as uh a parent to a young black girl
44:55
is the utilization of uh filters on your
45:01
platform that would suggest to young girls utilizing your platform the
45:08
evidence that they are not good enough as they are I want
45:15
to ask more specifically and refer to some unredacted court documents that
45:22
reveal that your own researchers uh concluding that these face filters that mimic plastic
45:30
surgery negatively impact youth mental health indeed uh and well-being why
45:37
should we believe why should we believe that because that you're going to do more to
45:44
protect young women and young girls when it is that you give them the tools to
45:50
affirm the self-hate that is spewed across your platforms why is it that we
45:55
should believe that you are committed to doing anything more to keep our children
46:01
safe sorry there's a lot to unpack there there people tools to express themselves
46:06
in different ways and people use face filters and different tools to make
46:12
media and photos and videos that are fun or interesting um across a lot of the
46:18
different products that that that that are plastic surgery pins are good tools to express
46:23
creativity um Senator I'm not speaking to that skin lightening tools are tools
46:30
to express creativity this is the direct thing that I'm asking about not defending any specific one of those I
46:36
think that the ability to kind of filter and um and edit images is generally a
46:44
useful tool for expression for that specifically I'm I'm not familiar with the study that you're referring to but
46:50
we did make it so that we're not recommending this type of content to teams no no reference to a study to
46:58
court documents that revealed your knowledge of the impact of these types
47:04
of filters on young people generally young girls in particular I disagree
47:09
with that characterization I I think that there's hyp cour documents I'm I
47:14
haven't seen any document that says okay M Mr Zuckerberg my my time is up um I
47:20
hope that you hear what is being offered to you and are prepared to step up and
47:25
do better I know this senate committee uh is going to do our work to hold you in great to Greater account thank you Mr
47:32
chair Senator Tillis thank you Mr chair thank you all
47:38
for being here the um I I don't feel like I'm going to have an opportunity to
47:43
ask a lot of questions so I'm going to reserve the right to submit some for the record but I I have heard we've had
47:51
hearings like this before I've been in the senate for nine years I've heard heard hearings like this before I've
47:57
heard horrible stories about uh people who have died committed suicide uh been
48:04
embarrassed um every year we have an annual flogging every year and what
48:11
material has occurred over the last nine years um do any of you all do just yes
48:19
or no question do any of y'all participate in an industry Consortium trying to make this fundamentally safe
48:24
across platform yes or no Mr Zuber there's a variety of
48:32
organizations which organiz I should say does anyone here not participate in an
48:37
industry if I I actually think it would be imoral for you all to consider it a
48:42
strategic advantage to keep safe or to keep private something that would secure
48:48
all these platforms to avoid this sort of pro do you all agree with that that anybody that would be saying you want
48:53
ours because ours is the safest and these haven't figured out the secret sauce that you as an industry realize this is an existential threat to you all
49:00
if we don't get it right right I mean you've you've got to secure your platforms you got to deal with this do do you not have an inherent mandate to
49:08
do this because it would seem to me if you don't you're going to cease to exist I mean we could regulate you out of
49:15
business if we wanted to and the reason I'm saying it may sound like a criticism it's not a criticism I think we have to
49:21
understand that there should be an inherent motivation get this right our
49:26
Congress will make a decision that could potentially put you out of business here's the reason I have a concern with that though I I just went on the
49:33
internet uh while I was listening intently to all the other members speaking and I found a dozen different
49:41
uh platforms outside the United States 10 of which are in China two of which are in in Russia uh their daily average
49:49
subscri or active membership numbers in the billions well people say you can't get on China's version of Tik Tok I took
49:58
me one quick search on my favorite search engine to find out exactly how I
50:03
could get a an account on this platform today um and so the other thing that we
50:11
have to keep in mind I come from technology I could figure out ladies and gentlemen I could figure out how to
50:17
influence your kid without them ever being on a social media platform I can randomly send text and get a bite and
50:24
then find out an email address and get compromising information um if we're it is horrible
50:31
to hear some of these stories and I have shared the and I've had these stories occur in my hometown down in North
50:38
Carolina but if we only come here and make a point today and don't start focusing on making a difference which
50:45
requires people to stop shouting and start listening and start passing
50:51
language here the Bad actors are just going to be off our Shores I have another question for you all how much do
50:57
how many people roughly if you don't know the exact numb okay roughly how many people do you have looking 24 hours
51:03
a day at these horrible images and just go real quick with an answer down the line and filtering it out um it's it's
51:10
most of the 40,000 about people who work on safety and again we have 2,300 people
51:16
all over the world okay we have 40,000 trust and safety professionals around the
51:22
world we have approximately 2,000 people dedicated to trust and safety and content
51:27
moderation um our our platform is much much smaller than these folks we have hundreds of people and it's um looking
51:33
at content 50% of our work I've mention these people are have a horrible job many of them experience um they they
51:41
have to get counseling for all the things they see we have evil people out there and we're not going to fix this by
51:47
shouting past or talking past each other we're going to fix this by every one of y'all being at the table and hopefully
51:52
coming closer to what I heard one person and say supporting a lot of the good bills like one that I hope Senator
51:58
Blackburn mentions when she gets a chance to talk but guys if you're not at the table and securing these platforms
52:05
you're going to be on it and and and the reason why I'm not okay with that is
52:10
that if we ultimately destroy your ability to create value and drive you
52:15
out of business the evil people will find another way to get to these children and I do have to admit I don't
52:23
think my mom's watching this one but there is good we we can't look past good that is occurring my mom who lives in
52:30
Nashville Tennessee and I talked yesterday and we talked about a Facebook post that she made a couple of days ago
52:36
we don't let her talk to anybody else that that that connects my 92y old mother with uh with her grandchildren
52:43
and great-grandchildren that lets a kid who may feel awkward in school to get into a group of people and relate to
52:49
people let let's not throw out the good because we have it all together focused
52:56
on rooting out the bad now I guarantee you I could go through some of your governance documents and find a reason
53:02
to fog every single one of you because you didn't place the emphasis on it that I think you should but at the end of the
53:09
day I F it find it hard to believe that any of you people started this business some of you in your college dorm rooms
53:15
for the purposes of creating the evil that is being perpetrated on your platforms but I hope that every single
53:22
waking hour you're doing everything you can to reduce it you're not going to be
53:28
able to eliminate it and I hope that there are some enterprising young tech people out there today that are going to
53:34
go to parents and say ladies and gentlemen your children have a deadly weapon they have a potentially deadly
53:42
weapon whether it's a phone or a tablet you have to secure it you can't assume
53:50
that they're going to be honest and say that they're 16 when they're 12 uh we
53:55
all have to recognize that we have a responsibility to play and you guys are at the tip of the spear so I hope that
54:03
we can get to a point to where we are moving these bills if you got a problem with them State your problem let's fix
54:10
it no is not an answer uh and and know that I want the United States to be the
54:17
beacon for Innovation to be the beacon for safety and to prevent people from using other options that have existed
54:24
since the internet has existed to exploit people and Count Me In as
54:29
somebody that will try and help out thank you Mr chair thank you Senator Tillis next is Senator oof thank you Mr
54:36
chairman and uh thank you to our Witnesses today
54:42
uh Mr Zuckerberg I want to begin by just asking a simple question which is do you want kids to use your platform more or
54:49
less well we don't want people under the age of 13 using you want teenagers 13
54:54
and up to use your platform more or less um well we would like to build a product
55:00
that is useful and that people want to use more my time is is going to be limited so it's just do you want them to
55:05
use it more or less teenagers 13 to 17 years old do you want them using meta
55:10
products more or less I'd like them to be useful enough that they want to use them more you want them to use it
55:19
more I think here in we have one of the fundamental challenges
55:25
in in fact you have a fiduciary obligation do you not to try to get kids
55:30
to use your platform more it depends on how you define that um we we obviously are a business um but
55:38
it's it I'm Sor Mr Zuckerberg it's just our time is it's it's not it's self-evident that you have a fiduciary
55:44
obligation to get your users including users under 18 to use and engage with
55:49
your platform more rather than less correct over the long term term but in
55:55
the near term we often take a lot of steps including we we made a change to show less videos that that on the
56:02
platform that reduced amount of time by more than 50 million hours but if your shareholders ask you
56:08
Mark I wouldn't Mr Zuckerberg here but your shareholders might be on a firstname basis with you mark are you
56:14
trying to get kids to use meta products more or less you'd say more right well I
56:19
would say that over the long term we're trying to create the most let's look so the 10K you file with the SEC a few
56:24
things I want to note here are some quotes and this is a a filing that you sign correct yes yeah our financial
56:31
performance has been and will continue to be significantly determined by our success in adding retaining and engaging
56:37
active users here's another quote if our users decrease their level of Engagement with our products our Revenue Financial
56:44
results in business may be significantly harmed here's another quote we believe that some users particularly younger
56:49
users are aware of and actively engaging with other products and services similar to as a substitute for ours
56:55
continues in the event that users increasingly engage with other products and services we may experience a decline in use and engagement in key
57:01
demographics or more broadly in which case our business would likely be harmed you have an
57:08
obligation as the chief executive to encourage your team to get
57:13
kids to use your platform more sen this is is that not
57:21
self-evident you have a fiduciary obligation to your shareholders to get kids to use your platform more I I think
57:26
that the thing that's not intuitive is the the direction is to make the
57:32
products more useful so that way people want to use them more we don't give our the teams running the Instagram feed or
57:38
the Facebook feed a goal to increase the amount of time that people spend yeah but you don't dispute and your and your
57:43
10K makes clear you want your users engaging more and using more the platform and I think this gets to the
57:50
root of the challenge because it's the overwhelming view of the the public certainly in my home
57:56
state of Georgia uh and we've had some discussions about the underlying science that this platform is
58:03
harmful for children I mean you are familiar with and not just your platform
58:09
by the way social media in general 2023 report from the Surgeon General about the impact of social media on kids
58:15
mental health which cited evidence that kids who spend more than three hours a day on social media have double the risk
58:20
of poor mental health outcomes including depression and anxiety you're familiar with that Surgeon General report the underlying study I I read the report yes
58:28
do you dispute it no but I think it's important to characterize it correctly I think what he was flagging in the report
58:34
is that there seems to be a correlation and obviously the mental health issue is very important so it's something that
58:40
needs to be the thing is that's that's everyone knows there's a correlation everyone knows that kids who spend a lot
58:47
of time too much time on your platforms are at risk and it's not just the mental
58:53
health issues I mean let let me ask you question question is your platform safe for kids I believe it is but there's a
58:59
important difference between correlation and causation you because we're not going to be able to get anywhere we want
59:04
to work in a productive open honest and collaborative way with the private
59:10
sector to pass legislation that will protect Americans that will protect American children above all and that
59:17
will allow businesses to thrive in this country if we don't start with an open honest candid realistic assessment of
59:22
the issues we can't do that the first point is you want kids to use the platform more in fact you have an
59:28
obligation to but if you're not willing to acknowledge it's a dangerous place for children the internet is a dangerous
59:35
place for children not just your platform isn't it isn't the internet a dangerous place for children I think it can be yeah there's both great things
59:41
that people can do and there are harms that we need to work to yeah it's a dangerous place for children there are families here who have lost their
59:47
children there are families across the country whose children have engaged in self harm who have experienced low
59:52
self-esteem who have been sold deadly pills on the internet the internet's a dangerous place for children and your
59:58
platforms are dangerous places for children do you agree I think that there
1:00:03
are harms that we need to work to mitigate okay I I'm not gonna I think overall why not why not just acknowledge
1:00:09
it why why do we have to do the the very care I just I disagree with the characterization that the internet's a
1:00:15
dangerous place for children um I I think you're you're trying to characterize our products as inherently
1:00:21
dangerous and I think that inherent or not your your product are places where children can experience harm they can
1:00:27
experience harm to their mental health they can be sold drugs they can be prayed upon by predators that you know
1:00:33
they're dangerous places and and and yet you have an obligation to promote the
1:00:40
use of these platforms by children and look all I'm all I'm trying to suggest to you Mr Zuckerberg and my my time is
1:00:47
is running short is that in order for you to
1:00:52
succeed you and your colleagues here we have to acknowledge these basic truths we have to be able to come before the
1:00:58
American people the American public the people in my state of Georgia and acknowledge the internet is
1:01:03
dangerous including your platforms there are predators lurking there are drugs being sold there are harms to mental
1:01:09
health that are taking a huge toll on kids quality of life and yet you have
1:01:16
this incentive not just you Mr Zuckerberg all of you have an incentive to boost maximize use utilization and
1:01:22
engagement and that is where public public policy has to step in to make sure that these platforms are safe for
1:01:29
kids so kids are not dying so kids are not overdosing so kids are not cutting themselves or killing themselves because
1:01:35
they're spending all day scrolling instead of playing outside and I appreciate all of you for your testimony
1:01:41
we will continue to engage as we develop this legislation thank
1:01:48
you senator from Tennessee thank you Mr chairman thank you for to each of you for coming and I
1:01:58
know some of you had to be subpoena to get here but we do appreciate that you
1:02:03
all are here Mr Chu I want to come to you first uh we've heard that you're looking at putting a headquarters in
1:02:10
Nashville and likewise in Silicon Valley and Seattle and what you're going to find probably is that the welcome mat is
1:02:17
not going to be rolled out for you in Nashville like it would be in California
1:02:23
there are a lot of people in see that are very concerned about the way Tik
1:02:28
Tock is basically building dossier on our kids the way they are building those
1:02:35
on their virtual you and also that that information is held in China in Beijing
1:02:42
as you responded to Senator Blumenthal and I last year in reference to that
1:02:49
question and we also know that a major music label yesterday said they were
1:02:54
pulling all of their content off your site because of your issues on payment
1:03:01
on artificial intelligence and because of the negative impact on our kids
1:03:07
mental health so we will see how that progresses uh Mr Zuckerberg I want to
1:03:14
come to you uh we have just had Senator Blumenthal and I of course have had some
1:03:20
internal documents and emails that have come our way one of the things that really concerned me is that you referred
1:03:28
to your young users in terms of their lifetime value of being roughly
1:03:36
$270 per teenager and each of you should be
1:03:41
looking at these kids their t-shirts they're wearing to
1:03:47
say today say I'm worth more than
1:03:53
270 dollars we've got some standing up in those
1:04:00
t-shirts now and some of the children from our
1:04:06
state some of the children the parents that we have worked with just to think
1:04:13
whether it is Becca Schmidt David mik Sarah flat and Lee
1:04:21
sh would you say that Li is only worth
1:04:28
$270 what could possibly lead you I mean I listen to that I know you're a dad I'm
1:04:34
a mom I'm a grandmom and how could you possibly even
1:04:41
have that thought it is astounding to me and I think this is one of the reasons
1:04:48
that um States 42 states are now suing you
1:04:54
because of features that they consider to be addictive that you are pushing forward
1:05:01
and in the emails that we've got from 2021 that go from August to
1:05:07
November there is the staff plan that is being discussed and Antony Davis Nick C
1:05:13
Cheryl Sandberg Chris Cox Alex Schulz Adam misseri are all on this chain of
1:05:19
emails on the well-being plan and then we get to one Nick did email Mark for
1:05:26
emphasis to emphasize his support for the package but it sounds like it lost
1:05:32
out to various other pressures and priorities see this is what bothers
1:05:39
us children are not your priority children are your
1:05:44
product children you see as a way to make
1:05:51
money and children protecting children in this virtual space you made a
1:05:58
conscious decision even though Nick clag and others were going through the
1:06:06
process of saying this is what we do the these documents are really Illuminating
1:06:13
and it just shows me that growing this
1:06:19
business expanding your Revenue what you were going to put on
1:06:25
those quarterly filings that was the priority the children were not it's very
1:06:33
clear um I want to talk with you about the pedophile ring because that came up
1:06:39
earlier and the Wall Street Journal reported on that and one of the things
1:06:45
that we found out was after that became evident then you didn't take that
1:06:51
content down and it was content that showed that teens were for sale and were
1:06:56
offering themselves to older men and you didn't take it down because it didn't
1:07:02
violate your community standards do you know how often a child is bought or sold
1:07:07
for sex in this country every two minutes every two minutes a child is
1:07:16
bought or sold for sex that's not my stat that is a TB I stat now finally
1:07:26
this content was taken down after a congressional staff or went to meta's
1:07:33
Global head of safety so would you please explain to me and to all these
1:07:38
parents why explicit predatory content does not violate your platform's terms
1:07:45
of service or your community standards sheer Senator let me try to
1:07:50
address all the things that you just said it does violate our standards we work very hard to take it down didn't
1:07:56
take it down we've well we've reported I think it's more than 26 million examples of this kind of content didn't take it
1:08:03
down until a congressional staffer brought it up it it may be that in this case we made a mistake and missed
1:08:08
something make a lot of mistakes leading teams that I want to talk with you about
1:08:14
your Instagram creators program and about the push we found out through these documents that you actually are
1:08:22
pushing forward because because you want to bring kids in early you see these
1:08:29
younger tweenagers as valuable but an untapped audience quoting from the
1:08:35
emails and suggesting teens are actually household influencers to bring their
1:08:40
younger siblings into your platform into
1:08:45
Instagram now how can you ensure that Instagram creators your product your
1:08:53
program does not facilitate illegal activities when you fail to remove
1:09:00
content pertaining to the sale of Miners and it is happening once every two
1:09:07
minutes in this country um Senator our tools for
1:09:13
identifying that kind of content or industry leading that doesn't mean we're perfect there are definitely issues that
1:09:18
we have but we continue Zu yes there are a lot that is slipping through it
1:09:24
appears that you're trying to be the premier sex trafficking not Senator Senator that's
1:09:30
ridiculous it is not ridiculous you want to turn around and want this content on our platforms why don't you take it down
1:09:37
we are here discussing all to work with us no you're
1:09:43
not you are not and the problem is we've been working on this Senator Welch is
1:09:49
over there we've been working on this stuff for a decade you have have an army of lawyers and lobbyists that have
1:09:57
fought us on this every step of the way you work with net Choice the KO
1:10:02
Institute taxpayers protection Alliance and chamber of progress to actually
1:10:08
fight our bipartisan legislation to keep kids safe online so are you going to
1:10:16
stop funding these groups are you going to stop lobbying against this and come to the table and work with us yes or no
1:10:23
Senator we have a yes or no of course we'll work with you on on the
1:10:28
legislation the door is open we've got all these bills you need you need to
1:10:34
come to the table each and every one of you need to come to the table and you need to work with us kids are
1:10:46
dying Senator Welch uh I want to thank my colleague Senator Blackburn for her
1:10:52
decade of work on this I actually have some
1:10:59
optimism there is a consensus today that didn't exist say 10 years ago that there
1:11:06
is a profound threat to children to mental health to safety there's not a
1:11:13
dispute that was in debate before that's a starting
1:11:19
point secondly we're identifying concrete things that can be done in four
1:11:26
different areas one is industry standards two is
1:11:33
legislation three is are the courts and then four is a proposal that
1:11:39
Senator Bennett Senator Graham myself and Senator Warren have to
1:11:45
establish an agency a governmental agency whose responsibility would be to
1:11:52
engage in this on a systematic regular basis with proper resources and I just
1:11:57
want to go through those I appreciate the industry standard decisions and
1:12:03
steps that you've had you you've taken in your companies but it's not enough uh and
1:12:09
that's what I think you're hearing from my colleagues like for instance where there are layoffs in it is in the
1:12:15
trusted uh the trust and verify programs uh that's alarming because it looks like
1:12:22
there is a reduction in emphasis on protecting things like you just added M
1:12:29
yino 100 employees in in Texas to in this category uh and how many did you
1:12:35
have before I the company is just coming through a significant
1:12:41
restructuring so we've increased the number of trust and safety employees and
1:12:46
agents all over the world by at least 10% so far in the last 14 months and we will continue to do so specifically in
1:12:53
Austin Texas all right Mr Zuckerberg my understanding is there have been layoffs in that area as well there's added jobs
1:13:01
there at Twitter but uh it meta have there been reductions in that there have been across the board not really focused
1:13:07
on that area I think our our investment is is relatively consistent over the last couple of years we we invested
1:13:14
almost five billion dollars in this work last year and I think this year will be on the same order of magnitude all right
1:13:20
and another question that's come up is when to the core of a user of any of
1:13:25
your platforms somebody has an image on there that's very compromising often of
1:13:31
a sexual nature is there any reason in the world why a person who wants to take
1:13:37
that down can't have a very simple same day response to have it taken
1:13:44
down I'll start with Twitter oh now I'm sorry Senator I was taking notes
1:13:51
could you repeat the question well it there's a lot of examples of a young
1:13:56
person finding out about an image that is of them and really compromises them
1:14:02
and actually can create suicidal thoughts and they want to call up or they want to send an email and say take
1:14:09
it down I mean why is it not possible for that to be responded to immediately
1:14:15
well we all strive to take down any type of uh violative content or disturbing
1:14:21
content immediately at we have increased our capabilities with a twostep
1:14:27
reporting process if I'm a parent or I'm a kid and I want this down shouldn't
1:14:32
there be methods in place where it comes down you can see what the image is yes a
1:14:40
an ecos systemwide standard would improve and actually enhance the
1:14:45
experience for users at all our platforms all right there there actually is an organization I think a number of the companies up here are a part of
1:14:52
called take it down it's um some technology that we and and a few others you all are you all are in favor of that
1:14:58
because that is going to give some peace of mind to people all right it really really matters uh I don't have that much
1:15:05
time so we've talked about the legislation and uh Senator uh White
1:15:11
House had asked you to get back with your position on Section 230 which I'll go to in a minute but I would welcome
1:15:18
each of you responding uh as to your company's position on the bills that are
1:15:24
under consideration in this hearing all right I'm just asking you to do that a
1:15:30
third the court this big question of section 230 and today uh I'm pretty inspired by
1:15:38
the presence of the parents who have turned their extraordinary grief into
1:15:44
action and hope that other parents may not have to suffer what for them is a
1:15:49
devastating for everyone a devastating loss senator White House asked you all to get back very concretely about
1:15:56
section 230 and your position on that but it's an astonishing benefit that your industry
1:16:04
has that no other industry has they just don't have to worry about being held
1:16:11
accountable in court if they're negligent so you've got some explaining
1:16:16
to do and I'm just reinforcing Senator White House's request that you get back
1:16:23
specifically about that and then finally I want to ask about this notion this
1:16:28
idea of a of a of a u Federal agency whose resourced and whose job is to be
1:16:37
dealing with public interest matters that are really affected by big Tech
1:16:42
it's extraordinary what has happened in our economy uh with technology and your
1:16:47
companies represent Innovation and success uh but just as when the
1:16:53
railroads were ascendant and were in charge and ripping off Farmers because
1:16:58
of practices they were able to get away with just as when Wall Street was flying high but there was no one regulating
1:17:04
Blue Sky laws uh we now have a whole new world in the economy and Mr Zuckerberg I
1:17:09
remember uh you testifying in the Energy and Commerce Committee and I asked you your position on the uh concept of a
1:17:17
federal regulatory agency my recollection is that you were positive about that is that still
1:17:23
case um I I think it it could be a a reasonable solution there are obviously
1:17:30
pros and cons to doing that versus through the normal the the the current structure of having different Regulatory
1:17:35
Agencies focused on specific issues but because a lot of the things trade off against each other like one of the topics that we talked about today is
1:17:41
encryption and that's obviously really important for privacy and security but can we just go down the line I'm at the
1:17:47
end but thank you m yino Senator I think the uh industry initiative to keep those
1:17:53
conversations going would be something X would be very very proactive about if you think about our support of the
1:17:59
report act the shield act the stop cesam act our support of the Project SAFE childhood act I think our intentions are
1:18:06
clear to participate and to here yeah Senator um we support National privacy
1:18:12
legislation for example so that sounds like a good idea we just need to understand what it means all right uh Mr
1:18:18
Spiel Senator we'll continue to work with your team and we'd certainly be open to exploring the right regul at body for big technology but the idea of
1:18:25
a regulatory body is something that you can see has Merit yes sen and Mr Cen
1:18:34
yeah we're very open to to working with with you and our peers and anybody on helping make the internet a safer place
1:18:40
you know I think you mentioned this is not a one platform problem right so we we do look to collaborate with other companies and with nonprofits in the
1:18:47
government thank you Mr chairman I yield back thank you Senator Welch well we're
1:18:53
going to conclude this hearing and thank you all for coming today you probably have your scorecard out there you've met
1:18:59
at least 20 members of this committee and have your own impressions of their questioning and approach and the like
1:19:04
but the one thing I want to make clear as chairman of this committee for the last three years is this was an
1:19:11
extraordinary vote on an extraordinary issue a year ago we passed five bills
1:19:17
unanimously in this committee you heard all the Senators every spot on the political Spectrum was
1:19:23
covered every single Senator voted unanimously in favor of the five pieces of legislation we've discussed today it
1:19:31
ought to tell everyone who follows Capitol Hill in Washington a pretty Stark message we get it and we live it
1:19:39
as parents and grandparents we know what our daughters and sons and others are going through they cannot cope they
1:19:48
cannot handle this issue on their own they're counting on us as much as they are counting on the industry to do the
1:19:54
responsible thing and some will leave with impressions of our Witnesses and the companies they represent that you're
1:20:00
right as an American citizen but you ought to also leave with the determination to keep the spotlight on
1:20:06
us to do something not just to hold a hearing bring out a good strong crowd of
1:20:13
supporters for change but to get something done no excuses no excuses
1:20:19
we've got to bring this to a vote what I found in my time in the house house and the Senate is that's the day that's the
1:20:25
moment of Reckoning speeches notwithstanding press releases and the like the moment of Reckoning is when we
1:20:31
call a vote on these measures it's time to do that I don't believe there's ever been a moment in America's wonderful
1:20:37
history when a business or industry has stepped up and said regulate us put some legal limits on us businesses Exist by
1:20:45
and large to be profitable and I think that we got to get behind that and say profitability at what cost Senator
1:20:52
Kennedy Republican colleague said is our technology greater than our Humanity I I
1:20:58
think that is a fundamental question that he asked what I would add to it or politics greater than
1:21:05
technology we're going to find out I want to thank a few people before we close up here I've got several staffers
1:21:12
who worked so hard on this Alexander galber thank you very much Alexander Jeff Hansen Scott
1:21:21
Jord last point I'll make Mr Zuckerberg is is just a little advice to you I think your
1:21:28
opening statement on Mental Health needs to be explained because I don't think it
1:21:33
makes any sense there is an parent in this room who's had a child that's gone through an emotional experience like
1:21:39
this that wouldn't tell you and me they changed right in front of my eyes they changed they hold themselves up in their
1:21:46
room they don't longer reached out to their friends they lost all interest in school these are mental health
1:21:51
consequences that I think come with the abuse of this right to have access to this kind of technology so uh I will
1:21:59
just I see my colleague you want to say a word uh I think it was a good hearing I hope something positive comes from it
1:22:05
thank you all for coming the hearing record is going to remain open for a week for statements and questions may be
1:22:10
sub uh submitted by Senators by 5:00 P pm on Wednesday once again thanks to the witnesses for coming the hearing stands
1:22:21
Jared just keep a path here folks good thank
1:22:31
you here your
1:22:39
here just watch your step
1:22:51
there your St
1:23:21
there
61171
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.