Return to Transcripts main page
State of the Race with Kasie Hunt
Capitol Hill Grilling Top Tech CEOs Over Child Safety; FBI Director Testifying On Chinese Hackers' Dangerous Intents. Aired 11a-12p ET
Aired January 31, 2024 - 11:00 ET
THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.
KASIE HUNT, CNN HOST: Good day. I'm Kasie Hunt live in Washington. We're going to take you out of the Capitol Hill because Big Tech executives are
testifying before the Senate Judiciary Committee about social media's impact on our kids. Let's listen.
SEN. LINDSEY GRAHAM (R-SC): -- enterprise. But, I also believe that every American who has been wronged has to have somebody to go to, to complain.
There is no commission to go to that can punish you. There is not one law in the book because you oppose everything we do. And you can't be sued.
That has to stop, folks. How do you expect the people in the audience to believe that we're going to help their families if we don't have some
system or a combination of systems to hold these people accountable? Because for all the upside, the dark side is too great to live with. We do
not need to live this way as Americans.
SEN. DICK DURBIN (D-IL): Thank you, Senator Graham. Senator Klobuchar is next. She has been quite a leader on the subject for quite a long time on
the SHIELD Act, and with Senator Cornyn, on the revenge porn legislation. Senator Klobuchar.
SEN. AMY KLOBUCHAR (D-MN): Thank you very much, Chairman Durbin, and thank you, Ranking Member Graham for those words. I couldn't agree more for too
long. We have been seeing the social media companies turn a blind eye when kids have joined these platforms in record numbers. They have used
algorithms that push harmful content because that content got popular. They provided a venue maybe not knowingly at first, but for dealers to sell
deadly drugs like fentanyl. Our own head of our drug enforcement administration has said they basically been captured by the cartels in
Mexico and in China.
So, I strongly support, first of all, the STOP CSAM bill. I agree with Senator Graham that nothing is going to change unless we open up the
courtroom doors. I think the time for all of this immunity is done because I think money talks even stronger than we talk up here. Two of the five
bills as noted are my bills with Senator Cornyn. One has actually passed through the Senate, but is waiting action in the House. But, the other one
is the SHIELD Act, and I do support -- appreciate those supportive acts of that bill. This is about revenge porn. The FBI Director testified before
this Committee. There has been over 20 suicides of kids attributed to online revenge porn in just the last year.
But, for those parents out there and those families, this is for them, how about their own child, but it's also about making sure this doesn't happen
to other children. I know because I've talked to these parents. Parents like Bridgette Norring from Hastings, Minnesota, who is out there today.
Bridgette lost her teenage son after he took a fentanyl-laced pill that he purchased on the internet. Amy Neville is also here. Platform, got the
pill. Amy Neville is also here. Her son Alexander was only 14 when he died after taking a pill he didn't know was actually fentanyl.
We're starting a law enforcement campaign, one pill kills in Minnesota going to the schools with the sheriffs and law enforcement. But, the way to
stop it is, yes, at the border and at the point of entry. But, we know that 30 percent some of the people that are getting the fentanyl are getting it
off the platforms. Meanwhile, social media platforms generated $11 billion in revenue in 2022 from advertising directed at children and teenagers,
including nearly $2 billion in ad profits derived from users age 12 and under.
When a Boeing plane lost a door in mid-flight several weeks ago, nobody questioned the decision to ground a fleet of over 700 planes. So, why
aren't we taking the same type of decisive action on the danger of these platforms when we know these kids are dying? We have bills that have passed
through this incredibly diverse Committee, when it comes to our political views, that have passed through this Committee and they should go to the
We should do something finally about liability and then we should turn to some of the other issues that a number of us have worked on when it comes
to the charges for app stores and when it comes to some of the monopoly behavior and this self-preferencing. But, I'm going to stick with this
today. Facts, one third of fentanyl cases investigated over five months had direct ties to social media. That's from the DEA, facts between 2012 and
2022. Cyber tip line reports of online child sexual exploitation increased from 415,000 to more than 32 million. And as I noted, at least 20 victims
committed suicide in sexual exploitation (ph) cases.
So, I'm going to start with that -- with you, Mr. Citron. My bill with Senator Cornyn, the SHIELD Act, includes a threat provision that would help
protection and accountability for those that are threatened by these predators. Young kids get a picture, send it in, think they got a new
girlfriend or a new boyfriend who ruins their life or they think it's going to be ruined, and they kill themselves. So, could you tell me why you're
not supporting the SHIELD Act?
JASON CITRON, CEO, DISCORD: Senator, we think it's very important that teens have a safe experience on our platforms. I think that the portion to
strengthen law enforcement's ability to investigate crimes against children to hold bad actors accountable is incredible.
KLOBUCHAR: So, you're holding open that you may support it?
CITRON: We very much would like to have conversations with you. We're open to discussing further. We do welcome legislation and regulation. This is a
very important issue for our country. And we've been prioritizing safety for change.
KLOBUCHAR: Thank you. I just -- I'm much more interested in if you support it, because there has been so much talk at these hearings and popcorn
throwing and the like, and I just want to get this stuff done. I'm so tired of this. It's been 28 years, what, since the internet. We haven't passed
any of these bills, because everyone's double talk, double talk. It's time to actually pass them. And the reason they haven't passed is because of the
power of your company. So, let's be really, really clear about that. So, what you say matters. Your words matter.
Mr. Chew, I'm a co-sponsor of Chair Durbin's STOP CSAM Act of 2023, along with Senator Hawley's, the lead Republican, I believe, which among other
things empowers victims by making it easier for them to ask tech companies to remove the material and related imagery from their platforms. Why would
you not support this bill?
SHOU CHEW, CEO, TIKTOK: Senator, we largely support it. I think the spirit of it is very aligned with what we want to do. There are questions about
implementation that I think companies like us and some other groups have, and we look forward to asking those, and of course, if this legislation is
law, we will comply.
KLOBUCHAR: Mr. Spiegel, I know we talked ahead of time. I do appreciate your company's support for the Cooper Davis Act, which will finally -- it's
a bill with Senator Shaheen and Marshall, which will allow law enforcement to do more when it comes to fentanyl. I think you know what a problem this
is. Devin Norring, teenager from Hastings, I mentioned his mom here, suffered dental pain and migraines. So, he bought what he thought was a
Percocet over Snap. But, instead, he bought a counterfeit drug laced with lethal dose of fentanyl. As his mom, who is here with us today, said, all
of the hopes and dreams we as parents had for Devin were erased in the blink of an eye, and no mom should have to bury their kid. Talk about why
you support the Cooper Davis Act.
EVAN SPIEGEL, CEO, SNAP: Senator, thank you. We strongly support the Cooper Davis Act. We believe it will help DEA go after the cartels and get more
dealers off the streets to save more lives.
KLOBUCHAR: OK. Are there others that support that bill? No. OK. Last, Mr. Zuckerberg. In 2021, The Wall Street Journal reported on internal Meta
research documents asking, why do we care about tweens? These were internal documents. I'm quoting the documents. And answering its own question by
citing Meta internal emails. They are a valuable but untapped audience. At a Commerce hearing, I'm also on that Committee, I asked Meta's Head of
Global Safety, why children aged 10 to 12 are so valuable to Meta? She responded, we do not knowingly attempt to recruit people who aren't old
enough to use our apps.
Well, when the 42 state attorneys general, Democrat and Republican, brought their case, they said this statement was inaccurate. Few examples, in 2021,
she received an email, Ms. Davis, from Instagram's Research Director, saying that Instagram is investing and experiencing targeting young age
roughly 10 to 12.
In a February 2021 instant message, one of your employees wrote that Meta is working to recruit Gen Alpha before they reached teenage years. A 2018
email that circulated inside Meta says that you were briefed that children under 13 will be critical for increasing the rate of acquisition when users
turn 13. Explain that with what I heard at that testimony at the Commerce hearing that they weren't being targeted. And I just ask again, as the
other witnesses were asked, why your company does not support the STOP CSAM Act or the SHIELD Act?
MARK ZUCKERBERG, CEO, META: Sure, Senator. I'm happy to talk to both of those. We had discussions internally about whether we should build a kids
version of Instagram, like the --
KLOBUCHAR: I remember that.
ZUCKERBERG: -- kids versions of YouTube --
ZUCKERBERG: -- and other services. We haven't actually moved forward with that. And we currently have no plans to do so. So, I can't speak directly
to the exact emails that you cited. But, it sounds to me like they were deliberations around a project that people internally thought was
important, and we didn't end up moving forward with.
KLOBUCHAR: OK. And the bills --
KLOBUCHAR: -- what are you going to say --
ZUCKERBERG: So --
KLOBUCHAR: -- about the two bills?
ZUCKERBERG: Sure. So, overall, I mean, my position on the bills is, I agree with the goal of all of them. There are most things that I agree with
within them. There are specific things that I would probably do differently. We also have our own legislative proposal for what we think
would be most effective in terms of helping the internet and the various companies give parents control over the experience. So, I'm happy to go
into the detail on any one of them. But, ultimately, I think that this is - -
KLOBUCHAR: Again, well, I think these parents will tell you that this stuff hasn't worked to just give parents control. They don't know what to do.
It's very, very hard. And that's why we are coming up with other solutions that we think are much more helpful to law enforcement, but also this idea
of finally getting something going on liability, because I just believe with all the resources you have that you actually would be able to do more
than you're doing, or these parents wouldn't be sitting behind you right now in this Senate hearing room.
ZUCKERBERG: Senator, can I speak to that?
DURBIN: Thank you, Senator Klobuchar. Senator --
ZUCKERBERG: Or do you want me to come back later?
DURBIN: Please go ahead.
ZUCKERBERG: I don't think that parents should have to upload an ID or prove that they're the parent of a child in every single app that their children
use. I think the right place to do this and a place where it'd be actually very easy for it to work is within the app store's themselves, where my
understanding is Apple and Google already or at least Apple already requires parental consent when a child does a payment with an app. So, it
should be pretty trivial to pass a law that requires them to make it said parents have control anytime a child downloads an app and offers consent to
that. And the research that we've done shows that the vast majority of parents want that. And I think that that's the type of legislation, in
addition to some of the other ideas that you all have, that would make this a lot easier --
ZUCKERBERG: -- for parents.
KLOBUCHAR: Just to be clear, I remember one mom telling me, with all these things she could maybe do that she can't figure out, it's like a faucet
overflowing in a sink, and she is out there with a mop while her kids are getting addicted to more and more different apps and being exposed to
material. We've got to make this simpler for parents so they can protect their kids. And I just don't think this is going to be the way to do it. I
think the answer is what Senator Graham has been talking about, which is opening up the halls of the courtroom. So, that puts it on you, guys, to
protect these parents and protect these kids, and then also to pass some of these laws that makes it easier for law enforcement.
DURBIN: Thank you, Senator Klobuchar. We're going to try to stick to the seven-minute rule. It didn't work pretty well. But, we're going to -- I
will try to give additional time on the other side as well. Senator Cornyn.
DURBIN: There is no question that your platforms are very popular. But, we know that while here in the United States we have an open society in free
exchange of information that there are authoritarian governments, there are criminals who will use your platforms for the sale of drugs, for sex, for
extortion, and the like.
And, Mr. Chew, I think your company is unique among --
HUNT: All right. We're going to keep our ears on what's going on with the social media hearing and those tech CEOs. But, we have another critical
congressional hearing also happening right now. The FBI Director Christopher Wray, he is testifying in the House this hour about the threats
posed by Chinese hackers and this could touch on possible threats to the 2024 election. Let's listen to that.
CHRISTOPHER WRAY, FBI DIRECTOR: When China decides the time has come to strike, they're not focused just on political and military targets.
We can see from where they position themselves, across civilian infrastructure, that low blows aren't just a possibility in the event of a
conflict. Low blows against civilians are part of China's plan. But, the PRC's cyber onslaught goes way beyond pre-positioning for future conflict.
Today and literally every day, they're actively attacking our economic security, engaging in wholesale theft of our innovation and our personal
and corporate data, nor is cyber the only PRC threat we face. The PRC cyber threat is made vastly more dangerous by the way they net cyber into a
whole-of-government campaign against us.
They recruit human sources to target our businesses, using insiders to steal the same kinds of innovation and data that their hackers are
targeting, while also engaging in corporate deception, hiding Beijing's hand in transactions, joint ventures and investments to do the same. And
they don't just hit our security and economy. They target our freedoms, reaching inside our borders across America to silence, coerce, and threaten
some of our citizens and residents.
Well, I can assure you, the FBI is laser focused on the threat posed by Beijing. We've got cyber, counterintelligence, criminal and WMD experts,
just to name a few, defending against it. And we're working in partnership, partnership with the private sector, partnership with our allies abroad,
and partnership at all levels of the U.S. government, especially the NSA, Cyber Command, CISA and ONCD, whose leaders I'm honored to be here with
In fact, just this morning, we announced an operation where we and our partners identified hundreds of routers that had been taken over by the PRC
state-sponsored hacking group known as Volt Typhoon. The Volt Typhoon malware enabled China to hide, among other things, preoperational
reconnaissance and network exploitation against critical infrastructure like our communications, energy, transportation, and water sectors, steps
China was taking, in other words, to find and prepare to destroy or degrade the civilian critical infrastructure that keeps us safe and prosperous.
And let's be clear, cyber threats to our critical infrastructure represent real-world threats to our physical safety. So, working with our partners,
the FBI ran a court-authorized, on-network operation to shut down Volt Typhoon and the access it enabled. Now, this operation was an important
step, but there is a whole lot more to do, and we need your help to do it.
To quantify what we're up against, the PRC has a bigger hacking program than that of every major nation combined. In fact, if you took every single
one of the FBI cyber agents and intelligence analysts and focus them exclusively on the China threat, China's hackers would still outnumber FBI
cyber personnel by at least 50 to one. So, as we sit here, while important budget discussions are underway, I will note that this is a time to be
keeping ahead of the threat by investing in our capabilities rather than cutting them. We need to ensure that we sustain and build on the gains that
we've made that have enabled us to take actions like the Volt Typhoon operation I just mentioned.
The budgets that emerge from discussions underway now will dictate what kind of resources we have ready in 2027, a year that as this Committee
knows all too well, the CCP has circled on its calendar. And that year will be on us before you know it. As I've described, the PRC is already today
putting their pieces in place. I do not want those watching today to think we can't protect ourselves. But, I do want the American people to know that
we cannot afford to sleep on this danger. As a government and a society, we've got to remain vigilant and actively defend against the threat that
Otherwise, China has shown it will make us pay. Thank you and look forward to today's discussion.
DURBIN: Thank you, sir. Director Coker, you're recognized for your opening statement.
HARRY COKER JR., NATIONAL CYBER DIRECTOR: Chairman Gallagher, Ranking Member Krishnamoorthi, and distinguished members of the Select Committee,
thank you for the opportunity to testify. I have been honored to lead the Office of the National Cyber Director (ONCD) in the White House for a
little over a month now, and I am grateful to Congress and your leadership, Mr. Chairman, for creating this office. And Mr. Ranking Member, I
appreciate our conversation yesterday and your abiding interest in our workforce challenges.
ONCD was established by --
HUNT: All right. We're going to keep our ears on these two hearings that are unfolding. You just heard the FBI Director Christopher Wray talking
about Chinese hackers and how they far outnumber FBI cybersecurity personnel and threats that that poses. We're also keeping our ears on the
tech CEOs that are under pressure from Congress about how they're going to protect kids on their platforms.
That said, we also have so much political news to get to today. So, we're going to take a quick break. We'll be right back.
HUNT: All right. We have been listening to this hearing with Big Tech CEOs from all the major companies you know of, Meta, X, Snap, Discord, TikTok.
We want to talk about this with today's panel as we continue to monitor this, because it's clearly an issue that's really resonating at home with
everyone figuring out how to protect their kids online.
With us today, Paul Begala, he is a CNN Political Commentator and worked for former President Bill Clinton. Alex Conant was Communications Director
for Marco Rubio's 2016 Presidential Campaign. We have Molly Ball. She is Senior Political Correspondent for The Wall Street Journal. And Alayna
Treene is a CNN Reporter. So, this is personal.
This is political. This touches on all aspects of our lives.
Molly, we were -- I think part of the reason why we wanted to make sure to show all of this and why we're talking about it, and we're sitting around
this table talking about what do we do? You, I, Alex, have kids at same ages. Paul has grown kids. Alayna is going to have to grapple with someday,
maybe. But -- and we can talk about the politics of Trump in particular and how that relates to this. But, I mean, Molly, this has been an issue that
Congress -- you saw Amy Klobuchar getting emotional talking about this. Tight? This is something that's in everybody's lives. It's in our pockets.
Everyone is worried about their children. And yet, Congress hasn't been able to actually do anything to get something done. Why is that, and where
do we go from here?
MOLLY BALL, SENIOR POLITICAL CORRESPONDENT, WALL STREET JOURNAL: Yeah. I couldn't help thinking. And you heard Senator Klobuchar speak to this as
well. Like, how many times have we had these CEOs hauled in front of some kind of panel and yelled at? And what are they supposed to do? They're not
the ones making the laws. They can sit there and answer questions. But, it's the lawmakers who have really failed to do anything. And you heard the
Senator expressing some of that frustration. And there is a lot of bipartisan agreement that something should be done. But, there is not
agreement on what exactly they should do. I certainly couldn't tell you what and how could be done given the vagaries of the First Amendment and
another trip wires that are involved here.
So, I think everyone wants to protect children in some sort of vague way. But, it is a hard problem, number one, and number two, given how divided
the Congress is and given all of the different interests at stake that you have not -- the Committee has passed things, but it just has not been a
high enough priority for either the full Senate or the entire Congress to get something done.
HUNT: Right. Well, Paul, we were also talking about this. There is something called Section 230.
PAUL BEGALA, CNN POLITICAL COMMENTATOR: Right.
HUNT: Right? And so, for people who aren't as down in the congressional weeds, as some of us at this table, it basically is a law that treats
internet companies differently from other places where speech might unfold. And it says, well, if somebody does something on Twitter or Facebook or
some other platform, the platform is not liable for the consequences of that speech, the person who posted it, maybe, but the platform is not.
That's not true for those of us sitting at this table right now, for example. Is it time for that to go, and is there any hope that that might
BEGALA: Yes and no. Yes, it's time for to go, no, I don't have very much hope will happen, although you did see remarkable bipartisan support for
that. My boss, President Clinton, signed that law. It was part -- it was 28 words, one sentence in a very long bill that was called the Communications
Decency Act. And unwittingly, those 28 words probably created more in decency in public discourse than anything else. It wasn't the intent. The
idea was, the internet is just like a bulletin board. And you don't want to sue the YMCA because somebody puts up a card saying, I want to sell my car,
and it's a lemon. So, that was the idea. And it was probably a good idea at the time. It created enormous companies and enormous wealth. But now, it is
wealth -- power without accountability.
And so, when Lindsey Graham, very conservative lawyer, Senator, and now Amy Klobuchar, a very liberal lawyer, Senator, are both saying, yeah, you ought
to hold these companies accountable in a court of law. They just -- it would be the same standard for these colossal companies that it is for
every weekly newspaper in America.
HUNT: Yeah. Well, I mean, and we have this example, honestly, this morning, this news of this man who allegedly, then he claims he beheaded his father
over something political. Puts a video up on YouTube and it's up for hours and thousands of people see it. It's -- I don't know. It's kind of
ALEX CONANT, PARTNER, FIREHOUSE STRATEGIES, & FMR. COMMUNICATIONS DIR. FOR MARCO RUBIO'S 2016 PRES. BID: Tech companies want to get that stuff off the
internet, and are working really hard to get it big. YouTube doesn't want people beheading other people on their platform. That doesn't do anybody
any good. And the 28, worth to mention that Section 230, that has created the Internet as we know it today, which is -- it creates tremendous wealth
and connects people over the country. And obviously, I do work in the tech sector.
CONANT: But, I think when you start to make these changes, you have to be very careful of unintended consequences. And the reason that Congress has
not acted is because of the fear of unintended consequences when you start tweaking with policies that have over the last generation been wildly
HUNT: Yeah. Yeah. And I want to tap that expertise for a second, but I think we want to listen for a second to what Chris Coons is pushing the
CEOs on. Let's listen.
SEN. CHRIS COONS (D-DE): -- the content. In closing, Mr. Chairman, I have a bipartisan bill, the Platform Accountability and Transparency Act, co-
sponsored by senators Cornyn, Klobuchar, Blumenthal, on this Committee, and Senator Cassidy and others. It's in front of the Commerce Committee, not
this committee. But, it would set reasonable standards for disclosure and transparency to make sure that we're doing our jobs based on data. Yes,
there is a lot of emotion in this field, understandably. But, if we're going to legislate responsibly about the management of the content on your
platforms, we need to have better data. Is there any one of you willing to say now that you support this bill?
Mr. Chairman, let the record reflect a yawning silence from the leaders of the social media platforms. Thank you.
DURBIN: Thanks Senator Coons. We're on one of two -- the first of two roll calls. And so, please understand, if some of the members leave and come
back, it's no disrespect. They're doing their job. Senator Lee.
SEN. MIKE LEE (R-UT): Thank you, Mr. Chairman. Tragically, survivals -- survivors of sexual abuse are often repeatedly victimized and re-victimized
over and over and over again by having nonconsensual images of themselves on social media platforms. There was a NCMEC study that pointed out there
was one instance of CSAM that reappeared more than 490,000 times after it had been reported, after it had been reported. So, we need tools in order
to deal with this. We need, frankly, laws in order to mandate standards so that this doesn't happen, so that we have a systematic way of getting rid
of this stuff, because there is literally no plausible justification, no way of defending this.
One tool, one that I think would be particularly effective, is a bill that I'll be introducing later today, and I invite all my committee members to
join me. It's called the PROTECT Act. The PROTECT Act would, in pertinent part, require websites to verify age and verify that they've received
consent of any and all individuals appearing on their site in pornographic images, and it also require platforms to have meaningful processes for an
individual seeking to have images of him or herself removed in a timely manner.
Ms. Yaccarino, based on your understanding of existing law, what might it take for a person to have those images removed, say from X?
LINDA YACCARINO, CEO, X: Senator Lee, thank you. It sounds like what you're going to introduce into law in terms of ecosystem-wide and user consent
sounds exactly like part of the philosophy of why we're supporting the SHIELD Act, and no one should have to endure non-consensual images being
LEE: Yep. And without that, without laws in place, and it's fantastic anytime a company, as you've described with yours, wants to take those
steps. It's very helpful. It can take a lot longer than it should. And sometimes, it does to the point where somebody had images shared 490,000
times after it was reported to the authorities. And that's deeply concerning. But, yes, the PROTECT Act would work in tandem with. It's a
good complement to the SHIELD Act.
Mr. Zuckerberg, let's turn to you next. As you know, I feel strongly about privacy and believe that one of the best protections for an individual's
privacy online involves end-to-end encryption. We also know that a great deal of grooming and sharing of CSAM happens to occur on end-to-end
encrypted systems. Does this Meta allow juvenile accounts on its platforms to use encrypted messaging services within those apps?
ZUCKERBERG: Sorry, Senator. What do you mean juvenile?
LEE: Underaged. People under 18.
ZUCKERBERG: Under 18. We allow people under the age of 18 to use WhatsApp, and we do allow that to be encrypted. Yes.
LEE: Do you have a bottom level age at which they're not allowed to use it?
ZUCKERBERG: I -- yeah.
ZUCKERBERG: I don't think we allow people under the age of 13.
LEE: What about you, Mr. Citron? On Discord, do you have -- do you allow kids to have accounts to access encrypted messaging?
CITRON: Discord is not allowed to be used by children under the age of 13. And we do not use end-to-end encryption for text messages. We believe that
it's very important to be able to respond to welfare and law enforcement requests to -- and we're also working on proactively building technology.
We're working with a nonprofit called Thorn to build a grooming classifier so that our teen safeties feature can actually identify these
conversations, if they might be happening, so we can intervene and give those teens' tools to get out of that situation or potentially even report
those conversations and those people to law enforcement.
LEE: And then, encryption as much as it can prove useful elsewhere. It can be harmful especially if you're on a site where you know children are being
groomed and exploited. If you allow children onto an end-to-end encryption enabled in the app, that can prove problematic.
Now, let's go back to you for a moment, Mr. Zuckerberg. Instagram recently announced that it's going to restrict all teenagers from access to eating
disorder material, suicidal ideation themes material, self-harm content, and that's fantastic. That's great. What's odd? What I'm trying to
understand is, why it is that Instagram is only restricting -- its restricting access to sexually explicit content, but only for teens ages 13
to 15? Why not restricted for 16 and 17-year-olds as well?
ZUCKERBERG: Senator, my understanding is that we don't allow sexually explicit content on the service for people of any age. The --
LEE: How is that going?
ZUCKERBERG: Our prevalence metrics suggest that, I think it's 99 percent or so of the content that we remove, we're able to identify automatically
using AI system. So, I think that our efforts in this, while they're not perfect, I think are industry-leading. The other thing that you asked about
was self-harm content, which is what we recently restricted. And we made that shift of the -- I think the state of the science is shifting a bit.
Previously, we believed that when people were thinking about self-harm, it was important for them to be able to express that and get support. And now,
more of the thinking in the field is that it's just better to not show that content at all, which is why we recently moved to restrict that from
showing up for those teens at all.
LEE: OK. Is there a way for parents to make a request on what their kid can see or not see on your sites?
ZUCKERBERG: There are a lot of parental controls. I'm not sure if they're - - I don't think that we currently have a control around topics. But, we do allow parents to control the time that the children are on the site. And
also, a lot of it is based on kind of monitoring and understanding what the teens' experience is what they are interacting.
LEE: Mr. Citron, Discord allows pornography on its site. Now, reportedly, 17 percent of minors who use Discord has -- have had online sexual
interactions on your platform, 17 percent, and 10 percent have those interactions with someone that the minor believed to be an adult. Do
restrict minors from accessing Discord servers that host pornographic material on them?
CITRON: Senator, yes, we do restrict minors from accessing content that is marked for adults. Discord also does not recommend content to people.
Discord as a chat app. We do not have a feed or an algorithm that boosts content. So, we allow adults to share content with other adults in adult-
labeled spaces, and we do not allow teens to access that content.
LEE: OK. I see my time has expired. Thank you.
SEN. SHELDON WHITEHOUSE (D-RI): Welcome everyone. We are here in this hearing, because as a collective, your platforms really suck at policing
themselves. We hear about it here in Congress with fentanyl and other drug dealing facilitated across platforms. We see it and hear about it here in
Congress with harassment and bullying that takes place across your platforms. We see it and hear about it here in Congress with respect to
child pornography, sexploitation and blackmail. And we are sick of it.
It seems to me that there is a problem with accountability because these conditions continue to persist. In my view, Section 230, which provides
immunity from lawsuit, is a very significant part of that problem. If you look at where bullies have been brought to heel recently, whether it's
Dominion finally getting justice against Fox News after a long campaign to try to discredit the election equipment manufacture or whether it's the
moms and dads of the Sandy Hook victims finally getting justice against InfoWars and its campaign of trying to get people to believe that the
massacre of their children was a fake put on by them.
Or even now, more recently, with a rioter getting a very significant judgment against Donald Trump after years of bullying and defamation. An
honest courtroom has proven to be the place where these things get sorted out. And I'll just describe one case, if I may. It's called Doe v. Twitter.
The plaintiff in that case was blackmailed in 2017 for sexually explicit photos and videos of himself, then aged 13 to 14. A compilation video of
multiple CSAM videos surfaced on Twitter in 2019. A concerned citizen reported that video on December 25, 2019, Christmas Day. Twitter took no
action. The plaintiff, then a minor in high school in 2019, became aware of this video from his classmates in January of 2020. You're a high school
kid. And suddenly, there is that. That's a day that's hard to recover from. Ultimately, he became suicidal.
He and his parents contacted law enforcement and Twitter to have these videos removed on January 21 and again on January 22, 2020, and Twitter
ultimately took down the video on January 30, 2020, once federal law enforcement got involved. That's a pretty foul set of facts. And when the
family sued Twitter for all those months of refusing to take down the explicit video of this child, Twitter invoked Section 230. And the district
court ruled that the claim was barred. There is nothing about that set of facts that tells me that Section 230 performed any public service in that
I would like to see very substantial adjustments to Section 230 so that the honest courtroom which brought relief and justice to E. Jean Carroll after
months of defamation which brought silence, peace and justice to the parents of the Sandy Hook children after months of defamation and bullying
by InfoWars and Alex Jones and which brought significant justice and an end to the campaign of definition by Fox News to a little company that was busy
just making election machines.
So, I would -- my time is running out. I'll turn to -- I guess, Senator Cruz is next. But, I would like to have each of your companies put in
writing what exemptions from the protection of Section 230 you would be willing to accept bearing in mind the fact situation in Doe v. Twitter,
bearing in mind the enormous harm that was done to that young person and that family by the non-responsiveness of this enormous --
HUNT: All right. We're going to keep our eye -- our eyes and ears on this hearing for you and bring you back to any other explosive or interesting
But, I want to bring our panel back.
And Alayna Treene is with us. She covers Donald Trump for us. Alayna, we've been kind of talking about the contours of this. I mean, Senator Whitehouse
just there, obviously shared a very devastating story about somebody who had a horrible time on Twitter that Section 230 allowed. Now, of course,
the people who would be responsible for changing it are, of course, members of Congress, not these CEOs.
But, Donald Trump, obviously has spoken out about this very loudly. And some of how we've seen Twitter change is in no small part in a direction
that he and other supporters of him wanted to see. And that platform has changed in meaningful ways. With him on track to be the nominee, what is
your sense? I mean, what has he said before about what these platforms should or shouldn't be doing, and what would we expect if he were to
actually become President?
ALAYNA TREENE, CNN REPORTER: Right. So, a couple of things. One and different from the issue of child safety, but just in the broader context
of Section 230 and how to handle tech companies and the content on their sites and platforms, is a big focus for Donald Trump and a lot of
conservatives is censorship. They very much think that these tech platforms have been unjustly targeting conservatives and limiting free speech. And
so, in a different sense, he actually said many times that he wants to change laws, revise Section 231, to make it harder for content to be
removed from these sites that maybe are labeled misinformation that Donald Trump has argued in the past, is not misinformation or disinformation.
But also, I think something that actually Democrats may agree with on this is making it easier for sometimes the tech companies to be held liable.
Those are some things that Donald Trump has talked about. And I mentioned this. But, he also, I remember, last year, when he was laying out a
platform for potential changes, if he were to be President, he laid out what was called a digital Bill of Rights. So, I just wanted to pull it up,
so I got this right, which would basically force government officials to get a court order before taking down online content and inform users if
their content is being restricted or removed. So, that's just one of the ways that Donald Trump has said, or something that he has vowed to do if --
HUNT: And that's basically shadow banning, right, is the word that you might hear when they say -- and that has to do with -- I mean, Alex, there
is a difference, it seems, in these companies between those who make decisions about what contents users are going to see and what they're not.
And we saw a little bit in that exchange where they were pressing the CEO of Discord, for example, said like, we don't have an algorithm, like, we
don't decide. We're going to show you more of this unless of that. We just allow people to share the algorithm itself and dealing with it in those
contexts is also extremely complicated and hard to legislate.
CONANT: It is, and it's hard to explain to people why some content is promoted in other contexts. In some cases, I'm not sure these companies
fully understand themselves because these algorithms are so complicated. But, a big part of what the algorithms are doing is taking down sexually
abusive content, explicit content, content that people that people don't want their kids seeing, the companies don't want to be hosting them on the
platforms because they've got advertisers that they're worried about. Section 230 is what allows them to do that, to moderate their own content
without being held liable for it.
So, yes, Donald Trump talks about wanting to reform Section 230, but that's because he wants to go sue Facebook when they take him off its platform.
CONANT: So, these are very complicated issues that I don't think, with all due respect, some of the senators fully understand.
HUNT: So, Paul, I absolutely take Alex's point on that. I mean, having covered Congress extensively, I mean, it is very clear that they are not
able to keep up with the pace at which technology is changing, and that is a big part of what makes it impossible to legislate. We're now -- I mean,
we haven't even dealt with this, and we're looking at AI on the horizon, like many more consequences that, again, this -- they, as a body, have just
been too slow to deal with. Now, that said, you think about this -- about politics and how people operate in this kind of a space. What kind of
pressure is there on the tech companies to actually do something about this from like a political outside pressure perspective? Is there any? Io they
need to respond to it?
BEGALA: Very little. That's why they have to have laws. Right? People use those platforms because they love those platforms. By the way, they love
those platforms because those platforms weaponize our biology against us. They have neuroscientists. They understand that our amygdala is fired by
particular things. And it's not nice things that fire our amygdala. Right? We really -- this is why negative ads work. I mean, thousands of negative
ads I've run because we're wired to process negative information. And it's why -- I think Senator Whitehouse made a really good point. We have worked
in politics in five continents. We have the strongest free speech protections of any democracy I know of. I love that. I've made my living
off of that.
And yet, we still have the right to sue someone who defamed us. And he talked about Mr. Jones who defamed the families of Sandy Hook or Ms.
Carroll who was defamed by Mr. Trump. This -- that will change. It won't end social media if they're held accountable in the very same way that a
newspaper or a radio station or a television station is.
To me, it's a very simple solution. Let them defend themselves in court rather than put it all on them. Why does Mr. Zuckerberg have to spend all
this money and worry all the time and be attacked by Donald Trump? As Sheldon Whitehouse said, a competent, honest court been handling this for
hundreds of years in this country, and it can handle it in this case too.
CONANT: Which fundamentally changed the nature of the internet.
BEGALA: Thank God.
CONTANT: So -- well, I mean, it's created the -- I would just say --
BEGALA: It's created wealth for a few of these knuckleheads. It's like suicide. It has created -- our teenagers.
CONANT: I think it's created a lot of wealth for just about every business in the country who uses the internet in one capacity or the other to be
HUNT: But, I will say, Alex, just to -- I mean, I think when this law was created, to Paul's earlier point, I mean, it was a Wild West. These were
small startups that needed some space and time to figure out what was going to be the most effective. I mean, you could argue now that these companies
are more powerful than most governments on the face of the planet. Right? So, why shouldn't they take more responsibility?
CONANT: Well, I mean, they don't have their own militaries. Are they? But, look, to your point, I think --
HUNT: But, they can --
HUNT: -- military operations.
CONANT: I think every company up there has been arguing for some sorts of reforms. They all want some changes to the law. My point is, these are very
complicated policies that are bad. In many cases, the tech advances faster than the policies can, possibly --
HUNT: Right. That's fair. Yeah.
CONANT: And so, when we rush to do -- undo things like Section 230, there will be real unintended consequences, and I just think it's important that
policymakers think that through.
HUNT: That's also -- I just want to cut to the heart of this matter too. We kind of had it up in a Chiron (ph) earlier. But, the main message, and
Alex, I know, again, just for our viewers, Alex works a lot with these tech companies who honestly are writing their own legislation and trying to
decide what the best policies are. So, to that end, I do think it's helpful to have you kind of explain this to us. But, basically, what Zuckerberg and
company, the social media CEOs are saying is that they don't want to do it. They want Google and Apple to deal with it, with this as a problem. Is that
CONANT: Look, I think that there is a lot of Big Tech versus Big Tech, little tech versus big tech, Big Tech versus little tech. It's a very -- we
talk about Big Tech as though it's all one big viewpoint and there is a lot that's driven.
HUNT: So -- but, like Zuckerberg basically said, we want Apple and Google in their Android interface to deal with this child porn.
BEGALA: They want to put the onus on -- right. Facebook clearly wants to put the onus on the app stores so that they don't have to regulate
themselves as much. Vice versa, I think the app stores would probably say, look, once the apps are downloaded, it's on these companies to make sure
that kids aren't accessing the bad stuff. I think that is something that clearly is a choice before the Congress, and a lot of states are dealing
with this as well. And they're even, I think, less equipped to deal with some of these complicated issues than Congress is.
But, clearly, look, there is a -- there is bipartisan consensus to do something. This was (inaudible) consensus about what exactly we should do,
because every policy that you go after is going to have some unintended consequence, that's either going to make it harder to get bad stuff off the
internet, or it's going to make it easier to put bad stuff up on the internet.
HUNT: Molly, big picture here. What are parents kind of supposed to do in these situations? Like what -- it's -- to a certain extent, the Congress is
clearly trying to create an environment that will make it easier for parents to protect their kids. But, it's -- I guess I when I think about
this, like, that's what I grapple with. When do I give my kids, thank God, they are too little for this right now, but when do I give them a phone?
When do I allow them access to this? Can you prohibit it from them at all? I mean, some school districts are starting to ban phones in classrooms.
Like, what are people supposed to do?
BALL: I think parents have to parent. Right? I mean, in some sense, it's no different than it's ever been. You got to -- you have to supervise your
kids within reason and trust them where you can, and there are various safeguards in place that the companies make available, whether it's
limiting the age for certain apps, or content restrictions on the web, or on different websites, and you control what technology your kids have in
But, I think the big picture, for me as a parent, and this is not a political comment, is just that all you can ever do is equip your kid to go
out in a world that you don't control. And so, on some level, you have to believe that you've given your kids the values that they need and the tools
that they need to deal with whatever they're going to encounter, because you can't control everything that goes into and out of their brains and
everything that they're going to encounter. You have to know that they're grounded enough that when they do come across something they can deal with
HUNT: Is there any argument to be made that these companies should be treated like Big Tobacco was in the 90s where basically we said no kids
that under 18 can buy at a store?
BALL: That's certainly an argument some have made. I do think, to Alex's point, we have to remember there is also an upside to these technologies.
They do connect people. When I was a kid, I had a landline phone in my room and I fell asleep with it in my hand every night, talking to my best friend
for five hours. Kids are socialized and grow up and learn to be human beings by communicating with one another. And some of it is toxic and some
of it is horrible. And perhaps there should be regulation. But, we have to remember that kids having friendships and relationships with each other is
also a hugely important part of their social development. And --
BALL: -- I'm in touch with people on Facebook that I don't think I would ever bother to send a letter to, and I value that as well. So --
HUNT: Fair enough.
BALL: -- and we as journalists are able to distribute our work to a worldwide audience that when I had my first newspaper gig and we didn't
have a website, it was not available to me. So, the dissemination of information, people's ability to form relationships with each other, all of
that is also good and important, and in the interest of society.
HUNT: Alayna, I'll give you the last word here, because this is obviously affected way beyond, I mean, we're talking about kids and sex and all the
ramifications there. But, our politics have been dramatically changed. I honestly would argue probably for the worse, although I guess we'll see.
TREENE: Yeah. I feel like social media. I mean, when I think about it with journalism, it's funny. It's what I tell a lot of young journalists when I
go and maybe talk to students or whatnot that the one thing I always warn them about is be careful on social media. Be careful using Twitter, because
that's something that I think can be so dangerous to careers, can be so dangerous to people, young people, I think exactly what they're talking
about. But, I also agree with Molly that there is a lot of benefits to it that you can get something read far much wider than you wouldn't if you
HUNT: Well, parts I've learned. I'm a Luddite at the table. Maybe call a Formula club over here.
All right. Thank you guys so much for spending this hour with us.
I am Kasie Hunt. CNN's coverage of all of this is going to continue, the tech leaders hearing, and of course FBI Director Christopher Wray also
testifying on the Hill. One World up next.