Return to Transcripts main page

CNN News Central

Big Tech CEOs Testify at Online Child Safety Hearing; Fed Will Make Its Most Recent Rate Policy Announcement; Musk's Multibillion- Dollar Compensation Deal Overturned by a Judge; "You have blood on your hands", Says Sen. Graham to Tech CEOs. Aired 10:30-11a ET

Aired January 31, 2024 - 10:30   ET



JOHN BERMAN, CNN NEWS CENTRAL CO-ANCHOR: All right. Mark Zuckerberg, the CEO of Meta, delivering his opening statements at this hearing. Tech CEOs answering questions from Senators. And again, there is bipartisan pressure on these companies to do more to protect kids on social media. And we really should explain how tense this is expected to get the ranking minority. Senator Lindsey Graham on this committee has said that these companies have blood on their hands. So, this hearing could get very tense.

In the meantime, you're looking to buy a house or maybe finance a car or make major financial decisions. You better pay attention to a big announcement coming today.

Newly released body cam video shows the moments before the violent beating that cost Tyree Nichols his life.



KATE BOLDUAN, CNN NEWS CENTRAL CO-ANCHOR: I'm showing you live pictures, once again, of a hearing on Capitol Hill. The leaders of some of the world's top tech companies and social media platforms now testifying before Senators. Talking about what they are and are not doing to better protect children on their platforms and online.

Senator Graham at the top of the hearing, just so you know, what -- is coming at these CEO's and top leaders, once they are done with their opening remarks. Senator Graham said to them that you have blood on your hands. You have a product that is killing people. You can't be sued, you should be, said Senator Graham. There is bipartisan agreement and anger that has been coming towards these social media platforms for quite some time.

The question has long been and remains today, what is Congress going to do about it? And what are the social media companies going to accept or do ahead of that? We're going to continue to listen and bring you some of those top moments from that hearing.

Also, this morning, all eyes are on the Federal Reserve as everyone waits to learn their next decision on interest rates. The general expectation is that the Fed is going to hold rates steady for the fourth consecutive meeting. A big question is, when will they then make the turn to start lowering rates after the aggressive moves taken to fight inflation.

CNN's Matt Egan has much more on this for us. Talk to us -- remind everyone the significance of these -- the -- these first meetings of the year.

MATT EGAN, CNN BUSINESS AND ECONOMY REPORTER: Well, Kate, the Fed is being -- it's preparing right now to make a significant shift in policy, right? And this is a shift that everyone is going to feel.


EGAN: Remember, the Fed has been hiking interest rates at the fastest pace since the 1980s, right? All of that designed to try to get inflation under control. This of course has lifted borrowing costs, right? As a lot of our viewers out there can attest to, it has gotten a lot more expensive to get a car loan, to carry credit card debt, and of course, to get a mortgage. But all this tough medicine from the Fed, it has actually worked, right? Inflation has cooled off. It's cooled off so much that the Fed can now start thinking about finally lowering interest rates, right? Giving borrowers a bit of a break for the first time since 2020.

As you mentioned, though, there's some debate over when this is going to happen.


EGAN: It's not going to happen today, right? There's almost no chance of a rate cut today, but the market is pricing in about a 60 percent chance of a recut at the next meeting in March. Still though, there's a lot of uncertainty there, like kind of an unusual amount of uncertainty. So, we're looking to see whether or not Fed Chair Jerome Powell can drop some hints about whether borrowers can start to get relief in March or if this is more of a summer issue for when people will get relief.

Kate, as you know, though, the implications here are massive for consumers, for the economy, the market, and maybe even the election.

BOLDUAN: Yes. I mean -- and everyone always listens and kind of, I guess, tries to decipher and translate what the Fed chairman says in his press conference. So, we'll be listening to that.

There's also -- we're going to turn here to a different, very interesting story as well. A judge in Delaware striking down a multibillion-dollar payout package for Elon Musk. It had boosted him to be one of the richest people in the world, and now no law -- I mean, what happened? What happened here?

EGAN: Kate, this was a stunning decision out of Delaware. A judge there ruling that this 2018 compensation package that Musk and Tesla, that they failed to prove that this package was fair. Now, remember, this monster pay package valued at $51 billion, billion with a B at today's prices, it did help make Elon Musk one of the richest people on the planet. And Tesla shareholders, they don't really have a lot to complain about in the sense that he's gotten some amazing results when this pay package was approved back in 2018. Tesla was worth $54 billion, today more than $600 billion, that's a return of more than that.

BOLDUAN: But a judge says sorry, sorry I'm taking it?

EGAN: Yes, and listen, Elon Musk is, of course, firing back as only he can, right? He's pushing back at Delaware where this play -- this court case played out. Where Tesla and other companies are incorporated. Musk said on his social media platform, X, never incorporate your company in the state of Delaware. He also asked his millions of followers on social media whether Tesla should change its state of incorporation to Texas, where its physical headquarters are.

Now the question is, how does Tesla and its board respond? They could appeal or they could use this as an opportunity to try to get a new pay package installed for Elon Musk, one that could really lock him up for the long term.


Because, you know, as one analyst this morning put it, at the end of the day, Elon Musk is Tesla, and Tesla is Elon Musk, and the company can't afford any sort of uncertainty over whether -- where he'll be in the long term.

BOLDUAN: I think one certainty is, is Elon Musk is not going to convince people to not incorporate him in Delaware since that is what a lot of people do.

EGAN: Yes.

BOLDUAN: It's good to see you, Matt. Thank you.

EGAN: Thank you, Kate.


BERMAN: All right. A member of Congress, under federal investigation, accused of paying her husband with federal money.



BERMAN: You have a product that is killing people, that is what tech CEOs were just told moments ago on Capitol Hill, part of this contentious testimony as they answer questions from senators who want answers about what these companies are doing to protect children.

You're listening right now to Linda Yaccarino who runs X, everyone knows it as Twitter, for Elon Musk. Let's listen.

LINDA YACCARINO, CEO, X CORP: The Kids Online Safety Act should continue to progress and we will support the continuation to engage with it and ensure the protections of the freedom of speech.

There are two additional areas that require everyone's attention. First, as the daughter of a police officer, law enforcement must have the critical resources to bring these bad offenders to justice. Second, with artificial intelligence, offenders' tactics will continue to sophisticate and evolve. Industry collaboration is imperative here. X believes that the freedom of speech and platform safety can and must coexist. We agree that now is the time to act with urgency. Thank you. I look forward to answering your questions.

SEN. DICK DURBIN (D-IL), CHAIRMAN, JUDICIARY COMMITTEE: Thank you very much, Ms. Yaccarino. Now, we're going to rounds of questions, seven minutes each for the members as well. I would like to make note of your testimony, Ms. Yaccarino. I believe you are the first social media company to public endorse the CSAM Act.

YACCARINO: It is our honor, Chairman.

DURBIN: That is progress, my friends. Thank you for doing that. I'm still going to be asking some probing questions, but let me get down to the bottom line here. I'm going to focus on my legislation on CSAM. What it says is civil liability, if you intentionally or knowingly host or store child sexual abuse materials or make child sex abuse materials available. Secondly, intentionally or knowingly promote or aid and abet a violation of child sexual exploitation laws. Is there anyone here who believes you should not be held civilly liable for that type of conduct? Mr. Citron?

JASON CITRON, CEO, DISCORD: Good morning, Chair. You know, we very much believe that this content is disgusting and that there are many things about the Stop CSAM Bill that I think are very encouraging, and we very much support adding more resources for the CyberTipline and modernizing that along with giving more resources to NCMEC. And we're -- be very open to having conversations with you and your team to talk through the details of the bill some more.

DURBIN: I sure would like to do that, because if you intentionally or knowingly post or store CSAM, I think you ought to at least be civilly liable. I can't imagine anyone who would disagree with that.

CITRON: Yes, it's a disgusting content.

DURBIN: It certainly is. That's why we need you supporting this legislation.

Mr. Spiegel, I want to tell you, I listened closely to your testimony here, and it's never been a secret that Snapchat is used to send sexually explicit images. In 2013, early in your company's history, you admitted this in an interview. Do you remember that interview?

EVAN SPIEGEL, CO-FOUNDER AND CEO, SNAP INC.: Senator, I don't recall the specific interview.

DURBIN: You said that when you were first trying to get people on the app, you would, "Go up to the people and be like, hey, you should try this application. You can send disappearing photos." And they would say, oh, for sexting? Do you remember that interview?

SPIEGEL: Senator, when we first created the application, it was actually called Picaboo, and it -- the idea was around disappearing images. We -- the feedback we received from people using the app is that they were actually using it to communicate. So, we changed the name of the application to Snapchat, and we found that people were using it to talk visually.

DURBIN: As early as 2017, law enforcement identified Snapchat as the pedophile's go-to sexual exploitation tool. The case of a 12-year-old girl, identified in court only as LW, shows the danger. Over two and a half years, a predator sexually groomed her, sending her sexually explicit images and videos over Snapchat. The man admitted that he only used Snapchat with LW and not any other platforms because he, "Knew that chats would go away." Did you or everyone else at Snap really fail to see that the platform was the perfect tool for predators?


SPIEGEL: Senator, that behavior is disgusting and reprehensible. We provide in app reporting tools so that people who are being harassed or who are, you know, have been shared inappropriate sexual content can report it. In the case of harassment or sexual content, we typically respond to those reports within 15 minutes so that we can provide help.

DURBIN: When LW, the victim, sued Snapchat, her case was dismissed under Section 230 of the Communications Decency Act. Do you have any doubt that had Snap faced the prospect of civil liability for facilitating sexual exploitation, the company would have implemented even better safeguards?

SPIEGEL: Senator, we already work extensively to proactively detect this type of behavior. We make it very difficult for predators to find teens on Snapchat. There are no public friends list. No public profile photos. When we recommend friends for teens, we make sure that they have several mutual friends in common before making that recommendation. We believe those safeguards are important to preventing predators from misusing our platform.

DURBIN: Mr. Citron, according to Discord's website, it takes a, "Proactive and automated approach to safety only on servers with more than 200 members. Smaller servers rely on server owners and community moderators to define and enforce behavior." So, how do you defend and approach to safety that relies on groups of fewer than 200 sexual predators to report themselves for things like grooming, trading a CSAM, or sextortion?

CITRON: Chair, our goal is to get all of that content off of our platform and ideally prevent it from showing up in the first place or from people engaging in these kinds of horrific activities. We deploy a wide array of techniques that work across every surface on our -- on Discord. I mentioned we recently launched something called Teen Safety Assist, which works everywhere and it's on by default for teen users. That kind of acts like a buddy that lets them know if they are in a situation or talking with someone that may be inappropriate. So, they can report that to us and block that user.

So, we --

DURBIN: Mr. Citron, if that were working, we wouldn't be here today.

CITRON: Senate chair there, there -- this is an ongoing challenge for all of us that is why we're here today. But we do have 15 percent of our company is focused on trust and safety of which this is one of our top issues, that's more people than we have working on marketing and promoting the company. So, we take these issues very seriously, but we know it's an ongoing challenge and I look forward to working with you in collaborating with our tech peers and the nonprofits to improve our approach.

DURBIN: Certainly, hope so.

Mr. Chew your organization business is one of the more popular ones among children. Can you explain to us what you are doing particularly and whether you have seen any evidence of CSAM in your business?

SHOU ZI CHEW, CEO, TIKTOK: Yes, Senator. We have a strong commitment to invest in trust and safety. And as I said in my opening statement, I intend to invest more than $2 billion in trust and safety this year alone. We have 40,000 safety professionals, you know, working on this topic. We have built a specialized child safety team to help us identify specialized issues, horrific issues like material like the ones you have mentioned.

If we identify any on our platform and we proactively do detection, we will remove it and we will report them to NCMEC and other authorities.

DURBIN: Why is it TikTok allowing children to be exploited into performing commercialized sex acts?

CHEW: Senator, I respectfully disagree with that characterization. Our live streaming product is not for anyone below the age of 18. We have taken action to identify anyone who violates that and we remove them from the -- from using that service.

DURBIN: At this point, I'm going to turn to my Ranking Member Senator Graham.


Mr. Citron, you said we need to start a discussion. Be honest with you, we've been having this discussion for a very long time. We need to get a result, not a discussion. Do you agree with that?

CITRON: Ranking Member, I agree this is an issue that we have also been very focused on since we started our company in 2015.


CITRON: But this is the first time we've been --

GRAHAM: Are you familiar with the EARN IT Act altered by myself and Sir Blumenthal?

CITRON: A little bit, yes.

GRAHAM: OK. Do you support that?


GRAHAM: Like yes or no?

CITRON: We're not prepared to support it today, but we believe the section --

GRAHAM: OK. Thus, do you support the CSAM Act?

CITRON: The Stop CSAM Act, we are not prepared to support today, but we think they're --

GRAHAM: OK. Do you support the Shield Act?

CITRON: We believe that the CyberTipline and --


GRAHAM: Do you support it, yes or no?

CITRON: We believe that the CyberTipline and NCMEC --

GRAHAM: I'll take that to be no. The Project Safe Childhood Act, do you support it?

CITRON: We believe that --

GRAHAM: I'll take that to be no. The Report Act, do you support it?

CITRON: Ranking Member Graham, we very much look forward to having conversations with you and your team. We want to be part of a --

GRAHAM: Thank you. Well, I look forward to passing a bill that will solve the problem. Do you support removing Section 230 liability protections for social media companies?

CITRON: I believe that Section 230 is -- needs to be updated. It's a very old law.

GRAHAM: Do you support repealing it so people can sue if they believe they are harmed?

CITRON: I think that Section 230 as written, while it has many downsides --

GRAHAM: Thank you. Thank you.

CITRON: -- has enabled innovation on the internet --

GRAHAM: Thank you.

CITRON: -- which I think has largely been --

GRAHAM: Thank you very much. So, here you are. You got -- if you're waiting on these guys to solve the problem, we're going to die waiting.

Mr. Zuckerberg -- mister -- try to be respectful here. The representative from South Carolina, Mr. Guffey's son, got caught up in a sex extortion ring in Nigeria using Instagram. He was shaken down, paid money, that weren't enough, and he killed himself using Instagram. What would you like to say to him?

MARK ZUCKERBERG, FOUNDER AND CEO, META: It's terrible. I mean, no one should have to go through something like that.

GRAHAM: Do you think he should be allowed to sue you?

ZUCKERBERG: I think that they can sue us.

GRAHAM: Well, I think he should and then he can't. So, bottom line here, folks, is that this committee is done with talking. We passed five bills unanimously that, in their different ways -- and look at who did this. Graham, Blumenthal. Durbin, Hawley. Klobuchar, Cornyn. Cornyn and Klobuchar. Blackburn and Ossoff. I mean, we've found common ground here that just is astonishing. And we've had hearing after hearing, Mr. Chairman. And the bottom line is, I've come to conclude, gentlemen, that you're not going to support any of this.

Linda, how do you say your last name?

YACCARINO: Yaccarino.

GRAHAM: Do you support the EARN IT Act?

YACCARINO: We strongly support the collaboration to raise industry practices to --

GRAHAM: No, no, no. Do you support the EARN IT Act?

YACCARINO: -- prevent CSAM --

GRAHAM: Do you support the EARN IT Act? Do you support the EARN IT -- in English, do you support the EARN IT Act, yes or no? We don't need doubles right here.

YACCARINO: We look forward to supporting and continue our conversations.

GRAHAM: OK. So, I'll take that as a no.

YACCARINO: As you can see --

GRAHAM: But you have -- you have taken -- the reason the EARN IT Act is important, you can actually lose your liability protections when children are exploited and you didn't use best business practices. See, the EARN IT Act means you have to earn liability production. You're given that no matter what you do. So, to the members of this Committee, it is now time to make sure that the people who are holding up the signs can sue on behalf of their loved ones. Nothing will change until the courtroom door is open to victims of social media.

$2 billion dollars, Mr. Chew. How much -- what percentage is that of what you made last year?

CHEW: Senator, it's a significant and increasing investment. As a private company --

GRAHAM: But, I mean, like --

CHEW: -- we're not sharing our financials.

GRAHAM: -- you pay taxes. I mean, two percent is what percent of your revenue?

CHEW: Senator, we are not ready to share our financials in public.

GRAHAM: Well, I just think $2 billion sounds a lot, unless you make $100 billion. So, the point is, you know, when you tell us you're going to spend $2 billion, great. But how much do you make? You know, it's all about eyeballs. Well, our goal is to get eyeballs on you. And it's just not about children. I mean, the damage being done. Do you -- are you -- do you realize, Mr. Chew, that your TikTok representative in Israel resigned yesterday?

CHEW: Yes, I am aware.

GRAHAM: OK. And he said, I resigned from TikTok. We're living in a time in which our existence as Jews in Israel and Israel is under attack and in danger. Multiple screenshots taken from TikTok's internal employee chat platform known as Lark (ph) show how TikTok's trust and safety officers celebrate the barbaric acts of Hamas and other Iranian-backed terror groups, including the Houthis in Yemen.

CHEW: Senator, I need to make it very clear that pro-Hamas content and hate speech is not allowed --

GRAHAM: Why did you --

CHEW: -- on our platform --

GRAHAM: Why did he resign?

CHEW: -- and within our country.

GRAHAM: Why did he resign? Why did he quit?

CHEW: Senator, we also do not allow any hateful behavior --

GRAHAM: Do you know why he quit? Do you know why he quit?

CHEW: We do not allow this. We will investigate such claims.

GRAHAM: But my question is, he quit, but I'm sure he had a good job.