Return to Transcripts main page
Facebook Whistleblower Testifies Before Senate. Aired 10-10:30a ET
Aired October 05, 2021 - 10:00 ET
THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.
ERICA HILL, CNN NEWSROOM: Good morning. I'm Erica Hill.
JIM SCIUTTO, CNN NEWSROOM: And I'm Jim Sciutto. Welcome to our viewers here in the U.S. and around the world.
Breaking news right now, critical pushback against the powers of social media today, in a matter of minutes, Facebook Whistleblower Frances Haugen will testify before a Senate panel. She claims that Facebook is prioritizing profit over public good by knowingly peddling disinformation, in fact, hiding the results of its own internal studies.
We're going to go live to that hearing once it begins just moments from now.
HILL: In prepared testimony obtained by CNN, Haugen, who released tens of thousands of pages of internal research and documents, plans to say, quote, I believe what I did was right and necessary for the common good. But I know Facebook has infinite resources, which it could use to destroy me. As long as Facebook is operating in the dark, it is accountable to no one and it will continue to make choices that go against the common good.
Let's being our coverage this morning with CNN Correspondent Donie O'Sullivan and Chief Media Correspondent Brian Stelter. Good to have both of you with us.
Donie, I just want to start with you. As we are waiting, we did, as we mentioned, we got that opening statement. And she also she talks about how what she saw at Facebook was so different from her experience at other social media companies. That stands out too.
DONIE O'SULLIVAN, CNN CORRESPONDENT: Yes, and I think we got a reminder of that yesterday, guys, right? I mean, today is all about power, the power of Facebook. And we saw just how powerful it was yesterday with that outage. One company controls three of the world's biggest platforms, WhatsApp, Instagram and Facebook. So, when one of them goes down, oftentimes, they'll all go down, as we saw yesterday.
But you're right, she said she has experienced in other platforms and what she saw at Facebook concerned her so much that she's now shortly going to begin testimony in the room behind me.
SCIUTTO: We see her walking in there now. I'm just going to quote from her opening statement because the language is pointed, to say the least. Facebook, she says, became a $1 trillion company by paying for its profits with our safety, including the safety of our children, and that is unacceptable.
Brian Stelter, as we wait for her to begin her testimony here, listen, this is not the first bout of bad P.R. for Facebook here and yet the profit spigot keeps on spitting out money, right? The clicks keep coming. And, by the way, the disinformation continues to get shared. I just wonder, do you see this as a moment of change?
BRIAN STELTER, CNN CHIEF MEDIA CORRESPONDENT: Well, every time there has been a Facebook scandal, think about Cambridge Analytica as an example, Facebook takes on a few weeks of bad press then they try to lay low, and they come back stronger than ever. Advertisers always come back to Facebook, and most users come back, although they have lost in some ways around the edges among younger users who might like Instagram, but not Facebook. But since Instagram is owned by Facebook, the house always wins. Mark Zuckerberg always wins.
If this time is different, it's because it involves children. The title of this hearing, protecting kids online. So, this is about a vulnerable population that gets addicted to these products at a very early age. Any parent knows that to be true and we're only just beginning as a society so reckon with the consequences. So, that is why her testimony, her decision to come forward -- by the way, one of tens of thousands of Facebook employees and she decided she had to quit and speak this truth. She could make a difference because it's about kids.
SCIUTTO: Yes, we had that father on last hour, right, who placed some blame on Instagram for his daughter's suicide.
HILL: Absolutely, Ian Russell, who joined in New Day earlier, and he talked about how his daughter didn't use social media a lot, but after her death, they recognized some of the places that she was visiting and where it was pointing her towards, right? And when you talk about maybe kids aren't using Facebook as much, Brian, but they are going to Instagram, that's the concern, right? I mean, look, we both have teenagers and we see this. The concern is what it is being engineered to do and what it is feeding these kids in terms of body image issues. Showing them that suicide is an answer, as we heard from that father.
STELTER: And when I interviewed a Facebook Executive Nick Clegg over the weekend, he said, well, this has always been a challenge for teenagers. It's always been a complicated environment for young women. Look at women's magazines, fashion magazines being a bad influence. The obvious difference is that fashion magazines are not as influential and addictive. They don't pull you back every minute of every hour of every day.
And what Haugen is going to share, I think, this morning is that other staffers have these same concerns. That's partly why she's shared these documents. She is not the only one inside Facebook with these fears.
SCIUTTO: Yes. People do not check fashion magazines 27 times a day, like social media feeds.
This is the chairman, the Senate Subcommittee on Consumer Protection, Richard Blumenthal.
SEN. RICHARD BLUMENTHAL (D-CT): -- ranking member, Senator Blackburn, for her cooperation and collaboration.
We've been working very closely. And the ranking member who is here, Senator Wicker, as well as our chairwoman, Maria Cantwell, Senator Cantwell, I'm sure will be here shortly.
Most important, I'd like to thank our witness, Frances Haugen, for being here and the two counsel who are representing her today. And I want to give you my heartfelt gratitude for your courage and strength in coming forward as you have done standing up to one of the most powerful, implacable corporate giants in the history of the world without any exaggeration.
You have a compelling, credible voice, which we've heard already, but you are not here alone. You're armed with documents and evidence. And you speak volumes as they do about how Facebook has put profits ahead of people.
Among other revelations, the information you have provided to Congress is powerful proof that Facebook knew its products were harming teenagers. Facebook exploited teens using powerful algorithms that amplified their insecurities and abuses through what it found was an addict's narrative.
There is a question, which I hope you will discuss, as to whether there is such a thing as a safe algorithm. Facebook saw teens creating secret accounts that are often hidden from their parents as unique value proposition, in their words, a unique valued proposition, a way to drive up numbers for advertisers and shareholders at the expense of safety. And it doubled down on targeting children, pushing products on preteens, not just teens, but preteens, that it knows are harmful to our kids' mental health and well-being.
Instead of telling parents, Facebook concealed the facts. It sought to stonewall and block this information from becoming public, including to this committee when Senator Blackburn and I specifically asked the company. And still, even now, as of just last Thursday when a Facebook witness came forward this committee, it has refused disclosure or even to tell us when it might decide whether to disclose additional documents.
And they've continued their tactics even after they knew the destruction it caused. It isn't just that they made money from these practices but they continued to profit from them. Their profit was more important than the pain that they caused. Last Thursday, a message from Ms. Antigone Davis, Facebook's Global Head of Safety, was simple. Quote, this research is not a bombshell, end quote. And she repeated the line, not a bombshell. Well, this research is the very definition of a bombshell.
Facebook and big tech are facing a big tobacco moment, a moment of reckoning. The parallel is striking. I sued big tobacco as Connecticut's attorney general. I helped to lead the states in that legal action. And I remember very, very well the moment in the course of our litigation when we learned of those files that showed not only that big tobacco knew that its product caused cancer, but that they had done the research, they concealed the files, and now we knew and the world knew.
And big tech now faces that big tobacco, jaw-dropping moment of truth. It is documented proof that Facebook knows its products can be addictive and toxic to children. And it's not just that they made money. Again, it's that they valued their profit more than the pain that they caused to children and their families.
The damage to self-interest and self-worth inflicted by Facebook today will haunt a generation. Feelings of inadequacy and insecurity, rejection and self-hatred will impact this generation for years to come.
Our children are the ones who are victims.
Teens today looking at themselves in the mirror feel doubt and insecurity. Mark Zuckerberg ought to be looking at himself in the mirror today, and yet rather than taking responsibility and showing leadership, Mr. Zuckerberg is going sailing. His new modus operand, no apologies, no admission, no action, nothing to see here. Mark Zuckerberg, you need to come before this committee. You need to explain to Frances Haugen, to us, to the world, and to the parents of America what you were doing and why you did it.
Instagram's business model is pretty straightforward, more eyeballs, more dollars. Everything Facebook does is to add more users and keep them on their apps for longer. In order to hook us, Instagram uses our private information to precisely target us with content and recommendations, assessing that what will provoke a reaction will keep us scrolling.
Far too often, these recommendations encourage are most destructive and dangerous behaviors. As we showed on Thursday, we created a fake account, my office and I did, as a teen interested in extreme dieting and eating disorders. Instagram latched on to that teenager's initial insecurities and then pushed more content and recommendations, glorifying eating disorders.
That's how Instagram's algorithms can push teens into darker and darker places. Facebook's own researchers called it Instagram's, quote, perfect storm, exacerbating downward spirals. Facebook, as you have put it, so powerfully, maximizes profits and ignores pain. Facebook's failure to acknowledge and to act makes it morally bankrupt. Again and again, Facebook rejected reforms recommended by its own researchers. Last week, Ms. Davis said, quote, we're looking at, end quote, no specific plans, no commitments, only vague platitudes.
These documents that you have revealed provided this company with a blueprint, provide specific recommendation that could have made Facebook and Instagram say for the company repeatedly ignored those recommendations from its own researchers that would have made Facebook and Instagram safer.
Facebook researchers have suggested changing their recommendations to stop promoting accounts known to encourage dangerous body comparison. Instead of making meaningful changes, Facebook simply pays lip service and if they won't act, and if big tech won't act, Congress has to intervene.
Privacy protection is long overdue. Senator Markey and I have introduced the Kids Act, which would ban addictive tactics that Facebook uses to exploit children. Parents deserve better tools to protect their children.
I'm also a firm supporter of reforming section 230. We should consider narrowing this sweeping immunity when platform's algorithms amplify illegal conduct. You've commented on this in your testimony, and perhaps you'll expand on it.
We've also heard compelling recommendations about requiring disclosures of research and independent reviews of these platform's algorithms and I plan to pursue these ideas. The Securities and Exchange Commission should investigate your intentions and claims, Ms. Haugen, and so should the Federal Trade Commission. Facebook appears to have misled the public and investors. And if that's correct, it ought to face real penalties as a result of that misleading and deceptive misrepresentation.
I want to thank all my colleagues who are here today because what we have is a bipartisan congressional roadmap for reform that will safeguard and protect children from big tech. That will be a focus of our subcommittee moving forward and it will continue to be bipartisan.
And, finally, I'll just end on this note. In the past weeks and days, parents have contacted me with their stories, heartbreaking and spine chilling stories about children pushed into eating disorders, bullying online, self-injury of the most disturbing kind and sometimes even taking their lives because of social media. Parents are holding Facebook accountable because of your bravery, Ms. Haugen. And we need to hold accountable Facebook and all big tech as well.
Again, my thanks to you. I am going to enter into the record a letter from 52 state attorneys general and from two members of the youth advisory board of Sandy Hook Promise, as long as there's no objection and now I'll turn to the ranking member, Senator Blackburn. SEN. MARSHA BLACKBURN (R-TN): Thank you, Mr. Chairman, and thank you for entering that letter in the record that we have from our states attorneys general. Good morning to everyone. It is nice to see people in this hearing room and to be here for the hearing today, Ms. Haugen. We thank you for your appearance before us today and for giving the opportunity not only for Congress but for the American people to hear from you in this setting and we appreciate that.
Mr. Chairman, I think, also thanks to you and your staff to make certain that we have worked with our team to make certain that we had this hearing and this opportunity today so that we can get more insight into what Facebook is actually doing as they invade the privacy, not only of adults, but of children and look at the ways that they are in violation of the Children's Online Privacy Protection Act, which is federal law, and looking at how they are evading that law and working around it.
And as the chairman said, privacy and online privacy, passing a federal privacy standard has been long in the works. I filed my first privacy bill when I was in the House back in 2012 and I think that it will be this congress and subcommittee that is going to lead the way to online privacy, data security, section 230 reforms. And, of course, Senator Klobuchar always wants to talk about antitrust.
And I have to give a nod. Senator Markey is down there when we were in the House, we were probably two of the only ones who were talking about the need to have a federal privacy standard.
Now, as the chairman mentioned, last week, we heard from Ms. Davis, who head global safety for Facebook. And it was surprises to us that what she tried was to minimize the information that was in these documents, to minimize the research and to minimize the knowledge that Facebook had. At one point, I even reminded her the research was not third party research. The research was their, Facebook's, internal research. So, they knew what they were doing. They knew where the violations were and they know they are guilty. They know this. Their research tells them this.
Last week, in advance of our hearing, Facebook released two studies and said that The Wall Street Journal was all wrong, that they had just gotten it wrong. As if The Wall Street Journal did not know how to read these documents and how to work through this research.
Having seen the data that you've presented and the other studies that Facebook did not publicly share, I feel pretty confident that it's Facebook who has done the misrepresenting to this committee.
Here are some of the numbers that Facebook chose not to share. And, Mr. Chairman, I think it's important that we look at these as we talk about the setting for this hearing. What we learned last week, what you and I have been learning over the past three years about big tech and Facebook. And here you go. 66 percent of teen girls on Instagram and 40 percent of teen boys experience negative social comparisons. This is Facebook's research. 52 percent of teen girls who experience negative social comparison on Instagram said it was caused by images related to beauty. Social comparison is worse on Instagram because it is perceived as real life, but based on celebrity standards. Social comparison mimics the grief cycle and includes a downward emotional spiral encompassing the range of emotions from jealousy to self- proclaimed body dysmorphia.
Facebook addiction, which Facebook calls conveniently, problematic use, is most severe in teens peaking at age 14. Facebook is not interested in making significant changes to improve kids' safety on their platforms, at least not when that would result in losing eyeballs on posts or decreasing their ad revenues. In fact, Facebook is running scared as they know that in their own words, young adults are less active and less engaged on Facebook and that they are running out of teens to add to Instagram. So, teens are looking at other platforms, like TikTok, and Facebook is only making those changes that add to its user's numbers and ultimately its profits. Follow the money.
So what are these changes? Allowing users to create multiple accounts that Facebook does not delete and encouraging teens to create second accounts they can hide from their parents. They are also studying younger and younger children, as young as eight, so that they can market to them. And while Ms. Davis says that kids below 13 are not allowed on Facebook or Instagram, we know that they are because she told us that they recently had deleted 600,000 accounts from children under of age 13.
So, how do you get that many underage accounts if you aren't turning a blind eye to them in the first place? And then in order to try to clean it up, you go to delete it then you say, oh, by the way, we just in the last month, deleted 600,000 under age accounts.
And speaking of turning a blind eye, Facebook turns a blind eye to user privacy. News broke yesterday that the private data of over 1.5 billion, that's right, 1.5 billion Facebook users is being sold on a hacking forum. That's its biggest data breach to date. Examples like this underscore my strong concerns about Facebook collecting the data of kids and teens and what they are doing with it.
Facebook also turns a blind eye toward blatant human exploitation, taking place on its platform, trafficking, forced labor, cartels, the worst possible things one can imagine. Big tech companies have gotten away with abusing consumers for too long. It is clear that Facebook prioritizes profit over the well-being of children and all users. So, as a mother and a grandmother, this is an issue that is of particular concern to me.
So, we thank you for being here today, Ms. Haugen, and we look forward to getting to the truth about what Facebook is doing with users data and how they are abusing their privacy and how they show a lack of respect for the individuals that are on their network. We look forward to the testimony. Thank you, Mr. Chairman.
BLUMENTHAL: Thanks, Senator Blackburn. I don't know whether ranking member would like to make a --
SEN. ROGER WICKER (R-MS): If you don't mind. Thank you, Chairman Blumenthal, and I will just take a moment or two. And I do appreciate being able to speak as ranking member of the full committee. This -- Ms. Haugen, this is a subcommittee hearing. You see some vacant seats. And this is pretty good attendance for a subcommittee. There are also a lot of things going on so people will be coming and going. But I'm willing to predict this will have almost 100 percent attendance by members of the subcommittee because of the importance of this subject matter.
So, thanks for coming forward to share concerns about Facebook's business practices, particularly with respect to children and teens. And, of course, that is the main topic of our -- it's the title of our hearing today, protecting kids online.
The recent revelations about Facebook's mental health effects on children and its plan to target younger audiences are indeed disturbing. And I think you're going to see a lot of bipartisan concern about this today and in future hearings. They just -- they show how urgent it is for Congress to act against powerful tech companies on behalf of children and the broader public. And I say powerful tech companies, they are possessive of immense, immense power. Their product is addictive and people on both sides of this dais are concerned about this.
I talked to an opinion-maker just down the hall a few moments before the hearing. This person said the tech gods have been demystified now. And I think this hearing today, Mr. Chair, is part of the process of demystifying big tech.
The children of America are hooked on their product. It is often destructive and harmful and there's a cynical knowledge on behalf of the leadership of these big tech companies that that is true. Ms. Haugen, I hope you will have a chance to talk about your work experience at Facebook and perhaps compare it to other social media companies. I also look forward to hearing your thoughts on how this committee and how this Congress can ensure greater accountability and transparency, especially with regard to children. So, thank you, Mr. Chairman, and thank you, Ms. Haugen, for being here today.
BLUMENTHAL: Thanks, Senator Wicker.
Our witness this morning is Frances Haugen. She was the lead project information on the Facebook civic misinformation team. She holds a degree in electrical and computer engineering from Owen College and an MBA from Harvard. She made the courageous decision, as all of us here and many others around the world know, to leave Facebook and reveal the terrible truths about the company she learned during her tenure there. And I think we are all in agreement here in expressing our gratitude and our admiration for your bravery in coming forward. Thank you, Ms. Haugen. Plead proceed.
FRANCES HAUGEN, FACEBOOK WHISTLEBLOWER: Good afternoon, Chairman Blumenthal, Ranking Member Blackburn and members of the subcommittee. Thank you for the opportunity to appear before you. My name is Frances Haugen. I used to work at Facebook. I joined Facebook because I think Facebook has the potential to bring out the best in us, but I'm here today because I believe Facebook's products harm children, stoke division and weaken our democracy.
The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They won't solve this crisis without your help.
Yesterday, we saw Facebook get taken off the internet. I don't know why it went down, but I know that for more than five hours, Facebook wasn't used to deepen divides, destabilize democracies and make young girls and women feel bad about their bodies. It also means that millions of small businesses weren't able to reach potential customers and countless photos of new babies weren't joyously celebrated by family and friends around the world.
I believe in the potential of Facebook.