Return to Transcripts main page

CNN Newsroom

Facebook Whistleblower Testifies Before Senate. Aired 10:30-11a ET

Aired October 05, 2021 - 10:30   ET

THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.


(COMMERCIAL BREAK)

[10:30:00]

FRANCES HAUGEN, FACEBOOK WHISTLEBLOWER: We can have social media we enjoy that connects us without tearing apart our democracy, putting our children in danger and sowing ethnic violence around the world. We can do better.

I have worked as a product manager at large tech companies since 2006 including, Google, Pinterest, Yelp and Facebook. My job has largely focused on algorithmic products, like Google Plus Search and recommendation systems like the one that powers the Facebook news feed.

Having worked on four different types of social networks, I understand how complex and nuanced these problems are. However, the choices being made inside of Facebook are disastrous for our children, our public safety, for our privacy and for our democracy, and that is why we must demand Facebook make changes.

During my time at Facebook, first working as the lead product manager for civic misinformation and later on counterespionage, I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolved these conflicts in favor of its own profits. The result has been more division, more harm, more lies, more threats and more combat.

In some cases, this dangerous online talk has led to actual violence that harms and even kills people. This is not simply a matter of certain social media users being angry or unstable or about one side being radicalized against the other. It is about Facebook choosing to grow at all costs, becoming an almost trillion dollar company by buying its profits with our safety.

During my time at Facebook, I came to realize the devastating truth. Almost no one outside of Facebook knows what happens inside of Facebook. The company intentionally hides vital information from the public, from the U.S. government and from governments around the world.

The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages. I came forward because I believe that every human being deserves the dignity of truth.

The severity of this crisis demands that we break out of our previous regulatory frames. Facebook wants to trick you into thinking that privacy protections or changes to section 230 alone will be sufficient. While important, these will not get to the core of the issue, which is that no one truly understands the destructive choices made by Facebook except Facebook.

We can afford nothing less than full transparency. As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. Until the incentives change, Facebook will not change. Left alone, Facebook will continue to make choices that go against the common good, our common good.

When we realize big tobacco was hiding the harms it caused, the government took action. When we figured out cars were safer with seat belts, the government took action. And when our government we learned that opioids were taking lives, the government took action. I implore you to do the same here.

Today, Facebook shapes our perception of the world by choosing the information we see. Even those who don't news Facebook are impacted by the majority who do. A company with such frightening influence over so many people, over their deepest thoughts, feelings and behavior needs real oversight.

But Facebook's closed design means it has no real oversight. Only Facebook knows how it personalizes your feed for you. At other large tech companies like Google, any independent re researcher can download from the internet the company's search results and write papers about what they find, and they do. But Facebook hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system.

Facebook will tell you privacy means they can't give you data. This is not true. When tobacco companies claimed that filtered cigarettes were safer for consumers, scientists could independently invalidate these marketing messages and confirm that, in fact, they posed a greater threat to human health.

The public cannot do the same with Facebook. We are given no other option than to take their marketing messages on blind faith. Not only does the company hide most of its own data, my disclosure has proved that when Facebook is directly asked questions as important as how do you impact the health and safety of our children, they mislead and they choose to mislead and misdirect.

[10:35:03]

Facebook has not earned our blind faith. This inability to see into Facebook's actual systems and confirm how they work as communicated -- -- and confirmed that they work as communicated is like the Department of Transportation regulating cars by only watching them drive down the highway.

Today, no regulator has a menu of solutions for how to fix Facebook because Facebook didn't want them to know about what's causing the problems. Otherwise there wouldn't have been a need for a whistleblower. How is the public supposed to assess if Facebook is resolving conflicts of interest in a way that is aligned with the public good if the public has no visibility into how Facebook operates? This must change.

Facebook wants you to believe that the problems we're talking about are unsolvable. They want you to believe in false choices. They want you to believe you must choose between a Facebook full of divisive and extreme content or losing one of the most important values our country was founded upon, free speech. They must choose between public oversight of Facebook's choices and your personal privacy, that to be able to share fun photos of your kids with old friends, you must also be inundated with anger driven virality.

They want you to believe that this is just part of the deal. I am here today to tell you that's not true. These problems are solvable, a safer, free speech-respecting more enjoyable social media is possible. But there's one thing that I hope everyone takes away from these disclosures, it is that Facebook can change, but it's clearly not going to do so on its own. My fear is that without action, divisive and extremist behaviors we see today are only the beginning. What we saw in Myanmar and are now seeing in Ethiopia are only the opening chapters of a story so terrifying no one wants to read the end of it.

Congress can change the rules that Facebook plays by and stop the many harms it is now causing. We now know the truth about Facebook's destructive impact. I really appreciate the seriousness, which the members of Congress and the Securities and Exchange Commission are approaching these issues. I came forward at great personal risk because I believe we still have time to act, but we must act now. I'm asking you, our elected representatives, to act. Thank you.

SEN. RICHARD BLUMENTHAL (D-CT): Thank you, Ms. Haugen. Thank you for taking that personal risk and we will do anything and everything to protect and stop any retaliation against you and any legal action that the company made, or anyone else, made that I think very clear in the course of these proceedings.

I want to ask you about this idea of disclosure. You've talked about looking in effect at a car going down the road and we're going to have five-minute rounds of questions, maybe a second round, if you're willing to do it. We're here today to look under the hood, and that's what we need to do more.

In August, Senator Blackburn and I wrote to Mark Zuckerberg and we asked him pretty straightforward questions about how the company works and safeguards children and teens on Instagram, Facebook dots, side track, in effect, misled us. So I'm going to ask you a few straightforward questions to break down some of what you have said. And if you can answer them yes or no, that would be great.

Is Facebook's research, its own research, ever found that its platforms can have a negative effect on children and teens' mental health or well-being? HAUGEN: Many of Facebook's internal research reports indicate that Facebook has a serious negative harm on a significant portion of teenagers and younger -- and children.

BLUMENTHAL: And has Facebook ever offered features that it knew had negative effect on children's and teens' mental health?

HAUGEN: Facebook knows that its amplification algorithms, things like engagement based ranking on Instagram, can lead children from very innocuous topics, like healthy recipes, I think all of us could eat a little more healthy, all the way from just something innocent, like healthy recipes to anorexia-promoting content over a very short period of time.

BLUMENTHAL: And has Facebook ever found again, in its research, that kids show sign of addiction on Instagram?

[10:40:05]

HAUGEN: Facebook has studied a pattern that they call problematic use, what we might more commonly call addiction. It has a very high bar for what it believes it is. It says you self-identify that you don't have control over your usage and that it is materially harming your health, your school work or your physical health.

5 to 6 percent of 14-year-olds have the self-awareness to admit both those questions. It is likely that far more than 5 to 6 percent of 14- year-olds are addicted to Instagram.

BLUMENTHAL: Last Thursday, my colleagues and I asked Ms. Davis, who was representing Facebook, about how the decision we had made whether to pause permanently Instagram for kids, and she said quote, there's no one person who makes a decision like that. We think about it that collaboratively. It's as though she couldn't mention Mark Zuckerberg's name. Isn't he the one who will be making this decision from your experience in the company?

HAUGEN: Mark holds a very unique role in the tech industry and that he holds over 55 percent of all the voting shares for Facebook. There are no similarly powerful companies that are as unilaterally controlled. And in the end, the buck stops with Mark. There's no one currently holding him accountable but himself.

BLUMENTHAL: And Mark Zuckerberg, in fact, is the algorithm designer in chief, correct?

HAUGEN: I received an MBA from Harvard and they emphasized to us that we are responsible for the organizations that we build. Mark has built an organization that is very metrics-driven, that it is intended to be flat. There is no unilateral responsibility. The metrics make the decision. Unfortunately, that itself is a decision. And in the end, if he's the CEO and the chairman of Facebook, he is responsible for those decisions.

BLUMENTHAL: The buck stops with him?

HAUGEN: The buck stops with him.

BLUMENTHAL: And speaking of the buck stopping, you have said that Facebook should declare moral bankruptcy. I agree. I think its actions and its failure to acknowledge its responsibility indicate moral bankruptcy.

HAUGEN: There is a cycle occurring inside the company where Facebook has struggled for a long time to recruit and retain the number of employees it needs to tackle the large scope of projects that it has chosen to take on. Facebook is stuck in a cycle where it struggles to hire. That causes it to understaff projects, which causes scandals, which then makes it harder to hire.

Part of why Facebook needs to come out and say, we did something wrong, we made some choices that we regret, is the only way we can move forward and heal Facebook is we first have to admit the truth. That way we'll have reconciliation and we can move forward is by first being honest and declaring moral bankruptcy.

BLUMENTHAL: Being honest and acknowledging that Facebook has caused and aggravated a lot pain to simply make more money.

HAUGEN: Yes.

BLUMENTHAL: And it has profited off spreading disinformation and misinformation and sowing hate. Facebook's answers to Facebook's destructive impact always seems to be more Facebook. We need more Facebook, which means more pain and more money for Facebook. Would you agree?

HAUGEN: I don't think at any point Facebook set out to make a destructive platform. I think it is a challenge of, that Facebook has set up an organization where the parts of the organization responsible for growing and expanding the organization are separate and not regularly cross-pollinated with the parts of the company that focus on the harms the company is causing. And as a result, regularly integrity actions, projects that were hard fought by the teams trying to keep us safe are undone by new growth projects that counteract those same remedies.

So I do think it's a thing of there are organizational problems that need oversight and Facebook needs help in order to move forward to a more healthy place.

BLUMENTHAL: And whether it's teens bullied into suicidal thoughts or the genocide of ethnic minorities in Myanmar or fanning the flames of division within our own country or Europe, they are ultimately responsible for the immorality of the pain it's caused.

HAUGEN: Facebook needs to take responsibility for the consequences of its choices. It needs to be willing to accept small tradeoffs on profit.

[10:45:00]

And I think just that act of being able to admit that it's a mixed bag is important. And I think what we saw from Antigone last week is an example of the kind of behavior we need to support Facebook and growing out of, which instead of just focusing on all the good they do, admit they have responsibilities to also remedy the harm.

BLUMENTHAL: But Mark Zuckerberg's new policy is no apologies, no admissions, no acknowledgment, nothing to see here, we're going to deflect it and go sailing. I turn to the ranking member.

SEN. MARSHA BLACKBURN (R-TN): Thank you, Mr. Chairman. Thank you for your testimony.

I want to stay with Ms. Davis and some of her comments, because I had asked her last week about the underage users. And she had made the comment, I'm going to quote from her testimony, if we find and account of someone who's under 13, we remove them. In the last three months, we removed 600,000 of under 13-year-olds, end quote. And I have to tell you, it seems to me that there's a problem if you have 600,000 accounts from children who ought not to be there in the first place.

So what did Mark Zuckerberg know about Facebook's plans to bring kids on as new users and advertise to them?

HAUGEN: There are reports within Facebook that show cohort analyses where they examine what ages do people join Facebook and Instagram. And based on those cohort analyses, so Facebook likes to say, children lie about their ages to get on to the platform. The reality is enough kids tell the truth that you can work backwards to figure out what are approximately the real ages of anyone who's on the platform.

One Facebook does cohort analyses and looks back retrospectively discovers things like up to 10 to 15 percent of even ten-year-olds in a given cohort may be on Facebook or Instagram.

BLACKBURN: Okay. So, this is why Adam Mosseri, who is the CEO of Instagram, would have replied to JoJo Siwa, when she said to him, oh, I've been on Instagram since I was eight, he said he didn't want to know that. So, it would be for this reason, correct?

HAUGEN: A pattern of behavior that I saw at Facebook was that often problems were so understaffed that there was kind of an implicit discouragement from having better detection systems. So, for example, I worked -- my last team at Facebook was on the counterespionage team within the threat intelligence org. And at any given time, our team could only handle a third of the cases that we knew about. We knew that if we built even a basic detector, we would likely have many more cases.

BLACKBURN: Yes, let me ask you this. So, you look at the way they have the data, but they're choosing to keep it and advertise from it, right, sell it to third parties. So what does Facebook do? You've got these 600,000 accounts that ought not to be on there.

HAUGEN: Probably more.

BLACKBURN: Right. But then you delete those accounts. But what happens to that data? Does Facebook keep that data? Do they keep it until those children go to age 13? Since, as you're saying, they can work backward and figure out the true age of a user. So what do they do with it? Do they delete it? Store it? Do they keep it? How do they process that?

HAUGEN: I am -- my understanding of Facebook's data retention policies, and I want to be clear, I didn't want directly on that, is that they delete -- when they delete an account, they delete all the data of the, I believe, 90 days in compliance with GDPR. With regard to children underage on the platform, Facebook do substantially more to detect more of those children and they should have to publish for Congress those processes because there are lots of subtleties in those things and they could be much more effective than probably what they're doing today.

BLACKBURN: Got it. Now, staying with this underage children, since this hearing is all about kids and about online privacy, I want you to tell me how Facebook is able to do market research on these children that are under age 13, because Ms. Davis was really -- she didn't deny this last week. So, how are they doing this? Do they bring kids into focus groups with their parents?

[10:50:03]

How do they get that permission? She said they got permission from parents. Is there a permission slip or a form that gets signed? And then how do they know which kids to target?

HAUGEN: There's a bunch to unpack there. We'll start with maybe how do they recruit children for focus groups or recruit teenagers. Most tech companies have systems where they can analyze the data that is on their servers. So, most of the focus groups I read or that I saw analysis of were around Messenger kids, which has children on it, and focus groups appear to be children interacting in person.

Often, large tech companies use either sourcing agencies that will go and identify people who meet certain demographic criteria or they will reach out directly based on data on the platform. So, for example, on the case of Messenger Kids, maybe you would want to study a child that was an active user and one that was a less active user. You might reach out that came from each population.

BLACKBURN: And so these are children under age 13?

HAUGEN: Yes.

BLACKBURN: And they know it?

HAUGEN: For some of these studies. And I assume they get permission, but I don't work on that.

BLACKBURN: Well, we're still waiting to get a copy of that parental consent form that would involve children.

My time has expired. Mr. Chairman, I'll save my other questions for our second round if we're able to get those. Thank you.

BLUMENTHAL: Great, thank you, Senator Blackburn.

Senator Klobuchar?

SEN. AMY KLOBUCHAR (D-MN): Thank you very much, Mr. Chairman. Thank you so much, Ms. Haugen for shedding a light on how Facebook time and time again has put profit over people.

When their own research found that more than 13 percent of teen girls say that Instagram made their thoughts of suicide worse, what do they, they proposed Instagram for kids, which has now been put on pause because of public pressure. When they found out that their algorithms are fostering polarization, misinformation and hate that they allowed 99 percent of their violent contact to remain unchecked on their platform, including lead-up to the January 6th insurrection, what did they do, they now, as we know, Mark Zuckerberg is going sailing and saying no apologies.

I think the time has come for action and I think you are the catalyst for that action. You have said privacy legislation is not enough. I completely agree with you, but I think, you know, we have not done anything to update our privacy laws in this country, our federal privacy laws, nothing, zilch, in any major way. Why? Because there are lobbyists around every corner of this building that have been hired by the tech industry.

We have done nothing when it comes to making the algorithms more transparent, allowing for the university research that you referred. Why? Because Facebook and the other tech companies are throwing a bunch of money around this town and people are listening to them. We have done nothing significantly passed although we are on a bipartisan basis working in an antitrust subcommittee to get something done on consolidation, which you understand allows the dominant platforms to control all this like the bullies in the neighborhood, buy out the companies that maybe could have competed with them and added the bells and whistles. So, the time for action is now.

So, I'll start with something that I asked Facebook's head of safety when she testified before us last week. I asked her how they estimate the lifetime value of a user for kids who start using their products before they turn 13. She evaded the question and said that's not the way we think about it. Is that right or is it your experience that Facebook estimates and puts a value on how much money they get from users, in general? I'll get to kids in a second. Is that a motivating force for them?

HAUGEN: Based on what I saw in terms of allocation of integrity spending, so, one of the things disclosed in The Wall Street Journal was that, I believe, it's like 87 percent of all the misinformation spending is spent on English, but only about like 9 percent of the users are English speakers. It seems that Facebook invests more in users who make them more money, even though the danger may not be evenly distributed based on profitability.

KLOBUCHAR: Does it make sense that having a younger person get hooked on social media at a young age makes them more profitable over the long-term as they have a life ahead of them? HAUGEN: Facebook's internal documents talk about the importance of getting younger users, for example, tweens on to Instagram, like Instagram Kids, because they need to have -- like they know that children bring their parents online and things like that. And so they understand the value of younger users for the long-term success of Facebook.

[10:55:00]

KLOBUCHAR: Facebook reported advertising revenue to be $51.58 per user last quarter in the U.S and Canada. When I asked Ms. Davis how much of that came from Instagram users under 18, she wouldn't say. Do you think that teens are profitable for their company?

HAUGEN: I would say assume so, based on advertising for things like television, to get much substantially higher advertising rates for customers who don't yet have preferences or habits. And so I'm sure they're some of the more profitable users on Facebook, but I do not work directly on that.

KLOBUCHAR: Another major issue that's come out of this, eating disorders. Studies have found that eating disorders actually have the highest mortality rate of any mental illness for women. And I led a bill on this with Senators Capito and Baldwin that we passed into law. And I'm concerned that these algorithms that they have pushes outrageous content promoting anorexia and the like.

I know it's personal to you. Do you think that their algorithms push some of this content to young girls?

HAUGEN: Facebook knows engagement based ranking, the way they pick the content in Instagram for young users, for all users, amplifies preferences. And they've done something called a proactive incident response where they take things that they've heard, for example, can you be led by the algorithms to anorexia content. And they have literally recreated this experiment themselves and confirmed yes, this happens to people. So, Facebook knows that they are leading young users to anorexia content.

KLOBUCHAR: Do you think they are deliberately designing their project to be addictive beyond even the content?

HAUGEN: Facebook has a long history of having a successful and very effective growth division where they take little tiny tweaks and they constantly, constantly, constantly are trying to optimize it to grow. Those kinds of stickiness could be construed as things that facilitate addiction.

KLOBUCHAR: Right. Last thing I'll ask you, as we've seen this same kind of content in the political world, you brought up other countries and what's been happening there. On 60 Minutes, you said facebook implemented safeguards to reduce misinformation ahead of the 2020 election, but turned off those safeguards right after the election. And you know that the insurrection occurred January 6th. Do you think they Facebook turned off the safeguards because they were costing the company money, because it was reducing profits? HAUGEN: Facebook has been emphasizing a false choice. They've said the safeguards that were in place before the election implicated free speech. The choices that were happening on the platform were really about how reactive and how twitchy was the platform, right, how viral. And Facebook changed those safety defaults in the run up to the election because they knew they were dangerous and because they wanted that growth back, they wanted the acceleration of the platform back after the election, they returned to their original defaults.

And the fact that they had to break the glass on January 6th and turn them back on, I think that's deeply problematic.

KLOBUCHAR: Agree. Thank you very much for your bravery and coming forward.

BLUMENTHAL: Senator Thune?

SEN. JOHN THUNE (R-SD): Thank you, Mr. Chair, and Ranking Member Blackburn.

I've been arguing for some time that it is time for Congress to act and I think the question is always what is the correct way to do it, the right way to do it, consistent with our First Amendment right to free speech. This committee doesn't have jurisdiction over the antitrust issue. That's the judiciary committee. And I'm not averse to looking at the monopolistic nature of Facebook. Honestly, I think that's a real issue that needs to be examined and perhaps addressed as well.

But at least under this committee's jurisdiction, there are a couple of things I think we can do. And I have a piece of legislation and Senators Blackburn and Blumenthal are both co-sponsors, called the Filter Bubble Transparency Act. And, essentially, what it would do is give users the options to engage with social media platforms without being manipulated by the secret formulas that essentially dictate the content that you see when you open up an app or logon to a website.

We also, I think, need to hold big tech accountable by reforming section 230. And one of the best opportunities I think to do that, at least for in a bipartisan way, is the platform accountability consumer transparency, or the PACT Act. And that's legislation that I've co- sponsored with Senator Schatz, which in addition to stripping section 230, protections for content that court determines to be illegal, the PACT Act would also increase transparency and due process for users around the content moderation process.

And, importantly, in the context we're talking about today with this hearing, with a major big tech whistleblower, the PACT Act would explore the viability of a federal program for big tech employees to blow the whistle on wrongdoing inside the companies where they work.

[11:00:00]