Return to Transcripts main page

First Move with Julia Chatterley

Whistleblower Frances Haugen Testifies in the U.K. as Criticism against Facebook Mounts; HSBC Plans a $2 Billion Splurge as Profits Jump; President Erdogan Threatens to Expel Western diplomats, Currency Plunges. Aired 9-10a ET

Aired October 25, 2021 - 09:00   ET

THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.


[09:00:03]

JULIA CHATTERLEY, CNN BUSINESS ANCHOR, FIRST MOVE: Live from New York, I'm Julia Chatterley. This is FIRST MOVE and here is your need to know.

Facebook fury. Whistleblower Frances Haugen testifies in the U.K. as criticism against the tech giant mounts.

Bumper buyback. HSBC plans a $2 billion splurge as profits jump.

And lira losses. President Erdogan threatens to expel Western diplomats, the currency plunges.

It's Monday, let's make a move.

A warm welcome once again to a new week here on FIRST MOVE and it is a face the painful truth moment for Facebook and it's near three billion users.

The tech giant faces accusations that its new users were being targeted by dangerous divisive and untruthful content on its platforms, and most

importantly, failed to stop it.

CNN, just one of a group of 17 news outlets with access to internal documents that painted damning picture of Facebook's impact on our society.

We'll be speaking with renowned journalist, Carole Cadwalladr, who rose to international prominence when she exposed the Cambridge Analytica scandal

back in 2018. Remember that?

She says this is the beginning of the end for the House of Zuckerberg. Really? Well, we'll find out how and why she believes that.

And as always, I'll suggest we follow the money and ask whether this scandal breaks the Facebook business model. Remember, it relies on

advertising by firms both big and small and we spoke to startup Clean Culture, remember a few weeks ago, who said it wouldn't exist without being

able to advertise on Facebook.

We'll find out what recent scandals have meant for Facebook's bottom line when they report earnings after the bell tonight. For now, it is the

opening bell on Wall Street that's our focus.

Futures relatively unchanged, and consolidation, I think that we're seeing on the cards after Friday's record close for the Dow. Also an uneven start

to the week in Europe and Asia there. The Shanghai composite as you can see there, closing firmly in the green.

Global stock shrugging off warning signals like inflation gauges in the markets, all around the world hitting multi-year highs, so-called inflation

break-evens.

And U.S. Treasury Secretary Janet Yellen telling CNN on Sunday, inflation, at least as far as America is concerned will not fall back to that two

percent target level until the back half of next year. There is nothing temporary about that -- transitory about that.

Okay, let's get to the drivers and Facebook's face the music moment.

The social media giant facing fresh claims that it failed to act despite knowing its platform was being used to incite civil violence in Ethiopia,

to trade to domestic workers, and to spread QAnon conspiracy theories. This damning picture painted by a new trove of internal documents referred to as

the Facebook Papers.

The documents were among the disclosures to the U.S. S.E.C. by Facebook whistleblower, Frances Haugen. Today, as I mentioned, she will be

testifying before lawmakers in the United Kingdom. And Donie O'Sullivan joins us now.

Donie, let's talk about the international angle, because I think the revelations that we've seen this morning from the Facebook Papers, I've

mentioned, Ethiopia, also point to India and actually test cases there that were done that showed that within days, weeks of new users join and they

were bombarded by misinformation, graphic, pictures, details, everything that we've suggested is wrong, actually about Facebook, and there's no way

really to control it.

DONIE O'SULLIVAN, CNN CORRESPONDENT: Yes, Julia, I mean, I think that's a really important point, you know, for as bad as Facebook here is in the

United States when it comes to election misinformation, fueling the January 6th insurrection, we know that outside of the United States, things look

far, far worse.

And we even heard that from another Facebook whistleblower a few weeks ago, Sophie Zhang who left the company last year where she said the company

doesn't really, essentially spend enough time or care about the rest of the world in a way that they do about America. Facebook, of course, denies

that.

But what we've seen in these internal documents is many sort of simple experiments really that Facebook staff were doing. They wanted to see what

it would be like for users on the platform. So in India, they just set up a new account, started following the recommendations, the groups that

Facebook's own algorithms were pushing, and they said after three weeks, after -- and "New York Times" pulled this out over the weekend, after three

weeks, the page, the feed was totally covered in haste, sectarianism, even violence.

And so it's very, very hard for Facebook to back away from this stuff when simple experiments are showing how flawed their platform is.

[09:05:00]

CHATTERLEY: Yes, and the mismatching resources, Donie, I think which is an important thing, spent on dealing with some of these issues. Eighty seven

percent of the global budget for time spent on misinformation is spent in the United States, 13 percent For the rest of the world, even though North

America is just a tiny fraction, it's around 10 percent, I think of the social network's daily active users.

And I know Facebook has responded on this and said, look, this third party -- third parties that are involved in analyzing this information and

helping them with this. So those numbers aren't quite accurate, but the resources spent on trying to tackle this simply aren't enough.

And I know our colleague, Brian Stelter spoke to Senator Richard Blumenthal over the weekend, and he basically said that they are utterly incapable of

policing themselves, and he has a point. What does it take to get action from regulators, Donie, in the face of all of this?

O'SULLIVAN: I mean, that's a great question. You know, people like Blumenthal on the Hill, they are talking a lot, and they do want to get --

they say they want to get regulation, but we're still waiting for them. I mean, Facebook is even saying, well, you know, regulate us if you care so

much about this.

I mean, I think we have seen over the past few weeks, especially when it comes to -- we learned about the harms that Instagram specifically can

cause on children. You know, I think that is something that has become a bipartisan issue here, obviously, in the U.S. and something that might inch

towards regulation.

But I you know, I mean, really what these papers show, and I can show you, I've been walking around all day with this -- these are just some of the

leaked Facebook documents. These are just a small sample of documents that are related, actually, to January 6th and to hate on the platform. There

are tens of thousands of documents.

We, along with 16 other news organizations here in the U.S. are going through them. This is going to be a real challenge for Facebook. You

mentioned at the top, though, you know, is this the nail in the coffin for Facebook? I'm not so sure. We've seen them weather so many storms before.

This is an extremely important conversation to have, but will it eventually fizzle out after a few weeks and we'll forget about it? It's very possible,

and I think that's probably what Facebook is hoping for. And of course, we've heard reports that Zuckerberg is even thinking of coming up with a

new name for the company.

CHATTERLEY: Yes, I mean, really, and that's why I brought it back to the business model, Donie, because it always has to come back to this and

advertisers. And it's not the biggest advertisers that you can put pressure on, it is the two million small companies, one of whom told us in the last

couple of weeks, they simply wouldn't exist without Facebook. Their business would not exist if they couldn't advertise for customers on

Facebook. So it's that battle of sort of public utility. WhatsApp as well - - public utilities -- we found two weeks ago when we had this outage.

You and I have long been skeptical. I guess, one of the other questions for me, Donie, and you can perhaps give us the best sense of this. To what

extent when you've got employees inside the company saying, look, we're not going far enough, we're worried about what's going on and the e-mail

traffic that's related to that that the whistleblower bought out. Does this tie it enough to senior executives -- to Mark Zuckerberg himself, to Sheryl

Sandberg, and not only the public face that they put on this, but the decision making inside to make change or not make change or to reverse

change after the election as an example in the U.S.?

O'SULLIVAN: Yes, I mean, I think what's clear from both these files and also what Facebook's public actions is that the buck stops with Zuck. You

know, there are so many times we've seen over the past few years where Trump -- just taking Trump as an example -- violated their policies, right,

in terms of glorification of violence when it came to looting and shooting and things like that, and Facebook bent its rules. And this is in these

files as well, where they say we bent our rules to placate these politicians and powerful people.

And also a person -- a Facebook employee made a point in the documents saying, look, we have all these policies. We have all these engineers who

are building systems to detect when these policies are broken, but then with the stroke of a pen, or a click of a mouse. Zuckerberg can say

actually, you know what, forget about that policy. This person is powerful and we need to let them do what they do.

So there is frustration inside the company. People are sounding the alarm. Clearly, though, it doesn't seem to be making its way up to the C-level

suites in a way that we are seeing actual impact on the platform.

I should, of course, mention, Julia, because I'll probably get a call from Facebook after this hit that Facebook denies the broad assertions being

made by Frances Haugen. They said that they are a corporation that doesn't put profit over safety.

CHATTERLEY: Yes, and that context is important as well and the voice from what Facebook is saying at this moment, and Donie, thank you for that.

People have to make their own minds up.

Donie O'Sullivan, thank you.

Okay on. The Turkish lira hitting a record -- fresh record low today after President Erdogan ordered the expulsion of ambassadors from 10 Western

nations including the United States. The nation signed a letter demanding the release of the jailed businessman and philanthropist, Osman Kavala.

Arwa Damon joins us now from Istanbul. Arwa, we will bring it back to the economics. I was looking down the list of these nations where the diplomats

have been threatened with expulsion and I believe, it makes up half of the top 10 trade partners for Istanbul. It's adding a -- sorry, for Turkey, it

is adding sort of economic crisis on top of political crisis, already a crisis in the nation. What do we make of this?

ARWA DAMON, CNN SENIOR INTERNATIONAL CORRESPONDENT: Quite potentially, yes, especially if those countries choose to try to squeeze the economic

screws when it comes to Turkey's position on Osman Kavala.

A little bit of background on that story. Kavala was detained back in 2017, accused first of being part of the 2013 Gezi Park protests. A few years

later, he was acquitted of those charges, ordered released only to be rearrested a few hours later, this time on charges in connection to the

2016 failed coup.

Now, a number of human rights organizations and watchdog groups have said that his detention is politically motivated. These 10 countries, Western

nations, have for quite some time now been a bit more quietly calling for Kavala's released, now doing it in that very public statement that was

jointly signed by all 10 of them, this aggravating President Erdogan over the weekend and the Turkish government that views this as being the West

once again meddling in Turkey's internal affairs.

The government here claims to maintain that the judiciary is independent and that basically the West should mind its own business saying that these

10 Ambassadors will be declared persona non grata, that being ordered to the Ministry of Foreign Affairs. There is a ministerial Cabinet meeting

that is happening.

As far as we're aware at this stage, none of these embassies have been officially notified of this step just yet, and we're most certainly, they

are already seeing the impact of this on the Turkish lira, which has for quite some time now been on a decline for a number of other factors that

have nothing to do with Kavala, but the bottom line for the vast majority of Turks and of foreign investors is that, right now, Turkey is not looking

like -- if you're a foreigner -- a stable economic investment, and if you're a Turk, you're just watching the value of your money dwindle away

and the cost of living rise significantly.

CHATTERLEY: Yes, and we've got the Central Bank cutting rates in the face of rising and already high inflation. Arwa, you make some great points

there. I'm sure some advisers around him have said, do we really want to do this? We'll see.

Arwa Damon, thank you so much for that.

Problems? What problems? The Asia focused British bank, HSBC, is shrugging off exposure to China's property crisis with surprise profits, and a share

buyback on the way. Anna Stewart joins me on this.

$2 billion worth of buybacks. That sends a huge signal of confidence, I think from the bank. How are they achieving it? Walk us through these

numbers.

ANNA STEWART, CNN REPORTER: Yes. I think that is the top line for the investors, right? $2 billion share buyback. I think they're pretty pleased

with that. And the comments from the CEO, Noel Quinn today saying: "The lows of recent quarters are behind us, really bullish outlook here."

They've been able to release $700 million worth of cash they've built up for loan loss provisions.

All regions were profitable, of course, the lion's share coming from Asia and particularly China. And they're looking at interest rate rises, and I

thought some of the comments around that have been really interesting, actually, particularly with the U.K. They point out in their call that a

25-basis-point rise would increase their income by half a billion dollars in one year. And the consensus they say is now for two or three number rate

rise between now and the end of next year.

So, all of that very bullish. This does, of course, follow on through from strong results from Barclays last week. Lots of strong results from the

Wall Street banks. It's not exactly an outlier. But I think we're particularly interested here in HSBC, given its reliance on Asia, given its

exposure to China, given that that is somewhere where we're seeing continued COVID lockdowns and of course, the fallout from Evergrande.

CHATTERLEY: Oh, yes. You raised the perfect point there for me. What did they have to say about exposure potentially to the property market in China

and Evergrande, specifically? And one of the other things that caught my attention as well was that they said that they weren't getting involved in

the SPAC deals i.e. these special purpose acquisition vehicles that allow companies to very quickly IPO, and that suggests to me that they're worried

about trouble brewing in that sector, too.

STEWART: It certainly sounded like it, didn't it? And it's quite interesting when an earnings call and all the interviews that follow are

more concerned really, with what you're not experiencing as to what you are.

But of course, they had to tell us about their exposure to Evergrande given their share price really did track down during that crisis that unfolded in

recent weeks. They had lots of caution to say. They say they've had an intensive review and currently have no direct exposure to the major

property developer stocks. They are happy with the overall exposure, but Julia, the CEO also said he would be foolish to say there is no second

audit risk.

[09:15:02]

STEWART: I think their headline really, the fact that they're willing to offer a $2 billion share buyback, hopefully, it's the actions -- they're

putting their money where their mouth is, quite literally there. They're clearly quite happy with their exposure currently when it comes to China

and property.

CHATTERLEY: One hundred percent agree. Anna Stewart, thank you for that.

Okay, let's move on. Here are some of the other stories making headlines around the world. Sudan's military says it has dissolved the country's

transitional government and declared a state of emergency. A government official tells CNN, the Prime Minister and his wife have been arrested and

taken to an undisclosed location in this military takeover.

CNN's Larry Madowo joins us now. Larry, good to have you with us. What more do we know about this? Clearly, it's been weeks, months of tension with

this transitional government, and now it seems to be over.

LARRY MADOWO, CNN CORRESPONDENT: It seems to have succeeded because there was an attempted coup just last month, which was blamed on forces loyal to

the ousted President Omar Al Bashir.

But this morning, a complete military takeover in Sudan. The General there in charge of the country who was the military figurehead and partly with

the civilian leader, the Prime Minister, Abdalla Hamdok, went on to state television, announced the dissolution of the government, that governors had

been removed from office, the Cabinet was no longer in place, sections the Constitution suspended, and people have taken to the streets because they

have been told by the Prime Minister Abdalla Hamdok to defend the revolution with the Sudan Professionals Association to go out there and

protect the gains that have been made since 2019 when Omar Al Bashir was removed from power following popular protests.

I spoke to the Prime Minister just last month, a day after that failed coup attempt. This is what he said about the possibility of another coup.

(BEGIN VIDEO CLIP)

ABDALLA HAMDOK, SUDANESE PRIME MINISTER: We can only guard against another coup attempt by relying on the forces that are part of this transition and

are guarding, and on the hardships on the economic front, we think the most serious part of it is behind us.

(END VIDEO CLIP)

MADOWO: The Prime Minister has had to put in place some tough economic measures that he himself described as such, because he is trying to get

some debt relief from the I.M.F., but this looks really bad for the U.S. because the U.S. envoy, Jeffrey Feltman for the Horn of Africa was

literally in Sudan until yesterday, meeting with the Prime Minister, meeting the military leadership and telling them that the U.S. supported

the democratic aspirations of the Sudanese people.

And as soon as he left, the military have taken over, going against everything they told the U.S. that they will be committing to even though

they're still promising an election in July 2023, but people in the country just don't think after this power grab, that's a possibility.

CHATTERLEY: And how are the people reacting, Larry?

MADOWO: There are people who have taken to the streets all this morning, they have been lighting fires, and they have been setting up barricades.

They've blocked some bridges leading to the military -- the General Command of the military.

There have been weeks of protests, but especially this past week, both sides, people who are supporting the military leadership, but also of a

civilian leadership have had public protests. But today, this was not surprising for many of them because one official who is an adviser to the

Prime Minister has told CNN that the Prime Minister was under pressure -- was under pressure to dissolve the government and the people saw that

coming, which might explain some of the reaction this morning seeing people on the streets trying to say that listen, we want to return to civilian

rule.

CHATTERLEY: Yes. Prime Minister under pressure and then given no choice. Larry Madowo, great to have you with us. Thank you for that.

Okay, let's move on. Japan says it will monitor the Russian and Chinese Navies in the Pacific Ocean after the two countries held joint naval

exercises. Japan's Defense Ministry says at least 10 Russian and Chinese vessels made what he calls quote "unusual moves," passing close to an

island in Southern Japan.

Parts of China are tightening restrictions on travel and social gatherings as the government battles a new COVID-19 outbreak, this ahead of the Winter

Olympics in Beijing. Meanwhile, Singapore reported more than 2,700 new cases of the virus on Sunday. Authorities there say employees must be fully

vaccinated before returning to work in January.

Okay, still to come here on FIRST MOVE, honey, I shrank the IPO. Volvo cars is set to go public, but it is raising less than it intended to, and

running out of friends, new revelations and whistleblower testimony deepen the crisis at Facebook. We'll discuss. Stay with us

(COMMERCIAL BREAK)

[09:22:17]

CHATTERLEY: Welcome back to FIRST MOVE. U.S. futures are a touch higher ahead of a fiercely busy week for tech earnings. In particular, Alphabet,

Amazon, and Apple among others reporting later this week. And of course, as we've mentioned, Facebook after the closing bell today, and not the only

ones to watch.

Consumer staples and blue chips also reporting in droves, too. 3M, Visa, Boeing, Coca-Cola, Kraft, Heinz, Starbucks, and Caterpillar will all give

us some fresh indication on economic recovery, price pressures, and supply chains snags, too.

And this could finally, finally be the week when U.S. Congress acts on both the bipartisan infrastructure package and the Democrat spending bill. We

will believe it when we see it.

Let's move on. Volvos have a reputation for being spacious and safe. Not sure investors feel quite as comfortable with the auto maker's shrunken

down IPO.

Paul La Monica joins me now. Interesting, Paul, scaling back the size of the IPO, delaying it by a day, and pricing at the bottom end of a range,

and it comes down, I think, to concerns and fears surrounding Chinese ownership.

PAUL LA MONICA, CNN BUSINESS REPORTER: Yes, I think, Julia, that there are clear worries about the control that Geely in China has over Volvo, and

that is a primary reason why the company seems to be scaling back the plans for the IPO, selling, you know, not as many shares.

I think, you know that macro factors are at play as well, worries about the delta variant of COVID and what that's going to do to the global economy.

But we'd be remiss if we didn't point out that Volvo, while like every traditional car company is amping up its electric vehicle plans, they still

pale in comparison to where companies like Tesla that dominate this market right now.

CHATTERLEY: This is such a great point. I was looking at some of the numbers on this, and if this is where they price, they are actually going

to be worth less than Polestar who we've interviewed on the show a number of times and this, of course is the electric carmaker that's been -- also

spun out of the Swedish carmaker.

And to your point, if you look at the comparison, Polestar is obviously pure electric or pure play electric cars. Volvo, just three percent of

their cars today actually are electric vehicles and obviously, they want to be fully electric by 2030.

So in the CEO's words, "Polestar is 10 years ahead of us," quote, that's what he said to "The Financial Times." So perhaps, it is a China thing here

or is there something else going on?

LA MONICA: Yes, definitely. Investors realize that since Polestar is itself going public through a merger with a SPAC, then I think investors,

you know it's not that hard to look at the environment right now and do the math.

[09:25:08]

LA MONICA: You could own Volvo, which has, you know, exposure to Polestar, or you can just wait and own Polestar, which gives you pure play electric

vehicle much exciting -- more exciting growth prospects without all the baggage that comes with Volvo. So, I think that is potentially a problem

for the Swedish auto giant as well.

CHATTERLEY: Yes. I couldn't agree more. Okay, let's move on, because we also had reservations about a potential deal that was rumored last week as

well, and it seems, no pin in Pinterest for now from PayPal's perspective.

LA MONICA: Yes, PayPal stock -- very well played. PayPal stock was you know, hit on the speculation that this deal was going to happen. And I

think that, you know, this is a recognition of a couple of things. I mean, one, Pinterest, you know, potentially wants more money in a deal. I mean, I

did find it interesting that PayPal said that the talks are not happening for now or right now, which doesn't preclude some sort of transaction

taking place at a future date.

But I think for PayPal as well, this might have been a bit of a trial balloon to kind of see, you know, leaking some of the reports of this

interest out there because this isn't on the face of it a deal that makes entirely strategic sense for PayPal. It's a bit of a reach to get more into

social media and some of the problems that we discussed on the show last week that come with having social media exposure.

So I think, well, for PayPal, you know, it's a company that's really competing with Square. Square just did a big deal for a buy now pay later

company, and I think that's something that makes more strategic sense. So PayPal might have just decided, okay, this is interesting. We may still

come back to it at a later date. But right now, it doesn't look like either Pinterest wants us to do it and Wall Street isn't too happy with the notion

of an acquisition this large either.

CHATTERLEY: Yes, the timing, I think with the Facebook revelations as well, to your point about getting into social media at this moment in time

a little bit uncomfortable. The trial balloon became a bit of a lead balloon, but the balloon is not yet popped to your point. They may come

back.

LA MONICA: We'll be talking about this again, I suspect.

CHATTERLEY: I am sure, we will Paul La Monica, great to see you. Thank you.

You're watching FIRST MOVE. Stay with us. There is more to come.

(COMMERCIAL BREAK)

[09:31:37]

CHATTERLEY: Welcome back to FIRST MOVE, and I want to get you straight over to London now in the U.K. Parliament where British lawmakers are

hearing testimony from the Facebook whistleblower, Frances Haugen. Let's listen in.

FRANCES HAUGEN, FORMER FACEBOOK PRODUCT MANAGER: ... ninety seven percent or something of what they take down happens because of their robots. But

that's not actually the question we want answered. The question we want answered is, did you take the hate speech down? And the number I've seen is

like three to five percent, but I wouldn't be surprised if there was some variation within the documents.

UNIDENTIFIED MALE: I think it's a really important point, as essentially what we're looking at here is the creation of an independent regulator for

the tech sector that not only do we need answers, but we need to know what the right questions are, as well, because the official statistics are so

misleading.

HAUGEN: And part of why I came forward is, I know that I have a specific set of expertise, like I've worked at four social networks. I am an

algorithmic specialist. So, I worked on search quality at Google, I ran ranking for the home feed on Pinterest. I have an understanding of how AI

can unintentionally behave.

Facebook never set out to prioritize polarizing divisive content, it just happened to be a side effect of the choices they did make.

Part of why I came forward is that I am extremely, extremely worried about the condition of our societies, the condition of world itself and the

interaction of the choices Facebook has made and how it plays out more broadly.

So, things I'm specifically worried about are engagement based ranking, which Facebook has said before, Mark Zuckerberg put out a white paper in

2018 saying engagement based ranking is dangerous, unless the AI can take out the bad things. And as you saw, they're getting three to five percent

of things like hate speech. There are getting 0.8 percent of violence inciting content. Engagement based ranking prioritizes that kind of extreme

content.

I'm deeply concerned about their underinvestment in non-English languages and how they mislead the public that they are supporting them. So Facebook

says things like we support 50 languages when in reality, most of those languages get a tiny fraction of the safety systems that English gets.

Also, I don't think this is widely known. U.K. English is sufficiently different that I would be unsurprised if the safety systems that they

developed primarily for American English were actually under enforcing in the U.K. I wouldn't be unsurprised with that, and Facebook should have to

disclose dialectical differences.

I'm deeply concerned about the false choices that Facebook presents. They routinely try to reduce the discussion to things like, you can either have

transparency or privacy, which do you want to have? Or you can, if you want safety, you have to have censorship, when in reality, they have lots of

non-content based choices that would sliver off a half percentage point of growth, a point -- a percentage point of growth, and Facebook is unwilling

to give up those slivers for our safety.

And I came forward now because now is the most critical time to act. When we see something like an oil spill, that oil spill, it doesn't make it

harder for a society to regulate oil companies. But right now, the failures of Facebook are making it harder for us to regulate Facebook.

UNIDENTIFIED MALE: On those failures, looking at the way the platform is moderated today, do you think -- unless there is change -- do you think it

makes it more likely that we will see events like the insurrection in Washington on the 6th of January this year? More violent acts that have

been driven by Facebook systems. Do you think it's more likely we will see more of those events as things stand today?

[09:35:07]

HAUGEN: I have no doubt that the events we're seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters because

engagement based ranking does two things. One, it prioritizes and amplifies divisive, polarizing, extreme content, and two, it concentrates it.

And so Facebook comes back and says, only a tiny sliver of content on our platform is hate, or only a tiny sliver is violence. One, they can't detect

it very well. So, I don't know if I trust those numbers. But two, it gets hyper concentrated in, you know, five percent of the population, and you

only need three percent of the population on the streets to have a revolution and that's dangerous.

UNIDENTIFIED MALE: I want to ask you a bit about that hyper concentration, particularly in an area that that you worked on, in particular, and that's

Facebook Groups.

I remember being told several years ago by Facebook executive that the only way you could drive content through the platform's advertising, then we see

that is not true, and groups are increasingly used to shape that experience.

We talk a lot about the impact of algorithmic based recommendation tools like newsfeed. To what extent do you think Groups are shaping the

experience for many people on Facebook?

HAUGEN: Those play a huge and critical role in driving the experience on Facebook. When I worked on civic misinformation, this is like, based on

recollection, I don't have a document, but I believe it was something like 60 percent of the content in the Newsfeed was from Groups.

I think the thing that is important for this group to know is that Facebook has been trying to extend sessions like you consume longer sessions more

content, and the only way they can do that is by multiplying the content that already exists on the platform. And the way they do that is with

things like Groups, and re-shares.

So if I put one post into a half million person group, that can go out to half a million people, and when combined with engagement based ranking,

that group might produce 500, a thousand pieces of content a day, but only three get delivered.

And if your algorithm is biased towards extreme polarizing, divisive content, it's like viral variants. Those giant groups are producing lots

and lots of pieces of content, and only the ones most likely to spread are the ones that go out.

UNIDENTIFIED MALE: It was reported last year by "The Wall Street Journal" that 60 percent of people that joined Facebook Groups that shared extremist

content or promoted extremist content did so at Facebook's active recommendations. So this is clearly something Facebook is researching. What

action is Facebook taking about Groups who share extremist content?

HAUGEN: I don't know the exact actions that have been taken in the last, you know, six months, a year. Actions regarding extremist groups

(INAUDIBLE) promoted to users is a thing that Facebook shouldn't be able to just say this is a hard problem, we are working on it. They should have to

articulate here's our five-point plan and here is the data that would allow you to hold us accountable because Facebook acting in a non-transparent and

accountable way will just lead to more tragedies.

UNIDENTIFIED MALE: You think that five-point plan exists?

HAUGEN: I don't know if they have a five-point plan.

UNIDENTIFIED MALE: Or any plan, do they --

HAUGEN: Yes, I don't know. I don't work on them.

UNIDENTIFIED MALE: Okay. To what extent should we be considering -- or as a sort of regulator, you can regulate to be asking these questions about

Facebook Groups? Now, how significant -- I mean, with what you're saying, they are a significant driver of engagement and engagement is part of the

problem the way Facebook designed it, then Groups must be a big part of that, too.

HAUGEN: Part of what is dangerous about Groups is that, you know, we talk about sometimes this idea of is -- is this an individual problem or is this

a societal problem? One of the things that happens in aggregate is the algorithms take people who have very mainstream interests and they push

them towards extreme interests. Usually, someone center left and you'll be pushed to radical left. You can be center right, and you'll be pushed to

the radical right. You can be looking for healthy recipes, you'll get pushed to anorexia content.

There are examples in Facebook's research of all this. One of the things that happens with groups and networks of groups is that people see echo

chambers, they create social norms. So, if I'm in a group that has lots of COVID, misinformation, and I see over and over again that if someone gives

COVID vaccine, like things that encourage people to get vaccinated, they get completely pounced upon. They're torn apart.

I learn that certain ideas are acceptable and unacceptable. When that context is around hate, now, you see a normalization of hate, a

normalization of dehumanizing others, and that's what leads to violent incidents.

UNIDENTIFIED MALE: I mean, many people would say that Groups, particularly large Groups, and some of these groups have hundreds of thousands of

members in them.

HAUGEN: Millions.

UNIDENTIFIED MALE: Yes, millions, they should be much easier for the platform to moderate because people are gathering in a common place.

HAUGEN: I strongly recommend that above a certain size group, they should be required to provide their own moderators and moderate every post. This

would naturally in a continent agnostic way regulate the impact of those large groups, because if that group is actually valuable enough, they will

have no trouble recruiting volunteers.

But if that group is just an amplification point, like we see for an information operations using groups like this in virality hacking, that is

the practice of borrowing viral content from other places to build a group, we see these places as being -- if you want to launch an advertising

campaign with misinformation in it, we at least have a credit card to track you back.

[09:40:15]

HAUGEN: If you want to start a group and invite a thousand people every day, like the limit is, I think, 2,200 people you can invite every day, you

can build out that group and your content will land in their newsfeed for a month, and if they engage with any of it, it'll be considered a follow.

And so things like that make them very, very dangerous and they drive outsized impact on the platform.

UNIDENTIFIED MALE: And what you say, if a bad actor or agency wanted to influence what a group of people on Facebook would see, you'd probably set

up Facebook groups to do that more than you would Facebook Pages and run advertising.

HAUGEN: And that is definitely a strategy that is currently used by information operations. Another one that's used, which I think is quite

dangerous is you can create a new account, and within five minutes, go post into a million person group, right? There is no accountability, there's no

trace, right?

You can find a group to target any interest you want to -- very, very fine grained. Even if you removed micro targeting from ads, people will micro

target via Groups.

UNIDENTIFIED MALE: And, again, I mean, what -- you know, what do you think the company's strategy is for dealing with this? Because again, there were

changes made to Facebook Groups, I think in 2017-2018 to create more of a community experience. I think Mark Zuckerberg said, which is good for

engagement.

But if it seems similar to changes to the way Newsfeed works in terms of the content that it prefers and favors, this is -- these are reforms the

company have put in place that have been good for engagement, but have been terrible for harm.

HAUGEN: I think there's -- we need to move away from having binary choices. There's a huge continuum of options that exists. Coming in and

saying, hey, Groups that are under a thousand people are wonderful, they create community, they create solidarity, they help people connect.

If you get above a certain size, maybe 10,000 people, like you need to start monitoring that group because that alone, that naturally rate limits

it, and the thing that we need to think about is where do we add selective friction to these systems so that they are safe in every language? You

don't need the AI's to find the bad content?

UNIDENTIFIED MALE: In your experience, is Facebook testing it systems all the time? Does Facebook experiment with the way its systems work around how

you can increase engagement? And obviously, you know, in terms of content on the newsfeed, we know it experimented around the election time around

the sort of news that should be favored. So, how does Facebook work in experimenting with its tools?

HAUGEN: Facebook is continuously running many experiments in parallel and little slices of the data that they have. I'm a strong proponent that

Facebook should have to publish a feed of all the experiments they're running. They don't have to tell us what the experiment is, just some ID,

and even just seeing the results data would allow us to establish patterns of behavior, because the real thing we're seeing here is Facebook accepting

little tiny additions of harm, like when they weigh off how much harm is worth how much growth for us?

Right now, we can't benchmark and say, oh, you're running all of these experiments? Are you acting in the public good? But if we had that data, we

could see patterns of behavior and see whether or not trends are occurring.

UNIDENTIFIED MALE: You worked in the Civic Integrity at Facebook. So, if you saw something that was concerning you, who would you report to?

HAUGEN: This is a huge, huge, weak spot. If I drove a bus in the United States, there will be a phone number in my break room that I could call and

would say, did you see something that endangered public safety? Call this number. Someone will take you seriously and listen to you and like the

Department of Transportation.

When I worked on counter espionage, I saw things where I was concerned about National Security and I had no idea how to escalate those, because I

didn't have faith in my chain of command at that point. Like they had dissolved Civic Integrity. I didn't see that they would take that seriously

and we were told just to accept under resourcing.

UNIDENTIFIED MALE: I mean, like you would report to your line manager. Would it be then up to them whether they chose to escalate that?

HAUGEN: I flagged repeatedly when I worked on Civic Integrity that I felt that critical teams were understaffed, and I was told at Facebook, we

accomplish unimaginable things with far fewer resources than anyone would think possible.

There is a culture that lionizes kind of a startup ethic that is, in my opinion, irresponsible, right? The idea that, you know, the person who can

figure out how to move the metric by cutting the most corners is good. And the reality is, it doesn't matter if Facebook is spending $14 billion in

safety a year. They should be spending $25 billion or $35 billion. That's the real question.

And right now, there is no incentives internally that if you make noise, saying we need more help, like people will not get rallied around for help

because everyone is -- everyone is underwater.

UNIDENTIFIED MALE: In many organizations that ultimately fail, I think that sort of culture exists. There is no -- a culture where there is no

external audit and people inside the organization don't share problems with the people at the top.

What do you think people like Mark Zuckerberg know about these things?

[09:45:03]

HAUGEN: I think it's important that all facts are viewed through a lens of interpretation, and there is a pattern across a lot of the people who run

the company, senior leaders, which is this may be the only job they've ever had. Right? Like Mark came in when he was 19 and he is still CEO. There's a

lot of other people who are VPs or Directors who, this is the only job they've ever had.

And so there is a lack of, you know, the people who have been promoted were the people who, you know, could focus on the goals they were given, and not

necessarily ones that asked questions around public safety.

And I think there's a real thing that people are exposed to that, and then they say, look at all the good we're doing. Like, yes, that's true, but

like we didn't invent hate. We didn't invent ethnic violence. And that's not the question. The question is, what is Facebook doing to amplify or

expand hate? What is it doing to amplify or expand ethnic violence?

UNIDENTIFIED MALE: You're right, I mean, Facebook didn't invent hate. But do you think it is making hate worse?

HAUGEN: Unquestionably, it is making hate worse.

UNIDENTIFIED MALE: Thank you. Joining us, Jim Knight.

JIM KNIGHT, MEMBER OF HOUSE OF LORDS OF THE UNITED KINGDOM: Thank you very much, Chairman. Thank you, Frances for coming and talking to us.

First of all, just on some of that last fascinating discussion that you were having. If you -- talked about if you were calling out for help, you

wouldn't necessarily get the resource, would the same be true if you're working in PR or legal within Facebook?

HAUGEN: I have never worked in PR or Communications, so I'm not -- I'm not sure.

I do know that there is -- I was -- I was shocked to hear recently that Facebook wants to double down on the Metaverse and that they're going to

hire 10,000 engineers in Europe to work on the Metaverse because I was like, wow, do you know what we could have done with safety if we had 10,000

more engineers? It would have been amazing.

I think there is a view inside the company that safety is a cost center, it is not a growth center, which I think is very short term in thinking

because Facebook's own research has shown that when people have worse integrity experiences on the site, they're less likely to retain.

I think regulation could actually be good for Facebook's long term success, because it would force Facebook back into a place where it was more

pleasant to be on Facebook and that could be good for the long term growth of the company.

KNIGHT: Thank you. And then let me go back also to the discussion about Facebook Groups, by which we're essentially talking about private groups

clearly. If you have asked to be the regulator of a platform like Facebook, how do you get the transparency about what's going on in private groups

given that they're private?

HAUGEN: I think, there is a real bar, like we need to have a conversation as society around how many people want -- after a certain number of people

have seen something, is it truly private, right? Is that number 10,000? Is it 25,000? Is it really private at that point?

Because I think there's an argument that Facebook will make, which is that, you know, there might be a sensitive group, which someone might post in to,

and you know, we wouldn't want to share that even if 25,000 people saw it, which I think is actually more dangerous, right, that if people are lulled

into a sense of safety, that you know, no one is going to see their hate speech, or no one is going to see maybe a more sensitive thing like maybe

they haven't come out yet, right?

That is dangerous because those spaces are not safe. Right? When 100,000 people see something, you don't know who saw it and what they might do. So,

I'm a big proponent of both Google and Twitter are radically more transparent than Facebook, right? People every day download the search

results on Google and analyze them, and people publish papers.

And because Google knows this happens, they staff software engineers who work on search quality to write blog posts. Twitter knows that 10 percent

of all the public tweets end up going on their fire hose, and people analyze those and do things like find information operation networks. And

because Twitter knows someone is watching, they behave better.

I think in the case of Facebook, and even with private groups, there should be some bar above which we say, enough people have seen us, it's not

private and we should have a firehose, just like Twitter, because if we want to catch National Security threats, like information operations, we

need to have not just the people at Facebook looking at it, we need to have 10,000 researchers looking at it.

And I think in addition to that, we'd have accountability on things like algorithmic bias or understanding whether or not our children are safe.

KNIGHT: That's really helpful. And just on Twitter and algorithmic bias, they published some -- a report on Friday suggesting that there's an

algorithmic bias politically. Do you think that is unique to Twitter or would you say that that would also be the case in Facebook? Is that

something implicit in the way that these algorithms and these platforms, with all of their algorithms are designed to optimize clicks, and

therefore, there is something about certain types of political content that makes it more extreme that is endemic to all of these social media

companies?

HAUGEN: I am not aware of any research that demonstrates a political bias on Facebook. I am familiar with lots of research that says the way

engagement based ranking was designed, so Facebook calls it meaningful social interaction, though meaningful, could have been hate speech or

bullying up until November 2020, it would still be considered meaningful.

[09:50:16]

HAUGEN: So let's call it social interaction ranking. I've seen lots of research that says that kind of ranking -- engagement based ranking

prioritizes polarizing, extreme, divisive content. It doesn't matter if you're on the left or on the right, it pushes you to the extremes. And it

fans hate, right? Anger and hate is the easiest way to grow on Facebook.

There's something called virality hacking, where you figure out all the tricks on how to optimize Facebook, and good actors, good publishers are

already publishing all the content they can do. But bad actors have an incentive to play the algorithm, and they figure out all the ways to

optimize Facebook and so the current system is biased towards bad actors and biased towards people who push people to the extremes.

KNIGHT: Thank you. And then currently, we have a draft bill, which is focusing on individual harm rather than societal harm. Given the work that

you've done around democracy, as part of your work at Facebook, do you think that it is a mistake to omit societal harm?

HAUGEN: I think it is a grave danger to democracy and societies around the world to omit societal harm. To give -- like a core part of why I came

forward was I looked at the consequences of choices Facebook was making, and I looked at things like the global south. And I believe situations like

Ethiopia are just part of the opening chapters of a novel that is going to be horrific to read, right?

We have to care about societal harm, not just for the global south, but our own societies. Because like I said before, when an oil spill happens, it

doesn't make it harder for us to regulate oil companies. But right now, Facebook is closing the door on us being able to act. Like we have a slight

window of time to regain people control over AI, we have to take advantage of this moment.

KNIGHT: And my final question, and thank you -- is -- undoubtedly, just because you're a digital company, you'll have looked at user journeys and

analyzed a lot of detail the data around how different user journeys work. Is there any relationship between pay for advertising and their moving into

some of these dangerous private groups, possibly then being moved into messaging services, into encrypted messaging?

Are there user journeys like that that we should also be concerned about, particularly given that paid for advertising is currently excluded from

this bill?

HAUGEN: I am extremely concerned about paid for advertising being excluded because engagement based ranking impacts ads as much as it impacts organic

content. I'll give an example. Ads are priced partially based on the likelihood that people like when we re-share them, do other things to

interact with them, click through on a link.

An ad that gets more engagement is a cheaper ad. We have seen it over and over again in Facebook's research, it is easier to provoke people to anger

than to empathy or compassion. And so we are literally subsidizing hate on these platforms.

It is cheaper substantially to run an angry, hateful, divisive ad than it is to run a compassionate, empathetic ad. I think there is a need for

things even discussing disclosures of what we're going --

CHATTERLEY: We are going to leave Frances Haugen there testifying before the U.K. Parliament and bring you some context.

Joining us now is Carole Cadwalladr. She's journalist at "The Guardian" and co-founder of the Real Facebook Oversight Board, that's an alternative to

Facebook's own appointed board. She also broke the 2018 Cambridge Analytica story. Carole, great to have you with us.

I want to get your views here, Facebook themselves have said, look, Frances is cherry picking. She didn't have full understanding and insight into what

was going on at Facebook. But she's brilliant, and she's very punchy and she is very detailed about explaining what's going on. What do you make of

it?

CAROLE CADWALLADR, JOURNALIST, "THE GUARDIAN": Well, I mean, I think Facebook tried to claim that didn't they, when she had given evidence to

Congress, you know, subsequently, we've got now 17 news organizations who have collaborated to look through these documents.

You know, this morning, we've seen an absolute deluge of news articles. So, I mean, I think this idea that Frances Haugen is cherry picking, you know,

alongside these massive news organizations, CNN included is a complete busted flush. I mean, there is absolutely -- you know, I don't think that

could be substantiated in any way, shape, sense or hold anymore.

CHATTERLEY: No, but she said some things that I think for most of us that have been looking at this in you yourself since before 2018, and other news

organizations have been reported, engagement base ranking prioritizing hate and extremist views.

The idea that they don't have enough moderation of content in other languages, never mind English and the difference between the United States

English and English-English or British English, the idea that they promote hate and actually, that's part of the business model that safety is a cost

not a growth tactic. Much of this we've heard before. Does this time make a difference? Will it be different here in terms of tackling Facebook from

the perspective of regulators, because what more do they need?

[09:55:25]

CADWALLADR: Well, I mean, I think the thing is that -- look, if Facebook was a normal company, I think today there would be real questions about

Mark Zuckerberg's position and I think he would have to be seriously consider stepping down.

The problem is, is that Facebook is not a normal company, and that's the sort of the scale and nature of the problem that we are seeing here.

But I think what is really important today is that we're taking this now to the rest of the world, as you sort of say that it is actually Facebook's

impact across the rest of the world which is even more damaging than the United States.

And so having Frances Haugen in Britain today where this, you know, this sort of landmark piece of legislation is currently being debated around how

do we hold these tech giants to account, and so I think this is -- I think it is -- I mean, it is a big day for Facebook in so many ways, but they are

running out of roads. I think they are running out of roads, you know, and it is just really a question of what happens next.

CHATTERLEY: I know. I separated the damage from the business model and the use of it particularly by small businesses for advertising purposes and

these are two separate things, is actually the best way that we tackle this as being part of one of the near three billion users that use Facebook or

Facebook products, actually, we have to understand that we are in these echo chambers and we have to be careful what we're reading or viewing or

seeing or sharing?

CADWALLADR: I don't think it can be on individuals. I mean, I think this platform is so embedded into our lives in the way we use WhatsApp to

communicate especially around the rest of the world. It is just, it is not enough.

You know, as individuals, there is only so much we can do to protect ourselves, and this is why, you know, we desperately need legislation. And

I do think that's why it's very important today that Frances Haugen is giving this evidence to the British Parliament and the British Parliament

is currently debating ways -- looking at ways of potentially imposing really big fines upon the tech giants if they fail to deal with these

problems.

I mean, this is -- there are some really serious outcomes from this. This could potentially be replicated around the world.

CHATTERLEY: Yes. They don't create hate, but they help promote it in societies all around the world, a quaking. Carole, thank you for joining us

and thank you for waiting around while we listened to that, too.

Carole Cadwalladr there, the journalist at "The Guardian" and co-founder of the Real Facebook Oversight Board.

That's it for the show. Stay safe.

"Connect the World" with Becky Anderson is next and we'll see you tomorrow.

(COMMERCIAL BREAK)

[10:00:00]

END