Return to Transcripts main page

CNN 10

Part Four and Final Part of Series on Facebook

Aired February 25, 2019 - 04:00:00   ET


CARL AZUZ, CNN 10 ANCHOR: As the new week begins CNN 10 is concluding its four part series overseeing the successes and controversies of Facebook.

Last Wednesday through Friday we examined the background and leadership of the company`s founder. The software development that brought anger and

popularity to the social network and an international controversy that jolted public trust in Facebook. You can find all of these shows in our

archive section at and meantime has coverage of breaking news from around the world.

We`ll be resuming our daily news reporting tomorrow. But as we bring you installment four of four on Facebook today, Laurie Segall looks at how the

company manages some of the problems that come up. Issues that can go viral. Controversies both on the platform and among the people who run it

through the words of both current and former Facebook employees.


LAURIE SEGALL, CNN SENIOR TECHNOLOGY CORRESPONDENT: Did making money get in the way of Facebook`s mission to ultimately connect the world?

UNIDENTIFIED FEMALE: I don`t think so. The fundamental business model we offer I think is a really good one. It protects people`s privacy and it

takes a very powerful product that changes people`s lives and makes it available for free. So if we were to have to charge for Facebook, we

didn`t have an ad business. A very small fraction of the people who use it would be able to use it.

TRISTAN HARRIS, DIRECTOR OF THE CENTER OF HUMANE TECHNOLOGY: They would say how else are we going to connect the whole world if it`s not free? But

I would say, if the thing you`re connecting the whole world to isn`t safe it shouldn`t be free.

SEGALL: Unsafe critics like Tristan Harris say not just because of personal data breeches but also unsafe because of the content that seems

unchecked on the platform. Something Congress` worry about on day two of Zuckerberg`s testimony.

MARK ZUCKERBERG, CEO AND FOUNDER OF FACEBOOK: Do we have a responsibility for the content that people share on Facebook? And I believe the answer to

that question is yes.

SEGALL: Facebook is now investing millions in policing content adding thousands of content reviewers to the payroll and developing artificial

intelligence they hope will help identify troublesome material. But the answers aren`t always clear.

UNIDENTIFIED FEMALE: Judging hate speech imagery - -

SEGALL: I saw it in this room. A bi-weekly meeting of Facebook employees devoted to content policies. They often discuss what posts stay up and

which ones are taken down.


SEGALL: It was striking to hear conversations we`d have in a newsroom happen at a tech company. And if unanswered here, content decisions go all

the way up to the top of the Facebook food chain, to Mark and Sheryl. This is like an editor-in-chief role. I mean, should you be making that


ZUCKERBERG: Well I think as little as possible, which is why I - - I focus on designing the systems because there are going to be billions of pieces

of content that people post everyday. And I think getting it right at a systems level is more important.

SEGALL: But with billions of contents hits daily, problems at this scale go viral and decisions made behind the scenes are increasingly scrutinized.

Now I`ve heard you guys talk a lot about transparency but then you have these reports coming out that say something otherwise. So how - - and I

guess I ask it again. How do you insure that you do win that public trust?

ZUCKERBERG: Yes. I don`t think that the right expectation is there aren`t going to be issues. I - - I think the question is how do we address them?

SEGALL: A question the company`s struggling with while executives have promised to focus on security and transparency - -

UNIDENTIFIED FEMALE: We have more than doubled the number of people we have working in safety and security.

SEGALL: According to the report, they not only ignored warning signs but sought to conceal them. So set the record straight. I mean, it says that

Facebook attempted to downplay the significance of Russian activity in the run up to the 2016 election. Did they?

ALEX STAMOS, FORMER CHIEF SECURITY OFFICER: Right. I mean, in - - after we found these things. I - - I am not totally happy with what the

communications strategy was.

SEGALL: Former Chief Security Officer Alex Stamos found himself in the news again and so we went back to interview him.

STAMOS: I think we could have been much more aggressive about talking about what we knew right after the election, probably even before the

election. Obviously, there`re all kinds of internal miscommunications. One of the problems about having a really tight knit set of people making

all these decisions, if you keep the - - the same people in the same places it`s just very difficult to admit you were wrong. Right?

SEGALL: The company`s powerful and after spending time behind Facebook`s walls there was another theme that emerged. Folks who had something to say

but were afraid to say it. Ironically in a place that`s connected billions, this former employee cites a disconnect within.

UNIDENTIFIED MALE: (Inaudible) because generally (inaudible).

SEGALL: Facebook is in transition. Many executives have left over rumored disputes about the company`s direction including the founders of Instagram

and WhatsApp. Amidst all the controversy there`s been speculation. Should Zuckerberg, who is CEO, Chairman and the majority share holder in Facebook

step aside?

ZUCKERBERG: That`s not the plan.

SEGALL: That`s not the plan. Would anything change that?

ZUCKERBERG: I - - I mean, like eventually over time I`m - - I`m - - I`m not going to - - I`m not going to be doing this forever.

SEGALL: Many of the employees you`ve met in the show have left.

STAMOS: Hopefully it will give you kind of a pretty good understanding - -

SEGALL: Including Alex Stamos who`s now teaching future entrepreneurs at Stanford University.

STAMOS: Doing better means doing things like this. Like bringing people together to think about these problems early and - - and not just be

reactive. You know, having a real diverse set of people work on these - - work in tech and work in Silicon Valley is going to be critical for that.

SEGALL: Randi Zuckerberg is building a media company that focus` on getting more young women in the tech world. You`ve always viewed your

brother as a creator. How do you feel, as his sister, about people genuinely questioning the impact of his creation and whether it`s good for


RANDI ZUCKERBERG: He has always just been an incredible trailblazer of pushing the boundaries. You know, of what he sees, where he sees the world

should go and you know, he has pushed a lot of us to think of the world in a bigger, more connected way than we ever thought it would be. So, you

know, for better or worse Mark is not the kind of person who is swayed by what you or I or anyone thinks of him. And because of that, I think we`re

going to get a lot more amazing things out of him in the years to come.

SEGALL: At 34, Zuckerberg is one of the most powerful people in the world. He`s pledged his fortune to charity and there`s no denying people have used

Facebook to raise over $1 billion in charitable causes throughout the years. By every metric, Facebook is a success. It`s revenue in 2018,

$55.8 billion. It was just $272 million in 2008 but Facebook at 15 faces complicated questions, a ripple effect of Zuckerberg`s mission. Is what`s

good for business good for society? And what is the cost of connecting the world?

UNIDENTIFIED MALE: Right now the tech industry has been telling themselves this narrative that if you connect people, if you have the platform for

free speech. It`s just automatically good. I think the - - the deeper upgrade we have to make is a philosophical upgrade which is what does it

mean for these things to be good?

UNIDENTIFIED MALE: Facebook is a living, breathing map of society and it`s literally a map of every single person in all of the relationships and all

of the interactions between all of those relationships. And so in a way, it`s as messy and human as we all are.

UNIDENTIFIED MALE: It is hard to say what any individual can do in the face of these massive, kind of, historical forces that - - that we`re at

the intersection of, right? What happens when you give voice to billions of people for the first time in history?

SEGALL: Do you think you`ll be on the right side of history when we look back on this period?

ZUCKERBERG: Yes. I do. I think it`s hard to imagine a future where giving people more power and helping people connect and share more ideas

isn`t going to end up being a very positive thing.

SEGALL: All though we`re at a moment where a lot of people are - - are watching Facebook and wondering, can it do all those things?

ZUCKERBERG: The principles of how do you balance giving people a voice with keeping people safe. How do you make sure that you can protect

people`s privacy and use information in a way that people are comfortable with? Well at the same time, being able to build a system that can go stop

bad guys from doing things and can provide a service that is free for people around the world.

These are really big historical questions and they`re not simple things that have, you know, one sentence answers. And over time, I really believe

that being on the side of giving people power and giving individuals a voice and giving people the ability to connect to the people that they want

to is going to be the - - the thing that wins out.

SEGALL: With a digital world of more than 2 billion people, Facebook has become part of the fabric of society. Revealing both the best and the

worst of humanity. For a tech company that has extraordinary human impact, what the next 15 years looks like is unknown. We do know one thing. As we

head into uncharted territory, Zuckerberg`s mission to connect the world will only amplify as will the issues that come along with it.