Return to Transcripts main page

CNN Special Reports

Facebook at 15: It's Complicated. Aired 9-10:30p ET

Aired February 10, 2019 - 21:00   ET


[21:00:19] (BEGIN VIDEO CLIP)

MARK ZUCKERBERG, CHIEF EXECUTIVE OFFICER, FACEBOOK, INC.: Hi, I'm Mark Zuckerberg, founder of Facebook, an online social directory.

UNIDENTIFIED MALE: Let's do it and then smile at the end.


ZUCKERBERG: I don't smile. You're asking a lot of me, guys.

LAURIE SEGALL, CNN SENIOR TECHNOLOGY CORRESPONDENT (voice-over): Built in a college dorm room.

ZUCKERBERG: When we first launched, we were hoping for, you know, maybe 400, 500 people.

SEGALL (voice-over): A digital world of 2 billion plus.

RICHARD QUEST, CNN ANCHOR: It's Facebook's world. We're just living in it.


SEGALL (voice-over): We blinked and Facebook became a part of the fabric of society.

UNIDENTIFIED FEMALE: The Arab Spring online, on Facebook.

SEGALL (voice-over): But things got complicated.

NICHOLAS CARLSON, GLOBAL EDITOR-IN-CHIEF, BUSINESS INSIDER: It was one of these first privacy scares on Facebook.

SEGALL (voice-over): The site now under fire.

SEN. MARK WARNER (D), VIRGINIA: The era of the Wild West in social media is coming to an end.

SEGALL (voice-over): Sharing private data.

CHRISTINE ROMANS, CNN ANCHOR: The largest security breach in Facebook history.

SEGALL (voice-over): Manipulated by foreign governments. JAKE TAPPER, CNN ANCHOR: New details about the extent of Facebook's

role in Russia's election interference campaign.

SEGALL (voice-over): Used to spread hate, conspiracy, and lies.

UNIDENTIFIED FEMALE: Russian-generated fake news.

SEGALL (voice-over): Leading questions about the platform.

BRIAN STELTER, CNN CHIEF MEDIA CORRESPONDENT: This is another black eye for Facebook.

UNIDENTIFIED FEMALE: You have users asking, maybe I should delete it.

SEGALL (voice-over): What does it truly mean to connect the world?

ZUCKERBERG: Our mission has really always been to connect the world.



We can connect the whole world.

RANDI ZUCKERBERG, SISTER OF MARK ZUCKERBERG: There's always, you know, a catch-22 whenever you're giving a voice to people who didn't have a voice before.

SEGALL (on camera): Did making money get in the way of Facebook's mission to ultimately connect the world?

We searched for answers, rare interviews with Facebook insiders.


SEGALL (voice-over): Ex-employees who feel powerless to tell the truth.

UNIDENTIFIED MALE: Working at Facebook can feel like being part of a cult.

SEGALL (voice-over): Politicians. Critics. And exclusive interviews with the creator of it all.

SEGALL (on camera): Did you ever question yourself?


SEGALL (voice-over): We go behind the graffitied walls of the sprawling campus in Menlo Park during the company's most pivotal moments.

BROOKE BALDWIN, CNN ANCHOR: Facebook on the defensive today.

BRIANA KEILAR, CNN ANCHOR: Senior tech correspondent Laurie Segall got this exclusive interview.

SEGALL (on camera): Mark, what happened? What went wrong?



SEGALL (on camera): It's October 19th, 2018, and we are heading to Facebook. This is a really big deal. We're going to sit down with Mark Zuckerberg who rarely sits down for interviews.

SEGALL (voice-over): Facebook years are like dog years. A lot happens in a little time. In the months since I first walked through these doors --

FREDRICKA WHITFIELD, CNN ANCHOR: Nearly 50 million Facebook users have been targeted by hackers.

ROMANS: The largest security breach in Facebook history.

BALDWIN: Facebook on the defensive today after this damning "New York Times" report of how Facebook has handled its bad P.R., Russia meddling --

SEGALL (voice-over): And we'll get to all that later. But for now, back to Facebook and what you need to know about an interview with Mark Zuckerberg.


SEGALL (on camera): It's so cold.

SEGALL (voice-over): First, he likes a room cold. Very cold.

Turn the cameras around and you'll see his people on the other side. They're taking notes, scribbling furiously, keeping time. They know that the stakes are high these days. The whole world seems to be watching.

And that's Facebook in this current moment. Massively influential.

POPPY HARLOW, CNN ANCHOR: Facebook CEO Mark Zuckerberg under fire.

SEGALL (voice-over): In flux.

ANA CABRERA, CNN ANCHOR: The biggest security breach in the history of Facebook.

SEGALL (voice-over): And controversial. But to fully understand Facebook of today, you have to go back to the beginning.

ZUCKERBERG: Hi, I'm Mark Zuckerberg, founder of Facebook, an online social directory.

UNIDENTIFIED MALE: That was really good. Just stare in the camera and just give a quirky little smile. ZUCKERBERG: Quirky little -- thanks.

The early days of Facebook were very scrappy time. There have been some constants through the years here. The mission of the company of connecting people and bringing people together, that really has informed some of the most important decisions that we've made.

TEXT: The Mission.

SEGALL (voice-over): You're going to hear this phrase, the mission, quite a bit. Mark's mission is the mantra at Facebook.

ZUCKERBERG: Our mission has really always been to connect the world.

We are a mission-driven company.

Our mission is to connect everyone in the world.



We can connect the whole world.

SEGALL (voice-over): It was in the company's DNA from the beginning.

ZUCKERBERG: And I think the people who have been drawn to this company over the years are people who really care about that.

SEGALL (voice-over): If you're an employee, you hear it the moment you walk through the door. You're almost indoctrinated into it.

NAOMI GLEIT, VICE PRESIDENT OF SOCIAL GOOD AND SENIOR DIRECTOR OF THE GROWTH, ENGAGEMENT, AND MOBILE TEAM, FACEBOOK, INC.: He believed in the mission of Facebook to help people share and be more connected, and I wanted to follow him.

[21:05:06] BARKER: It just felt so different from anything I'd done before.

SHERYL SANDBERG, CHIEF OPERATING OFFICER, FACEBOOK, INC.: Mark had this vision that everyone could be connected. And that was pretty exciting.

SEGALL (on camera): What's on your mind?

SEGALL (voice-over): Inside Facebook headquarters, that message to build out the mission is everywhere. Inspirational quote posters line the walls with delicate phrases fine-tuned to Facebook's current state.

SEGALL (on camera): Some call it almost even like a cult. Is this the cult of Mark Zuckerberg?

MIKE HOEFFLINGER, FORMER DIRECTOR OF GLOBAL BUSINESS MARKETING, FACEBOOK, INC.: I think a cult of personality is a little bit more kind of a Steve Jobs, Oprah Winfrey, Bill Clinton territory.

SEGALL (voice-over): Mike Hoefflinger worked at Facebook for 6-1/2 years.

HOEFFLINGER: I think cult of mission is what Facebook is, and that is still very much around and that's why people look to join Facebook.

ZUCKERBERG: The full formal mission statement is going to be --

SEGALL (voice-over): The mission is clear. As is one other thing. To understand Facebook at 15, you have to understand its DNA. And its DNA is Mark Zuckerberg.

RANDI ZUCKERBERG, SISTER OF MARK ZUCKERBERG: After just spending his entire childhood growing up with three sisters, he's like, I just got to just connect with other people.


SEGALL (voice-over): Mark's older sister Randi rarely gives interviews about her brother and Facebook.

R. ZUCKERBERG: We were always inventing. We were always collaborating. We're always, like, looking for any technology we could find and using it to just, you know, create something bigger.



ZUCKERBERG: I got accepted.

UNIDENTIFIED MALE: Are you serious?



SEGALL (voice-over): Creating solutions to problems yet to be seen was in the Zuckerberg blood and would eventually morph into the mission when Mark left home.

UNIDENTIFIED MALE: We are now focusing on one of the newest members of Harvard's Class of 2006.

SEGALL (voice-over): Its roots were actually quite trivial and controversial. Mark's first project at Harvard was Facemash. It was a hot or not style site he created by hacking pictures of classmates from Harvard's dormitory I.D. files.

CARLSON: People could rank -- you know, they see two pictures of people who are students at Harvard and vote for which one was more attractive. And then this site then produced a list of the most attractive people in -- at Harvard ranked.

SEGALL (voice-over): Business Insider Global Editor-in-Chief Nicholas Carlson.

CARLSON: Obviously, very offensive and people got very upset about it. And Zuckerberg was hauled in front of the disciplinary board at Harvard and admonished for this. But at the same time, that project revealed that Zuckerberg completely understood what people wanted to do in social media.

SEGALL (on camera): What do you mean?

CARLSON: What I mean is, is that people, when they voted, they ended up voting, on average, 44 times. Which just means they were, like, addicted to the site.

SEGALL (voice-over): Programming a platform that played into the best and worst of human impulses, that became familiar later. But in the meantime, it didn't take long for Harvard to shut down the site. Mark actually became a celebrity of sorts on campus.

R. ZUCKERBERG: I was like, oh, Mark. Like, you know, you kind of probably put that out there and didn't really think it through that well. But, I mean, he always saw a need for something and his gut instinct was always, like, let's get this out there and then make it perfect.

SEGALL (voice-over): This idea of creating and breaking in the name of connecting would be a theme that would only amplify a decade later as the stakes got higher. But 15 years ago at Harvard, it motivated Zuckerberg's next life-changing creation.

HARRY LEWIS, GORDON MCKAY PROFESSOR OF COMPUTER SCIENCE, HARVARD UNIVERSITY: Somewhere along one of these pathways might have been where the idea for Facebook started.

SEGALL (voice-over): Harry Lewis was Mark's computer science professor and a dean at Harvard. When Mark and every other student arrived on campus every fall, they were handed a face book.

LEWIS: It was literally a book of faces and names and you know -- and hometowns, basically. That's all it is. There were some computer science students who were eager to put, you know, the face book online. This was not necessarily a simple thing to do. And then somebody did it anyway, right? Without our --


LEWIS: Without our involvement.

SEGALL (voice-over): That somebody? Mark Zuckerberg.

R. ZUCKERBERG: He walked up to the registrar's office. Can I help you and volunteer as a student to digitize this?

SEGALL (on camera): Huh!

R. ZUCKERBERG: And they just said no. And I think it was almost a little bit of, like, well, maybe you just don't get what I mean, so let me just go home and do it and show you.

I want to meet that person who said no and give them a hug because if they had, you know, had the business foresight to say yes, like, none of this would have ever been created.

SEGALL (voice-over): February 4th, 2004. Facebook went live. Within 24 hours, an estimated 1,200 students had signed up. Within months, they had a hundred thousand users. A pretty meteoric rise.

ZUCKERBERG: Who knows where we're going next.

[21:10:00] SEGALL (voice-over): Mark's mission was born. He was just 19 years old.

ZUCKERBERG: We're hoping to have many more universities by fall, hopefully over a hundred or 200.

SEGALL (voice-over): And doing that would require a move to the epicenter of tech -- Silicon Valley. Where Mark found more space, more money, and more controversy.

CARLSON: The News Feed controversy. People hated it. They were really upset.

SEGALL (voice-over): That, when we come back.


UNIDENTIFIED MALE: What was the deciding factor for leaving Harvard?

ZUCKERBERG: When I made Facebook.


SEGALL (voice-over): Summer 2004. Mark and his co-founder, Dustin Moskovitz, left Cambridge and moved to Palo Alto to what's now become the legendary Facebook house.

CARLSON: He went to California and got a house, and it was a crazy house. They had a zip line that went over the pool and --

SEGALL (on camera): Shall we?

SEGALL (voice-over): It was here a lot of the earliest decisions were made. I took the tour years ago and, yes, there was a tour.

SEGALL (on camera): Life is just coding and --

ZUCKERBERG: Yes, it is basically coding.

SEGALL (voice-over): At the time, the site was exploding, a million users, and the company had little money in the bank. So the crew was making the Silicon Valley venture capital circuit in their own way.

CARLSON: You hear these famous stories of Zuckerberg and his team going to VCs in their pajamas and showing up late and being kind of rude. He was very, like, I'm young, rules are for adults. Get to work late, just do your thing, code it, hack it.

And over time, you know, they used to have a sign in the lobby which is, like, move fast and break things. And it's like -- now, it says, like, move carefully and don't break things.


SEGALL (on camera): Now, they're in such a delicate point, like --

CARLSON: Don't break things. If you break things, sometimes you break the country.

[21:15:03] SEGALL (voice-over): But this was long before the Facebook backlash. In 2005, the group moved to their first office, 471 Emerson.

GLEIT: It was just a bunch of kids who were kind of living out a continuation of their college experience.

SEGALL (voice-over): Naomi Gleit was one of those early employees.

GLEIT: It was above a Chinese restaurant called Jing Jing (ph). So we would order Chinese food, come into work at like 2:00, had college hours. Work all night until, you know, 8:00 in the morning and then go home and crash.


SEGALL (voice-over): Mark recruited his oldest sister, Randi, who was skeptical.

R. ZUCKERBERG: Facebook was so early.

SEGALL (on camera): Yes.

R. ZUCKERBERG: Who was I to think that that was going to be the thing that stuck?

He was just like, Randi, I think if you just come out and see what we're working on, you'll change your mind.

SEGALL (voice-over): And she did. At 471 Emerson, she negotiated her salary with her brother.

R. ZUCKERBERG: His initial proposal that he wrote out was, like, this tiny salary with, like, a good amount of stock options. I didn't know anything about stock options.

And so I remember I crossed out the stock options and, like, doubled the salary, and I was like, no, that's what I want. And he crossed it out again, wrote his initial offer again, and just said trust me and pushed the napkin back across the table. And I was like, all right, I'm going to trust you.

SEGALL (voice-over): Back then, cash was tight. They barely had enough to keep Facebook online. ZUCKERBERG: A router typically costs tens of thousands of dollars,

and we didn't have that. So I remember we went on eBay and bought a used router. I think that cost about $3,000 or $4,000.

And with the rest, we're like, all right, well, we need a car that we can get around that can lug servers. And we went on Craigslist and bought this 1994 Ford Explorer that, in retrospect, there was something wrong with it, right? Because you didn't need a key to turn it on. You just turn the ignition and it turned on?

MORIN: I have, ironically, so many memories of sitting in the parking lot of McDonald's with Mark and eating Chicken McNuggets.

SEGALL (voice-over): Fast food was all they could afford those days. Dave Morin was an early Facebook employee and the architect of many of Facebook's most influential features.

MORIN: We did it a lot, and we did it a lot at 12:00 at night, 2:00 in the morning. Because the ideas were just so interesting and so powerful that the conversations just went late into the night almost every single day.

SEGALL (voice-over): In the evenings, Morin would also play board games like Risk with Zuckerberg.

SEGALL (on camera): How was Mark as a game player?

MORIN: You know, they say this about great chess masters, that they're able to see, you know, three steps ahead or the best chess masters can see, like, five games ahead of you, right?

SEGALL (voice-over): That long-term strategy helped Zuckerberg turn down what became a historic offer to buy Facebook. It was June 2006. Yahoo! offered a billion dollars to buy Facebook. At the time, it seemed almost incomprehensible.

ZUCKERBERG: Most of the management team really thought that we should sell. I remember I had one late-night conversation with one of my closest advisers where he sat me down -- it was probably 11:00 p.m. -- and he said, Mark, if you don't sell the company, you are going to regret this decision for the rest of your life. I mean, it was just really intense.

SEGALL (on camera): What did you think when he said that?

ZUCKERBERG: When Dustin and I made the decision to not sell the company, within 18 months, every single person on the management team left.

SEGALL (on camera): Did you ever question yourself that you're making the right decision?

ZUCKERBERG: Oh, yes. Because, I mean, I was 22. I didn't have an exact plan of what was going to happen. It was incredibly scary.

SEGALL (on camera): Was it crazy, arrogant, confident? HOEFFLINGER: It's all of those things. And that is the curse of the

autocrat. It is this wild cocktail of vision, will power, the ability to get up in the morning and build the damn thing. A little bit of arrogance, a little bit of confidence and saying, you know, thank you, but I believe something else.

SEGALL (voice-over): An arrogance, or perhaps hubris, that many have said led Facebook into some of the serious troubles it's facing now. But then, it was the vision for what would come next that played out three months later. September 2006. The birth of News Feed.

HOEFFLINGER: People would simply go to somebody's profile then they would go to the next person profile and the next person's profile. And Facebook is like, aha, we need to actually bring this all together and just show you what's going on with your friends, especially those friends that are most important to you. And that was, of course, the dawn of the now-famous algorithm.

SEGALL (voice-over): News Feed would overhaul the site, giving a real-time feed of your friends and your family.

BARKER: Here I am gearing up for this fantastic launch and all the engineers are so excited.

SEGALL (voice-over): Facebook's former P.R. director remembers how everything changed in a minute.

BARKER: We saw this group that became a million people protesting against News Feed using News Feed.


BARGER: Because the way the product was working, as we all know now, was circulating into everybody's feed, and they were clicking on it and joining the group.

[21:20:03] R. ZUCKERBERG: So we were like, oh, my gosh, they are using our product to protest our product.

SEGALL (voice-over): The backlash was extreme.

BARKER: The phones started ringing off the hook. Thousands of e- mails of people saying, what have you done? What have you done? What you have done to my Facebook?

CARLSON: I think they were sort of alarmed by Facebook taking their activity and publishing it. You know, it was one of these first privacy scares on Facebook.

SEGALL (voice-over): But News Feed survived and it thrived.

CARLSON: Ultimately, it became the thing that is the core of Facebook, the News Feed. It's not only the core of Facebook, but it's the core of every social media application we use.

SEGALL (voice-over): It was Facebook's first hint of privacy issues. Down the line, the stakes only got higher as the platform connected the world.

But the next innovation would fundamentally change Facebook and society. It had great potential to both connect the world and tear it apart. That, when we come back.


SEGALL (voice-over): Welcome to December 2007. Also known as that time Facebook ruined Christmas.

JENNIFER WESTHOVEN, CNN CORRESPONDENT: There was a guy who bought a diamond ring for his wife. And all of his friends, it flashed up on their screen.

ALI VELSHI, ANCHOR: Including his wife.

WESTHOVEN: And his wife. Christmas ruined, he said.

SEGALL (voice-over): It was Facebook's first big privacy scandal.

WESTHOVEN: Thousands were outraged.

[21:24:58] ROMANS: People were really ticked off.

SEGALL (voice-over): Ticked off by Facebook's first real attempt at making money. It was an ad product called Beacon.

CARLSON: A Facebook user can log into e-commerce sites using your Facebook identification. And then when you buy something, all your friends are going to find out about it. And Facebook was like this is such a cool, innovative way to get, like, involved in commerce and not be doing boring old advertising.

SEGALL (voice-over): To say they got it wrong was an understatement.

CARLSON: It blew up in their faces immediately. Because when you think about it, you don't want, like, people to know what kind of underwear you're buying and so on.

SEGALL (voice-over): This soon after the News Feed outrage. But like many bets Zuckerberg made that paid off, this was different.

CARLSON: The story of Facebook Beacon is the time where Facebook got really burned.

BARKER: It was one of the darkest moments of my time there.

SEGALL (voice-over): Facebook's head of public relations, Brandee Barker, was dealing with the backlash.

BARKER: I think we pushed the boundaries and we pushed them too far.

UNIDENTIFIED FEMALE: Good afternoon, Facebook. How may I help you?

SEGALL (voice-over): The company debated how they'd handle the outrage. BARKER: There were just different points of view within the company,

and they ranged from the engineers who felt very strongly in it, to the sales reps who had something new and unique to sell, to the privacy advocates.

SEGALL (on camera): What was Mark saying at the time?

BARKER: I think he was among the people, initially, that were advocating to keep it with the belief that there might be a way to keep it and create it so that some people could use it if they wanted to, some people couldn't.

VELSHI: There were 67,000 people who had signed an online petition with Facebook to complain about this.

BARKER: We didn't move fast enough, and we broke some things in the process. We broke user trust in a big way that time around.

SEGALL (voice-over): It was a privacy and P.R. disaster. It took the company a month to react.

VELSHI: Facebook yesterday apologized and said that now you can entirely opt out of this program.

SEGALL (voice-over): But the loss of trust was damaging. It was officially time to bring in the operator that many believe Zuckerberg needed to help run Facebook. That person was Sheryl Sandberg.

But getting her to come to Facebook wouldn't be easy. She still had a job, a big one, at Google, so they couldn't be seen together. We spoke to her before a new report now questions her leadership.

SANDBERG: We can't go to his house because he literally had an apartment that was like one room with a futon on the floor. And I don't think he even had a chair, so there's nowhere to sit. So he came to my house for dinner.

I mean, the joke was he would, like, show up after my kids went to bed. I don't know, at 7:30, right? And then I would literally have to kick him out and say, like, OK, it's 11:30, I need to go to bed now. Like, leave.

SEGALL (on camera): At the time, you're 38 years old, you're managing 4,000 employees at Google, leaving for a company that barely had any revenue. And you felt like this was the opportunity?

SANDBERG: I felt like this was a great opportunity. A lot of people said to me, what are you doing? Facebook was really small. It didn't seem to be growing that quickly. And I told them I'm going to work with and for someone I really believe in, who I think is trying to do something really important.

GLEIT: She immediately became my hero and the hero of all the women at Facebook.

SEGALL (on camera): Why? GLEIT: Because there were so few of us back then. And on her first

day, she came around and introduced herself to everyone.

SEGALL (on camera): What did you say to her, do you remember?

GLEIT: I'm so glad you're here. We were like, thank god.


SEGALL (voice-over): Those early years with Sheryl at the helm would lead to tremendous growth. As would Facebook's next move.

BARKER: I remember sitting in that room with him and him drawing on the board these sort of circles and then like these connected lines. And it was kind of like a map at some point. And there was sort of this visual. He was trying to capture what it was. And then we were having a conversation about, well, should we call it a social map?

SEGALL (voice-over): It would eventually be called Platform.

BARGER: It was the first time that Facebook was opening the site to allow outside technology companies and/or, in certain cases, individual developers to build something that would work with the site.

SEGALL (voice-over): It would prove to be one of the most important moves the company made. And down the line, would lead to fundamental questions and concerns about how the company handled user data, giving third-party developers the ability to create their own applications by accessing Facebook user data.

SANDBERG: Facebook Platform is why you can follow your friends' playlists. It's why you can see your friends' birthdays on your calendar and remember. I think the early form of Platform was sharing more data.

SEGALL (voice-over): The architects were focused on the good as it went live in 2007.

[21:30:03] MORIN: We were talking about improving the world of education. We were talking about improving the world of education. We were talking about improving healthcare. We were talking about enabling people to discover their friends, and more importantly to discover people that were like them no matter where they went.

And then we launched and things went much faster than we expected.

SEGALL: In what sense?

MORIN: Within two days, there were applications that had over a million users, which at the time had never been seen before on the internet.

SEGALL: It was a moneymaker the moment it went live. And would lead to tremendous innovation and growth, but years later it would become the root of one of the company's biggest scandals. That, later. But first, Mark Zuckerberg gets the Hollywood treatment.


JESSE EISENBERG, PORTRAYING MARK ZUCKERBERG IN "THE SOCIAL NETWORK": If you guys were the inventors of Facebook, you would have invented Facebook.



RICHARD QUEST, ANCHOR, FACEBOOK: Facebook is now considered to be the big one.

UNIDENTIFIED FEMALE: It's the hottest web address there is.


SEGALL (voice over): Facebook was creating history, hitting a half a billion users by summer 2010. But that history came with a controversial founding story.


CAMERON WINKLEVOSS, FORMER COLLEGE CLASSMATE OF ZUCKERBERG: We certainly see a person who is where he is today absolutely because we approached him with our idea, our business plan and two years' worth of work.


SEGALL (voice over): It began a week after the Facebook launched in 2004, back in Cambridge, Massachusetts with two of Zuckerberg's college classmates.

TYLER WINKLEVOSS, FORMER COLLEGE CLASSMATE OF ZUCKERBERG: We first learned about Facebook and Mark Zuckerberg launching it by reading the Harvard student newspaper. We were pretty shocked. I mean, we were just totally blind sided.



SEGALL (voice over): Blindsided Tyler and Cameron Winklevoss say because they initially hired Mark months earlier to build their social network. They claimed Mark agreed to work with them, but stalled their project while he built his.


C. WINKLEVOSS: We were in a partnership and then he used his skill set to basically self-deal in his own interests and take the entire project from us.


SEGALL (voice over): So they decided to sue. The case would wind its way through the court for years.


BRANDEE BARKER, FACEBOOK FORMER PR HEAD: The lawsuit, for me, was like a little gnat flying around my head for years. You're just like, ugh.


SEGALL (voice over): Once again, Brandee Barker was playing defense.


BARKER: We're in Silicon Valley, like people are inventing things left and right. Why weren't the MySpace founders upset? Or suing us at the time? They had 100 million users, when we had 10 million. He did not steal the idea from the Winkelvoss.


SEGALL (voice over): Facebook strategy was to ignore and keep building. And when asked, deny.


MARK ZUCKERBERG, CEO AND FOUNDER, FACEBOOK: We know that we didn't steal any ideas or code. So we're just kind of waiting until that comes out in court.


SEGALL (voice over): In 2007, a judge called their claims "tissue thin" and called the agreement "dorm room chitchat," which the judge ruled does not make a contract.

By 2008, the twins settled for $65 million.


T. WINKLEVOSS: What Facebook did throughout the entire litigation was suppress and withhold all of the smoking gun electronic communications of Mark Zuckerberg.


SEGALL (voice over): Communications that eventually leaked because of "Business Insider's" Nicholas Carlson.


SEGALL (on camera): Journalist to journalist, so you pressed someone, someone wanted you to --


SEGALL: Several people. Someone wanted you to have these instant messages, someone wanted you to have these e-mails.


SEGALL (voice over): These e-mails plus texts were never before seen communications from Mark Zuckerberg during his Harvard days.


CARLSON: "I hate working under other people. I feel like the right thing to do is finish the Facebook and wait until the last day before I'm supposed to have their thing ready and then be like, 'Look, yours isn't as good, so if you want to join mine, you can.'"

His friend says, "So have you decided what you're going to do about the websites?" And Zuckerberg says, "Yeah, I'm them to [bleep] them, probably in the ear."

SEGALL: What went through your head when you first saw these instant messages and these e-mails that were an incredibly intimate peek into Mark Zuckerberg?

CARLSON: He is like a "Game of Thrones" character who is going to like cackle for the camera about how he's throwing these people under the bus, who have trusted him with this project.

To me, like, it's like fantasy business. Here he is, a 19-year-old talking about something that is kind of like maybe a hobby, but ended up being really huge.


SEGALL (voice over): And while these are the words of a 19-year-old college kid, what do they say about Mark Zuckerberg?


CARLSON: I think that there's a killer instinct there. He's willing to go for it and he's willing to go through people to get what he wants. And that was obvious from the second he started the site. And that became something as evident in how he ran his company over years. He was unsentimental about moving out executives who were not performing.

In the end, it served him because what he did is he churned his way through people until he got to Sheryl Sandberg, someone who really helped take that company from a very successful startup to the global behemoth we know it now.


SEGALL (voice over): Hollywood took a different perspective.


are saying that we stole The Facebook from Divya Narendra and the Winklevoss --

EISENBERG: I know what it says.

GARFIELD: Did with we?


SEGALL (voice over): Facebook's founding drama was about to become a major motion picture. Before the movie came out, Facebook was determined to show the public a different leader than the one about to be portrayed.


ZUCKERBERG: We're really just focused on our mission.


SEGALL (voice over): Zuckerberg went on a media tour and donated $100 million to help schools in New Jersey.


ZUCKERBERG: A $100 million challenge grant.

OPRAH WINFREY, AMERICAN ACTRESS: One hundred million dollars --

EISENBERG: You have part of my attention.


SEGALL (voice over): But when the movie came out, there was nothing charitable about Mark's portrayal.


EISENBERG: My colleagues and I are doing things that no one in this room, including and especially your clients are intellectually or are creatively capable of doing. Did I adequately answer your condescending question?


SEGALL (on camera): Let's wade into "The Social Network" the movie when it came out. How did you feel watching yourself, Hollywood-ized and being portrayed as a bit of a jerk?

(Phone rings)

SEGALL: Oh come on. Saved by the bell. Okay, so you get to take a peek, but how did you feel?


EISENBERG: You know, you really don't need a forensics team to get to the bottom of this. If you guys were the inventors of Facebook, you would have invented Facebook.


ZUCKERBERG: I wish that no one made a movie about me while I was so young. That was just so early in the journey and it mischaracterized so much of what we were doing.

But one of the things that's really tough is that it set the first impression for how a lot of people think about me and the company.


ZUCKERBERG: That's really unfortunate because the movie made up a lot of stuff. The whole idea that we started the company because we're trying to make money or that I was trying to find a girlfriend or something like that, and I was dating Priscilla at the time.


UNIDENTIFIED MALE: Have you two not met?





SEGALL (voice over): And while he tried to laugh off the movie --


UNIDENTIFIED MALE: Facebook's privacy policy is the issue here.

QUEST: You might have found you were connecting with more people than you bargained for.


SEGALL (voice over): There was nothing funny about something else that happened in 2010.


UNIDENTIFIED FEMALE: Do you feel like you're violating people's privacy?

ZUCKERBERG: You know, there are real learning points and turning points along the way.


SEGALL (voice over): Mark was in the hot seat again.


UNIDENTIFIED FEMALE: You want to take off the hoody?

ZUCKERBERG: No, I never take off the hoodie.


SEGALL (voice over): After the company changed its privacy settings without users' knowledge.


UNIDENTIFIED MALE: I like to say the devil is in the defaults, and the default changed from private to public. So that meant that a large amount of the content that users had on their profiles, their likes, and these types of things were no longer private and were now public.


SEGALL (voice over): The move was in line with Mark's mission to connect the world. It left users open to share more information. The subtext, more user data, more potential to make money.


UNIDENTIFIED FEMALE: People running from Facebook over privacy concerns.


SEGALL (voice over): The backlash was tremendous.


CHUCK SCHUMER, U.S. SENATOR, NEW YORK, DEMOCRAT: We are here today to urge Facebook's creator and CEO, Mark Zuckerberg to revisit this new policy as soon as possible.

UNIDENTIFIED FEMALE: This morning an about face by Facebook.


SEGALL (voice over): So they simplified their privacy settings putting users in control of how much data they shared.

At this point, Facebook was a rocket ship.


UNIDENTIFIED MALE: It could be one of the biggest initial public offerings ever. (END VIDEO CLIP)

SEGALL (voice over): These were the golden years and the most anticipated IPO of the decade can was coming. That, when we come back.



SEGALL (voice over): Many credited Facebook for its role in the Arab Spring in 2011. But as with everything that seems to happen with Facebook, it's never black and white.


RANDI ZUCKERBERG, SISTER OF MARK ZUCKERBERG: You don't empower that without empowering other people. And so there's always a catch 22 whenever you're giving a voice to people who didn't have a voice before.


SEGALL (voice over): It was winter 2011. Mark's sister, Randi was at World Economic Forum in Davos, Switzerland.


R. ZUCKERBERG: I was hosting a whole Facebook Live there, and I think the Minister of Tunisia rejected the BBC, rejected all of these news outlets because he wanted to talk to people directly on Facebook.

And that was this moment that I was like, "Oh, my gosh. Everything that I've been working for is happening." And later that day, I sat in a room with all of these religious leaders from around the world and they called on me in the back of the room and said, "Randi, can you come to the front for a second? We'd like to talk to you about all these pages we found on Facebook where 'F Christianity. F Judaism, ' and I just, I sat there and I was like, "Wow, this is going to be the issue of our time." Because it is impossible to provide that mega phone for the Minister of Tunisia without also providing the mega phone for people to say things that are upsetting these religious leaders.


SEGALL (voice over): They soon discovered policing content was complicated. An issue that would only grow as Facebook grew. And moving into 2012, Zuckerberg was close to a billion users and gearing up for company's massively anticipated IPO.


UNIDENTIFIED FEMALE: Questions are coming out -- is Facebook really living up to the hype?


SEGALL (voice over): But there was a problem. Smart phones were on the rise and Facebook wasn't a mobile first app. It was easier to access it from a computer. So Zuckerberg did something atypical. He bought a company during what was known as the quiet period.


SEGALL (on camera): Why the need to go on a buying spree?

M. ZUCKERBERG: It was really connected to this whole transition to mobile phones being the main way that we use technology.


SEGALL (voice over): Instagram was one of the most downloaded applications on the iPhone.


CARLSON: Zuckerberg went out and just put like a humongous amount of money on the table and bought Instagram and everyone said "You're crazy. What are you doing? That's a bad idea." And actually, it turns out to be a brilliant idea.

SEGALL (on camera): A billion dollars, they got it cheap.

CARLSON: Yes, they did get it cheap.


SEGALL (on camera): This was Facebook's biggest acquisition to date.


SEGALL (voice over): As a startup reporter, at time, I had never covered a deal so big, so quick and so important.


SEGALL (on camera): I spoke to an investor yesterday who said Facebook is going to have to find a way to help monetize this.

NAOMI GLEIT, VP SOCIAL GOOD AND SENIOR DIRECTOR OF GROWTH, ENGAGEMENT AND MOBILE TEAM, FACEBOOK: I think there were a lot of people that doubted whether or not was really a good acquisition and if you look back now, it makes so much sense.


SEGALL (voice over): It was an investment in Facebook's mobile strategy and it helped the company bolster its portfolio ahead of the IPO.


CHRISTINE ROMANS, ANCHOR, CNN: Facebook is set to go public this week in what could be the biggest IPO in history.

ZUCKERBERG: Going public is an important milestone.


SEGALL (voice over): With the world watching, everything that could go wrong, went wrong.


ERIN BURNETT, ANCHOR, CNN: Stock that maybe picked a bad week to go public. Facebook, faceplant.


SEGALL (voice over): And when the actual day came, the NASDAQ system broke.



ZUCKERBERG: Let's do this.


HOEFFLINGER: The NASDAQ button was all set up on campus. We're doing a kind of a remote to kind of ring the bell on Facebook going public. The bell rings. NASDAQ doesn't clear for at least three hours. It's massive uncertainty. This is just like beginning of the wildest period.

SEGALL (voice over): The stock tanked and continued to for 109 days.

HOEFFLINGER: I mean, it was the definition of roller coaster in every way.

SEGALL (voice over): And during all of this, the one person we didn't hear from was Mark Zuckerberg.

HOEFFLINGER: It was echoed a little bit when Mark and Sheryl were quiet for three plus days after the Cambridge Analytica crisis. I think Mark at the time just very firmly believed that we're going to focus on building our stuff instead of talking about our stuff.

SEGALL (voice over): Until Zuckerberg spoke for the first time. At this point, it was a highly anticipated interview.


UNIDENTIFIED MALE: Welcome to Tech Crunch Disrupt and thanks for coming.

ZUCKERBERG: Thanks for having me.

(END VIDEO CLIP) SEGALL (voice over): I'll never forget looking off stage and seeing

Mark Zuckerberg and seeing him take this deep breath and like pump himself up. There was something so youthful and nervous about what he was about to go do.


UNIDENTIFIED MALE: Okay, so are you ready?


HOEFFLINGER: This is a defining moment. You're an ice skater stepping on for the long skate and you know, you've got to nail it.



ZUCKERBERG: Performance of the stock has obviously been disappointing. We already see that mobile users are more likely to be daily active users of Facebook.


GLEIT: He said, you know, we're going to be focused on mobile and anyone who doesn't bring to my office mocks -- prototypes -- that are based on mobile rather than desktop will be kicked out.


UNIDENTIFIED MALE: We have almost 500 million mobile users.


SEGALL (voice over): They made the shift and it would pay off. Eventually the stock price went up. Zuckerberg's bet on Instagram also paid off in a big way. But his next move made a billion dollars look like a steal. He believed messaging app, WhatsApp would be key player in the future of Facebook. So he offered to buy it for $22 billion.

Even then, WhatsApp founder, Jan Koum had doubts.


ZUCKERBERG: So, my dog Beast, I think is actually the secret weapon here. It was a tense moment and Jan was like, "All right, I need to think about this." And just went silent for a few minutes. And Beast walked into the room kind and is like, kind of confused, he was like, "What's going on? These two guys are just sitting here silent." And he just like -- kind of looks quizzically and then walks up to Jan and jumps in his lap. And then Jan starts petting him and a second later, he is like, "Okay, I think we're good."

SEGALL (on camera): And so a $22 billion deal was done.


SEGALL (voice over): But what Facebook didn't see was about to shape its future.


UNIDENTIFIED MALE: Welcome to the CNN-Facebook Republican presidential debate.


SEGALL (voice over): That, when we come back.




HILLARY CLINTON, FORMER FIRST LADY OF THE UNITED STATES: You could put half of Trump's supporters into what I call the basket of deplorables.

DONALD TRUMP, PRESIDENT OF THE UNITED STATES: We have got to beat Hillary Clinton. Crooked Hillary Clinton.


SEGALL (voice over): It was 2016.


TRUMP: They just announced I have over between 22 million between Twitter and Facebook -- 22 million people.


SEGALL (voice over): And politicians had realized the reach of Facebook.


UNIDENTIFIED MALE: Today, you wrote a Facebook post in you likened Donald Trump to Gollum.


SEGALL (voice over): The platform was integral to politics and had been for years.


BARACK OBAMA, FORMER PRESIDENT OF THE UNITED STATES: I'm the guy who got Mark to wear a jacket and tie.

(END VIDEO CLIP) SEGALL (voice over): But unnoticed with all that hype --


JAKE TAPPER, ANCHOR, CNN: New details about the extent of Facebook's role in Russia's election interference campaign in 2016.


SEGALL (voice over): A campaign on Facebook to disrupt the U.S. Election and divide America was well under way.




SEGALL (voice over): Meet Alex Stamos. He's a respected figure in security.

STAMOS: We survived the last day.

SEGALL (voice over): Today is August 17th, 2018.

STAMOS: Tough to say good-bye to folks but I'm glad for what comes afterwards.

SEGALL (voice over): It's the last day he'll call himself Chief Security Officer at Facebook. He's been there three years.

STAMOS: It feels weird to live through history from the inside. I think, naturally, I want to question whether I did everything I could.

SEGALL (voice over): It was Stamos' team who discovered Russians were weaponizing Facebook to influence the election, a pivotal moment for democracy and a turning point in Facebook history.

STAMOS: When I was hired, the job was to protect the platform from attack, keep people's data safe. And now, I'm reading biographies of Putin and taking seminars on disinformation tactics of the Russian intelligence services.

SEGALL (voice over): It started the spring and summer of 2016. The race for President was in full gear. The Facebook Security Team discovered suspicious activity connected to Russian intelligence and reported their findings to U.S. law enforcement.

SEGALL (on camera): Why was Facebook not transparent as people went to vote in the fall?

STAMOS: It just wasn't seen as our position to kind of get involved publicly in these massive political issues. And in this situation, you know, you do not want to be seen as putting your thumb on the scale one way or the other.

SEGALL (voice over): Facebook stayed silent. Voters went to the polls in record numbers.


UNIDENTIFIED MALE: The astounding upset victory of Donald Trump J. Trump will become the 45th President of the United States.


SEGALL (voice over): But soon after, there were concerns that the platform had been used to spread propaganda and fake news, all of it in an effort to influence the election.


UNIDENTIFIED FEMALE: Facebook cracking down on fake news sites after backlash that the made-up stories may have influenced the election.


SEGALL (voice over): At the time, Zuckerberg downplayed it.


ZUCKERBERG: The idea that fake news on Facebook, of which, you know, it is a very small amount of the content influenced the election in any way I think is a pretty crazy idea.


ZUCKERBERG: I just think at the time I was way too dismissive. I think I just reacted and had a negative visceral reaction to the idea that people were somehow tricked.

SEGALL (voice over): By January 2017, a U.S. intelligence report linked the spread of hyper partisan fake news to the Internet Research Agency, a Russian company with close ties to Putin in Russian intelligence.

STAMOS: What they were trying to do was to take the most radical positions in our society and almost act as a parody of what the other side would think, supporters of that ideology would believe.

Their entire goal was to reduce the quality of discourse and to increase anger and divisiveness in the country.

SEGALL (voice over): The Russians took advantage of Facebook's advertising tools which allow advertisers to target people specifically based on their interests and even their political biases.

On top of that, the more polarizing and divisive their posts were, the more likely Facebook's algorithms would show them to more people, it's why fake news, conspiracies and divisive content got a boost.

TRISTAN HARRIS, DIRECTOR, CO-FOUNDER OF THE CENTER FOR HUMANE TECHNOLOGY: Fake news is just a bunch of car crashes. SEGALL (voice over): Tristan Harris started the Center for Human

Technology. It's aim, help tech companies understand the impact of their algorithms.

HARRIS: If you're driving on a road and you see a car crash, you have got a lot of evolutionary instincts that tell your mind, "I have got to like look at the car crash." But the way Facebook sees people looking at the car crash is, "That's what you want," so it starts feeding the whole world car crashes over and over again metaphorically. And that's a product of their business model.

SEGALL (voice over): The longer people stay on their Facebook feeds the more money Facebook makes.

HARRIS: Their business model is about two things -- how do I keep people on the street and then how do I make those people as easy to influence as possible for the advertiser. And the problem is, if you're not checking what's happening when you have six million advertisers cycling through the system, how do you know if some of the advertisers are Iran, Russia, China, Saudi Arabia.

SEGALL: Ends up, Facebook's internal investigation did find that Russia's Internet Research Agency did spend money on ads to target voters.

ANDERSON COOPER, CNN ANCHOR, "AC 360": The Twitter and Facebook account both disguised to look like they were run by the same black activists were actually the work of Russians.

SEGALL: After months of inaction and denials by the company, Stamos and his colleagues were allowed to go public with some of their findings.

ALEX STAMOS, FORMER CHIEF SECURITY OFFICER, FACEBOOK: Certainly we did not have the problem totally solved by then. But I hope that it started changing the conversation a bit that, you know, Facebook and the rest of the tech companies was that we start to believe that we should just be honest about the bad things that happen on our platform.

SEGALL: Publicly the company was limited in its transparency. There'd be even more revelations later. Internally Facebook execs treated the foreign interference like this shift to mobile, all hands on deck, and started to increase its security team. By 2018, it would be more than 30,000.

But Facebook's failure to anticipate and quickly address foreign government influence on the platform put the company on thin ice.

UNIDENTIFIED MALE: Facebook battling a massive data misuse scandal.

SEGALL: The straw that broke the camel's back would come early 2018.

CHRIS CUOMO, CNN ANCHOR: Millions of users had their information improperly obtained by a data firm.

JAKE TAPPER, CNN ANCHOR: From 2016 campaign data firm, Cambridge Analytica, harvested data from Facebook users.

SEGALL: Data of up to 87 million Facebook users to be precise collected before the election. That one research firm went on to help the Trump campaign better target voters online.

BRIAN STELTER, CNN SENIOR MEDIA CORRESPONDENT: Personality quizzes, we've all seen them on Facebook.

SEGALL: The way they got the data was simple, through a personality quiz app taken by about 300,000 Facebook users.

UNIDENTIFIED MALE: We were able to form a model to predict the personality of every single adult in the United States of America.

SEGALL (on camera): So data like your music preference, gender, demographics, your marital status, your likes.

(Voice-over): What few of us knew was taking the quiz gave the researcher access to not only our data but also our friends' data. This methodology was completely permissible at the time up until Facebook restricted access in 2015. Loser privacy settings allowed developers more access to our information.

(On camera): These are ethical debates that will impact two billion people about user data, about the spread of fake news, about the weaponization of the platform.

STELTER: This is another black eye for Facebook.

SEGALL (voice-over): People were outraged.

UNIDENTIFIED FEMALE: You have users asking, should I be spending so much time on Facebook. Maybe I should delete it.

NICHOLAS CARLSON, GLOBAL EDITOR-IN-CHIEF, INSIDER/BUSINESS INSIDER: I think people are upset about the election and they feel like this scandal is the thing that caused the election to happen the way it did and that's why a lot of people were angry about it.


SEGALL: It all dated back when Facebook opened up the platform to developers.

CARLSON: I think they built something that was beyond their wildest dreams and it was extremely powerful in ways that they had no idea it can be powerful.

SEGALL (on camera): Did you think, oh, no, because this is -- you know, this was based off with the platform that you were an early architect of?

DAVE MORIN, EARLY FACEBOOK EMPLOYEE: Yes is the answer. I don't think it was something you could have predicted or even thought was a risk. We're talking about a level of nation states and nation state actors. These are countries, right, that have military budgets that are beyond what any of us can ever imagine.

TAPPER: The social media powerhouse has been reeling, seeing its worst day in four years.

SEGALL (voice-over): It was a turning point for the company, user trust eroded and Facebook's stock price plunged, and the anger only amplified as days went by and there was silence from the upper ranks. Until Mark finally decided to talk.

BRIANNA KEILAR, CNN ANCHOR: Senior tech correspondent Laurie Segall got this exclusive interview.

SEGALL: To me.

(On camera): Mark, what happened? What went wrong?

(Voice-over): When we come back.


WOLF BLITZER, CNN ANCHOR: Cambridge Analytica is under investigation for its use of Facebook users' information.

BROOKE BALDWIN, CNN ANCHOR: A lot of people have been asking where is Mark Zuckerberg, the head of Facebook, on all of this.

SHERYL SANDBERG, CHIEF OPERATING OFFICER, FACEBOOK: We were slow to respond. It took us a couple of extra days. That was like a big deal that we didn't respond that quickly. But we were really trying to figure out what happened and we weren't able to do it quickly enough.

TAPPER: Mark Zuckerberg is breaking his silence finally to CNN talking about the election data breach debacle.

SEGALL: It was March 2018 and I just remember sitting across from Mark Zuckerberg. Inside that room it just felt incredibly tense.

ZUCKERBERG: There is an element of accountability where as uncomfortable as it is for me to do, you know, a TV interview, I think this is an important thing that as a discipline for what we're doing, I should be out there and being asked hard questions by journalists.

SEGALL (on camera): What happened? What went wrong?

ZUCKERBERG: This was a major breach of trust. And I'm really sorry that this happened.

SEN. JOHN THUNE (R), SOUTH DAKOTA: There were previous instances where they had to issue apologies for breaches in the past. This was different.

SEGALL (voice-over): Senator John Thune and much of Washington was watching closely.

THUNE: We're already sort of contemplating how to proceed in terms of our oversight role and we want to make sure we're holding them accountable.

SEGALL (on camera): Everybody wants you to show up. Will you testify before Congress?

ZUCKERBERG: I'm happy to if it's the right thing to do.

SEGALL: You are the brand of Facebook, you are the name of Facebook. People want to hear from you.

ZUCKERBERG: We just want to make sure that we send whoever is best informed to doing that.

THUNE: We took into consideration the statement that he made when you had asked him those questions and then began to more aggressively pursue the idea of having him come in.

SEGALL (voice-over): Zuckerberg agreed to testify on April 10th, 2018. And the prep started. Inside Facebook they built a mock hearing room.


SANDBERG: I mean, that was as high stakes as it gets.

JOHN BERMAN, CNN ANCHOR: Facebook's CEO, Mark Zuckerberg, just hours away from testifying.

SEGALL: It's all historic, the way it was for the tobacco execs in 1994. I remember someone that day told me they had to bring in an extra row of chairs for the senators because there was so much interest.

THUNE: We're listening. America is listening, and quite possibly, the world is listening, too.

SEGALL: And wondering, could Zuckerberg, who historically had a hard time with high pressure public moments, deliver?

ZUCKERBERG: We didn't take a broad enough view of our responsibility. And that was a big mistake. And it was my mistake, and I'm sorry. I started Facebook. I run it, and I'm responsible for what happens here.

CARLSON: His performance in front of Congress was poised. He had the right answers. I think he did laps around the people there.

SEGALL: But part of that success was due to the fact that many of the questions asked by the senators demonstrated a lack of understanding about how Facebook and tech in general works.

UNIDENTIFIED MALE: If I'm e-mailing with an WhatsApp, does that ever inform your advertisers?

SEN. ORRIN HATCH (R), UTAH: How do you sustain a business model in which users don't pay for your service?

ZUCKERBERG: Senator, we run ads.

HATCH: I see.

SEN. MARK WARNER (D), VIRGINIA: It was an embarrassment to the members of Congress. It reconfirmed what most of the Valley and frankly probably many of the viewers thought that most of these men and women don't even understand the basic business model let alone the amount of data that's being collected and the nature of these new communities.

SEN. ANNA ESHOO (D), CALIFORNIA: Was your data included in the data sold to the militia's third parties? Your personal data?


ESHOO: It was. Are you willing to change your business model in the interest of protecting individual privacy?

ZUCKERBERG: Congresswoman, we have made and are continuing to make changes to reduce the amount --

ESHOO: No. Are you willing to change your business model in the interest of protecting individual privacy?

ZUCKERBERG: Congresswoman, I'm not sure what that means.

ESHOO: Well, I'll follow up with you on it.

WARNER: A whole lot of folks didn't know that the -- some of the data that Facebook was collecting through your contacts back and forth with friends and others, and the news feed you go to, was actually being used for political purposes by a political consulting firm that had strong ties to the Russians.

THUNE: There's going to have to be a different approach taken in the future.

SEGALL: An approach many thought would be regulation. But after two days of testimony, what kind was unclear.

THUNE: You don't want to put a heavy burden on the goose that laid the golden egg.

WARNER: Social media is not going away, technology is not going away. I would hope they would work with us.

SEGALL: Facebook would try to get in front of pending regulation by promising to give users more control over their data and by investigating tens of thousands of other apps that were allowed to collect user data. But it still came down to one fundamental question.

(On camera): Did making money get in the way of Facebook's mission to ultimately connect the world? SANDBERG: I don't think so. The fundamental business model we offer

I think is a really good one. It protects people's privacy and it takes a very powerful product that changes people's lives and makes it available for free. So if we were to have to charge for Facebook like the ad business, a very small fraction of the people who use it would be able to use it.

HARRIS: They would say, how else are we going to connect the whole world if it's not free? But I would say if the thing you're connecting the whole world to isn't safe it shouldn't be free.

SEGALL (voice-over): Unsafe, critics Tristan Harris say, not just because of personal data breaches but also unsafe because of the content that seems unchecked on the platform, something Congress is worried about on day two of Zuckerberg's testimony.

ZUCKERBERG: Do we have a responsibility for the content that people share on Facebook? And I believe the answer to that question is yes.

SEGALL: It's a fine line, as hate speeches ignited, fake news spreads and conspiracy theorists like Alex Jones thrive. Facebook must decide what content stays and what goes.

STAMOS: If there's anything I've learned over the last three years is that you can't win. Right? Any content decision we make, any step we make to try to do something that sounds not controversial, like protect an election, will be portrayed by somebody else as being a partisan decision.

SANDBERG: We believe deeply in free expression. We believe that you should be able to express your voice, your opinion. We also believe deeply in a safe community and a protected community.

SEGALL: But that also means giving a platform to fake news, holocaust deniers, conspiracy theorists, racists, something that could be deadly.

UNIDENTIFIED FEMALE: Genocide and crimes against humanity, the horrors faced by the Rohinga Muslims living in Myanmar.

[22:15:07] JOHN VAUSE, CNN INTERNATIONAL ANCHOR: The U.N. investigation also blamed Facebook and de facto leader Aung San Suu Kyi.

SEGALL: One instance began in 2013. Facebook was warned that the platform was used to promote hatred of Muslims among the millions of users in Myanmar. By 2018, more than 10,000 Muslims were killed. And according to a United Nations report the role of social media is significant. Facebook has been a useful instrument for those seeking to spread hate.

SANDBERG: In places like Myanmar now we are actually working with the local civil rights groups on the ground identifying those things that might lead to real world harm, and we're just taking them down.

SEGALL: But critics say the company moved too slowly when they were warned. Facebook is now investing millions in policing content, adding thousands of content reviewers to the payroll and developing artificial intelligence that they hope will help identify troublesome material. But the answers aren't always clear.

UNIDENTIFIED FEMALE: Judging hate speech imagery --

SEGALL: I saw it in this room, a bi-weekly meeting of Facebook employees devoted to content policies. They often discuss what posts stay up and which ones are taken down. The post in question, a woman naming a man accusing him of assault in a Facebook post. The group decided to leave it up.

UNIDENTIFIED FEMALE: We had people who were posting allegations and we had other people coming forward saying this is harassment, it's not true.

UNIDENTIFIED FEMALE: So striking a balance of how do we frame the policy so that people can tell their story and name the person who attack them but never cross the line into bullying or harassment.

SEGALL: It was striking to hear conversations we'd have in a newsroom happen at a tech company. And if unanswered here, content decisions go all the way up to the top of the Facebook food chain to Mark and Sheryl.

(On camera): This is like an editor-in-chief role. I mean, should you be making that decision?

ZUCKERBERG: Well, I think as little as possible, which is why I focus on designing these systems because there are going to be billions of pieces of content that people post every day and I think getting it right at a systems level is more important.

SEGALL (voice-over): But with billions of contents hits daily, problems at this scale go viral and decisions made behind the scenes are increasingly scrutinized.

In November 2018, I'd be back on campus after a blockbuster report called into question the company's business tactics. That, when we come back.


[22:21:05] SEGALL: It's November 2018. And unexpectedly, I'm back on campus.

ZUCKERBERG: The challenges we have --

SEGALL: After a blockbuster report raises questions about Facebook's business tactics.

BERMAN: New details the company hired a Republican opposition research firm to discredit activist protesters by linking them to liberal financier George Soros.

SEGALL (on camera): You know, the PR firm was founded by Republican political strategists and it launched a campaign linking Facebook critics to George Soros. This is a common tactic used by anti-Semitic and alt-right groups. That's why I think people were so shocked when they found out about this.

Does that strike you as stooping low?

ZUCKERBERG: Yes. I wasn't particularly happy about that piece of it. That's certainly a big part of what made me want to look into this more deeply.

SEGALL: The way they went after George Soros, do you approve of that methodology?

ZUCKERBERG: I don't think that this is the type of thing our company should be engaging with.

SEGALL (voice-over): The report also painted a critical picture of how the company handled the revelations of Russian influence and its growing critics.

(On camera): Now I've heard you guys talk a lot about transparency but then you have these reports coming out that say something otherwise. So how -- and I guess I ask it again, how do you ensure you do win back public trust?

ZUCKERBERG: Yes. I don't think that the right expectations is that there aren't going to be issues. I think the question is how do we address them?

SEGALL (voice-over): A question the company is struggling with. While executives have promised to focus on security and transparency --

SANDBERG: We have more than doubled the number of people we have working in safety and security.

SEGALL: According to the report they not only ignored warning signs, but sought to conceal them.

(On camera): So set the record straight. I mean, it says that Facebook attempted to downplay the significance of Russian activity in the run-up to the 2016 election. Did they?

STAMOS: Right. I mean, after we found these things I'm not totally happy with what the communication strategy was.

SEGALL (voice-over): Former chief security officer, Alex Stamos, found himself in the news again. And so we went back to interview him.

STAMOS: I think we could have been much more aggressive about talking about what we knew right after the election, probably even before the election. Obviously, there were all kinds of internal miscommunications.

SEGALL: Miscommunications that seemingly came to a head in a border meeting fall of 2017, where Stamos revealed the company didn't yet have a grasp on Russian influence.

STAMOS: That came as a surprise to Sheryl. I had not briefed her on what I was going to tell the board. You know, she got mad at me and in the end the real root responsibility for why these things happened was not in Sheryl's control.

SEGALL (on camera): Whose control was it in?

STAMOS: Facebook wasn't measuring the bigger impact and thinking about the ways people could twist it to be misused. And in the end, that is Mark's responsibility.

SEGALL: What was your feeling kind of walking away from Facebook?

STAMOS: The truth is there is a bit of a "Game of Thrones" culture among the executives. One of the problems about having a really tight-knit set of people making all these decisions, if you keep the same people in the same places, it's just very difficult to admit you are wrong. Right?

SEGALL (voice-over): The company is powerful. And after spending time behind Facebook's walls, there was another thing that emerged. Folks who had something to say but were afraid to say it.

UNIDENTIFIED MALE: Working at Facebook can feel a little bit like being part of a cult.

SEGALL: This former employee asked us to protect their identity.

UNIDENTIFIED MALE: Speaking out against the company is not welcome. There is a career impact where you might get blacklisted and you're not going to get hired.

SEGALL: Ironically in a place that's connected billions, this former employee cites a disconnect within.

UNIDENTIFIED MALE: People aren't really encouraged to bring bad news to Mark because generally Mark doesn't handle bad news well. In a public setting he politely argues against it. In a private setting he's more likely to really aggressively go against it or challenge the source, to challenge the assumptions, to honestly not believe the bad news.

[22:25:06] SEGALL: Facebook is in transition. Many executives have left over rumored disputes about the company's direction including the founders of Instagram and WhatsApp.

Amidst all the controversy, there's been speculation. Should Zuckerberg, who is CEO, chairman and the majority shareholder in Facebook, step aside?

ZUCKERBERG: That's not the plan.

SEGALL (on camera): It's not the plan. Would anything change that?

ZUCKERBERG: I mean, eventually over time. I'm not going to be doing this forever.

SEGALL (voice-over): Many of the employees you've met on the show have left.

STAMOS: Hopefully he'll give you kind of a pretty good understanding --

SEGALL: Including Alex Stamos who's now teaching future entrepreneurs at Stanford University.

STAMOS: Doing better means doing things like this, like bringing people together to think about these problems early and not just be reactive. You know, having a real diverse set of people work on these -- work in tech and work Silicon Valley is going to be critical for that.

SEGALL: Randi Zuckerberg is building a media company that focuses on getting more young women in the tech world media.

(On camera): You've always viewed your brother as a creator. How do you as his sister about people genuinely questioning the impact of his creation and whether it's good for humanity?

RANDI ZUCKERBERG, MARK ZUCKERBERG'S SISTER: He has always just been an incredible trailblazer of pushing the boundaries. You know, of what he sees, where he see this world should go. And, you know, he has pushed a lot of us to think of the world in a bigger more connected way than we ever thought it would be, so, you know, for better or worse, Mark is not the kind of person who is swayed by what you or I or anyone thinks of him. And because of that, I think we're going to get a lot more amazing things out of him in the years to come.

SEGALL (voice-over): At 34, Zuckerberg is one of the most powerful people in the world. He's pledged his fortune to charity and there's no denying people have used Facebook to raise over a billion dollars in charitable causes throughout the years.

By every metric, Facebook is a success. Its revenue in 2018, $55.8 billion. It was just $272 million in 2008. But Facebook at 15 faces complicated questions, a ripple effect of Zuckerberg's mission. Is what's good for business good for society? And what is the cost of connecting the world?

HARRIS: Right now, the tech industry has been telling themselves this narrative that to connect people, you know, the platform for free speech, it's automatically good. I think the deeper upgrade we have to make is a philosophical upgrade, which is, what does it mean for these things to be good?

MORIN: Facebook is a living breathing map of society and it's literally a map of every single person in all the relationships and all of the interactions between all those relationships, and so in a way, it's as messy and human as we all are.

STAMOS: It is hard to say what any individual can do in the face of these massive kind of historical forces that we're at the intersection of. Right? What happens when you give voice to billions of people for the first time in history?

SEGALL (on camera): Do you think you'll be on the right side of history when you look back on this period?

ZUCKERBERG: Yes, I do. I think it's hard to imagine a future where giving people more power and helping people connect and share more ideas isn't going to end up being a very positive thing.

SEGALL: Although we're at a moment where a lot of people are watching Facebook and wondering, can you do all those things?

ZUCKERBERG: The principles of how do you balance giving people a voice with keeping people safe? How do you make sure that you can protect people's privacy and use information the way people are comfortable with while at the same time being able to build a system that can go to stop bad guys from doing things and can provide a service that is free for people around the world.

These are really big historical questions. And they're not simple things that have, you know, one-sentence answers. And over time, I really believe that being on the side of giving people power and giving individuals a voice and giving people the ability to connect to the people that they want to is going to be the thing that wins out.

SEGALL (voice-over): With a digital world of more than two billion people, Facebook has become part of the fabric of society, revealing both the best and the worst of humanity. For a tech company that has extraordinary human impact, what the next 15 years looks like is unknown.

We do know one thing. As we head into uncharted territory, Zuckerberg's mission to connect the world will only amplify, as will the issues that come along with it.