Return to Transcripts main page

New Day

Biden Defends Abortion Reversal; Researchers Combat Deepfake Videos; Estranged Husband's DNA Found in Missing Mom's Home; USA Beats Thailand in Women's World Cup; Radicalized by Alt-Right Videos. Aired 6:30-7a ET

Aired June 12, 2019 - 06:30   ET

THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.


[06:30:00] JOE BIDEN (D), PRESIDENTIAL CANDIDATE: I didn't consult with anybody but me in that decision because I was sitting on the way down finalizing the plan and what hit me was we're in a situation where when you saw what was going on in Georgia, what's going on in Alabama, Missouri, it's just outrageous.

(END VIDEO CLIP)

JOHN BERMAN, CNN ANCHOR: Don't be a wise guy.

JOHN AVLON, CNN SENIOR POLITICAL ANALYST: Yes.

ALISYN CAMEROTA, CNN ANCHOR: And also he said, I didn't consult with anybody but me. I think that that's interesting -- interesting to say from the get-go because we know that he did have conversations with other people about it.

MJ LEE, CNN POLITICAL CORRESPONDENT: Well, and, regardless, you have to wonder why we didn't see this answer a couple of days ago. I mean initially, when this was bubbling up, and people were criticizing him for having switched positions and really wondering and putting out the question, why did he make the switch, right? Was it because he believed it was politically expedient? Because for a lot of his critics, that is exactly how it seemed. I think that moment there, Joe Biden gaggling with reporters and telling them directly, the media directly, this was the thought process that went into why I made such a stark reversal. We do need to see more of that. At least if you are a Biden ally or you are a campaign aide, you have to look at that and think, these are more moments that we need to create on the campaign trail for him because if we are going to continue sort of protecting him and not allowing him to have those organic moments with reporters, then, yes, the narrative gets set by somebody else that is not Joe Biden.

AVLON: And it will -- it will increase the sense that you're keeping him hermetically sealed for fear of a gaffe because he's maybe past his sell by date. And the age is the highest hurdle. It's not experience. It's not that he's too radical for the American people. It's not that he can't connect with swing voters and middle class voters and address economic concerns, as he did in the speech. It's going to be his energy, his accessibility. And if he's too contained by his team, that's going to backfire. DAVID GREGORY, CNN POLITICAL ANALYST: Well, and before Donald Trump

told you everything he was thinking at every moment in public, Joe Biden was pretty good at that kind of stuff.

AVLON: Yes.

GREGORY: And --

BERMAN: He doesn't use Twitter as much.

GREGORY: Right. Right. And he should. I mean I don't actually think this flip flop is that big of a deal. I mean Donald Trump is president. I mean let's stop. This is not old rules here (INAUDIBLE).

BERMAN: They're telling us we're out of time, but Bernie Sanders released excerpts of the speech that he's given today on Democratic socialism. It's fascinating to me that Bernie Sanders is doing this, period. You know, he's saying, among other things, they, Republicans, may hate Democratic socialism because it benefits working people, but they absolutely love corporate socialism that enriches Trump and other billionaires.

Smart move?

GREGORY: Yes. Well, I mean it's -- it's speaking to a lot of the Democratic Party that looks up at our political systems and our economic system and says it's grossly unfair. And -- and there's a real audience for that. There's also real danger for Republicans generally and Donald Trump in particular to say, really, we're going to go down the road of socialism?

AVLON: But smart to play offense on it. And, I mean, just from a messaging standpoint to say --

GREGORY: Yes, to try to define the terms of it. I agree with that.

AVLON: Corporate socialism is worse than Democratic socialism.

GREGORY: Yes.

AVLON: That's a smart, strategic --

CAMEROTA: But don't they need to rebrand? I mean we have to go, but socialism does conjure Orwellian thoughts in lots of voters.

AVLON: Yes, it --

CAMEROTA: Don't they need to come up with a different name for it?

BERMAN: What have you got?

AVLON: That train seems to have sailed.

CAMEROTA: Think about it.

AVLON: Yes. CAMEROTA: Just ponder that, all right, John.

AVLON: I will.

CAMEROTA: Thank you all very much.

OK, take a look at this.

(BEGIN VIDEO CLIP)

ALEC BALDWIN, ACTOR: She's been sniffing all night. Testing. Testing. Gyna (ph). Gyna (ph).

(END VIDEO CLIP)

CAMEROTA: OK, only, actually, no one is laughing here because it is hard to tell what is real from what is fake online. And it poses a major risk in the 2020 election. We'll show you why, next.

(COMMERCIAL BREAK)

[06:37:21] CAMEROTA: The race is on to fight deep fake videos ahead of the presidential election. Researchers working with the Defense Department fear the fake videos are setting the stage for a very dangerous 2020.

Donie O'Sullivan joins us now with more.

This is really disturbing stuff, Donie.

DONIE O'SULLIVAN, CNN REPORTER: That's right, Alisyn. The number of these fake videos online are increasing, and now lawmakers and big tech are trying to get ahead of us.

(BEGIN VIDEOTAPE)

DONALD TRUMP, PRESIDENT OF THE UNITED STATES: They're going to Mexico. They're going to many other countries.

O'SULLIVAN (voice over): Here's President Trump.

ALEC BALDWIN, ACTOR: They took my microphone to Kenya and they broke it and now it's broken.

O'SULLIVAN: And Alec Baldwin's impression of him from "Saturday Night Live."

But now take a look and listen at this.

UNIDENTIFIED MALE: Can you hear that? It's picking up somebody's sniffing here.

O'SULLIVAN: That's not really President Trump. It's just his face mapped on top of Baldwin's. Researchers at USC created this clip and many others of prominent politicians, showing just how easily viewers could be tricked. UNIDENTIFIED FEMALE: Listen, America, Donald Trump cannot be president.

O'SULLIVAN: Videos like this are known as deepfakes, a new, sophisticated way to create fake videos using artificial intelligence. And their potential damage is catching attention on Capitol Hill. While some technology experts say the trick (ph) is exaggerated, it's very real for lawmakers like House Intelligence Committee Chairman Adam Schiff.

REP. ADAM SCHIFF (D-CA): It's a race between the AI to create them and the AI to detect them.

O'SULLIVAN: His committee is scheduled to hold a hearing tomorrow discussing national security challenges of artificial intelligence, manipulated media and deep fakes.

REP. NANCY PELOSI (D-CA): Sort of visuals that obviously were planned.

O'SULLIVAN: Highlighting this altered video of House Speaker Nancy Pelosi. It's not an example of a deep fake, but was edited to make it look like she's slurring her words.

SCHIFF: It would be very easy to introduce a doctored video that could have a very sizable impact anonymously at various places around the globe at one time. And whoever introduced it would always have some level of plausible deniability.

O'SULLIVAN: Other technology experts agree.

HANY FARID, PROFESSOR, UNIVERSITY OF CALIFORNIA BERKELEY: We are going to get to a point at some point in the near future where you are not going to visually be able to distinguish between the two. And we sort of want to get out ahead of this before we get to that point.

O'SULLIVAN: Hany Farid's team at UC-Berkeley studied hours of footage of political figures and 2020 presidential candidate's movements when they talk, constructing a system called fingerprinting, aiming to help the government and news organization separate the real from the unreal.

FARID: By the end of '19 and the lead-up to the Iowa caucuses and the New Hampshire primaries is that we will have most, if not all of the candidates fingerprinted.

(END VIDEOTAPE)

[06:40:01] CAMEROTA: And, Donie, there's also this video, this deepfake video, of Mark Zuckerberg that purports to show him talking about controlling users' stolen data. So let's watch it for a moment.

(BEGIN VIDEO CLIP)

UNIDENTIFIED MALE: Imagine this for a second. One man, with total control of billions of people's stolen data, all their secrets, their lives, their futures. I owe it all to Specter. Specter has showed me that whoever controls the data controls the future.

(END VIDEO CLIP)

CAMEROTA: So that's not real. But you would never know it. I mean you don't see any edits. You don't see anything weird happening. So who did that and is FaceBook going to take that down?

O'SULLIVAN: So that deepfake video, which we learned about, was posted online in the last few days. We learned about it last night. It was created by some artists in Europe who really want to challenge FaceBook's position and make them think hard about, you know, what they do with this type of video.

We saw just last month how FaceBook refused to remove that video of Nancy Pelosi. And FaceBook has been asked, you know, if a video like this were to emerge of Mark Zuckerberg or one of your employees, would you treat it the same way? FaceBook, as of last night, is telling us that they are going to keep that video up there. And -- but I guess it's sort of easier for them to say it in this -- in this -- for this instance in that this video was created to be a fake. If it was a video that was truly tricking people and fooling people, they might have a -- a different approach.

BERMAN: They almost have to keep it up at this point, but it's got to burn. It's got to burn a little bit.

What does this all tell us, Donie, about what's going to happen between now and Election Day?

O'SULLIVAN: I mean I think we're just living in a perfect storm of misinformation online. If you think about 2016, there was the fake FaceBook posts, there was the Russian trolls. You know, we've always been able to -- seeing is believing. If you see a video or hear audio, you know, you rarely question that that is real. If you think about in October 2016, when the "Access Hollywood" tape emerged, it was one of the few times that then candidate Trump ever had to apologize. His campaign didn't think to say, it's fake. He later called it fake, but at the time he didn't call it fake. So this is -- this not only -- these fake videos could not only trick, you know, people as they watched them, but also it gives plausible deniability. It allows people to question, you know, what is real.

CAMEROTA: This is why people need real news organizations and need to rely on real established news organizations because the Internet is the wild west and you don't know what you're getting unless you come to someplace with a track record, like, for instance, CNN.

Thank you very much, Donie.

BERMAN: All right, a big start for the defending champions of the Women's World Cup, but was it too much by team USA? My answer is, no. I dare you to disagree with me. The "Bleacher Report," next.

(COMMERCIAL BREAK)

[06:46:39] CAMEROTA: There are new developments in the case of Jennifer Dulos, the Connecticut mother of five who disappeared nearly three weeks ago. Investigators have discovered DNA from her estranged husband, Fotis Dulos, inside her home. Dulos was just released on bail and he made quite an exit from the courthouse.

CNN's Jean Casarez joins us now with more.

What happened there?

JEAN CASAREZ, CNN CORRESPONDENT: Well, Alisyn, Fotis Dulos, he posted bond late yesterday, but not before prosecutors tried to get his bail increased from $500,000 to $850,000. They justified this increase by disclosing for the very first time that after Jennifer's disappearance, her blood was found on her kitchen faucet mixed with the DNA of her estranged husband. And he has never lived in that home. They're in the middle of a contentious divorce.

Once posting bond, Dulos walked out of the courthouse yesterday afternoon in his orange jail jumpsuit very quickly and he was driven away. Dulos and his girlfriend, Michelle Troconis, they have been charged with tampering with or fabricating evidence and hindering prosecution after someone who looks like Dulos was seen on surveillance video the evening of the day Jennifer went missing depositing garbage bags into trash receptacles. Law enforcement says inside those bags, bloody clothes. At least one bloody sponge confirmed by police to be his estranged wife's blood. Both Dulos and Troconis have pled not guilty.

The defense says, obviously, we are happy to see him head home. Mr. Dulos had no motive to kill Jennifer.

Alisyn.

BERMAN: All right, Jean, we'll take it from here. Thanks so much. Thanks for being on top of this story.

All right, team USA off to a record-setting start at the Women's World Cup. Andy Scholes has more in the "Bleacher Report."

Andy, this was something.

ANDY SCHOLES, CNN SPORTS CORRESPONDENT: It certainly was, John. And, you know, the United States showing absolutely no mercy in their opening game against Thailand. The team setting a World Cup record, men's or women's, scoring 13 goals. It was only 3-0 at halftime, but team USA just pouring it on in the second half, scoring a record ten goals. Alex Morgan etching her name into the record books as well, scoring five goals herself. Seven different players for team USA finding the back of the net.

Next up for team USA, another easy one. They're going to play Chile at noon Eastern on Sunday.

All right, tonight we get one of the best things in all of sports, game seven of the Stanley Cup final. Blues and Bruins going to play in Boston. Now, the Blues have an 11-year-old super fan named Laila Anderson. Now, Laila's battling a rare, life-threatening immune disease. Well, ahead of game seven, Laila got a big surprise.

(BEGIN VIDEO CLIP)

UNIDENTIFIED FEMALE: What if I told you the Blues called and they want you at the game.

LAILA ANDERSON: What? How?

UNIDENTIFIED FEMALE: Doctor said it's OK.

ANDERSON: No, he didn't. Mommy, no, he didn't. Oh, my God! I'm going to Boston?

UNIDENTIFIED FEMALE: Uh-huh.

ANDERSON: Oh, my God. I love you.

(END VIDEO CLIP)

SCHOLES: That right there is pretty awesome. And if you didn't have someone to root for in game seven of the Stanley Cup final before, I know you do, John, but, I'll tell you what, pulling for Laila and the Blues tonight now.

BERMAN: I hear you. Laila, I'm so glad she's getting that experience. Thanks so much, Andy, for that.

All right.

[06:50:01] CAMEROTA: Andy has to warn us before he shows this (INAUDIBLE).

BERMAN: I know. That came out of nowhere. I had no idea.

CAMEROTA: I couldn't get to the tissues fast enough. That was beautiful.

OK, you have to stick around for this story because he says he became a right wing extremist thanks to becoming radicalized on YouTube. And he's here next to tell us how this happened and how to stop it.

(COMMERCIAL BREAK)

CAMEROTA: How does a liberal young man become a follower of the alt- right? You're about to meet a man who says he was radicalized by alt- right figures via their persuasive YouTube videos. He believed their extremist conspiracies for years. But then he somehow de-radicalized himself and he's now working to help others get out.

Joining us now is Caleb Cain.

Caleb, thank you so much for being here to tell your personal story. I think it is so fascinating.

[06:55:03] So just -- just to bring people up to speed, somewhere around, as I understand it, 2015, you were sort of at loose ends. You had dropped out of college. You were looking for direction. I think -- correct me if I'm wrong -- at the time you would describe yourself as liberal. And somehow, online, you found your way to these alt-right videos and websites. And then describe what happened.

CALEB CAIN, RADICALIZED WATCHING ALT-RIGHT VIDEOS: Yes.

Thank you for having me on.

Basically I was depressed during that period of time and looking for an outlet for that depression. And I turned to YouTube. And I found a man by the name of Stephan Molinu (ph). And his content really helped me in a lot of ways. But what I didn't realize was, I being given a political philosophy. And over time my political ideas started to change quite drastically, which was pretty, you know, pretty clear to anybody in my life at the time. And really it just took me into a place where I became further isolated from social groups and just started to ostracize a lot of people in my life very unfairly.

CAMEROTA: Let's talk about some of the things that you came to believe because of these videos. You came to believe -- I mean, again, jump in if this is wrong, that other races were inferior to whites, that women, I think -- or at least feminists were overly aggressive and women, I think, were inferior to men. You believed that, I guess, Muslims were trying to take over western civilization or immigrants were.

And how do you explain how you came to absorb those extreme views so much?

CAIN: Well, it was mostly due to the people that I was listening to. The people that I was listening to were selling me a narrative that, you know, cultural Marxists and, you know, immigrants and Muslims and basically liberals were trying to destroy western civilization and install some sort of socialist regime. And it's the type of rhetoric you hear from a lot of these people online the whole way across the political spectrum.

And, really, what it boils down to is it's digital hate politics. And it leads people to radicalization. And, you know, for me, in my opinion, radicalization, it's a public health crisis and we really need to fix this public health crisis through education, through fixing people's communities and through providing mental health support, support I didn't have growing up and is what led me down this path.

CAMEROTA: And I'm going to get to exactly how you pulled yourself out in one second. But, also, when you say it's a public health crisis, obviously we've seen hate crimes spike in the past years. We've also seen the rise of neo-Nazism. We've seen more violence. Were you ever tempted -- were -- did you feel you were veering in that direction towards violence?

CAIN: I don't know if I was ever veering towards violence. I never felt that way. But from what I see, the people that turn to violence are people that feel at the end of their line. They feel that their back is completely against the wall. And these were the people that we saw in the New Zealand shooting and in the Poway (ph) shooting. And these people were deeply ingrained within social groups online. And I think that giving people -- relieving those anxieties that people have, those systemic anxieties that they have, through, you know, their economic situations or their personal lives, it's fixing those situations that's going to keep us from seeing more violence.

CAMEROTA: By the way, we also need to talk about the role that YouTube plays. They directed you to more and more extreme videos. Once you found one and YouTube could tell through its algorithms that you liked it, that you were watching it, that you were engaged, they directed you to more and more extreme videos. And they need to take some responsibility for that as well.

CAIN: I think what YouTube needs to do is they need to have very clear terms of service on their website of what's acceptable and what isn't. you know, I am a free speech advocate, but what I saw on the platform was people taking advantage of this algorithm. And the algorithm does not care about what your politics are. It keeps -- it cares about watch time and keeping you on platform. The AI that they use for this is called reinforce. And the whole idea is to keep you watching more and more. And extremists come online and take advantage of that.

CAMEROTA: So, Caleb, we're almost out of time, what's your message to other people who, as you say, have fallen down the alt-right rabbit hole? How did you pull yourself out?

CAIN: I pulled myself out because I started getting exposed to other ideas. I basically started to educated myself on the problems and the issues. And I also started to reach out and get emotional support from others. And that is what we're trying to do with my team and I is, I have a bunch of volunteers. And we've basically set up digital platforms to try to de-radicalize young people online. And we're doing that through social intervention and compassion-based conversations. And it's really to combat, once again, this whole thing of digital hate politics, which leads to radicalization. And this radicalization is a huge public health crisis that we really need to solve.

[07:00:06] CAMEROTA: Yes. Good for you, Caleb. Good for you.

END