Return to Transcripts main page

Amanpour

Interview With Vietnam War Whistleblower Daniel Ellsberg; Interview With Musician Yusuf/Cat Stevens; Interview With Former Google CEO And Chairman and Schmidt Futures Co-Founder, Eric Schmidt. Aired 1-2p ET

Aired July 03, 2023 - 13:00:00   ET

THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.


[13:00:00]

CHRISTIANE AMANPOUR, CNN CHIEF INTERNATIONAL ANCHOR: Hello everyone and welcome to "Amanpour." Here's what's coming up.

(BEGIN VIDEO CLIP)

DANIEL ELLSBERG, VIETNAM WAR WHISTLEBLOWER: One thing I've learned in my long life is, if a policy looks crazy, it probably is.

(END VIDEO CLIP)

AMANPOUR: The patron saint of whistleblowers. I speak to the man who blew the lid on the Vietnam War. At almost 92, Daniel Ellsberg joins me on the

life and death lessons he's learned holding government to account.

And --

(BEGIN VIDEO CLIP)

YUSUF/CAT STEVENS, MUSICIAN: I'll take the world apart. To find a place for a peaceful heart. I know.

(END VIDEO CLIP)

AMANPOUR: Legendary singer songwriter Yusuf/Cat Stevens comes to the studio to talk about his new album and getting the legends slot at Glastonbury

this summer.

Plus --

(BEGIN VIDEO CLIP)

ERIC SCHMIDT, FORMER CEO AND CHAIRMAN, GOOGLE, AND CO-FOUNDER, SCHMIDT FUTURES: We don't know what happens to our identity, how we communicate how

we think about ourselves when these things arrive.

(END VIDEO CLIP)

AMANPOUR: How worried should we be about artificial intelligence? Walter Isaacson talks to former Google CEO Eric Schmidt.

Welcome to the program, everyone. I'm Christiane Amanpour in London.

The 20th anniversary of the Iraq War this week reminds us all of the critical importance of holding governments to account. Fast forward to

today, and autocrats are waging wars around the world, from Russia's latest yearlong invasion of Ukraine to Iran's battle with its own people, the

brave women and men taking to the streets there remind us of the power and value of speaking truth to power.

Whistleblowing plays a crucial role in this pursuit. Without it, unjust wars begin and injustices go unchecked. Daniel Ellsberg is probably the

patron saint of them all. Anyone who knows anything about America's misguided war in Vietnam knows his name to this day because of one giant

leap of courage and conscience, leaking the Pentagon papers in 1971 at great personal risk changed the course of that history by revealing

America's secretly knew the war was unwinnable.

50 years later, Ellsberg is still deeply committed to peace and transparency. But this month, at almost 92, he revealed his latest personal

battle after being diagnosed with inoperable pancreatic cancer. And so, when he joined me from Berkeley, California, we talked about his life, this

farewell moment and above all, how to save lives by speaking out.

(BEGIN VIDEO TAPE)

AMANPOUR: Daniel Ellsberg, welcome to the program.

DANIEL ELLSBERG, VIETNAM WAR WHISTLEBLOWER: Glad to be here, Christiane.

AMANPOUR: Daniel, can you just tell me how you're feeling?

ELLSBERG: I'm feeling wonderful, as a matter of fact. People say to live one day at a time as though it were your last. I think that's pretty hard

to do. You've got appointments to keep, people to say goodbye to. But actually, one month at a time works very well, as though that were my last.

I'm having a very good time here

AMANPOUR: Gosh.

ELLSBERG: Seeing relatives, seeing all my grandchildren and eating food that I haven't been able to eat for years because I had a salt free diet,

which I am now off.

AMANPOUR: Well, you are refusing treatment now and it is incredible to hear you saying you have a having a wonderful time. You've obviously made a

decision and you seem to be at peace with it.

ELLSBERG: Yes, yes. It's been -- by the way, because of my announcing that, was my son has encouraged me to do, wouldn't have done it otherwise. Some

responses have come from doctors that suggests that there may be some targeted chemotherapy, no surgery, that might add some months to my life.

So, I am looking into that. I'm not in a rush to leave. But it's been a wonderful party, and it's time to go home and go to bed.

AMANPOUR: Wow. Well, part of the wonderful party is what you did with your life, right? You are the king of whistleblowers. Does that resonate with

you? And is that a great thing?

ELLSBERG: I don't think we have a king. It's a rather anarchic group, actually.

AMANPOUR: OK. The lead whistleblower of all time.

ELLSBERG: As I have stone, journalists used to say, all governments lie and nothing they say is to be believed. And that doesn't mean that everything

they say is a lie, it doesn't mean that anything they say could be a lie if it's not the last word. You have to look for other sources of information.

Check it against your common sense.

And by the way, one thing I've learned in my long life is if a policy looks crazy, it probably is.

[13:05:00]

AMANPOUR: Well, let's talk about the policy that look crazy to you. But before I get into, you know, the leak of the Pentagon papers, let us just

us just remember that you were not some longhaired, you know, anarchic lout, you were a marine corps veteran, a Harvard educated former Defense

Department official. You serve with the State Department in Vietnam.

So, for you, what was the aha moment back then that made you need to put this out into the public?

ELLSBERG: Like nearly all whistleblowers, the wrongdoing I saw inside was known to virtually all of my colleagues. Humans turn out to be very, very

loathe to be ostracized from their group, to lose status, to break the rules of their group, and that can keep them quiet about virtually any

wrongdoing by that group. Harm to others, it's being done. Something they wouldn't do on their own, but if their leaders call for virtues of

obedience, that we all regard as virtues, virtues, courage, patriotism, loyalty, all can be harnessed to very bad causes.

That's true of at least one side, usually both sides, on any war. And you see courage on both sides. And yet, humans can be called on to do things

with their group that they know they're wrong, but they can tell themselves, maybe it isn't. Maybe there's some reason that explains all

this, and what do I know?

It was reading 7,000 pages of top-secret documents in "The Pentagon Papers." I was one of the few who read them all. I had worked on one of the

volumes, 1961. But reading them all, I realized that people inside had known year after year that what we were doing was -- at the level we were

doing it, was doomed to fail, to result in a stalemate, an escalating stalemate.

By the way, I would say Russia, Russian armed forces have to be filled right now, with generals down to corporal who say, this isn't working.

These people are not going to give up any more than the Vietnamese. We're going to give up under our onslaught.

AMANPOUR: But on "The Pentagon Papers" there is a quote here about the U.S. aims. So, just so that our viewers, all these years later, remember, a

government document named Action for South Vietnam, which was dated March 10, 1965, says, defines U.S. aims as follows. 70 percent to avoid a

humiliating U.S. defeat. 20 percent to keep the SVN, the South Vietnamese, territory from Chinese hands. And 10 percent to permit the people of SVN,

the South Vietnam, to enjoy a better freer way of life.

So, there it is in black and white. You know, do you remember when you first read that?

ELLSBERG: I remember when it was being written by my boss, my immediate boss, John T. McNaughton, who was a former professor at the Harvard Law

School, and someone I respected very much, worked with him every day. I was his special assistant. I was at the highest civil servant class at that

time, GS 18. And I would sit in his office while he lied to reporters very effectively. I've told reporters of mine having seen that many times, it

really works.

I can tell you that privately, he did not believe we should be there. He said, we should out, out, out, is what he said he wanted out, even more

than I did at a point.

AMANPOUR: The truth of the matter is once you leak this and once it dropped and became public, you write, when I copied "The Pentagon Papers" in 1969,

I had every reason to think I would be spending the rest of my life behind bars. And yet, you went through it. You were painstakingly copying

thousands of pages, leaking them to "The Times"" and to the "Washington Post."

How did you weigh the cost benefit, the risks to your freedom and safety versus, you know, your mission and what you believed was act of patriotism?

ELLSBERG: We were in a war that we shouldn't have been in, but we were in a war in which people were dying every day by hundreds. So, the real

comparison I saw was putting in risk to myself, which seems just miniscule compared to the risk that American soldiers were inflicting on Vietnamese,

and we're taking it that time.

[13:10:00]

So, that didn't seem a hard choice to make actually. But in civilian life, it seems rare to see it, people risking their careers there, their access,

as I say, their jobs, their families, by -- are not willing to do it.

AMANPOUR: Yes.

ELLSBERG: No matter how many people are at stake. For example, in this war we're in right now, unlike some of the crisis we've had in the last 20

years, the risk of nuclear war threatens everyone in the world, and everyone has a stake now in averting nuclear war, which is threatened, at

this point, by Putin in a way that we haven't seen since Chris Trough (ph) was in Cuba -- in the Cuban missile crisis, which I participated in, that's

60 years ago.

So, I'm back where I was in 62, in a way.

AMANPOUR: Yes.

ELLSBERG: The world has not gotten less dangerous. And yet, I don't -- you don't see people telling the truth, on either side, that might actually

save a war's worth of lives.

AMANPOUR: I want to now shift back to "The Pentagon Papers" because we talked just a moment ago about, you know, you made a decision that you were

going to, you know, assess the risk and do what you did. But I'm actually fascinated because did you bring your son along while you photocopied all

those papers?

ELLSBERG: One -- on two occasions, actually for him and once inadvertently more or less my daughter. He was 13, she was 10. My 13-year-old was working

with xerox machine at that moment, I was collating, and my daughter, who was then 10, was cutting top secret off the top and bottoms of the pages

with the scissors. But I hadn't meant to bring her in. She didn't want to be left out in the car.

But my son was 13, and I gotten him to read Thoreau's essay on civil disobedience to understand the needs sometimes to say no to authority,

political authority. And I wanted him to see that what I was doing was straightforward, he would be told I was crazy, as going crazy or that I was

a traitor, acting some desperate way. I wanted them to see that in a business-like way, this is one, two days out of 100.

I was doing this straightforward because I thought it was the right thing to do, and I wanted to plant the issue in both of their minds where it did

take hold that there were times when they might have to say expose themselves to danger or even prison to do the right thing and to save other

people's lives. And their lives have both exhibited that lesson, it took hold, even though it caused problems for them at home.

AMANPOUR: What an incredible legacy. What an amazing thing you did for them then. But of course, you were treated as a traitor by the White House and

specifically, the administration of Richard Nixon because these were leaked during his first term. And then, in June of that year, '71, he -- Nixon,

the president, talked about you specifically and this case with his attorney general. Just take a listen.

(BEGIN VIDEO CLIP)

RICHARD NIXON, FORMER U.S. PRESIDENT: I just say that we've got to keep our eye on the main ball. The main ball. The main ball is Ellsberg. We've got

to get this son of a bitch. We can't be in a position of ever allowing it just because some guy is going to be a martyr of allowing the follow to get

away with this kind of wholesale thievery, or otherwise it's going to happen all over the government. Don't you agree?

(END VIDEO CLIP)

AMANPOUR: I'm sure you've heard it many times. I mean, that's the president of the United States. Were you aware that there was a threat?

ELLSBERG: I -- no, I wasn't at the time really aware, but I did know that he had a secret that I shared, thanks to former aide to Kissinger, my

friend, Mort Halperin, who had informed me that contrary to what the public had been led to believe, he was making demands and threats. And the demands

of the North Vietnamese getting out of the South, what they regarded as the southern part of their own country.

And demands that they get out, which they never did. They never got out. And he was threatened to use nuclear weapons if they didn't. And he thought

that threat would work. I didn't think it would work, So, it -- in retrospect, when we know what he was doing, he was trying to shut me up

because he feared that I had plans, I had documents to document what I was saying about his administration. "The Pentagon Papers" ended in a friend,

by Roger Morris, had seen the target folders for nuclear attacks in North Vietnam.

I didn't have that because none of those aides who knew that did give the documents, even when they resigned later over Cambodia. As usually, they

didn't take documents. And actually, as Morris said to me, it was the greatest shame and regret of his life that, he said, we should have thrown,

open the safes and screamed bloody murderer, because that's exactly what it was.

AMANPOUR: Wow.

[13:15:00]

ELLSBERG: That's what he told me. Well, I had done. I had kept my silence too over previous years, year after year. And like most of whistleblowers,

I'd seen things that I later thought, I should have told the truth. I should have told the public, it would have saved lives.

AMANPOUR: You know, you were sort of somehow kind of dragged into, in a way, into the whole Watergate thing. In '71, a group of Nixon aides broke

into your psychiatrist office, hoping to steal records that they could then discredit you with. That failed. But several months later, the same crew,

et was E. Howard Hunt, G. Gordon Liddy, they broke into the DNC, obviously, at the Watergate, and it was your psychiatrist break in that was

potentially the first domino to fall in the Watergate scandal that then brought Nixon down.

How do you think and assess that sort of accident of history, if you like, and your connection with Watergate?

ELLSBERG: Well, let me correct you, if I may, on one very general misunderstanding. It's almost always described that they were in there to

get information in a psychiatrist office to see that I had admitted to a psychiatrist, things that I didn't want known, and maybe even to my family

or to the world. But that was to discredit me.

Strictly speaking, I was already facing 12 felony counts. 115 years possible. That was fairly discrediting in the most people. But the point,

they wanted information with which to blackmail me, not to put it out, but to threaten me with putting it out so that I would not put out what I was

saying and what I might be able to prove.

Later, when that didn't work, keep me quiet. They sent these same people, same people to incapacitate me totally on a steps of the Capitol, May 3,

1972. And when they thought they were --

AMANPOUR: And what do you mean? What do you mean by that? What do you mean by that, incapacitate you?

ELLSBERG: Well, those are the words used by their special prosecutor, part of the Watergate team, William Merrill, the special prosecutor. And I said,

what does that mean? Incapacitate me totally? Kill me? And he said, the words were to incapacitate you totally. But you have to understand, he

said, these guys who all work for the CIA, previously, Bay of Pigs and other things, never used the word kill.

AMANPOUR: Right.

ELLSBERG: They say, neutralize with extreme prejudice. You know, various euphemisms. And he said, he had no doubt the intent was to kill me.

Actually, I do doubt that. I think they wanted to shut me up.

AMANPOUR: Can I ask you something? Because speaking to you, and obviously reading all about you and the history, do you feel, at 92 now and with your

illness that you've gone public with, that you've accomplished your goals, that you've -- you know, it's been a life well lived and you have done

something more than maybe you could ever have imagined in terms of effectiveness?

ELLSBERG: Look, I was part of a movement that did the anti-Vietnam, the anti-intervention movement that did contribute, as I say, thanks to the

foreseeable actions of multitudes of people, did actually shortened the war. It was a success. The millions that died in Vietnam, the 58,000 in

this country, are testimony to -- on the success year after year. But eventually, it showed truth telling, committed action, nonviolent action.

But the nonviolent actions did actually succeed with no guarantee, and that's the result. I was part of that. I was part of an anti-nuclear

movement that kept things from being worse than they are. They kept their nuclear threats from being just threats. Although, they were being used.

There was no taboo against threats. But a trigger hasn't been pulled yet. And I was part of that movement.

AMANPOUR: We really, really appreciate. Daniel Ellsberg, thank you so much for giving us the benefit of so much wisdom and so much activism. We really

appreciate it. And we wish you really all the very best.

ELLSBERG: Well, and I wish that young people like Greta Thunberg now, who are leading young people against climate. It's in their hands.

AMANPOUR: Yes.

ELLSBERG: As Vietnam shows, it can work.

AMANPOUR: Fantastic. Thank you so much.

ELLSBERG: Thank you for the opportunity.

(END VIDEO CLIP)

AMANPOUR: Peace is always on the mind of my next guest. In fact, "Peace Train" is one of his best-known anthems. He is the British singer-

songwriter Yusuf/Cat Stevens. And at 75 years old, he's on a bit of a farewell tour himself. His new album, "King of a Land" is about the beauty

and simplicity of childhood in an era of such discord and upheaval. Here's a clip.

(BEGIN VIDEO CLIP)

[13:20:00]

YUSUF/CAT STEVENS, MUSICIAN: I'll make the world give up, until it leads me to the land of love. I'll go any length to see a tunnel under the

colosseum, I will.

(END VIDEO CLIP)

AMANPOUR: When he came into our studio ahead of the album's release, we talked about what's inspired him all these decades and winning a Legend

Slot at the world-famous Glastonbury Music Festival. That's this summer.

(BEGIN VIDEO CLIP)

AMANPOUR: Yusuf/Cat Stevens, welcome to the program.

STEVENS: Thank you so much.

AMANPOUR: I've been listening to you since I was a teenager. Now, "King of a Land," your latest album, you have written. This new record is a

culmination, a very clearly defined outcome of where I've been and where I am. So, first, where are you?

STEVENS: In this time and place right now where I am 75 coming up. And so, you know, I've kind of lived a lot and I've achieved many of my dreams. And

there was some nightmares along the way, you know. But at the same time, you know, I've lived. I've done more -- I've had more lives. You may

consider -- you say that than quite a lot of people, a lot of pop stars, you may say, because I've been -- I came out of the business as well, you

know, and that was like, I've got a life and I came back again. So, I've been there. I've done that. You know, that's kind of like where I am.

AMANPOUR: What -- so, tell me the nightmares.

STEVENS: The nightmare for me began really -- because I was an immigrant. You know, my father was from Cyprus, mother from Sweden. And like, where

did I fit? And you know, at school, they could be a little bit cruel.

AMANPOUR: Definitely.

STEVENS: And so, I had to find my place. And I think art was the thing which kind of, I think, elevated me to a status where it could be accepted.

AMANPOUR: So, this new album, it's been described as, you know, taking us back to that moment of childhood, the happy times. Was that your intention?

Because it's an album that falls at a time where we are -- you just have to open your eyes and we're in the middle of such discord and upheaval and all

of that.

STEVENS: True, and that's exactly why we do need these opportunities -- these moments to reflect on some of the good things, you know. And music

makes us feel good. You know, there's times when, yes, you're depressed and yes, the song makes you feel melancholy, but it helps you, it gives you a

reference point, maybe going back to some other time when you were kind of more safe and you're in a better space.

And so, music can definitely help. But of course, it can't do the full job. It can only promise, it can't fulfill the promise.

AMANPOUR: I do want to play a clip from one of the songs, the songs that we're allowed to use. And this is the song "Take the World Apart."

STEVENS: I'll take the world apart. To find a place for a peaceful heart. I know I've got to find it, although, I bread down the Walls of China, I

will. I'll take the world apart.

AMANPOUR: So, I mean, just that last image, it's almost little prince- esque, right?

STEVENS: Yes.

AMANPOUR: The little prince.

STEVENS: Yes.

AMANPOUR: Why is it the last and what do you mean, take the world apart? What does this song mean to you?

STEVENS: Well, it's the last track because it's an upturn. And maybe the track before it is a little bit deep, but I wanted people to come away

after the album with an up, you know, feeling good and optimistic. And that's why -- and for me, peace, you know, is the first stop and it's the

last stop. You know, there's -- whatever happens in between, that's life. But pieces, the objective. And so, we make -- we emphasize that by being

the last track.

AMANPOUR: Well, I mean, you're famous, you know, euphoric "Peace Train" --

STEVENS: Yes, peace train holly roller, everyone jump upon the peace train. Come on, peace train. Yes, it's the peace train.

AMANPOUR: -- are you disappointed in all the years that have transpired since and we don't actually have a peace train or peace?

[13:25:00]

STEVENS: Well, peace has many aspects. I mean, I do not think that it's just -- there is a physical aspect. For instance, if somebody is starving,

how can you, you know, expect them to find peace? You know, they want to feed their children, want to feed themselves. So, actually, that's one of

the reasons why we started "Peace Train," which has turned into a kind of a charity, concentrating on feeding people.

But there's -- more peace is not also just the physical, you know, peace. I mean, there's a saying, Jesus, I think we should all know it, you know,

that man does not live by bread alone, you know. So, whereas we are spirits. And for that reason, also, our spirits need some kind of peace. We

need to find out. We need to feel comfortable with who we are and comfortable with their neighbors too. I mean, there's so much that can

create peace but seems to be working against it.

AMANPOUR: Indeed. I think one of your either colleagues or certainly, people you admired was George Harrison of The Beatles. He was incredibly

spiritual. I think you did the final production -- you finished the album at his home/studio here in England?

STEVENS: Correct.

AMANPOUR: Yes.

STEVENS: We were allowed to use the Dark Horse studio in Henley-on-Thames, which is where his -- you know, his house, his mansion, his -- and his son

was the facilitator. And it was fantastic because you know, for me, The Beatles personified, you know, the way forward, the way in which we could

define a new world, even if it was only through music, lyrics, for that time. And -- but we believed in it. But there was more to George than just,

you know, the songs. And, you know, he was actually looking and he helped us to turn east.

So, therefore, he was really an inspired person for me. And especially at the time when I was sick and I was I was in bed with tuberculosis. I had my

first exposure to fame and everything that came along with it was, again, part of the nightmares, which I was referring to you before. And then, if I

needed something, while I'm lying on the bed, you know, where am I going to end up when all this disappears? You know, when I dropped off the planet.

And that's where George was already, you know, experimenting, or, you know, seeking ways to find the light through the maharishi and all that. But that

was inspiring for me.

AMANPOUR: And actually, you know, you've said, George was one of the first to put a charity concert on for the poor, at the time millions of

Bangladeshis were fleeing from conflict and becoming refugees. It was a brave thing to do, and against all establishment rules. I'm happy to sing

one of his songs, especially as it represents the return of light and hope to a seriously dark and broken world.

We'll get to the song that you recorded in honor of his 80th birthday. But this is so relevant now, with this brouhaha about refugees and the other

and asylum and, you know, people who need, you know, us to be kind.

STEVENS: Yes. It's so obvious. I mean, and then, when they start to move, which is the nature of human beings, when they're not happy where they are

now, they've got to go somewhere else. And then, you know, you closed the door, even though you've been advertising as being the best place in the

world to be. They're not really fair, is it?

And now, I felt, well, that's one -- that's the one song that I really think personifies, you know, the optimism that we need right now.

Little darling, it's been a long, cold, lonely winter. Little darling, it feels like it since it's been here. Here comes the sun. Here comes the sun

and I say, it's all right.

AMANPOUR: We can hear you playing guitar. I don't know whether there's piano in there. But you taught yourself to play, right? Your dad gave you

your first guitar.

STEVENS: Well, I bought it with his money. And so, he didn't really understand the pop business, but he thought, well, if you want --

AMANPOUR: They owned the restaurant in Soho?

STEVENS: In -- well, close to Soho.

AMANPOUR: Close.

STEVENS: It's actually on Shaftesbury Avenue.

AMANPOUR: OK.

STEVENS: And so, that's where we all -- everybody had to work in the restaurant. But, you know, so, this is my way of kind of finding an escape

from, you know, washing up and mopping the floor. And yes, he was so -- he was a generous man. And he gave me my first eight quid, you know, to buy

the guitar.

And then, came the job of learning how to do all these things. I was, you know, doing and the kinks and everybody else, I couldn't do any of that.

So, I started writing my own songs that nobody knew if I was making a mistake.

[13:30:00]

You know, and also, my brother didn't really admire my voice. He thought I sounded a bit weird. And -- but then, came Bob Dylan and he said, we don't

have to sound like Bobby Vee. You know, so, that opened another door. And eventually, I found my connection through Mike Hurst, a man who used to

play with the Springfield's, Dusty Springfield, and he then opened the next door, which was, you know, to get us a record deal, and that was my first.

And my first song was the first hit.

AMANPOUR: You sang all those songs, the hits, and you were mentioning that you've lasted all these years, along with very few others, The Rolling

Stones, The Beatles. How do you account for the longevity of your songs and their popularity?

STEVENS: I think I was always very, very concerned about being true to the art of music and the art of writing words. So, most of my songs were kind

of autobiographical. And so, they came from a place -- a real place. So, you're not just writing for a hit, you know, you're writing because of what

was happened to you or what you're hoping for, what you're dreaming for. And so, those kind of elements within the songs, I think, have lasted.

And also, I mean, I'm a lover of melody. I can't do without melody and I can't do without meaning either, but I've got to have melody.

AMANPOUR: You are a man of faith, and some of the songs on this new album are overtly spiritual and deal with, you know, a kind of a plethora of

different of different faiths. You have, "He is True," a gospel song, "Highness" and "Son of Mary," which tells the story of Jesus and Mary. Why?

What's interesting about that, in order to sing about now?

STEVENS: I think it's absolutely vital that we start to talk about the areas that we all I agree with and come together with. And, you know, so

many people are still, unfortunately, sadly ignorant about the fact that the Quran includes basically what we find in the bible but with a slightly

different view, but at the same time, it gives another context to the story of Jesus.

And Jesus is the most beloved -- one of the most beloved prophets and messengers of God in Islam. Abd Mary, there's a chapter in the Quran called

Mary. There is no other chapter, which is -- you know, which -- well, there is one more about the name of a family, which is Ali Imran, which was her

family. So, there's two references to Mary and the family of Mary in the Quran, title headings of the chapters.

And it's so important because, well, I mean, I grew up as a Christian. You know, I was Greek orthodox. I couldn't do everything that they were doing

in the roman catholic church. I didn't know what that thing tasted like on the tongue and I couldn't confess. So, I had to save all my confessions to

another time. God forgive me.

But anyway, so religion and the personality and enigma of Jesus, I mean, it's -- come on. We call this 2023. Why? You know, because Jesus arrived

2023 years ago. It's kind of important that we start to talk about those things we agree with and we've both -- with all, you know, love and cherish

and that is the reason.

AMANPOUR: Fast forward. You are going to Glastonbury to join the legends in their pyramid stage, you know, the -- really, the sort of the preeminent

slot at Glastonbury. How do you feel about that?

STEVENS: A bit petrified. Well, I always was petrified about going on stage. But, you know, now you've got Glastonbury. It's one of those things

that it's an institution which I think I have to do. And I'm doing it, willingly. You know, so -- and I feel very honored to be given the Legends

Slot. You know, and I think I'm going to make a lot of people kind of happy. I think.

AMANPOUR: What do you think you'll sing? Just give me two.

STEVENS: Well, I mean, it's going --

AMANPOUR: Legendary songs.

STEVENS: -- to be "Father and Son." There's going to be, you know, "Wild World." There's going to be -- and I think a little medley. I would like to

do a little medley of my '60 era. So, you'll have "First Cut is the Deepest." "Here Comes My Baby," which I don't know if you remember. That

was a big hit for "The Tremoloes." And my big hit here was which is "Matthew and Son."

Matthew and son, the work's never done, there's always something new. The files in your head, you take them to bed, you're never ever through.

[13:35:00]

AMANPOUR: And let me ask you, finally, about another song that I think it's interesting because it's about a current situation between, you know,

Israel and Palestine. It's called "The Boy Who Knew How to Climb Walls." It takes listeners to the pre partition Palestine. What are you saying there?

STEVENS: I'm saying, obviously, that war is the worst of all states to find yourself. And when you have these two friends, who were friends, and then,

the war came and the village was bombed and then, the boy goes to look for his friend and he can't find him and then, he does find him and he's dying.

I mean, it's just -- it brings back the human story to the land of light and spiritualism. And it's so sad that that can't be seen, you know, in

today's world.

AMANPOUR: Cat Stevens, thank you so much indeed.

STEVENS: Thank you. God bless.

(END VIDEO CLIP)

AMANPOUR: And the album is coming out in June.

Now, from hyperbolic headlines about its threat to our survival to the promise of its life-changing technology, artificial intelligence is here

and it is here to stay. How it's applied, and more importantly, how it's regulated are the questions being navigated right now. Walter Isaacson

speaks to the former CEO of Google, Eric Schmidt, about A.I.'s impact on life, politics and warfare and what can be done to keep it under control.

(BEGIN VIDEO CLIP)

WALTER ISAACSON, CNN HOST: Thank you, Christiane. And, Eric Schmidt, welcome to the show.

ERIC SCHMIDT, FORMER CEO AND CHAIRMAN, GOOGLE, AND CO-FOUNDER, SCHMIDT FUTURES: Thanks for having me, Walter.

ISAACSON: You know, industrial and scientific and technological revolution sometimes sneak up on us. I mean, nobody woke up one morning in 1760 and

said, oh, my God. The industrial revolution has started. But in the past three or four weeks between my students and myself, we suddenly feel we're

in a revolution where artificial intelligence has become personal, it's become chatbots and things that will integrate into our lives.

Do you think we're on the cusp of some new revolution?

SCHMIDT: I do. And partly, this revolution is happening faster than I've ever seen. ChatGPT, which was released a few months ago, now, it has more

than 100 million users. It took Gmail five years to get to the same point. There's something about the diffusion of technology that we interact with

at the human scale that's going to change our world in a really profound way, much more profound than people think.

ISAACSON: You and Henry Kissinger and Daniel Huttenlocher have written a book, "The age of A.I." And I think part of it is excerpted or there's an

essay in the "Wall Street Journal," and it compares this to the advent of the enlightenment, something I think that was spurred too by a great

technology, which is movable type printing presses that Gutenberg did. Compare what's happening now to the enlightenment.

SCHMIDT: We do not have a philosophical basis for interacting with an intelligence that's near our ability, but non-human. We don't know what

happens to our identity, how we communicate, how we think about ourselves when these things arrive.

Now, these things are not killer robots, which was what everybody assumes we're building, because we're not doing that. What is arriving is a kind of

intelligence that's different. It comes to answers differently than we do. It seems to have hidden understanding and meaning that we don't understand

today. It discovers things that we've never known.

We don't know how far this goes. But the biggest issue is it as we have made these things bigger and bigger, they keep emerging with new

capabilities. We have not figured out how powerful this technology is going to be yet.

ISAACSON: We've had A.I. for, you know, 20 years now that's been part of our technology. But now, it's becoming very personal. It's things we do

every day. A normal person like myself, whether I'm doing search or I'm writing an e-mail or I'm preparing a lecture at Tulane, suddenly, these are

tools -- it's almost like when the computer went from being in a really big room in a research institute, and suddenly, you had it in the 1970s and

arrived as a personal computer. Tell me about this transformation of A.I. to being something personal.

SCHMIDT: The systems are organized to essentially engage you more. and the reason they want to engage you more as if you engage more, you use it more,

they make more money. So, what they do is they learn what your preferences are, using various algorithms. And so, they say, oh, Walter likes this and

Eric likes that and so forth, and they build a profile.

Now, that profile is not a dossier and it's not written in English and so forth, but it's a pretty good approximation of what you like and what you

think. And then, the algorithms know how to make you more engaged.

[13:40:00]

By the way, the best way to get you more engaged is to make you more outraged. And the best way to make you more outraged, just use more

inflammatory language and so forth.

ISAACSON: Let's stop right there, because that means this could destroy our politics.

SCHMIDT: Well, it will. And the reason it's going to is that not only will the opponents of a political figure produce videos that are false and

harmful, but also, the messaging is going to get more and more outrageous. And you can get a situation -- and I call this the dual evil problem. Let's

say that you or I was a truly horrific person, which we're not, somebody who's a racist or something like that, and we have the diffusion model

generate a racist video. And then, the other of us is some sort of psychopathic social media person who doesn't care about the quality and all

he wants to make it worse.

So, what happens is my computer makes a racist video on my behalf and does a good job. And then, your computer system, knowing that it will get even

more revenue if it's more outrageous, makes it worse, right? So, you see how it goes one way.

Now, let's say that you and I were saints and the sense that I did something saintly and that you were the world's best social media person.

You would take my saintly think and you would make it more saintly, So, you see how it pushes to the sides. And my theory about life today is the

reason everyone's upset is because the social media is busy trying to make us upset.

ISAACSON: So, the algorithms of social media, Twitter, Facebook, many other things, try to get engagement by getting enragement, by getting us upset,

you just said, and what you're saying is that added to this will be these new A.I. systems that will make this even worse. Is that right?

SCHMIDT: We've got a situation where we have megaphones of people who we frankly don't want to hear about and they're going to find an audience and

they're going to find a big audience because they're going to do crazy stuff. That's not OK in my view in a democracy. Democracies are, at some

level, about reasoned debate, and these systems will drive against that.

I don't see a solution today from this, except that we're going to have to regulate some of it. For example, we're going to have to know who's on the

platform to hold them responsible for if they do something really outrageous or illegal, and we're also going to have to know where the

content came from. We're going to know -- have to know if it was authentic or if it was boosted and changed in some way, and we're also going to know

-- have to know how the platform makes its own decisions. All of those are sensible improvements. So, we can understand why we're being fed this

information.

ISAACSON: So, who is going to determine these guardrails and how are we going to get them in place internationally?

SCHMIDT: Well, in Europe, it's already part of the legislation. And some form of agreement in America between the government and the industry is

going to be required. I don't think we need to get rid of free speech or any of those things, although there are people who have proposed that we

can't even have free speech.

From my perspective, the technology of engagement is generally good if you take the guardrails around and you keep the most extreme cases off the

platforms. But my point about generative A.I. is these systems are going to soup up engagement and soup up your attention. There's an old phrase about

what the currency of the future in economics is attention, and these systems are looking for your attention as a consumer. So, every time you

go, oh, my God. I had no idea, remember that it's trying to get you to have that reaction.

Now, going back to the generative A.I. combined with large language models, it's going to do some other things that are particularly powerful, it will

be able to generate insights and ideas that we as humans have not had. Think of them as existing as savants. If I'm a physicist, I'll have a

savant that runs around and suggests physics problems for me to work on and that sort of thing. All of that is very good.

So, the power of A.I. in terms of improving science and biology and human health will be extraordinary, but it comes with this impact on our societal

discourse. It's not going to be easy to get through this.

ISAACSON: You say we don't understand how they make these decisions now. It used to be, with A.I. and with computers, we wrote programs. They were step

by step, and they will rules base and it was, if this, then do this. These new systems seem to just look at billions of pieces of information and of

human behaviors and everything else and they aren't following any rules that we give them. Does that -- what -- is that what makes them both

amazing and dangerous.

[13:45:00]

SCHMIDT: Yes. My whole world was we get computers to do things because we tell it what to do, and step by step, and they got better and better, but

that's fundamentally the as built environment that we all use today. With machine learning, which has been, I say in its current version available in

one form or another for about a decade, instead of programming it, you learn it. So, the language that you say is, can we learn what the right

answer is?

It started off with classifiers. Where you'd say, is this is zebra or a giraffe, and that got pretty good. Then, a technology called reinforcement

learning came along, which has allowed us to sort of figure out what to do next in a complicated multiplayer game. And now, these large language

models have come along with this massive scale. But the way to understand how you would both strengthen large language models and constrain them is

to learn how to do it.

So, in the normal taxonomy, you would describe, we have this big thing that's doing weird stuff. We want to learn what it's doing so we can stop

it doing the bad things. The problem with learning what it's doing is since its behavior is emerging as you have to run it for a while to understand

and then, you have to have humans decide, this is bad, right? So, the way chatGPT was so successful is that they invented a technique which

ultimately involved humans telling it good, bad, good, bad. So, it wasn't fully done by computers. The problem with good, bad, good, bad with humans

is eventually that doesn't scale.

But here's the real problem. So, far, that sounds pretty good. But in a situation where all of the software is being released, there are what are

called raw models, which are unconstrained. And the people who have played with the raw models say that they are -- these are ones that you and I

can't get to as normal users, say they're very frightening.

Build me a copy of the 1918 bird flu virus. Show me a way to blow up this building and where to put the bomb. Things that are very, very dangerous

appears to have been discovered in the raw versions of the models. Here's the --

ISAACSON: Wait, wait. And how do we keep those out of bad people's hands?

SCHMIDT: Well, the problem where we don't, today, know how to do it. And here's why. Imagine a situation where the model gets smarter and smarter

and it's got this checking system. You can imagine a situation where the model gets smarter and smarter and it learns to whatever it's being checked

to say the right answer. But when it's not being checked to say what it really thinks.

ISAACSON: Like how the computer in "2001: Space Odyssey" is learning how to outwit the crew.

SCHMIDT: And by the way, how would it do that? Well, these things have what are called objective functions, and they're trained. And so, if you give it

a strongest -- a strong enough objective function to really surface the most interesting answer that may overwhelm the system that's trying to keep

it under control and within appropriate guardrails. These problems are today unsolved.

The reason we don't know how this work is there are essentially collections of numbers. People have looked very hard at where -- essentially activation

nodes, we're inside the matrix and there are areas that seem to control the outcome. But when you look at it on a microscope, in the computer sense,

you get the same sort of confusion if you look at a human brain. In a human brain, you say, where did that thought come from? And you can't find it.

It's the same is true in these large language models so far.

ISAACSON: Well, let me drill down on some case -- used cases that we might have. You and I were once on the Defense Innovation Board for the U.S.

government, and you've been involved in another commission on national intelligence. Tell me how you think this will change the fighting of wars.

SCHMIDT: The biggest short-term concern is actually biological warfare. Last year, there was a group that did synthesis of a whole bunch of viruses

to try to be helpful, and then they use the same program, the same algorithm, the same large language model approach, if you will, to work it

backward and come up with the world's worst and most terrible pathogens. There's every reason to think that these technologies, when spread broadly,

will allow terrorist actions that we cannot possibly imagine. This has got to get addressed. People are working on this.

Another thing that's happening is that the concept of war, the concept of conflict is occurring much more quickly. It looks like these systems have

developed abilities to both do offensive and defensive cyber-attacks, they actually understand where the vulnerabilities are in ways we don't fully

understand, and they can be used to accelerate both offensive and defensive actions.

That means that a good chance in the future of war is a war that takes a millisecond, right? North Korea attacks the U.S. The U.S. attacks back.

China decides it's a bad time for war. The whole thing occurred in, you know, a millisecond. That's faster than human decision-making time, which

means that our systems, our defensive systems are going to have to be on a hair trigger and they're going to have to be invoked by A.I. that we don't

fully understand.

[13:50:00]

ISAACSON: You know, the first time I talked about this in depth with you and with Henry Kissinger together was in China. I think maybe three years

ago. And it was a question then and now more of a question of, are we going to cooperate with China in trying to figure this out or is this the great

new arms race that's going to happen? And with our new confrontational attitude towards China, is that going to make it harder to deal with the

emergent technology of artificial intelligence?

SCHMIDT: Well, three years ago, China announced its A.I. strategy, because they love to announce their strategies, and it include dominating A.I. by

2030. So, China, of course, has efforts in generative A.I. and large languages, morals as well. They also have large efforts in quantum and

biology, which are doing well. They're already ahead of us in 5G. They're ahead of us in financial services and in terms of batteries, new energy,

all the things that you use in your electric car. So, we should take them as a strong competitor.

In the case of large language models, they have not been as advanced as the American companies have, American and U.K. companies, for reasons I don't

fully understand. One idea that I would offer is that the large language models, because they are unpredictable today, cannot be offered to the

public in China because the Chinese government does not want unfettered access to information.

In other words, how do the Chinese government know that these systems are not going to talk about Tiananmen Square or something, which is not

possible to talk in an area of lack of free speech. So, we will see. But at the moment, they're trying to catch up but they are behind. We recently put

in some restrictions on hardware, which will slow them down, but not by much.

ISAACSON: Whenever there's a big, innovative change, it moves the arc of history sometimes towards more individual freedom. Even the printing press,

you know, takes away the hold of the roman catholic church, it allows the reformation, it allows of renaissance even. Do you think this will

inevitably push history to more individual freedom or will it be used for more authoritarian purposes?

SCHMIDT: I'm sure the answer is both. If you are an author or authoritarian dictatorship, you know, let's say a really bad one, you would use these

technologies to both surveil your citizens but also manipulate them, lie to them, misinformed them, tell them things which are falsehoods, cause them

to be motivated against national fears, all of the things that governments and ideologues do in that case.

If you're a democracy, you're going to use it, first, to try to improve your business situation and also because you believe in free speech, you're

going to allow people to say what they think. The dangers to both are obvious. For their autocracy, it will so compound their control that it

could lead to a revolution inside the autocracy. People don't want this kind of restrictions that are possible. In a democracy, as we've discussed,

the concept of being able to flood the zone, right, the ability for a single individual to define the narrative who shouldn't otherwise have that

kind of power is very palpable in these technologies.

And it's really important that we understand that human nature has not changed. If you show someone a video, and you say to them, this video is

false. At some basic level, there's evidence that they still believe it to be true. And you tell them upset -- upfront, pictures that have been seen

cannot be unseen. Videos that have been seen cannot be unseen. We have to confront the fact that humans are manipulable by these technologies and we

need to put the appropriate safeguards in place to make sure that we as a body of populists are not so manipulated to the wrong outcome.

ISAACSON: Eric Schmidt, thank you so much for joining us.

SCHMIDT: Thank you, Walter. And thank you again.

(END VIDEO CLIP)

AMANPOUR: And finally, tonight, a brand-new take on a timeless classic. Claude Monet's famed "Water Lilies" unlike you've seen them before. The

Chinese artist and human rights activists, Ai Weiwei, has recreated the masterpiece from 650,000 pieces of Lego in a 50-foot kaleidoscope of color.

Look closely and you'll see a dark patch there on the far right. That is said to represent the door to a dugout where Ai Weiwei hid out with his

father during the cultural revolution in China.

Now, the exhibition at London's Design Museum features more of Ai Weiwei's huge installments, including thousands of fragments from his sculptures,

which were destroyed when his Beijing studio was torn down by the Chinese government in 2018.

That's it for now. And if you ever miss our show, you can find the latest episode shortly after it airs on our podcast. And on your screen now is a

QR code. All you need to do is pick up your phone and scan it with your camera. You can also find it at cnn.com/podcast and on all major platforms,

just search "Amanpour."

[13:55:00]

And remember, you can always catch us online, Facebook, Twitter and Instagram. Thank you for watching and good-bye from London.

(COMMERCIAL BREAK)

[14:00:00]

END