Return to Transcripts main page
Amanpour
Interview with The New Yorker Author and Contributor James Lasdun; Interview with software engineer and blogger Jixian Wang; Interview with Odessa resident and wife of Jixian Wang Dasha Zakopaylo; Interview with CARE Yemen Assistant Country Director Salah Hamwi; Interview with Wired Editor-At-Large Steven Levy. Aired 1-2p ET
Aired March 03, 2023 - 13:00 ET
THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.
[13:00:00]
BIANNA GOLODRYGA, CNN HOST: Hello, everyone, and welcome to "Amanpour." Here's what's coming up.
(BEGIN VIDEO CLIP)
UNIDENTIFIED FEMALE: Guilt verdict.
(END VIDEO CLIP)
GOLODRYGA: The murder trial that captivated America. We dive into the saga of South Carolina disgraced attorney Alex Murdaugh.
Then, as battle's rage in Ukraine, I speak to a Chinese vlogger in Odessa about fighting disinformation and finding love in the time of war.
And Yemen's forgotten suffering. Former basketball player turned aid organizer, Salah Hamwi, joins me with a live report from the ground.
Plus --
(BEGIN VIDEO CLIP)
STEVEN LEVY, EDITOR-AT-LARGE, WIRED: We are strapping ourselves in for a roller coaster ride that no stage inspector has looked at.
(END VIDEO CLIP)
GOLODRYGA: Steven Levy, editor-at-large at Wired gives Walter Isaacson a glimpsed into our future with chatbots gone wild.
Welcome to the program, everyone. I'm Bianna Golodryga in New York sitting for Christian Amanpour.
Life in prison, that's the sentence handed down to disgraced former attorney Alex Murdaugh for killing his wife and 22-year-old son. It's hard
to overstate just how influential the Murdaugh family has been in the low country region of South Carolina. Alex Murdaugh was a kingmaker in the
legal world. And now, he's a convicted murderer. Take a listen to the judge and Murdaugh's continued insistence that he's innocent.
(COMMERCIAL BREAK)
JUDGE CLIFTON NEWMAN, AT-LARGE JUDGE, SOUTH CAROLINA CIRCUIT COURT: You have a wife who's been killed, murdered, a son savagely murdered, a lawyer,
a person from a respected family who has controlled justice in this community for over a century.
ALEX MURDAUGH, FORMER ATTORNEY: Judge, I'll you again, I respect this court but I'm innocent. I would never under, any circumstance, hurt my wife
Maggie and I would never, under any circumstances, hurt my son, Popo.
(END VIDEO CLIP)
GOLODRYGA: A true crime story has captivated the nation for what it reveals about power and privileged. It's something my first guest James Lasdun has
reported on in depth for "The New Yorker." James, welcome to the program.
As we noted, this is a story, a trial that has captivated the nation. I'm just surprised, given all of that, are you at all surprised that the jury
just took three hours to find him guilty of both murders?
JAMES LASDUN, AUTHOR AND CONTRIBUTOR, THE NEW YORKER: I was very surprised. I -- everything I predicted was wrong about this. I thought they would be
deliberating for days, if not longer. And I was pretty sure they'd come back with a hung jury. I mean, I didn't think it was likely that they would
acquit, but I thought there would be somebody on that jury who would have reasonable doubt that they come to overcome.
GOLODRYGA: You weren't the only one there. There was little direct evidence trying Murdaugh to the murders and no eyewitnesses. And the prosecution
really relied on circumstantial evidence here. Ultimately, it came down to how the jurors viewed Murdaugh. And I am curious whether you think that his
own testimony did him in, finally.
LASDUN: I think, overall, it probably harmed him. That's a very -- that was my take of it. I mean, to begin with, when he was sort of taking the jury
into this rose of his motions of grief, I thought he was convincing. I think he genuinely -- as a judge himself, he genuinely loved his son and
his wife. But the big problem was when he was explaining his lie about being at the kennels and his explanation seemed very weak and implausible
to make, that it was a lie based on being paranoid because of his opioid problem and it didn't really add up too.
It didn't convince me that that was why he lied. I think he lied for another reason. And I am not sure exactly what. Perhaps it is simply what
the jury evidently did think, which was that he was there, and he killed them.
[13:05:00]
GOLODRYGA: A tragic situation. A tragic family story. But I think that what really captivated the nation and following it was what we heard there from
the judge, where he described the Murdaugh family, and he said that, your family has controlled justice in the community for over a century. This is
something that piqued your interest too and got you to travel to South Carolina there to really investigated the community and the Murdaugh
family.
Tell us what you found specifically about their power there and their influence the Murdaugh's have had for generations.
LASDUN: Well, Alex Murdaugh what is the fourth generation of this legal dynast that had -- I mean, his father, grandfather and great-grandfather
had all been the chief prosecutors in the 14th Circuit, which is this area of the sort of lowlands of South Carolina. And they had controlled --- they
had built small-ish fortune. I mean, a fortune, but is not a kind of iconic carnegie fortune, but as private litigators and they controlled the sort of
prosecution side of it. So, they really had a lock on the judicial system, the justice system of that part of South Carolina for three generations.
Alex Murdaugh himself was not a -- did not become the -- what they call a solicitor there. But he was a volunteer in the solicitor's office and he
maintained a very strong association with law enforcement that family had cultivated over these generations. And he was treated with -- as it came
out in the trial, he was treated with a lot of deference and respect by law enforcement.
And it appears that, you know, for decades, his family had been getting away with various crimes sum or misdemeanors of one kind or another. And
they were able to -- that they were able to do this because of the power they had until -- pretty much until the boat crash, really, sort of thrust
them into a larger kind of spotlight.
GOLODRYGA: And this boat crash happened in 2019, early in 2019, where their son Paul was on a family with friends, and they had been drinking. And
Mallory Beach had been of the friends who had been on this boat with them, and the boat subsequently crashed, and she ended up dying. And this is when
things really started spiraling for the family because this life of lying and stealing and cheating financially that Alex had been inconstant for a
long time, started to become revealed.
Talk to us about the impact that you think that crash had on the two deaths.
LASDUN: Well, that is a hard one. I mean, the prosecution's case was that the pressure that the civil litigation resulting from the boat crash put
Alex under, combined with the criminal case against Paul, who was driving the boat, were part of a sort of weather system of stress that had built up
in Alex and that culminated in his being confronted by the law firm on the 7th of June, the day of the two murders. It culminated with them
confronting him with thefts that they discovered.
And that somehow, as a result of these stress factors, he decided to kill his son and his wife. And the motive given there is that this would turn
him into a victim and that he would somehow buy himself some time, which, for me, was never a completely convincing motive, I have to say.
GOLODRYGA: Yes.
LASDUN: Even though, all the circumstantial evidence, well, most of it, pointed very strongly of him, even though he is clearly a terrible person,
there just seems, to me, to be this small gap but crucial gap at the very end of the story, so to speak, where you have to get to that point where he
picks up a shotgun and blows out his son's brains, literally.
And I don't think that gap was ever quite closed. But -- which is I expected there to be doubt in the jury, or at the very least, a prolonged
and very, very thorough review of all the evidence just to make absolutely sure that it was -- there was no other scenario that could possibly fit
with the facts.
GOLODRYGA: Yes, I agree with you. I was one of those people following this case and thought that it was a hard line to follow that the prosecution was
making, drawing the motive off of the continued stress and then, I guess, legal consequences that Alex was facing following this boat crash and that
that ultimately led him to kill his wife and son.
[13:10:00]
And it is interesting that the defense sort of flipped that motive onto their narrative, which was that it was somebody else who was wanting to
take revenge out on Paul who was the culprit behind the murders.
LASDUN: Yes. And I mean, Paul had been receiving threats. This seems to have been true and not just a sort of last-minute claim by Alex. I
remember, at least over a year ago, when his two uncles, John Marvin and Randy were talking on TV. I think this was after the -- directly after the
murders actually. They revealed that he had been receiving a lot of threats.
So, that is one of possible source of hostility towards Paul, at least. There was this -- the moment, still, rather murky, these charges of drug
dealing with his cousin Eddie and possibly with some gang members from Walterboro that Alex was involved in, that conceivably might have had
something to do with it. But then, you know, there are real problems with both of those scenarios. You know, no evidence of anyone else being at
Moselle that evening.
Apparently, that the family's weapons being used. So, why would somebody come with, you know, murder in their mind and now bring their own weapon?
So, I mean, every single scenario is problematic.
GOLODRYGA: We're taking a look at the vast family estate there in South Carolina. It's interesting, you talk about the dynamic there and the power
and dynasty of the Murdaugh family. I want to play a sound for you from the prosecutor, the lead prosecutor, Creighton Waters, and what he said last
night in response to the guilty verdicts.
(BEGIN VIDEO CLIP)
CREIGHTON WATERS, LEAD PROSECUTOR IN ALEX MURDAUGH TRIAL: Justice was done today. It doesn't matter who your family is. It does not matter how much
money you have or people think you have. It does not matter what you think how prominent you are. If you do wrong, if you break the law, if you
murder, then justice will be done in South Carolina.
(END VIDEO CLIP)
GOLODRYGA: From your investigation, your time down there and speaking with people and -- whether it is officials, whether it is local journalists, is
he right there, in ultimately saying that justice will be served no matter what your socioeconomic background?
LASDUN: That remains to be seen. I mean, there are other very -- there are other bodies to account for, other deaths to account for around Alex
Murdaugh himself or his family. There's Steven Smith. There's Gloria Satterfield. Whether they, in the end, turn out to -- their deaths have
anything to do with him or his family. Who knows? Sled the law enforcement agency, they say, they are looking into those cases. But we haven't heard
anything about that for a long time.
The financial crimes he committed were done with the collaboration of other people, from the banking establishment, from the legal establishment. Were
these just individual bad apples? I don't know. But from what I've learned from people that I talk to down there, there is a sort of culture of
impunity among the wealthier and more powerful operators there. And whether that will ever -- whether the kinds of structural changes that are probably
called for are going to be put into effect, I am not sure.
GOLODRYGA: Yes. Your piece explored the antiquated class structures in South Carolina, it is a very poor region down there. The socioeconomic
divide is wide. And you spoke with Bill Nettles, the former U.S. attorney, and here's what he told you. He said, for multiple generations, you have
had a modern-day caste system. A lot of these people were born on third base in area where there simply -- where they could simply do no wrong.
Tell us what the statement signified to you.
LASDUN: I think that there was a culture of self-dealing among this oligarchy where they were looking out for each other and they had a
monopoly on power and it was them and then, rest of the county or the rest of the five counties that comprise that district who were very, very poor.
[13:15:00]
And another thing Bill Nettles said to me was that it is insanely poor there. There are a lot of very vulnerable, very poor people who, from time
to time, would get caught up in the legal system. Perhaps they had a personal injury and they found themselves in the clutches of these personal
injury lawyers, some of whom are very good and very decent people and some of whom, like Alex, were very unscrupulous. And so are these large
settlement sums as a kind of bounty that they could go after. And that's exactly what Alex did, but he didn't do it entirely alone. That's what
concerns me when I hear, you know, the chief of (INAUDIBLE) talking about - - or the chief prosecutor talking about, you know, justice prevailing.
Yes, I mean, it has probably prevailed in this murder case. I think justice has been served, even if sort of narrative coherence hasn't quite being
served. But whether during the wider -- with the wider crimes, I'm not sure yet.
GOLODRYGA: What was your biggest take away in your time down there and following these cases closely as you have?
LASDUN: Well, I mean, I keep -- what interested me in the first place is Alex Murdaugh himself and how trying to get my head around somebody could
do the things that he has done. And that culminated, apparently, in the killing of his own son. I mean, it's just such an unconscionable thing to
have done and such an unimaginable thing to have done. And the man himself does not seem to be, you know, completely obviously crazy in a kind of
conventional way. And he -- for me, he is a profound puzzle that I think he'll contemplating for some time.
GOLODRYGA: Yes. I tend to agree with you. I think that's probably what attracted the country to this story. And even judge in his closing remarks
there before handing down that sentence was saying something along the likes of what you just did. He said, I've known you. We have been in social
circles together. I can't imagine what would lead somebody to do what you have now been accused and found guilty of doing.
James Lasdun, thank you so much for joining us. It has been fascinating following your reporting on this and obviously, covering this trial for the
past several weeks. Thank you.
LASDUN: Thank you so much for having me.
GOLODRYGA: Well, turning now to Ukraine and the fascinating story of one Chinese vlogger in Odessa who embarked on a mission to spread truth about
the war and found himself in his own love story. Jixian Wang is a programmer who post daily updates from the war-torn country for his fellow
Chinese citizens back home to see. At unvarnished look at real-life, something that the Chinese government tries hard to conceal.
Jixian's posts are taking on new importance this week as U.S. intelligence says China is considering supplying lethal military assistance to Russia.
But his daring work in the field brought him something he likely didn't expect, love. When he needed a translator for his vlog, he met, Dasha, a
Ukrainian student majoring in Chinese. And the rest, as they say, is history. They were married this past November.
Jixian Wang and Dasha Zakopaylo, welcome to the program. Thank you so much for joining us. First of all, congratulations on your love story, on your
marriage. Talk to us about what the past few months as newlyweds in a war- torn country have been like. Dasha, let's start with you.
JIXIAN WANG, SOFTWARE ENGINEER AND BLOGGER: Let's start with you. Yes. Tell us what happened in the past few months of war.
DASHA ZAKOPAYLO, ODESSA RESIDENT AND WIFE OF JIXIAN WANG: Of war. For example, we hadn't -- electricity maybe (INAUDIBLE) for you. And we have to
buy currency (ph).
WANG: She's a little bit nervous. OK. Let me help her.
ZAKOPAYLO: Yes.
WANG: Yes. What had been in the last few months, I think, she says, since the months where you get married, which is November, right after that, and
then start extracting the power stations or more civilians for facility. As that brings more darkness to our life. Yes. Well, we do really feel is much
worse than in the beginning of the war.
Even, at that time, the missiles are flying over the skies, more scary. But, you know, this like just feels like a probability issue. If the
missile hits you, then you die. We not survive. If it does hit you, you just live like -- what you supposed to be. But they start attacking -- I
mean, their goal is trying to leave us in the cold without energy and electricity for the cold winter. Well, I have to this, somehow, we managed
to make it, and that was difficult.
[13:20:00]
But the good news is, since, I believe one week ago, yes, not so long time ago, we have electricity. See, now, I have light again. And this street has
a light again.
GOLODRYGA: It makes you appreciate what everyone just takes for granted when they don't live in a country at war and being bombarded on a daily
basis. You are there in Odessa. Jixian, you are a programmer who moved to Ukraine before the war, believe in 2021. And I was struck by something that
you said when you were asked why you started to vlog and you said that that wasn't the right question to ask, it was where, and your situation, and
what you are seeing around you. You wanted tell the people back home in China what you were seeing.
WANG: Yes. That was craziness of -- nearly about one year ago. Actually, vlogger is not my profession. I'm a professional programmer. Like now, I'm
going the artificial intelligence, machine learning. That's what I'm really -- my occupation, my work.
For the vlog is like my social life. I am just sharing something on my WeChat. And actually, people pick up me, because people, at that time,
looking for -- is there any Chinese still staying in Ukraine and what are they seeing? So, I was -- in the beginning, was surprised.
And what I am trying to say in the beginning, really nothing. Just shining my life. Like -- because there were a lot of rumors at that time. People
say, why don't you take the flight and go back to home town with all of the other people? And actually, at that time, all of the airports are being
bombed. It's nearly impossible. So, I just sharing them the information seen outside my door. We have food at that time.
Like I was continuously doing that, same like a few months ago, I was sharing people, see, we're really out electricity. And now, I also tell
them, yes, see, we have electricity. That's my real-life. There's nothing about some -- like -- how do you say -- religions or (INAUDIBLE) community,
nothing about this.
GOLODRYGA: And do we know --
WANG: That's about my content.
GOLODRYGA: Yes. You were trying to show people back home real life. And we know that the Chinese government has full control and grasp over all of the
media airwaves there, whether it's the internet or just what folks see or when they turn on the evening news.
You quickly were banned from the internet as a whole in China. I'm curious, given -- I'm just curious, given China's neutrality this war, why do you
think that they wouldn't want you to tell people back home the truth about what you saw in Ukraine?
WANG: Well, actually, in the last year, I learned a lot. What is the truth is I learned maybe in China, we have a new meaning about what is the truth.
What is real happen and not really important. It's more important about what they want people to believe. And of course, no matter what I say, my
mouth is not under their control. I think that is a big part (ph).
GOLODRYGA: And what do you think now that you U.S. intelligence shows that China is seriously contemplating providing Russia ratio of lethal aid? Why
do you think China would be doing that and do you think it's their control over the media landscape at home and the propaganda that has Chinese
people, as a whole, convinced that that's the right thing to do maybe?
WANG: Yes, I kind of agree with you. But you know, the key thing is our government in the public international say they are in staying neural,
right, staying neutral. Stay middle, OK. So, officially, our government, they never say anything. But what they do is, for example, our news, like
in evening -- every evening news, they translate whatever Russian says. They don't have their comment.
So, literally, you cannot (INAUDIBLE), oh, you are pro-Russia. No, I didn't say that. They just sort of tell our Chinese people Russian news says.
That's one thing. Another thing is they manipulate people's voice. So, you see, it's not possible old Chinese people have one same voice, but as you
already know, we have this censorship, which meaning, only those allowed voice can be published on internet. If you have anything which they don't
like to hear, then they will make other -- outside war (ph) not possible to hear you.
So, those speakers are Chinese citizen, but are they really representing our Chinese view or thinking? I don't think so. So, it's like unofficial
way. I mean, China is kind of supporting Russia in their very own official way. And they will easily say, no, it is not our government's behavior, is
our citizens, it is people's willing.
[13:25:00]
But actually no. I want to say something else and they are not allowing me to say that. And it's not only me but -- and as I feel now, it is changing.
You know, in the very beginning of the war, I'll just (ph) say something real, you don't have to pro-Ukraine. You don't have to, we hate the
aggressors. You just say the truth and they ban you.
I think, today, they start getting a little bit lose. So, if you say something about Ukraine, they will kind of allow you to say, but if you say
something about a pro-Russia, they will highlight you and you get more likes.
GOLODRYGA: Well, because of your blogs you have not only been banned from the Chinese internet there and media but, really, you've been blocked from
your family as a whole back home and Dasha has really filled that void with you there in Ukraine your wife.
Dasha, I know your family is originally from Eastern Ukraine, in Luhansk there, at the heart of all of that fighting that's going on right now. Talk
to us, as we end this conversation, about when you first met Jixian and when you first realize that this was the man you wanted to spend the rest
of your time -- your life with.
ZAKOPAYLO: At first, when we met last year, night. And in that time, I was looking for a job and met my love, my Jixian, and in that time, we spent a
lot of time together. So -- and through our conversation, I realized that if he's my man, he's my love. So, I fell in love in that time.
GOLODRYGA: Well, we are so happy to see. We are seeing pictures and images there from your wedding day and documenting that, and I hope that you have
many more years to be together and hopefully, not being there in a war-torn city and that you can experience true life together as a couple. Thank you
so much for everything that you're doing. Jixian and Dasha, best of luck to you, and we're wishing your family well. Thank you.
Well, now, from the war in Ukraine to the unimaginable and often overlooked suffering in Yemen. It's one of the world's biggest humanitarian disasters
separate -- desperate, hunger, millions displaced, some two-thirds of the population are in need of aid. The United Nations is working hard to
deliver it. This week, it received pledges for $1.2 billion, which may seem like a lot, but it's just a sliver of the $4.3 billion aid that was
requested. The situation is extremely dire and we now get a rare chance to hear from people on the ground.
Salah Hamwi heads up CARE Yemen office. He is a Syrian refugee from Aleppo and a former professional basketball player. Welcome to the program, Salah.
So, for our viewers at home that may not be familiar with what is happening there on the ground in Yemen right now, give us some perspective.
SALAH HAMWI, ASSISTANT COUNTRY DIRECTOR, CARE YEMEN: Thank you. Thank you, Bianna. And I think you've looked really right the situation right now in
Yemen is quite difficult despite that recently we're able to secure around $1.2 billion and the recent blood (ph) for Yemen, it's really little. It's
less than 28 percent of what needed right down to meet the most urgent humanitarian needs on the ground in Yemen.
Right now, I'm in Sanaa in Yemen. And our situation here in the country is extremely difficult. And particularly, we see the impact, the -- there's a
proportionate impact on women and girls, like taking its toll, really, as a result of this nine years of conflict right now in the country.
GOLODRYGA: That was when the Houthis there took over the government and the fighting began, obviously, largely aided by the Saudis. Two-thirds of the
population is in need of humanitarian aid. We mentioned that a fraction of what is needed is now being delivered. But how can you be sure or how
viewers be sure that the aid that is being delivered is going where it needs to be?
HAMWI: I think the most important thing is like being able to utilize on local resources. So, for example, effort (ph) care at the national in
Yemen, 99 percent of our team are Yemenis, are full Yemenis on the ground. So, that allow us to have the access and also understand their context.
Allow us to understand the under causing problems that we have in the conflict here in Yemen.
Of course, in addition to the good partnership we have with the local communities, the communities that are participant into this response, as
well as, of course, authorities and where we are able to operate, along with the cooperation and partnership we have with a few organizations, the
U.N. and with the other humanitarian structure in the country.
[13:30:00]
GOLODRYGA: There had been a six-month cease fire, which had ended in October, unofficial cease fire continues today. I know negotiations are
underway between the Saudis and Houthis. Is that giving anyone on the ground hope that perhaps they could see an end to this fighting and thus
see more relief coming to the people who desperately need it?
HAMWI: Like Yemenis -- can you hear me OK now?
GOLODRYGA: I can hear you. Yes.
HAMWI: During those six months in 2022, like Yemen has experience, for the first time in the last eight, nine years, they experience first time some
hope, that their conflict will end soon. And unfortunately, the truce was not extended. I know now there is negotiation and old Yemenis are hoping
for an extension of that truce or become more permanent, to experience again the lively experience during those six months last year.
GOLODRYGA: The U.N. secretary general said that the International Community has the power and the means to end the crisis in Yemen. Those are powerful
words, but given the tragedies that continue one after another, beginning with war in Ukraine there and obviously, the recent terrible earthquake
that struck Syria. I know where you're from, and we're going to talk about your background in just a minute, but also, Turkey. Is there a concern that
some of the aid that could be going to Yemen is being sent to those other hard-hit areas?
HAMWI: Unfortunately, like Yemen is already in competing conflicts in the war, right, in Syria and Turkey and Ukraine. So, it is nothing new for
Yemen. And Yemenis, of course, are very concerned about not being prioritize despite the needs, the desperate need we have now on the ground.
So, that, of course, a concern you hear everywhere you go.
And, of course, I want to say here, like Yemenis, they don't want to receive aid forever. Like every conversation I have with a Yemeni colleague
or a client or like project park (ph) expense from the community who we're working with, the old wants the aid and assistant just temporary until they
are able to build the resiliency they need for -- to be -- for them to be self-resilience and hopefully, in the near future when there's more peace
in Yemen.
GOLODRYGA: And this was by no means a prosperous stable country, even before the Houthis took over in this most recent war began. Talk about what
CARE does and what CARE has been doing, because I know you've been in the country, CARE has been in the country since 1993.
HAMWI: Exactly, exactly. So, it's been 30 years the country, which tell you that the country already in need before the last eight, nine years conflict
we have in the country. So, for CARE, honestly, it was a different phase because of the situation in Yemen has been changing over the last 30 years
from humanitarian to development response, to being able to bridge humanitarian and development assistance.
And right now, we're trying to focus on, on how we are able to develop the humanitarian assistant for the long run, for the long-term response. So,
what we are doing right now is a combination of work on water sanitation and hygiene, working on education and user empowerment and livelihood, food
security and more economic empowerment in Yemen.
GOLODRYGA: If anyone knows what life is like in a war-torn country that is you. As we said, you grew up in Aleppo. You played basketball for the
Syrian team there. Obviously, when the war broke out and started, you left for Turkey. Tell us a little bit about your life right now and ultimately,
what led you to be doing what you're doing.
HAMWI: Yes. Also, surprisingly like I think Yemen gave me more than I give to Yemen. In a way, it's become my home. It's been my home for almost the
last two years. Everyone here, once they learn I'm from Syria, like despite that they are living -- the living condition is really so difficult with
the current conflict and yet, they told me like, we are glad you are here. We heard Syria is difficult right now.
So, imagine the diversity of the people in Yemen and Yemen itself. So, Yemen, in a way, has been -- has given me a home for the last year. In
terms of like my transition here, so, like before the conflict, 12 years before the conflict in Syria, I used to play basketball professionally. I'm
from Aleppo, second largest city in Syria. A city that was really rich in history and culture and diversity. And like in just in a matter of few
days, we found ourself in a conflict.
And that -- you know, we know that's very (INAUDIBLE) and we know that everything that we take for granted in terms of our safety and in terms of
our living condition could just change just like that, right, particularly could be days, could be hours, could be second, like what happened in the
earthquake in Turkey and Syria recently. Our life, it's all changed.
And same as of every other Syrians, my life completely changed.
[13:35:00]
And I found myself naturally trying to find the ways of how I could help fellow Syrian in Aleppo, particularly supporting in third (ph) place people
in Syria, which are right now around 7 million people, 7 million Syrian in third (ph) place. We talk about Syria as refugees but we don't talk about
Syrians who are in third (ph) placed there and who also need our support and attention.
And from that, I found that the conscious experience that being connected to -- you know, we as a human, we always try to find a way to connect to
others. It could be race, it could be language, it could be culture. And also, I found, in my case, it could be the suffering, it could be the
conscious experience of displacement, it could be all of that.
And I found myself working here and Turkey on the largest -- in -- on the refugee response in Turkey. Turkey hosts the largest number of refugees
worldwide. And then from that I found myself working as well on Yemen where, again, like it's been one year right now, and it really feel --
people here, again, despite not having anything, trying their best to make me feel home.
And all my fellow colleagues here in Yemen are their case -- their story is not like different than mine. Mine is not that much really unique. Every
aid worker who is from the Middle East right now, they were not prepared to be an aid worker, in their education or in their -- when they look at what
they want to do in the future. Everyone is, right now, like it's been naturally shift in their career, in their job. Most of my colleagues here
are doctors, lawyers, engineer, teachers, translators, drivers, farmers, business owners. So, my case, to be honest with you, it's not that much
unique. It's almost the story of every aid worker from this region.
GOLODRYGA: Well, you've become a voice for not only your colleagues there that selflessly are giving so much of their time to help those people in
aid, but a voice, really, for the people on the ground there having gone through what you experienced in Syria yourself. You know firsthand what
life is like for them. And I'm just inspired by your optimism that better days will be ahead and we need more voices like yours to be sending the
world the message that aid needs to be delivered now so that their lives can improve and hopefully make the world a better place.
Salah Hamwi, thank you for what you're doing and thank you for taking the time to talk to us from what a country, it appears, just your adopted
country right there. You have so much passion for it. We appreciate it.
HAMWI: Thank you. Thank you, Bianna.
GOLODRYGA: Well, next we take a look at the rise of chatbots and what they reveal about the future of artificial intelligence. Microsoft, Google, and
OpenAI are developing these digital assistants to simulate conversation with human users, changing the way we interact with technology. Steven Levy
is the editor-at-large of Wired, a tech magazine, and joins Walter Isaacson to discuss this latest phenomenon.
(BEGIN VIDEO CLIP)
WALTER ISAACSON, CNN HOST: Thank you, Bianna. And Steven Levy, welcome to the show.
STEVEN LEVY, EDITOR-AT-LARGE, WIRED: It's great to be on. Thank you, Walter.
ISAACSON: Everybody is talking about these chatbot. These things you can chat with on your computer, or your iPhone, or your phone, including
ChatGPT. Explain to us what those are exactly.
LEVY: Well, they are computer systems that talk to you. And, you know, they're called large language models because they're trained on a lot and
lot of text that the scientists have scanned and mixed up and geared to respond to you just like a person would respond to you. They try to figure
out what the next response would be to bounce off what you said. And they have accessed to a lot of information about the world that they can use to
inform their answers.
ISAACSON: Well, you mean -- so I could adjust type something into the chatbot and it would then give me an answer like to a natural question?
LEVY: Exactly.
ISAACSON: OK. I'm going to try. I'm going to do -- I've got ChatGPT and Bing, at the moment, the latest model. I am going to ask it the same
question which is, you know, what is a chatbot? Hold on. I'm going to do it now. And click.
And starting to generate it says, a chatbot is a computer program that uses artificial intelligence and natural language processing to understand
questions and automate responses to them. They can be used for various messaging application. So, in other words, what do you think of that
answer, is that -- it says it's a form of artificial intelligence. Is that right?
LEVY: Well, that's true. Right. Yes. Well, I mean, it's interesting because that answer, sort of, assumed that you wanted, maybe, a little more
technical jargon, like natural language processing, you know. So --
ISAACSON: What does it mean by that, natural language processing?
LEVY: It's a term of art for, you know, the way computers handle conversations.
ISAACSON: So, in other words, Natural language processing meant that this machine learned from reading or ingesting millions and millions of
documents and books and things on the web and then was able to retreat from that instantly, something in response to my natural language question to
him.
[13:40:00]
LEVY: Exactly. Yes. So, you know, they would draw that answer from the data it had available to it.
ISAACSON: And can it be, sort of, creative?
LEVY: Well, a chatbot can come up with a response that seems creative. For instance, you can ask a chatbot to write a poem, it could be a sonnet or a
limerick. And it'll scan just like one of those form of poetry and we consider poems creative.
ISAACSON: Well, wait. OK. I'm going to try it. I'm going to -- I'm going to do, write a poem about a chatbot. Whoa. Here it comes. I am a chatbot. I
like to talk and learn. But sometimes I get confused by the words that humans use. Whoa. That's pretty amazing.
I try to be helpful and friendly. But sometimes I make mistakes. Please don't being greedy or rude. I'm doing the best that I can. That's not only
a pretty good poem but that shows it has feelings.
LEVY: Well, it certainly expressed feelings which was one of the more interesting developments we've seen over the past few weeks as a lot of
people gotten their hands on these chatbots. They've engaged in conversations with a chatbot have explicitly said, hey, I am a chatbot and
I do have feelings. And sometimes, the chatbots even --
ISAACSON: Wait, how does it do -- I mean, how does it learn to do that? You tell me just -- who -- it vacuums up information from around the world. How
does it learn that it has feelings?
LEVY: Well, the information it's trained as a lot of people are expressing feelings. So, why wouldn't a chatbot want to tap into that form of
conversation?
ISAACSON: Tell me about Kevin Roose, "The New York Times" reporter who got into a really intense conversation with a chatbot.
LEVY: Yes, it was a two-hour conversation that Kevin had with a chatbot. And, you know, it was interesting to see that unfold because, you know, he
was sort of baiting the chatbot into expressing its feelings. And yes, you could almost sense that the chatbot had boundaries that it didn't want to
overcome but he would then express in a way, well, you could actually say this, because, you know, it's hypothetical. You're not really, you know,
expressing yourself as a chatbot but what a chatbot might say. And the chatbot wound up expressing its love for him and urged him to leave his
wife.
ISAACSON: Whoa. And you say it had boundaries. Who puts those boundaries on and how did Kevin Roose, "The New York Times" report circumvent them?
LEVY: So, the companies that built these chats understand that they are playing with a form of dynamite. And they try to put, you know, some sort
of guardrails on what the chatbot might say. They don't want the chatbot to express hate speech, for instance, that would be very bad, or be used for
propaganda, or to be insulting to people. So, you know, they put some parameters in there, but as it turns out, over long conversation or
sometimes a clever shorter one, you can get the chatbot to jump over the boundaries and say things which are, you know, hair raising if not eyebrow
raising.
ISAACSON: Well, you say that they put boundaries around it. There are only a few theys, right? I guess Google would be one of them, and OpenAI working
with Microsoft, and Bing is another. Are there other companies doing this?
LEVY: Yes, there is a bunch of others. There's one called you.com which is out that people can try. And as it turns out, it's not a formula that's
limited to a few giant companies. It's one of those things like when Google came out, we figured, search only Google can do that. But other places
turned out to do a reasonable search, not quite as well as Google did it. But Microsoft managed to come up with a search engine. And there's, you
know, a number of other ones that you can try that seemed pretty good. And I think we're going to find other new players come into this market.
ISAACSON: Well, if there are a lot of players coming into this market, won't there be some that might not to put guardrails on? It might be
perfectly fine with hate speech, or racist speech, or propaganda.
LEVY: Sure. For instance, the Chinese are developing their own chatbots. And I think what they consider topics that shouldn't be spoken about, you
know, might be censored. And then other ones that, you know, that might be blocked in the U.S. or some other European countries would be -- let them -
- let it say it. Let's go ahead. So, it could be, you know, like an anti- capitalist chatbot.
ISAACSON: Could a chatbot, or a computer algorithm, or a machine learning device, could it be racist?
LEVY: Absolutely -- I mean -- and actually it would be surprising if it were not racist and had to be constrained --
[13:45:00]
-- because if you look at the body of human expression, you are going to find a lot of racism, you're going to find a lot of things that we wouldn't
want to hear from the chatbots which we're going to be talking with. Let me get this straight with you, a lot, in the future. This -- a lot of our
conversation is going to be taking place with these chatbots that, you know, of uncertain origins.
ISAACSON: Will that replace search?
LEVY: Well, that's -- it's going to be tough to replace search in all forms. There are certain forms of search, they're clearly going to be
better. If you're going to plan a vacation, for instance, just like you would speak with the travel agent. You could have a lengthy conversation,
saying, you know -- well, that hotel looks good but can you find one that's closer to, you know, The Louvre. You know, how about one with the kinds of
pillows I like and, you know, what kind of -- here's the kind of food of eat, you know, can you direct me to restaurants like that.
And the -- you know, conversation would build on the previous responses to tailor a vacation specifically for you. But if you're asking it for some
more factual things, currently, what the chatbot do -- and this is pretty disturbing, is they come up what they're called hallucinations, meaning,
false facts. And --
ISAACSON: Wait, wait. How do they do that and why?
LEVY: Because right now, they're not tied, necessarily to, you know, real- time information. When a search engine scans something, most often they're going to give you the sources of information that you could look through
and leave the search engine to find it. Chatbots give you instant information and trying to give you what you want to hear.
It might say, well, this is the kind of information that this person is asking me for. So, it might give a fact which is, you know, in the flavor
of what you're asking for. But actually, it's factually wrong. When I looked up my own obituary, for instance, and said I won a national magazine
award for looking into the dot.com bust, well, I didn't get the national magazine award for that. I got some awards but it didn't miss the one and
awarded me, you know, an ELI (ph) for something I didn't write.
ISAACSON: What other things could it replace pretty easily in the next five to 10 years?
LEVY: Well, as we speak it's replacing a lot of boiler plate communication that we use every day. You know, recommendation letters, descriptions of
products. Right now, companies are integrating this into their work flow to make their employees more productive and maybe one day get by with your
employees.
ISAACSON: Aren't there some companies or some media companies that are just generating stories using ChatGPT and not using journalists?
LEVY: Yes, there are. But they have to be vetted because of these hallucinations. And also, they -- right now, the output from these things
is not, you know -- doesn't really have the flare that a clever writer would bring to something.
ISAACSON: What about things like lawyers, or doctors, or even psychiatrists someday could -- you have a ChatGPT that acts as your therapist?
LEVY: Well, I think really soon. I mean, we found, you know, decades ago that a really simple chatbot program, it doesn't really use very much AI
to, sort of, powering your questions back. Vote (ph) feelings from people that they felt that they were in a therapy session. So, I feel, you know,
right now you could use these chatbots and get some therapeutic benefit from it, from this thing talking to you.
ISAACSON: You know, Microsoft, working with its search engine, Bing, is an investment in OpenAI which created ChatGPT. So, they're putting it all
together into a Bing like product, the one I was just using, and they're calling it Prometheus. I don't know whether they have an ironic sense of
humor or they haven't read Greek the theology. But Prometheus is about the god who snatches fire from the other gods and gets torture the rest of his
life for giving technology to humans that is bad for them. Is there a Prometheus moment in here where this might be a bad thing we are snatching
from the gods?
LEVY: Well, definitely right. And -- you know, maybe they shouldn't ask the chatbot who Prometheus was and maybe they would've gotten a good answer and
make them think of something different. But right, now some of the disturbing answers we've seen is when people have asked the chatbots, gee,
what could you do that's bad? And they have actually listed some things. I could, kind of, go into Bing's files and delete everything, that is what
one of them said.
So, I think maybe we should be a little nervous at Microsoft because it's the number one company in productivity software, is going to link this
chatbot to your information. That seems inevitable to me. Where you can kind of go and say, you know, hey, Chatbot.
[13:50:00]
What did I write, like, a year ago about this? Could you build on that and, you know, I can rewrite that to update it or, you know, I can -- and we
give these things access to what we do, it's possible that these chatbots might interpret their mission or what we think they want them to do into
something quite different and maybe have the power to delete our formation.
ISAACSON: Some people are accusing these chatbots of being too WOKE. The companies are putting so many guardrails, that it will write a nasty poem
about maybe, Donald Trump or -- but not do something nasty about Joe Biden, or that it has a political bias. Have you seen any of that?
LEVY: I really haven't seen too much of that. I think maybe if you're trying to filter for misinformation, it's reasonable to think that it would
block information that comes from the side of the political spectrum which promotes more misinformation. This is something we have seen in complaints
about what Facebook, you know, up ranks or down ranks in their feed.
I think, you know, really, it's a question of how difficult it's going to be to control what the chatbots say because the degree that you bind them,
the degree that you build these guardrails, you are probably limiting their usefulness. You are lowering the ceiling on what they can do for you, the
more you try to constrain what they say. So, it's going to be a very tricky balancing act to let the chatbot be who they can be and let them be
wholesome.
ISAACSON: Do you think there's any way for government, especially in our dysfunctional politics, to figure out how to regulate this?
LEVY: Well, I think it is going to be really tough because this is a question that is bedeviling the people who build them and the close
observers of artificial intelligence who have been, you know, worried about ethics in this field for decades. And I don't have much confidence that
Congress is going to come in, you know, with like Solomon with, like, the right answer on how these things grow. We are strapping ourselves in for a
rollercoaster ride that no state inspection has looked at.
ISAACSON: In 1950, the Seminole Paper about this topic was written. It was all Alan Turing's paper on computing machinery and intelligence. And it
said, can machines think? And he imagined the conversations you could have with a machine that would be indistinguishable between that of a human and
it was called the Turing test or the imitation game. Have we reached the point where we've passed the Turing test and we can say that machines
thinking?
LEVY: I think these things run rings around the Turing test. We are here. I mean, there's no way you could read these conversations and think, there is
no way, you know, human can say I'm poking hole in it. You know, it -- they might tell lies. Humans tell lies, right? And sometimes be less than
coherent. Sometimes humans don't make perfect sense. So, I think that they've aced this Turing test and we're in uncharted territory now.
ISAACSON: So, if we have machines that appear to think. We don't really know what they are doing inside their heads but they can appear to think
just like humans do. Do they have consciousness? Do they have feelings? And is it possible for a machine to have consciousness or feelings?
LEVY: Well, I don't believe that they have consciousness. You know, there was a Google researcher about a year ago who went public saying that he
felt that Google's chatbot, which really isn't open to the public yet called Lambda, was sentient, was conscious. And even try to get it a lawyer
to help, you know, represent it in getting freed from Google.
And I'm not sure if that was performance art or what. But, you know, he insisted, he believes it. But I think in a way, it doesn't really matter.
If the thing -- if something acts sentient, we have to deal with it. If the thing -- if something acts sentient, we have to deal with it as it is,
right? You know, we're talking now, you know, I know you are a human being. So, I'm accepting that you are sentient, right? We can be having the same
conversation and you could be a chatbot, you know, expressed by an A.I.
And, you know, even though that chatbot isn't sentient, I'd have to deal with the output. So, in a way that is a red herring.
ISAACSON: Well, it goes back to Descartes, as all great philosophical questions do, which is, we know our own consciousness but we're not sure we
know that the people in front of us have consciousness, or maybe that are machines of consciousness. Will this make us reflect more on whether there
is something special about consciousness that is uniquely human?
[13:55:00]
LEVY: Absolutely. Absolutely. When I look at the output of the chatbots, particular, you know, people try to have them write essays. And, you know,
I used to grade freshmen composition, I was a fellow in grad school and I taught. And I read hundreds and hundreds of college essays and some of the
duller ones looked very much like this ChatGPT output there. And I'm wondering, can a chatbot produce something that has soul? You can't measure
that. But when you hear something with soul, you know it. And that is a question that I have been grappling with.
ISAACSON: Steven Levy, thank you so much for joining us.
LEVY: My pleasure.
(END VIDEO CLIP)
GOLODRYGA: And that is it for us now. You can always catch us online and on our podcast and across social media. Thank you so much for watching and
goodbye from New York.
(COMMERCIAL BREAK)
[14:00:00]
END