Return to Transcripts main page
Amanpour
Interview with Scientist and Environmental Activist David Suzuki; Interview with Ocean Climate Activist and InnerLight Founder Bodhi Patil; Interview with "The Substance" Director, Writer, and Producer Coralie Fargeat; Interview with U.N. Artificial Intelligence Advisory Body Co-Chair and Spanish Secretary of State for Digitalization and A.I. Carme Artigas. Aired 1-2p ET
Aired September 20, 2024 - 13:00 ET
THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.
[13:00:00]
PAULA NEWTON, CNN ANCHOR: Hello everyone, and a very warm welcome to "Amanpour." Here's what's coming up.
More death in Lebanon as Israel destroys a multi-story building in Beirut. Correspondent Jeremy Diamond reports on rising tension in the region.
Then --
(BEGIN VIDEO CLIP)
DAVID SUZUKI, SCIENTIST AND ENVIRONMENTAL ACTIVIST: Human beings and the natural world are on a collision course.
(END VIDEO CLIP)
NEWTON: -- a call across generations to restore climate health, I speak with Canada's leading environmentalist, David Suzuki, and young climate
activist, Bodhi Patil.
And --
(BEGIN VIDEO CLIP)
UNIDENTIFIED MALE: Have you ever dreamt of a better version of yourself? Younger, more beautiful, more perfect.
(END VIDEO CLIP)
NEWTON: -- "The Substance," a gruesome takedown of impossible beauty standards. Director Coralie Fargeat, on her body horror hit.
Also --
(BEGIN VIDEO CLIP)
CARME ARTIGAS, CO-CHAIR, U.N. ARTIFICIAL INTELLIGENCE ADVISORY BODY AND SPANISH SECRETARY OF STATE FOR DIGITALIZATION AND A.I.: But what we cannot
do is to do nothing, to wait until the harm is done.
(END VIDEO CLIP)
NEWTON: -- a global framework for reigning in artificial intelligence. Harry Sreenivasan speaks with the U.N.s. A.I. Advisory Chief Carme Artigas.
And welcome to the program, everyone. I'm Paula Newton in New York, sitting in for Christiane Amanpour.
An Israeli strike targeting high ranking Hezbollah officials destroyed a building in a residential neighborhood in Southern Beirut. At this moment,
the Lebanese health ministry reports more than 10 people dead, more than 60 others injured, and those numbers, they warn us, are likely to rise.
Today's attack follows a night of strikes right across Israel's northern border. The IDF claims it struck about 100 Hezbollah rocket launchers in
Lebanon. Hezbollah reports at least 17 attacks on military sites in Northern Israel. All of this comes after those devastating pager and
walkie-talkie explosions earlier this week.
We want to get right to it, to get some insight into what's behind this apparent escalation and where it could lead. Correspondent Jeremy Diamond
joins us now live from Tel Aviv. And, Jeremy, good to have you weigh in on this as we continue to get more breaking news. I see the death toll has now
risen to 12, at least 66 injured.
I mean, as we were just saying, this is quite a tumultuous week given the pager and walkie-talkie attacks. What does this represent as far as the
Israeli military is concerned, as they did give a briefing in the last hour?
JEREMY DIAMOND, CNN JERUSALEM CORRESPONDENT: Well, it is very clear that the Israeli military and its government are weekly willing to climb up that
ladder of escalation. I mean, the pace and scale of the attacks that we have witnessed over the course of just the last four days is really
remarkable, and it is certainly pushing Israel and Hezbollah very close to the brink of all-out war.
The attack today, we should note, beyond the prospect of escalation, is certainly dealing a very heavy blow to Hezbollah. If indeed the Israeli
military's claims of having killed this man, the operations chief of Hezbollah alongside at least 10 other Hezbollah commanders of the elite
Radwan Force in this one strike in the Lebanese capital, does represent a significant blow, one that will certainly destabilize and limit Hezbollah's
abilities going forward.
To what extent, of course, remains to be seen, but there's no question that earlier this week, the attacks that have -- that killed several dozen
Hezbollah members and likely wounded and maimed several hundred more, including blinding some and amputating some as a result of those walkie-
talkie and pager attacks, this has all dealt a significant blow to Hezbollah. But what it has also clearly done is it has also -- and it is
also backing Hezbollah into a corner. And the question now is how will Hezbollah react?
We saw, of course, in the wake of the pager and walkie-talkie attacks that Hassan Nasrallah, Hezbollah's leader, vowed retaliation. But at the time,
he did not seem inclined to go for some, something much more significant that could plunge Israel and Hezbollah into that all-out war that has been
feared for so long. But will today's attack change his calculus? And what will come next? As Israel seems increasingly willing to escalate the
situation further and to potentially risk that all-out war.
[13:05:00]
NEWTON: Yes, Jeremy, certainly a lot at stake here, as you said, as Hezbollah really tries to determine its next move. I do note that the
caretaker cabinet in Lebanon says that this targeting of a populated residential area amounts to, in its words, genocide. And yet, the Israeli
military really striking a quite a tone here on this Friday, as I said, given the tumultuous week, what more are you learning from them about where
they may now take this conflict, especially since right now, there seems no hope of an actual ceasefire taking hold in Gaza?
DIAMOND: That's right. And so, one of the questions here is, has the Israeli government decided that there is not going to be a ceasefire in
Gaza, which therefore means that there will not be a diplomatic resolution to the conflict with Hezbollah, which has vowed that it will continue to
fire rockets into Northern Israel so long as the Israeli military continues its campaign in Gaza.
And so, has the decision been made? That, all right, the only other avenue is to therefore try and cow Hezbollah into submission through these this
ramped up pace of attacks or to simply risk and enter into perhaps all-out war with Hezbollah in order to bring about a conclusion to this nearly
yearlong conflict once and for all.
And if indeed that is the decision, we are in for something potentially very, very dangerous in this region, because not only will Israel -- the
Israeli military, in the event of an all-out war, reign down significant destruction in Lebanon, including in the Lebanese capital of Beirut. But
Hezbollah is also very capable of causing significant damage and death in Israel as well. There are -- it has a significant rocket arsenal and
missile arsenal that it has yet to deploy, some of which can reach Tel Aviv, where I'm standing right now.
And I have spoken with Israeli military commanders who understand very well that a war with Hezbollah is not the same as war with Hamas in terms of the
difficulty, but also in terms of the damage that Israel itself will incur.
NEWTON: Yes, and as you pointed out many times before, some 60,000 Israeli citizens still unable to be home because of the escalation along that
border. Jeremy Diamond, thanks for being with us. And here at CNN, we'll continue to follow that developing news.
Turning now to the climate crisis, which is, of course, another escalating threat. The European Union says devastating floods in Central Europe and
deadly wildfires in Portugal are proof of a, quote, "climate breakdown." And in Latin America, Peru declared a state of emergency for regions also
ravaged by wildfires.
Meantime, the U.S. government reports this August was incredible. The hottest ever recorded. Look at that. The dark red on this map represents
record breaking temperatures.
Now, while extreme weather impacts all of us, of course, it's often the most vulnerable who fight back. With older and younger activists winning
landmark climate lawsuits across the world. So, let's hear from two climate activists who we're happy to note span the generations.
David Suzuki is one of Canada's best-known scientists and environmental educators. Here he is hosting "The Nature of Things," certainly something
that was part of my childhood, called TV's longest running science series. And Bodhi Patil is a leading Gen Z climate activist and founder of the
group InnerLight He's one of a generation of young activists who say they have been inspired by David Suzuki's work. David and Bodhi, both of you,
welcome. And I'm glad to see you side by side there in Montreal.
We need to begin right with this the climate news, which has not been good for several years now, but even in recent weeks, when we cut across all of
these disasters. On four continents, will remind everyone, extreme rainfall, flooding that's killed thousands of people. We've seen major
fires in Europe and North America as well. What goes through your minds, to both of you, when you see these kinds of floods and fires now unfolding and
seemingly only getting worse.
And, David, I do turn to you because you have been such a pioneer and really have been trying to educate about this -- us about this for decades.
DAVID SUZUKI, SCIENTIST AND ENVIRONMENTAL ACTIVIST: Well, you know, I went into television in 1962 because I thought this is a powerful way to inform
people and educate them about issues of science and nature. And, you know, I have to say now that we live in a world where this kind of information
doesn't seem to matter.
[13:10:00]
Things like the economy seem to be more important than the avenues of communication, you know, with the internet you get, I don't look at it, but
TikTok apparently is a way a lot of young people get their information and, you know, QAnon now has far more -- as much credibility as a leading
scientist of the planet.
So, I go, what kind of an animal have we become? Where the people with the best information about the state of the world that we live in are now
marginalized and we have to deal with people that are saying climate change is a hoax and this will destroy the economy if we want to act on climate
change. I mean, it's a crazy world.
BODHI PATIL, OCEAN CLIMATE ACTIVIST AND FOUNDER, INNERLIGHT: The reality is that we --
SOARES: Go ahead, Bodhi.
PATIL: -- six of nine planetary boundaries already surpassed, and we're going to surpass the seventh. And that's what leading climate scientists
say. And we know that the ocean has some of the powers to be the largest carbon sink and to protect us. And if we're not working on actively
restoring nature and restoring our relationship to it, we won't have a healthy future, and we won't have a chance of creating a just future and a
just world. And so, we need all hands-on deck.
And when crisis can turn into an opportunity, that's where we all see the role of intergenerational collaboration coming in to solve some of our
greatest challenges.
SUZUKI: But you know, I see young people like this with the energy and the knowledge, and they're really -- they're ready to take on that role of
being planet protectors and guardians. But the very structures we've created to manage our interactions and the way we live, namely legal,
economic, and political are in fact constraining us so that they can't really do the things that they want to do.
Why is it the economy is always in the argument as a reason why we can't do what we have to do if we're going to save a future?
SOARES: Bodhi, I wonder how much, though, that sentiment resonates with your generation, right? I mean, certainly many in your generation want to
do as much as they can, and yet, with David sitting next to you, what do you say to him when you must hear that from Gen Z, right? The fact that the
economy is important to them in terms of what their future might hold.
PATIL: The reality is now that Gen Z and millennials have the largest purchasing power in history. And what we want to do is make sure we're
investing and creating a world that we can live in. And a world that we can live in is dependent on the natural ecosystems that David and the Suzuki
Foundation and young people and indigenous people have been working to protect for millennia.
And if we're not protecting our natural ecosystems, the systems that store the largest amount of carbon, the systems that provide us food and
livelihood, the systems that heal us, then our economies have no chance of surviving nor thriving. So, if we want to create a just economy, if we want
a blue and green economy, then we have to protect nature. There's nothing else to it, because the economy is dependent on the health of our natural
ecosystems, the health of our generation, and we need to support the next generation, reaching spaces where we can work in nature. We can work in the
just transition, away from fossil fuels and towards clean energy.
So, I stand 100 percent in solidarity, and so does my generation, in saying that if we want a thriving economy around the world in the U.S. and Canada,
then we have to place nature-based solutions, then we have to place indigenous stewardship at the center of that, because that's what will
sustain us. That's what will create a new way of going forward.
SUZUKI: The problem is that we're trapped within that economic system. We don't want to protect the economy as it now operates. And Partha Dasgupta,
an economists from Britain, showed us that if you look at all of the services that nature performs to keep the planet habitable for us,
beginning with all of the plants created the oxygen rich atmosphere that we depend on, take carbon out of the atmosphere, there are all these services,
pollination of flowering plants and so on, none of that is in the economy.
So, as Mark Carney points out in "Values," his book, really important book, that says that the Amazon, Jeff Bezos' company, is valued by the economy in
over a hundred billion dollars. But Amazon, the greatest ecosystem on the planet, has no economic worth until it is logged, mined, dammed, or it
grows soybeans or cattle or cities. If that isn't a screwed up, crazy economy. So, when we talk about a just economy, we want a just very
different kind.
SOARES: OK. David, though, but I'm -- David, I'm out of an age that I have heard this from you for literally decades. And yet, I do sense a certain
frustration within your voice that somehow this kind of thinking still has not taken hold. I want to show our viewers as well what went on in your
home country of Canada last year. Many of us went up to see the destruction ourselves as those forest fires absolutely burned through boreal forest in
Canada.
[13:15:00]
I mean, the stats are significant. That a new study says four times more carbon was released into the atmosphere just from those wildfires in 2023
than were admitted in Canada's fossil fuel industry for the entire year. David, I have to ask you, what now? Where do you go from here when it seems
that not much that has been done in the last few decades has worked or at least not worked quickly enough?
SUZUKI: I think young people have everything at stake. And as you can hear from Bodhi, they're informed, they're committed, and they're ready to act
and make that difference. But they don't have the muscle power within our economic and political systems. A lot of you guys don't vote, and that
seems to matter more.
So, I really think the point now is elders have to be the ones to step out. You know, we had retired admirals and generals against nuclear war. Very
powerful group who came through the system. And once they were retired, they could then talk about nuclear weapons from the standpoint of it. It's
a crazy way to protect yourself.
I'm calling on retired executives of forest companies -- fossil fuel companies, you know, fisheries, whatever, now, you've made your money.
You're no longer beholden to the organizations, come out and speak the truth. And I think elders like me, we've lived an entire life. The world
that we're leaving to them is a messed-up world. We've got to be ready now to put our bodies on the line.
I'm calling for elders to come up, put our bodies on the line, get arrested. We've got to begin to start to use civil disobedience as a way of
making their priorities our highest priorities.
SOARES: But, Bodhi, how do you make that happen within your generation? Do you think things like litigation helps? There's certainly been some
landmark litigation over the last few years. What do you think will actually work beyond the significant work that you are doing even in the
ocean?
PATIL: We're here at the One Young World Summit, where thousands of young people are gathered to solve the world's most pressing challenges. And we
know climate change is the largest challenge, and on top of our agenda is climate solutions. And so, we're now learning that there are all sorts of
innovative new solutions, that use both ancient intelligence, the knowledge we've learned from past generations and the new modern sciences and
technologies that can help innovate and protect our planet in the best way as possible.
And we know the ocean has some of those solutions, climate litigation, the rights of the ocean, the rights of whales, there's all sorts of new
movements, new momentum being built, and this is an entire new workforce that's being created for a blue economy. The blue economy is one of the
fastest growing economies. And if we're going to really protect our planet, our ocean and ensure that we have a planet to live on, that we need to make
sure we're sharing generational knowledge with the next generation, with Gen Z, with your children, your grandchildren, with future ancestors.
Because the only way that we get to a fair world is when kids grow up playing in the tide pools, when kids grow up in nature, learning from the
rivers, so they feel that same sense of love. Regardless if you're ExxonMobil and you're a supervillain with the largest profit share, or if
you're a simple kid living in a river and stream on a village in the Amazon, you share the same air produced by the ocean, you share the same
oxygen, you share the same water systems, and we need to stand up for what's right. And that's what our generation is calling us, that's why
we're here.
SOARES: And your enthusiasm comes through loud and clear in the most intelligent of ways, but also, what you do hear from your generation is
some resignation, some anxiety, some cynicism, even that they can do anything to really change the path of the climate crisis right now.
SUZUKI: Well, despair is, I think, a luxury we can't afford now. I mean, whether or not we'll survive is not the issue, it's what can we do to at
least try? And we're not listening to the message Greta said. Greta said, I learned to take science seriously. Why don't we elevate the science that we
have to tell us what the state of the planet is and to set our priorities?
And I want to point out one thing. I was in my last year of college at Amherst College in the United States in 1957. And in October 4th, we were
shocked to hear that the Soviet Union had launched Sputnik. I didn't even know there was a space program. And anyone old enough to remember, in the
months that followed, the United States launched their satellites and everyone blew up.
And meanwhile, the Russians launched the first animal, the dog Laika, the first man, Yuri Gagarin, the first team of cosmonauts, the first woman,
Valentina Tereshkova. Americans just went, oh, my god, they didn't say, oh, god, they're too far ahead, we can't do this. They've beaten us. They just
said, we got to catch up to these guys.
[13:20:00]
They poured money, they set up NASA, poured money into the universe. Here I am, a Canadian studying in the States. If I said, I like science, you threw
money at us. I mean, it was a glorious time.
And look at what happened, in 1962, John F. Kennedy said, we choose to go to the moon. We're going to get astronauts to the moon and back in 10
years. When he said that, we didn't have a clue how the hell to do it, but they made the commitment. And look what happened as a result, not only are
Americans the only ones to get to the moon and back in less than 10 years, but look at all the unexpected spin offs, you know, from laptop computers
to GPS, the 24-hour satellite television programs, face blankets, look at the benefits.
SOARES: David Suzuki, we do have leave the conversation there, but your enthusiasm does not surprise. And, Bodhi, really a delight to speak to both
of you across generations, really fighting for the planet. Thank you to both.
SUZUKI: Thank you.
PATIL: Thank you.
SOARES: And now, we turn to beauty standards placed on women in society that are exceedingly high, but just how far would you go in trying to
attain them? That's the focus of the new film, "The Substance, starring Demi Moore, who plays an actress clinging onto the remnants of
fame and relevance. Her character, Elisabeth, turns to a black-market treatment known as the substance, bringing to life her younger alter ego,
Sue.
(BEGIN VIDEO CLIP)
UNIDENTIFIED MALE: Have you ever dreamt of a better version of yourself? Younger, more beautiful, more perfect. One single injection unlocks your
DNA. Starting a new cellular division that will release another version of yourself. This is the substance.
(END VIDEO CLIP)
SOARES: OK. So, pacey and grotesque. French filmmaker Coralie Fargeat spoke with Christiane about the pressures imposed on women and the violence
they inflict on themselves.
(BEGIN VIDEOTAPE)
CHRISTIANE AMANPOUR, CNN CHIEF INTERNATIONAL ANCHOR: Coralie Fargeat, welcome to the program.
CORALIE FARGEAT, DIRECTOR, WRITER, AND PRODUCER, "THE SUBSTANCE": Thank you.
AMANPOUR: Listen, this probably one of the most anticipated films recent time. And after a 13-minute standing ovation in Cannes at the festival this
summer, you've now screened it in L.A. And I just wondered, what was the reaction? Were you surprised by the L.A. reaction?
FARGEAT: I mean, I was, like, super positively surprised because, like, the crowds -- different crowds that we had for the different screenings
here were, was just amazing, like, the audience really went for it, like, they were not shy at all to react, and that was, like, super enthusiastic
for me to witness this.
AMANPOUR: What is it that inspired you to make this film, you know, that it is body horror, it's a genre? And you call it a feminist film.
FARGEAT: Basically, my very own life. You know, like, about how as a woman when I had passed my 40s, I started to have those crazy violent thoughts
that my life was going to be over, that, you know, no one would be interested in me anymore, that I wouldn't have any value. And I really
realized how violent and powerful those thoughts were. And I said, OK, I decided I really want to do something with that.
AMANPOUR: You know, you have said that instead of doing this in a sort of reality way or realism, you decided to go the genre route. Why?
FARGEAT: So, I really wanted, like, this story to be able to symbolize, you know, what every woman can live and not be restricted to a specific
moments and time and place. So, that's why the genre, for me, felt, you know, the perfect way to build that.
AMANPOUR: Yes. But I mean, you know, some people could be quite worried about watching it. I mean, look, I know when I went to watch it, I'd heard
so much about it that for a while, I had my hand in front of my face, like squinting through my fingers.
It's very bloody. It's very, you know, monstrous. It's very visceral, squelching and body parts and injections into the veins. It's obviously
filmed, you know, beautifully and it's colorful and dramatic to watch. Why did you choose that kind of genre?
FARGEAT: So, in fact, I really wanted, like, the movie to express the violence that I feel about, you know, the themes. I think, like, the way
society treats women regarding this extremely violent. And I wanted, really, to put it on the screen the same way that I feel it inside. So,
that's why I really wanted to push the limits and not to be shy to say, this the reality. This how, you know, we are led to internalize so much
violence. This how gross it is.
[13:25:00]
This how extreme it is because, to me, I really -- so that we need this to be discussed in the public space, to have a change for this. And I wanted
to take it out in a way that would be striking, but, to me, at the same level that I really live it every day, honestly.
AMANPOUR: I'm going to get to you yourself in a moment. But first, let's just recap the story, of course. An aging actress, if you can call 50
aging, which I guess they do. Demi Moore plays Elisabeth, and she was an actress, then a very popular fitness aerobics instructor on television. And
all of a sudden, her boss, the boss of the television station, played incredibly repulsively by Dennis Quaid, announces to her that the gig is
up. Let's just play this clip.
(BEGIN VIDEO CLIP)
DENNIS QUAID, ACTOR, "THE SUBSTANCE": People always ask for something new. It's inevitable. At 50, well, it stops.
(END VIDEO CLIP)
AMANPOUR: I mean, yes, that is very, very dramatic and horrible for any woman to hear. Just minutes before in the film, we hear him, his character,
who you've named, I don't think it's coincidentally, Harvey. We need her young. We need her hot. We need her now.
So, that leads into the fact that then this substance she gets, it's on the black market, in order to create and bear a younger, beautiful, hotter,
blonder version of herself. And that's played by Margaret Qualley. Why did you choose Demi Moore and Margaret Qualley?
FARGEAT: I knew that to really tell that story, I needed very strong performances because it was going to be at the same time very emotional and
very physical. And I wanted to, you know, work with an actress that would be an icon in herself. And Demi Moore, you know, she kind of already
represents the many ideas of the film. And also, seeing her, you know, in such a movie, which is very different than what she has done been doing in
the past, I think, was very striking and interesting as well.
And I discovered someone who was really willing to -- you know, to take risk, to kind of also make a statement, I think, to the world with this
story, and pairing her with another actress, which has -- who has this kind of same raw, visceral, and very instinctive energy was the best way, for
me, to kind of build the two sides of this character.
AMANPOUR: And then, of course, in the film, there becomes an existential struggle between the two of them because the story goes that the two cannot
exist at the same time. When Elisabeth, the older one, is out and about, the other one has to be on ice and vice versa. But then the younger one,
Elisabeth's younger self, does not want to go back to being in hibernation. What is that saying to the audience?
FARGEAT: So, the idea of those two characters was really to kind of represent for real, I think, all the voices that we have in ourselves that
can be super violent and conflictual and kind of whispering all the time that you're not good enough, you're not beautiful enough, you're not sexy
enough, because that's what kind of what we hear all day and see all day, I think, all around in society.
And, you know, the younger version of Elisabeth kind of reproduce the same pattern, like she fall in the same trap, like thinking that it's because
she's going to be valued because she's young, sexy, hyper sexy, which going to put her, you know, at the center of the attention. She immediately kind
of live in the fear of losing that and of, at her turn, to be erased.
So, that was really the way to kind of express that -- I think the strong injunctions that are around us kind of lead us to not take our place in the
world for real, but kind of limits ourselves to this fear of being judged, this fear of being not good enough. And it's so hard to escape this. And I
think on an individual level, it's almost impossible.
AMANPOUR: So, look, there is a lot of violence that's done by the individual to herself and by society to the individual, and it's very gross
in a lot of places. It's also, though, your camera caresses the bodies, whether it's Elisabeth, who's older and Demi Moore will be called bold and
courageous for allowing her 61-year-old body to be examined in technicolor like that next to Margaret Qualley's very much younger, tauter -- you know,
taller, more perfect body.
[13:30:00]
But what do you say to those who've said, you know, you, too, are panning these bodies in close up? You also are looking with a gaze, maybe not the
male gaze, but society's gaze at these figures?
FARGEAT: So, that was truly the intention of the film for me. Like, you know, I'm a feminist and I have my own way of, you know, kind of saying my
world into the world about those issues. And my way of doing it was to represent what the reality around me makes me feel like.
And basically, I think there are two worlds, the world of Elisabeth, where the body is facing her with herself, you know, in her bathroom where no one
looks at her. And it's where, I think, she has the more judgmental look on her body. And I think that's a reality for many women who, when they look
at themselves, are led to not love themselves because there are, you know, so many things around that that make us believe our body is not perfect. We
don't deserve, you know, to feel good with ourself.
And on the other end, it's like a Sue character who is totally representing the male gaze. Like, you know, the one that we also internalize from a very
young age, I think that also make us believe and think that if we are not, you know, hyper sexy we're not going to be the center of attention. And,
you know, and I grew up as a young woman thinking that a lot in my own life. You know, so for me, it was super important to kind of show that.
AMANPOUR: Have you yourself recovered from that affliction?
FARGEAT: Well, I mean, the film has certainly be very liberating for that, but as I'm also saying, I think when you grow up with this your whole life,
it's something that is extremely tough to, you know, totally erase from your brain.
For me, it's really about to say, like, I understand why you don't feel good about yourself. I hope it's going to change, but I think it's a lot of
a journey that cannot be done just on an individual level. And for me, I have a lot of relapses. Sometimes I feel better, but sometimes I still, you
know, have to navigate with all these little voice inside my head. And that's also something important. I want to communicate.
AMANPOUR: Well, given what you just said I heard Demi was quoted as saying either this film will be a disaster or it will have a major cultural
impact. And I'm asking you, and I want -- I'm saying that, because as you know, there's so much on social media, so much in the anti-aging, you know,
space, even young children are asking for, I don't know, butt lifts and Botox and the whole lot. I mean, we're talking, you know, sub pubescent in
some cases. This really happening out there.
How do you think your film could have a cultural impact and do you have a target audience?
FARGEAT: I mean, like, obviously, I have grown up with the, you know, beginning of social media, and I think it has a huge impact, of course, on
the younger generations, the same way that, you know, all the magazines had a huge impact on me when I grew up as a teenager.
So, hopefully, I really hope that the younger generation, you know, will respond to the film and it will also help building some kind of distance
with all the pressure that, of course, social media totally increase and it's a never-ending circle. As you say, we have -- we are always exposed to
something that kind of give us that wrong message that we have to transform ourself to confront to some ideal beauty crazy standards.
So, even if I think the movie can really be heard by a super wide audience, because I think we've all been confronted to this issue, I really wish that
the younger generation, in particular, will, you know, be sensitive to its message.
AMANPOUR: Coralie, one of the most compassionate and moving parts of the film, for me anyway, was the way you address the issue of loneliness.
Again, the older woman, lonely. She's been asked on a date, Demi Moore in her Elisabeth reality, and she's made herself up. She's dressed up. She
looks great. And then here's the clip before she goes out.
[13:35:00]
(BEGIN VIDEO CLIP)
(MUSIC PLAYING)
(END VIDEO CLIP)
AMANPOUR: So, she's violently scrubbing her face and really harsh against herself, and not only does she do that to herself but she misses this,
what, appears to be a genuine opportunity at a real-life connection. What was going through your mind in that scene?
FARGEAT: Yes, I think to me that's one of the most emotionally powerful scene of the movie that's really related to the self-hatred that we are led
to develop, and as you said, that kind of cut off from the real world and from living our lives. I think what happens after that scene is Elisabeth
is not going to go out anymore from her apartment. She kind of decides that she doesn't deserve to be seen in the outside world.
And I think that what happens, you know, when we develop all those ideas around like beauty standards, we're building a jail around us that kind of
lock us in and really prevent us to take our place into the world. And again, I wanted to show the violence and the reality of it to hopefully
help women, you know, break those -- this jail. Because I think we need to go out. We need to step out of it and try and really take our place into
the world in a very different way and equal way than anyone else.
AMANPOUR: Well, your film is going to go into the world on the 20th of September, and it'll probably knock a lot of people's socks off. Coralie
Fargeat, thank you so much indeed.
FARGEAT: Thank you so much.
(END VIDEOTAPE)
SOARES: So, now to artificial intelligence, which, of course, is the next frontier of the technological revolution. But as it continues to evolve at
breakneck speed, we add, how can we ensure it's safe, ethical, and accessible for all to use? To answer this, Hari Sreenivasan spoke with
Carme Artigas, co-chair of the U.N. Artificial Intelligence Advisory Body.
(BEGIN VIDEOTAPE)
HARI SREENIVASAN, CNN INTERNATIONAL CORRESPONDENT: Paula, thanks. Carme Artigas, thank you so much for joining us. You are a co-chair of this U.N.
A.I. advisory board and you've published this final report. What's the top line? What are the findings that you're most interested in making sure that
people are aware of?
CARME ARTIGAS, CO-CHAIR, U.N. ARTIFICIAL INTELLIGENCE ADVISORY BODY AND SPANISH SECRETARY OF STATE FOR DIGITALIZATION AND A.I.: Yes. First of all,
I think we are all aware of the great possibilities that artificial intelligence is going to reach humanity in terms of efficiency, in
productive processes, of course, opportunity to spread public health or education, and of course, on scientific research. And we are all aware that
these are great possibilities, but at the same time, there are a lot of risks. At the short-term in terms of fundamental values, but also in the
long-term, in terms of safety.
What we need to ensure is that all these opportunities are absolutely developed. And if we leave all this technology, which is very
transformative and governed, not only we are not be able to capture all these opportunities, but probably are going to exacerbate some of the
problems we have today, especially talking about inclusiveness.
SREENIVASAN: You know, when you talk about lack of inclusiveness, the report points out, in several different ways, the sort of giant gaps there
are just in how unequal the distribution of where artificial intelligence is today. One of the things that you pointed out is seven countries are
party to all the different kind of A.I. governance efforts that are happening around the planet, and 118 countries are part of none of them.
So, is there a risk here that the rest of the world, meaning the majority of the world, gets left behind?
ARTIGAS: Yes, it is. I mean, in fact, there is a great risk to increase the current digital divide with a new A.I. divide. And I think that we must
ensure that the benefits and the cost of any technology revolution is equally distributed among different social classes, among different
countries.
The reality is that even though there are a lot of very important international efforts for government in terms of ethics guidelines, even
regulations in some parts of the world, we cannot leave all these countries without being sitting at the table, without being part not even in the
development, but also already in the discussion.
So, we want equality in the benefits. We need to ensure equality in the access. And to ensure equality in the access, we need to make them
participate in all these new instruments we are proposing to ensure that A.I. is government -- governed at the global level.
[13:40:00]
And I also provide with this less development countries with the right tools they need to develop their solutions, especially when we think that
A.I. is going to be fundamental in the achievement in sustainable development goals.
And I mean, by these are three main, I would say, entities that are needed, which is data, computing capabilities, and talent. And that is one of the
proposals we have is on a capacity development network and a capacity building also funded by a global (INAUDIBLE).
SSOARES: You know, some of this comes down to the kind of computing power and where that computing power is located. And right now, what you point
out is that there is almost no computing cluster, all the 100 biggest computing clusters in the world, none of them are in developing countries
at all, right? So, if the physical horsepower that's necessary to enable the talent in a smaller country to try to build applications on A.I., et
cetera, just doesn't exist there, how do we even begin that?
ARTIGAS: Exactly. That's the right question. We don't have access to that computing capabilities. And this why in the capacity building network
initiative, we propose a global fund that can be funded by private and public entities, but also not only in money, but also in kind.
We need to provide the capacities to build their own entrepreneurial ecosystems that these countries need. That's what's also proposed a data
framework, because most of the problem is that all the special and large language models, general purpose A.I. systems and models that are being
developed by the Global North, they are only developed by data from the Global North.
So, there is a lack of representation, and therefore, we cannot pretend that this a universally adopted technology that can benefit all, which is
(INAUDIBLE).
SREENIVASAN: You know, just the other day we saw that there was an investment between BlackRock and Microsoft and they want to put $30 billion
down to co-invest in data centers, right? They even have NVIDIA as a partner. But most of that is America centric. And I wonder, how does, kind
of -- how do the suggestions that you're making here, do you pick up the phone and call Satya Nadella and say, hey, listen how about a couple of
those data centers in a couple of other countries that could use it?
ARTIGAS: Well, we're talking a problem that has a lot to do with geopolitics, and of course, we're not messing into that. What we see is
what are the gaps, what are the instruments that need to be set, just where is the place where all these conversations need to take place?
The point is that we don't have yet a multilateral platform for collaboration, for example, on safety. Safety is very important for A.I.,
to design the safeguards and guardrails that we can really trust the technology and therefore, adopt it. Because I think we are all interested
in get all these benefits and get all these opportunities. And therefore, that we can adopt it with trust.
Trust for the consumers and trust for the citizens, which we are not saying here is giving all the answers to all the problems, what we are proposing
here was what are the instruments that are not yet in place that are necessary because they are covering the gaps.
And I think the other important thing for me, the -- one of the most important recommendations is the scientific panel. We need transparency on
the risks and on the opportunities. And without data and scientific evidence, not even policymakers can be sensitive roles to guide a properly.
SREENIVASAN: How do you create that incentive for transparency, right? Like right now, for example, when it comes to intellectual property,
there's a lot of concern that a lot of the large language models have been trained on copyrighted material. So, I wonder if you have this ability to
convene different countries, whose law do you agree on? Whose intellectual property law are you going to go by? Whose human rights law are you going
to go by? What is freedom of speech in one country versus another? How do you get through those kinds of thorny issues?
ARTIGAS: Well, I think there's a difference that so many people mix, which is one thing is ethics, another thing is regulation, another thing is
governance. When we're talking about ethics is, how should companies or governments -- because this affects also the use of A.I. by governments,
how should they behave when in a morally acceptable way or in the way we expect them to behave, that's ethics guidelines.
But then, it comes governance. And governance means, which are the instruments I need to put in place to ensure that these companies and this
governance are behaving ethically? And regulation is one of these tools, but it's not the only one. I come from Europe, and I've been an active
negotiator on the European A.I. Act, and we solve this on our European way, but it doesn't need to work for everyone. What we expect here is that when
we talk about governance, it can be through regulation, but it can be also through market incentives. It can be with oversight boards. It can be with
treaties. It can be with many other ways. We are proposing some instruments to make this happen.
[13:45:00]
And in terms of regulation, we cannot expect that all the part of the competition that will have the same regulation. But what we can expect that
is a convergence on a very important minimum, which is that anything with on A.I. is for the common good. It's based under the U.N. chapter, under
the international law and based on human rights, and I think that's the bare minimum we should ask to any country in the world and any company in
the world.
SREENIVASAN: Yes. You are, for our audience that doesn't know you, the Spanish secretary of state for digitization and artificial intelligence.
So, you've had these conversations across Europe. And I wonder how did you balance the need for making sure that it's comprehensive, that you
understand the technology and at the same time, the sort of need for speed because so often we find, at least in the United States, regulation is
about, I don't know, five to eight years behind where the technology already is. So, by the time it gets maybe litigated in the court system,
the technology has evolved so fast, right?
ARTIGAS: Well, exactly, that's the big challenge. How can we regulate the technology, which is a continuous evolution? How can we make these laws,
these regulations, or these best practices future proof? And that must be embedded in the law mechanism itself.
So, in particular, the EU AI Act has its own renewable mechanisms, and a lot of the things that are proposed have been designed together with the
industry. We have all the same principle. Anything that we are proposing here as the global governance are very agile instruments that can evolve
according to the needs.
But what we cannot do is to do nothing, to wait until the harm is done. Because governance has not -- must not be seen as an inhibitor, not
inhibitor of innovation. It must be seen as an enabler. If we give trust to consumers and users, people will adopt A.I. massively. I think that's what
we are not seeing.
And I think the risk, we don't need to wait five more years to know what are the potential risks. I think we are on time now to make things happen
and to ensure that everybody's doing the things right first time, because probably we can -- we are not going to be able to revert back the potential
harm we can create.
SREENIVASAN: You know, I looked at the report and the amazing kind of confluence of the number of experts who are very concerned about some of
the negative risks of A.I., you know, when it comes to information integrity on how people are able to tell fact from fiction, I mean, that's
something that we are here now, right before an election in the United States thinking about much more closely.
But I wonder, how -- you know, what are the conversations that are necessary to try to figure out some sort of baseline of ensuring that, you
know, a surveillance state doesn't take over in a harmful way, or that information integrity is not kind of destroyed across different societies?
ARTIGAS: Exactly. This where we think there must be this consensus. I will say that we can compete for market share, we can compete for talent, but we
cannot compete for safety. We cannot compete for human rights. I think that national countries will have to put their own regulations to limit the
power of governments or companies.
Again, in the EU AI Act, we put five cases what we consider forbidden uses of A.I. So, things that even though are technically feasible, we don't want
that to happen in Europe, for example, social scoring, where we know that other parts in the world is something widely accepted. So, we don't
pretend, through the U.N., to replace the role that all governments leaders need to put in place in their countries.
What we're seeing is wherever it is in the national level must be encompassed by a consensus in very important things, on what are the risks,
how do we prevent from unintended misuses of technology, how do we set up guard rails that a risk in a country is also a risk in another country?
Also, how do we align each other on the standard, technical standards? How do we set up scientific panels to really -- all these risks you are
mentioning are not just fears with no scientific evidence, because we are focusing a lot on the risk and we're not focusing, therefore, on the
opportunities, we firmly believe (ph) are huge.
And I think we are all in the same boat, companies, citizens, and governments that we use A.I. for the good of humanity. And I think that is
a great opportunity.
SREENIVASAN: Even if you wanted to focus on the potential benefits of A.I. there are significant concerns with the amount of energy that is necessary
to power some of the data centers where all of this computing power would be working, right? So, here we are, on the one hand, in a climate crisis
that is a significant, you know, kind of cost for the world. Are we making things worse when we are looking at how A.I. is developing today without
really any environmental guardrails?
[13:50:00]
ARTIGAS: Absolutely. The level of development of A.I. with the level of consumption is not only of energy, but especially of water is not
sustainable. And because we think that A.I. can be very positive for the development of sustainable development goals, we need to ask for
sustainable requirements also to the software industry and they are required by any other industry.
I think one of the recommendations that we expect the scientific panel will show some light is in how to do that in a better way. How can we be more
efficient in the software development, so that we don't have this excessive consumption, which is absolutely like contradictory or using A.I. to
improve energy consumption or be more efficient. At the same time, the technology is not sustainable by itself. I think that's the way we want to
go. That's where this international consensus must stipulate.
SREENIVASAN: Most of what we're thinking about A.I. as consumers might be chatbots, but there are, you know, much darker uses of artificial
intelligence that we are slowly starting to understand. One is in the use of autonomous weapons. And this a completely discreet conversation that has
mostly military stakeholders and, you know, heads of state involved.
So, I wonder, if in this kind of an advisory model, whether you've come up with anything to suggest or alter the course of how A.I. could be used in
defense.
ARTIGAS: Well, in this particular matter, we see it. We don't need to provide a different instrument because we already have the Geneva
Convention. And what we are recommending in the report it is, of course, we, I would say, expect or claim for a treaty in 2026 to ban these
autonomous weapons, but this is like a proposal that we do, but really the place where this must be discussed is in the Geneva Convention.
So, we don't need to create a different instrument for that. There is already a multilateral platform to discuss this topic. Of course, when
we're talking that A.I. must be for the good of humanity, we, of course, consider that this cannot be harmful for people.
SREENIVASAN: We've also already seen in -- here in the United States horrible cases where the models that the artificial intelligence was
trained on, especially for visual recognition, ends up creating biases and exacerbating biases from the people who might have been programming it. And
some of that might be conscious, some of that might be unconscious.
So, how do you figure out how to create any kind of a conversation, much less a standard so that companies in Europe and companies in the United
States, and maybe even companies in China can say, here's what to avoid to make sure that your data on the response and the output is better?
ARTIGAS: Yes. I would say that companies are very, very responsible. I see that all the software sector is very responsible, but they need to do
things better. They need to improve the products. And when we are talking about this, for example, the high policy dialogue, we include the companies
there. It's not only the countries talking among them, we need to include academia, we need to include the companies, we need to include the
government, the status (ph) society, and that's where we need. There is a conversation in it.
I think that all this going to progress on the technical point of view. And -- but it is true that all these models are general purpose, then are going
to be refined also by private data for specifically use cases on the industries.
So, I think one is the normal evolution of the products and the other is how can we assess the risk that this can have, for example, in
discrimination and fundamental rights or values. And that's where, again, we need this conversation. We need to put together the developers, the
users, the governments, policymakers and come to this consensus and standards.
SREENIVASAN: What do you think right now is the biggest obstacle of trying to establish something like this? Even if it's not sort of a hard agency,
but even the kind of softer steps that you're suggesting, what does success look like in five years?
ARTIGAS: Well, I think now the first immediate step is to gain the support of the member states in the voting they have on the global digital compact,
that's going to take place on Sunday, within the discussions of the common agenda on the United Nations.
So, the first thing is that this proposal that a group of independent experts that, as you mentioned, is not only 30 -- the 39 members of our
board, it's more than 2,000 experts in the world that has -- have participated in different consultations, more than 70 consultations all
over the world.
So, we are proud and we are, say, quite confident that these recommendations make sense and that these recommendations have gathered
absolutely all the sensitivities. So, the next step is that we really gain the support of the permanent representatives on the United Nations and to
push forward these initiatives.
But if even all these recommendations are not adopted, I think we should start the discussion from the side of society point of view. We need to put
all these challenges on the table and we create also, we expect that we create a conversation around these topics from now on.
[13:55:00]
SREENIVASAN: Carme Artigas, thanks so much for joining us.
ARTIGAS: Thank you so much.
(END VIDEOTAPE)
SOARES: And finally, for us, through the cracks of hate and transphobia, art shines through. More than 700 plaster casts now adorn London's
Trafalgar Square, each made from the faces of trans, non-binary, and gender non-conforming people.
Artist Teresa Margolles hopes her work will raise awareness of the violence directed towards the trans community in her native Mexico. Now, the piece
marks 25 years of art being shown on the square's fourth plinth.
And that's it for us now. I want to thank you for watching, and goodbye from New York.
(COMMERCIAL BREAK)
[14:00:00]
END