The Elephant in the Algorithm: Live from ZEG Fest in Tbilisi
Alix: [00:00:00] Hey there. Welcome to computers. Says maybe this is your host, Alex Dunn. And, um, I'm coming at you live from the tli Airport where I am headed home from the Zag Festival hosted by Coda Story. And it was really fantastic. It was. Three or four days. Hard to recall given how intense they were, um, with, uh, people from all over the world interested in journalism and media and technology and politics.
Alix: Um, and we had the opportunity to co-host a live show there, um, with Armando Iannucci, Chris Wylie and Adam Pincus all. Pretty influential, um, storytellers, Armando Ucci, the famous Satirist, um, who created Veep, um, and is extremely funny. Um, Chris Wylie, who, um, was the whistleblower that, uh, helped make Cambridge Analytica headlines, um, many years ago, but has recently launched a, an audio series called Captured, um, focused partly on AI as religion, which we'll [00:01:00] get into.
Alix: Um, and Adam Pincus, who most recently. I released a project called What Could Go Wrong with Scott Ze Burns the screenwriter of contagion who tried to make the sequel of contagion using ai. Uh, and uh, I presume from the title what could go wrong. You have a sense that things probably didn't go as bland. I.
Alix: This week we'll air the live show from Zag and enjoy.
Alix: My question, just to kick us off is why are we so bad at telling stories about technology? Like what makes it. So difficult, and I feel like maybe I can start with you, Adam, and then we can go this way.
Adam: Yeah, I mean, I, I think it is really difficult to tell stories about technology. It's burdened with a lot of jargon.
Adam: The ideas are a little bit complicated and they don't become personal and they don't [00:02:00] become emotional very easily. So I think the challenge is how do you, you know, kind of reframe a lot of that, those buzz words and those concepts that are pretty hard to penetrate. And ground a story in somebody's experience, and I'll just give you an example that I'm working on a documentary right now where it's really about surveillance, capitalism, a term that we've talked about that's really tough, like what does that even mean?
Adam: And I could give you a terrible and probably very long explanation of it and probably get half of it wrong. What I've been saying to the filmmakers is, I think what we have to ask our subjects and what we need to explain to the audience is, what are you afraid of and what did you do? Because those are things that grounded more in things that we can all hang onto.
Adam: But I think that it's things like phrases like surveillance, capitalism, where you're like, I don't even quite get it and I don't understand the stakes. I don't know why I should care.
Alix: Chris, you wanna jump in on this?
Christopher: I think in this sort of AI tech space, whatever we wanna call it, even just the fact that [00:03:00] I struggle with naming what it is, the space that we're in highlights the issue.
Christopher: I think we have not collectively come to. What I think of as the silent spring moment for tech. So if you look at what popularized or really mainstream, the environmental movement, back in the late fifties, early sixties, there was a book called Silent Spring by Rachel Carson. And the majority of the book is like data and facts about essentially how we're destroying ecosystems.
Christopher: But the reason why it's called Silent Spring, and I think the reason why it resonated with so many people at the time was the prologue, which is not that long. It's just a story about a town where all the birds died and spring is silent, and her contribution to the what. Then later sort of spurred on the popularized environmental movement was that prologue because it really encaptured what it really means to destroy the environment that your spring will be silent, it won't be spring anymore.
Christopher: I think with [00:04:00] AI and technology, we have yet to find that story that captures the attention of people. Really grips them emotionally to understand what it means for the future of our society and what world your children will grow up in.
Armando: I think that's right. You know this Z cause itself a storytelling festival story doesn't mean it's all false.
Armando: It's all made up. It's all pretend. It's just a way of outlining a narrative of facts. It's a way of allowing us to process otherwise difficult and unfathomable ideas. For me, the tech and specifically the ai. Problem is we have mythologized about it in the past and, and we've got this myth of, of AI turning human, turning sentient.
Armando: And we've told ourselves, you know, until it does that, it'll be fine. Now, as it slowly progresses, we realize that that becoming human is not the story. It's the fact that something can be so complicatedly programmed and [00:05:00] pre-programmed that it behaves like it's human. But never is human and it's trying to find a way to tell that story, I think is the way into that particular.
Armando: Subject.
Alix: Interesting. I love that you brought up mythology, but Adam, I wanna talk a little bit about a project that you've been working on that just went live yesterday. Uh,
Adam: two days ago.
Alix: Two days ago. Um, called What Could Go Wrong, which immediately makes me wanna know the answer to that question. Um, this is a project with Scott Z.
Alix: Burns, who's the screenwriter for contagion. Do you wanna say a little bit about that project and sort of what you have learned about constructing stories that help people understand how this stuff works, the implications and mistakes?
Adam: This was interesting because during the writer strike, I've worked with Scott in the past and during the writer strike when they were not able to write TV and movies, I had a conversation with him where he said, you know, look, there's a lot of anxiety about AI amongst my peers.
Adam: I. And during COVID, the film contagion had a real second life and that a lot of people watched it and were struck by how much it reflected what we were all going [00:06:00] through. And those two things combined led him to this idea that maybe he could do something he'd been unable to do prior to this by himself, which has come up with a sequel to contagion using ai.
Adam: And I think in his mind there were two lines of questioning. One was, could AI. Predict the next pandemic scientifically, which I think we pretty quickly abandoned. And the other was, could it be a creative partner? Which is really where the show goes. And I think that the core of this is, it's a show founded on anxiety more than anything else.
Adam: He's just anxious. He's an an, you know, he's anxious about the features. A lot of people are, and he wants to try this in a kind of a genuinely curious way. And going back to sort of, I think what one of the questions is how do you tell a story about something really abstract is we basically created characters out of a bunch of different ais and they recur over and over again and they flounder and they get things wrong and they hallucinate and they are full of hyperbole and they flatter and they flirt and [00:07:00] they do all these things, which I think shows you a lot about how the technology operates.
Adam: I would say to get its hooks into you. But you know, for story purposes, it works pretty well because they become recurring characters in a narrative about Scott's attempt to understand something, and Scott's essentially a proxy for the audience.
Alix: I mean, it's a, it's a new way of telling a new story, but I feel like you're also working on some things that are maybe old stories where we're seeing technology politics manifest in sort of traditional power struggles.
Alix: Do you wanna talk a little bit about the work you've been doing around Sidewalk Labs or kind of your exploration there?
Adam: Yeah. I mean, this is a story that actually came to me through a, a mutual friend of ours named Roger McNamee who said to me, when I met him, he said, I want to make this. So stayed with me.
Adam: He said, I wanna make the Philadelphia for technology. I wanna make a story that is emotional and works on an emotional level so that people understand what's at stake here, and it changes their minds by affecting them. [00:08:00] And we kind of cast around for a little while. And then he sent me a piece about. An effort that a subsidiary of Google called Sidewalk Labs had abandoned in Toronto.
Adam: This is right as the pandemic was starting, and it is really a story about them trying to build a smart city from the internet up. This is their language in Toronto on a 12 acre waterfront site. Which met first with great enthusiasm. You know, they use all the best words, you know, it's gonna be sustainable and it's gonna be innovative, and it's going to, you know, improve all of our lives.
Adam: It's a word cloud of wonderful things. And then when they were met with some questions and they answer those very. Occasionally and opaquely, they got some skepticism and then ultimately resistance. And some tech companies respond to resistance with disregard. You know, they, they're, they're very tough to embarrass and I think Chris knows one in particular, but Google hates it.
Adam: So they just packed up and went home and that's that story.
Alix: Yeah. We actually [00:09:00] interviewed Bianca Wiley, um, in the fall, who was one of the organizers that. I think that resistance period took a little bit longer, um, than initial resistance sort of leading to Google leaving. It was actually city meeting after city meeting of I think Bianca and other people showing up and saying, you keep talking about innovative building projects.
Alix: What is innovative about these projects? And some of the conclusions after asking that question a lot of times is that they were using timber. So basically her process was, let's take in good faith, the public engagement consultation that the company's saying that they'll participate in. We'll take it in good faith and we'll go and just ask questions.
Alix: Eventually they got, I think, so irritated and maybe also just they don't like pushback at all. They like people to get excited about their products. Yeah, it's amazing pushback. Yeah. Armando, I wanted to talk to you a little bit about when I heard you were gonna be a guest, um, on this. Panel I was thinking about Selena Meyer from Veep, um, making AI legislation, and I just kept picturing that and thinking about, uh, the public conception of what the government can and can't [00:10:00] do when it considers these new technologies and how in the public imagination, um, there might now be a sense that these people can't regulate these technologies.
Alix: Um, and I wonder a little bit. One about what you think about Selena Meyer making AI regulation, but also how you see satire being part of the process of mobilizing and communicating the implications. Well, first
Armando: on Selena Meyer, the, I mean, maybe I'm so close to all this, but I just know that most politicians don't understand it.
Armando: You know, you've seen the, the Zuckerbergs of this world sitting in front of a panel of senators and one rather elderly senator saying, so how do you make money and, um, can I click on this? And, you know. So they're easily, politicians are easily bamboozled by people with money who talk business talk, or who talk tech confidently.
Armando: And they see it as something that they can, for a large amount of money, can get taken off their desk that they personally don't then have to deal with and that they can then get the credit for. And I think [00:11:00] part of the stories I'm telling are, you know, we have to do something about that because these people.
Armando: Are not as good as they say they are. You know, these are people who at the age of 20 or 21 started something that then in the mid twenties have a company that is recognized by, you know, half the people on the planet. They've never had to experience conflict or push back or competition, and have also grown up with the fact that they can hoover up anything of ours.
Armando: As long as we get their service free, they can take anything that belongs to us. Somehow not pay any taxes and get huge profits from it. But I see it in the uk. The UK's just done a big deal with Palantir. Palantir now has access to all our medical data. That's not good. So politicians are just easily swayed by these individuals and, and the more we can show that, then hopefully the more we can try.
Armando: The other thing I, I try to avoid is anything that's too bleak. 'cause people will switch off if it's. Too bleak. So [00:12:00] try and make it funny and try and make it po possibly hopeful. Try and show a way out if you can.
Alix: Um, you're working on a project that maybe is taking on some of these questions directly. Do you wanna, well, I work
Armando: on a movie that's sat in that world of social, social media.
Armando: But, uh, you know, I, I found over the last couple of years making the mistake of trying to say too much. You know, you can, somehow, themes are endless. The, the data, the information, the people you speak to, you can go on researching it. You can research it for years and you still haven't. Got to the essence of it.
Armando: So I'm gonna temporarily park it while I step back and try and, you know, boil it down to the key key storylines. I think with big, cumbersome stories like this, I think you have to try and identify the one or two interesting themes. Go in through the human perspective, find one or two interesting characters, and just rely on them to take you into the wider picture.
Armando: Don't feel you have to force feed them all the information. Start because that's too much. As a viewer, that's too much to take. So it's [00:13:00] now about trying to tell the story properly and carefully and not in a sudden rush of information.
Alix: Chris, I feel like you maybe also struggled with this making captured this audio.
Alix: Is it a book, do we call it an audio book or an audio series? I
Christopher: think it's an audio. I'm not sure actually. I think it's an audio series.
Alix: Okay. So we'll call it that Sounds better. So this audio series, part of that exploration was around AI as a religion. Um, which I think is, is a really interesting frame, especially since you brought up mythology.
Alix: Do you wanna talk a little bit about the kind of juxtaposition that you explored in terms of the people that are so excited about these technologies and the people making them, trying to construct it as a myth and a religion? And then maybe a little bit about the realities and how you told stories that sort of made some of those dissonance between those two things.
Alix: Clear.
Christopher: Sure. So when you listen to all these sort of big name tech bros, talk about the future, one of the things that they claim is a definitive [00:14:00] knowledge of where society's going to be, what it's going to be shaped like. They speak with a. Absolute certainty about where we're going and what they're building and what that that will do.
Christopher: I found that really interesting, especially when they start talking about how powerful everything will be. So when you listen to people like Sam Altman about this Grand power or Elon Musk about this grand power that AI will become, that it will sort of subsume all human knowledge and reach what they call the singularity point, which is the point where AI not only encompasses all human knowledge, but surpasses our capabilities.
Christopher: To create super intelligence. And then when you look at people like Elon Musk, where he not only is experimenting in spaces like ai, but also with neuro technologies. So Neuralink creating human computer interfaces, brain computer interfaces with the goal as he puts it, of creating both a read and write function with people's minds.[00:15:00]
Christopher: Eventually, as he puts it, being able to take someone and just. Plop them into a computer. And if you think about sort of what that means, you create a super intelligence that transcends our capabilities and knowledge that also transcends our physical and mortal constraints that can exist forever, as long as there's energy and then that we can put ourselves into it and live forever and transcend death.
Christopher: To me that feels like a religious prophecy. And when you look at. A lot of religions throughout history, one of the things that they promise is in some way, transcendence in some way being able to exist beyond our current realm or our current existence. And this is the thing that they all seem to be working towards.
Christopher: It's interesting because they all talk about it very openly, but when you look at sort of how the media reports on it. Almost all the time they're making these [00:16:00] comments and they're sort of discarded as like these offhand, you know, Elon Musk literally said, I think God will emerge in my servers. The reason I find that troubling is then if you sort of extend that and think, okay, so if this AI is some kind of like supreme being, and then we're putting this supreme being all throughout our homes and our societies, speaking of like storytelling, like if you think about.
Christopher: The genre of like a haunted house. What makes ghosts scary? Ghosts and spirits are scary because they're these sort of invisible things that can see us, that can watch us, that can subtly change our environment, but we can't see them, we can't touch them, we can't interfere with them, and we are very helpless in that situation, right?
Christopher: Like a haunted house is scary because we're helpless. We have no ability to fight back. But that is the house of the future. We are [00:17:00] creating a haunted house. It's just, it's haunted by algorithms. And so for me, when I look at religion and spirituality and, and all of these things, I think it makes AI make a lot more sense, particularly when the leaders in Silicon Valley see themselves almost as messianic prophets that are going to be the ones that conjure up God.
Christopher: Their actions make a lot more sense if you, if you understand that they see themselves as prophets.
Armando: Something also about, um, these people who rely on that, that the label of genius, because if we call them a genius, it means that only they can understand what it is they're doing, and it's actually pointless to question them.
Armando: Pointless to say, will this lead to bad things
Christopher: in the same way that we don't question the prophets.
Armando: Yeah. Yes. You know, it's like 1950s where the priests were, you know, senior figures within the community that we wouldn't question same way. We wouldn't question what the doctor says. So we've been conditioned [00:18:00] to not question what Sam Altman says or what, what Elon Musk says.
Armando: I mean, that's breaking down with Elon Musk. But, uh, they still maintain that belief that they're the only ones who can see it. Therefore, why can't they have all our resources at their command to make it happen? And if the world doesn't understand us, then we'll just bring the world down, you know, we'll, we'll drink the Kool-Aid and, and we'll all sleep forever.
Alix: I mean, I feel like what's interesting also that we're recalling religion is that religions are often global in nature. Um, and I feel like, I think a lot about is it a religion or is it a cult? And I feel like cults are a little bit more localized. So maybe there's like the cult of the Bay Area. And then we have these religions and I kinda, I don't know.
Alix: I wonder if you could unpick a little bit the scale that these guys are operating on in terms of the global imagination and how story, um, how you, how you can make stories that resonate with so many different people around the world in the way that they, they have,
Christopher: when you look at, when you look at religions, you can have cult-like groups [00:19:00] of extremists, and then you can have a wider group of passive followers.
Christopher: I think it is at least quasi-religious nature in the sense of, or at least it's developing characteristics of it. Because even when you look at people like Brian Johnson, who also talks about living forever and integrating with the ai, but in the meantime creates like really strict lifestyle codes of how you need to behave, how you eat, how you sleep, right?
Christopher: It's sort of like the kosher for tech bros. I look at all of that and it just. It feels like some kind of new, at least quasi-religious movement, and that it's not just a cult because you have a wider periphery of people that are slowly, whether they realize it or not, adopting these ideas and internalizing them as well.
Armando: And if you look at any, you know, any study of religion, we'll show you there is always a need. We have a need to find some belief that can take us beyond ourselves. And if we can join a movement where there are many people who believe this, then we've [00:20:00] found a community and so on. And maybe as traditional religion has, has died away, you then get the, the new age, the, the, the religion of wellbeing and wellness and so on.
Armando: The thing with. What's happening now is I think people are attracted to it because it feels like a religion. They feel it's founded, in fact, because there's a science to it. It's to do with technology. So it can't be, it can't be silly, it can't be false. You know, there's a reality to it, and I think that's what gives them the confidence to.
Armando: See the musk of this world as their savior.
Adam: I think that there has been a, some movement with a lot of people, at least from seeing these things as benign or benevolent. I do think it goes back to about 2016 where you, you first start to see technology companies look not like the do no evil source of innovation and progress, and instead look a little bit more like, I think many of us see it today, which is.
Adam: Incredible [00:21:00] concentrations of power who project an inevitability about the products that they're creating, which are businesses that they want us to consume. Right? So that narrative has shifted, and there's more skepticism today than there was, you know? 3, 4, 5 years ago. I mean, I even think about the sidewalk story as, as emblematic of this because when they come to Toronto, Toronto can't believe their luck.
Adam: The little Toronto that I think has a, you know, as you would say, like a little bit of an inferiority complex about the United States and can't believe that they've been, they're, you know, pick me at the highest level. They can't believe their luck. They go on a journey themselves and realize that they've signed up for something that they don't understand, where the motives are not clear.
Adam: Where the deal's a little too good to be true, and maybe that's because they don't understand really what they're giving up. One of the things that I think you hear all the time, so another problem with telling stories about tech is you'll say, well, privacy, and people will say, I. I don't care about privacy because I [00:22:00] don't have anything to hide.
Adam: Hide, yeah. I'm like, I don't think that's the whole story, although my own, even my own ability to sort of tell you why that's problematic is really limited. Like I think you could explain that much better than I,
Christopher: um, I think when people talk about privacy. They often frame it in the position of, of hiding, right?
Christopher: And people will think about Snowden and the NSA and, okay, well I haven't done anything wrong, so if it's spying on me, so what? They can go look through my dick pics for all I care. But the problem with that is that privacy is the foundation of your ability to craft who you are and how you want to exist in the world when you go and live your life.
Christopher: You go to work, you have constructed an identity and a persona about who you are at work, which is different than who you are with your children versus who you are with your lover versus who you are with whomever. And it's only privacy that enables you the ability and power to determine how you want [00:23:00] to exist, how you want to show up in different realms of your life if you remove that power of being able to decide and determine who you are.
Christopher: And you hand that over to something else, you lose your ability to be you, and that's what privacy is. It's our ability to construct who we are.
Alix: I think of it as an instrument of pluralism that if we don't have privacy, you can't actually be the specific person you are in the fullest way possible because you can't, there's a system of control around you that makes that impossible.
Alix: But I think this point that there's been a pivot, I think everyone has seen it. We went from being like Google great Gmail, wonderful Facebook, wonderful Arab Spring, wonderful. Um, to. These guys are really creepy. They do all kinds of things that are, would be crisis communication disasters for any other company.
Alix: Like most of these companies should have been embarrassed into bankruptcy a long time ago. And yet they've accumulated more and more and more power sometimes through constructing these mythologies and these religions. So let's [00:24:00] now turn to them and sort of think about the power of story to counter the stories that they're telling.
Alix: 'cause I feel like that's the name of the game now is figuring out how to construct. World build essentially, where we can imagine a world where they're not the architects of all of the infrastructure around us. And I wanna draw on something Timothy Snyder said yesterday when talking about the role of narrative in imperialism.
Alix: And he made this really interesting point that when you're thinking tactically about how to counter imperial narratives, you can't go head to head with it because if you go head to head. You end up reinforcing the dominant narrative. And then he said, rather you should cover the, the rock. This like hard narrative of imperialism with so much interesting ivy that the rock disappears and there's this like something new sprouting from it that just distracts people from that kind of core message.
Alix: And I wonder a little bit like how, how do you all think about. Creating stories that don't react to the dominant narrative and paradigm. Um, and instead help audiences really engage with new possibilities [00:25:00] entirely, sort of new worlds that we could be living in.
Adam: I mean, I, I think that humor is a big piece of this.
Adam: And one of the things that we tried to do with the Scott Burns show was to make it funny at turns, right? So. You might have someone like Meredith Whitaker who's, uh, you know, we've talked about very, very smart speaker and very good at, at communicating ideas, and she comes into the show to basically say, how credulous do you have to be to believe this story?
Adam: Like, what a chump, basically for trying to, for buying their narrative. And even thinking that this thing could collab. What are you even talking about? You're gonna collaborate with it. And then Scott, who's our, you know, our character does it anyway. Has these sort of surreal encounters with these ais where he's.
Adam: Completely flummoxed and embarrassed and doesn't know what to say or do. But I think that the, the juxtaposing those two things, somebody extremely sober saying, let me tell you what is real. And then a character who is sort of, like I said, a kind of a stand in for all of us saying, okay, but I'm [00:26:00] still curious.
Adam: And I'm just gonna continue to flounder along and move forward, which I feel like is what we're all doing anyway. It's like, yeah, no, I think I understand the risks. I'm just gonna keep going.
Armando: Yeah. I mean, comment is important. I think its role is much more like, you know, it's like an insurgency rather than an equal, but opposite force, like you were saying, you can chip away, you can be funny about the humorlessness of these people.
Armando: The fact that, uh, Elon Musk's jokes are some of the worst jokes ever devised by man. Or machine. I was told a story about Nigel Farage, who is a, a right wing politician in the uk. And uh, it was a comedian who was about to do an event with him. And so she introduced herself as a comedian. He said, I don't find comedy funny, which just said everything about what we're dealing with here.
Armando: Um, and they don't find jokes about themselves. Funny. They get very irritated by it, as do most dictators. So if you can chip [00:27:00] away there, but I think that's just minor, minor work really in the end. It's, what is the alternative? I don't know what it is. What's that? What's that story? That hopeful story. I do remember once being at an event sponsored by Google and it was like a Ted weekend, lots of 20 minute talks and whatever, and ai, this, and, you know, charity that, whatever.
Armando: And, uh, across the weekend, the, the theme was meant to be optimism except the keynote speaker at the end was Stephen Hawkin, who came on and said. I beg you, be careful with ai. It will kill us all. Everyone was going, he's meant to be optimistic. What do we do? What do we do? Is Stephen hawing. There was just something inherently comical about that fact that he breached the kind of the religious codes that we were all meant to be following for those two days.
Christopher: Or, or maybe that was his version of optimism that will all just be ended quickly by ai. Or maybe he was kidding.
Alix: Maybe he was kidding. Yeah, he was a jerk. Um, but I feel like even that using that frame of AI's gonna kill us [00:28:00] all is in many ways reinforcing. The very narrative that they want us to be thinking about, which is that these systems are so all powerful.
Alix: Yeah. They're gonna be so, and I find that like the pretend spectrum of the optimists who say this is gonna solve all world's problems. And the pessimists who say it's gonna kill us all, they both think the same thing, which is that this thing's gonna be really powerful. And I feel like most thinking people don't feel that way, or at least are skeptical that
Armando: that, or highlighting the mistakes it makes and that they're hallucinating it folks and yeah.
Armando: I remember you last year saying that when there were, that was at its height of, you know, they were looking for more money and you were saying that they have to project this thing as this is gonna be great because they need the money, not because they know. You were saying they didn't know. They don't know where it's going.
Armando: They, they still don't, they still don't, A
Adam: lot of it's still bullshit.
Alix: Yeah. There's serious Ponzi scheme vibes happening. Yeah.
Adam: Yeah. No, I mean, so Roger would point that out. I think when we first started this project, he said, listen, they're gonna be out of business by Christmas. And when I pointed that out to him [00:29:00] that Christmas had come and gone, he was irritated by that fact.
Adam: But I don't think it changed his point of view, which is, you know that it is an investment story. That is, you know, very much to our detriment in terms of the its use of energy and water and all these other things, and this myth of competence. Is, uh, you know, in his view and many others exactly that. It's a myth that Meredith Whitaker being one, Osama being another, who we talked to, who's like, no, this road doesn't go there.
Adam: Like maybe somebody someday will come up with something that will be powerful enough to kill us all, or any of the other sort of narratives of, of omnipotence and you know, that it will replace us all, et cetera. Um, we're buying into something that, you know, this product doesn't go there. That's his point of view.
Alix: I also think the decline, the social decline because of social networks. That leads to countries dissolving their public health functions is probably much more likely to kill us than some technology system. But so do you, how do we, I [00:30:00] mean, you mentioned, um, we haven't had the story, so in using Silent Spring, which I think is a really good example, there's obviously, you know, we're in Georgia, which has a different set of problems related to technology.
Alix: This idea that you can have global stories help people understand enough in a contextualize enough sense to sort of. Articulate what it is that they want, I think is really tricky. Like, I don't know if there is one story. It feels like we need lots and lots and lots of stories. So how do you all, do you see that happening?
Alix: Like how do you, how do you feel about the overall kind of direction of, um, storytellers being able to produce the kinds of stories that help people understand the implications of all of this?
Armando: I mean, I think of the example of Black Mirror, which is yeah, just a set of very different stories, but around a.
Armando: A certain theme because I think that I, I think that's right. It's, I think it's impossible to tell this one story because we dunno how it ends. We don't even know if we're still at the start of it or in the middle of it. There's nothing tangible. It's like, it's like global warming and, and climate, how it's very hard to show a drama about climate change [00:31:00] because it's not a singular event and it's set sometime in the future, but we don't know when.
Armando: And that's a hard thing to dramatize 'cause it's hard to. Make personal.
Alix: Yeah. A plug for Ministry of the Future, which I think does a really, really good job of that. But it is
Adam: really tricky. Yeah, yeah. Yeah. I think trying to talk about it, these are kind of abstract ideas and they don't land, I think, very easily with us on a personal level, and they make you feel powerless.
Adam: And I think that it's, it's easier to tell a story that is, that demonstrates something about an individual. You can sort of understand the experience of and relate to your own life, that I think is a lot easier to try to tackle the problem that way. I would say it's like, it's easier to be specific than to try to tell sort of a big story.
Adam: 'cause the big ideas are really complex and they're really abstract and I'd rather see something in miniature that sort of speaks to a larger idea. I, I just think it'll be easier to do and probably more effective too.
Alix: I think you get at the [00:32:00] big ideas by doing that. Also, like there is a single truth I think that come from.
Alix: Individual stories. But
Armando: yeah, I think people are capable of moving from the specific and individual to the national and international and the global. That's the power of story. It's about story, but also the way we watch stories now we can access dramas and comedies from all around the world. Yeah. So we're less, we feel less detached from a drama, even if it's from Scandinavia or even if you know, we, because we connect with the people in it.
Armando: You know, so that's the way in, and we feel we're being given something that's false. If it feels like it's been dressed up to appeal to an international audience. It's almost like the more, the more specific it is, somehow the more it becomes a symbol for everyone.
Christopher: I think about not to bang a drum about religion, but when you look at the amounts of conflict, the amounts of debate, the [00:33:00] amounts of activation of people.
Christopher: That the stories of religion has prompted throughout history. I don't know if I fully agree with what you're saying around that. We need to distill it down to a specific character in a, in a tiny environment since, and exemplify something through that small thing, because that's not what like the Bible does.
Christopher: Per se. I mean, it's taught, it's, this is a series of stories for sure, but there's wider ideas that people go to battle for. I mean, you had an entire crusades. And so I think if people start to clock that what's happening with AI is not simply technology or simply the future, but it's actually ideology that's.
Christopher: Once people internalize that, that it is ideology that is being imposed on them, not AI per se, that you might sidestep this issue of storytelling because people will engage and, you [00:34:00] know, defend their livelihoods against a threatening ideology once they realize that it is a threat
Alix: and an ideology. Yeah, I think that's And an ideology.
Alix: Yeah.
Armando: Maybe that's a task for the journalist or the documentary or the. That coming together of, you know, a group of people who, who don't want the government to tell them how to run their businesses. Allied now with politicians who believe in government being as small as possible. That's a story that should be told, but it doesn't have to be done Fictionally.
Alix: I mean, your example of Palantir is a really good one. Um, well, I, I'm mindful of time. We started a bit late, um, no shade on the last session, but they were late. I was thinking we could turn to q and a for like. Five minutes. I see hands, but there are lights in my face. So could someone else find a microphone and find those people?
Alix: Yeah. There's two up front.
Audience: Thank you. Um, Mark Mullen. I wanted to ask about, in storytelling about sort of new complex, weird things, the [00:35:00] process of naming and naming those things, and how do you come up with those names? Is that a part of the process? Is it not? Should we let that happen organically? If you do feel that naming is an important part of the storytelling.
Audience: Is there a process by which you do that? What role does it play? Thanks.
Alix: That's super interesting. I think the name Palantir is a really interesting choice 'cause it comes from Lord of the Rings and is an evil all seeing figure and we decided to have the NHS partner with them as their primary digital partner.
Alix: Kind of creepy, you know, but, um, what are your thoughts on naming things? You've all, you've all spent some time doing that.
Armando: We were talking out outside about. Having to come up with the science, the, the, the, the, the phrases to about climate early on.
Christopher: Yeah. Yeah. So I think back to, so we talk about climate change now and everyone knows sort of roughly what that means.
Christopher: Even though none of us are climate scientists and we're not technical in that regard. I mean, I couldn't explain exactly what ozone is and how, you know, [00:36:00] sun rays affect all the different chemical. Like, I don't know, but I understand the principle of climate change. And these words all had at some point an origin, right?
Christopher: So if you go back, I think it was to the seventies when NASA wanted to brief Congress on shifts in meteorology. That it was noticing that it was concerned about it literally had to decide what to call it. And so global warming, this idea of global warming and greenhouse gases to explain what are all these chemicals doing?
Christopher: It's like a greenhouse. These are terms that emerged and that then we adopted as a public, as a way to quickly talk about these complex things. And at least in the AI space, we kind of have, we've done a kind of shit job at coming up with words and terms to describe these like complex things. Um, even on panels like this, we all have moments of struggling to kind of encapsulate what it is that exactly that we're talking about.
Christopher: Mm-hmm. I think about like, all the things that AI are doing, it's like. We're displacing [00:37:00] humanity, what do we call that? Yeah, the repla,
Armando: the replacement has been taken, hasn't it? Yeah. Right.
Christopher: Oh, oh gosh. Yeah. No. Yeah. Um, but, but I would love for, you know, people in storytelling or science fiction or whomever to like come and say, here's some idea.
Christopher: Like if you were writing a story about the future of a political debate about ai, what do, what were they, what would they call it?
Alix: Yeah. What we call it. Yeah. Yeah. Another really powerful one. I, I find, um, amnesty International did a report about what's happening in Palestine and they titled the report Automated Apartheid.
Alix: Um, which I found another really powerful like boom boom. And I think it's those sort of phrases that transfer so much. They pack in so much information and they're kind of catchy. Yeah. And even if you don't really know what you're saying and you
Armando: repeat that, but, but then of course the phrase then becomes everything.
Armando: So yeah, it was global warming and then the likes of Trump would. Well, I see snow. Where's this warming? I've been told. So you then have to change it to global emergency. So you know, the freezers are there as a way in, but they're not then freezing the concept forever. They're not defining [00:38:00] it forever. We have to keep refreshing.
Armando: I would describe these things.
Alix: Yeah. Interesting names. Have a halflife. Okay, next question. Yeah. Who is somebody here?
Audience: Hello? Um, wonderful insights from a wonderful panel. I have a question to Mr. Yanchi. Uh, roughly three years ago or so, we had an interview and something you said that kind of stayed with me.
Audience: I asked you whether you felt threatened in your line of work because of the rise of the ai, and you said you didn't because you still could not find a single good joke told by the ai. Uh, and it has come a long way since. Has it learned, uh, to tell a good joke and what will happen when it finally pulls it off?
Armando: Well, it's still, uh, you know, I'm blessed with having the initials AI anyway, so I can always. Say that I'm and that Don, um, but it still hasn't, I mean, it may get there. Somebody, my, somebody I knew asked AI to write a column as if written by me. I read it and it was terrible. I mean it, but I could see, I could see what [00:39:00] I was trying to say, but the examples it was using were just the wrong examples.
Armando: You know, like someone, like a machine that doesn't, still doesn't know whether this is a glass or a nose, you know, it still hasn't. It out that there was the shape of humor without the actual meaning of humor. But who knows? I don't know. I, I'll never say never. What I have noticed is the difference in just those two or three years, you know, it's got a lot more sophisticated, but I still clinging to this thing if it's based on, if it's generating ideas and thoughts.
Armando: Based on what it knows, things like comedy and poetry and so I suppose, are all about new ways of describing things. So it's when it gets to that stage that it's found a new and interesting way of describing something that'll have made a significant leap.
Christopher: To your point, I would agree that we're nowhere near the point where you could be replaced if, if ever
Armando: close to retirement anyway.
Armando: So,
Christopher: uh, but, [00:40:00] but just to add to that, it's because when you write and you, you draw on lived experience, you have, I. Experience, right? You're an embodied being that goes through the world. You're not a database, right? And so when you look at LLMs, they don't have embodied experience. We haven't given them that yet.
Christopher: This is also why, by the way, within ai, the big conversation right now is how do we give AI eyes, ears, nose, and also bodies to actually experience the world? The reason why it's so intuitive that this is a cup and not an ocean or whatever, is because we've experienced it. AI doesn't experience things at the moment, and so I think at least for now, you're safe in the future.
Christopher: I don't know.
Alix: There was someone, yeah.
Audience: Hi. Thank you for very engaging conversation and I'm really also taken by the title, uh, elephant. The Elephant in the algorithm or the elephant in the room. And likewise, the reflections [00:41:00] on religion. I think most of the people in the room would agree that any structure of power that is unaccountable becomes a problem.
Audience: And one of the things that we have with most of the forms of technology or social media is a massive asymmetry of power. And in a strange way, one of the elephants in the algorithm is us constantly handing over whether it's our data. Or handing over the fact that we can't leave Twitter when it's become X because it's become the commons.
Audience: And so I, I wonder if you could reflect on what that reformation or revolution in power structure might look like or where the opportunities are. That's my question
Armando: and it's interesting. I think, again, I dunno what the discussion is here in the uk there's this debate now about. Whether, you know, phones should be allowed in schools and when children should be allowed to [00:42:00] use social media and not.
Armando: So there's be beginning to be a recognition that it's not totally a good thing and that we should start talking about how we disengage from it to a certain degree so that we're not from the moment we wake up to the moment we go to sleep. Switched into it. So that's the start. But we know, of course, that every, any kind of CEO of any tech company will be lobbying, elected representatives to stop that from happening.
Armando: Even though they're, they'll never allow their children to grow up being on social media until a certain age. We've got to be braver about not giving it so much money so readily, you know, the, the, the tax exemptions it gets for taking all our property, intellectual property. Again, I see those discussions happening, but when are those debates actually gonna turn into votes?
Armando: I don't know.
Alix: I would also say the frame of individuals handing things over implies consent. Yeah. And I think that the structural power that has been accumulated by like eight [00:43:00] billionaires, um, is such that I think it's the responsibility of nation states who have stepped back, um, yielded so much space, um, to these eight.
Alix: Dudes, and I think until you have Mark Zuckerberg in jail for what happened in Myanmar, and you actually have political action taken at the level of breaking up these companies, it's not fair to say, you know, random school teacher in Extown using her phone and connecting to Facebook is she's somehow seeding power.
Alix: I think the people that seeded power are the ones that should have been representing us. And fighting from these structural power being built around us would be my take. Yeah. Yeah,
Adam: I completely agree with that. And I just wanted to add, because there is a, this internalized feeling of powerlessness and so.
Adam: I teach at NYU and uh, I was telling a colleague two things that had been recommended to me. One is I gave them their midterms in a blue book with a pen. And I was like, this is, this is about finding out what you know, right? I don't care what chat. [00:44:00] GBT knows, chat, GBT can, you know, is not in my class. So at least not that I'm aware of right now.
Adam: But the first thing is I want to know what you know, so you're just gonna write it down in this old fashioned book. And I'm sorry that we have to do it that way. And I'm sorry I have to read your handwriting, but. Them's the breaks, but this colleague said to me like, I actually require them to use chat GPT because I think it's inevitable.
Adam: And so I wanna see how well they can use it. It blew a fuse in my brain. I mean, I don't know even how to react to that because I don't agree with that statement. I don't think it's inevitable. I think if they, if they use it, they're shortchanging themselves. Technically they're also cheating, but they're shortchanging themselves more than anything else.
Adam: And I want to find ways to steer them away from thinking like that more than anything else.
Alix: Yeah. Okay. I think we are at, I just got a little hand wave. Um, I know there's several other questions. I presume you guys will be hanging around. Um, maybe conversation will continue. Um, thank you everyone for joining us and thanks you all.
Alix: This was great. Thank you.
Alix: All right. I [00:45:00] hope you enjoyed that conversation and thanks again Dakota story and Natalia Antelava for helping. Organize and even coming up with the idea. Um, and thanks to Georgia Iacovou, and Sarah Myles for producing this episode, and next week the team will be at fact. Um, so if you're around, uh, please reach out.
Alix: Um, look around for Georgia Iacovou, Soizic Pénicaud, and Hanna Barakat, um, who is also a project officer with the Craft program, which should be great. Um, even though. I co-chaired it. I will not be there. Um, but that's fine. My team's better than me, so go hang out with them. They'll also be doing interviews and producing some content off the back of that.
Alix: So if you are jealous you're not at fact, um, there will be some goodies for you to, you know, learn some of the. Findings from the research that's presented there and some of the cool research and researchers that are coming out of that space now. So if you're at fact, um, go say hi and if you're jealous or not, at fact, look forward to some good content coming outta that.
Alix: Um, and uh, for those [00:46:00] of you who are traveling to Athens, enjoy it is a wonderful city. Um, for those of you who didn't know, we lived there before Mexico City and it's very fun. So have a lovely time and we'll see you next time.
