AI in Gaza: Live from Mexico City

Alix: [00:00:00] Hey there. Welcome to Computer says Mamie. This is your host, Alix Dunn, and in this episode we have a live show for you. So last week in Mexico City where I live, for those of you who didn't know that we had the opportunity to host a live show on the heels of real ml, which we'll drop a link to it in the show notes, but it's one of the coolest communities of researchers and activists from around the world working on algorithmic accountability.

Alix: We were able to host a conversation that you would think would be encouraged in every single space about human rights, about digital rights, about technology, politics, but really it's been pushed out of a lot of those spaces, and that is how AI is being used in Gaza and in warfare, and what the implications are for human rights and what we can do about it, and how we should conceptualize big tech accountability when tech companies are knowingly enabling war crimes.

Alix: I personally think [00:01:00] that if you're not working directly on this issue, it might be overwhelming content, but the least we can do is know what's happening. The least we can do is hear directly from the people who are affected, whose families are affected, and who are working hard round the clock to stop what's happening and make sure that the people who have helped are held accountable.

Alix: So with that, here's our live show in Mexico City.

Matt: My name is Matt Mahmoudi. I am an assistant professor in digital humanities at the University of Cambridge, and a researcher and advisor on artificial intelligence and human rights at Amnesty. Largely what I do is look at the uses of AI in law enforcement, military, and migration context, but I'm also interested broadly in the ways in which the AI industry I.

Matt: Uses pretenses of racialization and oppression as a way of [00:02:00] circulating capital between some of the bad actors that we're gonna be talking about today.

Marwa: Hi everyone. My name is Marwa. So I am a Palestinian and I work on digital rights. I work for a digital rights organization called Access Now, and I lead their work, uh, in the Middle East and North Africa region.

Marwa: I work on many issues, not just AI warfare surveillance, uh, censorship platform accountability, and among. Now, of course, it's been the reality for the past year and a half working on tech accountability and AI warfare. I learned a lot, and that is basically what brought me to this world, is the digital occupation of Palestine and the different manifestations in which technology is used to oppress, uh, and brutalize people.

Wanda: I'm Wanda Muñez. I am Mexican. I'm really excited that all of you are here today and traveled so many hours to come to our country, so I hope that you are enjoying it. I work as consultant for different organizations and I'm member of the feminist AI Research Network in the Latin American chapter. And, uh, [00:03:00] for many years I worked as humanitarian worker and human rights activist.

Wanda: And now for a few years I've been working on, on AI from a human rights perspective, a gender equality. And a lot of my time and effort has been involved in working on the processes that are where we are advocating for the adoption of new rules on the use of AI in the military domain.

Karen: My name is Karen Palacio.

Karen: I'm also known as Scar. I'm an interdisciplinary artist and an AI developer from Cordova, Argentina. I really focus on maintaining, uh, research production loops that will help me uncover what it means to weave, uh, multi-dimensional texts surrounding what it means to enunciate. Techn artistic discourses.

Karen: This implies that, uh, relating to the histories of [00:04:00] computer science in Argentina, the histories of women in touch and technological sovereignty.

Alix: Wonderful. I think we're already starting to see some interesting connections between your work, but Karen, your work looks at the history of the use of technology in the planned condor.

Alix: Dictatorship. Do you wanna start there? I feel like it contextualizes a lot of the other conversations we might have. And then I'd love to talk a little more about your concept and work around cognitive warfare.

Karen: Yes, sure. So let's talk a little bit about Lan Condor, also known as Operation Condor. So, LAN Condor was an intelligence and repression military operation.

Karen: That happened between 1975 and 1983, and that was implemented by the southern corn countries, but was led, coordinated and financed by the USA. This military [00:05:00] operation, of course, is very relevant still today for us all in Latin America. It reinstated neocolonialism in a very particular way that was later exported to other regions.

Karen: All of these dictatorships were right wing, were very deeply interconnected, both on an informational level and on the type of techniques that they use to fragment social fabric and the social sphere of these countries and implement new ideological concepts. And categories. These, uh, narratives that were implemented were later used to justify state terror.

Karen: They will reconfigure the subjectivity of this entire countries on a more not so abstract level, so you understand, uh, kind of [00:06:00] genocide that is meant. This implies the extermination of three generations of people that were not politically aligned with the USA. It also involved extreme experimentation on psychological operations and cognitive operations in regards to torture and in regards to social engineering.

Karen: So main objective of Plant Condor was to, for these countries, uh, populations to accept economic and ideological dependence to the USA. So a lot of these concepts that were implemented through terror, state, terror and genocide are things that these countries are kind of in a loop with. These teams still come back time and time again, specifically when there are like waves of colonialism and the geopolitical [00:07:00] sphere changes and, and shifts.

Karen: And so all these concepts act as a sort of open wound that elicit a very specific fear. State mass, fear state, a fear of politics, a fear of ideology of being involved. Socially of being involved with honesty of their own ideologies.

Alix: And I know you've been sort of using some of the techniques of the past to sort of understand some of the techniques of the present.

Alix: Um, do you wanna talk a little bit about the work you've been doing that kind of ties those two things together?

Karen: Yes. So for the last six years, I've been kind of digging through the CIA's archives, trying to find mentions of the plant condor in order to uncover knowledge that is contained there that may not be easily.

Karen: And in [00:08:00] that search, that meant, uh, doing a lot of artistic works. I got into contact recently last year with a new concept into military sphere, which is cognitive warfare. So once I got into this concept. I felt like that re signified my whole approach to, to reading and collecting of these unclassified documents.

Karen: I went through a game, my own personal archive, in order to handpick and find all of this that I was learning about that is supposed to be newly formalized, and it's supposed to be like something that even is being advanced thanks to AI and neuroscience. But I see all the same tactics happening during the seventies and nineties in Argentina and South America.

Karen: Thanks to [00:09:00] this genocidal US led military operation. What this means is that Plan Condo was effectively a laboratory of extensive cognitive warfare experiments. What that will look like in case you don't know the specifics, it would look like ideological conditioning. So military people were trained by the US military, so the content of their ideologists were used during torture sessions.

Karen: They were used in songs they were talk about in the songs used while they were heavily tortured. This also involves, uh, IC manipulation because it re signifies for these people and for the social fabric. Of this country's manipulation of what ideology means. Ideology puts you in danger. Now, ideology means cruelty.

Karen: It means to be vulnerable in this very [00:10:00] extreme way. There were also heavy experiments of what I call cognitive fragmentation. So there was a lot of experimentation that I have seen being recorded in these fights of using overstimulation and shock techniques in order to produce a constant fear state that will, uh, make people talk and point fingers at other people.

Karen: So the torture will continue forward. So of course, this all means, uh, systematic terror state, which is an administration and coordination of the emotional landscape. Of entire countries. Yeah. That will, uh, later be, uh, formalized into cognitive warfare concept.

Marwa: To build on what Karen, I mentioned, Palestine has been a site of [00:11:00] experimentation for a very long time.

Marwa: As a matter of fact, even before the creation of State of Israel, where Zionist squads would come to Palestine to map and survey everything related to Palestine and Palestinians, including who owns cafes and who like, and who goes there. And I remember in one of the archives that if a Muslim cafe owner would have liquor in the the back room and who is cheating on who and in order to understand how society functions, and that is an important entry point for social engineering and fragmentation, which is our current reality now.

Marwa: I am a Palestinian from Hebron, from a village near the city of Hebron. So I come from the West Bank. My reality as a Palestinian is different from the reality of Palestinians living in Gaza right now who are, have been subject to 17 years of illegal blockade and a year and a half of brutal nonstop genocide, mass [00:12:00] starvation, without an exaggeration, experiencing every flavor and way of torture and sadism and suffering.

Marwa: And of course, it's different from Palestinians who have been refugees for since 1948, and some of whom, like millions of whom they live without documentation in really terrible conditions and denied the right to return where their homes and villages are demolished. And so, as I said, like we are subject to, um, a side of experimentation and.

Marwa: If you want to control a population for such a long time, of course you need surveillance and you need creating this omnipresence as a controlling power in which people feel that they're watched all the time, that the occupying power knows everything about them, the very small details of their lives, and that's enough for you to always.

Marwa: Be afraid and watch your footsteps and watch who you talk to and what you say. [00:13:00] And of course like with the deployment, like with the development of technologies, and Matt is gonna speak about that. We live in a place where there are no protections, no laws nor rights. And of course forget about international norms.

Marwa: They don't apply. And that's also something that the Israeli state has explicitly contested how international law and obligations to respect human rights apply to the population. They occupy something that the UN Human Rights Committee said and the ICJ, the International Court of Justice. Well, no, you do have to provide and protect people's rights.

Marwa: The people you occupy. But long story short, we are in a situation where the Israeli army can do whatever the hell they want. They can deploy whatever technologies they want, they can contract their own companies. It's a closely-knit military industrial complex. It's hard to draw the line between an active soldier or the Ministry of defense and the private sector, which is mostly started by [00:14:00] veterans or like Israeli intelligence officers.

Marwa: And so they can test and experiment. And the latest experiment has been the deployment of AI technologies in Gaza, the first of which was deployed in 2021, which, um, the Israeli occupation forces publicly hailed as the first AI war in the world. They have deployed, um, AI systems that. Use the mass surveillance, uh, or the data that they've extracted about the entire population in Gaza in order to generate targets, whether it be buildings that they need to bomb, or in the case of so-called lavender, which is a target generating system that they're using now that identifies the likelihood of a Palestinian being member of Hamas from zero to 100 based on different data points.

Marwa: It generates kill lists. And tied to that kill list is another system with a fucked up name called where's Daddy, which tracks the targets in real [00:15:00] time. And once they enter their family home, then the airstrike or the bomb drops killing not only the person but. Entire families that are sheltering there.

Marwa: Those technologies, of course, are deployed by the IDF, but they are supported by big tech companies who have been providing cloud computing services and also a host of other AI capabilities, which of course, they deny. They deny that their technologies are being used for military purposes. Microsoft, a couple of weeks ago, even published a statement saying, we've conducted an internal audit.

Marwa: See, like if our systems are being used to target or harm peoples or civilians in Gaza, and we've determined that no harm has occurred, but we don't have an insight into how our technology is used on site or like an air gap military basis and and whatnot. But what we know, and regardless of how exactly those services are being used, we know that the demand have been insane during the first few months of the war.

Marwa: We know, for example, that [00:16:00] on top of an existing cloud, uh, project called project numbers, I'm sure many people in the room have heard of it. It's the 1.2 billion, uh, US dollar contract won by Amazon and Google to build Israel's first national Cloud infrastructure, which also provides services to the Ministry of Defense and the intelligence apparatus.

Marwa: But in addition to that, Google had another contract. During the war to, uh, provide like special lending zones for the Ministry of Defense. We know that Amazon Web Services had also provide servers to the Israeli army that provide like endless storage because they've collected so much information that their own servers wouldn't be able to, uh, handle.

Marwa: So they had to go to Amazon and say, Hey, we want more support from your side. We also know that the use of Microsoft Azure Cloud and the AI capabilities had risen by 60% in the first six months of the war compared to [00:17:00] September, 2023. So a month before the war. And not only they've provided the services, they've also provided technical support.

Marwa: For instance, it's been reported that Google had created. A classified team that includes Israeli nationals that have security clearance in order to have access to information that the company is not clear to have. And not only that, they can also conduct joint drills and trainings with the IDF for special threat scenarios, which is an unprecedented service that Google has not provided to any country or government.

Marwa: We know that Microsoft had also, or Microsoft Engineers had provided onsite support both in military basis and remotely to the IDF, including notorious military intelligence units like Unit eight, 200 and Unit nine 900, which are notorious for surveilling Palestinians. And lastly, it's also been reported that Amazon Web Services, in some cases, had helped the Israeli [00:18:00] Army to confirm airstrikes.

Marwa: What I want to emphasize here is we're not talking about a situation of accidental misuse or an anomaly. Or companies, you know, providing services for profit. But I would argue that profit is not the only driver, but political affinity too. And you can see that case, uh, more clearly with Palantir that had even publicly, you know, their CEO publicly went, uh, during the first few weeks of the war to openly express support to Israel, forging a formal partnership with the Israeli army to help them win, win the war.

Marwa: Those are scary developments because to make a full circle, you know, Palestine is, I mean, I suck at math. I don't know how many miles away it is, but is it, it is in a different continent. It, it's seven, eight hours away from here. But Palestine is much closer to all of us in this room than we think. Many of the technologies that [00:19:00] surveil our borders, airports attack our defenders and journalists have been incubated and sold by the Israeli army.

Marwa: After being tested on Palestinian bodies again, in a place where there are no protections and no regulations and no ethical standards whatsoever. And that is why I think what is happening in Gaza now is a forecast or a foreshadow to what will happen in the future in terms of ar warfare and the extent in which these companies can just supply and, and provide these technologies to any customer who's happy to pay the, the price.

Marwa: And that requires us to take a collective stand to dismantle those structures that maintain and, uh, support and even profit from violence and gem.

Alix: I feel like, uh, it. Kind of bridges with some of Matt's work looking at, I don't know, we were, Marla and I were talking about how great a two word expression can be to sort of convey an idea and that, um, your report from, is it five years [00:20:00] ago?

Alix: Three? What is time? I don't know. Um, uh, but, but a report called Automated Apartheid. So conceptualizing the occupation of Palestine as a process of automation, of oppression. And we were just saying how that's such a powerful two word phrase that says so much. I think about this forecasting about the construction of, of this system.

Alix: But it also connects to some of your work on migration and thinking about domestic populations and domestic control, not just in the context of warfare. So I don't know where you wanna take us from there. Maybe share a little bit about automated apartheid as a, as a concept, um, and kind of what, what you've been, you've been working on and exploring.

Matt: So actually the term automated apartheid, it's not mine. It's not yours. It's not mine. It's from Keith Breckenridge, um, who wrote on the bi, the emergence of a biometric state in the context of South Africa and South African Apartheid. Of course. Okay. Yes. 'cause we have to understand that these things have far longer lives and the immediate invocations of these terms.

Matt: So it actually goes back to thinking about the attempts at creating biometric [00:21:00] records in apartheid South Africa, as well as the ways in which IBM and other larger computational systems, you know, innovative for their times. Were being used to do much of the classification that enabled the apartheid regime, uh, to, uh, operate.

Matt: And it's interesting 'cause at the time IBM would come out and say, well, you know, we're not doing anything particular, we're just doing file management. But like bureaucracy has such a way of, of, of veiling itself as something boring, but like profoundly violent. Um, and it's those file management systems and the ways in which they allow you to do file management faster.

Matt: Um, and a scale that is allowing for things like the international crime of apartheid named after the system and after the crime that was committed in South Africa to take place in the now modern slash newer iteration of it in the context of the occupied Palestinian territory. AWA was talking about some of the ways in which more analog forms of surveillance were taking place.

Matt: The village files, which were put together by British [00:22:00] intelligence and Zionists at the time, working together with British intelligence. You know, we zoom right over to the. Sixties and seventies, and we see, uh, these infrastructures starting to take a form of, of databases coming into the nineties. We see the emergence.

Matt: Of a system known as Wolfpack and the permit regime, which is a way of really keeping in one place information on all Palestinians, including the, uh, property that they own, relatives, any sort of political affiliation, affiliations to human rights groups, et cetera. And what we then see in the early two thousands is a way of trying to create instantaneity between those databases and violent actions that are taken by Israeli occupation forces in the streets on the ground.

Matt: So for example, you would see an Israeli soldier picking out a Palestinian from a crowd during a protest holding them up to A-C-C-T-V camera, and then calling into an operations room in which all the information on the Palestinian individual would be given to the soldier in [00:23:00] question. So they could then decide what to do with this individual.

Matt: Who, by the way. Doesn't really know what they have on them. This is the early two thousands, and at this time we start seeing real massive deployment of large CCTV infrastructures that are increasingly networked. Uh, we see the emergence of experimentation that hasn't really been done before with the support of both a newly established innovative tech economy.

Matt: In 2018, we see a large network, CCTV infrastructure in East Jerusalem in the old city gain the capabilities of carrying out facial recognition. So suddenly we have Palestinians Insan, who are being stopped on the streets and taken away on the basis that their cousin or their family member was involved in political activism.

Matt: We have a scenario in which increasing militarization and securitization of. Neighborhoods that are Palestinians, but where you have illegal Israeli settlers are helping Israeli settlers continue to act [00:24:00] in illegal ways to build settlements and to demolish Palestinian homes. Right? I mean, one of the projects that they have going on, which has been some of the most like horrifying stuff that we've seen has been the biblical excavation projects since.

Matt: So one, and we're under the t tens of digging out sites that are of biblical significance. Illegal settlers basically dig out from under Palestinian homes tunnels. And what ends up happening is that the Palestinian homes on top then collapse. And instead you then have illegal Israeli settlers building homes on top of those demolished sites, erecting infrastructure of surveillance that plug into those networks, forms of facial recognition.

Matt: So suddenly you have a situation in which I. Surveillance begets illegal settlement activity. Illegal settlement activity begets surveillance, and you start to see the emergence of a coercive strategy that's intended to force Palestinians out of areas that have strategic interest to Israeli [00:25:00] authorities and Israeli settlers.

Matt: And this is not far from what we know as forcible transfer and what we've been talking about a lot in the classic of, of, of Gaza. In Hebron, we've seen the experimental and profoundly perverse usage of, you know, gamified forms of capture of Palestinians. Blue Wolf was a system that Elizabeth Duskin helped expose, which basically.

Matt: Incentivized soldiers, uh, Israeli soldiers to go around and capture as many faces of Palestinians as possible to curate a database in which you could just look up a Palestinian, all of their information by just opening up a smartphone app and holding it to their face. So what you had is intelligence rates being carried out at 3:00 AM in the morning in which Israeli soldiers would go into a family home and line up kids down to eight years old and snapping their pictures, asking them to say cheese, and they would be incentivized through rewards like gift cards and paid holidays.

Matt: The unit with the most amount of pictures taken [00:26:00] of Palestinian faces would be rewarded with gift cards. Fast forward to 2022, and we start to see how the Blue Wolf system and the Wolf Pack system start to come together into a new system known as Red Wolf, which is a system of facial recognition that is deployed at checkpoints, places like H two and Haran by the neighborhood of Tele Romeda, where Palestinians who want to access medical services, work, education, you name it, would have to go through these checkpoints in order to access those services.

Matt: They can't do so if they're not recognized by this red, yellow, green light system in which it might go yellow because it simply doesn't recognize you, because algorithms are fucked and we know this. Or it might go red because you're affiliated with a political organization or a human rights defense organization that you, you know, have every right to be a part of.

Matt: When you're in the context of an unlawful occupation that you have every right to be a part of, when that unlawful occupation is turning into a [00:27:00] defacto annexation, and yet you're being flagged and it doesn't matter. That the Israeli soldier behind the computer screen in which the light indicator turns red or yellow or green, knows you and has seen you move out of these area, out of your home time and time again every day.

Matt: It doesn't matter that you may have a pregnant relative who's trying to access the ambulance on the other side of the checkpoint, of which we have her testimonies. You will be denied if the algorithm says no. And we've seen soldiers testifying that they are engaging in system deference. They don't want to make that judgment.

Matt: They let the system make the judgment. So in this way, we see how the algorithm is a part of reinforcing arbitrary restrictions on the freedom of movement of Palestinians, which is core to being able to exercise any basic rights. You know, whether you want family life, associational life, the ability to resist an occupation of which you have a right again under international law.

Matt: And that's using facial recognition. It's using anomaly detection, it's using [00:28:00] predictive algorithms, the same kinds of systems that are being weaponized in different ways to plug into the target acquisition systems that mono's referring to in the cons of Gaza. And the same kinds of systems that, by the way, we also see being used outside of Palestine.

Matt: We have Elbit surveillance towers on the US southern border surveilling migrants. We have uso, formerly known as Univision, which was an Israeli facial recognition developer now selling facial recognition software to US law enforcement. And this, by the way, was after Microsoft pulled out of his 40% stake, claiming that there was no evidence that tied them together to facial recognition usage.

Matt: Yet they actually, again, divested from it, which is all you need to know. And we see time and time again that other companies similarly, who have been combat, proven and tested on Palestinian bodies. Are exported and used in the context of people who also experience liminal or a suspension of their rights.

Matt: These are often magan communities. They're trans [00:29:00] communities, they're communities that are politically deviant. They're communities that are subject to the most atrocious forms of suspension of rights and dehumanization. And unfortunately, those two things, as Marro was saying, they reflect on the fact that what's happening in Palestine is never so far from where we are.

Matt: That in order to stop the economies of violence that allow for these systems to continue to take root and unfold and reinforce apartheid and genocide and Palestine, we also have to stop them wherever we're experiencing those same carceral logics. So you can fight facial recognition here. You can fight predictive policing elsewhere.

Matt: You can fight these systems and the companies wherever they show up. It doesn't have to be in Palestine.

Alix: Thank you for that. I feel like going back to Marla's point, that a lot of these spaces are rights free. Uh, I wanna go to Wanda to talk a little bit about why that is. I know you've been working a lot.

Alix: I mean, um, I know that's a giant question, but I think specifically thinking about weapons of war, autonomous weaponry, we [00:30:00] had an interesting conversation about what autonomous weaponry is in the consideration in international law. Um, and you've also worked on, uh, issues of sort of what is responsible AI in a, in a warfare context.

Alix: Um, so feel free to take it where you want, but I'd love to hear thoughts on, on those things. Thank

Wanda: you. Thank you so much. Um, and it's really an honor to be here with all of you and to, and to listen to you. I want to start addressing the issue of autonomous weapon systems. And first, this is a concept that has been.

Wanda: Discussed in international forums formally since around 2014, and it's, we are in 2025 and we see these systems using AI to come in genocide in Gasa, and still nothing happens at the, the national level, particularly the, what we would like to see is the adoption of a new treaty that, uh, prohibits autonomous weapons and regulates the use of AI in weapon systems.

Wanda: So I just want to first explain what is generally understood by autonomous weapons systems. There's still different definitions, particularly, I mean, from [00:31:00] countries that do not want to, um, toban autonomous weapons. But there's a general understanding including by, um, the International Committee of the Red Cross Civil Society, academia, and frankly the large minority of countries that participate in these discussions, that autonomous weapons should be defined as those that identify, select, and engage targets without human control.

Wanda: So here it's already interesting to, to listen to these diplomatic. Terms that what, what do we, what specifically do we mean by engaged targets? No, so it's actually attacking and destroying infrastructure and killing or maiming people. And this is done based on, like Margo mentioned, on data points, on individual characteristics, on algorithms.

Wanda: But basically the main characteristic is that there is no human taking the decision of if an attack should be carried out. And, and that comes back to whether a person should be killed or not. So as you can already imagine, this raises a lot of concerns from many different perspectives. So I'll just go quickly around some of them.

Wanda: First of [00:32:00] all, in the context of international humanitarian law, this would lead to many violations including of the, for example, the principles of, of distinction between civilian and non civilians, because this is something that no matter how, how many data you have on what algorithm you have. This decision has to be based also on the context.

Wanda: And as you can imagine, also the context in situation of conflict is completely changing all the time. So it would be impossible to use these weapons and comply with existing humanitarian law. Another aspect is, for example, if we go to the ethical questions. And so the question is raised, should we as human beings make the collective decision that we want to allow machines to take the decision of life and death over humans?

Wanda: There's also human rights concerns. So here we speak about all the evidence that exists, including that has been made in research by many of you here in the room, about the existing bias, uh, based on factors such as, uh, [00:33:00] race, disability, gender, et cetera. And it's extremely well documented how the impact that this has.

Wanda: In, in the civilian sector, such as in education, in employment, et cetera. So we make the argument that if this happens in these sectors in civilian life, that is relatively controlled as compared to conflict settings. Imagine the impact that this have in conflict settings. So if within the people who work in responsible ai, for example, there is a worry that bias in AI would lead to exclusion.

Wanda: From the right to education. Why is it that we think that it is correct to put into risk the right to life with the same systems? Right. And I just want to make another, um, area of, of, of risk that is all the technical complications. Because once and again, we see a lot of technical problems happening with these systems and then not working as they were supposed to do, and then no one being able to explain why you see that they did not work because of the black box.

Wanda: So as you can see, there's so many arguments that we have put forward [00:34:00] and still the states do not want to move forward to negotiations. So at this point, what I think is that it's not that they like arguments is that they like the political will. They, they, they have I think economic interest, geopolitical interest, but I also, I am convinced, as you were saying mar, that the ones that hold the power are aligned.

Wanda: With wanting the genocide to happen and with wanting these weapons to be used in battlefield. And basically they don't care about the people on which these weapons, you know, will be used. Because I was also at some point quite involved in some of the international discussions on responsible AI and ethical, ethical ai.

Wanda: I started asking this question to people from, from unesco, from USAD, from du like, oh, you have these really nice frameworks. How, why is it here? It says, this does not apply to the military domain. And so basically they told me, oh, it's because it's. It's not part of our mandate. I said, yeah, I can read the letters, but what is, I want to say something in Spanish [00:35:00] like, like what?

Wanda: What is the, what is the logic behind and who benefits and who is harmed is you? If you exclude this and people, they're like, ah, no. Yes. So coming to the conclusion and no one wants to take responsibility for this, so I think we need to question why is the military always being excluded from these discussions?

Wanda: Why are the big tech companies doing all, all what we just heard in this panel with no accountability? No. So I think these are the, the questions that we should be asking. I just want to say that there is a difference between autonomous weapons that are directly a weapon that will directly. Decide whether to attack or not and launch the attack and, uh, decision support systems that are the ones that are more documented in US and now, like the lavender system and the difference.

Wanda: And if you saying this, I always get angry and I think that I, there's some people that are just mean and just criminal because I think the point of developing these systems, [00:36:00] the way in which they work is that you have an AI powered system that gives recommendations of targets. This recommendation of a target is received by a soldier, and the soldier decides whether to launch the attack or not.

Wanda: So in a way, I think they use this to be covered in the sense that. It's not an autonomous weapon. There's a person that is making the decision. But uh, what has been documented is that this, on the average decision taking time between getting the recommendation and launching the attack, I think it was 15 to 20 seconds, right?

Wanda: 20 seconds. So 20 seconds. So it's like, uh, you get the recommendation and how much can you really analyze whether this complies with IHL in 20 seconds? So it's just having the person to push the button and follow exactly the recommendation that was done with all the problems that come with that.

Alix: You all probably have questions for each other or hearing all of that laid out like that.

Alix: I don't know if anybody has any thoughts, but I have another question for you all that maybe is slightly more positive [00:37:00] also. Any thoughts hearing all of that together? Anybody wanna jump in? Yeah,

Marwa: I think you and Matt, uh, made a good point that I want to emphasize on that is, so these systems are used under the pretext that they're helping the army to select targets in a targeted manner.

Marwa: Or even some proponents of those systems would argue that they are helping the armies to conduct military operations in a more IHL compliant manner. You know? But when it comes to Palestine, what they do in fact is they accelerate the mass laughter and the mass destruction with little bureaucracy as possible.

Marwa: So you kind of cut the red tape for the Army. And as a matter of fact, I think the AI systems that Israel uses in the Gaza Strip and also to an extension in the West Bank are translating what Israeli officials has stated multiple times that there are no civilians in Gaza. And accelerating, accelerating and [00:38:00] operationalizing that vision in which those systems can consider anyone to be a, a fair target.

Marwa: And as a matter of fact, as put by one of the soldiers as garbage targets. So the entire, you know, Palestinian families, their homes for an IDF soldier looking at the screen, these are just garbage targets.

Wanda: I also want to comment on that, uh, Mar something that we were discussing in the past days about the fact that in these diplomatic conferences, the diplomats come with a straight face from Russia, from Israel, from the us, from South Korea, and they say what the civil society is is missing, uh, from the interventions is that.

Wanda: These technologies will be used to protect civilians. Do you want to protect civilians? You want to protect civilians, right? So we should use AI because it's going to be for the benefit, right, of humanity, et cetera. I think we should, the first argument that we should say to answer to this now is if you really care about the protection of civilians, protect the civilians of [00:39:00] Gaza.

Wanda: If Israel wanted to protect civilians, they would be protecting civilians. Now it, it is not because you incorporate AI in your weapons that you're going to, if you want to protect civilians, you don't need AI at all. So I think anyone that comes and say this, eh, yeah, I don't want to, I don't want to say.

Alix: They're full of shit

Wanda: because it's an insult, first of all, to Palestinians. And it's an insult to Yeah, and it's an insult just to, you know, to general logic, to intelligence, to intelligence. So I think we really need to be, what we were discussing, uh, earlier, a lot of us is to say we need to, to have our presence in these forums where people are saying these things.

Wanda: Not because we believe in the forums or in the rules of the forums, but because we want to disrupt this and we want to say these things that you are saying are false, are not based on evidence, and we're not going to [00:40:00] let them go on challenge.

Matt: I think it's so important to highlight exactly. The reason why this AI washing of military targets is happening in the first place because there is no claim for legitimate targets or any legitimate action in Gaza in this case, right?

Matt: What we're looking at is deliberate usage of proxy variables, proxy data to create probabilistic decisions that relationally tie you together to particular spaces that you know could be Hamas affiliated, not potentially by the virtue that Hamas is a political party that runs much of the infrastructure in Gaza.

Matt: So just by virtue of they being there, you are relationally, Hamas. Um, and so you have to create this alibi by which the actions, the genocidal actions are justified and the alibi is ai. 'cause the AI says these are combatants. Yeah. But we have to also consider what happens when you say something or someone, or a group of people are combatants because again, under international law.[00:41:00]

Matt: We know that Palestinians have a right to resist. They have a right to resist an unlawful occupation. They have the right to resist an unlawful apartheid. I mean, there's no such thing as a lawful apartheid. Um, they have the right to resist genocide, right? These are things that you have the right to resist, and not only do you have the right to resist that, you have the right for armed resistance in this context, and we don't talk about that, right?

Matt: When we talk about, oh, there are combatants versus civilians. So many civilians were killed. 50,000 civilians have been killed. Oh, and they're civilians. So that's why it matters. People are engaged in resisting an unlawful occupation. Mm-hmm. That is something that we should entertain under international law as well, rather than only paying attention to the civilian dimension of this.

Matt: How else is Palestine going to be free?

Karen: When I hear a soul talking, what I see is a very clear tapestry of how it is. Technologies of evil are. [00:42:00] Always kind of the same, uh, in a very frightening way, how that means for us in the global south, uh, to have this kind of spaces where we can listen to each other and learn from each other and exchange experiences.

Karen: Because while we are in historical loops, interconnected and geopolitical trajectories change directions. We find ourselves in these repeated cycles that are very particular in a way and systematic. Mm-hmm.

Alix: I was also gonna say the point about probabilistic systems determining and distinguishing between civilians and um, uh, quote unquote combatants.

Alix: And that was what the Obama administration set up with drone warfare, where essentially they used if someone had a sim card in their possession. So a phone with attached to a particular sim, and if that person, the probability that that sim card belonged to that person. 'cause people share phones. Um. If that person holding that phone [00:43:00] was, had a certain probability of being a person that they thought was a terrorist, there was something like a threshold, I think of 80%.

Alix: So 80% likelihood that that sim card belongs to that person, 80% likely that that person is a terrorist. That's a sufficient level of probability for them to kill that person with a drone attack. Um, and I think that some of these, these ways of thinking about people and these ways of devaluing human life and being comfortable with decision making happening based on probability, um, is.

Alix: Not just, um, starting here with these AI systems, it's been, uh, happening and sort of rolling out. Um, and of course we, we innovate, which I'm sorry that we're, we're such an innovative country. Um, but I just wanted to draw attention to the fact that it's not just the Israeli government, it's also the US government, um, doing these things.

Alix: But wanted, did you wanna jump in?

Wanda: Yes, uh, I wanted to share, um, also building on some of the discussions that we've been having on the past days about strategies on how can we try to, to protest and to make things change, or at least to, to challenge people and that they are uncomfortable in saying these things.[00:44:00]

Wanda: What I have seen from the diplomatic forums is like the countries that the, the main ones that always block progress in international humanitarian law are a Israel, of course, Russia, the us, right? So generally we don't expect that they will. Change. And I think they don't care that they know and they own need that they are these guys, right.

Wanda: Uh, but there's other countries that I think that we should put much more pressure on that is countries like France, Canada, Germany, South Korea, Japan, Australia, that are trying to play, like we could do so much more, but Russia doesn't allow us to move forward. And so those of us that have been following these forums for many years, we realize that they have been blocking progress in different ways.

Wanda: So for example, they say, oh, we really want to start thinking about that treaty. But we have to first create an expert panel with like ethical experts, uh, uh, the tech experts, and then we'll see. And then there's the panel. And this [00:45:00] takes three years. And then we don't a report, a report, we don't have, uh, enough evidence.

Wanda: And you're just demonizing technology. You don't want progress. And so they're just delaying tactics. But then they go out of the forum and they say, I Russia, if only Russia doesn't allow. And everybody's like, what if we, we see this, but this is not setting the forums. But you have been there. Even the civil society doesn't say this is the dynamic and this is unacceptable.

Wanda: Second, second area of of things that we could do is, for example, France and Canada and other countries, they are happy to be like, we are the good guys of the responsible ai. So if you know I am such a nice guy and I have these forums and then they go into the disarmament forest and they block progress.

Wanda: Yeah. So I think we should be calling it out Canada. You cannot be the champion of Ible AI and not speak about autonomous weapons decision support systems and block the pro because we are linking these dots and, and we [00:46:00] should, I, I wish my, my wish we is that we could. Come together and say, every time we go to a conference, one of us and there's Canada speaking about this, we say, let me quote you at the CCW, and they're France and let me quote the ambassador, because otherwise they go on challenge.

Wanda: They just, uh, quickly. Similarly with the big tech companies, there's been a lot of people, including you and many others here, trying to, to make them change their practices, trying to hold them accountable and nothing happens. And do I expect anything from the big tech companies? No. They have their headquarters in places where these countries should be able to put rules for these companies.

Wanda: And these countries should supposedly are committed to our human rights and obviously they are not. So we should be also holding accountable our countries because I don't expect anything from the Bitta companies. And another layer of it, and this one is the last one, is the role of the un and we've been discussing about that too.

Wanda: Example, about one month ago there was a, a conference [00:47:00] organized by. United Nations Institute for Disarmament and who, and is the, uh, conference, one day conference on AI in the military domain? And who is funding the conference? Microsoft. Microsoft is like, what, how, what, what, what, what discussions are supposed to happen there?

Wanda: How are you supposed to question this? I, I was not there because I, I did not want to be there. Uh, maybe I should, maybe I should have. Maybe I should have, but there's also this power dynamics where I am this one person that is a consultant and part of some networks, and I really scared to go and raise my voice again, Microsoft.

Wanda: I think that maybe they throw me out of the confidence and never let me again, but that's why we need to work. Together, no, as a network and not, not one person going along. But we just to, to finish my point to summarize is we need to be holding accountable the un, the unesco. Why doesn't, it's committed to the recommendation on the ethics of ai and I don't see them talking about this either.

Alix: [00:48:00] Um, I feel like that's actually a really good, um, maybe hearing from each of you about an action you want us to take or sort of something that gives you hope in this moment in terms of turning the tide on some of these issues. So I'll give you a second to think. 'cause I know we've just had a quite heavy conversation, um, but I'm really glad we had it.

Alix: Um, uh, but I wanna sort of get us into a head space. What do we do? You know, what do we do about it? What can we do about it?

Marwa: Personally, I do flex. I've been fluctuating by between extreme despair, watching the world order collapsing before our eyes and the. The extensive dehumanization of Palestinians, because those double standards, they basically tell you that the Palestinians lives are worth nothing.

Marwa: Because if this happened in another context mm-hmm, mm-hmm. You will see completely different reaction

Wanda: with one person.

Marwa: Yes. If, if it's a white person, then of course, but then I, I am a stubborn person. I do consider myself that I have a fiery spirit, and out of stubbornness, I think, no, I [00:49:00] can't fall into despair.

Marwa: My life goal now has become holding big tech accountable. And I have other also goals in my mind, which I will not, uh, openly discuss, but I, I, it really pains me when I see children of Gaza that could be, could be my children, it could be my nieces, could be my nephews. It just happened that they were there and I'm here.

Marwa: To see them starving, to see them crying over their mothers and fathers being turned into orphans with no one left from their families. Being subjected to all forms of torture and tech. CAOs are enjoying so much money and wealth and platforms to talk about AI for good and progress and innovation and doing all this good for humanity.

Marwa: They need to be held accountable. That's my compass, that's my goal. Uh, and to make that happen, [00:50:00] we have to pave the road because to date, and it's not only about Palestine, these companies have facilitated harms and crimes in different contexts around the world, in Myanmar and in Ethiopia, in Sudan. And the list is really, really long.

Marwa: I think it's about time that we create those precedents in which, whether it be through. Investigating and prosecute them for participating or contributing to atrocity crimes, maybe at the ICC. Um, the prosecutor office is interested in looking into these cyber enabled crimes, so I think okay, maybe that's one avenue, even though that's something that has been, uh, resonated and came up during the, these three days discussions that yes, we lost faith in international law and it is still like a colonial mechanism, but we need to use all tools available to us.

Marwa: And that also means, as Matt said, um, it's not just fighting the fight there in Palestine, but [00:51:00] also in in the different jurisdictions in which we have to hold our. Governments accountable, not only for genocide, but also the International Court of Justice had made it very clear last summer that Israel's occupation is illegal and all states must not recognize the illegality of the occupation or provide any assistance or support that can help this illegal occupation to continue.

Marwa: Which that means that those governments have an obligation towards the companies within domiciled, within their jurisdiction, not to provide services or tools or technologies or softwares that enable apartheid, that enable the crime of persecution, that enable genocide, war crimes and crimes against humanity.

Marwa: So there are ways, um, that we can use and it does require this collective effort from whistleblowers, tech workers. Investigative journalists, um, international law experts, criminal lawyers, civil society [00:52:00] activists, grassroots activists, even fighting those technologies being used in their own context. The very drones that are surveilling the US Mexico border have been deployed in Gaza.

Marwa: Those are provided by Elate system. Gaza was the first testing site. So I think it's important for us to fight the fights here and there and where, you know, and try to use every avenue available to us to challenge those systems of, uh, oppression and and violence. So I

Karen: think that the way I go about it is to give ourselves a chance to try and see what it'll mean for us to take these technologies and use them for our needs, for our problems, for our desires and experience that radical difference of what.

Karen: Technology could be, that means to be creative and to be quick [00:53:00] on our feet into global south because, uh, we can fall into many traps when trying to do that, right? We have to remain critical and rooted and never take our eyes away of, uh, the material realities behind these technologies. But we also need to not take our eyes off the historical processes that we are part of, and the historical processes that we can spark too.

Karen: So if we understand ourselves beyond. The most immediate effects that we could have, and we take responsibility for our part in these historical processes. Then something new can come about that may not be the best, uh, by scientific standards or even artistic standards, uh, but it will be something new or something [00:54:00] free.

Karen: And we owe that to the people that have died for us to be here and for the new generations that, uh, can take that and make something better with it thanks to our efforts so that they don't have to start from scratch so that they can take our very humble, uh, tryouts and experiments. And if we had did it honestly, then those efforts will have been valuable because it is my hope that better people will come better than us with, uh, even more freedom of doubt and creativity.

Karen: And we'll take these little seats and be able to continue the historical processes that we are part of.

Matt: MAA mentioned the Genocide Convention and the kinds of strategic litigation that we can all engage in, in terms of making sure that the states within which companies [00:55:00] are domiciled, that are contributing to economic activity that enables genocide are actually prosecuting or holding their companies to account.

Matt: And that's actually instructive also for us. It's instructive for us as academics, activists, as sometimes tech workers, especially as tech workers because it tells us. That knowledge, undeniable knowledge is something we can be a part of producing, which would make it difficult, if not impossible, for the tech companies themselves to deny that they did not know that there was a risk that they could be involved in genocide.

Matt: And let's be clear, and Marra always says this, the Genocide Convention is about prevention, right? It's not just about stopping a genocide after it's happened, it's about prevention. So if we know that a tech company is contributing to particular risks, and the tech workers have been shouting directly to CEOs and loudly through working with other activists by making it clear that we know what you've done and what you're involved with, what are you doing to stop it?

Matt: [00:56:00] And tech executives do nothing. Then that is knowledge and knowledge is a prerequisite for being able to pursue these forms of legal action. So we need to keep shouting loudly and quietly in these spaces. There are also two other avenues that are really worth considering and thinking about putting our efforts into.

Matt: We're seeing more and more. The city councils are involved in passing bills in which they are refusing to work with or deploy Israeli products that are involved in apartheid in genocide. These are in particular security products that are refusing to work together around whether it's law enforcement training or what have you, uh, with the state of Israel.

Matt: And these are efforts that, again, we can, as constituents in cities, help effect at the, at the smaller level, we're also seeing university encampments, right? We're seeing students together with academics calling for greater scrutiny of the kinds of contracts and research projects that have been taking place between academia, uh, state of Israel, Israeli defense forces in [00:57:00] particular.

Matt: Uh, whether it's surveillance, whether it's the development of AI infrastructure, uh, whether it's developing new data centers and new thinking around police in securitization, what have you, and being able to call to account. Those universities, because they also are implicated under international law, is another way in which in the spaces we walk, we can interrupt some of these dynamics that we're seeing unfold.

Wanda: Mm-hmm. Thank you. So, um, I already addressed the, your question in my previous comment, uh, for me is really important that we start doing more work to try to hold accountable those that are enabling the actual perpetrators. But just my last, uh, comment and call to action will be, would be to encourage everyone to continue doing anything you can to support people like Marwa and other Palestinians and Palestinian organizations and in any way or shape or form that you can to end the genocide as soon as possible.

Alix: Okay. Um, we, uh, I think went somewhere I think really important and it was hard to resist having four people as amazing as this. Um, and [00:58:00] not having this conversation. I also wanna quickly thank Shazeda Ahmed for having the tenacity to tell me that this was an essential conversation that we should focus entirely on in this, um, in this conversation.

Alix: So thank you, Shazeda. Um, but I wanna, can we give a big round of applause for what we just. Uh, heard.

Alix: So I hope that was interesting to you. Thank you for listening to it. I think one of the least things we can do with what's going on in Gaza is know about it, learn about it. Listen to the people who are doing something about it, um, and better understand how it's entangled into all kinds of struggles around the world.

Alix: A quick announcement tomorrow morning for those of you in North America, tomorrow afternoon, for those of you in Europe. Also, this is all, assuming that you are listening to this on the day that the episode came out, uh, we will be hosting another live show in Tbilisi, Georgia at the Zag Festival co-hosted with our friends over at [00:59:00] Coda Story.

Alix: And in this conversation I'll be moderating a discussion of three amazing storytellers, Armando Aan Nucci, Chris Wiley and Adam Pincus, who in different ways have worked on how to help the public understand the systemic political problems that technology presents. Each of them has a very different vantage point on this opportunity of telling amazing stories to kind of electrify public attention and interest in these topics.

Alix: And we're gonna be talking about what kinds of responsibilities storytellers have, how to use storytelling as an intentional tactic to engage publics. So if you wanna join live, you can. We will have a link in the show notes if that time has passed and it has already happened, if you sign up to our newsletter, we will send around a recording of that conversation.

Alix: And we'll also post the episode on our stream in two weeks, I think. So be on the lookout for that. But with that, uh, thank you to Georgia Iacovou and Sarah Myles for producing this, and we will see you next [01:00:00] week.

AI in Gaza: Live from Mexico City
Broadcast by