Consciously Uncoupling from Silicon Valley w/ Cori Crider

Alix: [00:00:00] Oh, hi there. Welcome back. Uh, you might not have even noticed that we were gone for two weeks, which for those of you who I know, listen, every week, I'm so sorry that we had to do that, but it was good for us. I hope it was good for you. This week we have Cori Crider on the show who's a longtime friend and collaborator who.

Alix: You probably if you know her, know her for her work, co-founding Fox Glove. But before Fox Glove, she did a lot of amazing human rights litigation that informed how she approaches going after companies. Um, and now she's moving into other work and I was excited to hear from her what she's moving on to from Fox Glove and also how she's approaching her strategy in that space.

Alix: Because one of the things I most admire about her is that when she. Sort of takes on chunks of multi-year strategies. She really takes it seriously. The process of figuring out what she's trying to do, how's she gonna do it? And I always [00:01:00] find her thinking to be interesting, inspiring, tactical, strategic, um, and just really informs my own worldview.

Alix: So I figured it'd be great to have Corey on to talk a little bit about what she's taking on now.

Cori: I am Cori Kreider. I am a senior fellow at Future of Tech Institute and Open Markets, and I am thinking about strategies for de concentrating our tech economy to better serve people and the planet. I'm originally a lawyer, and you might say I'm a recovering litigator.

Alix: I wanna take us back to that funny lunch we had.

Alix: In, I think it was maybe like marito, like that little like span, whatever, that little Spanish restaurant, the small version of the bigger Spanish restaurant. And you were like, we have the crispy chickpeas. I can't remember. I think we definitely had the crispy chickpeas. We set up this launch because you basically were like, I dunno, say a little bit about what you were doing before [00:02:00] you had this like giant career strategy shift.

Cori: We've had so many of these strategic lunches, and now I'm gonna, I'm worried I'm gonna get this timeline right. Okay. I'm like, I'm like just gonna confect a lunch that we may have had. But like, I feel like my conversations with you meld into this permanent great lunch. Um, but anyway, if this is the one when we were kind of thinking about the first idea for Fox Glove, is that

Alix: the, and you were like, yeah.

Alix: And you were like, basically tech is the new frontier of a lot of the stuff I'm working on. It's a little intimidating. I don't feel like I can rock it quite yet, but yeah. What were you doing? Let's say what you, what were you doing

Cori: before? Okay, so I come from a law background and a quite a tra human rights background.

Cori: So I did Bush administration, human rights abuses, so detainees and torture, like real team America, chest beating, you know, the nature of the violations in 2006 when I was starting out. Was we take the person, we put them in the box, we beat them up, we waterboard them, we kidnap them in their family, we send them to a secret [00:03:00] prison.

Cori: All pretty analog, if I'm honest. I'm not saying surveillance wasn't involved, but ultimately the nature of the abuse is very physical, very personal, and that was the level on which I worked. It was really me with a person in a box who is chained to the floor, and the exercise is how do you get this person out of the box?

Cori: And we took some other cases over time at the organization. I was then directing called Reprieve. And what we saw as the National Security State transition from George Bush to Obama is that the kind of abuses would change. So suddenly we start hearing from people who are losing loved ones in these drone attacks.

Cori: Not in an official war zone. No, no, no. But in places like a village in Yemen, you know, like. Client had a wedding and his loved ones are killed. Or in a, in an area in Pakistan. Again, the United States. Are we officially at war? No. Are we bombing? Are we killing people anyway? Hell, yes. And they turned out to be the town's only [00:04:00] policemen, for example, or other, you know, innocent people.

Cori: And we started to investigate why this was happening. We even took Faisel, my Yemeni client over to. The United States met with the National Security Council, the White House. They gave them a so-called condolence payment, although no official apology, but we started to piece together how is this happening?

Cori: And it turned out we learned through our investigation through the Edwards Noden Leaks other investigative journalists works, that they were pilled by big data. Basically, so what the Obama administration at the time called a signature strike, which is just bureaucrat ease for bombing somebody whose name you don't know based on the pattern of activity from their cell phone.

Cori: You know, they're tracing a mobile phone, they, they know whose it is and it's moving around and it's an associated with such and such a person. And with our analysis of that phone's network, not that person, that phone. We think that they may be affiliated with a militant group, so boom, we're gonna kill 'em.

Cori: Can you imagine whether that leads to any mistakes? [00:05:00] Yes, I can. I mean, look, you know, I mean we can't even do kind of other kinds of, much less consequential automated decision making about people. But in a way, that was the first automated decision system I ever met. It was bombing people because the most consequential one of a.

Cori: Basically, yeah, man. I mean, Michael Hayden, the head of the CIA and NSA said at the time, we kill people based on metadata. He said it point blank, you know, I wrote about that back in the day. AI wasn't this inescapable branding term at that point, we were still calling it big data. I feel like I'm talking to you about dial up.

Cori: But anyway, like we were still calling it big data back then. I wrote this thing and some piece back then. I was like, God, what's going on here? Killing in the name of an algorithm. You know, you, you draw a box around a problem like that you think, okay, we used to just kidnap and torture and now we're engaging in mass surveillance and a semi-automated targeting process to make these life or death decisions about people.

Cori: That seems bad. Yeah, it seems like it's gonna be RIF with error. And also I wonder where [00:06:00] else this is happening. And so you just start to look. And actually you realize that this logic, this logic of treating people in a probabilistic or actuarial way and using this predictive process to make every consequential decision about them was where we were going.

Cori: We first saw it in governments, so you know, credit scores, prisons, you know, Julia Anguin did some amazing investigations early around prison scores. It was what everybody was talking about back at that day. But I also guess I started to think differently about public power and private power. So. Back then, we were really setting our face against first the Bush administration, then the Obama administration, like the overrun, national security state was, as I saw then the biggest threat to peace and stability and human rights that there was.

Cori: But then I thought, hang on a minute, we've all got really mad after Edwards note and about mass surveillance and the NSA and GCHQ, the amount of data that these folks have got really pissed off about all of this and fair enough, [00:07:00] but at the same timeline. We, these groups, especially some of these digital rights groups, have slightly slept at the wheel.

Cori: While a handful of California companies have got a level of information on us and power on us, that would potentially be the. And via any state. And I thought, gosh, that seems, that seems bad too. And so I think that was around the time. I can't even remember who put us in touch, Alex, but around the time I, I reached out to you,

Alix: you came with like a theory of change around technology and then very shortly thereafter where basically like there needs to be a muscular, tactical series of like litigation.

Alix: Maybe by a new organization that can basically take not just a human rights lens, but like a legal lens and a really sharp tactical approach to this power that was accumulating. And then you started with Martha Rosa, Fox Wolf. That's what happened, right?

Cori: Yeah. So first, just on the tactical point there. I talked about human rights there, but [00:08:00] actually like we almost never used like human rights law because it tended not to be what would win you, your case, right?

Cori: Yeah. If you run a national security case and you're like, but human rights, according to the human rights framework, yeah. Lose, lose, lose, lose, lose. It just doesn't work. It never works. Very often, George wouldn't respond to international human rights standards, to be honest. Neither would the courts where we took other, I mean, you know, I mean, in the UK for example, you would get much.

Cori: Further, there was a wonderful speech about this made by a woman called Dina Rose called Beef and Liberty. And she's like, we don't need any of these sort of Frenchy human rights things is we've got Beef and Liberty, we've got the Majesty of the common law and Magna Carta and habeas corpus, and that is all we need.

Cori: And that is, I tell you what that is brilliant advocacy, right? Because if you are rolling up in front of an extremely conservative set of judges, you don't wanna go on about human rights, that it had a bad run in the papers is bad advocacy. So anyway, so yes. So Martha and Rosa also came from that background.

Cori: We had, Martha had worked with me at Reprieve. Rosa had taken some cases about drones, actually. We'd taken some together and some on kidnapping. We started talking about these [00:09:00] issues and we go to a couple of these like AI conferences or tech conferences, and there was just this attitude on stage of these dudes of a kind I know well now, but had not previously encountered.

Cori: That was just like. I have a hoodie and a lot of money, and actually there are three of us and we can just kind of co-manage the rest of the world. And I did just think at that time, somebody co-manage

Alix: the rest of the world is such a great

Cori: No. Yeah. Way of. Yeah. Yeah. Like everywhere. You know, all they need is my products, right?

Cori: The world needs my cryptocurrency, the world needs my eye scanning orb or what I mean, whatever it then was. And we just, we kind of looked at one another and was like, who is going to sue these people's ass? So then we started Fox Gove. It was like a concept note on a piece of paper in 2019. It was no money and et cetera.

Cori: But yeah, so we built it out with the basic idea that we would take legal action against governments or companies when they use technology to oppress or exclude. You've

Alix: left Fox Gove, but before we move on to what you're [00:10:00] working on now, 'cause I think I really wanna get into these questions that you're grappling with.

Alix: Five years, six years at Fox Glove. Five.

Cori: Five, yeah. Five. Do you wanna, you're still

Alix: on the advisory council disclosure, I'm on the board for also disclosure. Is there a story or like an arc of a case that you feel like kind of exemplifies what you got up to and like what the promise of that type of work is?

Cori: I think the work I feel proudest of is the work that we did with and for social media content moderators. So. Probably your listeners know this 'cause you've got quite hip audience, but you know, very briefly the class of underpaid, outsourced exploited workers who, if they're still given a job, which obviously in acts they don't have, but anyway, this class of workers who swim through the filth and violence and hate on social media to try and make the places actually survivable for ordinary people and as a result are just exposed to.

Cori: The worst of what humans do to one another. [00:11:00] Sometimes live certainly in video, in just the most disgusting detail, hour after hour, day after day after day. And there had been a little bit of. Investigative reporting about this. In some American papers, there's been little bits and pieces, but there would be a little expose and a couple of days of online discourse about it, and then it would fade away, and there may be even one class action in the States, but nothing, nothing.

Cori: That said, okay, these folks actually have a right to labor organizing. Facebook is actually responsible for these folks. Absolutely. Classic big tech move, right? And whether it's Uber or Facebook, lots of these platforms monopolies do not want to own or take responsibility for this mass class of workers who make their wealth possible.

Cori: And content moderators, the absolute paradigmatic example of this, so they don't get the bean bags and the ping bong tables. They don't even get an employment contract at Facebook, right? They're outsourced to these other firms like Sam. So we worked with some investigative [00:12:00] journalists and then we would hear stories would come to us.

Cori: So this wonderful reporter called Billy Perrigo. At time, we got contacted for comment. Basically, as the story was breaking, he found this horrible union vesting case out of Nairobi, where a young South African man who tried to start a union of content moderators. Basically Facebook's content moderation hub for all of Eastern and Southern Africa.

Cori: I mean hundreds of millions of users affected by what goes on in this office. This guy, by the way, Daniel Motown, the first video he recalls seeing after his training when he gets onto the moderation floor, is literally alive, beheading. To give you a sense of how horrific this stuff is. And also I would

Alix: say, given the avaricious nature of the collection and accumulation of training data for AI models, the same type of content moderation is happening in the context of labeling data that feeds AI systems.

Alix: So if you think that this is like a, just a social network problem or like the last iteration of the big tech companies working on social networks, it's not, it's a through and through required piece of. Human [00:13:00] labor and infrastructure that's necessary to basically create the models that they're all shilling around the world.

Alix: So just wanted to say that.

Cori: No, a hundred percent. And in fact, some of the same outsourcing firms that Meta is using are the very same ones who they're using to sort of draw the poison out of some of the training sets used for large language models. Because the model is not, at least at the moment, is not, let's take a structured, special, curated data set and train our model on that.

Cori: It's like, let's. Ingest everything, including all of the horrors, and then let's, as I've just said, let's have some poor person's job to go and try and extract that. I'll be honest, when I first was thinking about this, I was surprised and maybe slightly resistant to the idea that you could get PTSD just by sitting in front of a computer.

Cori: You absolutely can. 100%. But there was, yeah, it's completely decontextualized. The stuff just comes at you left, right center. It'll be a beheading. It'll be child abuse, it'll be a terrorist attack. It'll be, you know, somebody, I'm sorry to say it, burned up in a cage alive. [00:14:00] These are things that people actually said to me that they saw and.

Cori: They would all say that something, one of these things plays over and over and over in their head. Classic PTSD symptomology, they would say that they had stopped trusting their family members with their daughter to say that they didn't really enjoy or couldn't relax into sex with their partners. I mean, just absolutely, you know, take really promising bright young people from these countries dangle the idea of tech advancement in front of them.

Cori: Then just put their brains through a sausage grinder. That's what this work was. So Daniel had the temerity to try and organize his workers and say, we know better. We know our worth. We demand actual. Decent pay and psychological healthcare. And for that he was summarily fired. And then there were a couple of other cases later where people were laid off.

Cori: So let me just talk about the cases. It's a kind of block of cases. 'cause then there was this, a later, uh, organizing effort that we did with our [00:15:00] amazing co-counsel, mercy Matti in Kenya that, anyhow, basically we took a series of cases with our co-counsel saying, enough, with this outsourcing lie, Facebook sets the terms of conditions of the employment.

Cori: Workplace, they decide whether you get actual psychiatric care or some coach who tells you to do yoga and deep breathing, which is generally what people get. They set the pay, they set the rate of work, and so forth and so on. So it seems to us that they ought to be on the hook as their employer. We won that in the Nairobi employment courts.

Cori: The judge saw through the kind of contracting guys and that, I feel still to this day, it's still, it's up on appeal. Facebook will drag it up. I think that area of work where we won the idea, I think the first time anywhere, I don't think there's any other judgment. That says this, that says Facebook is the true employer of these people.

Cori: Not just like you get, you know, you gotta shove them if you quit. But actually, no, you are responsible in law for maintaining a safe workplace. So, in other words, the way that [00:16:00] you are doing it is not gonna be lawful going forward. You're gonna have to, you know, develop some other way of managing this labor system.

Cori: So of course now they're gonna kind of try and move some of the work over to West Africa and so forth and so on. There's lots of other standard monopoly plays in which they're now engaging. But that principle. That puncturing of the lie that Facebook's workers are not Facebook's problem. I do feel very proud about that.

Alix: What are your thoughts on like how to turn? Big tech into General Electric or the defense industries? Well, they are now the defense industry in some ways. Yeah. I

Cori: mean, they absolutely are. Like how

Alix: do we, in the public consciousness and in the global public consciousness, both in terms of constituencies and also politicians, like what's it gonna take for them?

Alix: And maybe the inauguration did a lot of that work for them, but I like what do you think it's gonna take from a messaging perspective, from like a political strategy perspective to actually change the vibe when people. Think big techs come into my [00:17:00] country and instead of being like, oh, great, this is gonna look great for us, they're like, oh God, the jobs are horrible.

Alix: They're constantly evading any jurisdiction they're in. Like we can't keep them in control with fines. Like, this is terrible. Like, how do what? What do you think that's gonna look like? How long do you think it's gonna take? What do you think will work?

Cori: Basically, I think what happened in January with the tech oligarchs lined up at the inauguration, and then also the idea that, oh no, actually Elon Musk is a literal Nazi, not just a kind of.

Cori: Maybe you're overstating it. Oh, person Nazi, but an actual sea hying, white genocide baking into your GR high, actual Nazi. I'm afraid I think that that does. Help us. It's a bad fact, but sometimes I think power, monopoly power has to wear its ugliest face for us to stir ourselves to do something about it.

Cori: And that is sad to say what has happened since the start of the Trump administration, and you see this in lots of various [00:18:00] indicators of public opinion. I mean, it depends on where you're talking about. I mean, one of the earliest countries to give Musk a bloody nose was of course. So there are demo, you know, there are democracies in various places that are engaged in this struggle to say, you don't tell us what the law is.

Cori: You don't say how our democracy should behave. You do not set the rules of our economy and our society. We do, and you know. Must have blinked when Brazil confronted him. So it's not a simple story, but going back to Kenya for a minute. Kenya's got 40% youth unemployment or did last time I checked and Facebook, when we were doing these labor cases, just rolled out, it's absolutely standard lobby playbook.

Cori: So they've got Ruto coming over to Silicon Valley. Jobs. Jobs, jobs, exactly as you say. Uh, Nick Clegg, 'cause he was still there, goes down and talks about creators getting paid and meanwhile they start, you suddenly see the sector start trying to. Get a, a law passed in Kenya that would basically try and undo our court wins by statute.

Cori: At that point, I was like, oh hell no, but this is, this is a monopoly. This [00:19:00] is exactly the problem that has been diagnosed about monopoly power in tech or anything else since the beginning of the anti-monopoly movement, which is is not just bad 'cause you have a bad job. It's not just bad because they will raise your prices.

Cori: Monopoly is intolerable because it will become a power bigger than your democracy itself. You will follow their laws rather than the reverse. So I got to the point where I was like, well, if we ultimately want to ever be able to meaningfully hold these companies to account Facebook or whomever else, we have to break them up.

Cori: We have to take the choke points that they have now created in all parts of our essential infrastructure. 'cause you know, it's not like online is some other part of us. We all now get that this is our economy and our society. It's not some other magical world like I felt in 1997. Well, I was dialing up and they are at these choke points and they're just siphoning out the value from it and also steering us in all kinds of these horrible directions.

Cori: I think people see that now, and now the task is to take that horror and say, actually, there's something you can [00:20:00] do about it. Having bloodied Facebook's nose in court a few times and watched them roll out the full monopoly lobby playbook, I just thought this is intolerable. Like this level of concentrated political power cannot be left to let stand.

Cori: It is anti-democratic. It is a threat to all democracies everywhere. And if we wanna recover our ability to run our economy and society in ways that we choose and we vote for, then we have to dethrone them. It's just that simple.

Alix: Yeah. I'm glad that you work both on corporate accountability, but also government accountability.

Alix: And I feel like it feels like the corporate accountability space for like years was focused on convincing companies to use the power they have to make different decisions rather than challenging the sort of fundamental power structures that like allows them to be in that. Decision making chair, which we they shouldn't be.

Alix: And I feel like that's changing. But partly I think it's because there's more coordinated action of people [00:21:00] that historically have tried to hold governments account and now are turning their eye towards companies. 'cause the balance of power between companies and governments is changing. And I feel like the people that have always done government accountability work and human rights space are much more aggressive.

Alix: And basically they reject the accumulation of power, not fixating on how that power is exercised, which I feel like the corporate space kind of always has.

Cori: Yeah, I mean, let's take something like the, you know, some of these old business and human rights principles. What are they even called? The principles, the rugby principles.

Cori: Yeah. Yeah, man. Oh man. I did a little bit of consulting in the Amnesty International Office for a while, and I used to kind of ruffle a lot of feathers 'cause people would talk about some of these principles that I would say to them, sweetie, I like laws with cops. Yeah. If we can't enforce it, if we are just gonna go and wag our finger in a meeting and say, dear sir, you are not complying with your voluntary code.

Cori: What are we talking about? Yeah. Like what would they show? Like what incentive do they have to comply with this voluntary code? Zero. The point

Alix: of the force of law is basically that eventually it will be backed up by state violence.

Cori: Exactly. Like that's like Correct. [00:22:00] Correct. You know, you, you withdraw your license to operate.

Cori: I mean, that is how Brazil. BX. So in a way, I, I was, I've been on a funny journey about state power. I was really adverse to state power for 10 a dozen of the first, you know, years in my working life. And now that I see a level of concentrated power that seems super state, yeah, in some ways I'm not, I don't wanna say I'm like.

Cori: Comeback state, all is forgiven if there is another lever. I don't know what it is. It seems to me that we as people through our governments and through the laws that we as democracies have passed, have to use that tool to say, you corporation are a creature of law made by law and can be unmade or separated by law.

Cori: Just as we have beaten the monopolies of the past. And it's tricky because like I would love to be able to say that. A small campaigning group could by itself kind of break up one of these companies, but ultimately the structure of the [00:23:00] market is set by kind of market actors and by the state, by the state's intervention in those markets.

Cori: You know, I think it's kind of incumbent on all of us to make a demand for how the state. Exercises that power. There is a pending European Commission investigation that I think is so important. It's about basically all of the money out of the internet. Google's monopoly over advertising technology. I know all of your listeners hate ads.

Cori: I hate ads too. Nothing worse than a popup, but like it's either that or a paywall for news and for art and journalism. And if we want there to be news at all, then some people have to have a way to get paid. And Google has just siphoned out all of that multi-hundred billion dollar market and news is absolutely on the gurney.

Cori: And so there's a, an American breakup case and the Europeans are kind of sitting on the decision at the moment, like they said about two years ago, that they had the preliminary view that probably Google's ad tech monopoly needed to be broken up, but because of this geopolitical moment, you've got the sense that they're like, Hmm, do we have to poke there?

Cori: Do we have to go there? [00:24:00] Or should in fact, we just wait for the Americans to do it. And I would say that's foolhardy. If Google's gonna appeal it right the way up, God knows how it will govern in the Supreme Court. And also, you need to enforce your own law. That's what being a sovereign means. You were right for the reasons that you gave in 2023.

Cori: So go on, finish the job and break the monopoly up. You are aligned with even the Trump government's position on this. There's never gonna be more political headroom to do it than that, frankly.

Alix: I struggle a little bit with like this balance between corporate power, state power, like obviously there are gonna be these opportunities.

Alix: I don't know, like I'm thinking about the like public AI conversation of like, Hey, we should have democratic states build alternative infrastructure, or we should like have states invest in startups that aren't big tech, like kind of some of the small tech. Stuff, or like Little Tech or whatever, the pretend antitrust people that act like Little Tech isn't the same as Big Tech because it's all funded by people that made all their money in big tech, but whatever.

Alix: Because I feel like with a human rights background, like I've always thought about civil and [00:25:00] political rights and the excesses of the state as a really important thing to curb. Any institution, if you give them Supra power, they will abuse it and it's really important to have checks and balances so that you can prevent that power from being abused.

Alix: I really struggle now like reorienting towards states being the creators of technology that they are abusing, like a lot of them are still perpetuating human rights violations using technical infrastructure. I think the FSA court still operates in the US and, and I just think it's so hard to like see.

Alix: To your point, these entities, these companies be getting more power than the state. Then you turn to the state to try and like combat it. But sometimes it feels, I just don't trust them to do it well, and I also don't trust what would happen. I don't know what the answer is 'cause I, I, I definitely don't want, you know, as you said, three dudes co-managing global digital infrastructure,

Cori: but like.

Cori: I don't know. How do you feel about that? Completely. I mean, I've been on a bit of a journey about this, but the first thing I would say is like to [00:26:00] believe in anti-monopoly is not to say let's go and turn to a command economy. Far from it, it's not saying that Ursula Lger Lyon should like choose the winner in each of these parts of the market.

Cori: It's instead saying that if you actually wanna start something better that is public spirited, at this point, you do not have a snowballs chance in hell of me. Totally. Get, you know, and so the role of the government here is not to choose the corporate winner, you know, and maybe sometimes we do need like a tech equivalent of a library.

Cori: There's kind of a question about when you want a piece of public infrastructure or not. But most of the time I accept, not at least because of decades of hollowing outta state capacity, that we may need to just put some incentives, like a thumb on the scale that say, Hey. When you local authority are getting a cloud service, actually, you know, there's a preference for a local provider because otherwise they have no hope of competing against the Microsofts and the Amazon web services of this world.

Cori: Right? So I definitely, I'm not one of those people who says, let's go [00:27:00] and have the state build like, um, team data centers because we have loads and loads and loads of those already. That's definitely not the position, but like, you know, I will defend. Only so far folks from the Little tech agenda because what they are, I mean, you're absolutely right that ultimately the motive is.

Cori: To get rich, but or to be acquired, which means you play nice, but the but the positioning is hostile. I mean, I didn't look at, like I was reading earlier this week, the Amicus Brief filed in the Google search case by one Y Combinator and it's based like literally Y Combinator, and they're like, this company has monopolized and held the search market stagnant Google for 15 years.

Cori: We would be inclined to fund or intervene in other places that could bust this up at some point. Spot. Actually it's very difficult to fund in the space because Google has a kill zone. We acknowledge that Google has a kill zone and you know, they, they're sliding in front of the court research that shows 'cause one of the things that the companies, the monopolies always say.

Cori: You know, herb Hamp and some of these other [00:28:00] mnl degrees of antitrust say, oh, well you don't need to break them up because look at all of their spending on r and d. But there's, you know, there's r and d spending and there's r and d spending. There's like, would you like a surveillance RayBan? Or there's something genuinely looking for.

Cori: For a user and they cite research showing, well, actually most proper innovation and let me not use that term without scare quotes. But you know, most real, let's say useful new ideas actually come from outsiders. They don't come from r and d spent by some giant, bloated incumbent. Generally speaking. I mean, I actually take them at their word on that.

Cori: The other thing that happens, just thinking about what businesses are we talking about here, is that there will be other important sectors that you don't think of as having a stake in this. So like how about the entire media industry who will privately say to you, yes, it is absolutely essential for our continued survival that Google's monopoly is taken out of this part of the ecosystem.

Cori: Let's take some of the European stats, like Europe has shed [00:29:00] 30% of its media jobs since 2008. Most of your listeners will be aware of the news desertification in the United States. Lots of papers closing death of local news. Trump, by the way, won 91% of the vote, and what you would call American news deserts, places without a reliable and consistent access to local news.

Cori: So these questions of monopoly power and mistrust in systems go right hand in hand because the counterweight to some of the lies and the hate and the. You know, the mess that you see on social media would ultimately and historically have been maybe your local newspaper. And that stuff was just dying because of Google's monopoly.

Cori: So

Alix: basically we did it on accident. We started talking about what you're working on now, and I can tell by the way you're talking about it that you're so excited about it. Like this question of market concentration and antitrust, which is a departure from what you were doing at Fox Glove. You wanna talk a little bit about, I don't know what questions you're working on now, or like how you're approaching this stuff?

Alix: 'cause I, I presume you're not doing the same kind of strategic litigation as you were doing when you were at Fox Glove.

Cori: No. So after five years at Fox Club, I put the litigators [00:30:00] sword down and I thought, all right, what is it gonna take to persuade states to use the tools that they themselves have to restructure these markets and to deconcentrate them to go in with the choke points and bust them up?

Cori: And you know, as I say, essentially to break up big tech break and build, like to sever the choke points, but also to create room and incentives so that a good public spirited alternative can emerge and thrive and actually be used by people and not just be. Lasered into submission or acquired or eaten by one of these monopolies.

Cori: And so I am now a fellow, one of those wonderfully elastic terms at Open Markets Institute, which is one of the big, uh, American anti-monopoly organizations and a new outfit outta Brussels called Future of Tech Institute. They work on slightly different things, but they share a common vision for a future of technology, an economy, and a society that is governed by technology where we set the rules and that they benefit, they work by and for us.

Cori: People, society, small businesses in the planet instead of a handful of C-suites outta California and [00:31:00] Washington. I guess we're talking about Microsoft, so there are a lot of things that are different about this than a litigator's mindset. I mean, one is I'm just, I'm asking status warranties to use their powers rather than asking a judge.

Cori: So that's different. I also am talking a lot more to smaller businesses. Or indeed people in the open source community technologists, people who are trying to imagine or clear themselves a little bit of space for an alternative. So let's say you're a, a smaller European cloud provider who would like to construct an alternative to 3, 6 5, or you're somebody who's trying to build a cool service on top of the AT protocol, or maybe you're a.

Cori: Publisher, maybe you're a news outlet whose lunch has been absolutely eaten by Google. These are the kinds of people that I have been speaking with as well as just people in, you know, obviously people in think tanks and you know, adjacent to policymakers and talking to policymakers about it. Just trying to gauge the mood.

Cori: I guess. I've only been doing it since September of last year, [00:32:00] and I guess what I would say is that that moment. In January, the inauguration, the picture of the oligarchs lined up in the second row of the inauguration. You know the must stuff that has really raised alarm bells right across Europe. Now, there isn't a unity of purpose yet about what to do about it, but the sense that Europe's dependencies on US tech monopolies are a threat that they can no longer afford to maintain.

Cori: Is. I think much deeper and more widespread than you can tell from the headlines. And I also think that it is permanent. I think it's like that moment in the divorce when the Unretractable thing has been said. It's very sad for me. I mean, I'm American and I, I want this kind of alliance to persist, but they've heard it now and they see this adverse behavior and so they think these are critical infrastructure dependencies.

Cori: That we just can't afford. And so it's a kind of smile, [00:33:00] smile, smile, smile, smile. Like, like while furiously peddling behind the scenes, thinking about what can we do to reduce those dependencies, to be more sorry as this sounds chest beating, but be more sovereign. To be able to make and enforce our own laws, to set the rules of our own market.

Cori: The question is now just how fast and in what ways to uncouple from these US monopolies. And you know, you see a different response in the different companies, don't you? Like X is still very chess beating Elon Musk. Let me fund alternative for Deutsch la and let me choose the leader of reform and then put reform into power.

Cori: I mean, just. Absolutely comical levels of arrogance, whereas Microsoft is playing a much smarter game. You know, Brad Smith goes over and does a charm offensive tour of Brussels and says, no, no, we will fight any order by the Trump administration to hand over your user's data and all of your stuff can be housed within the European Union and La Viva GDPR.

Cori: You know, he said all of the right [00:34:00] things, but at the same time, but at the same time. The prosecutor of the Palestine cases in the ICC, the International Criminal court because yeah, they froze his

Alix: email account and his bank account. Oh, Mike in the uk. Hundred percent.

Cori: So, and the cloud services, I mean the whole thing.

Cori: Yeah. Microsoft,

Alix: Microsoft cut off the email account. What's what he, is it the chief prosecutor? What's, who's the Yeah,

Cori: exactly. He's like, he's the lead guy in these Palestine cases. Exactly. But it's not just that, I mean, the whole thing, the, the court is basically, there's a. Terrifying article in the Associated Press where the court has basically ground to a standstill because of these executive orders and sanctions.

Cori: And so the, the firms who provide the cloud computing and the hosting and the emails are like, sorry, no email for you. Like that is a core. I mean, it's not just a court, it's the literal ICC. It just cannot operate because of its own critical tech dependencies. So if you're a European state and you're then looking at that, you're thinking, hang on a minute.

Cori: All right, well, all of my government business is on 3 6 5, or my hospitals are on Amazon [00:35:00] Web services that is maybe a little bit foolhardy. So there's a genuine conversation happening in Europe about how we break free of this dependency, and this stuff is infrastructure and people don't care about infrastructure until it breaks.

Cori: See the ICC, but how we move it over and how we resolve these dependencies. I'm not saying that there is a final solution, not at least because you know, they don't have the level of budget. The money isn't quite there. There's gonna have to be some creative slightly shoe string alternatives put together.

Cori: It's a bit dam busters, but you know, there's, there's the will there. I was thinking about the Clean Net initiative,

Alix: like you used the word decouple and how basically five, six years ago when Huawei started offering. Firmware hardware for 5G networks, and basically that felt really threatening to a lot of Western democracies who both have a xenophobic view and maybe a clear-eyed view on some of the implications of having China run Europe infrastructure and maybe a little bit of embarrassment that they haven't.

Alix: Been [00:36:00] able to innovate at the speed that China has. So there was this initiative, and I think there's, now last I read there was like 55 countries. So just to be

Cori: clear, does Clean mean clean as a climate point or clean as a, like as in decoupled from China? Ooh, that's not a, oh, it's, uh,

Alix: clean.

Cori: Yeah. Sorry for any great friend of mine who set that up, but I, you know, I might not have put that way.

Alix: The clean, the Clean Network. Jesus. And according to the Trump administration, which is who started it, clean Network is intended to implement internationally accepted digital trust standards across a coalition of trusted partners. And basically a lot of it was like decouple from Chinese hardware and infrastructure.

Alix: If you want to be. Clean. Yeah. No, it's really gross to use Zeta Ahmed's term techno jingoist, but it, it made me think. Basically this mean that like countries are gonna start decoupling from us. Technology. Like that's what it sounds like you're saying.

Cori: Yes. And it won't be everything. And it won't be everywhere, and it won't be overnight.

Cori: But there [00:37:00] was a search for sovereign alternatives. And it's not kind of seeking to imitate China at all, but just to say actually, they were able to set rules of their tech and their market in a way that we are now subject to the rules. I mean, you hear people in Europe describe themselves, which is a bit rich given their colonial history, but they describe themselves as a digital colony.

Cori: I'm like, Hmm. But the point they're making is we do not have agency over our infrastructure over the rails on which our economy and our society. Run. And that is shocking. And you're absolutely right. I mean, it is been like that for a while, but you know, everybody sees it now in a way that wasn't previously the case.

Cori: Sometimes there's some negative Nellys over here being like, oh, well we don't make anything. But that too is false. They have good hosting companies talking about that layer, you know, and they've got as SML, I mean, there are certain parts of the tech stack where they actually have a meaningful presence.

Cori: And this is about shoring up the gap. And by the way, again, this is not about taking. [00:38:00] Other companies market share down to zero. Google will still be out there, but it's about saying. There needs to be an alternative in these spaces.

Alix: Did you hear that

Cori: Little

Alix: now, like the grocery store, people like the German, I can think they're German grocery store.

Alix: Mm-hmm. Company. Mm-hmm. They now have a data center business that has outstripped their grocery business. Um, well, it's just like Amazon. I mean, you know, I mean, Amazon

Cori: didn't, yeah, it got into because of Black Friday. That's just kind of, you know. Yeah. But it's funny that like little, yeah. Also, I think there's a values thing here.

Cori: Like every now and again, people don't just buy the $3 t-shirt, whatever it is, they will occasionally sacrifice a little bit of convenience. You talked about the smoothness of the services, although we all can see that there's a big asterisk over that. Now people like if they think that. These companies are attacking their way of life.

Cori: Then there is some patriotism there. I think there is a, the a possibility that people would love to go for a [00:39:00] European solution. I mean, in a way I think it's kind of galvanized, not completely, let me not overstate this, but that it's galvanized European sensibility in a way that kind of almost nothing else has.

Cori: It's like, okay, oh my God, our closest ally. Since 1945 has just rug pulled us. We have a war on our eastern front. We have a huge manufacturing crush from China. Like we need to band together. So people and companies and parties who might once have squabbled. Are starting to coalesce behind this idea of reducing our dependencies and constructing alternatives.

Cori: You know, the break piece and the build piece, and they go hand in hand. But I don't look, I really don't see this as a left or right wing issue anymore. I think it's a, you know, it's a kind of, how do we save and defend the European project?

Alix: Yeah, I think that's right. I think there's also the Titanic sinking in the US.

Alix: But also this is a galvanizing issue possibly. 'cause a lot of people don't like a small number of companies having so much control, so much [00:40:00] visibility. Like I think I, um, in a conversation with David Seligman a few weeks back, he shared towards Justice and AI now and a couple other organizations. I'm sorry if you're listening to this and you were involved and I didn't name you willing to in the shout.

Alix: Um, but basically how surveillance from Big Tech is being leveraged for wage suppression and price fixing. Americans hate that shit. Um, like I think there's also so much political momentum that's possible in this arena that the left in the US just isn't. Crushing and they should be. And I know that like we can't make Elizabeth Warren 15 years younger or a man, um, which is unfortunate 'cause those would probably be required for her to be able to run for office again.

Alix: But I feel like that type of argument now would probably be so much more resonant because people have seen the consequences. They've seen that picture at the inauguration. They hate Musk.

Cori: I

Alix: haven't lived in the

Cori: States in a while. Me neither. I love David Seligman, by the way. He's love David

Alix: Seligman. You know, he's running for Attorney General of Colorado.

Alix: Do it. He just, he just announced, yes, he just announced. I love this. Sorry, that's timing. He probably does not

Cori: benefit from my [00:41:00] endorsement. Um, but anyway. Yeah. Amazing. Donate to his campaign. Yeah. Yeah, do definitely. So it seems to me like in these big parties, both the Republican Party and in the Democratic party, there is a war going on, a genuine schism in both about the approach to.

Cori: Concentrated corporate power and the way that we are describing, so, you know, in, in Democrats it's the kind of, obviously you've got the Chuck Schumer wing who are like, this is all fine. These people give us a lot of money and pay for our, you know, and pay for our kitchen. He's like the dog,

Alix: that dog meme that's sitting at the kitchen table and everything's on fire.

Cori: It's just like, this is fine. He's like, when it is a constitutional crisis, then don't worry. I'll be there. And it's like. Study. Yeah, totally. We don't believe you. And also we're already in one crab. Dear crab, the pot is boiling. You are gonna be gumbo any moment anyhow. So there's the kind of corporatist wing of the [00:42:00] Democratic party who I think really dominated the last campaign.

Cori: And then there was a wing that actually did have anti monopolists there. And you know, I mean, under Biden, who actually managed to get their people in really important enforcement positions. So. In FTC, in Do OJ antitrust in trade, in the Consumer Financial Protection Bureau. And so it wasn't, by the way, the entire of economic enforcement under Biden, they couldn't do everything.

Cori: But you had this cadre of officials, Lena Khan, Jonathan Cantor, Rohit Chopra, Katherine Tai, who were able. To face the state, again in this very different direction, to reinvigorate certain kinds of economic regulation of ways that started to make a difference. Did it save the Democrats in an election? No, because they say the schism in these parties, but they, you know, they started some of these breakup cases.

Cori: Actually some started under Tron one, although actually I think the FTC now is res sending it. They did a click to cancel rule. I dunno if anybody watched Le Khan in 60 Minutes, but from her consumer protection side, she [00:43:00] dealt with patents on stupid parts of inhalers that drove the price up a bazillion percent, et cetera, et cetera.

Cori: They were really economic, and this becomes a controversial term in different context, but economic populace, they were looking at the bread and butter issues that were. Causing ordinary Americans to suffer and seeking both to address them and to be seen to address them. And there are, I have to say it hand on heart.

Cori: I think that there is still a thread of that in the second Trump administration fighting for the soul of some of this enforcement Now. Do they frame it in a way that I would frame it? No. Do I agree with all of their priors? No. So they say things like, we need to stop concentrations of power so that you don't need to regulate at all.

Cori: I'm like, I mean, okay, that's not gonna deal with food safety, but, okay. Um, but if you're an antitrust force of trying to preserve some part of that agenda. Maybe that's a necessary argument. Maybe that's what keeps it going. Will they actually be able to stay the course? I dunno. Mark me, [00:44:00] for example, wrote a a piece recently called Antitrust for the Conservative.

Cori: I'm not saying that I share all of its precepts, but it is worth reading. It's worth engaging with. The way that they are approaching some of this wholly, apart from whether it's a good idea or a bad idea, if the Democratic party lose ground among the working class to Republicans. If Republicans are suddenly perceived as the party of the working class, like when are we gonna win an election again?

Cori: When is that gonna happen? I don't wish to put lipstick on the, what is obviously overwhelming move basically in the direction of fascism, which is what we're talking about. Right. Well, I'll, we can

Alix: link in the show notes. It's antitrust policy for the Conservative. 'cause I hadn't seen it and I just pulled it up and it looks super interesting.

Alix: Um, it's

Cori: definitely interesting. You know, I mean, you know, I I I'm not saying it's great news over overall, we can all see that it isn't, but, but there are people, like, let's say the best thing you can say. Is that there were people fighting for Trump's ear and that some of them are driving outcomes that would have some positive effects for the economy and for society.[00:45:00]

Cori: That is not an endorsement of the administration. It's definitely not an endorsement of everything that this DOJ or FTC would do, but on these narrow subset of issues. Some people are to some extent trying to stay the course. This is why I think that the Europeans and other allied sister Enforcers in the UK and Canada and Brazil in South Africa, I mean you name it, actually have a bit of political room to do some allied and aligned enforcement.

Cori: Against big tech because some of that stuff is still going and I think it gives room to actually do the same thing. And you might ask, well, why would you bother if the states is taken care of? And the answer is, well, A, there will be appeals to, you know, look after your own democracy. And three, you know, the risk of a potential settlement.

Alix: What happens next for you? Like, what are you gonna be working on in the next year? Like, what do you wanna see? How do we know if this is working?

Cori: We'll see some major decisions out of not just the United States, but European authorities that goes way beyond this parking ticket fine, and actually goes to market structure.

Cori: So that could be a breakup in the ad tech case. It could be a [00:46:00] sudden change in the procurement regulations that say, actually we're gonna favor. European suppliers for some of our critical tech infrastructure. It could be a fund, it could be free. Our fees soaring and and thriving, and suddenly we see this in, you know, social media could be fun again, there's lots of different ways that it could go, but in my head, the frame I've got for it is the breaking of choke points and the building of alternatives are really two interlaced threads and we have to.

Cori: Do them both at the same time. We have to walk and shoot down at the same time. It won't be enough. I mean, I really appreciate the public infrastructure Euro stack stuff. I appreciate the effort, but. It won't be enough to just chuck a few mill or even a couple of bill at these issues if we don't also intercede to open up the choke points.

Cori: You have to do both of them at the same time. There's such monocultures in key parts of our infrastructure, search browsers, social cloud. We've gotta crack those open and make [00:47:00] space. And in terms of the kind of institutional relationship, I'm really enjoying being a free floating person. I mean, maybe it would be good to.

Cori: I have a team at some point, but at the, at the moment, I'm just, I'm really enjoying having a perch and observing and doing a bit of advocacy in these coalitions with people, because ultimately politics is about coalitions. It's about winning and, and making the case and, and persuading people, not because you're their boss or they're in your NGO, but because you.

Cori: See eye to eye and you have common colors. So it's a challenge. Uh, and you know, Europe is linguistically and culturally and historically quite politically fragmented. But this crisis has also provided a unity of purpose that if we can all just get our act together, could be the soil for something really quite exciting to grow in.

Cori: I'd like to see in Europe at least, 'cause people actually do care about some of this stuff, but people don't always connect it all together. I think. People like your representatives need to hear from you on this stuff because [00:48:00] too often I think tech issues are left to the kind of technocratic Brussels cadre.

Cori: So this has gotta go. Ultimately, politically, this has gotta be buzzy in the capitals. This has gotta be buzz and not just the capitals, but the countries and the member states. So this needs to not just be a Brussels bubble conversation. And the equivalent is true of your listeners in other, in other countries, right?

Cori: It's gotta be a German conversation, a French conversation, a British conversation, a Brazilian conversation, and some of it's there. But I'd like to see the conversation graduate from harms on social media about which I care upstream to. God, we can't do anything about this because of the sheer power and size of these companies.

Cori: So what are we gonna do about that? The truth is, is that in most of these places, there are, there are leavers, there are already legal leavers to do things about it. The antitrust laws, as I said, that were used by the American enforcers or a hundred plus years old, they didn't like pass a new law. 'cause Americans can't do that right now.

Cori: They used the laws that they had and they changed their posture and that can happen again. [00:49:00] People who say, oh, well Europe doesn't do structure. I completely disagree with that. They've done it in other, you know, in energy markets and others. It's just about creating the political will to say, this is now the rail that which our economy runs, and we will no longer tolerate it being in a few people's hands.

Cori: Yeah. Make your views heard. Yeah, I mean, is what I would say. Yeah. I mean, sure. You know, talk to me, but actually talk to them. Talk to the people who can get it done.

Alix: Yeah. And like start, you know, reading stuff. That's also a good place to go. And I think, yeah, just generally knowing that it does not have to be this way, I think is the other big, because I think we have grown, so, I mean, it's.

Alix: Me when I was talking about like that Linux machine, it was like shitty experience or uh, corporatized mega company experience that pretends that they're like innovative and interesting. Um, and I feel like those are not the only two options we can, there can be more

Cori: stuff can be better. It has been better before.

Cori: It can be better again. And actually we have taken on. Concentrated powers like this before, I mean, actually we [00:50:00] get, you get a Google when it was good because of earlier action against Microsoft that they, they arguably would not have emerged, but for those cases in the 1990s, similarly, you arguably don't get the computer age at all without repeated ways of antitrust action against at and t.

Cori: In 1956, bell Labs, their kind of skunkworks was sitting on thousands of patents, including for like the transistor that they didn't care about. 'cause actually you make a lot of money if you have a telephone monopoly. And so they go and they force them to license all of those patents for free to other companies.

Cori: Boom. You get the computer rage. And so when we talk about the potential for licensing out the search index, it's something very, very similar here. Anyhow, so, so yeah, this idea that we're stuck with what we got, Uhuh, I mean, you, you only have to zoom out and take a little bit of a historical view to realize what we have.

Cori: Is not inevitable. We didn't have it before and we don't have to have it in the future, but we have to take the field, including maybe making some uncomfortable allies and going to places. I mean, this is the journey I've been on about, let's say government and state power. You [00:51:00] know, going into places where we might ordinarily have felt uncomfortable or have felt suspicious in saying, if we do not, if we seed the field here, it will be failed by private power.

Cori: We can't leave it to them. We have to shape this future ourselves. 'cause if not, it's gonna be shaped for

Alix: us. Thank you. This was awesome. I feel like every time we talk, I like have like 400 connections in my head that I didn't have before. This is lovely, so thank you.

Alix: I hope. You enjoyed that conversation with Corey. I find hearing how people end up where they are is oftentimes really informative and interesting, and her career is so inspiring to me. So hopefully you learn something new. If you already knew Corey, and if you didn't know Corey's work, you can uh, connect with her.

Alix: We will share her contact information in the show notes. And up next, uh, we have a conversation. With Paris Marks, who, if you don't know him, if you listen to our show, I'm pretty sure you listened to his. Um, but he is a self-described left-wing tech critic. Um, so he [00:52:00] thinks a lot about technology as part of progressive politics.

Alix: Probably the most surprising part of our conversation is we went kind of deep on the Chinese electric car manufacturing industry, and then following that we are also doing too many series. There are kind of clusters of episodes of interviews with people that we kind of spotted a theme that we were interested in and then decided to do a couple of episodes about each.

Alix: Um, the first is gonna be focused on decentralized technologies, which is, you know, it's not, maybe sounds pretty boring when you first think about it, but when you hear Cori talking about centralized technologies. It becomes ever more important to think about, well, if we don't want centralization, what does decentralization give us?

Alix: So we have a set of conversations with different people that look at that issue. And then the second miniseries we're putting together now is one on. Scams, not just scams, but sort of how our economy is kind of turning into a massive casino and how [00:53:00] that is affected by technology, but it's also affecting the kind of politics around technology.

Alix: And if either of those topics are things that you work on and you want us to know about your work, please do reach out. Our email as ever, is in the show notes. Thank you to Georgia Iacovou and Sarah Myles for producing this episode and also for producing a weekly show for 51 weeks straight. Um, we took two weeks off partly 'cause we realized we'd just been doing this every week for a year and figured we should, you know, give ourselves some August time.

Alix: So thanks to Sarah and Georgia for making that possible, and thanks to you all for listening, and we'll see you next week.

Consciously Uncoupling from Silicon Valley w/ Cori Crider
Broadcast by