Worker Power & Big Tech Bossmen w/ David Seligman (replay)

Alix: [00:00:00] Okay, so today's replay is David Seligman. This is a conversation from April of 2025. Now last year. It's an excellent overview of the ways that corporations reduce worker bargaining power, and it's just this swirl of really interesting ideas related to things that are only becoming more and more relevant, like price fixing for consumers, and this perceived tension between worker power and consumer power and how that's just a ludicrous.

Attempt by people with power to pit people that don't have it against each other because consumers and workers are in the same boat. So let's jump in it with David Seligman.

Welcome to Computer. Says maybe this is your host, Alex Dun. This week I sat down with David Seligman, who is a lawyer and founder of Towards Justice, a nonprofit based in Colorado that's focused on protecting workers rights and power. We get into some of the, I think. Core questions about inequality in the us, especially inequality that's been made worse [00:01:00] by big tech's business model and also its legal innovations.

We get into two big concepts that I found really. Interesting, and I keep thinking about, the first is forced arbitration, which you might know about, but basically it's when a company requires that if an employee or a contractor has a dispute, that that dispute is resolved in a legal setting that is private, that has all kinds of weird effects.

One, it decreases the chances that people actually can access justice. And two, it means that the formal legal. System and legal branch of government in the United States is not actually involved in making rulings that accumulate into jurisprudence around big tech regulation. The second big thing we talk about is the work that towards justice and other organizations like AI now and Tech Equity have been doing around price fixing and wage suppression.

So given big tech's vantage point where they can basically see so much about every driver [00:02:00] in their network, in the case of Uber or every shopper in the case of Instacart, they have a tremendous amount of power and insight into what people will accept in terms of wages. David gives some really compelling, disturbing examples of how companies might be abusing.

Surveillance of workers to be able to offer them less money because they know they can, because a person is in a desperate situation. And then also using that surveillance to fix prices. So I think the classic example of this is uber surge pricing, which we've kind of come to accept that if at the time period's particularly busy, Uber will charge more because.

It can, but thinking about that in the entire economy. So what can companies do to set prices given their increasing monopolistic position in labor markets and in economies, um, and in marketplaces? Those are the two big topics we cover. But more broadly, what I found super interesting is just the way David talks about these issues.

I feel like oftentimes when we talk about technology, politics, we get so [00:03:00] focused on the technology and I think what David manages to do in the stories he tells, and in the way he talks about these topics, he just centers people, which I think has an amazing effect of making it clear what's at stake and how gross some of the injustices within big Tech are.

With that, my conversation with David Seligman.

David: My name is David Seligman and I'm the executive director of Towards Justice. We're a nonprofit legal organization. We're based in Colorado, but we do. Litigation and advocacy around the country to help people in communities hold corporate power accountable. We often bring cases at the intersection of consumer protection, worker protection, and antitrust because we recognize what so many of our clients have known all along.

Right? Which is that often the distinction between those legal silos is irrelevant, that people working, people, communities all the time are subject to. The outrageous threats from the rise of corporate power and greed and government [00:04:00] corruption. And you know, we need to do everything we can with all the tools we have in our toolbox to take it on.

So we fight for rideshare drivers and Amazon drivers and teachers and childcare workers and farm workers, and the consumers of medical debt and tenants across the country, buried and hidden in junk fees. And of course, at the end of the day. All of these folks are just people, right? So we help people fight back against corporate power and government corruption.

I think at a time when that's more important than ever. And very often that means helping people to take on big tech and to take on abuses, both through litigation and through advocacy, including our recent advocacy and litigation against surveillance wages and prices. A lot of what, what we're really interested in is how we can break down this artificial divide between workers and consumers.

Right. Because like that's often I was gonna ask

Alix: you about that. Let's just start there. Yeah. That's because like when I think about you, you and towards Justice's work, which I think about what legal vehicles do we use to create [00:05:00] protections for people in a system of inequality and a system of increased digitization.

One of the things I think about a lot is the vector of consumer and the vector of citizen and the vector of worker, and like all of these different ways that we relate to power. It feels like it kind of complicates the legal strategy behind it, but I don't know how you, like, how do you think about the law as it relates to different people who are affected by power in different contexts?

David: So law and politics, fuck this up all the time. You know, you'll see in political conversation we'll talk about consumers, right? We'll talk about workers and you know, that maps on to like our law very often, right? We'll talk about consumer protections, we'll talk about worker protections. We even have siloing within the law, like I'm a worker's rights lawyer.

I'm a consumer rights lawyer, right? I'm a anti-monopoly lawyer. And those silos. Very often do a really bad job of mapping on to how people actually experience the world. And sometimes those silos, especially the political silos, can create real dangers, right? And create opportunities for corporate power to divide us, right, and sort of [00:06:00] pull the wool over our eyes about what may be happening.

There's this moment that I always like to refer to, which is not. Directly tied to the types of technologies that we're concerned about, that we'll be talking about today, but, but I think is illustrative. Someone who came in the doors of Towards Justice. At the time we were doing a lot of wage theft work, you know, fighting for workers whose employers hadn't paid them, their full wages, had stolen their wages from them.

This is many years ago. She came in the door at Towards Justice and she said, I've got a wage theft problem. I took a look at her pay stubs. It seemed like what had happened is that. Her wages were being garnished because of an unpaid medical debt. So I tried to explain to her that she didn't actually have a wage theft problem.

She didn't have an employment problem, she had a consumer problem, and she looked at me like I was a freaking alien because I am a freaking alien when I talk like that. Right? Like, what are we talking about here? Right? Her wages weren't getting paid because of a BS medical debt.

Alix: So rather than it being her employer, it was another company.

David: Yeah, her employer had complied with this garnishment order, right and was withholding garnishing her wages right because of an unpaid medical debt that like sounds a lot like wage theft to the average person. Her [00:07:00] wages are getting stolen because of a BS medical debt to a hospital that might be making billions of dollars a year.

The silos that we often set up between consumers and workers don't actually map on how people experience the world, right? The question is, are you being screwed over? Right? Do you feel like as you're trying to push a boulder up a hill to get through your life, that there's corporate interests on the other side pushing the boulder back down the hill at you, right?

It doesn't matter if we classify those issues as consumer issues or worker issues or what have you. The point is that we need to be there to. Fight for people. These are people and that's what matters. It also creates lots of complicated political dynamics that really like undermine our ability to hold corporate power accountable because corporate power always wants to pit consumers and workers and small businesses against each other.

Right. And I think our goal consistently has to be to. Break through those divides, right? To cut through the BS and to recognize that we have a common goal and very often a common enemy. That can be hard. [00:08:00] It can be not totally intuitive to people, especially when a lot of the technologies that we see, I think are designed right to disappear into the background and put people into conflict with each other.

Right? So it was a couple of weeks ago, I was at the grocery store. I saw it was this really tragic moment, right? I saw a grocery store worker. Having a fight. With a Instacart delivery driver about the fact that there weren't any more eggs on the shelf. And if you take a second to think about that, you're like, what's happening here?

We've got like Instacart, who the hell knows where they are? They claim that they're just some like nameless, they're faceless. This

Alix: an exchange marketplace. Yes. Yeah.

David: Platform, right In the background. Right. But they're exercising all of this control over this worker without even. Giving him sick leave, paying him minimum wage, giving him any of the protections of the discrimination or labor laws, even while exercising all kinds of control over him.

We've got this grocery store clerk who's just trying to do the best that they can at the very end of the supply chain, which is dominated by a couple of really [00:09:00] powerful egg distribution, uh, the

Alix: AG lobby. Yeah.

David: And monopolies, right, that have had a stranglehold on our egg production and distribution system for years, right?

And have made us incredibly vulnerable to avian flu and other market shocks and who even as we've seen inflation on eggs go up, you know, these companies have made substantially more than that in profits. There was one year that, you know, these companies made 40% in profits even way above what egg inflation was.

And then of course you've got the customer at home who you know is, I'm sure, just trying to do the best that they can and get through the day and like get a little help from somebody you know from Instacart as opposed to like maybe taking their kids out and gonna the grocery store. Like you've got these three people that are all put in detention with each other.

And you've got Instacart and then egg monopolies that have like vanished into the background. So this comes up and it affects the actual like day-to-day of our communities and how they operate. Right. And how they put people into conflict with each other. I think in really concerning ways that erode our democracy.

And then, you know, that of course shows up in policymaking, right? When you've got Uber and other big tech [00:10:00] firms arguing that we can't provide worker protections to people because those, those would increased costs to consumers. Right. Without acknowledging, of course, that

Alix: consumers are workers.

David: Yeah. That they we're talking about the same people, right?

And that all of us are in this together. And also these companies make outrageous profits and have all of this like power in our economy and our society, right? So breaking down the divide between workers and consumers is absolutely essential from a movement perspective, from a legal perspective, and from a political perspective.

And we found various ways to do it. One that I would love to just flag, which has been really interesting to see come into effect is we got a law passed last year in Colorado. That requires ride share companies like Uber and Lyft to disclose to customers how much of what they pay goes to drivers, right?

And to disclose the same to drivers, how much a customer actually paid for a fare, right? So now all of a sudden we have situations where customers are able to see, well, crap, I paid 80 bucks for a ride from the airport. Only 23 of it went to the driver. Transparency isn't going to, you know, get us out of this mess.

Right. But transparency can be a [00:11:00] really important step in helping to expose the sort of manipulation that these companies often engage in to pick customers against workers, which can be really dangerous to the marketplace and to our democracy. That's.

Alix: Not new as a, as a strategy. And I wonder, I mean, it seems to me that technology as a product, like the whole idea of it is that you can scale quite cheaply once you have a product that has product market fit.

Now it's very clear that Uber can't be profitable because of that overhead. So if you talk about an $80 ride, only $23 going to the driver, what are those other costs? Slash profit that they've essentially like taken out as this giant chunk. I think it makes more visible the way that these companies are not adding value, they're extracting value.

And I wonder, like given this is a narrative that's happened for a very long time, that like companies have power, they sort of seed into the background, they create tension between different parts of the population to prevent them from exercising power collectively to make these companies accountable.

How do you see that changing with technology companies? Like how do you feel like Yeah. Is it different? [00:12:00]

David: Yes, a hundred percent. One of the most dangerous developments we've seen as a consequence of big tech and purported innovations of these new technologies is that they have seized an opportunity and help to create a fiction that they are, in this case of Uber and Lyft, for example, just a platform, right?

When in actuality they're exercising. All kinds of hidden control over people, right? So I think it's the hiddenness of that control, whether it's, you know, control via algorithm, right? Hidden algorithm that allows them also to perpetuate a fiction of, in this case of independence, right? The suggestion that these drivers are truly independent small businesses, when actuality, they're subject to minute by minute.

Control by a boss that has, in many ways, more control over them than most human bosses ever could. Right? I think the technologies allow, because we often can't see their mechanisms of control and manipulation, right? [00:13:00] We can think about this as a, an illustration of control without responsibility. Control without accountability, right?

Because the laws of the new deal, right? What they say are. If you are operating within an employment relationship where a company has a lot of control over you, in exchange for exercising that amount of control, they have to give you certain rights, right? They have gotta give you the right to minimum wage.

They gotta give you the right to overtime. They give you the right to be free from discrimination. They have to give you the right, most importantly, perhaps, to exercise the countervailing power of collective action, right? Of workers coming together and forming unions. The gig economy is just one example of this, but when you have a situation where you've got.

Outrageous amounts of control, right of minute by minute, algorithmic control without providing people with any of their rights. Right? Then that's a system of control without accountability. That is, I think, what can be different about these new technologies and how they pit workers against consumers, right?

They create a fiction, which I think is easier for them to create because of the hiddenness. Of their involvement in the transaction and their control over both sides of the [00:14:00] transaction. They can create a fiction of independence that this really is an arm's length negotiation in the world between two people.

When in fact the transaction is heavily structured and controlled by an entity that is a fundamentally right operating for its own profit. That's, I think, what makes these technologies dangerous in a new kind of way.

Alix: I really like that analysis of control with that accountability 'cause I feel like it fits the bill.

Um, yeah. I'm wondering though, just kind of going deeper into this, thinking about forced arbitration or different contractual mechanisms that have prevented employees from being able to find this accountability, aside from the attempts to prevent unionization, et cetera. Do you wanna talk a little bit about that element of things and kind of what you all have been working on in terms of forced arbitration?

David: Yeah. It's hard to overstate how. Dangerous forced arbitration has been to our economy and also the functioning of our democracy. Right? If you came down, if you were an alien and you came down from outer space, right? And you looked at our marketplace, I think one of [00:15:00] the things that would stand out to you about this is you'd say, wow, these guys, they, they seem to care a lot about law enforcement in all kinds of ways.

People are constantly talking about law enforcement, and yet there is a shocking amount of lawlessness. An outrageous amount of lawlessness, very often. The actual laws are nothing more than words on a page. That is especially true when it comes to corporate lawlessness, right? When it comes to lawlessness committed by algorithms and people wearing suits, right?

There is an outrageous amount of fraud, of shadiness, of wage theft, of misclassification, of antitrust abuses that go unpoliced, and one of the core reasons they go unpoliced is because of. Forced arbitration, right? These terms that exist in almost all of the fine print that we have to enter into all the time.

That says that we cannot hold corporate power accountable in court no matter how much they steal from us, no matter how much they violate the law. I also just wanna be clear about the history of forced arbitration and, and how it arose. This is not like an inherent thing about our law that has been around for a century.

The proliferation of forced arbitration has [00:16:00] happened over the last couple of decades because of concerted effort from. The Federalist Society in the Chamber of Commerce to develop a legal theory that a 1920s law called the Federal Arbitration Act, which was never meant to apply outside of the context of like business to business negotiations, that that law allows corporations to put arbitration clauses in their fine print and allows corporations to say, as part of those arbitration clauses that people can't sue us in court and people can never sue us as part of a class action.

Without being able to sue as a class action, you can't hold major companies accountable for their systematic fraud and abuse. The gig economy is one example of this. We have one from one of our cases, which I think is on point here, right? So we helped some drivers in California, uh, a couple of years ago, sue under California antitrust law and argue that if in fact Uber.

As Uber said, and Lyft said, if in fact, these drivers were independent [00:17:00] contractors, right? If they were truly not employees, like and not entitled to the protections of employment law, then Uber and Lyft were violating the antitrust laws because Uber and Lyft were fixing the amount that those drivers could charge in prices, right?

It's like one or the other if we're independent. If we're truly small businesses give us the independence that small businesses have to have. And among the most important categories of independence under our anti-monopoly laws that small businesses have to have is the opportunity to set prices for themselves.

Right? That's core. Without that, you've got price fixing. You've got like a marketplace that's like. Manipulated by a powerful and dominant corporate actor that sets prices across purportedly independent entities, one or the other. This is a really important case under California law. This is a case we brought a couple of years ago.

In some ways, it's a, a core attack on a business model of control without responsibility. The court concluded that. We couldn't proceed in court and, you know, compelled our claims to arbitration because even though our clients had [00:18:00] opted out of Uber enlists arbitration agreement like a dozen times, each of them had had apparently, according to the court, not opted out on at least one occasion.

So you could opt outta that arbitration clause a dozen times and still not be able to establish your case in court. That's obviously like really harmful and there's all kinds of evidence about how arbitration has actually served as a wealth transfer from. Working people to the corporations that use arbitration clauses to conceal their misconduct.

But talk about also how this undermines our law, right? It undermines the rule of law, how it serves as an attack on our democracy. We should have a court deciding whether Uber and Lyft can in fact have it both ways, right? Can they in fact, continue to fix prices even while denying their employment rates to their drivers?

As of now, no court has actually had to grapple with that issue because of Uber and Lyfts get outta jail free card and the fine print of their contracts. It's an incredibly dangerous situation, and it's not just Uber and Lyft. Arbitration clauses aren't just like a legal strategy. They are part of a business model for big tech across the board [00:19:00] to evade accountability for misconduct.

That is really dangerous.

Alix: Yeah, I'm not a lawyer, but it feels like in a common law system that relies on precedence to get specific than having huge swaths of decisions being made in ways that don't help accumulate. Jurisprudence and like understanding of these emerging issues that it would like hobble the whole system.

David: Uh, a hundred percent. Well put. We should always be talking to non-lawyers about forced arbitration. By the way, we should non, non-lawyers about all this stuff, right? Because non-lawyers all the time recognize how absurd this is, right? The conversations that we have to have with our clients. You know, we talked to a, a woman several years ago, I remember who.

She worked from home for one of these like online shady companies that very often will target moms. They, they actually often will target military wives with like work from home jobs, right? Where you can work anywhere around the world and they involve, it's sometimes like call center work or things like that.

And many of them involve minute by minute control over people [00:20:00] within their own homes. So in this case, this is a person who. Described that she was monitored every second of the day by a company across the world, right? A company she barely interacted with, a company that called her an independent contractor, and the amount of their control was so extraordinary that at one point.

She peed herself at her desk in her own office, right? Because she couldn't get up to use the bathroom. Right? So take a second to think about that. Like what is happening in a society where people are peeing themselves at their own desks in their home office because of some corporate entity halfway around the world that disclaims responsibility to them.

But you know what? We couldn't help pursue in court. Right. Because that same fine print that purported to allow them to exercise all this control without responsibility, it included an arbitration clause. This is a, that's so fucked up. Yeah. Like it's unbelievable how fucked up it is. Right? We're talking about a world where you can get away with just about anything and use the fine print to shield yourself from accountability.

It's outrageous and it's really dangerous, and [00:21:00] we shouldn't be talking about it in abstract terms. When we talked to that woman about her rights, she was astonished. Like, how could they do that? How could they freaking do that? And you know what we have to say, I mean. I don't know, take it up with John Roberts, take it up with the five four Supreme Court that ruled companies could get away with this stuff.

Alix: I was gonna ask, so basically there has been attempts to make forest arbitration either illegal or constrained and it was not successful.

David: Yeah, so, so basically this 1920s law called the Federal Arbitration Act, which recognizes that especially companies bargaining at arm's length will very often enter into arbitration provisions that allow them to, as opposed to going to court to use third parties to resolve disputes and that those agreements.

The Federal Arbitration Act, access should generally be recognized and enforceable. That law from the 1920s was identified in the 1990s really and beyond as providing an opportunity for corporate America, the Chamber of Commerce, and then supported by the Federalist Society and others was seen as an opportunity to, a potential path to obtaining get outta jail free cars through the fine print.

[00:22:00] And so over the course of several decades, various legal theories were developed about how. The Federal Arbitration Act. What it meant in practice, according to courts and ultimately to the Supreme Court, was that as a matter of federal law contracts are in consumer contracts, in worker contracts and contracts that no one has ever has any opportunities to negotiate that corporations could say.

You can't hold us accountable in court. You have to go to private arbitration. You have to go to a private arbitrator of our choosing, right? And in addition to that, you can't sue us in class actions. There are all kinds of ways in which we're gonna rake arbitration against you. Take it harder for you to actually obtain relief.

You know, especially because in so many of these cases, class actions are the only actual way to hold somebody accountable. Those arbitration agreements can be, you know, exceptionally harmful. All part of the design of the Roberts Court and the Federal Society of the Chamber of Commerce.

Alix: I'd never really thought about the effect that this must have on refining our understanding of what is legal and not legal in the context of algorithmic management and sort of [00:23:00] technology systems more broadly.

'cause if this system has largely been in place since the inception of scaled technology companies that have this type of power over people, that means that essentially almost all of the cases that would've been brought. Would've been prevented from being aired in court. So then you end up having, like I imagine plaintiffs identified in very particular context to be able to bring suits that could actually potentially get legal remedy that sets precedent.

David: Absolutely. Most of the precedent that we see happening when it comes to private enforcement, just about all the precedent that we see happening when it comes to private enforcement, are only happening in cases where the plaintiff and the defendant don't have a contractual relationship. But as soon as you've got a contractual relationship, as soon as you've actually transacted, as soon as you've clicked the box that says, I accept the terms and conditions.

What you were saying is those business models that you are never gonna be able to hold those business models to account in court. It doesn't mean that those business models can't actually be accountable. Right now, if public enforcers as a general matter, right, can still bring enforcement actions. Right?

So the Department of Justice, the Federal Trade Commission, the Consumer Financial Protection Group, [00:24:00] bureau, state Attorneys General, who are exceptionally important, increasingly important in holding corporate power to account. Can still bring suit. The problem of course, is that enforcement resources are limited in some other countries, and my understanding is in lots of Europe, they don't have the same sort of reliance on private enforcement because they've invested so heavily in public enforcement.

Right. But we don't have the same sort of scale of public enforcement in the United States. 'cause we built a system which relies on this like private public coordination around enforcement and the development of arbitration has been bound up with, with big tech. The role of private equity in big tech.

The role of big law in Big Tech has been rolled out as a, like a key way for big tech to evade accountability, even while violating the law and exercising all kinds of outrageous control over people.

Alix: What an American thing to do.

David: Totally, totally. So when you people hear people say, well, what about the innovation here?

You know, all these sort of innovations here, and sometimes even in the legal context, right? You'll hear people talk about the legal innovations. [00:25:00] Of these new models very often the core innovation here is really just forced arbitration. That's it. Right? An innovation blessed by the Supreme Court to allow companies to evade accountability.

That's the fricking innovation we're talking about here.

Alix: Yeah. Okay. Well, I mean, you used the word lawlessness to describe some of this, and I feel like. I'm kind of wondering what you think in terms of the way that this is manifesting in federal government politics. 'cause it feels like essentially the fusion of big tech approaches and government is happening before our very eyes.

And basically every headline, every 24 hours, it's. Becoming more and more obvious, which feels partly connected to like a brain rot Silicon Valley, thinking about the role of the state. But it also feels like an expectation of a society where the powerful don't experience any inconvenience of accountability.

Um, do you wanna say anything about like this kind of tech oligarchy moment and kinda what we're seeing politically at the federal level as it connects to some of these practices that you've seen kinda emerging and building over the last decade or so?

David: Absolutely. I mean, [00:26:00] I'm gonna borrow something that I heard just the other day, which I thought was beautiful and spot on from Representative Greg Kasar from Arizona.

The way that he put it was we are seeing a convergence of corporate greed and government corruption. That should be. Our top priority right now. And he was talking about the Democratic Party in particular should be all of our top priority right now, that it is like an incredibly dangerous state of affairs.

And just to be clear, I don't think things were great under the first Trump administration just because like Amazon put like a Black Lives Matter symbol right in it's Twitter handle or whatever, right? That was never gonna freaking save us. Okay. But I think that we are at a moment, right? When even for many people and for many of these firms and for many of these billionaires, even the pretext of hashtag resistance not being there has set up a new kinds of dangers.

And I think that the really scary thing, and I think we need to be taking it on. Aggressively and that's gonna have to happen in communities, in our organizing [00:27:00] at the state level. It's something that we should be exceptionally concerned about. And the examples a bound, I mean, just one of them that I'd like to share, something that has gotten lost just a little bit in Elon Musk's dismantling of the Consumer Financial Production Bureau is just days before the dismantling began, right?

Just days before. X announced that it was going to enter into a deal with Visa to start getting involved in the payment space. So I think what they've announced is that there's gonna be some kind of new digital payments, digital wallet called X Money that's gonna launch at some point in 2025. Digital wallets have been the subject of all kinds of concern from the Consumer Financial Protection Bureau, from state regulators, because they're a rife for abuse and for fraud.

There's very little oversight and regulation. There are often ways to sidestep regulations put on banks, by the way, when combined with, you know, for example, Musk's interest in stable coin, right? Where Musk seems like he's gonna start doing his own money and having his own own payments platform, right? It just sounds absolutely outrageous.

So we're talking about dismantling the Consumer Financial Protection [00:28:00] Bureau days after you've announced that you're gonna start to get into the consumer financial space. The outrageously dangerous state of affairs, right? For our marketplace, and we're gonna see it around consumer protection. We're gonna see around worker protection.

We shouldn't lose track of what it means also for environmental protections, right? I think when it comes to like AI and crypto obviously pose all kinds of the thrust to the environment, right? Like dismantling environmental protections. I think it's gonna be a key part of this project, especially because so many of these guys, right, are so heavily invested in those industries.

We could go on forever. The point is that the threat of. Big tech, I think, is to, our democracy right now is more acute than it's ever been. Big tech was never going to save us, and now weaponized against us is going to be a huge part of the problem. And I'll also say too, like taking it on is going to be essential for saving our democracy.

You know, we see how our democracy can be manipulated by big tech and in addition. I'm very concerned. I think we should all be very concerned about a save affairs that I think we're rapidly entering into now, where most [00:29:00] people, most of the time are going to be able to lead lives that look kind of like the lives they had before, right?

Where society, civil society, protections for lots of people have been evaporated and dissolved. I think big tech is already a part of making us feel like our lives for most people are largely the same as they've ever been. And especially when you know, these firms you know are making more money than ever that will allow.

Lots of people to like, you know, continue to have just enough to get by and that will make it really hard for us to fight back against the sort of rapid proliferation of autocracy.

Alix: Yeah. It's the, it's the other, I think underaged with part of the banality of evil and Aaron's thinking that like basically life can continue being really banal, even if you're.

Under system of fascism.

David: There's this concept, and I just wrote about this and I forget who wrote about it, but this concept of what's called the dual state, right? Which is like someone initially wrote about this at [00:30:00] the dawn of fascism in Nazi Germany, and I identify that one of the real dangers of.

Fascism was this creation of this dual state where some people were living in absolute hell and everything in their lives had crumbled and they were subject to all kinds of absurd threats from a dangerous autocracy. And for most people, most of the time, life was kind of the same, right? That's what allows these dangerous systems to propagate.

So I, I think big tech is absolutely a part of that.

Alix: So if we think about, I mean, there's so much happening that's so intense at the federal level and thinking about how. Difficult it is to imagine a strategy that would emerge now that isn't already kind of underway to stop it or change it. I feel like people are kind of thinking at the state level, those are more controlled political spaces.

There's more options for maneuvering, at least at the policy level, potentially at also the organizing level. Like how do you think about working in Colorado as part of a strategy to help get at some of these federal issues? Yeah. How do you think about that [00:31:00] juxtaposition of state activity and federal change and, and power.

David: Bottom line is I think we need to be doing everything we can, everywhere we can all the time, right? If we have an opportunity to get stuff done at the state level, we should do it at the state level. An opportunity to get stuff done at the city level. We should do it at the city level. If an opportunity to get something done at the community level, we should absolutely do it at the community level, right?

It's not only about enforcing the law and passing new laws, right? It's about how we stand up to corporate greed and. Government corruption and we stand up, have to stand up in all kinds of ways all the time. Very often, that's also gonna involve showing solidarity across groups that have come under attack.

The states are. Absolutely going to be a key part of the effort to enforce the laws on the books. A key part of the effort to pass new laws, a key part of the effort to stand up to protect people. But you know, I don't think that it's necessarily worth overthinking things, right? We need to do everything we can all the time, however we can and, and it's just one example in our recent report we call surveillance [00:32:00] wages and prices.

We talk about how the Federal Trade Commission had been up until. Trump's inauguration had been interested in taking on the threat of surveillance prices. Uh, had sort of done the beginning work on a study on surveillance prices, identifying how pervasive this problem may be, how expansive the technology may be to allow.

Companies, major corporations, to charge people different amounts at different times based on their financial vulnerabilities, which is really dangerous. We are not optimistic about anything happening at the federal level. Right? So the first day on the job, Andrew Ferguson, the chair of the Federal Trade Commission, he announces that he is going to, first, they're gonna like withdraw all of the FTCs internal HR DEI guidance, right?

And they're also gonna cut off. Comments on the FTCs surveillance price study, right? So again, we see this as a coupling of bullying, right? Pulling back, you know, the diversity, equity and inclusion stuff so you can bang your chest on, you know, Elon Musk's social media platform [00:33:00] while at the same time your undermining any efforts to investigate or hold accountable major corporations that charge people different amounts and pay people different amounts.

Based on their financial vulnerabilities. So that's what what we're dealing with at the federal level. That's why in our report we call upon states to do more states to enforce laws in the books and also assist for states to pass laws. And now with there are bills running across the country that would prohibit corporations from using surveillance wages and prices to chart us as much as possible and pay us as levels as possible.

Alix: God, that's so grim. It makes me think about ATMs and placement of ATMs and how banks use where you live to change fees. And I feel like that is now illegal, isn't it? Um, like banks aren't allowed to charge place-based fees where they charge essentially poorer communities more in let's say overdraft charges, et cetera.

That's not legal anymore, is it? Yes.

David: It should never have been legal. Right. Um, of course, I expect, we know that stuff like that happens all the time often through surveillance wage and price mechanisms, right? Often through like online algorithm. We know that it happens all the time, [00:34:00] but you're right, it's illegal.

I mean, that's, that's something that we discuss in our report.

Alix: How, how do we get in these, how do we keep getting in these situations where existing legislation. Isn't applied. When digital business models start breaking laws that already are on the books.

David: Honestly, first forced arbitration back to forced arbitration, like forced arbitration, fundamentally as a business model for big banks, big tech.

Big everything to violate the law and screw us over with impunity. Right? Like that has been foundational to what has happened over the last couple of decades for sure. I think also there's a lot of this that comes from this pervasive marketing and political strategy that what we're doing here in a lot of these cases is novel and different.

When very often it involves nothing more than breaking the law, right? FinTech. Is one example, right? You know, the new technologies in financial services spaces, which very often, not all the time, very often, just involves fighting new ways to break the law with a fancier website and some private equity funding and some [00:35:00] like bros on YouTube talking about how cooler stuff is, right?

Like that's it's innovation. It's just innovation, right? I mean, and that's a real problem. I mean, you know, we think about, I think about this like. One time I was talking to a lawmaker in Colorado who we were talking, this was years ago, we were talking about the gig economy and worker protections, and he's not a bad guy at all.

Right? And I think his heart actually is sort of in the right place on this stuff. We were talking about how many delivery drivers. Are living in really destitute poverty, right? Really are not even making close to enough money to put food on the table that's describing a common problem. Really sad problem.

I think a really powerful illustration of how harmful some of these arrangements are, which is that we've got so many clients who describe situations when their kid is homesick from school. And the kid has to ride in the back of their car and watch an iPad while they make deliveries, right? Like how, how sad is that I can sit at home when my kid watches bluey, right?

For many of my clients, their kids watching Bluey in the back of the car, will they make DoorDash deliveries feeling like crap, [00:36:00] right? It's horrible. And I was describing a situation and he basically said, yeah, it's sounds horrible. Isn't it amazing that I can just get a burrito on my smartphone in like 10 minutes and I, yeah.

Back to this consumer worker thing, right? So these sort of models of control without responsibility, the marketing and gloss of private equity, big tech lobbyists. Describing these innovations that allows some of these new models to persist in ways that I think are often, not always, but often involve nothing more than just breaking the law.

Alix: And it goes back to that point about enforcement when you don't have must heal or enforcement that's actually enforcing those laws. Um, yeah, a lot of those laws don't really matter anyway. So.

David: Yeah, yeah. There's something called earned wage access, which is so earned Wage access is what they call this product, which just sounds like a payday loan to me.

But basically what they say is, this is not a payday loan. What we do is we partner with your employer to give you your wages, the wages you've already earned, purportedly. [00:37:00] Before your paycheck comes, right? So situations where people are living so paycheck to paycheck, right? That they have a difficult time putting food on the table before their next paycheck comes two weeks later.

First of all, let's take a step back. Let's talk about where our society is, right when. That's what we're talking about here. And by the way, some of the biggest consumers of these products are service members. So we're talking about military service members who can't put food on the table between paychecks.

You know, these companies target them and say, we'll partner with your employer. And will allow you to gain early access to the wages you've already earned in exchange for a fee. Then when your paycheck comes, we take the amount you owe us right outta your paycheck. That's, that's just like a payday loan, right?

Repackaged a different kind of product. Um, and

Alix: worse because it's automated, which means structurally they probably end up losing more money than you would if you did it one time. Um, and it essentially a hundred percent.

David: Cycles of debt. You can't make decisions about who to pay off first, right? 'cause it's gonna come right outta your wages, right?

Like they're the more powerful than any of your other creditors. Not only that, here's what it intersects with. Surveillance wage issues, right? There's all kinds of evidence employers may be, or companies that [00:38:00] employ you to do work that when they have information about your financial vulnerabilities, they may pay you less in wages because they know that you have lower bargaining powers.

Now all. If your employer knows how financially desperate you are, right? Are they paying you less because of that? Like what you know, how much more power do they have over you? But in states, across the country, you know, this industry is running around and saying we're totally different, right? We're totally different from payday lenders that, you know, Biden administration, consumer Financial Production Bureau says, this is just a loan.

We just gotta just treat it like a loan. That's what it is. Of course, Trump rolls that back again, another illustration. Of this sort of dangerous pattern of you creating new products that appear on their face. It's designed to evade the law and screw people over, but alongside forced arbitration and this sort of marketing ploy of repackaging them as some like innovative and disruptive new product.

They effectively go unregulated. You know, a case that we brought the Gill versus Uber case that we filed a couple of years ago in California that brings a couple of claims on behalf of drivers against [00:39:00] Uber and Lyft. One of those claims is that Uber and Lyft are involved in, as I explained earlier, that they're involved in price fixing, right?

Because they claim that their drivers are independent and that matters to Uber and Lyft business model because independence, the fact that they, these. Guys are purportedly independent businesses means that they're not accountable to them under the labor laws, right? They don't have to pay them minimum wage.

They aren't subject to the National Labor Relations Act that would give them the right to collective bargaining and the opportunity to come together and exercise the countervailing power of the union of their numbers, right? But we say that you're actually operating under a model of control without that responsibility, and that control includes.

Fixing prices because the opportunity to set individual prices for independent businesses is core to our purported free and fair market, right? So you're rigging the market in ways that violate the antitrust laws in California. But as part of that case too, and one of the really powerful things that flows outta that case is that it got us thinking more that we did this in working, especially with driver [00:40:00] organizations like Rideshare Drivers United in California.

Colorado Independent Drivers Union in Colorado, New York Taxi Workers Alliance in New York City. We see how those companies use a model of what Professor Vina Dubal calls algorithmic wage discrimination to exercise control without responsibility, algorithmic wage discrimination, or what we call in our report.

Also, surveillance wages means using. Individualized data about people, their status, their behavior in order to assign them different wages for the same work in order to pay them as little as possible while still getting them to perform the work. Right? So there's all kinds of examples of this in Rideshare that we hear about anecdotally, right?

You hear about people who. Are paid less at the end of a shift for a job that might bring them closer to home, right? So because they might be going home anyway, Uber and Lyft will try to pay them as little as possible, or [00:41:00] situations in which drivers think that because they're more financially desperate, right?

For whatever reason, there's some medication that are more financially, financially desperate. They get paid less. Uber and Lyft have sought patents on the use of algorithms. Of surveillance wage and price algorithms to allow 'em to set different prices and different wages based on all kinds of personal individualized characteristics.

And in that case, we also brought a claim under California law that the companies were engaged in illegal. We call secret discounting as a general matter. Under California law, you can't provide discounts to some independent entities that you aren't providing to others based on considerations that are hidden, right?

Because our market says that as a general matter, we want information about discounts, information about prices to be public, to be publicly available. And ride share drivers, as we see over and over and over again, are really the canaries in the coal mine for so much, right? So we're seeing the same models proliferate into nursing, into teaching, right into all kinds of professions.

And I think we were rapidly approaching a world [00:42:00] where a teacher. Could get paid less because an app that assigns her to work may know that from her online behavior that she searched for divorce attorneys and maybe more financially desperate, like that's the world that we're entering right now and we may not be that far away from it.

Stuff like that is already happening, and I think that if we don't do something. To hold these systems of control without responsibility to account, then we're gonna be there before we do it.

Alix: I think it's hard to even wrap your head around the level of surveillance that's happening. I feel like people are starting to understand and there's like instinct to defend, but I don't think that the, at the granular level, they realize both the extent of the surveillance or the.

Extent to which these companies are willing to go to fuck you over and take advantage of this information. 'cause it's disturbing. Like it's disturbing to think that there's these huge companies that would be willing to systematize this type of abuse and abusive practices.

David: Absolutely. Absolutely. And I think very often, I mean, you know, they [00:43:00] don't understand them as abuse and that's part of the.

Like this, this is innovation. And I think that's the problem.

Alix: Yeah. When really it's just extraction, um, zero sum extraction, it's not creating value that it then gets to read the rewards of it's not creating value and it's taking from places that it shouldn't be able to. Yeah. I don't know if you have many thoughts on, um, I mean, I imagine in Colorado it's particularly pronounced, but the intersection of immigration.

Enforcement and technology and the kind of fucked up, like not just at borders, but also in the context of ice. Things like ice, buying third party data on people to be able to surveil 'em outside of what government is allowed to do. And these kinds of increased militarization at the technology level also with immigration enforcement and deportation.

Yeah. What your thoughts are on what might happen or, or, or what types of, um, considerations you all are thinking about in terms of legal protections.

David: If there's no question in a lot of these cases that we will. Have to fight back in court. I do think that forced arbitration [00:44:00] might enter into the fray at some points.

Right? Especially when you have abuses by corporate entities that you have a direct contractual relationship with. So I think that's something that we all ought to be paying attention to, and a way in which forced arbitration can help to insulate again, the collaboration between big tech, greed, and, you know, corporate corruption and brutality and autocracy.

Alix: Oh my God. So the government. Contracts a company. That company then does something, if they break the law in the context of doing that, they can then push it into forced arbitration. So ultimately, you don't even get state accountability in that loop because essentially the company acts as a mediator between that.

Had not thought about that until this second. That's really

David: there. There are arguments that you can make to try to avoid arbitration in those contexts, but depending on the context, it can be hard. Of course may not accept them. Right? So like, yeah, are we gonna be able to hold accountable? The private entities that do.

The work of autocracy, if they force us into private arbitration, it is not clear and it is going [00:45:00] to depend on a case by case basis. That is a very scary prospect, but I do worry that that's the place where we may be headed and that is something that we've been thinking about and been concerned about.

But again, just as a broadly as a political matter, I think we need to like continue to attack. The relationship between big tech and the Trump administration directly and you know, shows some solidarity across all of civil society, whether it's the big law firms that are attacked, the unions under attack, whatever, and standing together and fighting back.

Alix: I hope that was as inspiring and engaging as it was for me, and also that you learned a little bit, especially if you work in the field of advocacy around these issues. About how to really center people and center what's at stake when we talk about these topics? I think it's so important right now that the big case that is being made is one of power.

It's one of money. It's one of control. It's one of oligarchy. And if we get too nitty [00:46:00] gritty in terms of like the digital issues in the digital detail and even the legal detail. I think we can lose that perspective on really what the story is, and I think David just does an incredible job of that, and I hope you enjoyed it.

Thank you so much to Georgia Iacovou and Prathm Juneja, for helping structure this episode, this conversation, and to Sarah Myles who as ever makes these things sound so nice. And next week, total change of topic. Um, but we get into kids online and how to understand that topic in a way that isn't influenced by a moral panic, but also doesn't disregard how much devices and connectivity is changing the lives of kids.

So next week, uh, you get to look forward to that and we will see you then.

Worker Power & Big Tech Bossmen w/ David Seligman (replay)
Broadcast by