Logging Off w/ Adele Walton
Alix: [00:00:00] Hey there. Welcome to Computer Says maybe this is your host, Alix Dunn, and in this episode I sat down with Adele Walton, who just came out with a book called Logging Off that I highly recommend, particularly if you have friends or family who maybe aren't nerds like me or you on technology politics questions, but that you want them to understand.
The various ways that technology companies are influencing our lives, oftentimes negatively, sometimes positively, but it's complicated. That can be a very difficult conversation to start because there are so many different ways individuals experience these technologies. And logging off is basically a chapter by chapter exploration of all these different intersectional ways that technology affects us with really fantastic grounding.
Compelling stories about people that have been affected? There's two things that motivated Adele to write the book, which she and I will talk a bit about, two [00:01:00] very different things. The first is she lost her sister to suicide, her younger sister, and it turns out that her sister had spent a very long time on platforms that were encouraging suicide and this experience, the loss of her sister.
Inspired her, angered her, motivated her to explore and galvanize people around the problems and tragedies that these technologies are bringing about in communities to hopefully spur action and change. And she continues to work with a network of bereaved families who have been on the receiving end of platform pain.
The second motivation for writing the book. Very different. She answered a classified ad, so I'll let Adele explain a little bit about that. But highly encourage reading, logging off, and also buying it for family and friends who you want to be able to have these conversations with without having to, you know, take them on a 1 0 1 university course of all the way, all the ways these companies are causing harm because this is a [00:02:00] much easier, more efficient.
More compelling way to onboard more people into this conversation. So definitely check out the book. We will link to it in the show notes. It just came out this week, but with that Adele Walton.
Adele: I am Adele Za Net Walton. I am a journalist and more recently the author of Logging Off the Human Cost for Our Digital World. And I'm also an online safety campaigner with Bereave Families for online safety and more recently, also an EU youth movement called Control Alt Reclaim.
Alix: Obviously you're looking back at how you used it when you were younger and now you have the benefit of being slightly older.
You're still in your mid twenties.
Adele: Yeah. 25. Yeah.
Alix: Yeah. The technology's also changed while that's been happening. Like what are your reflections on your practices now? Because sometimes you can critique a system or understand a system and still be in it in a way that you like, recognize. Yeah. Like how do you feel about your, [00:03:00] your technology use now?
Adele: I'm very much still in it, like undoubtedly, but it's interesting that I've sort of shifted away from like, and this is an internal thing, more so than I think the technology is like as I've improved my relationship with my body image, and I wouldn't say that I have body dysmorphia anymore. Of course it comes and goes, but I don't have that anymore.
And. I in that process have become a bit more of a workaholic rather than body image being my obsession work and my sort of productive outputs have become a bit of an obsession since university. I think anyone would see that from following me. Like if you scroll through my social media, whilst it is personal, it's a lot more of like, what am I doing in my professional life as a freelance journalist?
And I think now I am working on it, but there is sort of more that dependency of. Am I doing good work? You know, like I put this out, how is it being received? Is it being talked about? And that's a shame. But you know, I write about how in the book so [00:04:00] many freelancers, whether you're a journalist, a musician, whatever.
These platforms have become our main way of sustaining our livelihood in that we find new clients there. We find new collaborators there. I wouldn't have got a book deal if it wasn't for my social media presence, which I've been. Curating, as you say, since a teenager. So it's really difficult to critique it whilst being within it and whilst being dependent on these platforms.
But I'm definitely still in it, but still trying to escape. But I think as much as we each as individuals might be trying to log off and detach, we at the moment do not have. This so-called third space that we can go back to, because as we have become more dependent on these tools, we've been conditioned that this is the only way, this is the only way to live your life as a freelancer.
There's no way you could just go up to someone in the street or go up to some business on the high street and pitch yourself to them. Who would do that? You know, it's such a foreign thing [00:05:00] now for us to sort of. Build those relationships organically, and that's why we do think that the online community is community, but actually it very much doesn't live up to that expectation or the lie that we've been sold.
Alix: No, I also think that there's benefits of digital marketplaces that we should get to access. Like we should have those benefits without having to deal. All of the structural harms. Something I like could feel throughout your book that you're not like technology's bad, you're like, the way that this technology has evolved is bad and it has horrific consequences.
But we shouldn't necessarily say, I dunno, throw the baby out with the bath water, but that we don't necessarily have control over forcing accountability for the people that should be responsible for dealing with those harms and like changing the way that these systems. Function, and I feel like the alternative isn't to log off forever.
What I like about it is it's not log off, it's logging off. And I feel like there's this [00:06:00] iterative process of like, I'm in the system, I'm outta the system, I'm in the system, I'm outta the system. And like how can I like, appreciate. Bigger picture without feeling like it has to be this binary if I'm either on it or I'm not, because it maybe isn't realistic or even good to, yeah, check out entirely.
Adele: Definitely. That's a really interesting take on it, is that it is an ongoing process and a daily process of like, how in this day can I make time and space for logging off and for slowing down? My thing is logging off. What that means is a rejection of the culture and the sort of attitudes that we've all been conditioned to adopt in our own lives, like urgency.
Culture is so, so heavily tied to social media and its design. I. I think not many people realize that until they start detaching from it and putting those like stop guards in place in their own lives. Whether it's using apps like Opal to block out time in their day where they're not gonna be online and, [00:07:00] and actually sitting back and being like, oh, I do have time to go for a walk.
I do have time to meditate. But when there's that constant stream of information and of. Stuff that's going to withhold your attention. You don't feel that you have the time. So yeah, it is definitely a rejection of the toxic culture that Big Tech has created. And I'm definitely not, like you say, a Luddite, no shade to Luddite.
It's like that's, do you, but that's personally not my lifestyle. And. I don't think I could go to the opposite extreme, but what we have right now is not working for most of us and it's harming so many people in so many different ways that I try to sort of reveal in the book. So, yeah. Which I love by
Alix: the way, like the book, I mean, I feel like chapter by chapter, it's almost like a human interest lit review of different fields and different people that for the last 15 years have been [00:08:00] trying to.
Articulate harms in ways that are tangible so that people can understand the kind of systems that are developing around them. I think it's such a good entry point for people that don't wanna go read a bunch of science and technology studies papers about these things, um, but are looking for a holistic perspective on the way that these systems have ended up the way that they are and, and, and the nitty gritty of what that means for people.
Adele: That's something I didn't really realize until writing the chapters where it was more about my personal reflections of, of growing up on social media and the fact that I, you know, I got my first Facebook at the age of 10, got my first Instagram account at the age of 12 or 13, I think 12. How so many of us are now with TikTok as well, especially, I think TikTok has only exacerbated this feeling of like.
Being in social situations and sort of observing and calculating in my mind, what is the content that I can extract out of [00:09:00] this situation? And it sounds so icky because it is totally unintentional, like it's totally. Unconscious process that we have learned through being teenagers on these platforms, being wired to turn our lives and our real experiences into something to be shared through a visual lens and.
Even just the fact that, like growing up and knowing your angles, I didn't realize that that was a generational thing that is quite tied to my generation. I don't
Alix: what even, what means, what does that mean? That like you look good from one side but not from
Adele: another. Yeah. And like, and that is, that has come from, in the book I write about how I would.
Rush home from school and college, and this was particularly when I was struggling with body dysmorphia and I would spend hours taking selfies every day, and that was a totally normal part of my like daily life and daily routine. [00:10:00] Was it because, yeah, if you
Alix: could capture like the perfect image of yourself, it would somehow like freeze something or like, like help you reaffirm something?
Like what do you think was the motivation for that?
Adele: I think it's control. If I could. Secure and capture, you know, an image of my body in which I felt good about how my body looked. In that one photo, it was like, okay, I can relax a little bit now because at least there's, you know, one pose that I feel really good in, or one angle, or one sort of lighting or outfit, whatever it is.
And yeah, it's really sad that that was so much of my sort of mindset as a teenager. For many years. But that was definitely the feeling, I think. And then of course the validation cycle that comes with that. The fact that I would then upload that to Instagram and be flooded with comments, likes, and you know, I didn't have tons of followers, but I've always had a sort of presence online.[00:11:00]
As a teenager I had around a thousand followers. So that meant I was always seeking like right. What selfie is gonna get the next best level. It was that like internal competitiveness with yourself, and then of course the comparison with, with people on the explore page. Did you,
Alix: because, because it strikes me that you were spending time in person with lots of friends and things while curating this online version of yourself.
Like was it collaborative or social with your friends, or was it more you were by yourself doing this and then most of the interactions were happening in, in digital spaces.
Adele: I think what a lot of teenagers feel today still is I'm more understood through my online community or I'm more understood through my online presence, and I think that's how I felt at the time.
I am better understood. Through the way that I present myself online, because that's my higher self or like that's my ambitious self, you know, is it's the [00:12:00] curated controls is, yeah, exactly. And as like I've always been a very ambitious person and I've always been quite tough on myself from teenage age.
And I think for people that have that type of personality, social media can become so addictive because it is that sense of like. Why didn't I say that in person? That was such a funny or witty thing to say, oh, it's okay. I'll make it a caption on Instagram. Or, oh my God, if only I said that in conversation earlier, oh, it's all right.
I'll tweet it or I'll write a Substack post about it. And I think that's what for so many people keeps us coming back to these online communities. You know, supposedly communities in ways. I was gonna say, do you feel like it's like, yeah, like sort of empty communities, is that actually it is a really gratifying thing for the ego and for us.
I sort of ended up going on this sort of semi spiritual journey when writing logging off that I didn't expect to have because it made me really unpack my own relationship with the way that these [00:13:00] tools are designed. And honestly, that did just sort of come by chance because in 2022, me and my partner, we moved to a really like rural part of the uk and it is a part of the UK where like.
Community newsletters still exist. So we got our first community newsletter through the door and there was sort of like an ad section in the back and I was just like reading it for the first time and someone had put an ad saying, intelligent pensioner requires smartphone tuition. And I was like, oh, this is so interesting.
Like that is the most British shit I have ever done in my life. I know literally I was like, love that you are like bold enough to call yourself intelligent. Like you know what stuff it, I was like, if I can't teach someone how to use a smartphone, no one can. So I rang this number and within like a week I was sat in Tony's Flat, who we are still very close friends.
I was sat in his flat teaching him how to use his smartphone and what had happened was he had bought a [00:14:00] smartphone and a really, really not advanced smartphone as well. I was actually getting imposter syndrome thinking, oh my God, what if he's got a really, really advanced smartphone that even, I dunno to use, and it was literally like one I love.
That's your first.
Alix: Thought, I know I 25, there is no technology that we could put in front of you that you can't figure out, but carry on.
Adele: I get there and it's like my dad's phone and I was like, oh, bless him. Like he, he didn't even know how to turn his phone on. What had happened was he had bought the phone and then had a fall and gone into hospital, and this had triggered the onset of Parkinson's when he had this fall.
So he was finding, learning new skills, really difficult when he left hospital and when I met him, he couldn't, he just couldn't self-teach. So by week, by week, we'd sort of teach him how to use his phone, like how to call someone, how to turn it on and off, how to download the app so he could get like the bus time table up and what.
It got me thinking about was like someone else's digital experience that I had never seen live in the flesh. [00:15:00] Like I had heard digital exclusion, but I never really knew what that actually meant for someone. He doesn't know his angles.
Alix: He doesn't know his good angles.
Adele: No, he doesn't exactly. He's losing out.
But like I didn't, I didn't know what that actually looked like 'cause I didn't have elderly relatives. Around me. So it made me get outside my own digital bubble and I was like, wow, this is really necessary that we all step outside our digital bubbles 'cause we are literally all living in bubbles. And then a few months after that, I lost my sister to online harms.
And 'cause I was already sort of thinking and reflecting about who's losing out from big tech. Then when that happened, I was like, this is the worst case scenario that someone could experience from the lack of safeguards that we have and just the carelessness that we've allowed to unfold. In the tech space.
So when, when that happened, I was like, I have to write logging off. And I've always wanted it to be [00:16:00] focused with human stories because I know that's also what persuades people who feel intimidated and turned off by the techy stuff and the jargon and, and that's always turned me off in, not necessarily in tech, but in like political spaces and progressive spaces.
I think sometimes we have the tendency to just sort of. Preach to the choir and get, you know, again, get stuck in our bubble of assuming that everyone knows what these words mean, like austerity or you know, whatever it is. And then we don't get more people on side because we are not speaking to regular people the way that we should be.
And we are just making it difficult when it doesn't need to be difficult. And I think. Tech often that happens around the conversations. So yeah, that's basically how logging off started. And then I just started to sort of widely connect the dots between like, right, how are workers being affected by digital technology and algorithmic management, or how are women [00:17:00] being affected by the lack of safeguards we have online?
So, and then it just sort of spiraled from there and I just couldn't stop.
Alix: I'm glad you didn't, 'cause it's really good. You describe the like endorphin hit of getting validation and approval and like feeling that online version of yourself is the better version. I also think what you're describing with.
Political communication about these things that we use language that is exclusive, that there's a policing around who gets to talk about what in in particular ways it feels connected to me that like political movements have the same problem you were describing that individuals have where it's like I have to be perfect in every articulation of my political position and I have to use words that make me make it clear that I have done the reading.
And that actually, that really does a disservice to this attempt to get more ordinary people mass mobilized to sort of. Combat the things that we're seeing, and I don't know if you have thoughts on that dynamic or what it would look like for there to be a larger number of everyday people engaged in policy and visioning around what we wanna see.[00:18:00]
Adele: I think that dynamic is like really. I think I observed that a lot on Twitter, and I think especially around election times. I dunno if it's the same for you, but like I really noticed around the 2019 election in the UK that it just felt so bitchy. Like that's what it felt like, like. It's just like God's sake.
Do people not have better things to do than like be arguing about the real nitty gritty stuff that actually isn't gonna make a difference? Like the slight wording difference that you use And you know, I think when we talk about cancel culture, I think sometimes people don't wanna talk about it. And I was a bit nervous about writing about it in the book because I totally understand that.
Cancel culture is a very like polarizing phrase in itself now, and I think when you say that you are against it, maybe people think you are [00:19:00] like an apologist for bad behavior, but I think cancel culture has sort of come out of that outrage machine-type engineering that we have on social media—off like the pylons and the like.
You know, if you do say something slightly wrong. You are gonna be like exiled from this community or this space or this group or you know, your clout will be immediately gone because you don't deserve that anymore. And I think you are right. It is that whole thing of like having to be perfect when you present yourself online and.
I used to get really paranoid before I would like tweet something political because I was worried, and this was when I was a student, I studied international development, so like I was very much on leftist Twitter. I would get really worried and nervous because I, I would construct something and then it would be literally just be I really.
Unproblematic opinion, like it would literally just be like, what Twitter was originally designed for sharing your [00:20:00] opinion. And I would write a sentence and then be really scared to actually post it because I, I just had this paranoia of, are people gonna come for me? It's not even from lived experience, it's just like.
Witnessing this culture unfold online and being sort of exhausted of it. And I think it also ties to that urgency culture that we've instilled within us of like we need to be ever online. And missing out on the conversation is not something you want to do, but also Owning up and saying, hang on a second, I've got something wrong.
We never see that online. Like so rarely do we see people come on onto the feed and be like, you know what? My opinion's changed. Like, because we are meant to be fixed entities on social media, we are meant to have a niche. We are meant to have a brand that shouldn't change. And you know, I sometimes feel that way on.
When I, I've noticed that on TikTok, the algorithm really pushes your niche and [00:21:00] like I started out on TikTok, basically on book talk, like I was promoting nonfiction books that I'm really enjoying. And I had one video that I went semi vial and it got like over a million views after that. TikTok would only sort of boost the videos that I shared to do with nonfiction books.
So if I would post anything, like anything to do with fashion, anything to do with my journalism, it would be crickets. It wouldn't get anything, and it made. Me feel really frustrated 'cause I was like, I'm not just one thing. I don't have just one interest. Like, well if I wanna share a video of me doing yoga or you know, doing my witchy like tarot stuff.
And I could of course still share it, but I was like, oh well no one from that community is gonna find me. Because that is how we get found is sadly vi virality and engagement. So I think it's really sad that. We feel that we have to stick to a niche and that so many [00:22:00] people's personalities now because we have that inseparability between our online persona and our offline interests, we feel that we can only be our niche.
I think that's like. Depriving us. It
Alix: connects to
Adele: that idea
that
Alix: you were saying earlier, that there's like a quantification of the validation and you start like optimizing in this way. It's not that surprising that it happens in like more like content context that aren't just about you as a person, but that there's this systemic attempt to make us quantifiable by engagement generating machines, uh, which you kind of like.
Logically get, but it, it also has all these effects. There's this great book called Re-Engineering Humanity. That's all about how, not just what are we making that is new, but like how what we are making is making us new and like how we're actually bending and conforming to the thing around us, and that it's this disturbing.
Process. That's hard to explain because it's not [00:23:00] necessarily quantitative. 'cause we're responding in qualitative ways to a quantitative, uh, system. And I find that creepy when you feel yourself changing.
Adele: It is creepy. And it's like when you start to notice like this third eye that you've got. There's an article, the Facebook Eye, I think a long time ago, and now like, it's probably more like the TikTok Eye or the Instagram eye, but it's that that idea of like you have a third eye that is geared and designed, looking at your life through the lens of.
What is gonna perform? What is gonna be engaging? Whilst I have always loved photography, and I think that's what drew me to Instagram as a young person anyway, I've always enjoyed photography. Like my dad was a photographer. We've always had lots of film cameras around the house. So I've always been intrigued by like how you can present certain things and capture moments.
At the same time, it does make you feel icky when you notice yourself constantly [00:24:00] looking at your own life and being like, oh, I'm having a lovely time with my friends. Right. Let's organize the picnic blanket for a story. You know, let's curate this
Alix: content. Yeah. Yeah, no, I think that's, I think that's real.
I, there's um, PJ Voight who did reply all, he was going on some trip and there were like a bunch of Instagram influencers and he was like the Twitter guy on the trip and he realized how it wasn't just that platforms generally change you, it's that specific platforms do specific things to you. So he was like watching these Instagram influencers and he was like, you could see them just like clocking for like the right frame and the right lighting and the right building or whatever.
And he was like, and I have a Twitter brain. I was just like in my head making up jokes about them that I would tweet. Um, and he was like, and we were both like, because we'd been like inundated or shaped by a platform we used a lot, we were absorbing the world through a very specific lens to create a particular kind of content and that that was really disturbing a realization.
Adele: Yeah, no, totally. And [00:25:00] it's like when I realized I've spent too long on Twitter, which I don't spend like any time on it anymore since Elon, but when I did used to spend too much time on Twitter, I would notice how angry I was all the time in my offline life. I was just carrying this internal anger and rage at the world and at society.
And I think that's just not a healthy way to live. And I think, you know, the same way you say like how people that spend more time on Instagram, they've got that lens of like. What will look beautiful right now, you know? And it's that perfectionist like visual lens that is equally depriving us living like that because it just flattens everything.
Like it's just sad.
Alix: Yeah, I know. I'm very glad to be off the Twitter treadmill where I'm just thinking about like shitty little comments I could make, like all the time, which is like not, it's just not a good way to live. Um, you wanna say a little bit about the work you've been doing? To organize around some of these issues because I'd love to hear more about kind of what you wanna [00:26:00] see and the movements that you see emerging that might make change.
Adele: I guess it started when in 2022, after we lost my sister, I got involved pretty much overnight with Berea Families for online safety in the uk, and they're a group of families who have all lost loved ones to online harms. Various ages of their loved ones and various causes, but all relating to the lack of safeguards that we have online.
And they have been very involved with the online Safety Act and ensuring that that gets through in the uk pressing mps in the UK to ensure that it's not. Just a token act, but actually goes the distance. Since Trump has been in a game, there is definitely some sort of rumblings and worries around whether the online Safety Act will actually.
Be substantial. And this has sort of come from recent trade talks between the US and the [00:27:00] UK government because the US government have obviously said they're gonna put tariffs on the uk. The UK and the US have now been having conversations around. Potentially watering down the online safety Act in order for some leeway on those tariffs, which would be disastrous for young people in the uk, and it would be really quite scandalous and insulting to bereave families.
Not just bere families, but anyone who has experienced online harms, because this is exactly what we've been fighting for for so long, and I think it's just a bit of a joke, really. I mean, to be honest, I'm not surprised, but I'm very disappointed that. Ki Arm's government have sort of pandered to big tech's agenda just like so many other governments have been because the UK government with the Online Safety Act had the aims of making it the safest place in the world to exist online.
And now we're gonna see that totally [00:28:00] won't be the case. But what I'm also worried about is it puts. The EU in a really difficult position to also continue to stay strong against big tech and the US government around online safety and tech legislation, because really we've been seeing that. The EU have been a lot stronger on legislation, whether it's Digital Services Act, the AI Act, and a possible Digital Fairness Act, which is sort of like underway at the moment, is being talked about and you know, drafted up.
It would make it really difficult. It would make. The eus life harder and their job harder to actually continue to stay strong against big tech. And Kim Vanek is one MEP, who is doing incredible work in the EU to push back against big tech pressure and lobbying. But I mean, what I've seen as an online safety campaigner is.
It feels like there should be so much more public outrage [00:29:00] around what is happening between the gap in the conversations and concerns around online safety, which those are happening. You know, smartphone free childhood has built such an incredible movement. There are so many parents who are mobilized writing to mps, you know, saying that they're really concerned about their kids' safety.
But then when it comes to like. Pushing mps and making sure they follow through on the job off what they promised five years ago. That isn't happening, and I feel that that is where people in the tech policy space need to start to bridge that gap is between like mobilizing the public more. They already support it.
If you go and do, you know, public polling from, for example, Molly Rose Foundation shows that the public are so in support of actually another online safety act. Like they're not happy with the first one, they want another one. That in itself shows that there is public demand for this, but we've not yet connected that gap and close the gap, and I feel that with tech policy we're getting stuck sort of [00:30:00] again, preaching to the choir, like I said earlier.
Of like having our conversations in our bubble, but not using that tidal wave of anger that we have seen after adolescence. There has been so much conversation around online safety, around violence against women and girls and how social media is fueling that in many ways, how toxic influences are fueling that we really need to close that gap and push for more legislation that protects us and.
I mean, we even saw this tidal wave of anger right when it was Trump's inauguration, and I was like, sort of not glad to see him, Abby inauguration, but glad that that photo was captured. Yeah, in h in hd. Like no questions. No questions, exactly. Like live in the flesh. Like, these guys are so interconnected.
There is no more doubt about this anymore, guys. Finally we can wake up to know that and we can get moving and fighting back. Because I feel like before [00:31:00] that there wasn't really that collective consciousness, but I feel like, I mean,
Alix: it's interesting. So recently Tanya o' Carroll sued meta. Yeah. You know about this case.
And what's so interesting is that it's not new legislation, it's GDPR, which I think most people think of as like pretty boring. It's responsible for the stupid popups. It hasn't actually had any meaningful deterrent effect, but she got a really meaningful ruling after 10 years of that legislation being in place.
And I think what's interesting is that the accountability comes when the laws are used and actually the accountability is there. And so what I see happening is you get this tidal wave of anger as you're describing that then gets. Smooshed down into policy debates and conversations. It takes ages to get a law written.
There's so much detail in negotiating, and it kind of sucks all the energy out of that tidal wave. It becomes a law, and then the law is either not enforced, not enforced, particularly consistently, or that the consequences of the law are so. Comparably [00:32:00] small for these huge companies. So it's like, oh, we're gonna fine you this tiny amount in six years if you, you know, it's like, it's so insignificant in terms of the scale of the problem.
I don't know if you have thoughts on like, where could that wave of anger go that is more translated directly into power rather than through this like tiny, tiny vector of policy?
Adele: I think I. Maybe got swept up into that like policy is the solution, like good laws. And I think, I think it is. That's part of it for sure.
Yeah, of course. And when you are meeting mps, that's sort of what they wanna hear. 'cause that's their business like, and as a campaigner, a lot of what you are doing is meeting people in decision making, being like, look, you are really not doing what you should be doing. Like catch up, take your power
Alix: seriously.
Yeah,
Adele: exactly. And I think it would be great to see more of. That power and anger channeled into actual innovation. And of course, innovation and technological change, of course it can get hijacked. Like we see that happen all the time in [00:33:00] capitalism. But I do think we need more of it because at the moment there's like policy and then there's like trolling billionaires, which has gotten really, really popular, um, with like trolling Elon, you know?
And I'm like, yeah, I love that. But then it's like, okay, what's next? And I think. From writing my book, I found and saw one really brilliant example off someone channeling that anger and a personal loss that they had experienced into actual what would the solution be in technological change, and that was Alice Hendy.
So Alice sadly lost her brother to online harms in similar circumstances to my sister. After he passed, she found out that he was looking for harmful material. Like he was searching up like how to take your life online, and he was being shown that information and basically got sucked down like a really harmful rabbit hole online.
And Alice, you [00:34:00] know, sort of thought. What would the online world look like if it actually directed vulnerable people towards support rather than towards harmful content? And she designed Ripple, which is a free browser extension that anyone can download, and it does exactly that. Essentially it directs people that search for harmful material to do with self-harm or suicide.
Away from that and towards. Safety and resources and help that they actually need. And you know, that is a prime example of safety by design. That is safety by design, inaction. I learned the term trauma informed design principles whilst writing the book. And I think this is why it's so important that either survivors of online harms and also.
People who have lived experience in some way are connected to online harms really need to be part of the conversations around tech and around tech policy. Because from everyone that I [00:35:00] interviewed for the book, often the solutions coming from those people who have lived the impacts, whether it's like survivors of image-based sexual abuse, or survivors off.
The type of harm that my sister had experienced. Those were the people who knew what would've been helpful in the time of their crisis, or in the time of immediately after this harm had been committed against them. And I think we have undeniably a huge gap between. Services and support services in the offline world and then like what's happening online.
And that gap has been exacerbated because we've been really slow to catch up With technological change. It would be a huge help to service workers, whether that's ambulance teams, gps, police departments, whoever it is, schools, universities. Those are the people who need to understand what online harms are, [00:36:00] what they look like, how are people vulnerable to online harms?
Who is more vulnerable? How can we prevent them from happening because the digital world's not going away? And we need to have those safeguards not just in the online space, but also in the offline spaces that we exist and live in too. For example, you know, the World Health Organization, problematic internet use, they have found.
Is. More likely in people who are neurodivergent. And that's something that I think about all the time because my sister was neurodivergent and I rarely hear it spoken about when it comes to what's happening online and who is more at risk of dangers online and risks online. And also of course, like who benefits from the online communities that we want to build.
It is a lot for the time people who are marginalized offline, and we need to make sure that these places are safe for vulnerable people and making sure that support services [00:37:00] have the training to recognize, right. What does problematic internet use look like, for example, and how do we prevent that because.
Of course prevention is better than cure, and I think that's something that I really do press in the book is that like we wouldn't have to have an online safety act if we had thought about these things 30 years ago that had been having these conversations and been actually slowing down before we implement things.
That's something I really wanna see as well is a cultural slowing down of adopting technology if it's not necessary, and if it's not actually beneficial to the majority, rather than to a few handful of people or companies. So obviously
Alix: you come at this from an angle, a very horrible, tangible. Lived experience with your sister and I think also other experiences you've had, what role do you think people have in policymaking and agenda setting that have lived experience and like what, what, what do you, what do you imagine that, [00:38:00] obviously people shouldn't have to be harmed for these laws to be written appropriately and prevent things, but in the, in the context of the kind of system we're in, the power.
Conversations, the narrative conversations, the like size, scale of this challenge. What do you think the ideal role would be for people that have had direct harm come to them because of these technologies? In terms of translating that understanding into policy prescription?
Adele: It's tricky to navigate because from my experience and a lot of the brewery families that I am connected with.
Also survivors of other types of online harm. What can often happen is that our engagement becomes just a performative thing, and it's like we've consulted this group. We are not gonna actually implement anything that they've said, but we did talk to them, so that's great. Come on. It's also just really traumatic for people to have to keep talking about.
The traumas that they've had in their life over and over to strangers and then it go [00:39:00] nowhere. So to be honest, I would love to see more people with direct experience of online harms in positions of power when it comes to decision making, but also voting in some way. And I dunno what that would look like, but say like.
If there was another online safety act and Ofcom was sort of mapping out what that would look like, if we are invited to that space, actually let us have a voting system in that space that makes our opinions get actually implemented into what the policy will look like at the final stage, rather than just the sort of, yeah, we've consulted them, to be honest.
Don't think what they said works for us or. Could actually be implemented because that's just again, you know, useless. Actually making the inputs go. The distance would look like. Consulting, giving voting power the same way [00:40:00] that we sort of like, I guess in the un like states have, you know, voting power on different bits of legislation.
It would be incredible if we could have those sorts of forums for, and not just for people with experience of online harms, but like say in a workplace setting, if a workplace is going to implement algorithmic management workers need to be consulted on that. And then be allowed to vote on whether they want that to be implemented.
Like that's the ideal for me. That's what I think should be the case, and I think that should be the case for every type of technology that is going to directly impact people's lived day-to-day experiences, whether that is in like an educational setting, a workplace setting. A healthcare setting because so far those decisions have just unfolded without anyone's consent.
And that is also why I think we're seeing so much digital exclusion because [00:41:00] those services are going digital without having. First consulted any other public, like, do you have the device, do you have the, the means, do you have the access to, to even engage with the service in this way?
Alix: Um,
Adele: and yeah, even if you're an
Alix: intelligent pensioner, you be able
Adele: to,
Alix: uh, I think that's a really good, I mean, well, you're also pointing to though, I think with the worker example, is that there needs to be power structures that are not.
Just the companies. 'cause in that context, the way you get to vote on whether or not something happens in the workplace is by joining a union. Um, and I think finding these structures that have been eroded over the last however many decades, and basically saying we actually need those as a counterweight to these companies who currently can act with impunity.
Even if someone with lived experience, even if someone who will be directly affected, has a strong opinion about what it should look like, they don't actually have any meaningful levers of power in many contexts. To be able to assert their views. And to me that speaks to a broader question of like where does [00:42:00] power sit?
You know, like you named mps and that's just one form of like representative power. And I think we need to probably think more about how to have more leverage for everyday people to participate meaningfully in what happens. Thank you. This was lovely. Thank you. The book is great. Logging off. Everyone should read it.
It's really good. Also buy it for your family and friends who are less. Nerdy about this as a way of helping them get, I think, the human interest stories that help give you access to understanding how these systems are affecting all of us and what's at stake. Um, so thanks Adele. This is awesome.
Adele: Thank you so much.
Alix: All right. I hope that was interesting. And if you're a millennial, you learned about angles. And also hopefully one of my, one of my big takeaways from this conversation, aside from the fact that Adele is amazing, the book is fantastic and highly encourage you to buy it for you. And also friends and family is just like how grateful I am that I am the age I am, [00:43:00] um, because I am young enough that I feel like I understand technology pretty natively and I'm old enough that I wasn't.
Pressured or felt pressured to post thirst traps on Instagram when I was 13 and haven't been kind of hardwired mentally to think about platforms to the extent that younger people have, but I also really appreciate the vulnerability. And curiosity that Adele brings to this conversation. And I think that there's a lot to learn generationally from how she engages in the conversation.
And also kudos to Adele for writing a book this young. I hope she writes more. And with that, thanks so much to Georgia Iacovou and Sarah Myles for helping produce this and also the whole new protagonist network community where I got to know Adele and her work and really encourage you too as well. Uh, with that, we will see you next week.
