Net 0++: AI Thirst in a Water-Scarce World w/ Julie McCarthy

Alix: [00:00:00] Hey there. Welcome to Computer says maybe this is your host, Alix Dunn. And in this episode, we are going to be continuing our net zero plus plus series, which if you'll remember, we did a few episodes last year on the environmental implications of AI systems and the physical infrastructure that they insatiably require, um, to work.

In this episode, I'm talking with the CEO of Nature Finance, Julie MacCarthy, about a report that the organization put together on the nature related implications of the data center build out currently underway. They have some pretty astounding figures about the. Correlation of data centers being built and probably the places they're being built, being the exact wrong places that you would want to build.

Water hungry, energy, hungry infrastructure. We'll get in more to the details of that, but generally we are doing a lot more work on data center [00:01:00] expansion, as are a lot of organizations in our network. And we have an upcoming meetup. We've hosted a few of these. So if you're interested in this topic, either from a research perspective, a campaigning perspective, or just wanna meet other people that are working on better understanding the implications of data center development, and in some cases resisting them when that feels like the right thing to do, you can get more information in the show notes.

But what I'm most excited about in this episode is that Julie hasn't always been working on the nature related implications of finance. She actually started her career in extractive industry trans. Parent, which I think we have just loads to learn from when thinking about the way that big tech is working and operating because they're trying to be secret, they're trying to build a narrative around their work that obscures the implications of a lot of the downsides, environmental and other of the types of work they're doing.

Um, and uh, I just think there's a lot to learn from the extractive industries and the campaigning to make those industries more transparent. We also dig into just generally the last. 15, 20 years of efforts [00:02:00] where governments have tried to increase transparency and reduce corruption and how that connects to, you know, a local policymaker being asked by an LLC that it's not clear that LLC is actually Google or Meta coming to them and saying, Hey, we have a really exciting opportunity to build.

This behemoth thing in your backyard, it's gonna look great in front of constituents who are gonna be excited about the tech industries coming into their backyard, which could only mean good things because it's gonna create jobs, et cetera. And how that just is a really problematic dynamic 'cause local policy makers don't have the money, the context to know what to do.

And these tech companies are not being forthright. It's gonna be really important to totally overhaul the information made available to communities who are considering data centers entering their backyards. Um, we're already seeing some of this change, which is really exciting, but I think Julie's career arc with extractive industries, with government transparency, and now with nature related implications for financial investment, she just has a ton, I think, [00:03:00] to teach us about this topic.

And also this specific report, which we'll link in the show notes is fantastic. And with that, let's jump into it with Julie MacCarthy.

Julie: I'm Julie MacCarthy. I'm the CEO of Nature Finance. And Nature Finance is a global non-profit that works to try and align global finance with an economy that works for people, climate and nature. So our mission is driven. By the fact that our global financial economic system is currently on track according to the IPCC for about 2.5 to three degrees warming without dramatic intervention over the next five years.

We have around $7 trillion per annum, around 7% of our global GDP each year actively destroying nature. And that's irrespective of the quantum indirect effects that that harmful activity [00:04:00] may have. And then we have a growing number of countries that are struggling to find the finance needed to address the nature and climate crises in the context of broader economic crisis.

So nature finance works as a kind of. Think tank and solutions lab. So we partner with financial institutions, governments at the national and sub-national level, multilateral development banks, civil society, academia, to try and develop and pilot new tools, frameworks, innovative financial instruments, partnerships and policy reforms that are.

Aimed at trying to help future-proof economies from the effects of rapid warming, help low and middle income countries in particular, to generate the resources that they need to invest in their own resilience and adaptation, and then support and enabling regulatory environment that drives investment and accountability for.

Economies that are nature positive and also more equitable.

Alix: That is really dark. [00:05:00] The 7% of global GDP going towards or actively participating in nature structure,

Julie: it's, it's, and it's an estimate and it's probably honestly an underestimate. It's several years old. It's an estimate that unit fi. Undertook in a sort of global landscaping report, and it's really important to hold that perspective in the field in which I work, the sort of broader nature and climate finance field right now, in particular, because there's been traditionally a lot of focus on gap filling.

And so when you look at this. Global biodiversity framework, the sort of centrally important set of commitments that governments have made to restoring and protecting biodiversity. There's a target around financing there that's around 200 billion per year in financing to invest in biodiversity and countries multilateral development banks, donors, civil society, are all constantly attuned to filling that gap and fundraising to fill the 200 [00:06:00] billion that's missing.

At the same time that you have an economy with 7 trillion inactivity that is actively harmful in nature. So it's like the gap filling feels akin to kind of treading water in a monsoon. It's not ultimately gonna have an effect on the broader overall system unless you are engaging structurally within the broader financial and investment.

Policy environment to try and turn the larger ship around in the nature positive direction.

Alix: I'd love to transition to. 'cause I think in that moment when governments are acting much more nationally or locally, it feels like companies are now the primary global I. Movers and I feel like big tech is taking up a lot of the space within the power vacuum that's being created.

And I feel like part of this is they're just so much more comfortable acting globally. They're so much more comfortable asserting a governance role in huge, ideally, democratically decided decisions that are now just being taken [00:07:00] by like seven dudes. Um, and part of that. Connects to this environmental crisis, this sort of crisis of nature, because they're essentially asserting the right and the good of building out technical infrastructure around the world that like we couldn't have imagined even 15 years ago that they would essentially assert.

Physical control over places in so many different countries around the world and have basically no oversight or control on that, or like governance standards on that. I wanna turn to like this intersection of nature, the economy, and big tech. I know you all just produced a report on this, like what got you interested in big tech?

Like how are you thinking about the size and the scope of that? Problem. Issue, challenge. Opportunity. Yeah. Whatever we wanna call it.

Julie: Yeah, all. I think it's all of those things. So obviously AI is a huge interest and sort of iconic, at least perceived driver of economic growth and [00:08:00] innovation, especially in the US and Europe right now.

In the lead up to the big AI Macron Summit that happens in February. About six months before, in conversations with different stakeholders in the field and mutual colleagues, I started a conversation about the sustainability impacts of AI infrastructure and what it meant for nature, broadly speaking, and economic development in some of these places where these investments were being made.

And at that time, you know, I had read a lot. Saw a ton of attention being paid, obviously, to the extraordinary energy demands of this infrastructure. And I had read that tech companies were looking to reopen Three Mile Island and looking at nuclear, and it was raising all of these questions about in an overall moment of looking to.

Decarbonize and digitize the economy and AI having a big role to play in that. How did one approach that in a way that didn't undermine [00:09:00] broader sustainability imperatives nature, had really been left out of that mainstream conversation, and I wasn't really sure at the time, this is sort of middle of last year, how much of a concern that was for the work that we do more broadly at Nature Finance, for example, trying to interrogate.

Systemic challenges around nature, risks and resilience and adaptation in, in the context of the global financial system. And so really as part of our own learning exercise from a strategic perspective, we partnered up with a group called Nature Alpha that does incredible analytic. Work taking different kinds of bio data and ecosystems and looking at the asset level to do different kinds of analysis for financial institutions, investors, and we ended up producing this report called Navigating AI's Thirst in a Water Scarce World.

A governance agenda for AI and the environment. And the goal [00:10:00] was to try and take as much publicly available information as we could find around existing AI hyperscale data centers right now, which are largely in the us. Europe for the most part though, growing globally and rapidly and take what information we could find around down to the watershed level for each of these 8,000 plus data centers that we ultimately ended up looking at.

What could we tell about the different kinds of nature related risks, dependencies that were present in those environments, and what could we learn more broadly around. The kind of regulatory and voluntary environment right now around if you're Microsoft, your Meta, your Google, your Amazon, you. Want to build a data center in the middle of Texas, what kinds of environmental assessments are you required to do?

What kinds of information are you required to disclose at the city or the state or the federal [00:11:00] level to help people who are making decisions about whether to host this infrastructure in their communities to understand and weigh what the cost and benefits are in terms of. Economic and environmental and, and social dimensions.

And so what we found, uh, not great, not not great, Bob, uh, to, to quote Mad Men. Um, yeah, I mean, the headline is there are actually extraordinary embedded risks related to nature, specifically related to water within the current. AI infrastructure that exists, and likewise, within the information that we could find on potentially planned data centers in the US and Europe and other parts of the world, some of the places that are on the radar for future centers.

And so just to give a flavor, 45% of the data centers globally that we looked at are located in places where water availability is high risk. 47% of data centers are located in [00:12:00] river basins with high drought risk. 55% of data centers are in places with high risk of water pollution in already water scarce environments, and 68% of data centers are located within five kilometers of key biodiversity areas.

Key protected areas that also, again, are relying on those water resources for all kinds of essential ecosystem. Services to back up in terms of what's going on with all of this water demand and usage and AI more broadly. So data centers, they rely on both electricity and water, sometimes to an inverse degree, to undertake the extraordinary amount of computing that they're doing in terms of both training ai.

Models and then deploying them as well to different communities. And so these servers get very hot in the process of doing all of this computing. And you can either use [00:13:00] lots of electricity. To cool them down by air conditioning or you can use lots of water to cool them down in different ways. And depending on the approach that you take, it can either have really high carbon footprint implications if using lots of electricity unless you are using renewable energy sources.

Or obviously it can have huge impact on water usage if you are using that to cool these facilities. What we have found is that, you know, there's a tremendous amount of water cooling going on. Not surprisingly, because of all of the energy related concerns and kind of policy attention in that domain to big tech being sustainable in their pursuit of ai.

They have, in many instances, aired on the side of low electricity high. Water usage, which has resulted in the landscape I described. We had a lot of really interesting conversations off the record with people who had served as chief [00:14:00] sustainability officers, for example, for some of these large tech companies previously, saying that generally, because this is such an area of rapid growth, of intense competition between the small number, especially of big tech stakeholders that are trying to gain market advantage, where energy teams.

Are sent out with the instructions of Find me a hundred megawatts of energy anywhere you can find it sometimes in instances with, with very little to no due diligence done on some of these other nature related risks, impacts, and questions, and that's. Then resulting in these facilities being put in places where they are indirect competition with local communities for scarce resources.

In some instances, with these communities not having any idea or having to take legal action in order to find out that that's the case, which is obviously from an accountability and democracy perspective, a really concerning

Alix: situation. I have [00:15:00] so many questions. I mean, I Why do you think, and I don't think it was in the report, but data centers are being developed in places that are least suited to have this kind of infrastructure built.

Is it just like land is cheap where there's not very much water and laws are lax? Or is it there's like some other. Element here. That means that over time we're gonna see this, this trend continue where basically the worst place is imaginable is where they're being put.

Julie: So it's interesting, I think, and it depends on where you're looking.

In the case of the United States, a lot of this infrastructure is in Texas, for example, which has. Not the greatest electricity grid, to be honest, but lots of sonic, famously not great solar, famously not famous, uniquely not great, um, in terms of domestic energy sources, but it has a highly favorable regulatory and tax environment.

For example, I mean, when you look at the new high profile Stargate, [00:16:00] 500 billion US investment. In ai, Texas is the first place that they're going. And you know, that's also a place where Elon Musk has chosen to build virtually all of his. Infrastructure for SpaceX notoriously frustrated with being sued by the EPA for not following regulations around biodiversity protection for some of his rocket launching sites there.

Alix: I will say I didn't think the end game of that was gonna be, he would just destroy the EPAI was not, that was not on my, no, no,

Julie: no. I mean, in hindsight it seems quite obvious, but yes, in the moment. Yeah. And many unimaginable things have since occurred. So in any event, I think there's a lot of interest. And desire by a lot of cities to host these data centers because there is this perception that it's going to bring lots of jobs and investment in.

That's certainly how it's pitched. The reality interestingly is that there is a lot of short [00:17:00] term economic benefit in terms of construction of these facilities. Once these data centers are, are up and running, they actually only employ a few dozen people typically, and they're typically highly skilled network engineers.

So you have a data center that is putting extraordinary demands. On the water systems and the electrical grid, that's only employing 20 to 50 people at the end of the day. So I think it, it is really important for countries and cities to interrogate. What are the trade-offs ultimately in terms of perceived economic benefits versus real economic benefits?

There may be, you know, tax benefits and other dimensions to this as well that need to be taken into account. But the whole sort of getting dollar signs in your eyes at the idea of a big data center wanting to come to town. I think that there's the potential for a, a really nasty hangover. I. [00:18:00] In some of these environments and, and looking at the net challenges and the medium to long term in terms of the environmental impacts, but I, I will say likewise, that there are increasingly innovations that over time can absolutely improve the nature related footprint of these facilities.

So you have closed loop. Cooling systems, which is basically the same kind of idea as a radiator in a car where you're using the same coolant in a cycle continuously. There's also immersion cooling where you're putting the entire server within this dielectric fluid. It's so that doesn't conduct electricity.

That keeps the servers cool, and all of these methods are very new. They're just being tested. They're much more expensive. The innovation curve, I'm sure will result in them ultimately being much more affordable and and widely available. The challenge, of course, is that [00:19:00] we have these data centers rapidly being developed, as I described in many of the wrong kinds of places, given the technology that they're currently using, and it's really difficult and expensive in many cases to retrofit facilities with any of these new innovations.

Alix: Yeah, and I think also in an environment where there's not disclosures and there's also an immaturity of our ability to sort of model out the nature impacts over the next like 10, 15 years, the chances that a company in a secret. Context is gonna make the more expensive choice feels, um, unlikely. But I also, I wanna take this back to like, I don't know, however many years ago it was you were running OGP and Open government partnership because it feels to me like it's a very similar kind of problem.

So like you work for a really long time on. Governments not systematically understanding what transparency should do for how it engages with people [00:20:00] understanding democracy and like the role people should play, and now it feels like you're doing a lot more work on. Corporate transparency and it feels like the sort of power center has almost shifted in some of these contexts.

So I dunno if you have like reflections on the dark days. Not to say we haven't returned to those dark days where governments literally didn't wanna disclose things where even like pre freedom of information act things, there was a political movement to essentially require transparency of governments in a way that it feels like right now we need in the context of private sector companies who are.

Doing everything they can to keep these projects as secret as possible so that people can't be involved in allocating resources, nature and otherwise. When doing this work, like were there, like how did you feel about the echoes in your career?

Julie: There were a lot of of echoes to unpack and so in the earliest part of my career I was working on transparency and accountability and governance issues in the oil, gas, and mining.

Sector, and that was a place where you [00:21:00] were dealing with these really large multinational corporations that were making huge investments in countries often that had really weak. Governance environments and surprise, surprise, there was a lot of corruption and rent taking and misbehavior going on in in these contexts.

And one of the big things that I spent a lot of time pushing for early in my career was in context like Africa or Latin America, Asia. Um, central Asia caucuses where new oil and gas and mineral discoveries were happening, and there was a chance to try and do things differently from how we had seen things unfold in the first generation of, of oil and gas and mining development in places like Nigeria or Angola, for example, making the terms of these.

Contracts public, having some ability for people to discuss and debate whether or not the deal was fair, whether or not environmental and social protections were being put in place, who was benefiting how? [00:22:00] How were people in a position to hold both government and corporations accountable if bad things started to happen?

Like what we've seen in the Niger Delta, for example, and. Yeah, there were a lot of the same arguments in that context that we hear now where, you know, a mining company would say, well, it's proprietary information to share how much we're paying for this particular investment, or what some of the environmental impacts are.

And similarly, you know, with Google building an AI data center in the das in Oregon. The local paper asked Google initially to disclose what its water usage was in the community, and Google said, oh, this is proprietary information. Sorry, we can't tell you. And the city actually aligned with Google. And so the, the newspaper had to sue both of them and in the process won the lawsuit, got the information, and the data center was using a quarter of the city's.

Daily water usage to [00:23:00] cool its facility. And you can imagine that, you know, in a non-water scarce environment, you know, that's a lot, but it may be sustainable in a water scarce environment. That's a lot of alarm bells going off. But the point is that I. Communities need to be in a position to be able to have, have this information and have these debates before facilities are built.

You just had Elon Musk brag about building a rock three data center facility in Memphis in 19 days, and there was so much pride in that. And then you had the city council coming out, you know, on day 20 when they find out about this in the newspaper saying, what the hell just happened? How did we not know anything about this?

What are the impacts gonna be in terms of the environment, in terms of pollution, where there's already an incredible amount of corporate-driven pollution in that particular community? A long history of it. What are the economic benefits? Why weren't we in a position to debate [00:24:00] publicly make sure that these trade offs are worth it?

So there are a lot of echoes there, and again, it's one of those lessons that for some reason. It's so hard for governments to learn, but you've seen in the past, and take India for example, there's a great story about when they passed the National Rural Employment Guarantee Act, where all DaLiss were able to have over a hundred days of federal employment throughout the year that they were guaranteed the right to this employment and these wages, and there was.

A whole website that was tracking, you know, who was claiming their days. And this group called MKSS and Gujarat was working in rural communities that were supposed to benefit from this, but had no access to computers and no ability to sort of engage with the material. So they ended up painting murals.

Of the websites on all of these walls in the community, it's the most amazing story. And people [00:25:00] started going up to the walls and saying, Hey, wait a minute, I didn't work those days. Or like, that person's dead. How did they work? You know, 50 days this year. And so the government, you know, quickly realized as many governments around the world from the Philippines to South Africa and onwards have that.

You know, when you get. Citizens engaged in monitoring and stress testing claims and information. You know, you get a better result at the end of the day. You save money, you have more eyes on behavior and activities in order to try and ensure better governance. Uh, there's another great example, but I heard from the Open Contracting partnership just this week where.

They were in Assam and they were looking at where money allocated for disaster and flood response was going versus which communities were actually suffering the most and most vulnerable to flooding. And there was huge divergence where the money was going and where it was needed. And by making that information publicly [00:26:00] available.

Open contracting partnership, working with different groups to crunch the numbers and surface these issues with the government. They were able to redirect the resources to where they were needed and also now want to do the same thing in other places because they see the broader benefit. So, you know, there's a whole history that it really felt like people were understanding and embracing around 2011, 2012, 2013.

We're in a cycle right now where people seem willing to sacrifice a lot in terms of. Oversight and control and engagement and accountability. Putting their trust in leaders and a vision that feels almost religious in this country in terms of how it's gonna transform everybody's lives for the better.

But in reality, it's just Elon Musk building a data center without any oversight or regulations in 19 days. That's probably gonna [00:27:00] employ 20 to 50 people and Memphis has every right and reason to be concerned about this. So there's definitely, I think, a, a lot of lessons to draw on in, in terms of how people in the past journalists, civil society organizations, in some instances local governments have used the legal system in particular.

To try and create new precedent around what kinds of information need to be put in the public domain in advance of really huge decisions being made from an investment perspective that have the ability to impact communities for decades. To come and you know, we're at a moment where we're gonna need to see a lot more communities stepping up.

I think in doing that and not taking for granted that all of these big tech stakeholders are acting in their best interests. It's not just a US phenomenon. The government of Chile canceled a permit that they had given to Google to [00:28:00] build a big data center there because they're experiencing extraordinary drought.

And there was tremendous concern about the plans that Google had submitted and a rethinking of whether or not they could ask more and do better in terms of what are impacts in that country. And so now they're in a new process with a lot more public discussion and transparency about their performance in that domain.

So, you know, there are positive examples one can point to of people stepping up and asking the kind of questions and, and demanding the kind of accountability that we need to see more of.

Alix: Looking back myself, like at that period. It also feels like the power analysis was so. Partial, like for one in those spaces there was this very like technocratic polite engagement from civil society of like, if we build governance standards, uh, and have this point person in this particular government agree to them, um, that will be a good lever to change this [00:29:00] government's entire way of being.

If you kind of transport to another field of like digital rights. It was at the same time when Google was organizing internet freedom conferences, which was just like going back to that era of like buying Civil Society's participation in a nominal vision that at the end of that rainbow was always going to be corporate capture, but like essentially trying to preempt some of the community building within civil society to resist.

I think some of those structures, and I think also like the Memphis example is really good because it's not just. Are there appropriate city council mechanisms by which OR standards set that Musk has to obey? It's that like it's a democracy in quotes that has been historically so racially hierarchized.

So if you look at like Memphis Community Against Pollution and the work to basically prevent not just xai, but historically all of this infrastructure that is related to the chemical industries is related. Like essentially like. It's like the northern most tip of [00:30:00] cancer alley, and it is where basically like the biggest black urban place you could put this infrastructure.

The fact that like a city council doesn't listen to that community isn't just because there's not. Standards in place are like, it's because they don't listen to black people and they've historically basically said, if you're poor and you're black, we actually don't care if you die at 60 because basically you're not worth governing around.

So I feel like there's just like, there's so much here that feels like civil society and maybe kind of thinking from a governance standpoint, we need to reckon with that. There's like a power. Analysis, a power frame and a racial frame that we miss and I think we have missed historically, and I feel like it's just all coming home to roost in this way, like is so, yeah.

Yeah.

Julie: Well, I think that's such a good, good point in the context of your previous question about why are we seeing this infrastructure be built where it's built? Why did Elon Musk decide to go somewhere where he wanted to build? Infrastructure really fast without a lot of environmental considerations and [00:31:00] happens to go to the place where lots of corporations, for all the reasons that you just described, having to do with race and power and equity, feel like they can act with impunity and get away with it.

And then add to that the fact that there's no longer a public discourse allowed around diversity, equity, and inclusion and issues of race. Conveniently by an administration where you have this kind of activity happening. It's an extraordinarily concerning but very explicit overlap of dynamics going on there.

I mean, to your point about corporate power, it's interesting because just reflecting back on the initial days of forming OGP, you actually had multiple big tech companies in the room during the earliest days of OGP forming Google. Was in the room, for example. I don't think Microsoft might have been there.

And you may recall also in the Obama administration, there's a lot of criticism around Google being too close to the government, a lot of Google former employees going into the government and then [00:32:00] recycling out, and there are all kinds of sensitivities there. There was a desire in the beginning to engage big tech again, more from this sort of optimistic they can be a partner.

Yeah. You know, a clean, individual, polite,

Alix: technocratic vision. Yeah.

Julie: But, but I have to say, you know, there, there was a concern at the end of the day that it felt like this was a inappropriate backdoor for potential advantage. Interesting. In terms of certain companies getting privileged access to certain countries that were in the mix and around the table.

And a desire from a governance perspective actually to exclude them from that. Yeah. From the governance structure to try and ensure some integrity there.

Alix: I also feel like, I mean, this is where. The fact that you co-directed the economic justice program at OSF? Yes. Yeah. Is interesting. Like, 'cause I feel like you must have had to deal with the confluence of like the impact investing kind of safer fallback position of like ESG capitalism light.

Yes. Um, combined with the challenge of redistribution in the context of a [00:33:00] multiracial democracy and like all that, all that comes with that. Yes. I assume there's stuff there that, um, is also intersecting with some of your strategy and thinking.

Julie: Oh, definitely. I mean, you even see it. Now, I mean, I can point to two instances.

There's a big sustainable finance conference that happens in Switzerland every year. It's called Building Bridges, and it brings together the whole Swiss banking and finance community, plus civil society and academia and different internationals. And you had one very large multinational. Bank stand up there.

I think this was about two years ago to great applause. And with a lot of self-satisfaction saying that I think like 7% of assets under their management were now ESG aligned. What? And yes. And so, uh. Colleague of mine stood up, bless him, uh, during the question period and said, so just to be clear, that means that 93% of your assets are not currently [00:34:00] aligned with sustainability outcomes.

But there is this. This real risk and you get it at Davos and you know, you get it in context like building bridges of everybody feeling like these really incremental, symbolic baby steps are indicative of the kind of transformational change that we actually need to see occurring. And then you can sort of get back, you know, in your jet and fly home and.

You know, feel good about yourself in terms of the reputation that you're building for your company. And you know, the private sector's, not a monolith. There are companies and financial institutions that are doing extraordinary voluntary measures in the right direction, but there is equally this. Dynamic, which is more pervasive for sure right now, of lowest common denominator.

Incremental, insufficient progress being celebrated as an extraordinary accomplishment and lift. And to be honest, there was that same. To a certain degree, [00:35:00] kind of techno optimist, self-congratulatory spirit a little bit at the Macron AI Summit in February where you don't say, you know, the only conversations that people wanted to have in terms of sustainability were about how all of.

This incredible new bio data and ability of AI to make sense of it and use remote sensing and all these new tools was gonna transform our ability to address sustainability challenges. And it's not to say at all that AI doesn't have a tremendous amount to contribute in a positive way to sustainability challenges, but the conspicuous absence of a willingness to discuss.

The existing risks and challenges and implications of AI infrastructure in particular for sustainability right now. I found really challenging and for the one or two questions that large companies in this space were willing to entertain. The response was an incredibly dismissive yes. You know, we know it's a challenge, but don't worry, we'll [00:36:00] innovate our way out of it really quickly.

So it's not even anything you need to be worried about. And obviously the story that the paper that we did with Nature Alpha Tells is a, a very different one. If I was an investor. In companies right now that have this level of embedded risk and exposure, I would be pretty horrified. I'd be having a lot of conversations trying to understand what they plan to do about it, and it gets to.

The more positive side of trends that I think we've seen. Something I mentioned to you earlier in the conversation around, for example, the Norwegian Pension Fund, Norges Bank, which is 1.6 trillion in assets under management, and recently made the decision, which is extraordinary in the context of the broader ESG backlash happening right now to subject 96% of those assets of that 1.6 trillion.

It's a natural capital risk assessment because they see all of the ways in which this exposure to vulnerability in ecosystems and their destruction and lack of [00:37:00] attention to them has the potential to come back and bite everyone in the rear going forward if you aren't proactively attending to these concerns.

Alix: I think that's also why the joint statement that came out, I think just before the action summit, that Green Web Foundation among others put together that was like, Hey, so we think on the agenda it's important to include that any AI infrastructure needs to be within planetary boundaries. And that from a narrative perspective, you can't have a vibe of infiniteness that like, it'll be fine.

I don't know how all signs point to it not being fine, but like, let's just keep on this path and trust that eventually technology will kind of. Evolve in a way that is beneficial. And I, I think the drum we should bang is that basically, this is like a baby hanni. She says, um, there's potential benefits and definite harms.

Yes. And you can't, you can't, yes. That, that formula just doesn't work. Like, that's not a narrative that we can, we can, we can lean on. Like it's a huge gamble. So basically just say like, we will sort it out and it feels very immature. [00:38:00] For people who nominally are responsible for these huge consequential decisions for how resources are allocated to just like YOLO

Julie: this.

Yeah. Yeah. It feels patronizing and reckless. And also as I described earlier in the conversation, there is a broader pushback right now around more regulations, particularly around environmental social governance. Issues and that, you know, the incoming US administration has accelerated it, but it was already happening before that in Europe and in the us.

That said, over the past decade or so, there has been a general trend from a regulatory perspective towards more mandatory proactive disclosure of. Climate and increasingly nature related risks, impacts dependencies because they're recognized to be material to the risk framework for investors that are looking at engaging with [00:39:00] particular companies.

And so in that context, I think there's a lot of room for improvement and low hanging fruit in terms of the kinds of basic information. Investors from a prophylactic perspective should wanna see. Coming out of this sector in particular, not only looking at some of the highly aggregated basic data that's reported right now around energy usage or efficiency, or in the best of cases, sometimes high level of water consumption, but also looking at issues of what local water availability and risk actually is in the places where these assets are.

You know, in terms of pollution, drought risk. Keep diversity protected areas, these different metrics where there's a lot that we have found that is going on to be concerned about. And you know, this is the kind of data that communities, as we discussed, should have in order to interrogate and properly consent.

To hosting this [00:40:00] infrastructure in their communities, and it's not that data and disclosure is by any means, a silver bullet, which also in my decades of working in this space, I have necessary but insufficient. Learned in spades. Yes, yes. It's only one piece of the ecosystem, but obviously in this case. If a company is moving into your backyard and is gonna start draining, you know, a quarter of the water from your community and you are in the middle of Texas, in a drought prone, or Nevada or some of these other places where this infrastructure is being built, that's something that you wanna know about and figure out in advance and maybe insist that a different kind of cooling infrastructure be used, even if it's more expensive for the company in order to agree to host the data center.

These are the kinds of questions and. Stipulations that one can only develop in the context of having some information to base that on. To me, there's so much power and agency that's not [00:41:00] being deployed right now by local communities and city governments and national governments that are entertaining the hosting of this infrastructure where.

Again, there should never be an instance where it's even possible to come into a community and without any sort of pre-approval by a city council in 19 days, build a huge hyperscale center and then have people find out about it after the fact. But these are instances where, you know, what we saw in Chile with Google, it reminds me very much also of Mariana Matos work on the entrepreneurial state and this whole sense that.

You know, really what a lot of policy makers need are sort of life coaches and therapists to help remind them that they actually have a lot more power and ability, you know, to assert and responsibility, yeah, leverage and responsibility on behalf of, you know, those whom they're meant to serve. They can ask for a lot [00:42:00] more than they're demanding right now and not jeopardize the opportunity to still potentially host these investments.

And there's a lot of great information already moving through the atmosphere about what kinds of questions they should be asking aside from the report that we produce, which has recommendations in it for people who are looking for a place to start. The United Nations Environment Program has developed this whole proposal for how one thinks about.

Performance standards in terms of energy and nature for AI infrastructure, and they actually developed it for investors. But it's equally as useful if you are on a city council and you're considering whether or not you wanna host this infrastructure. I mean, asking basic questions about the energy usage, the source of energy, the water consumption, how that jives with local water availability and risk, how that's expected to play out over the coming.

Decade. And then what are you looking at in [00:43:00] terms of jobs that are being promised, both short term and medium to long term? And what are you looking at in terms of potential tax revenues and what's the history of this company in terms of paying the taxes that they're meant to pay? And you know, due diligence.

That people could already be taking up. But I, you know, I think in certain instances, a lot of people are looking at these data centers again, with dollar signs in their eyes because it looks and smells like a big ticket, long-term investment without looking under the hood to really interrogate what it might mean for their communities.

And so I do think that there's just general awareness of what some of these risks are. If I could do a day long seminar. With every city council in the US and Europe and India and Malaysia and all these places where these centers are starting to pop up, one would hope that a portion of those stakeholders would then use that information to start asking more questions.

Alix: So basically people representing. Constituencies need to [00:44:00] buck up and learn stuff and be more assertive and be more mindful and engaged in the process of deciding what resources should be used for. Is that what I'm hearing? Yeah.

Julie: Yes. And same for journalists and same for Civil Society. You know, it was the newspaper and the Dows that ended up taking Google and its own local government to court and winning.

In order to get this information disclosed on water consumption and in Memphis, a number of places now you're seeing similar legal action, emerging as in all accountability ecosystems, you know, requires on multiple stakeholders sort of playing their part and assuming that. Some stakeholders are not necessarily acting in the best public interest and therefore, you know you need others that are there to correct and remind them of, of what that looks like.

So again, I think that there's a lot more general awareness raising and sensitization [00:45:00] across civil society as well. I will say though, that being at the Sustainable AI Summit, which happened the day after the primary Macron Summit, when I did stand up in the audience and raise questions. To the Nvidia Chief Sustainability Officer around some of these water risk questions and received, unfortunately, like a very dismissive and patronizing answer about them fixing all of it before it really becomes an issue.

Noting that actually Nvidia with their semiconductors, they huge water risk issues in Taiwan right now, they have huge vulnerability that water and dams there that they rely on is down extraordinary percent. So they're by no means vulnerable here. But even afterwards, I had. A whole raft of people in the audience come up to me from academia, from University of California and Santa Barbara, to corporate stakeholders, to investors, to friends of the earth, noting their own sense that, yes, this is such an important issue and why isn't anybody talking about this?

So I do think that it's a wave that's. [00:46:00] Coming and in a context where over the next decade, water demand globally is expected to exceed supply by 40%. This is fresh water. People are gonna have to pay attention in different ways. This is part of living in a warming world and investing in a a warming world.

Alix: Just quickly, so the stakeholders you mentioned were people that are acting in the public interest. If we recognize that a lot of companies aren't, are there things you would require of companies? If you could be queen for a day.

Julie: Oh yeah. I mean, you don't like to live in this universe because it's not a real universe.

Um,

Alix: okay. That's fair.

Julie: But yeah, I mean, again, like you gotta know

Alix: what we want though. 'cause I think without knowing for

Julie: sure, for sure. And I, again, I have to say like. That's part of the reason I'm really excited about doing the sectoral guidance with the task force and nature related financial disclosures, and also the work that UNAP is doing on performance standards.

Because again, just laying out the set of [00:47:00] questions in terms of core metrics that everybody needs to assess in advance, disclose. To those in the community where they're investing and have a conversation about how they're gonna handle the implications of all of that data. Again, related to what's the energy source that you're using from this plan?

Is it nuclear? Is it coal? Is it renewable? What are the consumption patterns? How is water engaged? What are the challenges related to water? What other technologies could you be using? And then again, not to speak of all the questions in terms of economic benefits and pinning that down like. All of that being proactively assessed and disclosed prior to investment taking place, and then also tracked over time.

I mean, Microsoft's. Water usage overall went up, I think it was something like 34% over the past two years. Yeah, year on year. So these things are also not static. They change over time and they need to be tracked over time. And so it's that kind of proactive assessment and disclosure that [00:48:00] again, it, you know, it's in a company's enlightened self-interest.

You really wanna build infrastructure in a community that suddenly you're gonna become. Embroiled and legal battles with and social license to operate for the next five to 10 years. Because you forgot to ask whether or not there was enough water for you both to share. Yeah. In terms of you and local communities.

But they don't

Alix: even, I feel like it's this like infinity industrial complex where it's like they just have to pretend like it's all immaterial because they are gonna be designing this future that we all want, and I just feel like. Part of the lack of engagement isn't a ruthless strategic attempt to get stuff they shouldn't have.

I think it's that even engaging in the conversation about what they should have undermines their position of authority and power in this way that isn't conducive to them. It's continuing.

Julie: It's, it's, I think that's, I think that it's hard to discuss some of these companies as a monolith because, for example, in talking to one former chief sustainability officer for one of [00:49:00] the big tech companies.

There was a sense that there's this arms race going on of keeping up and grabbing the energy and building the facilities and having the computing power to train these models to, you know, hopefully come out in the top three in the coming years. And there was also a sense of, you know, I feel like bringing these issues to people's attention, they would have this kind of oh shoot moment and could potentially start thinking differently.

It's just, it needs to be tabled with them also in a way that they're able to. Navigate and not feel like it's imploding, their business model and everything that they're trying to do, but in like the former kind of semi rules based system, still highly characterized by corporate capture, but not the sort of fee stum that it has become.

You know, I don't know if it's a concern for precedent setting of, you know, sharing some of this information and then losing leverage on power. I think for some of them, yes, that's probably the case, I think for others. [00:50:00] It's just that classic sort of market racing to keep up and maintain advantage in a fast moving space and acting now and asking questions later.

So I don't wanna presume, and I'm hoping that we'll be able to engage with some of these members in the Sustainable AI Coalition, for example, to have really open, candid conversations about this, recognizing that this is a really fast developing space and speed is certainly an asset. You know, in terms of, of what people are doing, but longevity and resilience is ultimately gonna be the most important thing.

Nobody wants to build a house of cards to have it implode suddenly for all these predictable reasons. And so, you know, I think investors hopefully are gonna become more sensitive to that. And I hope that there's at least a subset of these. Companies that from a pure rational self-interest perspective, can also recognize the ways that in a rapidly warming world of scarce resources, these are issues that they need to take really seriously.

It's like a game of musical chairs. [00:51:00] Everyone's not gonna have a seat at the end unless they've integrated some of these core nature and climate related risks into their investment and corporate value chains.

Alix: Well, I think, thank you. Let's leave it there. Yes. This was what I wanted to talk about. Okay, good.

It was a lot of fun. No, no. It's so like, I just think you're like what you've worked on is so relevant in so many different ways. Yeah. It's like this, I don't know, like Rubik's cube of where we are now, and I love the report. I thought you guys did such a good job clarifying the risks, but also the policy opportunities and the kind of global governance mechanisms that might be available to.

Sort of work at this problem from both ends in terms of setting, you know, standards, but also encouraging and creating space for resistance and opposition using those standards. I feel like it's a important piece of the puzzle, so thank you.

Julie: Thank you. I feel like we have a work cut out for us, and I have to give a shout out to my colleague, Constantina Kori, who played a huge role in helping author the report and also colleagues at Nature Alpha was a real team effort and has certainly set out an [00:52:00] agenda that will keep us busy for.

Years to come, I suspect.

Alix: Okay. I hope that was as interesting to you as it was to me. I think I could have talked to, to Julie for hours and hours. As I said up top. I think her experience is just so relevant to the moment we find ourselves in now, and I'm really grateful that there's people like her who are. Earnestly engaging in both governance standards that should be set around the reporting requirements and transparency requirements, and also just the decision making around how and when to invest in huge build outs of infrastructure that have very big implications for communities and nature.

So do read that report, we are gonna be doing a couple more of these. Net zero plus plus. Episodes, um, just because the implications of AI for the environment are huge and growing, and we wanna platform people who are doing work in this area. The next episode is actually gonna be digging into some work we've just put together around five case studies of data center development in five different [00:53:00] countries.

And digging into how that came about, the organizing around it, um, the process by which governments consulted or didn't consult local communities. We have two researchers from that report who are gonna share a little bit about the findings and the kind of trends that we're seeing across jurisdictions and some of the differences.

So stay tuned for our next episode. Thanks for joining. Thank you as always to Georgie Iacovou and Sarah Myles for producing the episode, and we will see you soon.

Net 0++: AI Thirst in a Water-Scarce World w/ Julie McCarthy
Broadcast by