AI's Energy & Water Demands: Sorting Fact from Fiction with Andy Masley

Andy Masley examines AI's energy and water use, unpacking myths and giving heuristics (e.g., a ChatGPT query about running a microwave for one second) and comparing data center power, solar land area, and an estimated 80 GW buildout. He covers local impacts, water trade-offs and uncertainties.

AI's Energy & Water Demands: Sorting Fact from Fiction with Andy Masley

Watch Episode Here


Listen to Episode Here


Show Notes

Andy Masley on AI energy and water use unpacks common myths and brings an Effective Altruism lens to the debate. PSA for AI builders: Interested in alignment, governance, or AI safety? Learn more about the MATS Summer 2026 Fellowship and submit your name to be notified when applications open: https://matsprogram.org/s26-tcr. You’ll get memorable heuristics—one ChatGPT query ≈ running a microwave for one second; 10,000 queries ≈ a cross-town car trip, burger, or hot shower—and clear comparisons about data center power, solar land area, and an estimated 80 GW buildout. We also cover local impacts, water trade-offs, uncertainty drivers, and why AI’s global resource footprint is smaller than many fear while noting where caution is still warranted.

LINKS:

Sponsors:

Framer:

Framer is the all-in-one tool to design, iterate, and publish stunning websites with powerful AI features. Start creating for free and use code COGNITIVE to get one free month of Framer Pro at https://framer.com/design

Agents of Scale:

Agents of Scale is a podcast from Zapier CEO Wade Foster, featuring conversations with C-suite leaders who are leading AI transformation. Subscribe to the show wherever you get your podcasts

Tasklet:

Tasklet is an AI agent that automates your work 24/7; just describe what you want in plain English and it gets the job done. Try it for free and use code COGREV for 50% off your first month at https://tasklet.ai

Shopify:

Shopify powers millions of businesses worldwide, handling 10% of U.S. e-commerce. With hundreds of templates, AI tools for product descriptions, and seamless marketing campaign creation, it's like having a design studio and marketing team in one. Start your $1/month trial today at https://shopify.com/cognitive

CHAPTERS:

(00:00) About the Episode

(03:39) Meet Andy Masley

(11:19) AI resource myths (Part 1)

(20:10) Sponsors: Framer | Agents of Scale

(22:31) AI resource myths (Part 2)

(25:01) Per-prompt energy math

(37:20) Chips versus electricity (Part 1)

(37:25) Sponsors: Tasklet | Shopify

(40:33) Chips versus electricity (Part 2)

(47:10) Everyday energy comparisons

(59:56) Debunking water myths

(01:09:20) Gigawatt data center scale

(01:21:12) Local air pollution

(01:32:06) Electric bills and policy

(01:45:18) Desert siting tradeoffs

(01:53:54) Real risks, real benefits

(02:05:19) Outro

PRODUCED BY:

https://aipodcast.ing

SOCIAL LINKS:

Website: https://www.cognitiverevolution.ai

Twitter (Podcast): https://x.com/cogrev_podcast

Twitter (Nathan): https://x.com/labenz

LinkedIn: https://linkedin.com/in/nathanlabenz/

Youtube: https://youtube.com/@CognitiveRevolutionPodcast

Apple: https://podcasts.apple.com/de/podcast/the-cognitive-revolution-ai-builders-researchers-and/id1669813431

Spotify: https://open.spotify.com/show/6yHyok3M3BjqzR0VB5MSyk


Transcript

Introduction

AI Energy Usage: Sorting Fact from Fiction with Andy Masley

Hello, and welcome back to the Cognitive Revolution!

Today my guest is Andy Masley, a blogger and thinker who has recently done some of the best independent analysis on the energy and water demands associated with AI.

Andy is the director of Effective Altruism DC, and this conversation served as a great reminder of why I appreciate the EA community. Their emphasis on identifying causes that are neglected, tractable, and large-scale seems right on to me, and the culture of epistemic humility and earnest truth-seeking is admirable. 

What's more, Andy is a great example of how EA thinkers are generally very pro-progress, even as they worry about extreme risks from AI – a consistent pattern in my experience, which contradicts the popular "doomer" caricature. 

In all sincerity, the main reason I've never gone around calling myself an EA is simply because I don't consider myself virtuous enough to deserve the label.

Regardless of affiliation, this conversation also demonstrates how a genuinely curious, truth-seeking, numerate person can, with AI help, make a meaningful contribution to the discourse on a complicated topic, even without formal credentials or deep experience.

In this conversation, we tackle the prevailing narratives around AI’s consumption of energy and water.  While we acknowledge that there can be local issues associated with large scale datacenters that really do matter to specific communities, such that local leaders should be careful about the deals they strike with data center companies, the bottom line is that AI is not a huge deal when it comes to global emissions or water use.  

The main reason is simply that bits really are that much less massive and therefore easier to manipulate than atoms.

For the purposes of quick mental math, I find it helpful to remember a few key heuristics that we develop in this conversation:

  • a single ChatGPT query uses roughly as much energy as running a microwave for 1 second
  • a single cross-town car trip, a hamburger, and a hot shower all use roughly as much energy as 10,000 ChatGPT queries, which means that if you can save just one car trip with AI usage – which I have done a number of times in just the last couple months – you have more than offset your ChatGPT usage for the year
  • a 1 GW data center uses as much electricity as 1 million American homes
  • 1 GW of power requires roughly 10 square miles of solar panels
  • the full $7T buildout, which is estimated to use something like 80 GW of power, represents a 1-2% increase in global energy usage, which is less than the expected increase due to general global economic development
  • that full 80 GW of power, if it were all powered with solar panels, would require 800 square miles, which is less than 1% of the land area of the state of Nevada

The water usage analysis is very similar, albeit a bit more complicated, for reasons you'll hear.  

Obviously there is some uncertainty associated with all of these numbers.  And of course things will continue to change, with forces such as increased usage and longer thinking times pushing AI resource intensity up, while increased efficiency and new energy sources push it down.

Nevertheless, I feel very good about the back-of-the-envelope calculations and memorable comparisons in this conversation, and hope it gives everyone a confident, grounded sense of how resource-intensive AI really is.

With that, let's get into the fundamentals of AI energy and water use, with the director of EA DC, Andy Masley.


Main Episode

Nathan Labenz: Andy Masley, Director of Effective Altruism DC, Internet Truth Warrior online at andymasley.substack.com. Welcome to the Cognitive Revolution.

Andy Masley: Yeah, so great to be here. Long time listener, first time caller. Like really, yeah, crazy privilege to be here. Thank you, David.

Nathan Labenz: My pleasure. Thank you. That's very kind. So today we're going to get into AI use of energy and a little bit of other resources, focusing on water probably. This has been something that for me has really kind of surprised me over the last couple of months. In particular, I went and did a talk at an AI for public school educators event in October, and There's a lot of questions on everybody's mind across society and certainly in the education space right now. What is this going to mean? Whatever, whatever. But aside from the very education-focused questions they were asking, I think the number one question that kept coming up over and over again is, what about energy? Is it going to ruin the environment? Is it going to blow up our electricity bills? What's going to happen with this energy thing? It is clearly something that people have understood is a potentially big problem. And I think you and I are both, you know, spoiler, generally on the same page that a lot of the fears that are out there circulating are pretty overblown relative to what we actually have to worry about, which is not nothing, but, you know, hopefully is something that we as a society can manage. So maybe for starters, why don't you just introduce yourself a little bit? I mean, I think one thing that is striking is, you know, I already gave the your role as the director of EADC. This is, I think you may surprise some people with your profile, as I, you know, hope to do sometimes as well, with a mix of opinions, which aren't super easily bucketed into doomer or accelerationist. Maybe give like your kind of big picture worldview and, you know, relationship to AI as it exists today. And then we'll really dig into the resources question.

Andy Masley: Oh, my, my full theory of the world. Yeah, we'll, we'll take a little while. Yeah, I mean, basic place I'm coming from is the general EA perspective that it seems very likely that machines that can replicate a lot or even most or all of what can happen in the human mind might not be too far away. They seem more likely than not to happen in my lifetime with massive error bars there, obviously. And that can come with a lot of risks. So I'm pretty agnostic about P doom stuff. I would probably lean in the direction of like a lower P doom purely because I'm a little bit wary of the very specific philosophy arguments that get made around like advanced AI, like rising up and having misaligned goals and stuff. It's kind of a separate conversation. But I am very high P weird. I do expect the world to get very weird in very potentially dangerous ways as more and more powerful AI systems come online. And this is all kind of downstream of me believing that AI systems as they currently are and as they will be in the future just seem very likely to be very capable in general. And as a result of AI being capable, I'm also just pretty excited about AI tools as they currently exist. I think this is a big disconnect that actually just makes complete sense if you think about it from the perspective of AI being capable, but I think a lot of people are also thrown off by, where I'm very visibly a massive chatbot user, like Opus 4.5, basically my new best friend, so cool, love it, but I'm also quite wary and worried about AI's overall impacts on society in the long term. And so I think that combination is actually pretty common behind the scenes, and a lot of people probably identify with both being excited about AI as it exists and scared in the future, but I think a lot of people have this really simplified version of the debate, where you're either a Luddite doomer who hates all technology, including AI, and wants to stop it, or you're an accelerationist who wants to go full speed ahead and loves AI because they think nothing bad will ever happen. And there's just an obvious middle ground here, like a pretty easy Venn diagram intersection, where it's like, yeah, I don't know, for the same reason that I think nuclear power is really good. But also, nuclear stuff comes with some tail risks too. Like machines that mimic a lot of what happens in the human mind, like very useful for me personally, very excited about a lot of what they're doing, but also like, man, this could get really crazy, really fast. And there are a lot of potential risks that come with that. So that's the general background attitude that I'm coming in with.

Nathan Labenz: Yeah, I think that is pretty representative of the EA. It's obviously hard to paint a movement with such a simple descriptor, but When I've attended EA events, for example, my own personal experience has been mostly what people seem to want to talk to me about is like, hey, could you maybe help advise my organization on how we can make better use of AIs as they exist today? While at the same time, maybe one of the things that they're trying to do is either raise awareness or try to figure out some of these big picture, long-term tail risks, wrap their heads around them better in any number of ways. But that fear is, in my experience, like very often packaged, I would say, you know, much more often than not packaged with a appreciation for what the technology can do for us today. And certainly, you know, nobody is not nobody, but, you know, there's, I'd say very few who fail to recognize that in the EA movement as I've encountered it.

Andy Masley: Very much, yeah. And I think behind the scenes, EA is much more heterogeneous than a lot of people might be aware of, where there are wildly different takes about the basic AI X-risk case. Definitely different scenes where you can maybe receive some social capital or punishment for believing different things. And I think that's bad, obviously. But at least where I am in DC, which is one of the largest EA communities anywhere, I've never personally experienced any kind of social pressure to believe the absolute craziest things about AI risk. I feel very confident being very agnostic about that actually, which is very nice. So I'm a fan there. And yeah, I think most EAs I know are coming at this from a general, we're very excited about a lot of technology and In some ways, it wouldn't really make sense if everyone were like, oh, AI is going to be so capable in 20 years that it could kill everyone, but it's not currently capable enough to be useful in answering my emails or something like that. That combination, I think, is becoming pretty rare pretty fast. And usually, a lot of skepticism about current AI models is also paired with skepticism about future risks from misuse or misalignment and stuff like that.

Nathan Labenz: Yeah, the coalitions are a little odd these days. The scrambling is well underway, which is probably healthy, I think, better that than just total polarization. But it does mean people sometimes incorrectly project certain beliefs onto their debate partners.

Andy Masley: Yeah, sorry, go ahead.

Nathan Labenz: No, please.

Andy Masley: Oh, yeah, without calling anyone out, every now and then, I'll bump into someone being like, oh, this guy's an EA. He's a communist Luddite who wants to destroy all technology and just level everything because he hates freedom and stuff. And this doesn't really describe the profile of the people I'm bumping into. I think there's just a lot of AI doom in the air right now from a bunch of different directions, including climate doom, which we're going to get into here, and I think in most cases, there are just a lot of examples of worry about AI that a lot of EAs just don't really share. I think my takes on AI and the environment is shared not just in EA, but with a lot of other people who are just knowledgeable about the technical aspects of this. But there's a huge disconnect between there and the broader society and a lot of circles who are much more worried. So yeah, I tend to say about that. But basically, a part of this has just been presenting, hey, look, I'm an EA. I can say what I believe very directly about this stuff. And I definitely don't believe a lot of the standard environmental worry about AI as it's presented in a lot of the media right now.

Nathan Labenz: Yeah. So let's get into that. How would you describe... what people broadly are hearing from the media? Like, what is the baseline mainstream narrative that you are kind of setting out to correct?

Andy Masley: Yeah, I mean, there are different levels of confidence they have about different claims here. So the media is covering the general data center build out a lot. And they're like a huge amount of uncertainty and concern is, I think, definitely warranted because this is just a huge new industrial project. We don't really know how it's going to go or what the consequences are going to be. So I think that The main thing I'm trying to correct is there are a lot of assumptions that this will just automatically definitely be bad for the environment, or it's already causing these huge cataclysms. And I think I want to adjust people more to observing, what are the actual facts on the ground? What are the potential trade-offs here? Let's not just assume automatically that using more energy or water is always necessarily bad. There are a lot of places where this can potentially be good for communities that receive a new massive taxable industry that's using a normal amount of water and providing of new revenue for utility and stuff like that. And I'm not saying this doesn't also come with problems. I'm just asking that the positives be weighed against the negatives on a case-by-case basis. And I do worry that a lot of the coverage there, especially, is framing like, oh, any environmental harm cannot be traded off with the positives. There's nothing good that's coming from this. And I just want to push against that extreme case while still leaving a lot of uncertainty. And then the thing that I feel most confident about separately that I do want to demolish, honestly. I'm just so tired of this talking point bouncing around. It sounds like you bumped into this as well, is the idea that your personal chatbot use is significantly contributing to your emissions or your personal water footprint. We can have a debate about maybe Personal footprints don't even matter, and all that matters is big systematic stuff. I actually think that's pretty healthy as a way of thinking. But if you are worried about this stuff, I'm more convinced than I can be of most things that poking around on chatbots, even if you use them a lot throughout the day, is just not going to add a meaningful amount to your carbon or water footprint, and in many cases might actually reduce it. We can get into that more later. I still bump into so many people with such wild misconceptions about this, including people who are very influential in big institutions who are making decisions saying, the people in my institution cannot use chatbots because the individual environmental harm of those individual prompts is too much, basically. And that, I think, is ridiculous. We just have the numbers, it's looking pretty good, there's uncertainty, but the range of uncertainty isn't enough to justify this right now. So that's a big thing I'm pushing back against.

Nathan Labenz: Yeah, that's crazy. I haven't heard of any... organization leaders giving dictates along those lines.

Andy Masley: Yeah, I'm sworn to secrecy on who, basically, unfortunately. Basically, I recently interacted with a large educational institution where the people in charge of the technology there were just completely convinced that this isn't actually so bad, and we should really be able to pay for this for our low-income students, especially, to have access to chatbots and stuff. But we're getting a lot of pushback from a lot of different places, basically being like this will be bad for the environment if we do this to the point that we can't do this. Using that logic, it also wouldn't make sense to ever purchase books for any students and stuff like that. There's just a lot of stuff to say about that. But this is happening. A lot of people who are focused on AI on the ground are pretty shielded from this because I think they're much more aware of what the actual numbers look like. People might be blind to just how common this is and how many people have not touched chatbots at all specifically because they're completely convinced that this is a significant add to their personal omissions. A lot more to say about that. Some people are not touching them because they're worried about contributing to a systematic thing. can say more about that. But the general misconception here, I think, is just so widespread still. And it's a massive tragedy. I think in blogging about this, it kind of feels like shooting fish in a barrel. It's just a little bit goofy when you actually look at the numbers. But yeah, I tend to get into there. But that's the other main thing that I'm trying to address.

Nathan Labenz: Yeah, one strategy for rising to prominence in an online-- building your online voices. Pick an area where people are dramatically and consistently wrong, and then you can be very consistently right. Okay, so, so where do you think, like what is the sort of, what is the move that's happening that's like distributing this bad information, you know, incorrect understandings in the first place? I mean, one thing I've observed in general is just that like the world is really huge. And so when you aggregate numbers associated with almost anything, you get to these like really big numbers. So I think one pattern that maybe is happening is people are sort of saying, here's how much energy AI is going to use. It's yada, whatever kind of watts. And isn't that insane? And people are like, that is insane. And then there's just no reference to anything else. And so it's like this sort of very big number hangs out in a vacuum and intimidates people because they don't see the other even much bigger numbers that maybe they should be comparing it to. How much of it is that? What do you think is kind of going on that is creating this sort of broad misconception in the first place?

Andy Masley: Yeah, it's coming from a lot of different directions. There was a lot of coverage of this early on. There was a lot of individual news coverage about the cost of individual prompts. And there's an outdated talking point about how it's 10 times as much as a Google search. And I think for a lot of people, this was the first time they'd ever seriously thought about their internet activity using resources in general. And I think there's actually something that just feels really wrong to people about that because digital goods are ephemeral and they're just this 2D thing on your screen. And especially if you don't like AI or if you think AI is all silly or whatever. It's this ridiculous, meaningless use of these valuable resources, especially water, actually. I think the idea of AI burning through water really sticks with people because they kind of imagine it as this dwindling resource that we might eventually run out of, and we need to conserve it and protect ourselves from that. And there's this ridiculous new thing that they don't like burning through a huge amount of it for silly reasons. I think there's almost a religious sense of sin. about that really gets to people at some deep level, just because they might not have thought about how data centers have used water for a while for other things. People also watch Netflix shows that I don't think are very good, and that uses water, and maybe I would be more upset about that if I thought it was a serious problem. But even there, not too much water is being used. I think there's just been a lot of really weird and bad media coverage of this that's definitely caught on in social media a lot too. There are a lot of TikToks and tweets giving very silly explanations of how very grimly talking about how a chatbot is 10 times as much as a Google search. You're multiplying your emissions by 10 times. And they never stopped to mention, oh, by the way, 10 times a Google search is still not that much. If I search Google 1,000 times throughout the day, that's actually still not going to add too much to my emissions. And so I shouldn't expect that 100 ChatGPT prompts will either. And so there's a lot of hyper-focus on these relative numbers because almost no one, including me too much before I actually started looking into this, actually knows where a lot of the energy they use is or how much it is and stuff. So a lot of different stuff. There's general anti-AI attitudes, which otherwise understand worrying about. There's a sense that this is all these valuable resources. One very last one is this very strong sense that people have that because we're in a climate crisis, it is never acceptable to add a new source of emissions. This is very common. I bump into this all the time, where even if AI is adding so little, it's new emissions that didn't exist before. This is actually a really strange category when you think about it, because every day we all wake up and cause new emissions in everything we do. Every time I drive my car, I'm adding new emissions to the atmosphere that will stick around for a thousand years or something. The source doesn't really matter, in my opinion. It definitely matters to an extent. But I think a lot of people are really hung up on the idea that this is a new thing or a new way of emitting and thinking less about other stuff. And then the very last thing I'll say is that the data center buildout is objectively huge, and I think people have a lot of trouble computing how such a huge new use of energy can relate to such small amounts on a personal level. And I think a lot of this is just that data centers are these really weird paradoxical buildings that aggregate huge amounts of very tiny emissions in one place. They all look really huge together, but this would all appear if we were able to aggregate anything else we do in society as well. If all toasters or all microwaves were in one spot, that would also look very large by the standards of an individual person. A lot to say, a lot of stuff that's floating around in the background here, but all of this is congealed to this very broad sentiment that using ChatGPT is bad for the environment, and in my opinion, it is not.

Nathan Labenz: Yeah. I, one of my general principles is don't psychologize others' AI takes because I think the whole sit, you know, the facts are confusing enough and the, you know, the fog of AI forecasting is pretty thick in many cases. So I really do try to avoid that. I guess I'll, you know, maybe break that rule very slightly, very briefly by asking like, do you think this is primarily coming from the environmental movement and it's just bad tactics or misunderstanding? Or another corner, and maybe it can be both, but another corner it could be coming from is sort of an anti-AI motivation that is like trying to find things that are resonant with the public. Even if those arguments that work aren't necessarily at the sort of core of the motivation, like what would you say you've encountered in your online battles?

Andy Masley: Yeah, I actually mostly don't think this is coming from the actual environmental movement, like on-the-ground environmentalists. Maybe a few of them get mad about this in the way that other people do, but it's not really originating there at all. I think it's mostly coming from a much broader swath of people who I would say are fired up and sometimes a little bit confused about climate, where they engage with a lot of doomy climate social media. I just want to flag throughout all of this that I am very worried about climate change. I don't want to say that this is not a problem or whatever. But there's this separate category of person who isn't actually involved in actual environmental work, but just engages with a lot of very guilt-ridden media about very specific things that they do in their personal lives. I think the environmental movement, for the most part, has correctly moved on from policing personal lifestyle stuff so much into focusing on systematic changes to the energy grid. Huge win there, that's great. But the public discourse hasn't really followed along. I used to be a physics teacher for seven years, which is also important context here. And I just remember a lot of my students actually really believe really strange things about climate, mostly because of TikTok, where they'd be like, oh, we're all going to die by 2024 or something because of climate change or society's going to fall apart. And I think this is mostly coming from people who have this cluster of beliefs about climate who have memed themselves into this idea that any tiny amount of new emissions is always a catastrophe, and oh, this silly thing AI is using new emissions, and so that's bad, basically. So yeah, that's my basic guess about the main culprit here. There's a general broader social media sense of climate doom that sometimes causes people to not think so much about the actual numbers involved.

Nathan Labenz: Yeah. Okay, great. Well, let's get into some of the actual numbers involved. How do you want to attack it? I mean, I think we can come at it from multiple angles. There's the sort of you know, unit of one ChatGPT request and, you know, what that uses. There's also, you know, all sorts of comparisons we could use like to a microwave or to a, you know, heating one's home or driving one's car. There's also kind of top down economic stuff, which I think is often like way too quickly brushed past as well. Like ChatGPT is 20 bucks a month. You know, that sort of puts a pretty clear limit on like how much energy it could really be using. Of course, they could be subsidizing it, but they are subsidizing the free users, presumably in part with the revenue from the paid users. What do you find to be the most, probably at least dabble in all three of those and maybe even more different ways of thinking about it, but what do you find to be the most intuitive or resonant way to start to break this down for people?

Andy Masley: For a while, I was mostly making those kinds of individual comparisons, and I was finding that in a lot of places that didn't hit, and I was trying to figure out why. It's this much to charge your phone, a microwave can add up to being like hundreds or even if you use it for long enough, I'll limit it to hundreds for now, ChatGPT prompts and stuff like that. But for some reason that wasn't getting through to people and people would still be like, oh, but it's still more. No matter how little it is, it's still more emissions in total. And I realized that they weren't getting this other big intuition I had. So my elevator pitch, if I meet someone at a party now, my general pitch is that To the best of my knowledge, based on all the resources that we have on this, once you add up all the ways that a chatbot can cause emissions, like not just including the inference, like the cooling, the training, the embodied cost of the hardware, my best guess like right now is that lands you at about 0.3 grams of carbon emitted per prompt on average. And I'm making a lot of assumptions, assuming that's actually kind of high. And that's about one 100,000th of your daily emissions. And so What that means is that you would have to prompt ChatGPT something like 1,000 times to increase your emissions by 1%. And if you did that, if you were spending the entire day doing nothing but prompting ChatGPT and reading all the responses, that would probably take you something like 10 hours or something at least. And if you think about it, spending 10 hours a day on something that only uses 1% of your emissions actually implies that your emissions will be way lower than they would otherwise be. if that makes sense. Because most other things that we do in our lives, we're not just gonna be sitting, not doing anything all day. We would be taking a walk or driving our car or something like that, or even watching a show on TV. And so if ChatGPT is this small, it actually probably means that anything you replace it with will emit more, except for other things online. Computing is just so wildly efficient that almost nothing else we do... uses as little energy, basically. And so my elevator pitch is basically, ChatGPT actually uses so little energy that on net, it actually seems much more likely to reduce your emissions. Because in general, probably the single most climate-optimized way you can live your life is living in a big city, being vegan, and spending literally all your day on your computer. And I check two of three of those boxes. I try to get out. And I don't spend all day on my computer. But computing is just so efficient compared to most other things that we do. that. It's just not a good thing to trade off if that's what you're worried about for the climate, basically. So that's my basic pitch. And can go into a lot of the other comparisons. I like making the comparisons to microwaves and stuff like that, but basic ground level thing is that this is going to reduce your emissions if you trade it off against anything that's not on the computer specifically. And it's just so small that it would kind of be sad if we were focused on this in any other context. It's taking a little dropper from your pot of spaghetti and like, oh, I need to save a few drops for later. Something like that, I think we would call correctly a distraction for someone so focused on climate and environment stuff. So that's my basic pitch in a nutshell.

Nathan Labenz: Yeah, I think the bits versus atoms distinction is a pretty strong place to start is I often think about the energy that it takes to charge a phone. And this is something actually GPT-4, a little bit of lore, is that the original GPT-4 while I was doing the red teaming, this is now basically three years ago, a little more than three years ago at this point, it really took me through some of these energy analyses very early on. And it was one of the kind of Eureka moments where I was like, man, this thing can really help me educate myself in all these different areas. But I was shocked to learn that a typical cell phone to charge it fully is something like 20 watt hours of energy, which in my jurisdiction in Detroit, Michigan, is about a third of a cent worth of electricity. And we've now also heard a lot recently that the brain itself runs on 20 watts. So the energy that your phone works on for, say, whatever, depending on how intensively you use it, a twelve or a 24 hour cycle, That's the same energy that your own brain is using in just one hour. So already, all the things that your phone is doing, and obviously we've seen language models that can run on phones, not the Frontier ones, of course, but you can run Java models, you can run things on your phone, you can run things on your laptop. Those are literally running on less energy than your own brain, which is an interesting starting comparison. And it really just kind of goes up from there.

Andy Masley: Yeah, and this is actually the reason why I've mostly moved away from comparisons to individual objects, because realized that most people, including me before I started, don't actually have much of a solid idea of how much energy individual objects use. I've actually come to think that most people seem to have a scale in their head where something either uses a little amount of energy, a normal amount, or a lot. And it's one of those three, basically. And so they might think, oh, ChatGPT uses This isn't actually true, but let's say ChatGPT uses as much energy as it takes to charge your phone. They might think, Oh, that's a normal amount. So ChatGPT is adding this new significant thing to my carbon emissions. And they're not thinking about just how big the scale is. Without getting into Twitter spats too much, I just had a guy respond to me the other day and being like, Oh, ChatGPT uses as much as charging your phone or running a microwave for an hour. And I wanted to hit the brakes and be like, Well, those are two very different quantities. Almost nobody seems to notice just how little energy it takes to charge your phone. specifically, and this is a comparison that gets made a lot. And so I can make these other comparisons where it's like, yeah, it's something like 60 chatbot prompts per phone charge or something like that, or per minute of microwave or something. A chatbot prompt is effectively a second of a microwave. I will flag actually here that whenever I say a prompt, I'm talking about the median chatbot prompt, and this is a whole separate thing we can get into with much longer prompts and stuff like that, but we need a place to start. And so yeah, these comparisons to household stuff can sometimes be less helpful than I expected because a lot of people still, including me a lot of the time, don't have much of an idea about this. But yeah, it's something like 60 prompts per phone charge or about a prompt per second of using a microwave or something like that.

Nathan Labenz: Yeah, that's helpful. Well, maybe let's dwell on it a little bit more and maybe we can kind of help popularize a few of these memes. So I guess orders of magnitude are always kind of helpful, right? Like the human body, runs on basically 100 watts. 20 watts of that is the brain. So now we can just look at a few household appliances. The microwave typically runs at about 1,000 watts. An electric teapot-- and themes here obviously are like heating things is energy intensive. So the microwave is heating the electric teapot is also about 1,000 watts. Those are heating things. Running my furnace here in the wintertime, a lot of energy is going through that. And obviously, it depends on how cold it is outside and all that kind of stuff. It roughly shakes out to, at least for electricity, from what I understand, the average US home is doing about a thousand watts of electricity on an averaged out basis. And obviously, you know, when you turn on your microwave and whatever, that spikes up relative to your base load. But like rough intuition is a home is running on a thousand watts, a body is running on a hundred watts, a brain is running on 20 watts, and ChatGPT then comes in with, you said one prompt roughly equal to one second of a microphone.

Andy Masley: Yeah, uh-huh, yep, yep, just like that, yeah. And like, you know, inner physics teacher is coming out, so just want to flag for the listeners, yeah. A watt is a rate of using energy, and then a watt hour is a unit of energy because it's that rate multiplied by a certain amount of time, basically. So yeah, a thousand watts, if you run that amount of power for an hour, you use up a thousand watt hours specifically. And so yeah, a ChatGPT prompt, from best I can tell is between 0.3 and 0.6 watt hours for the median prompt. And that is going to-- yeah, it's effectively running a one-watt thing for about 0.3 to 0.6 hours. And a one-watt thing is also an alarm clock or something. So if your clock is just running-- a digital clock, specifically. So if your digital clock is running normally, 0.3 watt hours is about the same as 0.3 hours of running a digital clock, specifically.

Nathan Labenz: Yeah, that's really, and I guess it's also worth noting the a kilowatt hour is the typical unit of electrical billing. So when I said it's, you know, 20 cents in my power jurisdiction, that 20 cents is for a kilowatt hour, which also then leads to another kind of interesting comparison that if my $20 ChatGPT subscription went entirely to energy, it would buy a hundred kilowatt hours. which would be basically four days of a typical home electrical usage. And that would assume, again, that they're taking, obviously we know that we can get into this in more detail too, but obviously we know they're not turning around and spending all of my $20 on electricity because for one thing, you know, they've got chips to buy and they've got, you know, researchers to train and all sorts of other costs. But that kind of gives you a absolute max if you were to just spend $20 more on electricity. You know, that's kind of roughly increasing your typical home electrical bill by 10%. And so again, it's like way less than that. So we kind of seem to be coming down to these like 1% ish numbers. It's probably still significantly less than that today. You had said like, you got to get to a thousand prompts a day to increase your overall footprint by 1%. And I do think that's where, you know, certainly the people at the model development companies think we're going and that's why they're, you know, planning such big build outs. It is hard to imagine usage that moves one's overall energy footprint all that much more than that.

Andy Masley: Yeah, and it will flag too that when people talk about energy, obviously there's some concern about electric bills around data centers, but mostly when people talk about this, they're actually using it as a proxy for carbon emissions, right? Because like we're worried about climate. And something else that I think doesn't get through to people a lot is that something like... I don't have the exact number, but it's something like 25% to a third of our carbon emissions come from the electricity that we use, and two thirds come from other sources, like direct use of fossil fuels, like driving your car, as an example. And so even within that electricity, so assuming that all of your money, all of the $20 is going to ChatGPT, and it becomes four days of your household bill and 10% of your electricity use, because your electricity use is only about 25% of your actual emissions, that actually drops you even more to something like maybe, you know, like 3% of your actual emissions. So like the absolute most ridiculous situation where like OpenAI is taking your money, spending all of it on electricity all the time, which it's actually just much smaller than that. Like even there, the absolute most, this could possibly be raising your emissions is something like 3%. And that would actually be significant, you know, like something that raises your... emissions by 3% is definitely not nothing. But my claim is also that it's much smaller than that. And that's the absolute most it can be economically. Unless, of course, ChatGPT is just giving everyone huge amounts of free energy, which just doesn't seem likely. AI companies just don't want to do that if they can avoid it.

Nathan Labenz: Yeah, well, there's maybe a good way... I don't know if this is the next place we should go immediately, but another good angle of attack on that question is what does it cost to buy chips and then what does it cost to run chips? Do you think we should dig into that now? Sure.

Andy Masley: Yeah, let's do it. Yeah, I was looking into this a little bit, and it's actually pretty counterintuitive where the cost of a chip versus the cost of running it in money actually doesn't seem to tell you very much about the carbon costs involved because most of the actual cost of the chip seems to be the Nvidia premium and the cost of these incredibly complex supply chains all working so well together and the market, basically Nvidia running a lot of this stuff basically, and so there's this huge premium that you're And so as an example, so in eight, in eight GPU H100 server node costs about $300,000. And over a four year lifespan, the server will consume about like $35,000 in electricity, which is only about 10% of its purchase price. So like this chip, as an example, 90% of the cost is in purchasing it and only 10% is in actually running it. And so you might think that like, if that's the case, maybe most of the carbon cost is also in actually producing it, but The price is actually just mostly reflecting other stuff. One intuition I have that's pretty useful for this, and I'll talk about more of the proportions in a second, but ultimately when you think about it, an AI chip is just designed to have electricity flowing through it as much as possible over its entire lifespan. And if I designed a wire and it just started to flow electricity through the wire until the wire burned out, and I was like, okay, what took more carbon? Making the wire itself? or making all that electricity to flow through the wire to the point that it eventually blows out. It's kind of similar to, I've never once thought about the carbon emissions of the wires in my home, but I have thought about the carbon emissions of my electricity bills. And so in the same way, because so much electricity flows through AI chips, we should basically expect most of their carbon cost to actually be in the electricity itself rather than in the embedded cost of making the chip in the first place. And Nvidia conveniently actually shares the carbon cost of all its chips. Now it seems like one positive aspect of people focusing more on the environmental stuff on this is that a lot of companies are just doing this more. And based on what I can tell, it seems like the ratio is something like 20:1 of the carbon of the electricity versus the actual embodied cost of making these chips itself. So it's actually like wildly disproportionate where it's something like at least 10 to one in the other direction of the cost of like purchasing the chips and money versus like the cost of the electricity. But the carbon cost is completely flipped where most of the carbon cost is in the electricity.

Nathan Labenz: Yeah. So another way to think about that or kind of validate that would be if it costs $300,000 to buy this pot of eight chips and then $35,000 to run it over a lifespan of four years, then we all know right off the bat that Nvidia's margins are something like 80%. So that takes down their, you know, cost of marginal production to something like $60,000. And that still includes, you know, the relationship with TSMC and, you know, shipping the things all around the world and all that sort of stuff. And so you kind of have at least a governor there on the top of like, well, it can't be more than that 'cause we know what their margins are. And then you're also highlighting that they're just straight up publishing an analysis of what their energy is and saying it's like, did you say 1% of total?

Andy Masley: Oh, it's 10%, it seems like that. Or like about the energy cost of running energy in like dollars is about 10% of the cost of actually purchasing the chip in the first place with money. And then the carbon cost is flipped where it's something like 20 to 1, the carbon of the emissions of actually using the electricity is 20 times as much as that. And it could get even higher because a lot of AI data centers are built with grids with obviously reliable, cheap energy. And unfortunately, right now, in the reality we live in, that tends toward more fossil fuels being used. And so as a result, their electricity might actually be even dirtier than the grid average. And so that's another thing where that's actually raising the carbon cost even more.

Nathan Labenz: So out of a $300,000 pot of eight, you can then expect a, call it, you know, 10 to one, call it a $30,000 energy bill over the lifetime of the chip. And then if you flip back to, well, what was the energy cost of that on the production side, it's another 20 to one reduction, AKA something like $1,500 spent on energy. Or, in other words, half a percent, basically, of the original cost of the chip. It's always interesting to kind of flip these things back and forth between money and emissions, but...

Andy Masley: Yeah, because the markets are so weird here, it just doesn't track very closely. Money just tells you very little about how much was emitted at different points. And I think that, in general, this is a really funny area where sometimes when I talk about the carbon cost of a median prompt, people will be like, Oh, but you're not including these hidden costs. things like the embodied cost of the hardware itself. And they'll never actually try to make a guess at what that actually is. And when you poke at the embodied cost of chips, that anyway seems pretty marginal. It's like 10%, 5% sometimes. So that doesn't really concern me super much. It doesn't add too much to the cost. The real secret cost that I do think actually significantly raises it is the cost of training the models. But that's a software question rather than a hardware question, obviously. So that's something else we can talk about. But my best guess right now is that maybe it doubles the full energy or emissions cost, so it goes from 0.3 to 0.6 watt hours or something like that. That has pretty wide error bars, but yeah, I can go into that separately. But at least for the hardware, I think it makes about as much sense to worry a lot about the embodied carbon cost of AI chips as it does to worry about the embodied carbon cost of the wires in your home that electricity flows through. These things are just so hyper-optimized, and there's also a common theme about this is that so many people are being paid to think in really complicated ways of optimizing the energy use of these chips as much as possible. There's just a huge amount of incentive to do that. And so we should expect that there's a lot of electricity flowing through them, but also that they're relatively optimized. Just a lot of stuff to say, basically. But yeah, I expect that most of the costs in general of a chip to be in running the energy through it rather than in the energy used to make it.

Nathan Labenz: Yeah, okay. I think that's quite helpful. Do you wanna do, I know you said these don't always hit, but I wanna- Oh, I mean, I love them. Do you wanna do one to driving a car or taking a flight?

Andy Masley: Yeah, yeah. So this is actually, let's see, best I can tell is that if you're driving a car for 20 miles or so, that's gonna probably emit three to four kilograms of CO2 if it's a sedan. And that's something like 10,000 chatbot prompts. so. And so I sometimes think about it like, if I didn't want to feel guilty about chatbot prompts, if I was worried about that, if I avoid a single car trip, I get 10,000 free chatbot prompts to use however I want. This is also an interesting argument for ways that chatbots might be able to on-net reduce someone's emissions in other ways. I think in these conversations in general, something else I try to flag is that But almost all of the climate impacts of AI and environmental impacts in general are almost definitely going to come from how AI is used rather than the actual energy used in data centers, for the same reason that all other computing works the same way. If we're trying to think about Amazon's impact on the environment, there's shipping and other things that it affects. But looking at the energy cost of running the Amazon website and data centers isn't going to tell us very much about its total impacts. And so if AI is actually changing your behavior at all, that's just just where most of the impact is going to be. And so if one of those 10,000 chatbot prompts that equals a 20-mile car ride prevents you from taking that trip, you realize, oh, I looked into this and I don't actually need to drive out to do this thing, or I can get it somewhere else, that's at least going to modify your behavior enough that that's going to significantly change your emissions. Potentially not for the better either. Maybe it's the case that it will cause you to do something else that is maybe silly, but the comparisons here get kind of comical. Flights can get up to the millions at times. single intercontinental flight, or sorry, transcontinental flight can, I think, go over at least 1 million and potentially 2 million chatbot prompts, which is more than I'll probably do in my lifetime. I consider myself to be a chatbot power user, but I'm probably not going to hit that. And so if I decide, if I make a very extreme decision, like, oh, I won't take that European vacation this year, but at least now I can use chatbots basically forever without any guilt, water comparisons actually get even more ridiculous because I think the average person just doesn't actually know where most of their own water footprint is, because the main way we experience water in our lives is in the water we drink, maybe a shower, watering our lawns if we have one. And most of the actual ways that our lifestyles consume water is way far away from our homes. It's in the food that we eat, it's in the power that generates our home, or that is used to power our home and stuff like that. And I'm pretty sure that the average water cost of a prompt, if you include everything, including the offsite cost, rises to something like 1/800,000th of your daily water consumption when you include everything else. And so there, the numbers get really silly. I'm pretty sure that if someone avoids buying a single additional pair of jeans, that is worth a million chatbot prompts worth of water or something, just because jeans might require irrigated cotton to grow or something like that. And this makes intuitive sense to me because if I think about how water is being used, in the Gene's case, huge amounts of water are flowing to grow plants and they become part of the plant itself. Whereas with AI, it's flowing in the vicinity of hot computers, basically. And it's capturing some of the heat that my individual prompt generated. And I'm like, well, how would I expect the proportions here to look? ultimately it makes sense that there can be these huge order of magnitude differences between those things. So yeah, a lot more comparisons I can make, but at least with energy, yeah, a single second of your microwave is probably about a ChatGPT prompt or even three or so. It just gets pretty comical pretty fast. And so if you're willing to use your microwave for a few additional seconds, you should be okay with a few additional chatbot prompts too. And this is actually a good comparison for longer prompts because some people will be like, oh, what Andy's saying is about median But obviously, there are all these other things. There are these long coding tasks and reasoning tasks and stuff like that. And it's true that those use a lot more energy, but it seems kind of like just running your microwave for longer. And if I had a friend with a microwave that if you used it for a minute could produce a very thorough research overview of a field, or it's like, ooh, I heated up a ton of new useful code for whatever I'm working on, I wouldn't actually be super bothered if they were using the microwave. for that. I use it to heat vegan chicken nuggets and I don't feel bad about that. And in comparison, when these larger models start to get talked about and it's like, oh, you're either using the prompt is much longer or the reasoning time is much longer, that definitely adds to the cost. But I think anyone who's used a deep research prompt or a coding prompt also knows how valuable the outputs are compared to even a normal prompt. Basically, the microwave comparison is really useful here because it can basically be like, oh, this thing is like a microwave. And however long it takes is kind of like running a microwave that long with error bars, obviously. But at the end, if I had a microwave in my home that produced a deep research level response to something, I would be using that quite a bit and not feel super bad about the results, personally. So yeah, a lot to say about that. But the comparisons can be useful here in a lot of ways once you get the context down.

Nathan Labenz: Yeah, so let's, let me just kind of re-derive maybe a car one again, just for my own and everybody's benefit to get the reps on kind of stepping through it. So let's say I have a 20 gallon tank in my car. That's one way I like to think about it. The unit of the fill up, right? So I've got gasoline weighs about five pounds a gallon. Water is eight pounds a gallon. Gasoline is lighter. So roughly you fill up your car with a 20 gallon tank, you have put a hundred pounds of fuel into the tank. Now that actually becomes a lot, somewhat surprisingly or weirdly, becomes a lot more massive as it gets converted to exhaust because your hydrocarbon is basically just a chain of carbons with hydrogen atoms alongside it, and each carbon atom becomes a CO2 in full combustion. You can have CO, I guess, in there as well, but obviously we're going to simplify away a lot of details, so we'll call it CO2. Oxygen weighs more than carbon, so you've more than, roughly speaking, you've tripled the mass. For every carbon atom that gets combusted, you have a CO2 triple the mass. So I've gone from a hundred pounds of fuel into the car to 300 pounds of emissions, which if I convert that to kilograms, we'll call it 150, and then that becomes 150,000 grams. 150,000 grams compared to one chatbot prompt was a third of a gram. So my full tank of gas gets into the range of roughly half a million chatbot, median chatbot prompts. And then you had scaled that down by, let's say I took just one 20-mile trip, that would be one of my 20 gallons. So if I divided that by 20 from half a million, I get down to 20, 25,000, like 10,000. So I kind of came at it from a different angle, but we ended up within 50% of each other.

Andy Masley: Yeah, yeah. I should double check the numbers there, obviously, because it gets a little hand wavy at the extremes here. But yeah, yeah, exactly. This brings back when I was first learning about climate change. Actually, I remember hearing the fact that if you see a huge pile of coal and that's all emitted, the CO2 weighs so much more than the pile because oxygen is being added to it. And I remember just having this deep sense of evil. Coal is just such the ultimate bad guy in climate conversations, that it's just a really funny fact that this stuff that you use will end up actually adding way more to the atmosphere than it itself weighs. It's pretty crazy. But yeah, that's a really good intuition. at least hundreds of thousands of chatbot prompts. And we do that pretty regularly. This leads into another interesting general comment on climate stuff, where the main way that we emit and the main way that most of our emissions happen is actually with stuff that is otherwise very useful to us. I think people sometimes have this idea that, oh, well, I should focus on chatbots a lot because I think chatbots are useless and it's a useless form of emissions. And I think that... Something like 99% of the actual emissions reductions that we need to make are actually going to be in the useful category. And highlighting these tiny little useless emissions, it makes sense to worry about those. Obviously, we want to emit less if we can, and if something's useless, we want to cut it. But in terms of the amount of attention it should actually take up, the big challenge of climate change that's actually preventing us from making change is how much emissions are actually very useful. Driving a car is very useful to me, and so another part of this debate that I find comes up a lot is that no matter how small I can express the chatbot as how small the emissions are, people will be like, oh yeah, but it's useless, and it's just this silly plagiarism machine. So we should somehow focus on it more than all these useful emissions. And I kind of want to get it through to them that the useful emissions are actually so large and so much more difficult to change. That's what needs the most of our attention. And so a lot to say about that, but the car example just brings up a lot of different associations there.

Nathan Labenz: So just to repeat it one more time. Yeah, yeah, yeah. One crosstown 20 mile trip is worth order of magnitude 10,000 chatbot prompts. One tank of gas, 28 times more than that is worth something like a quarter to a half a million chatbot prompts. And then if we go up to a cross-country flight, we take basically another couple orders of magnitude up and we get to basically like a million, which again, makes sense if you figure just the cost of the flight and how much of that goes to energy from what I've seen, like a non-trivial, unlike the AI companies, like the airlines are actually indeed turning around and spending a pretty significant share of your money on the jet fuel. And so you might be spending north of $100 per passenger on jet fuel to get country.

Andy Masley: Yeah. It actually brings up a funny intuition where driving solo actually seems to emit about as much as a long plane flight as a passenger, specifically, which I think people have some weird intuitions about where they think the plane might be a thousand times worse. But driving actually is just surprisingly bad, and the main way that flying adds to our emissions is that it just causes us to travel longer distance rather than per mile being worse. And so it makes sense that the plane scaling up is going to be about the same as an equivalent amount of distance traveled in the car. up to comical amounts of chatbot prompts. Even if you thought this was evil, but you saw any value in chatbots at all, you could just skip a single car trip and be good for at least a year of prompting or something like that. I try to get across that all these cuts just seem very depressing to focus on in the first place. Back when I was talking a lot more about climate change when I was in college, there was a really great general consensus that the name of the game is systematic change and changing the energy grid, and there you can have hundreds of thousands times as much individual impact as anything you can do in your personal life. Even more extreme things, like I'm vegan and I live in a city and stuff like that, and that's good, but that's just not nearly as impactful as just being one small part of a group of people who opened a new solar plant or a battery facility or keeping a nuclear power plant open or something like that. And so within personal lifestyle stuff, if you're going to litigate anything, litigate a few big things specifically, by the time you get to chatbot prompts, it's as if you're pausing YouTube a few seconds early for the sake of the climate or something. Honestly, a part of this debate is quite depressing to me because I feel like The general climate conversation has receded or gotten worse since I was talking about it, where like, there's so much hyperfocus on these tiny things that just aren't gonna matter compared to like big systematic stuff we need to do.

Nathan Labenz: Yeah. That's that's weird. I don't really know what to say about that.

Andy Masley: It's upsetting, yeah.

Nathan Labenz: But I guess that's why we're here.

Andy Masley: Yeah.

Nathan Labenz: Okay, so 10,000 prompts per 20 mile car trip. million prompts-ish for a cross-country flight. One prompt is one second of microwave. Hopefully these start to put people's minds a little bit more at ease.

Andy Masley: Yeah, actually one other comparison that I realized we hadn't made yet is that I just wanna make it completely clear that there's a really common talking point that I just want to never be mentioned again because it's just so wrong and so off, which is a chatbot prompt uses a bottle of water, basically. If you look into where this came from, it was this really misleading article from the Washington Post. that basically took this really wild estimate of how many chatbot prompts it would require to write a 100-word e-mail. It seems to assume 10 to 20 prompts to write the single e-mail. And I'm a power user of chatbots, but I don't use that many. And even there, it assumes all these other wild things, like chips hadn't gained any efficiency in five years since before they were commercialized. like specific things about training and evaporating water from hydroelectric plants, all these things that together just tell this really, I think, misleading story. And the actual amount is something like one, two hundredth of that. It's something like two milliliters of water. And this just seems pretty conclusive based on a bunch of different estimates and different studies from people. And so, yeah, the general comparison to water specifically, I always just wanna flag that maybe the single most popular wrong piece of information about these chatbots is that they consume a bottle of water per prompt, that's just not the case. You'd have to prompt them about 200 times to use a bottle of water. And also, you use thousands of bottles of water, or maybe not thousands, but at least hundreds of bottles of water every day in your consumptive use that you don't see because most of that is off-site in the electricity you generate and in the food that you eat.

Nathan Labenz: Yeah, I didn't pre-run the numbers on a hot shower, but I suspect there's an awful lot of prompts in a hot shower as well.

Andy Masley: Yeah, and I could say all kinds of crazy things about hot showers, actually, where a lot of the water that goes into the hot shower is the water used to generate the electricity to heat the water, basically. The way we use water is just very interesting and sometimes comical because so much of the water we use is actually used to generate electricity, where even your alarm clock has a big off-site water cost. My best estimate is The average digital clock uses something like three liters of water offsite at a power plant per month. Even there, that might be invisible to you, and you might not know that this thing is burning through three liters at a time. A lot of stuff to say about that. A part of the reason why I've been writing about this so much recently is that it's honestly just pretty cool. There are a lot of counterintuitive ways that we use energy and water and stuff like that. I think the chatbot conversation can honestly be really fun once you contextualize it in the counterintuitive ways that society uses water. the goofy excesses in like animal agriculture and stuff like that. Like that's its own separate topic.

Nathan Labenz: Yeah, well, I I think agriculture would definitely be worth doing a couple little comparison exercises with two. But maybe before we do that, can how should we think about the relationship between energy and water and like, what does it mean to use water? I think there's conflation going on in many cases where people some cases think like this water is used like it's like destroyed or it's like contaminated and obviously contaminating water is a big problem my sense is that's not really what is going on when that's not what people are referring to when they refer to water use right this is more like evaporative cooling kind of processes where the water is not like removed from the water cycle entirely it's just sort of put put into the air from wherever it was before that. But help me sharpen that up.

Andy Masley: Yeah, three different categories of using water that is useful to think about. So first of all, there's, I think the most important category to know about is consumptive use of water, which is water that withdraws from a very local source and either evaporates it or does something else to it that causes it to leave that very specific source of water. And that can be disruptive for high water stress areas because the water might not be returned at the same rate and over time things can dry up. So consumptive use is the main harmful way that water can be used, basically. There's also water pollution, where water is returned polluted in some ways. AI itself, data centers can contribute to this a little bit because they can use specific chemicals in the water to keep it very clean, but they might return some of it to a local system polluted with these chemicals. This doesn't seem to be nearly as big of a problem as the pollution that might come from agriculture, but it's definitely not nothing, that's something to worry about a little bit. And then the other thing to focus on is non-consumptive withdrawal of water, which is where you take water, use it, and then basically give it back to the source. And a lot of the ways that we generate electricity, as an example, mostly use non-consumptive water withdrawal, where it takes water, maybe returns it a little bit warmer than it was before. That can sometimes create some problems. But for the most part, it's just taken and put back. And so this can lead to some really wild, confusing statements about how AI uses water. Like one very popular statement about AI water use is that by 2027, it's projected to use 50% as much water as the entire United Kingdom. And if you actually look at what that means and what that 50% is, something like 90% of that is the water that power plants withdraw and then immediately put back after. And I think that paints a very different picture. And then another something like 5% or 7% of that water is the water consumed in the power plant and not in the data center itself. And only about 3% of that amount is going to be in data centers themselves specifically. And so when people hear that, I think they imagine half of the UK is water flowing through data centers. And what's actually happening is that that's only going to be at most maybe 5% of the UK's water, which is definitely significant. It's not nothing, but it just shows how easy it is to really wildly and sometimes unintentionally mislead people about it because the way we use water is so counterintuitive. One other thing to note is that there's an important difference between freshwater and potable water, which is water that's been improved enough to basically be drinking quality. Data centers themselves rely a lot on potable water because it's very important for the water to be very clean as it moves through the data center so as not to dunk things up, basically. But from what I can tell, actually, There's not too much of a problem right now in converting fresh water to potable water, and there are actually economies of scale where if you're converting more of it, that can actually make it cheaper rather than more expensive. There's a lot of stuff to say about that, but another common thing that gets said is that AI is using drinking water, and that sounds like a big emergency because obviously people need water to drink and stuff like that. people have a really weird understanding of the relationship between drinking water and fresh water, which is basically not salt water. The big constraint in a lot of areas is just how much access to fresh water people have, and not so much how much access to drinkable water people have, because there can be times when economies of scale can kick in. And there's a lot of uncertainty about this. I want to flag that I'm basically just some guy here who's done a lot of deep dives on this topic over the past few months, and so I'm reporting what I know to the best of my ability, but I'm not an expert, and if you think I'm getting anything wrong, please e-mail me right away. That would be very embarrassing. But what I'm saying is based on just a lot of reading and a lot of individual ideas that each aren't actually super complicated, but can be confusing when you first bump into them, basically. So yeah, a lot more to say about water, but those are the really useful categories to understand.

Nathan Labenz: So let's bottom line that again one more time. We got the three categories. A lot of what we're taking out to do this cooling can be put back. That's not without any issues. What should I say at a party?

Andy Masley: Oh, so super quick, I will, God, yeah, what should you say at a party? I will find that most of the water that flows through data centers themselves actually does seem to be consumed where it's evaporated, where like maybe 20% of it is returned to like local sources and stuff, and sometimes a little bit polluted that needs to be treated. What should you say at a party? Okay, so my best guess right now is for like, First of all, I would just describe consumptive versus non-consumptive use, where if someone's like, AI uses half as much water as the UK, I would say about 90% of that is withdrawn by power plants and then returned. So we're actually talking more about 5% of the water that's actually consumed by the UK. Consumption is the big problem. Obviously, withdrawal has some issues, but the main thing I'm worried about is water that's not returned. And then within that, AI is kind of a small percentage of that. My best guess right now is in 2023, and obviously a lot's changed since then, but this is the last time we have good data. My best guess is that I'm from a town of about 15,000 people. And in 2023, I'm pretty sure that AI in America used like eight to 10 times as much water as my town used. And that's not nothing, but it's basically like spreading 10 towns of 15,000 people each all across America. And if you just ask how big of a water problem would this be for the country, I don't actually think that would be a super big deal. It comes with some interesting planning problems, but I usually keep in the back of my pocket, oh, it's about this much, at least it was two years ago, and then I can talk about how much it's changing and how much it's going on. But the two things I would say are, yeah, those water statistics, here's how much of it is actually consumed, and then within that consumption, And how much does that compare to just towns or other things that are using water? And I kind of lose sight of how clear this is to other people when I'm talking because I've been so in the weeds. So if you wanna rephrase that for yourself, that might be useful here.

Nathan Labenz: Yeah, well, I do think the comparison to towns and cities is really helpful. I've done that for myself a little bit more on the energy side. So maybe I'll do the energy and then you can try to do the water. It is pretty handy that the latest chips basically run a thousand watts as well. The H100 was like 700 watts. There's a little bit of extra, you know, kind of padding on top of that for the auxiliary things that also consume power at the data center. The, you know, bigger and better chips use a little bit more. They're getting a little bit more efficient, not necessarily like a ton more efficient in terms of like flops per energy these days because a lot of the low hanging fruit there has been picked. But roughly speaking, we can still sort of round to 1,000 watts per chip. So that notably is like the same as a home. And then you can kind of work that to these like big data center build outs. So if somebody's going to do a gigawatt data center, that is a billion watts, which would be a million chips if each chip is at 1,000, and that would translate to basically a million homes, worth of electricity. What I've seen from the Stargate plans, for example, is that they are trying to get to five gigawatts over a couple of years of this build out. So five gigawatts would be 5 million chips or 5 million homes or a city of 5 million homes, which would be maybe more like 10, 12 million people, something maybe around the order of magnitude of New York or Los Angeles. which again is not small, but is, you know, only single digit percentage of the American population for one thing. And as you noted earlier, too, is with electricity being like roughly a quarter of our emissions footprint, if you say, okay, well, you know, New York City is, I don't know, four or five, depending on, you know, whatever metro area, whatever you want to draw the line, call it 5%, call it 4% of the US overall footprint, and then dial that down by the 25, only 25% of your overall emissions footprint is electricity in the first place. You're basically saying Stargate in its full five gigawatt build out is roughly a 1% contribution to US emissions as they exist baseline.

Andy Masley: Yeah, man, a ton to say about this. I actually want to flag that now that we're talking about big data center level stuff, that all the confidence I just had about the individual prompts I have way less confidence about how much the data center build out affects things. And some people get a little bit confused by that because isn't there some relationship there? I can go into details on how I think about how these individual prompts contribute to the overall stuff and why this is confusing and why it's not. And once we get into here, I have to say, Gigawatt data centers are just mind-blowing. It's crazy to think about buildings like just stepping back and being like, oh yeah, a single building could use a percentage point of America's household electricity. insane that's happening. That's just completely mind-blowing to me. So a lot to think about here. And when it comes to this level, at the individual prompt level, I'm not worried about electricity or water. At the global level, at the level of data centers, I think it's very clear that electricity is going to be the big challenge here. At least from the estimates that I've seen, even at the maximum amount of projected data center growth over the next few years, it seems like basically adding about-- the water cost seems to be about an additional 1% of America's irrigated corn, which again, it's not nothing, it's actually quite a bit, but that's potentially adaptable. The system can handle an increase at that size. But the electricity stuff is orders of magnitude larger as a proportion of America's total grid right now. That's going to be the huge challenge. And there I'm a lot more uncertain about, oh, I could totally believe that this will be very bad for local environments. So I tend to say about that. But yeah, gigawatt data centers, yeah, very useful comparison. Each of these chips is effectively like a home or at least an apartment where yeah, a thousand watts, really great, simple comparison to a home. And it's like very convenient that it's around there because you can just do the like orders of magnitude multiples. So yeah, a gigawatt is something like a million homes worth of energy, five gigawatts, five million homes, like completely wild. So yeah, this is a place where I'm like, yeah, I don't know. I'm like quite worried about the local environmental effects of the electricity specifically. A ton to think about there. But yeah, I'm shooting off in a lot of different directions here because I'm kind of awed by the as a baseline and just want to flag a ton of stuff. So yeah, that's a really great comparison between homes and like data centers.

Nathan Labenz: So how do you think about the, I guess we're basically, if I had to summarize the conversation so far, it's like at the very aggregate level, we are looking at something like a low single digit increase in emissions from AI, even with the like mega projects that have been announced, you know, all kind of coming to fruition. From a individual user standpoint, you almost can't get there. Like those visions are obviously predicated on a vision for the evolution of AI where, you know, it's going off and doing like long running autonomous science and like coming back with discoveries and it's sort of like working in the background for you nonstop. You know, how does one get to 10,000 prompts a day where you would be potentially using as much energy with AI as you are with a daily commute, the answer is like, well, it's got to be just doing an unbelievable amount of stuff in the background for you, right? So this is like a very different regime of AI use. And we're accounting for all of that when we say we're getting to kind of low single digit percentage increase in emissions, and then water is even less. So that's kind of, okay, great. Now, the next level of analysis that the concerned citizens, you know, including yourself, would get to would be like, okay, well, maybe that's true. And by the way, I would say just as a brief anecdote for myself, the offset has been very real for me personally recently. I've talked about this on another episode, but my son currently has Burkitt lymphoma, Burkitt leukemia, and I've used AI nonstop to try to help me understand what that is and are the doctors doing the right thing and is there anything else we might be doing, et cetera, et cetera. Okay, how many prompts is that across? And I do everything in triple also. I go to ChatGPT and Gemini and Claude and sometimes even Grok, although it's really the main three that I trust the most. So even doing everything in triplicate, I'm probably like a few hundred prompts over the last month worth of doing this. Maybe a thousand. So I still got like a long way to go even to like one crosstown trip. And I would really emphasize how much it has indeed saved me in the real world because the things that we were considering that the AI kind of helped us understand like we didn't really need to do were like, maybe we should move our entire family to another city to, you know, take advantage of some other medical center that might, you know, have better care. Or another thing is because we had a, I I talked about this in the other episode, but it just so happened that I had a possible issue of mold in my basement at the same time that my kid is going through this and is vulnerable to possible mold infection. So I've got these kind of like, well, geez, what do I do about the fact that I might have mold in my house and my kid's extra vulnerable to it? And I was able to, with the help of AI analysis, get comfortable with buying a few HEPA filters, putting them around my house. And that does have its own energy cost, which again, probably Trump's the AI you spy a lot, but it's also still a lot cheaper than going out and renting some other house and heating my own house and living in another one and having two houses that I'm heating for the winter would be even another order of magnitude up easily. So I have personally, in just the last month, offset enough real world know, transporting of my family or like, you know, renting, heating another house kind of stuff to basically pay for your and my like lifetime of, you know, at least like conventional prompting AI use, if not necessarily the full, you know, Altman vision of like, you know, tons of agents working for you around the clock in the background. But I think that part is like, it is worth highlighting because I think it is kind of It can feel hand waving to people, but I've definitely lived in a very concrete way that this has saved me from having to do dramatically more energy and resource intensive stuff and not obviously financially intensive as well. So the ROI and kind of every dimension there has been outstanding, not even mentioning the impact that it's hopefully had in terms of reducing the overall risk to my son's long term well-being. So that's notable. But okay, so let's get into the local stuff. Now, we've got these things are mega projects, no doubt about that. One big building or cluster of buildings might use as much as a million person city, or you could even imagine that getting into a couple million people's worth of energy consumed at a particular site. One way I started to think about this is that I was just looking at like, we can also maybe talk about what kinds of energy are used, because that's going to change things a lot too. I get the sense that in the short term, we're going to burn a lot more gas. In the medium term, we hope to have a lot more solar. We hope to have a lot more nuclear, perhaps as well. I looked at solar and I found that one gigawatt can be provided with basically ten square miles of solar panels, which is, you know, not a small area. But then I looked at, okay, let's say That's one gigawatt for, you know, you can get one gigawatt powered for 10 square miles. Okay, what if we go to the full 80 gigawatts that is kind of projected as like the total usage of the, you know, the Grand Vision, Altman's like $7 trillion build out or whatever. Now you're going from 10 ish square miles to, let's say, 800 square miles. And that is still less than 1%. of the land area of the state of Michigan where I live or the state of Nevada, which is, you know, sunnier and much emptier, although there's a lot of empty space everywhere. That's another thing that I think is like dramatically underappreciated. You know, the view from the cockpit of the plane is like just so much of the time is like there's just no, you know, no sign of civilization in front of you. And the, you know, the cities are kind of these like relatively small things over the vast landscape. So if we can How are all of the grand vision of the $7 trillion build out 80 gigawatts with less than 1% of the land area of either Michigan or Nevada? It seems like it's not like land use per se that is the problem. So how should we start to think about the impact? in localities. There obviously still is like, well, what about Memphis, right by where the Colossus thing is being built? What's happening there? Is there something that's unfairly locally concentrated that we should be more concerned about, even if this aggregate all sounds fine?

Andy Masley: Oh, yeah, a ton to say about that. A lot of different directions. I wanted to circle back and just say that your story has been really amazing to follow over the last few weeks, by the way, and so just really grateful that you're sharing so much. And yeah, obviously you're doing a ton. Yeah, the Memphis example actually brings up the single thing that I'm most worried about by far as environmental impact of data centers. I have a ranking in my head and far and away, I think the biggest threat that I'm worried about is air pollution, much more so than climate impacts as a whole. I can explain why they're much lower on my list, much more so than water. Land use gets really ridiculous where America just has a lot of land. Honestly, like, and data centers are just so incredibly compact for what they're doing. Like, it's just like, if you think about the processes that are happening inside of them, they're in some ways like miracles of compactness, you know, it's like Moore's law of the building or something like that. It's crazy. But yeah, like with climate stuff, I think that, It seems very likely to me that most of the impacts that AI will have on the climate will be in how it's used, not in the data centers itself. And it seems very likely that there are just so many opportunities to reduce emissions using AI, even just like, this isn't a great example of current AI, but just like Google Maps, if you believe that Google Maps has reduced total car emissions on net by like a percent, that's already a huge portion of all global data centers, and that's like one app. But the thing about climate is that it's actually very fungible. So like if I emit one unit of carbon somewhere and reduce 10 units of carbon somewhere else, like the harm is exactly equivalent to me just removing nine grams of carbon, right? Because or nine units of carbon, because the impact of carbon is over the long term. It's not an immediate bad thing. But air pollution is very different, where it's just not fungible in the same way. If a nearby coal plant built a pipe directly into my home, and we're like, oh, sorry, Andy, we did the math, then on net, this will reduce air pollution as a whole, even though it will make it much worse for you, I'd be like, well, that's not cool, because air pollution has immediate, very bad health consequences. Something that really puts this into stark perspective is that Every year, it seems like millions of people are dying from the effects of indoor air pollution, especially. And this is more than the WHO expects will die from climate change every year, even into the 2040s and the 2050s. And so air pollution is actually already a much bigger disaster than climate change will be in the medium term anyway. And in America, we have pretty clean air, but there's still a lot of bad things that can happen. And it's just such a normalized part of our lives that I expect that even if AI is just using the normal grid and it's just increasing a lot of natural gas and some combination of that and renewables. Air pollution stuff actually seems like the biggest threat because as a baseline it's already just very bad for us. Memphis, I'm very worried to comment on because I really don't want to get this wrong, basically, because it does seem like The local community reported that they were literally smelling gas in the air. There was a lot of movement about this. There was a recording of Colossus using more gas turbines than they were permitted. I have to admit, I just don't know nearly as much about this as I should. There's a lot of contested stuff that happened. It does seem like the city has run a lot of tests since. This is not a continuing problem, but it might have been happening at the time. But again, I want to quadruple check this before making any big claims about this. But it's illustrative of the broader point that even if data centers operate as just normal parts of the grid, the normal grid is actually somewhat bad for us right now. And that cost is often shifted to very poor communities who live in undesirable areas with more air pollution as a baseline. The area that the Memphis data center was polluting already had grade F air. It's just an obvious case of how pollution is distributed. And if you look at the forecasts for AI, it's just forecasted to use a lot of coal and a lot of natural gas, at least in the 2030 or so. And there are all these complicating factors where it's also like... There are all these agreements to buy new renewable energy as well, and there can be these huge benefits to renewable energy from economies of scale there. The climate impacts get really uncertain for me, where I'm like, on net, this could actually be a benefit for the climate, both in how it's used and in building out so much new renewables that economies of scale happen and they become cheaper overall. But air pollution, I actually really worry about as something that communities need to take really deathly seriously. This is actually the environmental impact I'm most worried about. Water just doesn't seem to add up in most places. Like most places I've investigated, I just can't find a single place where it's impacted water access at all. A lot of people will immediately be up in arms if I say that because they've seen a lot of scary stories. My claim is that basically every scary story about water access doesn't really add up when you actually look at the numbers or is related to the construction of a data center rather than the normal operations. And so I'm not really worried about water in the short term, but I am just pretty worried about the air pollution effects of a lot of additional gas, and especially new coal plants opening up. That seems really bad. I'm moderately worried about climate impacts, not really worried about water, and there's a huge power gap where I'm extremely worried about air pollution just because air pollution is just a very normalized, very bad thing in the country right now.

Nathan Labenz: Yeah, I've learned a little bit about air pollution and how bad it is. force over time. It seems like it is kind of like lead in the sense that you want basically as little as you can possibly get.

Andy Masley: Yeah.

Nathan Labenz: My sense about the numbers that you led off with there in terms of like millions of people dying annually from indoor air pollution.

Andy Masley: Yeah, that's different, I realize.

Nathan Labenz: That's like people mostly like heating and cooking their homes with like... wood fires in the home, right?

Andy Masley: Yeah, yeah. On America, I can't get good numbers on how many people are dying from air pollution in America, but a lot of the estimates are really shocking and will range from like 30 to 100,000 people a year, which is like more than guns, you know, and it's like in the same range as cars and stuff. And like, It seems like it's pretty hard to actually understand the causality here, and it could be more and it could be less. I get fired up a lot about how I'm excited about Waymo because 40,000 people die in cars each year, and I'd like to at least automate that away. That would be really amazing. air pollution specifically just does seem by any measure to be killing at least tens of thousands of people every year. And just thinking about, oh, well, this building of these new data centers contribute to additional deaths from air pollution, that is something I would really want communities to be really careful with. And that's the main way I'm not really gung-ho about building these really massive data centers in around vulnerable communities specifically. Yeah, I tend to say about that. And I'm pretty agnostic about the specifics where I haven't done the deep dive on exactly how bad it is to have a gas turbine a mile from you versus a coal plant 10 miles away or something like that. But that's the main thing I would be concerned about. And in some ways, I get quite frustrated by the water conversation for distracting so much from what I think is a much more serious issue.

Nathan Labenz: How much do we know about-- how much of that is this sort of-- threshold effect problem versus a... Do you have any advice for everyday listeners who are like, Geez, I hadn't thought about air pollution. Should they go buy HEPA filters for their own homes? If you're just living in a place where you don't smell gas, should you still be taking action on the margin to clean your own air more?

Andy Masley: I haven't really thought about this enough to comment. I know I've read scary stories about how back when toll booths were common, there were a disproportionate amount of surprising negative health impacts of people in toll booths standing next to idling cars so much. And so I sometimes worry about just being in a city with a lot of traffic specifically where maybe that on its own is pretty bad. But this is actually a place I haven't really done a deep dive, so I don't know the exact thing. Basically just Like maybe prioritizing the general quality of the air where you live actually probably matters a lot for your health. Besides that, it's not really something I know very much about, honestly, so I don't really wanna comment too much.

Nathan Labenz: Yeah. One thing I think we'll never go back on probably is having upgraded to the highest quality filters that you can put into your furnace. We have just a central forced air furnace in the house and those things do come out amazingly dirty after eight weeks or whatever, they're in there. So that seems like just off the top, you can see that there's some benefit to it. I think we'll stay top of the line on that. On the HEPA filters, it's interesting because they also bring a noise pollution element, which there's a lot of, I guess, no pun intended, but some increasing noise being made about the evils of noise pollution, too. So I'm not sure how to think about trading those off. But at a minimum, the furnace filters are a no-brainer, it seems like, and then maybe having a couple HEPA filters around one's home is the kind of thing that would be a good kind of precautionary investment in one's long-term health.

Andy Masley: Yeah, yeah, yeah. It's super interesting. And it's kind of fun. It's just evidence of how actually interesting the AI environment conversation can be where it's like, oh, by the way, HEPA filters. It just bleeds into so many other things that can be pretty interesting. What was I going to say? Do you also worry a lot in these conversations that... in purely focusing on the negatives, and I'm not really trying to bracket off air pollution here, air pollution is uniquely bad, but assuming that a data center contributes minimal air pollution, and we think instead about noise pollution, I do worry that by the time we get down to noise pollution, a lot of the positives of the data center might not be being considered, where I kind of think about a data center as a new, huge taxable industry in an area. And a lot of people will talk about how, oh, so many parts of America are in decline and the people are having so much trouble because industry left. And a lot of those industries were also sometimes noisy. If there was a factory near you, you would probably sometimes hear noise pollution and stuff. And so now, when new data centers are being built, they just often contribute so much to the local tax base and can sometimes help the economy in other ways too. I don't want to say they're always good, but I do sometimes worry that people will be like, oh, maybe they contribute millions of dollars a year to the town, but oh, they make some noise. This would have prevented us from building factories back in the day and the town would have never got off its feet in the first place. I'm very happy to go into the nitty gritty of what about land use, what about noise, what about these other weird aspects of data centers. But I do worry that at that point we're starting to lose the forest for the trees and stuff like that and not consider, oh, yeah, there's a trade-off to be made here. This introduces some new noise pollution and you get a million dollars a year in tax revenue for your town or something like that. Is the trade-off worth it? I would say so in a lot of cases, obviously with the consent of the people around it. But yeah, definitely important to consider the positives as well.

Nathan Labenz: Another thing that makes noise is neighbors. I've got neighbors that run their lawn mowers during the day and have a leaf blower or whatever. And it's like, yeah, nothing's quieter than a ghost town. So that doesn't in the limit, that doesn't really work. How about just the sort of energy cost to the consumer? I think one of the big complaints we're hearing now also is this is going to jack up the cost of everybody's electricity, and that's not fair. And I think there's at least something to be said for that. So I guess Maybe a two part question would be like, how big of a deal is that? What have we seen so far? And then maybe you could sort of play like policy advice. It seems like at the macro level of policy, you'd be like, do the AI build out. It kind of sounds like your main line recommendation. It'll probably have enough offset. It'll probably work. But then there's the local policy level of like, okay, what if you were advising the, you know, Memphis, you know, whatever city council there, they have more particular concerns around these water, land, air pollution, noise pollution sort of issues. So how much of a issue have we seen in electricity bills? And how would you advise the local decision makers that are like, everybody wants to get these industry investments. The companies are often pretty good at playing jurisdictions off against one another. Where would you say, OK, hold the line here for your people to make sure you're getting a good deal? versus what can you accept that maybe you give some ground on because again, it might just be worth it.

Andy Masley: Yeah, going into the electricity thing for a second, again, the Lawrence Berkeley National Laboratory, who are just some of my favorite sources on data center stuff, they just produce a lot of really good work. They had produced a general report, at least very strongly implying that almost all of the recent increases in electricity costs at the average national level anyway, seem to be due to mostly, inflation, downstream effects of inflation, and a few other bumps. Like there was a point where like, you know, the war in Ukraine was raising gas prices for a little while and stuff like that. I can't really figure out exactly what's going on, but the general overview takeaway seems like the, a lot of the cause of increased average electricity prices at the national level in the past five years has been like been caused entirely by issues with supply rather than demand. And so even though we've added a lot of additional energy demand, Data centers don't seem to have contributed to that rise. So there's this common statistic going around right now that since 2020, the average American electricity bill has risen by 35%. And from what I can tell, data centers don't seem to have contributed at all to that 35% rise. A huge amount of it is just inflation. When you factor out inflation, the rest seems to be a combination of a few weird issues with supply. And I'm deferring here to the report itself, the report's making these claims, and I'm just trying to summarize it and doing a good job there. So at the national level, it doesn't seem to be an issue. There have definitely been local places where utilities have come out and said, We've had to raise electricity rates because of the data center. aware that they've done that with water yet. It could be wrong. But with electricity, that's definitely happened. There's a whole separate weird debate to be had about this, where it's like, oh, maybe we need to pressure utilities to do more buildouts of infrastructure anyway for the green energy transition. And so they maybe don't want to, but we have to pay higher electricity bills for the green energy transition anyway, because electrifying everything is going to require huge amounts of new green energy, but also maybe data centers eat up too much of that and they're very There's so many unknown unknowns here that's very hard to talk about too much, and I don't feel fully equipped to diagnose it. But basically, I think there are definitely places where data centers have so far affected electricity prices. I think they're pretty overblown in that most of those price rises have come from these other things. But this could all change in the future as well, just because data centers are projected to use such a massive amount of new energy in America. One other complicating factor here is that from what I can tell, American total electricity use over the next five years will definitely rise significantly, primarily driven by data centers, also a few other things like the electrification of cars and stuff. But this rise will actually be slower than it has been in the past. We were in this really weird state from like 2008 to the present, or until like 2021 or so, where Electricity consumption in America was mostly staying flat because we both optimized a lot of things and also just shifted more to a commercial economy. And so we lost industry and, as a result, used less total electricity. And so total American electricity use is rising for the first time since the financial crisis. basically. This is a weird new situation for us to be in the short term, but in the long term, it's actually projected to rise less than it did historically from 1950 to the year 2000 or something like that. I think we have it enough to manage that, honestly. The big difference here is that now we really don't want to be building a ton of new fossil fuels, which we didn't really know in the 1960s or whatever. It's going to be harder to do that as we build up green energy, but not impossible. As that rise happens, it seems manageable to keep bills low, but also data centers are so much more concentrated compared to other sources of demand. I have to throw my hands up and say I don't really know. There are a lot of people smarter than me who have looked into this a lot more who are pretty concerned about future electricity bills rising a lot. One very last note is that there's this funny thing that I think a lot of people have this bad misconception where if demand for electricity goes up in your region, your electricity costs stay higher forever. And that's not actually the case at all, because that would imply that very large cities would always have higher electricity bills than very rural areas. And what's actually happening is that there's a lot of economies of scale happening, where as more demand comes online, there's a temporary increase in cost, but over time that can level out. But another thing that's happening that's weird is that data center demand might just keep going up and up and up and not give grids enough time to catch up. And so we might be facing very long periods of higher electricity bills to fund the continuous build out to support data centers. So it's really fascinating. It's one of the most interesting topics I've ever scraped up against, which is how the American energy grid is actually going to deal with this. And I'm kind of left... First of all, believing that people actually have a surprising lack of understanding of what's actually causing electricity bills to rise and fall. Even experts disagree about this a lot. And yeah, the future is just so wildly uncertain. I will say that I'm very agnostic about whether we should do the AI build-up in general, because I'm excited about AI, but I want to maximize the benefits and minimize the risk and stuff. I'm totally open to there being a huge amount of risk involved in this for the uses of AI. So yeah, just want to flag that. Finally, one last thing is that at the community level, I actually don't feel like I have enough knowledge to give to specific communities on this, outside of do a full, thorough analysis of the pluses and minuses here. Don't just look at the fact that the data center uses water at all. Also consider the tax revenue and the utility revenue, and there are some cases where using water is actually good, because it gives the utility more revenue to upgrade aging pipes and stuff like that, and really push for renewable energy if you can, and build out battery stuff, and maybe pressure utility to build out a lot of high transmission stuff anyway. But this is all very hand-wavy. I'm totally just some guy on this, so I don't want to speak more confidently on this one thing than I am.

Nathan Labenz: So to try to bottom line a couple of things there. One is, if there's a concern about the macro AI build out, it's the macro issues of controlling the AIs long term, not the marginal emissions. Locally, there can be a lot of issues, but the, you know, it seems like maybe what we need for sort of the general protection of the population is some sort of federal standard, you know, not to invoke the impossible, but like, it seems like we do have a challenge where, you know, we saw this with like Amazon with its second headquarters famously, you know, all these sorts of things where even just pro sports teams, you know, build us a stadium, we'll go to some other city that will build us a stadium. We do have a problematic dynamic where companies will, they have direct incentive and often they do choose to locate themselves in the jurisdiction that will give them the most breaks and the least, you know, obligations. And so it does seem like we are probably headed for at least instances of local governments making deals with companies where they effectively raced to the bottom in terms of what protections they insisted on for their local populations, ultimately leading to what might not be a great deal for those local populations, but the real solution to that seems to be we need some sort of cooperate equilibrium to be established. Or if not equilibrium, maybe some sort of higher level rule to just insist that certain things get paid back into the community or get remediated so that whoever's living in the area isn't breathing bad air. That seems like there's enough surplus to go around for it, for sure. It's just a question of-- I mean, and I don't mean to dismiss this because this is always the case, right? The same was said about the China shock in the first place that left a lot of towns that are now looking for new industry in a bad place. Well, sure, the aggregate welfare goes up. You can always redistribute. Is that effectively going to happen? It hasn't always. Some might say it hasn't in most cases. So we face that same challenge again. But it seems like the bottom line on all these resource dimensions and even the sort of pollution externality dimensions is the surplus is definitely there for it. If we want to take one big takeaway, it's like, let's get our act together this time and actually make sure that we do insist on, you know, reasonable standards or, you know, get the right compensation for people that are, you know, directly locally affected. And if we do that, we should be fine. And it seems like it still shouldn't be so much that it would like cede the AI race to China or, you know, prevent us from like being able to use all the chatbots and AI agents and, you know, for that matter, image and video generation models that we could possibly want. But the local political economy of this can be problematic if we're not. careful about it.

Andy Masley: Yeah, it's funny. This is something else that I worry doesn't always come through where I feel like I'm often writing in response to the most extreme takes about data centers where people will write implying that it is literally always bad to use water on data centers. I'm like, well, here are all these trade-offs. In a lot of America, it is good to use more water because economies of scale happen and the main access problem is aging infrastructure and not raw access to fresh water. But in doing that, I often don't also stake out that, yeah, I'd like a lot more rigorous environmental regulations on this stuff with the goal of doing a green energy build out. There's another nuance you can get into where it's like, oh, certain environmental regulations can actually stifle the types of builds that we need for green energy and stuff like high transmission, long distance power and stuff. I'll have to get into there. Yeah, and the race to the bottom thing is interesting too, where it's definitely a huge problem. You read about how... People are just delaying taxes on data centers for longer and longer periods of time, other things. And people will talk about this, but then in the same breath be like, oh, and there's no benefit to communities anyway. And my question is always like, well, if there's a race to the bottom, why are all these city governments racing in the first place? And a big part of it is that there are some tangible benefits to having a data center. All else equal, I grew up pretty well off personally, but I'm from a not super well off area. that is in some ways left behind by industry and stuff. I think I would be actively excited if a medium-sized data center were built in the vicinity that the town could then tax and use for stuff. I think that would be net good, personally. But there are a lot of other places where I would probably be much more hesitant, especially a super large data center that was going to build a huge amount of gas turbines everywhere. I'd be personally pretty excited about a lot of very specific targeted regulation that prevented air from being polluted in significant ways, but didn't prevent the types of otherwise grid upgrades that we need for the green energy transition anyway. Again, I'm kind of a hobbyist here, so I'm not speaking from a position of authority, but yeah, I think my general ask for most people considering this is even in race to the bottom dynamics, it's kind of important to just consider what are the actual trade-offs here? Is this worth it? Definitely say no. I don't think that data centers should be built everywhere, but just it's important to consider the positives and the negatives as opposed to just deciding that because they don't see any value in AI, like it shouldn't be able to use any of the community's water because there are a lot of instances where even if the thing inside the data center is useless, it can actually sometimes have positive effects on the local community.

Nathan Labenz: Are there any rules of thumb that you could venture around who should not build a data center? I mean, one candidate would be like, if water's already really scarce in your area, maybe you're not a good candidate for a data center.

Andy Masley: I have a really crazy take on the water stuff, actually, which is that I think that data centers are actually quite good candidates for very water scarce areas, like Maricopa in Arizona. So a lot of new data centers are being built around the Phoenix area, and you might think this is ridiculous. Phoenix a city built in a desert. But there are a lot of other industries and commercial buildings that use a lot more water than data centers do. So the golf courses around Phoenix, from what I can tell, use something like 30 times as much water, or the golf courses in the state of Arizona overall might use 30 times as much water as all data centers in Arizona. And that on its own is pretty crazy to me. I don't know if you've ever been to that area specifically, but it's kind of eerie to stumble on this sickly sweet green like the middle of a desert. And if these industries are using water, the difference between them and data centers is that data centers are often generating way more tax revenue per gallon of water used. Best I can tell right now is that if you add up the total tax revenue and water used from both golf and data centers in Arizona, data centers are generating something like 50 times as much revenue per gallon used. And so basically thinking about it, if you want... I wouldn't support additional water being used in Arizona, but I would definitely support less effective industries being swapped out for data centers. If I could close all the golf courses in Arizona and replace them all with data centers in the state, that would just generate billions of dollars in revenue for the state without increasing the water bills at all. And so because data centers generate so much revenue per gallon used, at least right now, they actually seem like surprisingly good candidates for the desert, but a lot to say about that, I definitely don't want to increase water use more there than it already is. Besides that, main rule of thumb is it just seems really preferable to not build data centers near where you have to open new coal plants or keep coal plants running to operate them. Coal is just so bad for people's health and for the climate that that seems good. Beyond that, though, I don't have any clear rules of thumb. It's not to say I wouldn't object to more specific cases. It's kind of hard to think about on a case-by-case basis, and I haven't really developed an overarching like, oh, here's the strategy. And it seems a little bit goofy because all this started with me wanting to win arguments at parties about using chatbots and I've escalated to, here's my complete plan for the data center build out. And at some point I have to step back and be like, I don't know. I kind of trust people who've been thinking about the very local ecologies of these regions and the local economics and stuff to know enough to make good decisions here. I think I'm just in general somewhat deferential to local experts.

Nathan Labenz: Yeah, well, I would say you know, the world is grappling with this in real time, for one thing. So one of my big themes of this overall journey of making this show is like, there are AI is intersecting with everything. And we need people to kind of step into these niches, you know, whether they started in a particular niche or are kind of jumping into it based on, you know, even a motivation as, you know, as pedestrian as being annoyed by stupid stuff other people are saying. It is important that people kind of take ownership of figuring out different corners of this whole phenomenon. So I applaud you for doing that. And I would say, obviously, more expertise and more local knowledge, always good. But we are kind of moving very quickly into this. And I think you've definitely done the overall discourse a service by, you know, piping up many times in many places with more grounded analysis than we're otherwise typically getting. And then, yeah, I mean, I guess the local stuff is I'm glad we got to golf, you know, because that was another one that I was really marveling at. I was looking at LA area golf courses. And again, this is sort of these like giant numbers. One thing I did want to double click on for a second is like, are we using the same meaning of use when we talk about use water like I've seen an LA golf course uses 90 millions of 90 million gallons of water a year or LA area golf courses plural is like 1.6 billion gallons per year estimated that's like a ridiculous number and it dwarfs you know like what it took to train GPT-3 and things like this. But are we talking about the same use? Because those sort of seep back into the ground, right? Or are they the same? I can't tell. Is that the same or different?

Andy Masley: This is where I start to scrape up against some issues here. Yeah, I have to admit that I'm not an expert on, so I don't want to comment on too closely. Yeah, I mostly do try to stick to places where water is being actively chosen to be delivered by people. So I try not to count rainwater as an example. If crops are mainly reliant on rainwater that would otherwise just be falling into the ground, you can say that crops are using that, but it's different. And so at least for agriculture, I mainly focus on irrigated agriculture specifically. And I have to admit, I don't know too much about the water dynamics of how this eventually flows back. So I would have to circle back on this. This is an autodidact curse that you're bumping up against, where I know a lot about the specific water that is being delivered to these places, but Whether this counts ultimately as consumption or withdrawal is a little bit beyond me, and what the difference is and how big of a deal it is. One other important difference here is that the water used in data centers themselves, like I said, is potable, whereas a lot of water delivered to crops or golf courses is just fresh water, but sometimes it's potable as well. So that's an important difference to keep in mind as well. But it just doesn't seem too costly to turn freshwater potable based on my understanding. There's an initial upfront infrastructure cost of a water treatment plant, but yeah, a lot of unknowns there. I wish I had a better answer there. That was a good question. I think in general, the singular, really weird way that we use water that I would like more people to know about is irrigated alfalfa farms, which again, I'm not entirely sure how this eventually flows back into the system, but they seem to use something like a thousand times as much as all AI used in 2023. And there are a lot of irrigated alfalfa farms in Arizona and other places too, where these are also just blooming in deserts. And a lot of alfalfa is used to feed animals, which we then eat. So I'm biased here because I'm against animal agriculture for animal welfare reasons. But this is another place where if we're going to focus on one water bad guy in the country, I would like to start with the bizarre amount of irrigated alfalfa we have and all the corn that is eventually converted into ethanol that we then use for where maybe we could use something else instead. Strong recommendation. Hank Green recently did a big video on data center AI water stuff that I thought was just masterful. I definitely disagreed with some of the ways that he hedged and some of the specific things he said, but for the most part, it was some of the best single media I've ever seen about it anyway. He talks a lot about corn there and how goofy it is that we use corn. goofy it is that we use so much water on our lawns every year. And he just puts things into a lot of really useful context too. So if people are looking for more of a deep dive on the water stuff, that's probably the best single resource to start with. So yeah, we'll definitely start there.

Nathan Labenz: Yeah, cool. Just to fill in a little bit of the detail on food, since you'd mentioned animal agriculture, I had GBT in the background answer the question. How much energy/emissions goes into the production of a hamburger? And the answer comes back at a few kilowatt hours of primary energy and a few kilograms of CO2 equivalent. So if we just, again, kind of do a energy to energy quick conversion, let's say it's 2000 kilowatt hours, that would be running your microwave for 2 hours. 3,600 seconds in an hour, you're talking, you know, 7,200 seconds in two hours. And we already previously established one chatbot prompt is worth about a second of a microwave. So we get again to a sort of comical ratio of something like five to 10,000 chatbot queries for one hamburger. And by the way, the exact same number came back with the question of how much for a hot shower. Again, five to 10,000 ChatGPT queries for one hot shower.

Andy Masley: Totally forgotten if I've already like shared the story, but a friend was in a pizza place like a few weeks ago and overheard two teenagers talking and one of them had said like, oh, this mutual friend of ours uses ChatGPT. So it's ridiculous that she calls herself an environmentalist. And then she immediately went and like ordered a meat lover's pizza specifically. And like, this is kind of just like, sums up a lot of the debate here for me, where I worry that in hyper-focusing so much on this, people are really losing sight of where the big environmental bad guys are. They're not where you'd expect. A lot of them in your personal life are probably going to involve very different things, like maybe the food that you eat. But even there, I'm mostly not eating animal products for animal welfare reasons. And I have to admit that if someone's eating a chicken sandwich around me, I actually think that it uses more resources than some of the things that I eat. But even there, I actually don't want to litigate the small amounts of environmental harm. Usually, if I wanted to talk to them about it, I'd be like, oh, that was a little guy who had a really bad time in his life, and try to talk more about the animal welfare thing. And I think saying, oh, and it uses this many tofu slabs or something would actually dilute the point-- worth of energy, I mean-- that would dilute the point that I'm making. I think one other theme of a lot of my writing is that I'm quite concerned about most other aspects of AI. I love it and fear it. Current chatbots are great. I'm very worried about the consequences of even very near-term AI, and I'm pretty worried that a lot of my fellow AI critics and warriors are potentially diluting a lot of the points that they make by just adding on, Oh, this thing could make surveillance so much more effective, and every time it does that, it uses a drop of water. Isn't that terrible? The second point just takes away from the first in a way that I think a lot of people are missing. background motivation here isn't to say, oh, everything's fine. Data center's built out is fine. You don't have anything to worry about. It's more to say that I'd really like us to focus on these actual things and not just throw everything we have at AI. Because I think people who are listening to us notice very fast when you're just kind of reaching for any tool that you can to attack something as opposed to actually building up actual legitimate criticisms that you're really worried about.

Nathan Labenz: Yeah. Another way to put that, I guess, is in terms of contribution to overall pDoom, the climate change contribution of AI is very minimal.

Andy Masley: Yeah. Yeah. And potentially even positive. Like again, I like the International Energy Agency projects that by 2035, like all AI applications might together be like preventing something like four times as many emissions as all data centers cause. And that's wildly uncertain for a lot of different reasons. And another weird thing about that is that will mostly rely on deep learning models that aren't the big chatbots that are being powered in these huge data centers. So a lot of hand-wavy stuff to say about that. But I think the existence of AI in the short term reduces my odds of severe climate change by a little bit, but also increases the worry in a lot of other ways and other environmental stuff like air pollution I do take seriously.

Nathan Labenz: Yeah, I'm really glad you mentioned that point about non-LLMs or non-general purpose models driving a lot of value. I mean, I think that's something the world would do well to keep in mind in general. But specifically on this resources point, it's always really striking to me how small some of the models that are driving breakthrough contributions to various fields of science actually are. One, I did an episode not long ago with Professor Jim Collins, who has discovered new antibiotics with models that are just like, I forget exactly how small they are, but I think we're talking millions of parameters, you know, tens of thousands of data points, like super small stuff that they can run on, you know, a couple computers in like a couple days time. And you get new antibiotics out of it. It's like, okay, this is a big deal. Similar things probably are going to start to happen with material science, you know, with just all sorts of optimizations, you know, one thing we had kind of, touched on a little bit in preparing for this is just like optimizing energy use in buildings. There's so much waste all over the place, right, that you could just get smarter about some of these kind of core operations and get a ton of value from there. But then when you get into material science, I mean, then you really can move the sort of fundamentals of energy in a significant way. And literally, like one material science breakthrough could very plausibly offset all the inference that the world is going to use.

Andy Masley: Yeah, exactly. Especially material science for batteries and solar panels, there have already been some really exciting photovoltaic improvements enabled in part by AI anyway. And that on its own, the scale of that could just start to dwarf a lot of what happens in data centers specifically. It's just another obvious example where computing is just so wildly efficient per task. Basically, it's comically optimized. I'd run some kind of goofy back of the envelope calculation where I'm pretty sure that to run a game of Minecraft in 1950 would have required half the energy of the United States, and now it can run on your computer. I think people really underestimate how much energy has been optimized in computers in the last 70 years or so. And because it's so wildly optimized, what you're getting out of it is going to have so much more effect on the world than the actual energy that went into it. The big holdup to this case is that a lot of people believe that chatbots are still completely useless and are going to be the main things in data centers. So if you're still in that camp and you don't believe chatbots will ever have any value, then my case becomes more tenuous. I have to put my chips in and be like, Man, chatbots are a miracle to me. I'm sure they're adding quite a bit of value in changing a lot of my behavior outside of the computer itself. I think that for anyone who is very worried about the environment and AI, just all the game is going to be in the way that AI is used and not necessarily in the data centers itself and worry about specific consequences of data centers. But framing the debate around the energy used in data centers does feel feel disturbingly similar to me to like seeing Amazon take off and thinking, oh, I should mainly worry about how much energy the website's using.

Nathan Labenz: Yeah, maybe one kind of final point of comparison and then invite your kind of closing thoughts, but it does touch on this, you know, how AI is going to be used. Obviously, you know, there were some confidence bars around these estimates, but again, 80 gigawatts seems to be the sort of Grand Vision, you know, with the Stargate project being five gigawatts of that and, you know, imagining the full $7 trillion build out, whatever, that's kind of come in at 80 gigawatts in the analysis and estimates that I've run. Compare that to total global energy use today is estimated at 6,000 gigawatts, which kind of backs out when we talked about like one gigawatt being a million people, then, you know, a thousand gigawatts could be like a billion people, that's like at US energy levels. Six gigawatts would be like 6 billion people. There's the sort of electricity to non-electricity conversion factor there. But roughly speaking, that seems like a order of magnitude reasonable estimate. So we're talking basically about like a little bit more than 1% increase. And another thing to keep in mind is that is almost certainly going to be dominated by the increase in energy use in the developing world. by the just general background process of people getting wealthier, people being able to afford more energy. And that's something that I think certainly I do not begrudge the global poor for their marginal energy use over the coming decades. Maybe some do, but I certainly want to go on record saying I think it's good when people can do more stuff and have better lives. So I don't know if there is any downside scenario that you think would be worth worrying about in terms of how it's going to be used. And we've kind of focused on the rosy side. I mean, one other short version, super compressed, would be like, if AI makes everyone richer and then we can all afford more energy, maybe there's just a lot more energy used broadly. But is there anything more specific that you think would be the kind of surprising, possibly negative story in the macro sense?

Andy Masley: I mean, I can get into really goofy sci-fi scenarios where it's like, oh, if there's some global war over AGI, which I don't actually put super strong probability on, but I'm like, oh, very powerful AI systems in the future maybe destabilizes things. Nuclear war is bad for the environment, so that's something. At the edge cases, it just gets really ridiculous. So I don't wanna go quite that far. So I'll bracket that for now and just say that, AI can probably reduce a lot of emissions. It does seem like there are a lot of potential ways it could really radically increase how much energy we use as well that might just make the green energy transition harder. Super optimized shopping apps might just make it much more tempting to buy more stuff. Or just making manufacturing more efficient in some ways might make us use a lot more energy in total without transitioning that to green energy. I also do want to make it clear that I'm also coming from a very specific perspective of it is definitely important to get the poorest people in the world a lot more access to energy. basically by any means. And when I talk about cuts to emissions, I'm mainly talking about for people like us in global, very wealthy communities in comparison, just motivated a lot by, very much not like a degrowth person as an example. Yeah, a lot to say about that. I'm trying to think about, yeah, even just self-driving cars, I could see a world where that just makes driving much more tempting because suddenly a five-year-old kid can take a self-driving car to their friend's house if the parent's okayed or something like that. And that just opens up so much additional travel that on net, that creates a lot more energy use. And it's a funny world where the value we're getting per energy will probably be way higher because we'll have all these nifty gadgets and stuff. if that doesn't come with the green energy transition specifically, we'll still be emitting more on that. And ultimately the climate only cares about the total amount of CO2 that's in the air. And so I don't know, this all just feels so speculative and weird where I'm like, oh yeah, AI could radically reduce car trips, but maybe increase it and reduce building stuff, but also or decrease building energy use, but also increase it elsewhere. Or material science for green energy is good, but certain scientific discoveries for the military might make global war more likely. It starts to get really hand-wavy really fast, basically. There are a lot of really cool papers on this. I'm pretty sure it was the London School of Economics that recently published one that was really good. Yeah, there are a lot of cool speculative stuff about this, but it's so uncertain that if environmentalists are gonna really focus on this, I'd really want them on, as soldiers in the battle to make sure that the way AI is used is good for the environment. And data centers definitely worry about, but don't treat them as the big central part of the story.

Nathan Labenz: I think that's probably a good place to leave it unless you want to bottom line it any other way for us.

Andy Masley: Yeah, basically that was super comprehensive. Thank you. That was a blast. So just enjoyed the conversation a lot. Yeah, again, just a big fan of the show, super grateful to make it on.

Nathan Labenz: Love it. Andy Masley, thank you for being part of the Cognitive Revolution.


Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to The Cognitive Revolution.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.