The energy and/or water use comes up from time to time so I did a little digging:
The average ChatGPT query uses 0.34 Wh (0.12 kJ, 0.3kcal) and 0.32ml of water.
This means if you ask it one question every two minutes, you're using about as much energy as a 10W LED lightbulb.
If you use it 5000 times, you'll waste the energy of a medium-sized pizza, and water amount equivalent to how much you'd need to wash your hands after that pizza.
Let's do this in reverse. xAI is planning to buy an additional million Blackwell GPUs right? That's more compute than we all need to ask a question every minute or two.
The persistent popularity of discussing how AI is supposedly bad for the environment actually gives me the opposite impression.
Ever since reading Ryan Holiday's book, Trust Me, I'm Lying: Confessions of a Media Manipulator, popular controversies like this always make me wonder who benefits from their continued discussion.
Every time someone talks about how AI is using energy, that's free, viral publicity advertising AI, keeping us obsessed with talking about it for one reason or another. The controversy makes it stickier in our brains. It makes it feel relevant even if you aren't using the product yet.
And yes, when done correctly, a leading figure in AI acknowledging the allegation and responding to it helps to further our obsession with the idea. The reassurance might not be to assuage, but rather to keep us engaged.
AI is polarizing. People who don't like it really don't like it. This wouldn't be the first anti-AI argument that was more about sentiment than actual analysis. Again, I think the climate impact argument here is basically imported from the argument against crypto (where I enthusiastically join the skeptics) and doesn't apply here. Either way, my point is, just go look at different sources. Simon Willison has linked to some on his blog.
Unless we crack efficient rain water harvesting, its not. Also good water that goes up in the air and falls back wjth other pollutants makes our food worse.
I'm very sympathetic to those who want focus on the climate disaster, but I'm not convinced that genAI's carbon pollution is a major problem. Maybe I've missed a memo, but the biggest polluters and criminals in the climate story are still by far the fossil fuel industry (eg. Shell, Exxon) and the agricultural industry (eg. Cargill).
At some point I was even hearing the claim that digitization (e.g. GenAI) was finally divorcing the tight connection between economic growth and resource extraction. I'd bet it's incorrect, but it's much less fanciful than thinking that growth in oil or beef would help us grow without strip mining the earth.
Bray's first issue —the influence of GenAI on labour's (and also the democratic people's) decreasing power versus capital— is much more important and interesting.
> the biggest polluters and criminals in the climate story are still by far the fossil fuel industry (eg. Shell, Exxon)
What does this actually mean? It feels like when people say this it implies that gas companies just burn gas randomly for no reason. They sell their gas to everyday consumers, actual people. Why is this Shell/Exxon's fault and not consumers?
No, but that doesn't change the fact that people in general dislike the idea of lowering their standard of living for environmental reasons.
Nobody pollutes in a vacuum, and the truth is that our emissions are the collective responsibility of all of us and not just 10 large companies or whatever the Reddit line is these days.
We need to rewire the economy in a zillion different ways. Nobody should be using fossil fuels if they can avoid it, but so many cannot at present e.g. air travel, steel mills. Shell and Exxon aren't forcing people to burn their product. If they shut down tomorrow, millions worldwide would starve. We better keep them going as long as we need them. Edit: I work in oilfield services and probably am a little biased.
I don't think the oil industry should shut down tomorrow. Because you're right that it would be a disaster and hundreds of millions would starve and die.
But I also don't think the oil industry should put up any barriers to renewable energy. By doing that they force us to need oil. They should accept that they need to wind down.
Employees should be provided good exits from the industry.
> Shell and Exxon aren't forcing people to burn their product.
Is this a joke? They are lobbying against the subsidy of alternatives that are better for society. The repeal of subsidies was just successfully achieved in the OBBB. Yet, they continue to get subsidies themselves. Have you noticed the special exemption for oil and gas extraction on your tax forms? It’s in your face.
So revoking a subsidy is forcing people to act a certain way?
Dude, we burned plenty of oil even when we had the subsides. Big automakers were on board with the idea, energy companies were pivoting to batteries and renewables - and the average person still cared more about their costs and standard of living than whether things were actually green or not.
Fossil (in particular coal) thermal plants that were planned for shutdown are being kept online or restarted because of AI energy use. Tech had a pretty minor environmental footprint until now but it's growing rapidly due to AI, for use cases that are clearly not vital and for a good part garbage.
> At some point I was even hearing the claim that digitization (e.g. GenAI) was finally divorcing the tight connection between economic growth and resource extraction
Kind of? Mostly it's a result of renewables. The US basically doesn't build new fossil fuel power any more. Not every country is on exactly the same point on the curve but they're all approximately following the same curve. Energy use has also decoupled which I think is what you're referring to, but I'm not sure the cause neatly decouples.
> Then there’s the other thing that nobody talks about, the massive greenhouse-gas load that all those data centers are going to be pumping out.
Arguably, the GPU emissions are lower than the carbon footprint of the human workers they seek to replace.
Before you downvote: yes, we can’t just “deprovision” humans, but unless you think reproduction is immoral because of the environmental impact (an extreme), it has to be possible to increase total industrial production and negate global warming simultaneously. It’s a big equation. The hard part seems to coordinate anything whatsoever as a species.
I don't think we even approach this science-fiction analysis of the carbon footprint of human workers vs. LLMs, because LLM inference apparently has a carbon impact in the vicinity of, like, a couple Google searches. The environmental concerns over LLMs seem mostly to be ported in from cryptocurrency (where they were a very real concern, because crypto put a serious cash value on energy arbitrage).
I do agree. I think it's most important point - that while yes, people rightfully point out that past automation in the long term and averaged across everyone did have probably a positive impact on income, that's not what is prompting the current investments in it. The investments only make sense in the context of and clearly are a bet on reduced payroll.
Reduced payroll to a large portion of the members of the already struggling and shrinking middle class. Thinking that some natural law says that jobs that will come back will automatically also be middle class
This is what makes me uneasy the most right now: The promise of reduced payroll can only materialize if there are no replacement jobs created by AI - otherwise the labor cost would just shift to a different position. So mass unemployment it is then?
At the same time, the same people are also hard at work dismantling the social security net that makes prolonged unemployment survivable.
I don't know, I think there's a real possibility that this ends up increasing the demand for software engineers rather than decreasing. The lower barrier to entry will mean lots more projects pursued by lots more people, and they'll eventually need help. And all the things non-tech companies would like to do but can't afford a whole software team, now they can launch with a couple engineers. All the stuff that has been on tech companies' backlogs forever, they can start tackling and then start the new initiatives they've been dreaming about but too busy managing firedrills and KLO.
I think as software gets cheqper to build then there's just going to be a lot more of it, for use cases we aren't even thinking about yet. And the more software there is, the more challenges it will create to manage it. Our jobs will be a lot different, but I think anyone who is laying off humans right now to save a bit of cash is really short sighted. It's going to be a long time before AI can do everything a software engineer does. (think about accountants, seems like they could have been replaced long ago, but it's still a huge industry). But the time between now and then, software engineers will be the highest ROI employees there are.
The investment makes sense even if you think there will be replacement jobs created because nobody wants to be left behind.
If you are spending X on payroll to do Y, and your competitor is now spending X on payroll to do both Y and Z, then they can (a) undercut you on selling Y, and (b) sell Z that you can't even compete with.
(Or, even if you aren't directly going to ever Z yourself, the replacement jobs could be somewhere else entirely. You actually hope this is the case - if there are no replacement jobs, the market for selling Y might shrink, if everyone has less money.)
Its really short term thinking. What do those employers think all the software developers are going to do when they get home. The're going to start competing companies.
As an indie game developer it's mind boggling that MS can fire 9000 staff and makes me wonder how even more flooded the indie game market can get :)
Certainly Gen AI is being marketed to business leaders as being capable of reducing their payroll, but I don't believe that's what it's for as such. And as several comments have mentioned, its energy use is not even especially significant.
Gen AI exists to wrest control of information from the internet into the hands of the few. Once upon a time, the Encyclopaedia was a viable business model. It was destroyed as a business model once the internet grew to the point that a large percentage of the population was able to access it. At that point, information became free, and impossible to control.
Look at google's "AI summaries" that they've inserted at the top of their search results. Often wrong, sometimes stupid, occasionally dangerous - but think about what will happen if and when people divert their attention from "the internet" to the AI summaries of the internet. The internet as we know it, the free repository of humanity's knowledge, will wither and die.
And that is the point. The point is to once again lock up the knowledge in obscure unmodifiable black boxes, because this provides opportunity to charge for access to them. They have literally harvested the world's information, given and created freely by all of us, and are attempting to sell it back to us.
Energy use is a distraction, in terms of why we must fight Gen AI. Energy use will go down, it's an argument easily countered by the Gen AI companies. Fight Gen AI because it is an attempt to steal back what was once the property of all of us. You can't ban it, but you can and absolutely should refuse to use it.
The second order effects are were the real dangers lie: people will lose the ability to understand their own reality.
You see it at Twitter, where community notes are being replaced by AI. Stupid users asking "@grok is it true that ...?" People are gullible, putting trust where they absolutely shouldn't.
Musk wasn't happy about some facts, so grok changed. "Sorry, I was instructed to.."
These tools are seen by the clueless populace, whether it is your own aunt or some HN'ers, as an objective, factually correct oracle, free of influence.
Then there are lobby groups pushing for AI in the judiciary. Always under the banner of "cost savings". Sure, guess who gets their case being handled better.
A debate about what a healthy society would be, what people share as a common cause, is urgent as ever. Without reality distortion from autocratic interest groups an allied talking heads. The AI flood is unstoppable, but with the current culturally engineered crises in many democratic countries, it will most likely result in serious catastrophe.
So, I just want to say that increasing productivity over time should result in net higher prosperity.
There are numerous examples of disruptive technologies that reduce labor costs and the world has gotten better over time not become a dystopia.
I’m sure there will be winners and losers and it will take time adjust, but dramatic increases in productivity will make a better world, because it will take less effort for you to get what you want to get.
Add to this weakened labor and social programs being treated like a four letter word among most politicians, the media and voters. AI isn't going to make the system change.
I think this is a separate argument as productivity is getting squeezed out by competition and the consumer is benefiting.
I'm not sure if capital owners, management or workers are getting equal slices of the benefits, but I do believe that everyone is benefiting and it doesn't make sense to avoid progress as though increased productivity isn't worthwhile.
> Then there’s the other thing that nobody talks about, the massive greenhouse-gas load that all those data centers are going to be pumping out.
This is discussed ad nauseum, and the carbon accounting is very poorly done.
Looking at the capital and operating expenses of datacenters is the right way to think about it. Nothing about that tells me that AI is environmentally worse than driving a big vanity pickup truck bigger, owning a large house, having lots of offspring, or taking many international flights.
Aren't most jobs bullshit jobs already? Fingers crossed genAI doesn't change this too much.
Low empathy has been an issue with humanity since day 1. I wouldn't even know how to begin to fix it. It'll probably still be an issue long after we're dead. If it really bothers you I recommend meditation/therapy/etc.
Don't expect action on climate change until a few million in western countries are killed. Humans are terrible at slow-moving disasters. My parents both died early from being sedentary, despite my best efforts to get them to work out.
With luck, smarter decision-making with genAI might actually improve some societal systems.
> The real problem · It’s the people who are pushing it. Their business goals are quite likely, as a side-effect, to make the world a worse place
Me, 10 months ago:
--- start quote ---
Someone on Twitter said: "Do not fear AI. Fear the people and companies that run AI".
After all, it's the same industry that came up with pervasive and invasive tracking, automated insurance refusals, automated credit lookups and checks, racial ...ahem... neighbourhood profiling for benefits etc. etc.
I'm sorry, but removing barriers that stop people from creating their own computer programs, their own anime scenes, their own marimba jazz, or their own live action commercial for their used car is not going to make the world a worse place.
There will downsides and their will be people negatively effected, but democratizing ability has never been a net negative.
This article read like a conspiracy theory, and it offers no evidence for its position. It's not even really making a case. It's just stating that companies are putting money behind AI so that they can lay off employees and reduce the quality of their products, and by saying that they apparently think they've made an argument.
When I've talked to startup CEOs or execs building AI products at larger companies, their sales pitch is usually some form of:
"Right now X is expensive because you have to hire people to do it. That means that access to X is limited. By using AI, we can provide more access to X".
X could be anything from some aspect of healthcare, to some type of business service, or even something in the arts.
It's very clear that the people building AI companies, and the people investing in those companies, think that there is an enormous market to use AI to automate a wide variety of work that currently requires human labor. They may not be explicitly framing it as "we get to reduce our payroll by 50%" - they may be framing it as "now everyone in the world will get access to X" (often tech executives will use a grand mission to justify horrible things), but the upshot is that companies are 100% putting money behind AI because they believe it will help them get more work out of fewer and fewer people.
I would say "works as designed" - efficiency gains add shareholder value which is what most orgs optimize for. Pitching these makes sense. As usual, the costs of second order effects are externalized.
The average ChatGPT query uses 0.34 Wh (0.12 kJ, 0.3kcal) and 0.32ml of water.
This means if you ask it one question every two minutes, you're using about as much energy as a 10W LED lightbulb.
If you use it 5000 times, you'll waste the energy of a medium-sized pizza, and water amount equivalent to how much you'd need to wash your hands after that pizza.
Sources:
Energy and water use: https://blog.samaltman.com/the-gentle-singularity
Typical water pipe flow (with perlator/aerator): 3.5 L/min: https://www.enu.hr/wp-content/uploads/2016/03/7.-Racionalno-... (US stats are 4x that: https://www.nyc.gov/assets/dep/downloads/pdf/environment/edu...) Time required to properly wash hands: https://www.hzjz.hr/sluzba-zdravstvena-ekologija/pravilno-pr...
Ever since reading Ryan Holiday's book, Trust Me, I'm Lying: Confessions of a Media Manipulator, popular controversies like this always make me wonder who benefits from their continued discussion.
Every time someone talks about how AI is using energy, that's free, viral publicity advertising AI, keeping us obsessed with talking about it for one reason or another. The controversy makes it stickier in our brains. It makes it feel relevant even if you aren't using the product yet.
And yes, when done correctly, a leading figure in AI acknowledging the allegation and responding to it helps to further our obsession with the idea. The reassurance might not be to assuage, but rather to keep us engaged.
I don't think he'd outright lie, but possibly the calculation is ... stretched ... to fit the narrative.
Let's call it a small pizza, then :)
Meanwhile your idle desktop PC and monitor are pulling 20-100W each.
People tend to forget that doing the work the old fashioned way uses energy too. Presumably for a longer time for equivalent results.
At some point I was even hearing the claim that digitization (e.g. GenAI) was finally divorcing the tight connection between economic growth and resource extraction. I'd bet it's incorrect, but it's much less fanciful than thinking that growth in oil or beef would help us grow without strip mining the earth.
Bray's first issue —the influence of GenAI on labour's (and also the democratic people's) decreasing power versus capital— is much more important and interesting.
What does this actually mean? It feels like when people say this it implies that gas companies just burn gas randomly for no reason. They sell their gas to everyday consumers, actual people. Why is this Shell/Exxon's fault and not consumers?
Big oil fits better into my social media feed and political beliefs.
https://en.wikipedia.org/wiki/ExxonMobil_climate_change_deni...
Nobody pollutes in a vacuum, and the truth is that our emissions are the collective responsibility of all of us and not just 10 large companies or whatever the Reddit line is these days.
I don't think the oil industry should shut down tomorrow. Because you're right that it would be a disaster and hundreds of millions would starve and die.
But I also don't think the oil industry should put up any barriers to renewable energy. By doing that they force us to need oil. They should accept that they need to wind down.
Employees should be provided good exits from the industry.
Is this a joke? They are lobbying against the subsidy of alternatives that are better for society. The repeal of subsidies was just successfully achieved in the OBBB. Yet, they continue to get subsidies themselves. Have you noticed the special exemption for oil and gas extraction on your tax forms? It’s in your face.
Dude, we burned plenty of oil even when we had the subsides. Big automakers were on board with the idea, energy companies were pivoting to batteries and renewables - and the average person still cared more about their costs and standard of living than whether things were actually green or not.
Kind of? Mostly it's a result of renewables. The US basically doesn't build new fossil fuel power any more. Not every country is on exactly the same point on the curve but they're all approximately following the same curve. Energy use has also decoupled which I think is what you're referring to, but I'm not sure the cause neatly decouples.
> Then there’s the other thing that nobody talks about, the massive greenhouse-gas load that all those data centers are going to be pumping out.
Arguably, the GPU emissions are lower than the carbon footprint of the human workers they seek to replace.
Before you downvote: yes, we can’t just “deprovision” humans, but unless you think reproduction is immoral because of the environmental impact (an extreme), it has to be possible to increase total industrial production and negate global warming simultaneously. It’s a big equation. The hard part seems to coordinate anything whatsoever as a species.
Reduced payroll to a large portion of the members of the already struggling and shrinking middle class. Thinking that some natural law says that jobs that will come back will automatically also be middle class
At the same time, the same people are also hard at work dismantling the social security net that makes prolonged unemployment survivable.
So what exactly is the endgame here?
I think as software gets cheqper to build then there's just going to be a lot more of it, for use cases we aren't even thinking about yet. And the more software there is, the more challenges it will create to manage it. Our jobs will be a lot different, but I think anyone who is laying off humans right now to save a bit of cash is really short sighted. It's going to be a long time before AI can do everything a software engineer does. (think about accountants, seems like they could have been replaced long ago, but it's still a huge industry). But the time between now and then, software engineers will be the highest ROI employees there are.
If you are spending X on payroll to do Y, and your competitor is now spending X on payroll to do both Y and Z, then they can (a) undercut you on selling Y, and (b) sell Z that you can't even compete with.
(Or, even if you aren't directly going to ever Z yourself, the replacement jobs could be somewhere else entirely. You actually hope this is the case - if there are no replacement jobs, the market for selling Y might shrink, if everyone has less money.)
As an indie game developer it's mind boggling that MS can fire 9000 staff and makes me wonder how even more flooded the indie game market can get :)
Gen AI exists to wrest control of information from the internet into the hands of the few. Once upon a time, the Encyclopaedia was a viable business model. It was destroyed as a business model once the internet grew to the point that a large percentage of the population was able to access it. At that point, information became free, and impossible to control.
Look at google's "AI summaries" that they've inserted at the top of their search results. Often wrong, sometimes stupid, occasionally dangerous - but think about what will happen if and when people divert their attention from "the internet" to the AI summaries of the internet. The internet as we know it, the free repository of humanity's knowledge, will wither and die.
And that is the point. The point is to once again lock up the knowledge in obscure unmodifiable black boxes, because this provides opportunity to charge for access to them. They have literally harvested the world's information, given and created freely by all of us, and are attempting to sell it back to us.
Energy use is a distraction, in terms of why we must fight Gen AI. Energy use will go down, it's an argument easily countered by the Gen AI companies. Fight Gen AI because it is an attempt to steal back what was once the property of all of us. You can't ban it, but you can and absolutely should refuse to use it.
Musk wasn't happy about some facts, so grok changed. "Sorry, I was instructed to.." These tools are seen by the clueless populace, whether it is your own aunt or some HN'ers, as an objective, factually correct oracle, free of influence.
Then there are lobby groups pushing for AI in the judiciary. Always under the banner of "cost savings". Sure, guess who gets their case being handled better.
A debate about what a healthy society would be, what people share as a common cause, is urgent as ever. Without reality distortion from autocratic interest groups an allied talking heads. The AI flood is unstoppable, but with the current culturally engineered crises in many democratic countries, it will most likely result in serious catastrophe.
There are numerous examples of disruptive technologies that reduce labor costs and the world has gotten better over time not become a dystopia.
I’m sure there will be winners and losers and it will take time adjust, but dramatic increases in productivity will make a better world, because it will take less effort for you to get what you want to get.
https://www.epi.org/productivity-pay-gap/
Add to this weakened labor and social programs being treated like a four letter word among most politicians, the media and voters. AI isn't going to make the system change.
I'm not sure if capital owners, management or workers are getting equal slices of the benefits, but I do believe that everyone is benefiting and it doesn't make sense to avoid progress as though increased productivity isn't worthwhile.
This is discussed ad nauseum, and the carbon accounting is very poorly done.
Looking at the capital and operating expenses of datacenters is the right way to think about it. Nothing about that tells me that AI is environmentally worse than driving a big vanity pickup truck bigger, owning a large house, having lots of offspring, or taking many international flights.
Low empathy has been an issue with humanity since day 1. I wouldn't even know how to begin to fix it. It'll probably still be an issue long after we're dead. If it really bothers you I recommend meditation/therapy/etc.
Don't expect action on climate change until a few million in western countries are killed. Humans are terrible at slow-moving disasters. My parents both died early from being sedentary, despite my best efforts to get them to work out.
With luck, smarter decision-making with genAI might actually improve some societal systems.
I wouldn't count on genAI making smarter decisions, only decisions that benefit the people who control the computers that it runs on
If you're willing to only look at the downsides of an issue and not at its upsides, then you can a-priori reject the entirety of human advancement.
Me, 10 months ago:
--- start quote ---
Someone on Twitter said: "Do not fear AI. Fear the people and companies that run AI".
After all, it's the same industry that came up with pervasive and invasive tracking, automated insurance refusals, automated credit lookups and checks, racial ...ahem... neighbourhood profiling for benefits etc. etc.
https://news.ycombinator.com/item?id=41414873
--- end quote ---
There will downsides and their will be people negatively effected, but democratizing ability has never been a net negative.
"Right now X is expensive because you have to hire people to do it. That means that access to X is limited. By using AI, we can provide more access to X".
X could be anything from some aspect of healthcare, to some type of business service, or even something in the arts.
It's very clear that the people building AI companies, and the people investing in those companies, think that there is an enormous market to use AI to automate a wide variety of work that currently requires human labor. They may not be explicitly framing it as "we get to reduce our payroll by 50%" - they may be framing it as "now everyone in the world will get access to X" (often tech executives will use a grand mission to justify horrible things), but the upshot is that companies are 100% putting money behind AI because they believe it will help them get more work out of fewer and fewer people.