One of the potential upsides of AI in the USA is we'll bring down electrical prices compared to something like China. Power has to be abundantly plentiful and concentrated.
Maybe then, we could afford to smelt an ingot of aluminum in the USA.
Until then, I guess we're just sadly just burning coal to create cat memes. I hope Anthropic can lead the charge. Crypto was already a massive setback in terms of clean power, AI is already very dirty.
> Cover grid infrastructure costs. We will pay for 100% of the grid upgrades needed to interconnect our data centers, paid through increases to our monthly electricity charges. This includes the shares of these costs that would otherwise be passed onto consumers.
This is great, but do they have an actual example of something that would have been passed on to consumers? Or is it just a hypothetical?
In the location I’m familiar with, large infrastructure projects have to pay their own interconnection costs. Utilities are diverse across the country so I wouldn’t be surprised if there are differences, but in general I doubt there are many situations where utilities were going to raise consumer’s monthly rates specifically to connect some large commercial infrastructure.
Maybe someone more familiar with these locations can provide more details, but I think this public promise is rather easy to make.
There's a huge diversity of pricing and regulatory schemes across the US. I think you skepticism is well placed in general, because where I live in California the price increase has been almost entirely from bad grid maintenance policies of years past but people come up with random other excuses.
However there are some examples where increased demand by one sector leads to higher prices for everyone. The PJM electricity market has a capacity market, where generators get compensated for being able to promise the ability to deliver electricity on demand. When demand goes up, prices increase in the capacity market, and those prices get charged to everyone. In the last auction, prices were sky high, which leads to higher electricity prices for everyone:
A lot of electricity markets in other places allow procurement processes where increased costs to meet demand get passed to all consumers equally. If these places were actually using IRPs that had up to date pricing, adding new capacity from renewables and storage would lower prices, but instead many utilities go with what they know, gas generators, which are in short supply and coming in at very high prices.
And the cost of the grid is high everywhere. As renewables and storage drive down electricity generation prices, the grid will come to be a larger and larger percentage of electricity costs. Interconnection is just one bit of the cost, transmission needs to be upgraded all around as overall demand grows. We've gone through a few decades of stagnant to lessening electricity demand, and utilities are hungry to do very expensive grid projects because they get a guaranteed rate of return on grid expansion in most parts of the country.
North Carolina passed Senate Bill 266, changing how utilities can recover costs for projects under construction amid rising energy demand, particularly from data centers. Now Duke Energy wants a double digit price rate increase: https://starw1.ncuc.gov/NCUC/ViewFile.aspx?Id=0ac12377-99be-...
Georgia power already has a demand scaled recovery charge addition to bills that increases prices for residential customers regardless of where the demand originates. It used to be only applied occasionally during the summer. Now they've adjusted the peak / off-peak rates to be what it used to be plus the demand recovery, and now the demand recovery is additional and just applies pretty much all the time.
Generally most distribution costs are socialized starting with the REA and such. My block needed a new transformer a few weeks ago and it will be paid for by every customer of that utility.
Rather have the government tax these entities (great way to have the public support a VAT in this instance) than rely on their "benefactors" that have shown zero remorse in the societal destruction against the planet and humanity, but okay.
Utilities do charge infrastructure projects for their interconnection costs. Maybe there was some hypothetical situation where some costs would have gone into a general budget, but utilities aren’t usually in the habit of doing large interconnection projects for free and sending the bills to consumers.
So the interconnect costs cover the infrastructure buildout to generate the additional power demands, or that’s spread across all consumers in perpetuity? Because the interconnect itself is the cheap part afaik. And all of our rates go up to cover the costs of the additional generation whether it’s another solar farm or another ng plant.
They don’t generally just have GW of power sitting idle for a rainy day (I’m not talking about the capacity they reserve for hot july days).
"Committing to buying the glass to replace the window I broke in your shop to rob the place, you're welcome."
> Training a single frontier AI model will soon require gigawatts of power, and the US AI sector will need at least 50 gigawatts of capacity over the next several years.
These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
> "Committing to buying the glass to replace the window I broke in your shop to rob the place, you're welcome."
Buying electricity isn't inherently destructive. That's a very bad analogy.
> These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
I'm not arguing that they are efficient right now, but how would you measure that? What kind of output does it have to make per kWh of input to be acceptable? Keep in mind that the baseline of US power use is around 500GW and that currently AI is maybe 10.
Quite the opposite, really. I did some napkin math for energy and water consumption, and compared to humans these things are very resource efficient.
If LLMs improve productivity by even 5% (studies actually peg productivity gains across various professions at 15 - 30%, and these are from 2024!) the resource savings by accelerating all knowledge workers are significant.
Simplistically, during 8 hours of work a human would consume 10 kWH of electricity + 27 gallons of water. Sped up by 5%, that drops by 0.5kWH and 1.35 gallons. Even assuming a higher end of resources used by LLMs, a 100 large prompts (~1 every 5 minutes) would only consume 0.25 kWH + 0.3 gallons. So we're still saving ~0.25 kWH + 1 gallon overall per day!
That is, humans + LLMs are way more efficient than humans alone. As such, the more knowledge workers adopt LLMs, the more efficiently they can achieve the same work output!
If we assume a conservative 10% productivity speed up, adoption across all ~100M knowledge work in the US will recoup the resource cost of a full training run in a few business days, even after accounting for the inference costs!
Additional reading with more useful numbers (independent of my napkin math):
So with the AI is doing more of the work and you need less humans, what are you doing with the extra humans to eliminate their no-longer-productive resource consumption?
Saying “we can do the same work with less resource use” doesn’t mean resource consumption is reduced. You’ve just gone from humans using resources to humans using the same resources and doing less work, plus AI using more resources.
> So with the AI is doing more of the work and you need less humans, what are you doing with the extra humans to eliminate their no-longer-productive resource consumption?
Soon enough, we won't be able to avoid this question.
The thing is, there are many interplaying dynamics here that are impossible to unravel. This is why I called it "napkin math", because figuring out the full ramifications of this change is a pretty large economic problem that nobody has figured out!
But even then, the productivity per human will explode, and we will still have the problem of "too many humans." Cynically, if most knowledge workers get laid off, it's good from an environmental perspective because that means much less commuting and pollution! But then they're starving and we will have riots!
This is where I foresee the near-term problems with GenAI: social turmoil rather than resource consumption. I suspect it's not all bad news though. While it's impossible to put numbers on it, it helps to think about the first-order economic principles that are in play:
1. This is hand-wavy, but knowledge work boosts economic growth. If this is massively accelerated, we should be creating surplus value that compensates for a lot of costs.
2. However a huge chunk of knowledge work is busy work which will be automated away. People can try upskilling but the skill gap is already huge an growing quickly and they will lose jobs.
3. The economy is essentially people providing and paying for services and goods. If people lose jobs and cannot earn, they cannot drive the economy and it shrinks.
4. The elite, counter-intuitively enough, do NOT want that because they get richer by taking a massive cut of the economy! (Not to mention life in a doomsday bunker can get pretty dull if starving people start rioting -- https://news.ycombinator.com/item?id=46896066)
There are many more dynamics at play of course, but I think an equilibrium will be found purely because everyone is incentivized to find a solution (UBI?) that keeps both the elites and the plebes living long and prospering. I expect some turmoil, but luckily, the severe resource crunch of GPUs gives us time to figure things out.
I have no expertise here, but a couple years ago I had a prototype using locally deployed Llama 2 that cached the context (now deprecated https://github.com/ollama/ollama/issues/10576) from previous inference calls, and reused it for subsequent calls. The subsequent calls were much much faster. I suspect prompt caching works similarly, especially given changed code is very small compered to the rest of the codebase.
Adding new electricity demand to the grid should not be viewed as breaking windows and robbing others. When I bought an EV, I increased my electricity demand a huge amount, but it's not like I'm stealing from my neighbors. No rules were broken. We just need to make sure that I pay enough for my additional demand.
> AI sector will need at least 50 gigawatts of capacity over the next several years.
The error bars on this prediction are extremely large. It would represent a 5% increase in capacity in "the next several years" which is only a percent or two per year, but it could also only be 5GW over the next several years. 50GW represents about 1 year of actual grid additions.
> All of you building these things for these people should be embarrassed and ashamed.
I'm not building these things, and I think there should be AI critique, but this is far over the top. There's great value for all of humanity in these tools. The actual energy use of a typical user is not much more than a typical home appliance, because so many requests are batched together and processed in parallel.
We should be ashamed of getting into our cars every day, that's a true harm to the environment. We should have built something better, allowed more transit. A daily commute of 30 miles is disastrous for the environment compared other any AI use that's really possible at the moment.
Let's be cautious of AI but keep our critiques grounded in reality, so that we have enough powder left to fight the rest of things we need to change in society.
This is all good and well wishes as long as investors are willing to pour money into the bubble. When the music stops is where we will see the true colors. Corporations are optimized to make money, governments should be optimized to protect people.
> Cover grid infrastructure costs. We will pay for 100% of the grid upgrades needed to interconnect our data centers, paid through increases to our monthly electricity charges.
How does paying more monthly cover an infrastructure build out that requires up front capital?
One of the potential upsides of AI in the USA is we'll bring down electrical prices compared to something like China. Power has to be abundantly plentiful and concentrated.
Maybe then, we could afford to smelt an ingot of aluminum in the USA.
Until then, I guess we're just sadly just burning coal to create cat memes. I hope Anthropic can lead the charge. Crypto was already a massive setback in terms of clean power, AI is already very dirty.
> Cover grid infrastructure costs. We will pay for 100% of the grid upgrades needed to interconnect our data centers, paid through increases to our monthly electricity charges. This includes the shares of these costs that would otherwise be passed onto consumers.
This is great, but do they have an actual example of something that would have been passed on to consumers? Or is it just a hypothetical?
In the location I’m familiar with, large infrastructure projects have to pay their own interconnection costs. Utilities are diverse across the country so I wouldn’t be surprised if there are differences, but in general I doubt there are many situations where utilities were going to raise consumer’s monthly rates specifically to connect some large commercial infrastructure.
Maybe someone more familiar with these locations can provide more details, but I think this public promise is rather easy to make.
There's a huge diversity of pricing and regulatory schemes across the US. I think you skepticism is well placed in general, because where I live in California the price increase has been almost entirely from bad grid maintenance policies of years past but people come up with random other excuses.
However there are some examples where increased demand by one sector leads to higher prices for everyone. The PJM electricity market has a capacity market, where generators get compensated for being able to promise the ability to deliver electricity on demand. When demand goes up, prices increase in the capacity market, and those prices get charged to everyone. In the last auction, prices were sky high, which leads to higher electricity prices for everyone:
https://www.utilitydive.com/news/pjm-interconnection-capacit...
A lot of electricity markets in other places allow procurement processes where increased costs to meet demand get passed to all consumers equally. If these places were actually using IRPs that had up to date pricing, adding new capacity from renewables and storage would lower prices, but instead many utilities go with what they know, gas generators, which are in short supply and coming in at very high prices.
And the cost of the grid is high everywhere. As renewables and storage drive down electricity generation prices, the grid will come to be a larger and larger percentage of electricity costs. Interconnection is just one bit of the cost, transmission needs to be upgraded all around as overall demand grows. We've gone through a few decades of stagnant to lessening electricity demand, and utilities are hungry to do very expensive grid projects because they get a guaranteed rate of return on grid expansion in most parts of the country.
North Carolina passed Senate Bill 266, changing how utilities can recover costs for projects under construction amid rising energy demand, particularly from data centers. Now Duke Energy wants a double digit price rate increase: https://starw1.ncuc.gov/NCUC/ViewFile.aspx?Id=0ac12377-99be-...
Georgia power already has a demand scaled recovery charge addition to bills that increases prices for residential customers regardless of where the demand originates. It used to be only applied occasionally during the summer. Now they've adjusted the peak / off-peak rates to be what it used to be plus the demand recovery, and now the demand recovery is additional and just applies pretty much all the time.
Putting aside interconnection costs, when electricity is auctioned increased demand can increase wholesale prices for everyone.
Generally most distribution costs are socialized starting with the REA and such. My block needed a new transformer a few weeks ago and it will be paid for by every customer of that utility.
Rather have the government tax these entities (great way to have the public support a VAT in this instance) than rely on their "benefactors" that have shown zero remorse in the societal destruction against the planet and humanity, but okay.
Utilities do charge infrastructure projects for their interconnection costs. Maybe there was some hypothetical situation where some costs would have gone into a general budget, but utilities aren’t usually in the habit of doing large interconnection projects for free and sending the bills to consumers.
So the interconnect costs cover the infrastructure buildout to generate the additional power demands, or that’s spread across all consumers in perpetuity? Because the interconnect itself is the cheap part afaik. And all of our rates go up to cover the costs of the additional generation whether it’s another solar farm or another ng plant.
They don’t generally just have GW of power sitting idle for a rainy day (I’m not talking about the capacity they reserve for hot july days).
Most of the increase seen in utility costs is for transmission, not generation. Generation is an important piece, but it's not the only piece.
"Committing to buying the glass to replace the window I broke in your shop to rob the place, you're welcome."
> Training a single frontier AI model will soon require gigawatts of power, and the US AI sector will need at least 50 gigawatts of capacity over the next several years.
These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
> "Committing to buying the glass to replace the window I broke in your shop to rob the place, you're welcome."
Buying electricity isn't inherently destructive. That's a very bad analogy.
> These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
I'm not arguing that they are efficient right now, but how would you measure that? What kind of output does it have to make per kWh of input to be acceptable? Keep in mind that the baseline of US power use is around 500GW and that currently AI is maybe 10.
> These things are so hideously inefficient.
Quite the opposite, really. I did some napkin math for energy and water consumption, and compared to humans these things are very resource efficient.
If LLMs improve productivity by even 5% (studies actually peg productivity gains across various professions at 15 - 30%, and these are from 2024!) the resource savings by accelerating all knowledge workers are significant.
Simplistically, during 8 hours of work a human would consume 10 kWH of electricity + 27 gallons of water. Sped up by 5%, that drops by 0.5kWH and 1.35 gallons. Even assuming a higher end of resources used by LLMs, a 100 large prompts (~1 every 5 minutes) would only consume 0.25 kWH + 0.3 gallons. So we're still saving ~0.25 kWH + 1 gallon overall per day!
That is, humans + LLMs are way more efficient than humans alone. As such, the more knowledge workers adopt LLMs, the more efficiently they can achieve the same work output!
If we assume a conservative 10% productivity speed up, adoption across all ~100M knowledge work in the US will recoup the resource cost of a full training run in a few business days, even after accounting for the inference costs!
Additional reading with more useful numbers (independent of my napkin math):
https://www.nature.com/articles/s41598-024-76682-6
https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans...
So with the AI is doing more of the work and you need less humans, what are you doing with the extra humans to eliminate their no-longer-productive resource consumption?
Saying “we can do the same work with less resource use” doesn’t mean resource consumption is reduced. You’ve just gone from humans using resources to humans using the same resources and doing less work, plus AI using more resources.
> So with the AI is doing more of the work and you need less humans, what are you doing with the extra humans to eliminate their no-longer-productive resource consumption?
Soon enough, we won't be able to avoid this question.
You put them to work doing more things than were possible in a month before.
The thing is, there are many interplaying dynamics here that are impossible to unravel. This is why I called it "napkin math", because figuring out the full ramifications of this change is a pretty large economic problem that nobody has figured out!
For instance, I think operating at this level of productivity is unsustainable (https://news.ycombinator.com/item?id=46938038). As discussed in detail by the recent "AI vampire" blog: https://news.ycombinator.com/item?id=46972179 -- most humans are not designed for that level of cognitive intensity.
But even then, the productivity per human will explode, and we will still have the problem of "too many humans." Cynically, if most knowledge workers get laid off, it's good from an environmental perspective because that means much less commuting and pollution! But then they're starving and we will have riots!
This is where I foresee the near-term problems with GenAI: social turmoil rather than resource consumption. I suspect it's not all bad news though. While it's impossible to put numbers on it, it helps to think about the first-order economic principles that are in play:
1. This is hand-wavy, but knowledge work boosts economic growth. If this is massively accelerated, we should be creating surplus value that compensates for a lot of costs.
2. However a huge chunk of knowledge work is busy work which will be automated away. People can try upskilling but the skill gap is already huge an growing quickly and they will lose jobs.
3. The economy is essentially people providing and paying for services and goods. If people lose jobs and cannot earn, they cannot drive the economy and it shrinks.
4. The elite, counter-intuitively enough, do NOT want that because they get richer by taking a massive cut of the economy! (Not to mention life in a doomsday bunker can get pretty dull if starving people start rioting -- https://news.ycombinator.com/item?id=46896066)
There are many more dynamics at play of course, but I think an equilibrium will be found purely because everyone is incentivized to find a solution (UBI?) that keeps both the elites and the plebes living long and prospering. I expect some turmoil, but luckily, the severe resource crunch of GPUs gives us time to figure things out.
How is a human consuming 27 gallons of water in an 8 hour work shift?
Do keep in mind that 1 large prompt every 5 minutes is not how e.g. coding agents are used. There it's 1 large prompt every couple of seconds.
True, but I think in these scenarios they rely on prompt caching, which is much cheaper: https://ngrok.com/blog/prompt-caching/
I have no expertise here, but a couple years ago I had a prototype using locally deployed Llama 2 that cached the context (now deprecated https://github.com/ollama/ollama/issues/10576) from previous inference calls, and reused it for subsequent calls. The subsequent calls were much much faster. I suspect prompt caching works similarly, especially given changed code is very small compered to the rest of the codebase.
How are you measuring efficiency? They're better than most humans, which is what I would need more of as a substitute.
Adding new electricity demand to the grid should not be viewed as breaking windows and robbing others. When I bought an EV, I increased my electricity demand a huge amount, but it's not like I'm stealing from my neighbors. No rules were broken. We just need to make sure that I pay enough for my additional demand.
> AI sector will need at least 50 gigawatts of capacity over the next several years.
The error bars on this prediction are extremely large. It would represent a 5% increase in capacity in "the next several years" which is only a percent or two per year, but it could also only be 5GW over the next several years. 50GW represents about 1 year of actual grid additions.
> All of you building these things for these people should be embarrassed and ashamed.
I'm not building these things, and I think there should be AI critique, but this is far over the top. There's great value for all of humanity in these tools. The actual energy use of a typical user is not much more than a typical home appliance, because so many requests are batched together and processed in parallel.
We should be ashamed of getting into our cars every day, that's a true harm to the environment. We should have built something better, allowed more transit. A daily commute of 30 miles is disastrous for the environment compared other any AI use that's really possible at the moment.
Let's be cautious of AI but keep our critiques grounded in reality, so that we have enough powder left to fight the rest of things we need to change in society.
Every piece of progress looks like this to begin with.
The numbers must go up, there is no other way.
Surprised local power generation isn’t in the radar. Whether it’s solar, natural gas or others.
This is all good and well wishes as long as investors are willing to pour money into the bubble. When the music stops is where we will see the true colors. Corporations are optimized to make money, governments should be optimized to protect people.
> projects will create hundreds of permanent jobs
See, the AI is gonna create jobs, not eliminate them lol. Now let us strip mine your hood G.
Blah, blah, blah. prices will rise regardless and they know it
> Cover grid infrastructure costs. We will pay for 100% of the grid upgrades needed to interconnect our data centers, paid through increases to our monthly electricity charges.
How does paying more monthly cover an infrastructure build out that requires up front capital?
Financing.
shrug as long as the cost of getting that upfront capital is also added to what they pay...