Former Google CEO Eric Schmidt runs the Special Competitive Studies Project, a non-profit vehicle for him to make excuses for the AI bubble industry. [SCSP.ai] Schmidt spoke at SCSP’s inaugural “AI…
That’s the fucking problem, it’s impossible to tell since MSFT won’t tell you directly, and only the people who run the datacenters could.
The only relatively reliable numbers I was able to find were in this research paper by Luccioni and Strubell from ACM Conference on Fairness, Accountability, and Transparency 2024. Now, that’s an obscure conference (not even ranked by CORE), by Dr. Luccioni appears to be right on the money about dangers of AI (https://www.sashaluccioni.com/).
this works out to 2.7GW in 2023, on average. that’s comparable to peak daily consumption in croatia (today), if that 30%-ish figure is accurate then something closer to 700MW is ai-only, that’s smaller country like macedonia
which only highlights how bizarre is their 5GW proposition. hey let’s outbuild ms 2x, like, now
that sounds like it’s much less than crypto at its peak, and even 2023 estimate differs by over an order of magnitude (14.5GW avg). there’s also google and fb and whoever else (aws?)
i started to look up satellite photos and openinframap in order to figure out maximum capacity of their substations, but powerlines for them are probably massively oversized, and substations are probably oversized too in order to make it redundant and high-availability so there might be some way to guess it but then some of these will be underground and if they’re doing load-following to match their renewables (which might be cheaper for them) then it’s also oversized a bit on top of that
Well the main problem is that a datacenter is running much more than just AI. You’d need to somehow subtract “normal” cloud usage from just the promptfondling.
how much it does use anyway? 5GWe was from delusional openai talk for investors, so maybe lower
That’s the fucking problem, it’s impossible to tell since MSFT won’t tell you directly, and only the people who run the datacenters could.
The only relatively reliable numbers I was able to find were in this research paper by Luccioni and Strubell from ACM Conference on Fairness, Accountability, and Transparency 2024. Now, that’s an obscure conference (not even ranked by CORE), by Dr. Luccioni appears to be right on the money about dangers of AI (https://www.sashaluccioni.com/).
they will tell total tho https://www.latitudemedia.com/news/microsoft-reveals-the-energy-impact-of-artificial-intelligence
this works out to 2.7GW in 2023, on average. that’s comparable to peak daily consumption in croatia (today), if that 30%-ish figure is accurate then something closer to 700MW is ai-only, that’s smaller country like macedonia
which only highlights how bizarre is their 5GW proposition. hey let’s outbuild ms 2x, like, now
that sounds like it’s much less than crypto at its peak, and even 2023 estimate differs by over an order of magnitude (14.5GW avg). there’s also google and fb and whoever else (aws?)
i started to look up satellite photos and openinframap in order to figure out maximum capacity of their substations, but powerlines for them are probably massively oversized, and substations are probably oversized too in order to make it redundant and high-availability so there might be some way to guess it but then some of these will be underground and if they’re doing load-following to match their renewables (which might be cheaper for them) then it’s also oversized a bit on top of that
Well the main problem is that a datacenter is running much more than just AI. You’d need to somehow subtract “normal” cloud usage from just the promptfondling.
ez. remember that announcement when ms said their energy use got up 36%? that’s ai, and includes both training and use
this still can be fudged with more efficient office heating, shutdowns of least efficient dcs and so on, but only to a limited degree