I wonder if I should write about how useless the energy cost per prompt is regarding Generative AI. I see it used too much without people actually thinking about what it represents
The energy cost per prompt is used in comparison to so many unrelated shit by AI users to go "oh look but this is much worse, why don't you go after that?" but the individual cost per prompt doesn't matter when 8 billion prompts per day are being done, with half of them being used to generate pineapples with tits
The cost to train is much, much larger and should be looked at far more imo. Sure it's a production cost & not a usage cost, but you can't remove it from the picture since it's so incredibly large. Plus, since companies are constantly retraining their generative AI, it's irremovably connected to the lifetime of the companies themselves.
entire towns in rural places (including majority african-american and latine towns) in north america have to go through brownouts and rolling blackouts because of genAI training and use. 2 such places are even in my own home state.
google wants to build a genAI datacentre in my own homeTOWN even, which is being looked at sideways by the city council because it will need more than 5x the amount of power of the ENTIRE COUNTY.
The energy cost of those datacenters is absolutely awful. I'm surprised not more people look at AI's energy use by looking at the datacenters. Today i read an article comparing prompt energy cost to microwave energy costs & tea kettle energy costs, which is so stupid and misses the point of cumulative use so hard. That's why i made this post to begin with
The energy cost per prompt is used in comparison to so many unrelated shit by AI users to go "oh look but this is much worse, why don't you go after that?" but the individual cost per prompt doesn't matter when 8 billion prompts per day are being done, with half of them being used to generate pineapples with tits
The cost to train is much, much larger and should be looked at far more imo. Sure it's a production cost & not a usage cost, but you can't remove it from the picture since it's so incredibly large. Plus, since companies are constantly retraining their generative AI, it's irremovably connected to the lifetime of the companies themselves.
entire towns in rural places (including majority african-american and latine towns) in north america have to go through brownouts and rolling blackouts because of genAI training and use. 2 such places are even in my own home state.
google wants to build a genAI datacentre in my own homeTOWN even, which is being looked at sideways by the city council because it will need more than 5x the amount of power of the ENTIRE COUNTY.
it is impossible to understate how much of an energy suck-hole genAI is.
The energy cost of those datacenters is absolutely awful. I'm surprised not more people look at AI's energy use by looking at the datacenters. Today i read an article comparing prompt energy cost to microwave energy costs & tea kettle energy costs, which is so stupid and misses the point of cumulative use so hard. That's why i made this post to begin with