this post was submitted on 24 Feb 2025
384 points (99.5% liked)

Gaming

2812 readers
660 users here now

The Lemmy.zip Gaming Community

For news, discussions and memes!


Community Rules

This community follows the Lemmy.zip Instance rules, with the inclusion of the following rule:

You can see Lemmy.zip's rules by going to our Code of Conduct.

What to Expect in Our Code of Conduct:


If you enjoy reading legal stuff, you can check it all out at legal.lemmy.zip.


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Lumiluz@slrpnk.net 1 points 3 hours ago (1 children)

Have you ever looked at the file size of something like Stable Diffusion?

Considering the data it's trained on, do you think it's;

A) 3 Petabytes B) 500 Terabytes C) 900 Gigabytes D) 100 Gigabytes

Second, what's the electrical cost of generating a single image using Flux vs 3 minutes of Balder's Gate, or similar on max settings?

Surely you must have some idea on these numbers and aren't just parroting things you don't understand.

[–] finitebanjo@lemmy.world 0 points 2 hours ago* (last edited 1 hour ago) (1 children)

What a fucking curveball joke of a question, you take a nearly impossible to quantify comparison and ask if its equivalent?

Gaming:

A high scenario electricity consumption figure of around 27 TWh, and a low scenario figure of 14.7 TWh

North American gaming market is about 7% of the global total

then that gives us a very very rough figure of about 210-285 TWh per annum of global electricity used by gamers.

AI:

The rapid growth of AI and the investments into the underlying AI infrastructure have significantly intensified the power demands of data centers. Globally, data centers consumed an estimated 240–340 TWh of electricity in 2022—approximately 1% to 1.3% of global electricity use, according to the International Energy Agency (IEA). In the early 2010s, data center energy footprints grew at a relatively moderate pace, thanks to efficiency gains and the shift toward hyperscale facilities, which are more efficient than smaller server rooms.

That stable growth pattern has given way to explosive demand. The IEA projects that global data center electricity consumption could double between 2022 and 2026. Similarly, IDC forecasts that surging AI workloads will drive a massive increase in data center capacity and power usage, with global electricity consumption from data centers projected to double to 857 TWh between 2023 and 2028. Purpose-built AI nfrastructure is at the core of this growth, with IDC estimating that AI data center capacity will expand at a 40.5% CAGR through 2027.

Lets just say we're at the halfway point and its 600 TWh per anum compared to 285 for gamers.

So more than fucking double, yeah.

And to reiterate, people generate thousands of frames in a session of gaming, vs a handful of images or maybe some emails in a session of AI.

[–] Lumiluz@slrpnk.net 0 points 49 minutes ago (1 children)

But we're not comparing the global energy use of LLMs, diffusion engines, other specialized AI (like protein foldings) etc to ONLY the American gaming market.

The conversation was specifically about image generative AI. You can stop moving the goalposts and building a strawman now, and while at it answer the first question too.

[–] finitebanjo@lemmy.world 1 points 46 minutes ago* (last edited 1 minute ago)

Apparently you can only read 2 of 3 lines, that estimate was a global projection of gaming cost IF the globe followed similar trends to the USA (because thats the only available data) so the real global cost estimate for gaming might be far far lower.

The USA alone spent 27 on gaming, not 285.