this post was submitted on 09 Jan 2025
379 points (98.7% liked)

Opensource

1608 readers
710 users here now

A community for discussion about open source software! Ask questions, share knowledge, share news, or post interesting stuff related to it!

CreditsIcon base by Lorc under CC BY 3.0 with modifications to add a gradient



founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] technomad@slrpnk.net 33 points 21 hours ago (2 children)

Also the incessant ammounts of power/energy that they consume.

[–] jsomae@lemmy.ml 12 points 16 hours ago (2 children)

Running an llm llocally takes less power than playing a video game.

[–] vividspecter@lemm.ee 7 points 11 hours ago

The training of the models themselves also takes a lot of power usage.

[–] msage@programming.dev 2 points 12 hours ago (1 children)
[–] jsomae@lemmy.ml 5 points 11 hours ago* (last edited 11 hours ago) (1 children)

I don't have a source for that, but the most that any locally-run program can cost in terms of power is basically the sum of a few things: maxed-out gpu usage, maxed-out cpu usage, maxed-out disk access. GPU is by far the most power-consuming of these things, and modern video games make essentially the most possible use of the GPU that they can get away with.

Running an LLM locally can at most max out usage of the GPU, putting it in the same ballpark as a video game. Typical usage of an LLM is to run it for a few seconds and then submit another query, so it's not running 100% of the time during typical usage, unlike a video game (where it remains open and active the whole time, GPU usage dips only when you're in a menu for instance.)

Data centers drain lots of power by running a very large number of machines at the same time.

[–] msage@programming.dev 2 points 4 hours ago

From what I know, local LLMs take minutes to process a single prompt, not seconds, but I guess that depends on the use case.

But also games, dunno about maxing GPU in most games. I maxed mine for crypto mining, and that was power hungry. So I would put LLMs closer to crypto than games.

Not to mention games will entertain you way more for the same time.

[–] Sixtyforce@sh.itjust.works 1 points 10 hours ago

Curious how resource intensive AI subtitle generation will be. Probably fine on some setups.

Trying to use madVR (tweaker's video postprocessing) in the summer in my small office with an RTX 3090 was turning my office into a sauna. Next time I buy a video card it'll be a lower tier deliberately to avoid the higher power draw lol.