this post was submitted on 18 Feb 2025
19 points (100.0% liked)

LocalLLaMA

2590 readers
14 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 2 years ago
MODERATORS
 

Yesterday I got bored and decided to try out my old GPUs with Vulkan. I had an HD 5830, GTX 460 and GTX 770 4Gb laying around so I figured "Why not".

Long story short - Vulkan didn't recognize them, hell, Linux didn't even recognize them. They didn't show up in nvtop, nvidia-smi or anything. I didn't think to check dmesg.

Honestly, I thought the 770 would work; it hasn't been in legacy status that long. It might work with an older Nvidia driver version (I'm on 550 now) but I'm not messing with that stuff just because I'm bored.

So for now the oldest GPUs I can get running are a Ryzen 5700G APU and 1080ti. Both Vega and Pascal came out in early 2017 according to Wikipedia. Those people disappointed that their RX 500 and RX 5000 don't work in Ollama should give Llama.cpp Vulkan a shot. Kobold has a Vulkan option too.

The 5700G works fine alongside Nvidia GPUs in Vulkan. The performance is what you'd expect from an APU, but at least it works. Now I'm tempted to buy a 7600 XT just to see how it does.

Has anyone else out there tried Vulkan?

you are viewing a single comment's thread
view the rest of the comments
[–] TheHobbyist@lemmy.zip 1 points 3 days ago (1 children)

Well, in the case of legacy GPUs you are forced to downgrade drivers. In that case, you can no longer use your recent and legacy GPU simultaneously, if that's what you were hoping for.

But if you do go the route of legacy drivers, they work fine.

[–] OpticalMoose@discuss.tchncs.de 1 points 3 days ago (1 children)

I guess if I get REALLY bored, I might do a fresh install and load up legacy drivers just to see what the performance is like with the old cards. It would be interesting to see how they stack up to the Vega APU.

I'm not going to actually use these cards, just trying them out for the heck of it.

[–] brokenlcd@feddit.it 1 points 3 days ago

I think you may be able to use a podman container and pass the gpu over. It will for sure be easier than reinstalling .