this post was submitted on 28 Mar 2025
28 points (93.8% liked)
LocalLLaMA
2795 readers
1 users here now
Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm running ollama 0.6.3 (pre-release) and rocm v6.10.5 on linux 6.11.0-21
Still getting
I have an RX 6700 XT and I needed to change an environment variable to make it work. Maybe something similar is needed for you GPU. I'd try googling something like "RX 9700 XT ROCM" or "RX 9700 XT ROCM no compatible GPUs were discovered" if you haven't done that already.
When I had my AMD GPU going the best way to get models running was kobold.cpp and using vulcan. The flag is like --usevulcan or something. Its way easier than getting a rocm fork working from source.