- cross-posted to:
- amd@lemmy.world
- aicompanions@lemmy.world
- selfhosted@lemmit.online
- cross-posted to:
- amd@lemmy.world
- aicompanions@lemmy.world
- selfhosted@lemmit.online
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
I was sadly stymied by the fact the rocm driver install is very much x86 only.
It’s improving very fast. Give it a little time.