pavnilschanda@lemmy.worldM to AI Companions@lemmy.world · 1 year ago[News] Ollama now supports AMD graphics cards · Ollama Blogollama.comexternal-linkmessage-square2fedilinkarrow-up19arrow-down11cross-posted to: localllama@sh.itjust.worksamd@lemmy.worldselfhosted@lemmit.online
arrow-up18arrow-down1external-link[News] Ollama now supports AMD graphics cards · Ollama Blogollama.compavnilschanda@lemmy.worldM to AI Companions@lemmy.world · 1 year agomessage-square2fedilinkcross-posted to: localllama@sh.itjust.worksamd@lemmy.worldselfhosted@lemmit.online
minus-squareOpticalMooselinkfedilinkEnglisharrow-up4·1 year agoOllama supports more AMD cards than AMD’s ROCm does. https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html#supported-gpus I’d love to buy an AMD card, but it just feels like they’re not even trying to meet us halfway on this. ROCm needs to be better.
minus-squaremindbleach@sh.itjust.workslinkfedilinkarrow-up2·1 year agoCUDA is the one Nvidia attack on interoperability that AMD did not answer, and consequently, it’s the only one Nvidia kept at.
Ollama supports more AMD cards than AMD’s ROCm does. https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html#supported-gpus
I’d love to buy an AMD card, but it just feels like they’re not even trying to meet us halfway on this. ROCm needs to be better.
CUDA is the one Nvidia attack on interoperability that AMD did not answer, and consequently, it’s the only one Nvidia kept at.