76
Ollama now supports AMD graphics cards
(ollama.com)
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
I was sadly stymied by the fact the rocm driver install is very much x86 only.
It's improving very fast. Give it a little time.