76
Ollama now supports AMD graphics cards
(ollama.com)
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
I've been using it with a 6800 for a few months now, all it needs is a few env vars.