70
Running Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazine
(fedoramagazine.org)
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
I did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model.
I have the same setup, you have to add the line
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
for that specific GPU to the ollama.service fileollam runs on the 6700 XT, but you need to add an environment variable for it to work... I just don't remember what it was and am away from my computer right now