10
Fixed it (sh.itjust.works)
submitted 1 week ago* (last edited 1 week ago) by HumanPerson@sh.itjust.works to c/localllama@sh.itjust.works

Seriously though, does anyone know how to use openwebui with the new version?

Edit: if you go into the ollama container using sudo docker exec -it bash, then you can pull models with ollama pull llama3.1:8b for example and have it.

top 1 comments
sorted by: hot top controversial new old
[-] slock@lemmy.world 1 points 1 week ago

For some reasons, there are now two models settings pages. One in the workspace, and another one in the admin settings (the old one was moved here). The feature you are looking for was probably just moved in the admin settings page

this post was submitted on 12 Dec 2024
10 points (81.2% liked)

LocalLLaMA

2327 readers
26 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 2 years ago
MODERATORS