1926
The dream (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] CeeBee@lemmy.world 2 points 1 year ago

I don't know of an LLM that works decently on personal hardware

Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.

[-] ParetoOptimalDev@lemmy.today 1 points 11 months ago

If you have really low specs use the recently open sourced Microsoft Phi model.

this post was submitted on 25 Dec 2023
1926 points (97.9% liked)

People Twitter

5396 readers
769 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS