157
submitted 3 days ago by simple@lemm.ee to c/games@lemmy.world
you are viewing a single comment's thread
view the rest of the comments

What if I'm buying a graphics card to run Flux or an LLM locally. Aren't these cards good for those use cases?

[-] Breve@pawb.social 4 points 2 days ago

Oh yeah for sure, I've run Llama 3.2 on my RTX 4080 and it struggles but it's not obnoxiously slow. I think they are betting more software will ship with integrated LLMs that run locally on users PCs instead of relying on cloud compute.

this post was submitted on 07 Jan 2025
157 points (95.9% liked)

Games

33113 readers
1193 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 2 years ago
MODERATORS