1300
you are viewing a single comment's thread
view the rest of the comments
[-] Lmaydev@programming.dev 4 points 8 months ago

LLMs are in a position to make boring NPCs much better.

Once they can be run locally at a good speed it'll be a game changer.

I reckon we'll start getting AI cards for computers soon.

[-] bbuez@lemmy.world 4 points 8 months ago

We already do! And on the cheap! I have a Coral TPU running for presence detection on some security cameras, I'm pretty sure they can run LLMs but I haven't looked around.

GPT4ALL runs rather well on a 2060 and I would only imagine a lot better on newer hardware

this post was submitted on 10 Apr 2024
1300 points (99.0% liked)

Programmer Humor

19821 readers
535 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS