76

But in all fairness, it's really llama.cpp that supports AMD.

Now looking forward to the Vulkan support!

you are viewing a single comment's thread
view the rest of the comments
[-] stsquad@lemmy.ml 1 points 6 months ago

I was sadly stymied by the fact the rocm driver install is very much x86 only.

[-] turkishdelight@lemmy.ml 2 points 6 months ago

It's improving very fast. Give it a little time.

this post was submitted on 16 Mar 2024
76 points (100.0% liked)

LocalLLaMA

2220 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS