602
OpenAI is reportedly going all-in as a for-profit company
(mashable.com)
This is a most excellent place for technology news and articles.
Another good resource to help people find models https://llm.extractum.io
Or just straight up install https://ollama.com
I like Ollama, and recommend it to tinker, but I admit this "LLM Explorer" is quite neat thanks to sections like "LLMs Fit 16GB VRAM"
Ollama just works but it doesn't help to pick which model best fits your needs.
What is the need I have to put the effort in to install all this locally. Websites win in terms of convenience.
I want to work on my stuff in peace and in private without worrying about a company grabbing my stuff and using it for themselves and to give/sell it to other outfits, including the government. "If you have nothing to hide..." is bullshit and needs to die.
Good point. Everything you feed into chatgpt is stored for future reference.
I don't think I understand your point, are you saying there is no benefit in running locally and that Websites or APIs are more convenient?
I already have stable diffusion on a local machine. I was trying to find motivation to install a LLM locally. You answered my question in a different response