[-] OpticalMoose@discuss.tchncs.de 24 points 6 months ago

Thank you for that explanation. My regex impaired ass thought he wanted to hurt generation[x|y|z].

I'm like "what'd we ever do to you?"

[-] OpticalMoose@discuss.tchncs.de 85 points 6 months ago

At least the article points out that this is a Wall Street valuation, meaning it's meaningless in reality, the company doesn't have that much money, nor is it actually worth that much. In reality, Nvidia's tangible book value (plant, equipment, brands, logos, patents, etc.) is $37,436,000,000.

$37,436,000,000 / 29,600 employees = $1,264,729.73 per employee

Which isn't bad considering the median salary at Nvidia is $266,939 (up 17% from last year).

14
submitted 7 months ago* (last edited 7 months ago) by OpticalMoose@discuss.tchncs.de to c/localllama@sh.itjust.works


So here's the way I see it; with Data Center profits being the way they are, I don't think Nvidia's going to do us any favors with GPU pricing next generation. And apparently, the new rule is Nvidia cards exist to bring AMD prices up.

So here's my plan. Starting with my current system;

OS: Linux Mint 21.2 x86_64  
CPU: AMD Ryzen 7 5700G with Radeon Graphics (16) @ 4.673GHz  
GPU: NVIDIA GeForce RTX 3060 Lite Hash Rate  
GPU: AMD ATI 0b:00.0 Cezanne  
GPU: NVIDIA GeForce GTX 1080 Ti  
Memory: 4646MiB / 31374MiB

I think I'm better off just buying another 3060 or maybe 4060ti/16. To be nitpicky, I can get 3 3060s for the price of 2 4060tis and get more VRAM plus wider memory bus. The 4060ti is probably better in the long run, it's just so damn expensive for what you're actually getting. The 3060 really is the working man's compute card. It needs to be on an all-time-greats list.

My limitations are that I don't have room for full-length cards (a 1080ti, at 267mm, just barely fits), also I don't want the cursed power connector. Also, I don't really want to buy used because I've lost all faith in humanity and trust in my fellow man, but I realize that's more of a "me" problem.

Plus, I'm sure that used P40s and P100s are a great value as far as VRAM goes, but how long are they going to last? I've been using GPGPU since the early days of LuxRender OpenCL and Daz Studio Iray, so I know that sinking feeling when older CUDA versions get dropped from support and my GPU becomes a paperweight. Maxwell is already deprecated, so Pascal's days are definitely numbered.

On the CPU side, I'm upgrading to whatever they announce for Ryzen 9000 and a ton of RAM. Hopefully they have some models without NPUs, I don't think I'll need them. As far as what I'm running, it's Ollama and Oobabooga, mostly models 32Gb and lower. My goal is to run Mixtral 8x22b but I'll probably have to run it at a lower quant, maybe one of the 40 or 50Gb versions.

My budget: Less than Threadripper level.

Thanks for listening to my insane ramblings. Any thoughts?

[-] OpticalMoose@discuss.tchncs.de 57 points 7 months ago

When I was in Korea, I leaned that chickens can (sort of) fly. They can flap their wings hard enough to get from the ground to a tree branch maybe 8 feet or so off the ground, and safely back down.

And I've heard chickens tasted better back in the old days. A bird that eats grubs, worms, grasshoppers, frogs, snakes, etc tastes different than one that just eats chickenfeed all day.

[-] OpticalMoose@discuss.tchncs.de 75 points 7 months ago

Maybe we don't live in the worst possible universe. Madonna and Will Smith in the Matrix, everybody using the Hulk Hogan Grill, Stallone as Axel Foley, OJ as the Terminator. I guess I'm ok with where we are now.

64

I had to take my GPU out to do some troubleshooting, so I figured why not try some games on the old Ryzen 5700G. Ray-traced Quake wasn't exactly playable at 3 fps, but I'm impressed that it could load and display correctly.

Other games I tried; Portal RTX wouldn't start at all. Spider-Man remastered did start, but I can't get past the load menu, not related to the Ryzen APU. Most of my library is 10+ years old, so pretty much everything else runs fine on the APU.

[-] OpticalMoose@discuss.tchncs.de 88 points 8 months ago

Even after the price cut, theirs is still 3x the price of Mercedes' system which works better. I have a feeling Tesla's earnings report won't go well this afternoon. https://finance.yahoo.com/news/tesla-earnings-q1-175358835.html

12

Hartford is credited as creator of Dolphin-Mistral, Dolphin-Mixtral and lots of other stuff.

He's done a huge amount of work on uncensored models.

[-] OpticalMoose@discuss.tchncs.de 16 points 8 months ago

It's out on Ollama already. I'm downloading it now. Can't wait for the uncensored versions.

[-] OpticalMoose@discuss.tchncs.de 18 points 8 months ago

In addition, manufacturers will make a smaller and easier to lose format.

[-] OpticalMoose@discuss.tchncs.de 22 points 8 months ago

Using a phone sounds inconvenient to me. I usually just pull my card out of my wallet, wave it over the terminal until I hear a beep and that's it. Worst case scenario, I have to insert it into the chip reader or God-forbid swipe it through the slot like some kind of Neanderthal.

I'm kidding, but seriously, that's easier than screwing around with a phone, to me.

115

An update to this post https://beehaw.org/post/6717143

17
NVIDIA Chat With RTX (www.nvidia.com)

This is an interesting demo, but it has some drawbacks I can already see:

  • It's Windows only (maybe Win11 only, the documentation isn't clear)
  • It only works with RTX 30 series and up
  • It's closed source, so you have no idea if they're uploading your data somewhere

The concept is great, having an LLM to sort through your local files and help you find stuff, but it seems really limited.

I think you could get the same functionality(and more) by writing an API for text-gen-webui.

more info here: https://videocardz.com/newz/nvidia-unveils-chat-with-rtx-ai-chatbot-powered-locally-by-geforce-rtx-30-40-gpus

[-] OpticalMoose@discuss.tchncs.de 18 points 11 months ago

The real-estate mogul, of course, suggests he’s actually worth much more — valuing his own brand at as much as $10 billion.
... “I think it’s the hottest brand in the world.”

Yeah, I mean the Ford logo was good for $23B back in 2006, but I guess 10 > 23.5. Alternative math and all.

[-] OpticalMoose@discuss.tchncs.de 45 points 11 months ago

He wanted Playboy to be progressive (on abortion, weed, euthanasia, sexuality, etc), and he wanted equality for women, but he personally didn't live by those same rules. Rules for thee, not for me, etc.
That's just my opinion, though.

9
submitted 11 months ago* (last edited 11 months ago) by OpticalMoose@discuss.tchncs.de to c/linux_gaming@lemmy.ml

Edit: Best viewed with an ad blocker. Sorry, I didn't notice till someone pointed it out to me.

[-] OpticalMoose@discuss.tchncs.de 35 points 1 year ago

I never knew her, but after reading this, it makes me feel as if I know she existed.

[-] OpticalMoose@discuss.tchncs.de 27 points 1 year ago

Thanks for posting this. I couldn't figure out why Steam was broken on my laptop for the last 3 weeks or so. Installing the .deb from the Steam website fixed it. I'm starting to get fed up with Canonical.

225

In the grand scheme of things, the customer may have slightly more pull than the cashier ringing up their order, but it's the CEO and the board of directors that control the narrative. That's why we're getting bigger and less fuel efficient vehicles, bigger and more fattening meal portions in restaurants, and bigger less affordable houses.

view more: next ›

OpticalMoose

joined 1 year ago