I'm not sure about questions to ask but you should definitely wear a monocle
You'll have to wait until the embargo is lifted. I thought my post would be ok because it was meta but I guess the mods aren't taking any chances. I don't want to give the mods more work
Yacht clubs often have Wednesday afternoon sailing races and those crews often need extra folks on board. I learned sailing that way when I was in university. An inexperienced reliable crew is member is better than an experienced unreliable crew.
If you’re driving the same speed as the car in front of you, you have no reason to use the left lane
What if the car in front of you is driving at the same speed but heading right at you? Or if there is an angry T-Rex in the right lane?
I've installed Ollama on my Gaming Rig (RTX4090 with 128GB ram), M3 MacBook Pro, and M2 MacBook Air. I'm running Open WebUI on my server which can connect to multiple Ollama instances. Open WebUI has it's own Ollama compatible API which I use for projects. I'll only boot up my gaming rig if I need to use larger models, otherwise the M3 MacBook Pro can handle most tasks.
Have you downloaded your podcasts while I'm another country or with a VPN set in another country?
Try and pee nonstop during totality
Modding is a time consuming and often thankless job, this doesn't sound like power tripping.
Even if you count the fetus as a human, the fetuses bones are still inside the pregnant woman. So there are still more than one skeleton on average inside humans.
Mmmm eat balls
Canada
I agree. Very few people in industry are claiming that LLMs will become AGI. The release of o1 demonstrates that even OpenAI are pivoting from pure LLM approaches. It was always going to be a framework approach that utilizes LLMs.