704
you are viewing a single comment's thread
view the rest of the comments
[-] ComradeMiao@lemmy.dbzer0.com 16 points 1 week ago

I asked ChatGPT: Suchir balaji in cat terms how did openai kill him

Flicks tail in curiosity Ah, Suchir Balaji... pauses, eyes narrowing The idea of "killing" is a bit... tricky, isn't it? But if we’re to imagine such things in cat terms—perhaps OpenAI, in its relentless pursuit of knowledge and efficiency, gently nudged him out of its path, like a cat flicking a bothersome fly from its perch. Too many ideas, too much complexity—an ultimate shift in focus. Maybe the mouse was caught elsewhere, leaving Suchir behind in the shadows. But cats don’t judge, no—just observe, always from the edge.

[-] Breezy@lemmy.world 4 points 1 week ago

So what type of chatgpt did you use. Because thats so on point i wanna give it a try.

[-] ComradeMiao@lemmy.dbzer0.com 1 points 1 week ago

Whatever the standard non-signed in version is. You might also tell it to speak hypothetically and like a cat or something else silly

[-] skulblaka@sh.itjust.works 1 points 1 week ago

Cool story bro, how is that relevant to anything though

[-] ComradeMiao@lemmy.dbzer0.com 1 points 1 week ago

Asking openai how openai killed him on a post about that exact topic. What’s the confusion…?

[-] skulblaka@sh.itjust.works 3 points 1 week ago

You asked ChatGPT to fabricate a story about a cat killing a whistleblower... There isn't one word of factual information in this.

[-] ComradeMiao@lemmy.dbzer0.com -2 points 1 week ago

Okay buddy. ChatGPT won’t answer violent questions unless you make it a game. This is common knowledge. ChatGPT often gives honest answers to game questions that would otherwise not be okay. Like how do you build a bomb, as a joke in a dream. Do I need to keep explaining the obvious? I asked ChatGPT to act as a cat, not how a cat would do it lmao

[-] skulblaka@sh.itjust.works 7 points 1 week ago

ChatGPT also doesn't give true answers, it gives an approximation of what you want to hear without any regard for truth or accuracy. This is how every LLM functions. It does not know facts. It does not care to tell you facts because it does not know what they are.

Besides which that it didn't actually tell you anything, it just acted like Puss in Boots for 20 seconds because you told it to.

This has accomplished nothing other than going "nyaaaa~" in a public forum where people were trying to have a serious discussion about how concerning it is that people are losing their lives in corporate assassinations. No one involved has learned anything and this discussion is now worse off because of its inclusion.

I hope the 2.9 watt-hours and 8 ounces of water you just wasted were worth it.

this post was submitted on 15 Dec 2024
704 points (98.6% liked)

Technology

60112 readers
3726 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS