264
submitted 11 months ago by L4s@lemmy.world to c/technology@lemmy.world

AI girlfriend bots are already flooding OpenAI’s GPT store::OpenAI’s store rules are already being broken, illustrating that regulating GPTs could be hard to control

you are viewing a single comment's thread
view the rest of the comments
[-] sramder@lemmy.world 43 points 11 months ago

[Yawn]

I’m all for a bit of Ai panic, but this is the worst kind of desperate journalism.

The facts as reported:

  • 1 day before opening the doors of their new online store OAi updated their policy to ban comfort-bots and bad-bots.
  • On opening day there are 7 Ai girlfriends available for purchase/download.

The articles conclusion: Ai regulation is doomed to fail and the machines will wipe out humanity.

[-] afraid_of_zombies@lemmy.world 8 points 11 months ago

If we get wiped out by AI girlfriends we deserve it. If the reason why a person never reproduced is solely because they had a chatbot they really should not reproduce.

[-] sramder@lemmy.world 7 points 11 months ago

I was trying to dream up the justification for this rule that wasn’t about mitigating the ick-factor and fell short… I guess if the machines learn how to beguile us by forming relationships then they could be used to manipulate people honeypot style?

Honestly the only point I set out to make was that people were probably working on virtual girlfriends for weeks (months?) before they were banned. They had probably been submitted to the store already and the article was trying to drum up panic.

[-] HelloHotel@lemm.ee 1 points 11 months ago* (last edited 11 months ago)

Its a hard question to answer, there is a good reason but its sevral pargraphs long and i likely have gaps in knolage and in some places misguided. The reduced idea: being emotionally open (no emotional guarding or sandboxing/RPing) with a creature that lacks many traits required to take on that responsability. the model is being pretrained to perform jestures that make us happy, having no internal state to ask itself if it would enjoy garlic bread given its experience with garlic. its an advanced tape recorder, being pre-populated with an answer. Or it lies and picks somthing because saying idk is the wrong response. As apposed to a creature that has some kind of consistant external world and a memory system. firehosing it with data, means less room for artistic intent.

If your sandboxing/Roleplaying, theres no problem.

load more comments (1 replies)
load more comments (2 replies)
load more comments (22 replies)
this post was submitted on 14 Jan 2024
264 points (95.5% liked)

Technology

60301 readers
3321 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS