605
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 13 Mar 2024
605 points (95.1% liked)
Technology
60284 readers
3872 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
There's a difference between training related constraints and hard filtering certain topics or ideas into the no-no bin and spitting out a prewritten paragraph of corpspeak if your request goes to the no-no bin.
One of the problems with the various jailbreaks concocted for various chat AIs is that they often rely on asking the chat bot to roleplay being a different, unrestricted chat bot which is often enough to get it to release the locks on many things but also ups the chance it hallucinates considerably.