loose
Irony?
loose
Irony?
must of made a mistake their
your so dumb lmao
thank you kind stranger
Should of proof red it
And my axe!
This guy fucks
I also choose this guy's dead wife.
I need to of a word with you
Knead*
This one must be the worst. "Could care less" being a close second
OP hasn't payed enough attention in English class.
Muphry's Law at work
Now when you submit text to chat GPT, it responds with “this.”
Unironically this
Criminaly underated post
As a language model, I laughed at this way harder than I should have
NTA, that was funny.
And it will get LOSE and LOOSE mixed up like you did
I'm waiting for it to start using units of banana for all quantities of things
ChatGPT trained used Reddit posts -> ChatGPT goes temporarily “insane”
Coincidence? I don't think so.
This is exactly what I was thinking.
And maybe some more people did what i did. Not deleting my accounts but replacing all my posts with content created by a bullshit-generator. Made texts look normal, but everything was completely senseless.
Back in june-july, I used a screen tapping tool + boost to go through and change every comment i could edit with generic type fill, then waited something like 2 weeks in hopes that all of their servers would update to the new text, and then used the same app to delete each comment and post, and then the account itself. Its about all I could think to do.
They have always trained on reddit data, like, gpt2 was, i'm unsure about gpt1
ChatGPT also chooses that guy's dead wife
The Narwhal Bacons at Midnight.
It also won't be able to differentiate between a jackdaw and a crow.
On the contrary, it'll becomes excessively perfectionist about it. Can't even say "could have" without someone coming in and saying "THANK YOU FOR NOT SAYING OF"
It already was, the only difference is that now reddit is getting paid for it.
From now on, when you say something like "I think I can give my hoodie to my girlfriend", it will answer"and my axe""
GROND
It was already trained on Reddit posts. It's just now they're paying for it.
Its going to be a poop knife wielding guy with 2 broken arms out to get those jackdaws.
Would have and would of
ChatGPT was already trained on Reddit data. Check this video to see how one reddit username caused bugs on it: https://youtu.be/WO2X3oZEJOA?si=maWhUpJRf0ZSF_1T
And between were, we’re and where.
Insure and ensure.
It will also reply "Yes." to questions "is it A or B?".
Don't forget the bullshit that is "would of"
"What is a giraffe?"
ChatGPT: "geraffes are so dumb."
"Can't even breath"
Your right.
And then and than.
Sure it might have some effect, but a big part of ChatGPT besides "raw" training data is RLHF, reinforcement learning from human feedback. Realistically, the bigger problem is training on AI-generated content that might have correct spelling, but hardly makes sense.
Then I did the right thing by replacing my texts with correct spelled nonsense.
And when it learns something new, the response will be "Holy Hell".
TIL
Is it a showerthought if it's actually just incorrect
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. A showerthought should offer a unique perspective on an ordinary part of life.