529
The Vatican’s Anime Mascot Is Now an AI Porn Sensation
(www.404media.co)
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
Posts must be:
Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
It's not victimless. It normalizes the sexualization of children.
Do you have any research that backs this up? Because there is research that claims the opposite and that this can work as a preventative measure.
People who downvoted you are lazy do a quick Google search on the topic.
I mean, Japan has had csam cartoons for decades. They have a lower CSA rate compared to the USA. Not saying it's totally related, but it doesn't seem like if someone has access to cartoon csam they will normalize it and do it in real life.
Sure, the same way video games normalize stealing cars. Or the same way movies normalize killing people. I mean at some point you gotta stop blaming media.
It is already normalized.
And GTA / video games normalizes mass shootings?
So if legalized porn reduces rapes as studies show, how to we figure out if this existing allows for less abuse to kids, or if it spawns long term interest
Cartoon csam has been legal in Japan for decades. They have a lower CSA per Capita than the USA.
There are some brain studies that show the area of the brain that is responsible for caring for children is butt up next to the part of the brain that is responsible for sexual pleasures. The study suggests that there might be a misfiring of the synapse between the different sides of the brain that might cause someone to be a pedo. These people don't experience sexual pleasures without thinking about kids. It's literally a disability.
My opinion is that we don't know if removing AI generated csam would make things worse for real life kids or not. But flat out banning it without proper research would be irresponsible.
I think the whole argument is moot. AI image generation is available to pretty much everyone. It's impossible to control what what people are doing with it
Maybe if self hosted, but if the AI is hosted by someone else... I imagine it would be as easy as key words being reported/flagged
Self hosting is trivial these days. Any modern NVIDIA card and hundreds of models available online.
Thanks for the thoughts on such, the way people were only downvoting originally and not providing any actual explanation to why, had me thinking it was just going to have been dumb to ask.