96
submitted 1 year ago by tux0r@feddit.de to c/technology@beehaw.org

shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

you are viewing a single comment's thread
view the rest of the comments
[-] ram@lemmy.ca 36 points 1 year ago

Let's not forget that these AI aren't limited by age. Like fuck am I gonna be out here defending tech that would turn my kid into CSAM. Fucking disgusting.

[-] PelicanPersuader@beehaw.org 14 points 1 year ago

Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn't happen, meaning those resource won't be used to save real children in actual danger.

[-] MaggiWuerze@feddit.de 1 points 1 year ago

On the other hand, this could be used to create material that did not need new suffering. So it might reduce the need for actual children to be abused for the production of it.

[-] ram@lemmy.ca 4 points 1 year ago

Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.

[-] MaggiWuerze@feddit.de 1 points 1 year ago

Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?

[-] ram@lemmy.ca 1 points 1 year ago

Neither. I would have mental health supports that are accessible to them.

[-] tweeks@feddit.nl 2 points 1 year ago

Of course we don't want both, but it comes across as if you're dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.

Mental health support is available and real CSAM is still being generated. I'd suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.

[-] ichbinjasokreativ@beehaw.org 1 points 1 year ago

It's (rightfully) currently illegal, but that doesn't stop people. Keep it illegal, increase punishment drastically, make AI-created material a grey area.

[-] Rekorse@kbin.social 2 points 1 year ago

Its already the worst crime around and people still do it. Maybe its not the punishment we need to focus on.

[-] ram@lemmy.ca 1 points 1 year ago* (last edited 1 year ago)

I'm not sure increasing punishment is actually an effective manner of combating this. The social implications of being a child predator are likely to have a more deterrent effect than the penal system imo (I don't have data to back that).

I, personally, am an advocate for making treatment for pedophiles freely, easily, and safely accessible. I'd much rather help people be productive, non-violent members of society than lock them up, if given a choice.

[-] tweeks@feddit.nl 1 points 1 year ago

That's a fair point. And I believe AI should be able to combine legal material to create illegal material. Although this still feels wrong, if it excludes suffering in base material and reduces future (child) suffering, I'd say we should do research on it at least. Even if it's controversial, we need to look at the rationale behind it.

this post was submitted on 22 Aug 2023
96 points (100.0% liked)

Technology

37804 readers
388 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS