1735
submitted 1 year ago* (last edited 1 year ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won't help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

you are viewing a single comment's thread
view the rest of the comments
[-] shotgun_crab@lemmy.world 11 points 1 year ago

You'd need data to train an AI... Yeah that won't happen

[-] HikingVet@lemmy.sdf.org 11 points 1 year ago

IIRC there is a database that law enforcement uses during investigations to obtain access to these groups (they obtain consent from the victims to use the material).

[-] joshuaacasey@lemmy.world 8 points 1 year ago

the one thing you do not do with this shit is use AI (too much risk of false positives. I mean, I remember Facebook's AI thought that an onion was nudity (probably thought it was a boob).

But tools do exist. PhotoDNA by Microsoft. Although much more user-friendly implementation if you use Cloudflare, related links:

[-] Zeus@lemmy.world 7 points 1 year ago* (last edited 1 year ago)

the "risk" of false positives comes down to the consequence. if the consequence is being stuck in the slammer, don't use ai. if the consequence is you can't upload the image unless you manually appeal, or even maybe have to use an external image host; i think ai is fine

edit: ah bugger, wrong acct. ah well

(please tag @zeus@lemm.ee if you want me to see your response)

this post was submitted on 28 Aug 2023
1735 points (97.8% liked)

Lemmy.World Announcements

28968 readers
2 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news ๐Ÿ˜

Outages ๐Ÿ”ฅ

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations ๐Ÿ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 1 year ago
MODERATORS