Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.
The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.
In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.
If the other admins want to give their opinions about this, then I am all ears.
I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.
Others have said it better but sorry that you had to experience that
If lemmy doesn't support expanding remote hosted images from platforms like imgur or whatever that's probably a good feature to look for in a new platform. If it does, things are obviously getting to a point where you need to reduce what you need to worry about, to help focus on moderation and keeping things running. Block listing urls is easier than running and hoping on an ai.
Otherwise I think disabling images or giving the ability only to financial supporters or strongly verified accounts via upvote counts, number of reputable or discussed posts, or and age of account are the only sane moves.
Personally I don't mind clicking through for an image if it interests me.
Some how more controlling the user base or making the images less useful seem like an interesting idea.
For example one could only allow thumbnail size images for example or resemble to that resolution. This might discourage image distribution. Or one could allow only accounts with a certain age and posting history or verification to load images. Would not eliminate but would reduce it.