155
submitted 9 months ago* (last edited 9 months ago) by deadsuperhero@lemmy.ml to c/fediverse@lemmy.ml

Highlighting the recent report of users and admins being unable to delete images, and how Trust & Safety tooling is currently lacking.

top 50 comments
sorted by: hot top controversial new old
[-] nutomic@lemmy.ml 112 points 9 months ago

I find it very questionable that you publish this sort of hit piece against Lemmy without even bothering to ask for a comment from our side. This is not how journalism should work.

Effectively you are blowing the complaints of a single user completely out of proportion. It is true that we didnt respond ideally in the mentioned issue, but neither is it okay for a user to act so demanding towards open source developers who provide software for free. You also completely ignore that this is an exception, there are thousands of issues and pull requests in the Lemmy repos which are handled without any problems.

Besides you claim that we dont care about moderation, user safety and tooling which is simply not true. If you look at the 0.19.0 release notes there are numerous features in these areas, such as instance blocking, better reports handling and a new moderator view. However we also have to work on improvements to many other features, and our time is limited.

Finally you act like 4000€ per month is a lot of money, however thats only 2000€ for each of us. We could stop developing Lemmy right now and work for a startup or corporation for three or four times the amount of money. Then we also wouldnt have to deal with this kind of meaningless drama. Is that what you want to achieve with your website?

[-] dessalines@lemmy.ml 78 points 9 months ago

The thing that really gets me with these, is that we are 2-4 devs working on software used by over 40k ppl. It is absolutely impossible to please everyone, and fix every issue, there just isn't enough of us.

Oftentimes we ask for ppl to do the open source thing, and contribute a PR, and many of them do.

Anyone can look at our github profiles and see how busy we've been, and how many moderation related issues we've been working on, this is all out in the open. Yet writers of these articles somehow never bother to look, or reach out to us for questions. The amount of entitlement and second-hand rumors is really dissapointing.

[-] deadsuperhero@lemmy.ml 20 points 9 months ago

I've reviewed both your and @Nutomic's comments, your latest blog updates, and GitHub PR's, and added a section accordingly: https://wedistribute.org/2024/03/lemmy-image-problem/#giving-credit

Thank you for your hard work, and for taking necessary steps to improve something that is essential for instance operators.

[-] nutomic@lemmy.ml 20 points 9 months ago

Thanks that is a bit better. Unfortunately people who have already read the article wont see the update, and even people who read it now may not read all the way to the end, and still leave with a negative impression. Still its better than nothing.

To get an idea how most Lemmy users feel, have a look at this thread. Practically every comment is positive about Lemmy, you can hardly find any negative sentiment. And certainly no one cares about this image deletion issue, which proves that the complaints of a few individuals are completely blown out of proportion.

load more comments (2 replies)
load more comments (1 replies)
load more comments (26 replies)
[-] givesomefucks@lemmy.world 51 points 9 months ago

Well, yeah...

If you upload a picture to Lemmy, it's going to get saved by a shit of federated instances.

That's how federation works, but once it happens, it's hard to get all of them to delete it.

The fix is easy:

Upload somewhere else (theres a bunch of images hosts) then make your post point to that image host. Federated instances just have to host the link, so it's good for them too.

I'd love to see something like the RES feature where Lemmy can still show an expandable thumbnail for non-hosted images. RES pulled it off fine years ago, not sure how hard it would be.

But that would fix all these issues

[-] deadsuperhero@lemmy.ml 44 points 9 months ago

So, to be clear, the story the article links to is specifically a case of local content that didn't actually federate. It was an accidental upload, he cancelled the post, it sat in storage, and even his admin was stumped about how to get it out.

I agree that with federation, it's a lot more messy. But, having provisions to delete things locally, and try to push out deletes across the network, is absolutely better than nothing.

The biggest issue I have is that there's really not much an admin can do at the moment if CSAM or some other horrific shit gets into pict-rs, short of using a tool to crawl through the database and use API calls to hackily delete things. Federation aside, at least make it easy for admins and mods to handle this on their home servers.

[-] bloup@lemmy.sdf.org 21 points 9 months ago

I have to say, I think the article actually does address what you’re saying, in particular here:

There are a couple of reasons as to why this is so surprising. Firstly, the Trust & Safety aspect: a few months ago, several Lemmy servers were absolutely hammered with CSAM, to the point that communities shut down and several servers were forced to defederate from one another or shut down themselves.

Simply put, the existing moderation tooling is not adequate for removing illegal content from servers. It’s bad enough to have to jump through hoops dealing with local content, but when it comes to federated data, it’s a whole other ball game.

The second, equally important aspect is one of user consent. If a user accidentally uploads a sensitive image, or wants to wipe their account off of a server, the instance should make an effort to comply with their wishes. Federated deletions fail sometimes, but an earnest attempt to remove content from a local server should be trivial, and attempting to perform a remote delete is better than nothing.

I also just want to point out that the knife cuts both ways. Yes, it’s impossible to guarantee nodes you’re federating with aren’t just ignoring remote delete requests. But, there is a benefit to acting in good faith that I think is easy to infer from the CSAM material example the article presents.

[-] gabe@literature.cafe 14 points 9 months ago

Upload somewhere else (theres a bunch of images hosts) then make your post point to that image host. Federated instances just have to host the link, so it’s good for them too.

Those images are still cached as well as the thumbnails.

[-] Fisch@lemmy.ml 12 points 9 months ago

Couldn't images and videos just be loaded from the instance they were uploaded to instead of getting copied to each instance? It would work almost the same as uploading it to a file hoster but it would be a lot easier usability wise and illegal content would still only have to be deleted at a single point.

load more comments (1 replies)
load more comments (1 replies)
[-] morrowind@lemmy.ml 43 points 9 months ago

At this point, most of the solutions the ecosystem has relied on have been third-party tools, such as db0’s fantastic Fediseer and Fedi-Safety initiatives. While I’m sure many people are glad these tools exist, the fact that instances have to rely on third-party solutions is downright baffling.

I'm not sure I see the issue here, what's the point of an open ecosystem if you don't make use of any third party tools? Fedi-safety in particular feels like it should not be part of the core project

[-] deadsuperhero@lemmy.ml 33 points 9 months ago

There's nothing wrong with having good third-party tools, that was not my point. db0 in particular has done some amazing, amazing work.

What's fucked, however, is having a project:

  • whose core infrastructure only offers the most threadbare tools
  • there's zero consideration from development on privacy, user safety, or basic controls to handle when shit hits the bed
  • the devs are stone silent when waves of CSAM crash through instances
  • they openly mock people or say they're "too busy to do this" when it comes to meeting the most basic expectations of how a social platform ought to work.

Like, this is not an attack on Lemmy itself, I think the platform can be a real force for good in the Fediverse. But let's be honest, this project is not going to live very long if nothing changes.

Basic things like having the ability to easily remove images from storage should be part of the core platform. The fact that this still isn't a thing even four years into the project is insane.

[-] nutomic@lemmy.ml 30 points 9 months ago

Its simply not true that we have zero consideration for privacy or user safety. But that is only one aspect of Lemmy, we also have to work on many other things. And we werent silent during the CSAM wave, but most of it was handled by admins and all the related issues are long resolved. Lemmy has 50k active users, its obvious that we are too busy to work on every single thing that some individual user demands.

There is a reason that Lemmy still has version 0.x. If you have such high demands then you shouldnt use it, and switch to another platform instead. And yes you are clearly stoking an attack against Lemmy, I wonder why you hate our project so much.

load more comments (2 replies)
[-] DieguiTux8623@feddit.it 22 points 9 months ago

The first time some random user files a sue in court the admins of their instance will be in trouble.

Lemmy devs are not affected, but instance admins are and according to the GDPR they are considered "data controllers" and are responsible for the processing of users' data.

As far as I understand it, this lacking feature is an open "challenge" to existing regulation and legislators, maybe also to open people's eyes about the fact that privacy claims are often not enforced even by those who claim to do so.

load more comments (1 replies)
load more comments (9 replies)
[-] eveninghere@beehaw.org 13 points 9 months ago

There's no guarantee on third party tools continuing to work with Lemmy. Something as critical as deleting images, which can cause problems like revenge porn and such, must be given priority by the official project.

[-] dessalines@lemmy.ml 21 points 9 months ago

We will never block third party tools, and will always have an open API.

One of the PRs I've been working on, is an interface to view your image uploads and delete them. This is not trivial, but will probably be in the next release.

load more comments (2 replies)
[-] ada@lemmy.blahaj.zone 28 points 9 months ago

Lemmy's Image Problem

I see what you did there...

[-] buh@hexbear.net 23 points 9 months ago

fedposting that's not a bug

[-] corymbia@reddthat.com 23 points 9 months ago

Top tip: don’t take nude photos of yourself. Ever.

[-] The_wild_card@lemmy.today 15 points 9 months ago* (last edited 9 months ago)

Even though i don't do it i think people should've a right to do what they want with their body without fearing what others think or will do.

load more comments (1 replies)
[-] Zerush@lemmy.ml 22 points 9 months ago* (last edited 9 months ago)

Because of this, I never upload directly an image to Lemmy and others, using instead a sharing tool (FileCoffee, IMHO the best). With my account there I can easily delete the hosted image with which it disappear everywhere.

load more comments (2 replies)
[-] maegul@lemmy.ml 21 points 9 months ago

Moderation is obviously important, but what are the realities around deletion in a federated ecosystem?

I feel like the push around this and GDPR are similar to the DMs situation in mastodon, where you might feel like you’ve deleted your stuff but it’s actually very much out there still.

I’m sure there’s a middle ground where some amount of deletion occurs and it’s better than nothing. But as with the BlueSky bridge conversation, it seems to me having a frank conversation about the kind of system we’re dealing with here is just as important.

[-] deadsuperhero@lemmy.ml 13 points 9 months ago

Yeah, I agree. I think the important thing is "was the local content scrubbed?" Because at least if that was done, the place of origin no longer has it.

Federated deletes will always be imperfect, but I'd rather have them than not have them.

What might actually be interesting would be if someone could figure out this type of content negotiation: deletes get federated, some servers miss it. Maybe there's a way to get servers to check the cache and, if a corresponding origin value is no longer there, dump it?

[-] maegul@lemmy.ml 12 points 9 months ago

Well I’m sure there are a number of nice ways of arranging federated delete, including your suggestion, but it seems to me that the issue is guaranteeing a delete across all federated servers where the diversity of software and the openness/opt-out-ness of federation basically ensure something somewhere will not respect a request out of either malice, ignorance or error.

Ultimately, it seems a weird thing to be creating and expecting fediverse platforms in the image of those designed with complete central control over all data and servers. Like we’re still struggling to break out of the mould.

Even if one platform makes a perfect arrangement for something like delete, so long as servers running that platform push to / federate with servers that run something else, where it’s ultimately impossible to tell what they’re running because it’s someone else’s server, there will be broken promises.

I’m interested to hear your response on this actually, because it increasingly seems to me like we haven’t got to terms yet with what decentralisation actually means and how libertarian some of its implications are once you care about these sorts of issues.

I suspect it gets to the point where for social activity some people may start realising that they actually want a centralised body they can hold to account.

And that feeling secure on a decentralised social media platform requires significant structural adjustments, like e2ee, allow-list federation, private spaces, where public spaces are left for more blog like and anonymous interactions.

Also, sorry, end rant.

load more comments (1 replies)
load more comments (3 replies)
[-] mactan@lemmy.ml 9 points 9 months ago

tbh I just assume that despite GDPR or CCPA any images I put up are there forever, it's not like the tech is that different in the fediverse from anywhere else. nowhere is going to be scrubbing their storage drive blocks on delete if it even can be deleted

[-] deadsuperhero@lemmy.ml 20 points 9 months ago

I take it you've never run a community instance. The problem is, laws vary by jurisdiction, and can have a very real effect on how you run your server when shit hits the fan.

We recently ran a story about a guy building his own Fediverse community and platform, who just happened to be a bit naive about the network. He's off in his corner, doing his own thing, people find his project and assume it's some kind of weird scraper. After disinformation came out about it, someone remote-loaded child pornography to his server, for the purpose of filling a report with the police.

The guy is based on Germany. Local jurisdiction requires one year of prison time minimum. It matters.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 06 Mar 2024
155 points (81.4% liked)

Fediverse

17854 readers
2 users here now

A community dedicated to fediverse news and discussion.

Fediverse is a portmanteau of "federation" and "universe".

Getting started on Fediverse;

founded 5 years ago
MODERATORS