EDIT
TO EVERYONE ASKING TO OPEN AN ISSUE ON GITHUB, IT HAS BEEN OPEN SINCE JULY 6: https://github.com/LemmyNet/lemmy/issues/3504
June 24 - https://github.com/LemmyNet/lemmy/issues/3236
TO EVERYONE SAYING THAT THIS IS NOT A CONCERN: Everybody has different laws in their countries (in other words, not everyone is American), and whether or not an admin is liable for such content residing in their servers without their knowledge, don’t you think it’s still an issue anyway? Are you not bothered by the fact that somebody could be sharing illegal images from your server without you ever knowing? Is that okay with you? OR are you only saying this because you’re NOT an admin? Different admins have already responded in the comments and have suggested ways to solve the problem because they are genuinely concerned about this problem as much as I am. Thank you to all the hard working admins. I appreciate and love you all.
ORIGINAL POST
You can upload images to a Lemmy instance without anyone knowing that the image is there if the admins are not regularly checking their pictrs database.
To do this, you create a post on any Lemmy instance, upload an image, and never click the “Create” button. The post is never created but the image is uploaded. Because the post isn’t created, nobody knows that the image is uploaded.
You can also go to any post, upload a picture in the comment, copy the URL and never post the comment. You can also upload an image as your avatar or banner and just close the tab. The image will still reside in the server.
You can (possibly) do the same with community icons and banners.
Why does this matter?
Because anyone can upload illegal images without the admin knowing and the admin will be liable for it. With everything that has been going on lately, I wanted to remind all of you about this. Don’t think that disabling cache is enough. Bad actors can secretly stash illegal images on your Lemmy instance if you aren’t checking!
These bad actors can then share these links around and you would never know! They can report it to the FBI and if you haven’t taken it down (because you did not know) for a certain period, say goodbye to your instance and see you in court.
Only your backend admins who have access to the database (or object storage or whatever) can check this, meaning non-backend admins and moderators WILL NOT BE ABLE TO MONITOR THESE, and regular users WILL NOT BE ABLE TO REPORT THESE.
Aren’t these images deleted if they aren’t used for the post/comment/banner/avatar/icon?
NOPE! The image actually stays uploaded! Lemmy doesn’t check if the images are used! Try it out yourself. Just make sure to copy the link by copying the link text or copying it by clicking the image then “copy image link”.
How come this hasn’t been addressed before?
I don’t know. I am fairly certain that this has been brought up before. Nobody paid attention but I’m bringing it up again after all the shit that happened in the past week. I can’t even find it on the GitHub issue tracker.
I’m an instance administrator, what the fuck do I do?
Check your pictrs images (good luck) or nuke it. Disable pictrs, restrict sign ups, or watch your database like a hawk. You can also delete your instance.
Good luck.
Perhaps someone should create a script to purge orphan images
Wasn’t facebook also found to store images that were uploaded but not posted? This is just a resource leak . I can’t believe no one has mentioned this phrase yet. I’m more concerned about DoS attacks that fill up the instance’s storage with unused images. I think the issue of illegal content is being blown out of proportion. As long as it’s removed promptly (I believe the standard is 1 hour) when the mods/admins learn about it, there should be no liabilities. Otherwise every site that allows users to post media would be dead by now.
I’m a pentester and security consultant. From my point of view, this vulnerability has more impact than just a resource leak or DOS. We all know how often CSAM or other illegal material is uploaded to communities here as actual posts (where hundreds of viewers run into it to report it). Now imagine them uploading it and spreading it like this, and only the admin can catch it if they goes out of their way to check it?
I wouldn’t call this a high risk issue for sure. But a significant security risk regardless.
Pedo trolls will be the death of Lemmy, you heard it here first!
How come this hasn’t been addressed before?
Because pictrs and most other components of Lemmy was designed for a much smaller use case by a very small development team. It was designed primarily by people volunteering their time and expertise. Most of the contributors have other things to do on a full-time basis. If you really want to see a change like this implemented NOW, then code it in yourself, file a new issue directly on their page with potential solutions, or donate to the people working on it.
Your post is good for the most part, but my patience is limited for the kind of entitled attitude you show under that heading specifically. Thanks for hearing me out.
Entitled attitude? I’m just bringing it up again. It was brought up some time ago but wasn’t given attention so I’m bringing it up again after the recent CSAM attacks.
I didn’t demand anything in the post. I brought up the issue, explained why it’s important, and what admins could do about it.
I don’t know how to code but that doesn’t mean I’m not allowed to bring this issue to light…
I have no issue with your post itself and discussing this issue it is important to highlight things like this. Thank you for bringing it up, and sorry if I sound mad at you for doing that.
I will point out, the specific thing that bothers me is that the heading
How come this hasn’t been addressed before?
contains an incomplete answer that ignores work that is currently in progress by devs to address. I don’t blame you for not knowing the answer but for including and answering that question when you don’t know the answer. To me it’s reminiscent of Tucker Carlson-style questioning, where some issue is brought up, questions are asked but then the answer is sparsely researched and the viewer is expected to come to some conclusion of who to blame. This specifically is what gets on my nerves.
If you can include where work to rectify the issue had been discussed and is in progress like github issues, discussion throughout Lemmy and other things, I’ll edit my first reply to note my concern is assuaged.
E: Here are some of the relevant issues and discussion:
- This sort of thing has been discussed since 2020
- Adding admin purging of DB items and pictures. #904 #1331 #1809
- #3504, #2277, these and related issues are still open.
I don’t care if you don’t like my English writing. I brought up the issue and if people don’t care about it then whatever. We’ll just have to wait until it’s abused then maybe people will be actually concerned.
Entitlement? The “Subtitles” are acting as a Question the reader may have, and below the answer, OP is not demanding anything
@bmygsbvur Pleroma is exactly the same and no one cared in six years.
Doesn’t change the fact that this is an issue.
Other than fulling up storage, what is the actual issue? If the image is orphaned then surely nobody can actually access the content? Sure you could be blind hosting things but if nobody can get the content back out then the abuse is surely minimal apart from say a complex cyber and physical targetted campaign or simply fulling up storage…
The issue is that you can share the image link to other people. People CAN get the content back out and admins or moderators WILL NOT KNOW about it.
So if someone uploads an illegal image in the comments, copies the link and does not post the comment, then they have a link of an illegal image hosted on someone’s Lemmy instance. They can share this image to other people or report it to the FBI. Admins won’t know about this UNLESS they look at their pictrs database. Nobody else can see it so nobody can report it.