Some trolls post csam which is why this is currently remove everything rather than handle afterwards since I would rather not let our users see csam. If you get a bit of activity in your account that handling goes away and usually ill restore the post a bit after if I see the automod has removed It but looks like you reposted
Thank you @Ategon@programming.dev
It's that so much CSAM[^1] that all posts with a picture need to be censored automatically ? I'm amazed if it's the case.
FYI: in Lemmy, an user can block another user ! (do not see it's bullshit content, and maybe report to admin and authority the content)
I'm not a huge supporter of automatic censorship...
[^1]:Child Sexual Abuse Material
Blocking doesn't help if they just make new accounts each time
And yes there was a lot for awhile and now there's also spam on images from the mastodon side that's been coming in
It's only image posts and its not all posts with a picture, just if the user has basically no posts
And yes there was a lot for awhile and now there’s also spam on images from the mastodon side that’s been coming in
I guess there is out there some Lemmy haters :)
It’s only image posts and its not all posts with a picture, just if the user has basically no posts
Hoo ok, that indeed better.
Thanks for the precision.
I'd rather not see any child porn then see pictures by relatively inactive users personally. Even one image getting through is too much.
It's also not censoring but moderating. Two very different concepts.
JavaScript