149
submitted 1 year ago by admin@beehaw.org to c/chat@beehaw.org

Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

you are viewing a single comment's thread
view the rest of the comments
[-] pemmykins@beehaw.org 15 points 1 year ago

I mentioned this in discord a while back, but there are image-matching databases for known instances of CSAM that you can apply for access to, as an admin of a forum or social media site. If you had access, you could scan each image uploaded or linked to in a post or comment, and compare to the database for matches. I think that mastodon is adding some hooks for this kind of checking during the upload phase, but I’m not sure what the status is with Lemmy.

I’m happy to help facilitate a solution like this, as it’s something I also care about. Feel free to find me on discord if you want to talk.

Also, as others have said - I’m sorry you had to go through that. The same thing happened to me many years ago and it definitely affected me for a long time.

[-] ShellMonkey@lemmy.socdojo.com 8 points 1 year ago

There are some automated options out there at the frontend already. One that looks simple if not absolute is putting the site through cloudflare with their csam engine. It'll even do some of the dirty work reporting to the appropriate agency and putting a legal bar up on the link until someone can delete it.

this post was submitted on 17 Sep 2023
149 points (100.0% liked)

Chat

7499 readers
94 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS