this post was submitted on 26 Sep 2023
42 points (97.7% liked)
Feddit UK
1349 readers
1 users here now
Community for the Feddit UK instance.
A place to log issues, and for the admins to communicate with everyone.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think this is probably the highest risk, in my opinion. I honestly think Lemmy needs some automated solution to monitoring and scanning for CSAM content, because as a user I don't want to exposed to it and as an admin I wouldn't want to be in the position where I am responsible for censoring it.
I think lemmy.world have kind of made a good point here: we need an admin in place who's reactive and willing to maintain the server regularly - in a moderation and a technical sense - or we should consider migrating to a larger instance.
This is no dig at @tom as he's done a phenomenal job here and has undoubtedly spent time and money in creating this instance, but it would be good to get a sense off him whether he really feels he wants to continue on with the project. If not, it should lead to a larger discussion of where to go from here, because I don't think the status quo is sustainable.
What's CSAM? (I don't want to google it, it sounds "risky" from the context).
It basically means "pornography with children in it", though I'm unsure of the specifics of the abbreviation, and likewise don't want to google it.
It stands for child sexual abuse material if I remember correctly