We're looking to put together some more detailed rules on what should and should not be submitted to the instance. Things such as, but not exclusively:
- What types of message you would always like to see removed on sight
- Whether there are any types of message which should be left up (borderline, with strong corrections from the community)
- Where the line is drawn on political views (and how gray areas should be treated)
I'll make no bones: Moderating uk/ukpol has been a learning experience for me.
I've learned that there often isn't much difference between "leaving a comment up because the community has done an excellent job highlighting flaws" and "I should have removed this hours ago, the community shouldn't have to do this".
As there isn't a way to mod-tag a post, inaction on negative posts can reflect badly on the instance as a whole.
Having some clear guidelines/rules will hopefully simplify things.
And more admins should mean that if a report isn't looked at, someone can review it as an escalation.
I've also enabled the slur filters. And we'll be listening to see if anything needs adding/removing (the template had swearing blocked :| )
So...Answers on a postcard, I guess!
I've been at the thick end of this fight for over a decade (on reddit). The vast majority of the work I and my co-mods do is never seen by anyone outside the mod team.
It depends. Very few are stupid enough to say anything that's obviously bad. Most use dog whistles and innuendo. And when they do speak plainly, there's an army of their likeminded friends to drown out any descenting voices (brigading) and inflate their (and similar) comments.
Yes, bad actors can have their own community. No, you don't want to go there. Did you ever see /r/MGTOW, or /r/PussyPassDenied? Mysogony is rife, homophobia, transphobia, xenophobia, racism, etc, etc are all more common than they used to be 5-10 years ago.
If a mod is any good, and knows the group the bad actors are from, then the majority of the work is done behind closed doors (automod for dog whistles and known phrases, etc).
Bans are seen as a "badge of honor" by most of these people since it's so easy to create a new account. They can also wait to appealed after 3 months since that's how far back the moderation log goes (on reddit), so unless you've kept notes and evidence it's easy for them to play the fool and say they've "turned over a new leaf".
This doesn't even touch on suspected state actors / state run bot accounts. Several UK regional subreddits saw a wave of anti-Ukranian posts (false news reports about robbery, attacks, theft, etc by refugees) in an attempt to destabilise the UK's support for Ukraine shortly after the Russian invasion.
Or the "bad news" accounts that just go from regional subreddit to subreddit posting (legitimate) news stories about bad things. Rape, murder, assault. Anything that can stir up some rage. They never comment, and post at all hours of the day, every day of the week.
Thanks for the reply. As I said, I think I may have very strong filter. I know these things exist, I just seem to be able to avoid and ignore them.
I've rarely ventured outside of my own communities other than to be curious and always found what I suspected was there, on both Reddit and Lemmy, and that's probably why I'm not subjected to it as well.
I must add that I am aware of generally how much good work the mods do and I'm using that to enable my experience to be a good one.
Thanks for the chat, it's helped me to evaluate things again. A self reflection of sorts.