We're looking to put together some more detailed rules on what should and should not be submitted to the instance. Things such as, but not exclusively:
- What types of message you would always like to see removed on sight
- Whether there are any types of message which should be left up (borderline, with strong corrections from the community)
- Where the line is drawn on political views (and how gray areas should be treated)
I'll make no bones: Moderating uk/ukpol has been a learning experience for me.
I've learned that there often isn't much difference between "leaving a comment up because the community has done an excellent job highlighting flaws" and "I should have removed this hours ago, the community shouldn't have to do this".
As there isn't a way to mod-tag a post, inaction on negative posts can reflect badly on the instance as a whole.
Having some clear guidelines/rules will hopefully simplify things.
And more admins should mean that if a report isn't looked at, someone can review it as an escalation.
I've also enabled the slur filters. And we'll be listening to see if anything needs adding/removing (the template had swearing blocked :| )
So...Answers on a postcard, I guess!
Yeah, I have to agree with this. If I don't know much about a post's topic and read a comment that could be arguing in bad faith, it's usually a good indicator if it has lots of down votes.
In my experience, if bad actors have a specific agenda to push they usually hang around in groups of likeminded people and search for specific discussions, posts, subjects, etc to push their agenda in. This can often results in a toxic / bad faith comment being one of, if not the most highly upvoted comments (due to brigading). Most people that take the middle ground on a topic rarely have the energy to fight against a dog pile of downvotes and bad actors. Automatically assuming that bad actors are single outliers operating by themselves is a bad position that IMHO leads to "too little, too late" moderation reactions after the damage is done and the post has turned toxic.
This is a very complex topic, and not one I believe can be solved by just letting the voting system work it out. I don't know the right answer, it's one I've been searching for, for years. But the assumption of "good faith, until proven otherwise" in conjunction with "it's only a few bad actors" is specifically the mentality the bad actors are exploiting. IMHO, YMMV, IANAL, etc...
I've seen this mentioned over the years, I've just never experienced it. I'd like to ask you about it because I'm fairly certain I would know if I had. Please bear with me, I'm not trying to be antagonistic.
If it's an agenda that goes against what people generally think, don't they get down voted?
That sounds like their own communities? Those are easy to spot and back away from. If it's just a group of individuals whose agenda goes against the grain, those "discussions, posts, subjects" are either controversial to begin with or they aren't of enough significance to make a difference anyway. Where it matters, wouldn't they be overwhelmingly down voted?
I'm not aware of having seen this when it hasn't been dealt with be mods/admins, usually by locking the post or deleting the comment and with bans. Where you see damage being done through the post having quickly turned toxic, I see the moderation that follows as a red flag.
Maybe I'm thick skinned and cynical enough that it's obvious to me and I just ignore it and move on? It's also possible that the places I frequent aren't targets for the bad actors you describe?
Any thoughts on this would be very much appreciated as I'm trying to keep up with the modern world and I need all the help I can get.
I've been at the thick end of this fight for over a decade (on reddit). The vast majority of the work I and my co-mods do is never seen by anyone outside the mod team.
It depends. Very few are stupid enough to say anything that's obviously bad. Most use dog whistles and innuendo. And when they do speak plainly, there's an army of their likeminded friends to drown out any descenting voices (brigading) and inflate their (and similar) comments.
Yes, bad actors can have their own community. No, you don't want to go there. Did you ever see /r/MGTOW, or /r/PussyPassDenied? Mysogony is rife, homophobia, transphobia, xenophobia, racism, etc, etc are all more common than they used to be 5-10 years ago.
If a mod is any good, and knows the group the bad actors are from, then the majority of the work is done behind closed doors (automod for dog whistles and known phrases, etc).
Bans are seen as a "badge of honor" by most of these people since it's so easy to create a new account. They can also wait to appealed after 3 months since that's how far back the moderation log goes (on reddit), so unless you've kept notes and evidence it's easy for them to play the fool and say they've "turned over a new leaf".
This doesn't even touch on suspected state actors / state run bot accounts. Several UK regional subreddits saw a wave of anti-Ukranian posts (false news reports about robbery, attacks, theft, etc by refugees) in an attempt to destabilise the UK's support for Ukraine shortly after the Russian invasion.
Or the "bad news" accounts that just go from regional subreddit to subreddit posting (legitimate) news stories about bad things. Rape, murder, assault. Anything that can stir up some rage. They never comment, and post at all hours of the day, every day of the week.
Thanks for the reply. As I said, I think I may have very strong filter. I know these things exist, I just seem to be able to avoid and ignore them.
I've rarely ventured outside of my own communities other than to be curious and always found what I suspected was there, on both Reddit and Lemmy, and that's probably why I'm not subjected to it as well.
I must add that I am aware of generally how much good work the mods do and I'm using that to enable my experience to be a good one.
Thanks for the chat, it's helped me to evaluate things again. A self reflection of sorts.