Comics
This is a community for everything comics related! A place for all comics fans.
Rules:
1- Do not violate lemmy.ml site-wide rules
2- Be civil.
3- If you are going to post NSFW content that doesn't violate the lemmy.ml site-wide rules, please mark it as NSFW and add a content warning (CW). This includes content that shows the killing of people and or animals, gore, content that talks about suicide or shows suicide, content that talks about sexual assault, etc. Please use your best judgement. We want to keep this space safe for all our comic lovers.
4- No Zionism or Hasbara apologia of any kind. We stand with Palestine 🇵🇸 . Zionists will be banned on sight.
5- The moderation team reserves the right to remove any post or comments that it deems a necessary for the well-being and safety of the members of this community, and same goes with temporarily or permanently banning any user.
Guidelines:
- If possible, give us your sources.
- If possible, credit creators of each comics in the title or body of your post. If you are the creator, please credit yourself. A simple “- Me” would suffice.
- In general terms, write in body of your post as much information as possible (dates, creators, editors, links).
- If you found the image on the web, it is encouraged to put the direct link to the image in the ‘Link’ field when creating a post, instead of uploading the image to Lemmy. Direct links usually end in .jpg, .png, etc.
- One post by topic.
view the rest of the comments
A society that values free speech rejects Mustache's philosophy. He never gains enough of a following in panel 1, 2, or 3 to be able to enact panel 4.
As soon as we allow ourselves to silence someone, Mustache can use the same argument to justify silencing Black Shirt. When we allow ourselves to suppress an enemy of society, Mustache merely needs to suggest to us that Black Shirt is such an enemy.
The insidious part of fascism is that by the time we get to Panel 4, we are the ones carrying Black Shirt to the gallows.
I couldn't have said it better.
You have dictatorships you would not identify stereotypically as fascist, yet they silence anyone dangerous by calling them a fascist. Oldest trick in the book.
A very simple test: A f*ing fascist could use the same comic to justify repressing communists in a fascist regime. It just has to replace those "fascists" believes by communist ones.
This is what worries me about large centralized platforms. They normalize the idea that offensive speakers should be silenced, or should be able to silence dissent. They shouldn't. They should be challenged or ignored. You can block an individual, controlling what you listen to. You can urge others to ignore them. But it should be a cringeworthy act of authoritarianism to lay down a banhammer and block someone from speaking.
The offensive, intolerant asshole should not be banning dissenters; dissenters should not be banning assholes. Any banning anywhere should be seen as deeply troubling, and only done openly, publicly, and with the consent and agreement of the community.
Unilateral control over the process should be seen as fascism.
I am thrilled at the decentralized nature of Lemmy effectively eliminating that capability.
i really don't understand this perspective. we aren't talking about the ability for anybody to silence anyone for any reason, we're talking specifically about rhetoric calling for the death of human beings. is that not a well defined category of speech we should at least keep an eye on? should we let people actively call for the death of other people, when we know historically that that specific kind of rhetoric can lead to people being put in camps?
like, if somebody's sole contribution to an platform is doxxing anybody they don't like, they should be stopped. if they shout death threats in a public forum, they shouldn't be in that forum. we don't need to give platforms unchecked power over our lives to put reasonable limitations on conduct for public platforms.
There is a difference between speech and violence. "Calling for the death of a human being" is violence, not speech. The speaker making that call should not be silenced; they should be jailed. And we have a process for doing just that. That process involves far more than someone unilaterally deciding to take away their microphone or ban them from a platform.
That process involves judges, either elected directly, or appointed by elected officials. It involves the community in the form of a jury of one's peers. It involves open processes and procedures, an appellate process, and a wide variety of protections for the accused.
Banning them from the platform is not a sufficient response to such an act of violence.
Threats of violence are not social disputes.
The rest of your argument is predicated on this fallacy, so I will ignore it.
oh that's why you'll ignore it, huh?
Correct. I am not defending death threats or threats of violence in any way, and I will not allow you to portray me as doing so. Please confine your arguments to forms of speech that do not rise to the level of violent criminality.
Fascism arises when dissent is silenced. Death threats are not dissent.
that's the thing, we don't live in a world where death threats and threats of violence are being dealt with in the way you seem to think they are, and community tools like bans are sometimes the only recourse people have that isn't ruinously expensive, glacially slow, and uncertain to work.
but sure, lets say we aren't talking about explicit death threats or threats of violence. instead, they just... post the account information of queer tiktok creators, and spend most of their time calling queer people groomers and pedophiles. its not directly a threat of violence, but every time they post something, the accounts they post get harrassed by tons of anonymous followers, one of them figures out where they live, and then start bombarding a real human person with death threats. everybody doing the death threats is anonymous, there's no way for the legal system to touch them. what do we do? nothing? or somebody's whole online presence is talking about the great replacement, how the anglo-saxon race is being exterminated, and somewhere down the line we start seeing mass shooters pop up saying nearly the exact same thing in their manifestos. stochastic terrorism. using speech to motivate anonymous observers to take violent action, without calling for violence explicitly. should nothing be done about that? is that not concerning to you?
i think you have a very simplistic definition of what fascism is, and what can or cannot be defined as a threat of violence. there is nuance to what should and should not be considered hate speech, and if you're defending the institution of slavery, implying queer people are groomers, really doing any sort of bigotry, it can meaningfully cause harm to people even if it isn't in and of itself a threat of violence. what do we do then? either nothing or put them in jail? because i think that having more than one way of mediating and enacting punishment for misbehavior is a good thing. i think that being able to respond proportionately to assholes without waiting for them to reach the threshold of illegality is a more healthy way of maintaining a community than putting a firm barrier between "dissent" and "actual crime".
I am not interested in discussing death threats.
I will not discuss criminal speech, let alone defend it. I refuse to take the position you are attempting to assign to me. I do not accept your red herring and strawman arguments.
The overwhelming majority of bans, blocks, and other fascist, silencing behaviors are in response to non-criminal speech. Please confine your arguments to such speech.
right... did you read the rest of it? because i did make a relevant argument like right below that.
No, I did not read the rest of it. Again, the premise of your argument was a strawman about death threats, and I refuse to engage with that premise. Demonstrate comprehension of that distinction, or find someone else to argue with.
read the rest of it. or don't, whatever. the majority of the post did conform to your specifications. i object to your framing, i just don't think its settled ground that these things would be handled appropriately by a court of law, or that they are being handled in the way you have previously described. but i would also just generally recommend reading what somebody says before deciding what their argument is? even if just for curiosity's sake. that's a weird way of engaging with somebody.
I'll read it eventually, but I won't engage with it. This topic is too sensitive and contentious to allow that sort of misconception to creep in. I am not interested in derailing a discussion on censorship by conflating speech with violence.
Apply that argument to someone who has been censored/silenced, and you might begin to understand why I oppose it.
ugh. i know you think that's clever, but its just confusing. what would they be judged by anything other than the content of their arguments? that's why people get banned, its because of what they're saying! i don't hold the position that people should be banned or moderated for something other than for their behavior, that wouldn't make sense. in any case, i'm not conflating speech with violence, i'm not misconceiving anything. i disagree with the premise that speech and violence are discrete from one another. they operate on a continuum. there is speech that is more violent than other speech, and we should have tools for dealing with the things that can lead to but are not in and of themselves violence. content moderation is one of those tools.
Those two sentences are contradictory. There is no such thing as lawful, violent speech, nor unlawful, non-violent speech. No violent speech is protected; no non-violent speech is prohibited. We don't have an authority to tell us exactly where that line is. We do have the consensus of society in general, who we can consult - formally or informally - on whether that line has been crossed.
"Content moderation" replaces that societal consensus with authoritarian opinion. When you decide I don't need to hear from Redneck Russell about how he hates Jews, I am harmed. I don't get to challenge Russell's opinions, or argue with him, or rally people against him. In silencing him, you've taken away my ability to engage him. He still gets to recruit his disciples into his own little spaces out of your control. If I try to engage him there, he merely silences me, censors me. His acolytes never hear a dissenting opinion against him, because he, and you, have decided I don't need to engage him.
They occasionally come out of their little holes, spout their nonsense in your forums,, and proudly tell their compatriots that you banned them from talking to your community members because you couldn't engage them.
Content moderation should not take the form of banning or blocking speech outright, and should not be conducted unilaterally. Moderation should be community driven and transparent. Anyone should be able to see what was hidden, so they can determine for themselves if the censorship was reasonable and appropriate. The content should remain readily available, perhaps "hidden" behind an unexpanded tab rather than deleted entirely.
i've given several examples where that isn't as clear cut, but whatever. speech is a behavior, and can modulate how we act. if you tell people that a group of people is evil, and never say what to do about it, you still increase the likelihood that somebody will act on the belief that that group of people is evil. there are material consequences for speech between causing violence and not causing violence.
the barrier of lawfulness, violence, and all that are socially defined, yes, but if you concede that much, then there will be communities that define racism, bigotry, and other forms of inflammatory speech as violent, and decide that those things ought not to be in their social spaces. unless you're appealing to the group consensus of the largest possible group, there will be subcultures that disagree with each other on what does and doesn't constitute violent speech. if you're appealing to the legality of speech, you aren't appealing to group consensus, you're appealing to the government. so either we as autonomous communities ought to draw our own lines for what is and isn't violent speech ourselves (what i believe), or there is a precise legal definition we have to adhere to, given to us by the government. in reality, its both. there are firm lines of conduct that the government prohibits in theory (though i would dispute their efficacy), and there are communities that disagree on what the limit should be. i don't think that having codes of conduct in this way is necessarily authoritarian.
to be clear, i am here talking to you because i prefer the model that federated services use for moderating their communities, and believe that having tech companies be the sole arbiter of what is and isn't proper speech is a fundamentally flawed approach. that being said, the problem i have with your solution is one that's shared with a lot of community moderation on platforms. it relies on people being willing and able to confront and defuse bigotry on an individual level. i'm jewish. i don't want to hear what Redneck Russell has to say. i doubt that i could say anything to him to change his mind, and i don't want my internet experience to be saturated in Russells, for the basic reason that i want my time online to be relatively relaxing. people who are less attached to jewish identity are even less likely to engage with him, because it doesn't affect them personally, internet arguments are often unpleasant, and they also want their time online to be relatively relaxing. so how do things pan out if a community is only loosely engaged? well, if we aren't relying on moderators to curate our platforms, the hate motivated Russells of the world are empowered to say their bullshit, they receive relatively little resistance, and the relative permissiveness attracts more Russells. the people who want a nice place to hang out online go elsewhere, the concentration of Russells rises, and we're left with a platform that is actively hostile towards jewish people. oops!
if you are part of a focused, highly engaged community, maybe your solution works, but most online spaces are not focused and highly engaged. i agree generally that echo chambers are problematic, but i think on the whole that federation does more to mitigate that than large, algorithmically segregated platforms. i don't really agree that banning or blocking don't or won't play a role in ensuring that social spaces are friendly and enjoyable to be in, especially for marginalized people groups. if you let people say the n word on your platform, and don't do anything about the people who do, don't expect many people of color to want to be where you are. its just not fun to hang out with bigots if you're the one they're targeting, and that will affect the culture of your platform.
i think it really isn't so simple. some people are more invested in a community than others, lots of people are just... not interested in auditing their moderators. generally i think its a good idea to have it be transparent, certainly better than what any major social media platforms do, but at a certain point it does just come down to trust. for example, i agree broadly with the code of conduct for Beehaw, that's why i have an account there. i'm generally uninterested in trying to verbally spar with bigots, i don't want to engage deeply with the moderation of the platform, i have no interest in litigating what is and isn't proper conduct on the site, that's not what i use the internet for. lots of people who are the target of bigotry and hatred just... don't really want to constantly be on guard for that shit. they want a space where they can exist without being confronted with cruelty. i wouldn't want to be on the kind of platform you're describing, sorry.
in any case, i think i'm basically done with you. the world isn't made of neat little blocks you can arrange to your liking. the barrier between criminal and non-criminal speech is socially constructed, and the conduct of individuals doesn't go from perfectly fine to absolutely unacceptable in an instant. its more nuanced than that, and the way we interact with each other should reflect that nuance. like it or not, we have to be the ones to determine what is and is not a threat, it cannot be deferred to an authority unquestioningly.
On the other hand, calling for the death of capitalists or Billionaires, and the politicians that enable them should be protected speech. I'd go so far as to say that anything up to with the exception of actually committing physical violence directly upon them and their family should be the most protected speech.
If you are exploiting society so completely, so wantonly that people want to actually kill you, then you are SHOULD feel uncomfortable in that society. You should feel the need to hire an army of private security, going outside should be a burden for you because of what you have done.
You seem to have learnt nothing from history and how fascism manifests itself. Adderaline had many good points but you just don't want to actually respond to them? There are so many rightwing, fascistic parties in various countries that already use the rhetoric of panels 1-3. And now society debates if e.g. trans people should be allowed to exist or not, if immigrants should be deported or not, if racism is actually a thing or not. We need to define a line where we will not tolerate further discussions. Because if we allow any form of discussion on certain topics, we will again and again get to the point where we argue about someone's right to exist. And this will result in panel 4. I'm glad for you that you don't seem to be affected by this. But please listen to people who are. It is very very frightening if people are publicly debating if they should consider you a valuable human being or not. And even more so as right wing and fascist politics are gaining more traction worldwide.
Adderaline did, indeed, have many good points, just not any that were actually relevant. None of my arguments denied the prosecution or condemnation of death threats. As I am not defending threats or other forms of violence, there is no issue under dispute, and nothing for me to engage.
Every fascist movement has attempted to suppress groups they deem undesirable or offensive. Your determination that racists are undesirable does not impress me. Nor your targeting of homophobes, transphobes, sexists. The reason your calls for suppression against these people don't impress me today is because I have no idea who you are going to be trying to suppress tomorrow.
I take my guidance from Thomas Paine:
Fascism manifests by constantly identifying new and exciting targets for oppression. I reserve my right to disagree with you in the future, so I must defend against your suppressive acts today.
this isn't about "offense" or whatever. its specifically about people calling for other people to be removed from society. when is stopping the spread of that specific conduct ever a bad thing. i broadly agree that decentralized platforms are a good thing overall, but we can't ignore the many ways in which poor moderation make spaces hostile to people, right? or the very real problems that unmoderated, anonymous spaces have with bigotry, CSAM, and other bullshit. i don't fuckin' like 4chan, places like that suck balls. not every aspect of our social lives should be governed by some blanket approval for saying whatever the fuck you like whenever the fuck you want, no actual real life social situation plays by those rules. if a person consistently talks about killing jews, or other bigoted bullshit, i don't want to occupy the same social space as that person, and i want there to be mechanisms in place to stop people like that from bothering people.
Well, I'll give you an example of exactly when that is a problem:-
Your very own statement could be considered a "call for other people to be removed from society". Specifically, you were calling for "anti semites" to be forced to "stop bothering people".
Now, you and I might agree that anti-semites don't bring much of value to the table, but in calling for them to be silenced, we are likely going to impact Palestinians, for example, who arguably have a legitimate grievance that could also be classified as anti-semitism.
The Germans believed they were protecting society when they rounded up Jews, and other undesirables. They raised the same arguments you have. They didn't believe at the time that they were the mustached man in the comics.
The allies believed they were protecting society when they continued to imprison the homosexuals that Hitler had arrested. They raised the same arguments you have. They didn't believe at the time that they were the mustached man in the comics.
If we are appalled at what Mustache is saying and doing to the people he considers his enemies, we must refrain from saying and doing the same things to those who offend us. We cannot assume that our adversaries are completely wrong. We each have to consider that we might be the ones wearing the mustaches.
you're wrong? you're just wrong. the Nazi's called for death. i am not. they spoke their desire to see the extermination of millions of people, and then they did it. saying that bigots shouldn't be able to speak hate on a public platform and saying that people should die are NOT even remotely similar, and are extremely easy to distinguish from one another. we are not, at all times, at risk of becoming fascists because we don't want people to try organizing pogroms in the modern era.
you're flattening speech down into this binary choice, where we can either allow everybody to say anything or we cede our right to speech entirely. that has always, and will always be a falsehood. there has never been, at any point in human history, the kind of free speech you seem to think is an intrinsic right. it simply is not how the world has ever worked. if you make threats at people in a public place, you can be made to leave. if you attend a party and scream racist slurs at a guest, you can be made to leave. if you join a club and can't stop yourself from ranting about jews, you can be made to leave. it is not wrong or authoritarian to expel anti-social weirdos from the places where you socialize.
no community of people is obligated to tolerate bigotry, hatred, and threats, and communities that do tolerate such things are usually shitty places only assholes like to hang out in. it is not some great miscarriage of justice that we are implementing ways of removing assholes from our digital spaces, in the same way we have done so in meatspace since before we had agriculture, and it is not a sign that we're sliding into fascism. what is a sign that we're sliding into fascism is platforms empowering extremists to speak their desire to do harm to others into being, allowing them to say their bullshit without meaningful consequence.
the Holocaust started with words. it started with speeches and books and pamphlets, and it culminated in the extermination of millions of human beings. words are powerful, and we must treat them as such. that being said, i am generally opposed to corporate middlemen monopolizing the social spaces we inhabit, that's why we're having this conversation here instead of elsewhere. i just think this whole idea, that we ought not ban people from platforms for being bigots, ought not have codes of conduct for our social spaces, is fundamentally at odds with any sort of functional, friendly, welcoming community, and i don't want to have to sort through racist screeds every time i go on the internet.
the reality is, if you want to cultivate a space that accommodates bigotry and hatred, expect to find only bigots and racists to hang with, because all the nice people are not going to want to visit your Nazi bar.
FWIW, .world had nothing to do with your previous comment
.world might be sending DMs when your stuff gets removed but that was removed by .ml mod