this post was submitted on 20 Aug 2023
31 points (100.0% liked)

SneerClub

989 readers
2 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
all 50 comments
sorted by: hot top controversial new old
[–] [email protected] 18 points 1 year ago (1 children)

I haven't listened to this podcast, but

[–] [email protected] 5 points 1 year ago

Far as I can tell none of them had the attention span for the BtB live show on Yudkowsky either which is a shame because I actually learned quite a bit from it (being that I'm not going to slog through HPMOR, even out of spite.)

[–] [email protected] 18 points 1 year ago (2 children)

a quick supercut of garbage Scott Alexander takes for your sneering convenience:

I bet the billionaires who have donated the majority of their fortune to the cause also enjoy being told it's just so they can get a few percent tax break.

does anyone have the slightest idea what Scott’s talking about here? cause I feel like I’d have heard of a billionaire donating most of their money to any cause, unless my intuition’s right and scott’s playing with words pretending that treating a non-profit as a piggy bank somehow isn’t possible or common

I am of the opinion that most billionaires are good and have nothing to whitewash. But even aside from that, this is the worst possible way to whitewash money. If you spend your money on a yacht, people say "cool yacht".

I… I don’t have words for this one actually, the sneers write themselves

"Doesn't actually do selfless things with the money" - can you explain how this works? My impression is if your foundation spends money on yachts or mansions for yourself, that's plain tax fraud.

a bunch of folks in the thread and the original podcast have already explained exactly how this works, Scott, and it’s also kinda fucking obvious, but I guess don’t let that stop you from JAQing off about it

[–] [email protected] 7 points 1 year ago* (last edited 1 year ago) (1 children)

@self @dgerard Wanna bet Scott didn't listen to the podcast before he wrote these replies?

Also note that this post fails their own moderation rules. it isn't true nor kind (they admit their reaction is in bad faith), and it is according to them culture war shit. I wonder if you were to post this podcast and said you agreed with it if your post would be removed due to culture war.

(Also wow Scotts replies are so bad, layers of strawmen and misunderstanding of the lefts arguments)

[–] [email protected] 6 points 1 year ago (1 children)

Well technically, there’s no culture war thread on the subreddit anymore, so all of the culture war rules they had to invent to contain the culture war they started in part by inventing a series of culture war rules which constantly fed the flames of culture war are irrelevant, since when there’s no culture war thread to quarantine the culture war it’s fine to wage culture war elsewhere (and if you took away culture war, what would be the point of either the sub or its inspiring author?)

[–] [email protected] 4 points 1 year ago (1 children)

@YouKnowWhoTheFuckIAM And they have not even updated their rules after all those years. Prob because parts of the rules come from 'The Rightful Caliph' himself, and well can't change them without divine guidance about what to do about the themotte/cw problem.

[–] [email protected] 4 points 1 year ago

@YouKnowWhoTheFuckIAM (My joke about the 'The Rightful Caliph' doesn't come from sneerclub btw, it was actually what some of the Rationalists said about Scott when he became more popular than Yudkowsky)

[–] [email protected] 17 points 1 year ago (2 children)

But dismissing the majority of the community is like saying the entirety of Antifa

drink!

it’s actually amazing to me how many folks in that thread say they love behind the bastards but don’t appear to have learned anything from it. it’s all “oh man I thought Robert Evans would love my particular techfash grift run by billionaires” which, like, what did they think the point of the podcast was this whole time?

[–] [email protected] 8 points 1 year ago (1 children)

If you listen to behind the bastards for a while and don't come away at least itching a little closer towards being a crazy anti-capitalist man with a lot of guns hiding somewhere in the woods I can help you.

Anderson would be disappointed

[–] [email protected] 3 points 1 year ago (2 children)

Still not a gun fan (I look at statistics) but I hear you on all other accounts.

load more comments (2 replies)
[–] [email protected] 14 points 1 year ago (4 children)

I constantly experience [the Gell-Mann amnesia] effect on this subreddit; everyone sounds so smart and so knowledgeable until they start talking about the handful of things I know a little bit about (leftism, the arts, philosophy) and they’re so far off the mark — then there’s another post and I’ve forgotten all about it

Bias noted, impact not reduced. Basic rationality failed. These people are so willing to discard their own sense of right and wrong, moral or rational, just to belong in their weird cult. Why is it so hard for these dorks to admit that they don't actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

[–] [email protected] 15 points 1 year ago

Why is it so hard for these dorks to admit that they don’t actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

because that'd involve talking about feelings, and those are the domain of weird squishy gooey human things, disgusting! :sarcmark:

[–] [email protected] 9 points 1 year ago

then there’s another post and I’ve forgotten all about it

was so sensible until this twisty tail

[–] [email protected] 7 points 1 year ago

Why is it so hard for these dorks to admit that they don’t actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

(Not to be trivially reductionist, but...) Because admitting you want friends is unmasculine. IBTP

[–] [email protected] 6 points 1 year ago (1 children)

Why is it so hard for these dorks to admit that they don't actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

Do they just want some likemindedly dorky friends, or a sense of superiority over the normies/NPCs/whatever they call everyone else, complete with extensively footnoted justifications for why they are the cognitive aristocracy?

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (2 children)

@AllNewTypeFace @swlabr Because they are in it for the status boost. (at least that is what they accuse sneerclubbers of wanting to do, so I just go of the accusations are projections bit). And being part of the rich tech influencers groups is quite a bit of status.

They also convince themselves they are actually right, and we just are factually wrong.

See this, esp the first comment. https://www.lesswrong.com/posts/xMkDui7oCFZPaWYPp/retracted-a-really-simplistic-experiment-for-lesswrong-and-r (but the rest of the thread is also quite something)

[–] [email protected] 5 points 1 year ago (1 children)

Ima sit here and imagine a world where SneerClub makes enough money to cover its own infrastructure, much less the literal fuckton of cash that gets thrown at EA and rationalist groups because there’s good money in carrying water for billionaires who need a smart-sounding excuse for doing a whole bunch of racism

[–] [email protected] 5 points 1 year ago

Since that does not seem likely to be the sort of answer you’re looking for though, if I wanted to bridge the inferential gap with a hypothetical Sneer Clubber who genuinely cared about truth, or indeed about anything other than status (which they do not), I’d tell them that convention doesn’t work as well as one might think. If you think that the conventional way to approach the world is usually right, the rationalist community will seem unusually stupid.

get in the fucking locker

[–] [email protected] 3 points 1 year ago (1 children)

wow the angry downvotes, much coolness so rational

[–] [email protected] 4 points 1 year ago (1 children)

the downvotes and absolute lack of replies whenever SneerClubMod made a point they couldn’t pedantically dismiss are very noticeable

[–] [email protected] 4 points 1 year ago (1 children)

@self @gerikson Combined with fact that they started this post as an invitation for dialogue, an attempt to bridge the gaps, but it quickly turns into reasoning from first feelings, and dismissal of the actual experts.

[–] [email protected] 2 points 1 year ago (1 children)

* an invitation for monologue

[–] [email protected] 3 points 1 year ago

@dgerard Yes, indeed, I unconsciously steelmanned it sorry.

[–] [email protected] 14 points 1 year ago (1 children)

For a subreddit of supposedly ultra rational, dispassionate intellectuals that are willing to consider any ideas for merit based on pure argument and ignoring the speaker they sure do spend a lot of words talking about the kind of man Robert is vs what he actually says in the ep.

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (1 children)

My rational analysis — Your emotional reaction

[–] [email protected] 12 points 1 year ago (1 children)

I have priors
You have biases
She is toxoplasmotic SJW filth

[–] [email protected] 9 points 1 year ago

She is toxoplasmotic SJW filth

If you say this in front of a mirror 3 times I show up

spoilergarf-chan

[–] [email protected] 12 points 1 year ago (2 children)

At risk of going NSFW, it's obvious that none of these folks have read Singer 1971, which is the paper that kickstarted the EA movement. This paper's argument has a massive fucking hole right in the middle.

Without cracking open the paper, I seem to recall that it is specifically about Oxfam and famine in Africa. The central claim of the paper is that everybody should donate to Oxfam. However, if one is an employee of Oxfam, then suddenly the utilitarian arithmetic fails; his argument only allows for money going from non-Oxfam taxpayers to Oxfam employees.

Can't help but notice how the main problem with EA charities is the fucking nepotism. Almost as if the EA movement rests on a philosophical foundation of ignoring when charities employ friends of donors.

[–] [email protected] 5 points 1 year ago

there was a while I was working at an org that would occasionally do things with the wikimedia foundation

for similar reasons as what you remark on here: when the walesbegging banners would pop up on wikipedia, I'd only chuckle and move on

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (1 children)

I don’t see how this works.

On one point:

The utilitarian argument construes the relevant ethical concerns, unsurprisingly, as utilitarian: the starting point doesn’t matter so long as the right results get over the line. This can be both one of utilitarianism’s greatest strengths and greatest weaknesses, and in this case the strength is that utilitarianism is highly accommodating of the fact that some but not all people are employees of Oxfam (or indeed any relevant charity or similar organisation). The obvious point to make is that If you’re not an employee of Oxfam then the utilitarian argument goes through, because giving to Oxfam is your means of getting those results over the line. If you are an employee of Oxfam, then perhaps you don’t need to give, because working for Oxfam is your means.

On another:

The sentence “his argument only allows for money going from non-Oxfam taxpayers to Oxfam employees” doesn’t include the important premise “the role of an Oxfam employee is to convert that money into good deeds done for the poor, for example by using it to pay for food in a famine”. The intended result is the same whether you are an employee of Oxfam or not (viz. paying for food in a famine). You want us to quibble about the wording (or rather: the wording as you have summarised it here) on grounds (which you leave implicit, so correct me if I’m wrong) that it is incoherent to say “everybody” when some people are already employees of Oxfam.

This seems to drastically confuse Singer’s actual aim (to convince the vast majority of people who are not Oxfam employees to give to Oxfam) for something not only very odd but plainly non-utilitarian, something like: “it is a deontological requirement that everybody give money to Oxfam”.

[–] [email protected] 4 points 1 year ago (2 children)

I was incorrect; the paper is about famine and aid in Bengal.

NSFWHere is a PDF of Singer's paper. On p4 you can see the closest he gets to actually doing arithmetic. At that point he does not notice the problem I pointed out; he only notes that we can contribute labor instead of money, without considering that money is what compensates laborers. On p7 he admits that utilitarianism does not give a complete analysis, because it cannot predict a time when charity will no longer be necessary; however, he does not note that many charities are set up to provide eternal grift, including some of the biggest humanitarian-aid charities in the world.

Bonus sneer! Quote from Singer's paper (p9):

Another, more serious reason for not giving to famine relief funds is that until there is effective population control, relieving famine merely postpones starvation. … The conclusion that should be drawn is that the best means of preventing famine, in the long run, is population control. It would then follow from the position reached earlier that one ought to be doing all one can to promote population control (unless one held that all forms of population control were wrong in themselves, or would have significantly bad consequences). Since there are organizations working specifically for population control, one would then support them rather than more orthodox methods of preventing famine.

Isn't Singer so polite to leave us an escape hatch just in case we happen to "[hold] that all forms of population control [are] wrong in themselves"? But we have enough experience to know now that sterilization (USA), rules against too many children (CCP), and straight-up forced starvation (USSR) are inhumane. So while his ignorance could be acceptable in the 70s, I think that our half-century of intervening experience shows that he was, uh, naïve.

[–] [email protected] 6 points 1 year ago

I suspect that was simply Singer's nod to religious opposition to voluntary contraception and he wasn't necessarily suggesting that the things you list are viable options.

[–] [email protected] 2 points 1 year ago (1 children)

I don’t really get the sneer here, he mentions population control at a time when it was widely believed that overpopulation was a looming problem

[–] [email protected] 4 points 1 year ago (1 children)

that the eugenics was mainstream doesn't make it not eugenics tho, or mean that it wasn't bad then too

[–] [email protected] 3 points 1 year ago (1 children)

No, but Singer does mean stuff like “supply birth control to people who don’t have birth control” and “make them rich and educated so they have fewer kids” which eugenics or not is a real policy response by governments which had to deal with famine pursued

[–] [email protected] 2 points 1 year ago (2 children)

singer's a fucking eugenicist though (as any disabled person would tell you).

load more comments (2 replies)
[–] [email protected] 12 points 1 year ago

Are there crazy people adjacent to the community? Of course, and there are certainly loud billionaires co-opting it to their own purposes, and even some of the people from the beginning have failed to live up to their aspirations.

99% of EA funding went into building a lampshade to hang over this minor quibble.

[–] [email protected] 8 points 1 year ago (1 children)

Not all Effective Altruists! But! all Antifa.

[–] [email protected] 8 points 1 year ago

Antifa just hate helping others u kno

[–] [email protected] 7 points 1 year ago (1 children)

They say that our whole project is the baby killing machine, but it's not. Do I have other examples? No.

[–] [email protected] 10 points 1 year ago

Look you toxoplasmotic SJW filth, Scott donates money personally. You must just hate helping people.

[–] [email protected] 7 points 1 year ago (1 children)

@dgerard I read this and am now substantially stupider