this post was submitted on 06 Sep 2023
10 points (100.0% liked)

SneerClub

989 readers
2 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

This is a slightly emotional response off the back of a discussion with a heavily TESCREAList family member recently. Which concluded with his belief there are a very small number of humans with incredible information processing abilities that know the real truth about humanity's future. He knows I hate Yudkowsky, I know he considers him one of the most important voices of our time. It's not fun listening to someone I love and value heading into borderline scientology territory. I kind of feel like, just with Peterson a few years ago, this is the next post-truth battle on our hands.

all 13 comments
sorted by: hot top controversial new old
[–] [email protected] 11 points 1 year ago (1 children)

The thing about rationalists is that they are fully invested in irrational beliefs, which they prefer not to examine. In other words, just like most people, but with a specific terminology that, if they use it properly, identifies them as one of the elect.

I suggest that whenever your relative talks about EA, you talk about kindness. When they bring up longtermism, point out that you have to survive in the short term to reach the long term, so working on better policies now is rather important. If they start in on life extension, note that until quite recently, all the major advances in improving average human lifespan come from improving infant mortality, and be prepared to explain the demographic transition.

When they go extropian, say that it's a nice vision of the future but your kids are unlikely to see it unless we fix the world we're currently in.

But most of all, point out that multiplying infinitesimals by infinities to justify any course of action (a) is Pascal's Wager and (b) justifies every course of action -- so it can't justify any.

[–] [email protected] 8 points 1 year ago

then they'll lament that you're a toxoplasmotic victim of the woke mind virus and wonder at your genes

[–] [email protected] 6 points 1 year ago

If you are the sort of person who subscribes to the worldview that rats do (anti-collectivist, libertarian) you absolutely subscribe to the Great Man view of historiography. It's meaningless to ask the question if "we" are more fixated in individuals. The view among historians is divided, after all. I believe the consensus view is mixed.

[–] [email protected] 5 points 1 year ago

Well, I don't know if we're more fixated on individual genius now than ever before. But it does feel like we're living through a retreat from democracy. I don't know what's coming next but it probably won't be good.

[–] [email protected] 5 points 1 year ago

I’ve seen this used to scam rats/postrats. Its really something to see.

[–] [email protected] 3 points 1 year ago (4 children)

Isn’t Yudkowsky the nimrod from LessWrong? I’ve never actually met a fan of his in real life, but I empathize with your situation.

[–] [email protected] 9 points 1 year ago

SneerClub is, in part, a place for folks who have met fans of Yudkowsky and wish they hadn’t

[–] [email protected] 8 points 1 year ago

I'd never heard of Yudkowsky until a few months ago, until my brother ruined a dinner with doomerism dressed up as credible science, and I had to dig deep to discover the insidious bullshit. Big thanks to Timnit Gebru and Emile Torres on that front.

[–] [email protected] 6 points 1 year ago

literally the dude this sub is about lol

[–] [email protected] 4 points 1 year ago

Yudkowsky is in fact the nimrod from less wrong. Currently doing a fedora'd chicken little podcast tour.